Why do we go for BPM

Hi all,
Why do we go for BPM?

Hi
Error handling,Handling messages from multiple senders,Collecting of messages,Splitting of messages
Processing message based on certain criteria,
If the receiver structure is created based on certain criteria
It is also used to control complex document choreography - e.g. when you need to wait for two different types of messages to arrive and then send out another message of a 3rd type, or when you need to wait for a series of messages to arrive, and combine them into a single message to be sent out.
http://help.sap.com/saphelp_nw04/helpdata/en/de/766840bf0cbf49e10000000a1550b0/content.htm and
http://help.sap.com/saphelp_nw04/helpdata/en/69/4ad13fa69a4921e10000000a1550b0/content.htm
BPM is one way to raise your alerts in an excetion step. This does not mean BPM is mandatory for raising alerts
Hope this help's
Regard's
Chetan Ahuja

Similar Messages

  • Why exactly do we need BPM in XI?

    Hi,
    Why exactly do we need BPM in XI?

    Hi Jeba,
    BPM in XI is ccBPM - i.e. cross component BPM
    It is used to control complex document choreography - e.g. when you need to wait for two different types of messages to arrive and then send out another message of a 3rd type, or when you need to wait for a series of messages to arrive, and combine them into a single message to be sent out.
    You don't need it for simple mappings.
    Strongly suggest you read the FAQ at the top of the forum and also take a look at http://service.sap.com/bpms to get some background.
    Regards,
    Jocelyn

  • Simple question:  I want to use iCloud as a back up disk for my documents folder.  How can I do this?  If I cannot do this, why am I paying for access to "the cloud?"

    Simple question:  I want to use iCloud as a back up disk for my documents folder.  How can I do this?  If I cannot do this, why am I paying for access to "the cloud?"

    iCloud does not provide general file storage or backup, so you cannot back up your Documents folder using it. You will need to find a third-party alternative - this page examines some options (some are free):
    http://rfwilmut.net/missing3
    iCloud at basic level, with 5GB is storage, is free: you only pay anything if you want to increase the storage space.

  • Why are you asking for my Visa card information when I have a balance on my iTunes gift card?

    Why are you asking for my Visa card information when I have a balance on my iTunes gift card?

    Either because you're trying to buy some sort of gift, or the iTunes Store servers are checking the validity of your account and won't charge the card, or there's a problem at Apple's end.
    (99821)

  • Why is Domain required for an identity in the FIM Service?

    I have a scenario where FIM is managing identity, but not all identities have an Active Directory account. I have a flag in the FIM Portal (Service) that indicates if a particular
    user is entitled to an AD account or not. My provisioning setup adds or removes the AD account as appropriate. To support FIM Portal activities for those that do have AD accounts, I populate AccountName, Domain, and ObjectSID in the FIM Service from their
    corresponding attributes in AD.
    What I have noticed is that it does not seem possible to null out or delete the Domain attribute for a user in the FIM Service. I can delete the attributes for both AccountName
    and ObjectSID without issues.
    When attempting to remove the Domain attribute for a user I get the following in the event logs:
    Microsoft.ResourceManagement.WebServices.Exceptions.UnwillingToPerformException: Other ---> System.Data.SqlClient.SqlException: Procedure or function 'GetDomainConfigurationIdentifiersFromDomain'
    expects parameter '@domainName', which was not supplied.
    I assume that something internal to the FIM Service is trying to do some magic with validating the domain name and the domain configuration. I did found a post saying, “Yeah,
    you have to populate Domain”:
    http://social.technet.microsoft.com/Forums/en-US/f207caa9-3a6f-4f2d-8461-a83777280803/fim-service-ma-export-failedmodificationviawebservices-error?forum=ilm2
    My question is why is Domain required for a user? It is obviously needed for users that have AD accounts an must authenticate with the Portal, but in the case where a user
    does not have an account (and therefore does not have a domain), it feels odd to store the incorrect data for the user. It also looks weird when you bring up list of users in the portal and see domain values for users that do not have accounts. In this particular
    case, the client has many domains and does have the Domain and AccountName attributes displayed on the user search results page.

    Hi Henry,
    Using another domain attribute and workflow to maintain the actual Domain and DomainConfiguration is a good suggestion, thanks.
    My original question still stands however... Why is Domain required in the FIM Service?
    It is sounding like the answer is "It is not really required on it's own, but there is an internal process that requires it if there is a value for DomainContext set (and there is some magic that sets DomainContext, so you have to manually clear it.)"
    Since DomainContext is automatically set when a client writes a value to Domain. I would suggest that it is a bug that DomainContext is not automatically cleared when Domain is cleared.
    I poked around a bit and the bug can be fixed by changing the stored procedure definition to allow null parameters. In the FIM Service database the stored procedure [fim].[GetDomainConfigurationIdentifiersFromDomain] has a parameter declaration of "@domainName
    NVARCHAR(448)". If this is changed to "@domainName NVARCHAR(448) = null" the problem appears to be solved.
    Making this change would of course be totally unsupported, but perhaps it can be included in a future product update.
    For now I will use Henry's workaround, or just live with potential out of date Domain data.
     Thanks

  • Want to delete all the mails in the mail box configured for BPM Portal

    Hi All,
    Do you have idea to perform this activity.
    I want to delete all the mails in the Dev mail box configured for BPM Portal.
    Server and mailbox details as given below :
    Mail a/c = Y00123
    Mail server = sap.mail.com
    Thanks, Sanjay

    http://java.sun.com/developer/onlineTraining/JavaMail/contents.html
    http://www.jguru.com/faq/view.jsp?EID=17035
    if you know the password of the account, i think you can also access the mail using mail client, like you use outlook to deal with your company mail daily.

  • [SOLVED] makepkg error: Why is it looking for '.part' files?

    I've been seeing the following problem with every package I try to build with makepkg.  The download works, but then makepkg complains that it can't find the source file:
    ==> Making package: android-sdk r20-2 (Fri Jun 29 11:01:35 CDT 2012)
    ==> Checking runtime dependencies...
    ==> Checking buildtime dependencies...
    ==> Retrieving Sources...
    -> Downloading android-sdk_r20-linux.tgz...
    % Total % Received % Xferd Average Speed Time Time Time Current
    Dload Upload Total Spent Left Speed
    100 78.7M 100 78.7M 0 0 1640k 0 0:00:49 0:00:49 --:--:-- 2166k
    mv: cannot stat ‘/usr/local/src/android-sdk_r20-linux.tgz.part’: No such file or directory
    ==> ERROR: Failure while downloading android-sdk_r20-linux.tgz
    Aborting...
    Why is it looking for the .part file, instead of the .tgz file?  When I run makepkg again, without doing anything in between, it finds the .tgz file and goes on its merry way.
    I'm using curl, and I haven't tried wget.  Judging by my unscientific sample of forum posts it looks like wget is the de facto standard, so this may not be on anyone else's radar.
    Here's my makepkg.conf:
    # /etc/makepkg.conf
    # SOURCE ACQUISITION
    #-- The download utilities that makepkg should use to acquire sources
    # Format: 'protocol::agent'
    DLAGENTS=('ftp::/usr/bin/curl -fC - --ftp-pasv --retry 3 --retry-delay 3 -o %o %u'
    'http::/usr/bin/curl -fLC - --retry 3 --retry-delay 3 -o %o %u'
    'https::/usr/bin/curl -fLC - --retry 3 --retry-delay 3 -o %o %u'
    'rsync::/usr/bin/rsync -z %u %o'
    'scp::/usr/bin/scp -C %u %o')
    # Other common tools:
    # /usr/bin/snarf
    # /usr/bin/lftpget -c
    # /usr/bin/wget
    # ARCHITECTURE, COMPILE FLAGS
    CARCH="x86_64"
    CHOST="x86_64-unknown-linux-gnu"
    #-- Compiler and Linker Flags
    # -march (or -mcpu) builds exclusively for an architecture
    # -mtune optimizes for an architecture, but builds for whole processor family
    CFLAGS="-march=x86-64 -mtune=generic -O2 -pipe -fstack-protector --param=ssp-buffer-size=4 -D_FORTIFY_SOURCE=2"
    CXXFLAGS="-march=x86-64 -mtune=generic -O2 -pipe -fstack-protector --param=ssp-buffer-size=4 -D_FORTIFY_SOURCE=2"
    LDFLAGS="-Wl,-O1,--sort-common,--as-needed,-z,relro"
    #-- Make Flags: change this for DistCC/SMP systems
    MAKEFLAGS="-j4"
    # BUILD ENVIRONMENT
    # Defaults: BUILDENV=(fakeroot !distcc color !ccache check !sign)
    # A negated environment option will do the opposite of the comments below.
    #-- fakeroot: Allow building packages as a non-root user
    #-- distcc: Use the Distributed C/C++/ObjC compiler
    #-- color: Colorize output messages
    #-- ccache: Use ccache to cache compilation
    #-- check: Run the check() function if present in the PKGBUILD
    #-- sign: Generate PGP signature file
    BUILDENV=(fakeroot !distcc color !ccache check !sign)
    #-- If using DistCC, your MAKEFLAGS will also need modification. In addition,
    #-- specify a space-delimited list of hosts running in the DistCC cluster.
    #DISTCC_HOSTS=""
    #-- Specify a directory for package building.
    #BUILDDIR=/tmp/makepkg
    # GLOBAL PACKAGE OPTIONS
    # These are default values for the options=() settings
    # Default: OPTIONS=(strip docs libtool emptydirs zipman purge !upx)
    # A negated option will do the opposite of the comments below.
    #-- strip: Strip symbols from binaries/libraries
    #-- docs: Save doc directories specified by DOC_DIRS
    #-- libtool: Leave libtool (.la) files in packages
    #-- emptydirs: Leave empty directories in packages
    #-- zipman: Compress manual (man and info) pages in MAN_DIRS with gzip
    #-- purge: Remove files specified by PURGE_TARGETS
    #-- upx: Compress binary executable files using UPX
    OPTIONS=(strip docs !libtool emptydirs zipman purge !upx)
    #-- File integrity checks to use. Valid: md5, sha1, sha256, sha384, sha512
    INTEGRITY_CHECK=(md5)
    #-- Options to be used when stripping binaries. See `man strip' for details.
    STRIP_BINARIES="--strip-all"
    #-- Options to be used when stripping shared libraries. See `man strip' for details.
    STRIP_SHARED="--strip-unneeded"
    #-- Options to be used when stripping static libraries. See `man strip' for details.
    STRIP_STATIC="--strip-debug"
    #-- Manual (man and info) directories to compress (if zipman is specified)
    MAN_DIRS=({usr{,/local}{,/share},opt/*}/{man,info})
    #-- Doc directories to remove (if !docs is specified)
    DOC_DIRS=(usr/{,local/}{,share/}{doc,gtk-doc} opt/*/{doc,gtk-doc})
    #-- Files to be removed from all packages (if purge is specified)
    PURGE_TARGETS=(usr/{,share}/info/dir .packlist *.pod)
    # PACKAGE OUTPUT
    # Default: put built package and cached source in build directory
    #-- Destination: specify a fixed directory where all packages will be placed
    PKGDEST=/usr/local/pkgs
    #-- Source cache: specify a fixed directory where source files will be cached
    SRCDEST=/usr/local/src
    #-- Source packages: specify a fixed directory where all src packages will be placed
    #SRCPKGDEST=/home/srcpackages
    #-- Packager: name/email of the person or organization building packages
    PACKAGER="Whitney Marshall <[email protected]>"
    #-- Specify a key to use for package signing
    GPGKEY="E4FB694E"
    # EXTENSION DEFAULTS
    # WARNING: Do NOT modify these variables unless you know what you are
    # doing.
    PKGEXT='.pkg.tar.xz'
    SRCEXT='.src.tar.gz'
    # vim: set ft=sh ts=2 sw=2 et:
    Last edited by wmarshall (2012-07-02 15:39:16)

    It looks like this code is the culprit.  I don't have time to dig into it right now, but presumably this works with wget.  I would have thought any cmdline downloader would manage renaming the .part file itself...?
    makepkg, lines 395-417 (pacman 4.0.3-2):
    395 # replace %o by the temporary dlfile if it exists
    396 if [[ $dlcmd = *%o* ]]; then
    397 dlcmd=${dlcmd//\%o/\"$file.part\"}
    398 dlfile="$file.part"
    399 fi
    414 # rename the temporary download file to the final destination
    415 if [[ $dlfile != "$file" ]]; then
    416 mv -f "$SRCDEST/$dlfile" "$SRCDEST/$file"
    417 fi

  • How to setup the cluster environment for BPM using weblogic

    want to setup the cluster environment for BPM using weblogic....
    i have installed the oracle weblogic server 10gr3 and oracle BPM enterprise for weblogic 10gR3
    i have used the Admin tools from the "oracle BPM enterprise for weblogic" to setup the configuration and create the weblogic domain servers.
    i can launch the process administrator and import the project exp file to domain server.
    but what should i do to setup cluster environment using weblogic?
    what i want to do is :
    setup one admin machine..
    setup two product machine..
    enable the cluster so the admin machine can monitor the status of the product machine..
    thanks a lot ...

    The install guide at http://download-llnw.oracle.com/docs/cd/E13154_01/bpm/docs65/config_guide/index.html gives a reasonable amount of info on how to do this.
    Personally I have not used the OBPM option to configure WebLogic instead I've used the information in the above install guide to create the weblogic domain in advance of configuring OBPM.
    Once you've setup WebLogic configure OBPM using the values I mention in the following thread: How to set the JMX Engine parameter in Process Administation?
    Let me know any specific config questions and I'll do my best to answer them for you.
    Thanks,
    Mike

  • HT4889 Migration Assistant to update my new Mountain Lion iMac from an external USB hard disk drive, it is constantly saying that it is "looking for other computers". It doesn't find the external drive. Why is it looking for other computers?

    I'm trying to use Migration Assistant to update my new Mountain Lion iMac from an external USB hard disk drive. I told it to look for a drive, yet it is constantly saying that it is "looking for other computers". It doesn't find the external drive ... it just endlessly looks for other computers. Why is it looking for other computers at all, when I told it not to?

    Wow, the wording in Migration Assistant is misleading. I've never used it before, so I thought I would try to copy my files from the external drive ... my old iMac died, but I managed to get everything I need off it, using the 'cp' command in single-user mode. So I guess I'll just have to manually copy the files from the external drive to the new machine. I was hoping that Migration Assistant might help somehow, but obviously not.
    Thanks for the quick reply!

  • When updating itunes, it keeps saying invalid drive E, why does it look for drive e?

    When updating itunes, it keeps saying invalid drive E, why does it look for drive e?

    Try the following user tip:
    "Invalid drive X:\" install errors

  • When we go for BPM?

    BPM Team,
    Please can you give inputs when we go for BPM?
    Regards
    Swarna.

    Hi Swarna,
         Business Process Management  is the automation of those employee activities that cost the company valuable time and money.
         It is readily understandable by all business users, from the business analysts that create the initial drafts of the processes, to the technical developers responsible for implementing the technology that will perform those processes, and finally, to the business people who will manage and monitor those processes.
           It creates a standardized bridge for the gap between the business process design and process implementation.
    Cheers,
    Divya

  • Why we will go for Queue delta instead of Unserialized and Direct delta ?

    Hi Experts,
    Why we will go for Queue delta instead of Unserialized and Direct delta ? specify any reasons for that ?
    What happens internally when we use Queue delta , Direct delta ?
    I will allocate points to those who help me in detail. My advance thanks who respond to my query.

    Hi,
    Direct Delta
    With this update mode, extraction data is transferred directly to the BW delta queues every time a document is posted. In this way, each document posted with delta extraction is converted to exactly one LUW in the related BW delta queues. If you are using this method, there is no need to schedule a job at regular intervals to transfer the data to the BW delta queues. On the other hand, the number of LUWs per DataSource increases significantly in the BW delta queues because the deltas of many documents are not summarized into one LUW in the BW delta queues as was previously the case for the V3 update.
    If you are using this update mode, note that you cannot post any documents during delta initialization in an application from the start of the recompilation run in the OLTP until all delta init requests have been successfully updated successfully in BW. Otherwise, data from documents posted in the meantime is irretrievably lost. The restrictions and problems described in relation to the "Serialized V3 update" do not apply to this update method.
    This update method is recommended for the following general criteria:
    a) A maximum of 10,000 document changes (creating, changing or deleting documents) are accrued between two delta extractions for the application in question. A (considerably) larger number of LUWs in the BW delta queue can result in terminations during extraction.
    b) With a future delta initialization, you can ensure that no documents are posted from the start of the recompilation run in R/3 until all delta-init requests have been successfully posted. This applies particularly if, for example, you want to include more organizational units such as another plant or sales organization in the extraction. Stopping the posting of documents always applies to the entire client.
    Queued Delta
    With this update mode, the extraction data for the affected application is compiled in an extraction queue (instead of in the update data) and can be transferred to the BW delta queues by an update collective run, as previously executed during the V3 update.
    Up to 10,000 delta extractions of documents to an LUW in the BW delta queues are cumulated in this way per DataSource, depending on the application.
    If you use this method, it is also necessary to schedule a job to regularly transfer the data to the BW delta queues ("update collective run"). However, you should note that reports delivered using the logistics extract structures Customizing cockpit are used during this scheduling. This scheduling is carried out with the same report which is used when you use the V3 updating (RMBWV311, RMBWV312 or RMBWV313).There is no point in scheduling with the RSM13005 report for this update method since this report only processes V3 update entries. The simplest way to perform scheduling is via the "Job control" function in the logistics extract structures Customizing Cockpit. We recommend that you schedule the job hourly during normal operation - that is, after successful delta initialization.
    In the case of a delta initialization, the document postings of the affected application can be included again after successful execution of the recompilation run in the OLTP (e.g OLI7BW, OLI8BW or OLI9BW), provided that you make sure that the update collective run is not started before all delta Init requests have been successfully updated in the BW.
    In the posting-free phase during the recompilation run in OLTP, you should execute the update collective run once (as before) to make sure that there are no old delta extraction data remaining in the extraction queues when you resume posting of documents.
    Using transaction SMQ1 and the queue names MCEX11, MCEX12 or MCEX13 you can get an overview of the data in the extraction queues.
    If you want to use the functions of the logistics extract structures Customizing cockpit to make changes to the extract structures of an application (for which you selected this update method), you should make absolutely sure that there is no data in the extraction queue before executing these changes in the affected systems. This applies in particular to the transfer of changes to a production system. You can perform a check when the V3 update is already in use in the respective target system using the RMCSBWCC check report.
    In the following cases, the extraction queues should never contain any data:
    - Importing an R/3 Support Package
    - Performing an R/3 upgrade
    For an overview of the data of all extraction queues of the logistics extract structures Customizing Cockpit, use transaction LBWQ. You may also obtain this overview via the "Log queue overview" function in the logistics extract structures Customizing cockpit. Only the extraction queues that currently contain extraction data are displayed in this case.
    The restrictions and problems described in relation to the "Serialized V3 update" do not apply to this update method.
    This update method is recommended for the following general criteria:
    a) More than 10,000 document changes (creating, changing or deleting a document) are performed each day for the application in question.
    b) In future delta initializations, you must reduce the posting-free phase to executing the recompilation run in R/3. The document postings should be included again when the delta Init requests are posted in BW. Of course, the conditions described above for the update collective run must be taken into account.
    Un-serialized V3 Update
    Note: Before PI Release 2002.1 the only update method available was V3 Update. As of PI 2002.1 three new update methods are available because the V3 update could lead to inconsistencies under certain circumstances. As of PI 2003.1 the old V3 update will not be supported anymore.
    With this update mode, the extraction data of the application in question continues to be written to the update tables using a V3 update module and is retained there until the data is read and processed by a collective update run.
    However, unlike the current default values (serialized V3 update); the data is read in the update collective run (without taking the sequence from the update tables into account) and then transferred to the BW delta queues.
    The restrictions and problems described in relation to the "Serialized V3 update" do not apply to this update method since serialized data transfer is never the aim of this update method. However, you should note the following limitation of this update method:
    The extraction data of a document posting, where update terminations occurred in the V2 update, can only be processed by the V3 update when the V2 update has been successfully posted.
    This update method is recommended for the following general criteria:
    a) Due to the design of the data targets in BW and for the particular application in question, it is irrelevant whether or not the extraction data is transferred to BW in exactly the same sequence in which the data was generated in R/3.
    Thanks,
    JituK

  • When we will go for BPM

    Hi,
    I want to know when exactly we go for bpm and how to work simple bpm scenario.

    Hi Sreenu,
    BPM is used for Stateful communications,suppose u have to delay message processing,or wait for other messages to arrive and then semd them all together,in that case use BPM.
    We will use BPM when ever we want to do the following:
    1.Controling or Monitoring of messages in XI
    2. Collect or Merge the messages in XI
    3. Split the messages in XI
    4. Multicast a Message
    5. Need to send an Alert
    6. Transformation
    With its BPM capability, SAP NetWeaver:
    • Exploits business-process efficiency by giving your business
    users the ability to directly model, manage, monitor, and
    analyze business processes
    • Enables continuous process improvement and the dynamic
    modification of business processes
    • Extends the value of your company’s core business investment
    and maximizes the return on its strategic assets by providing
    the ability to change process rules without additional IT
    investment
    • Provides greater visibility into critical business operations for
    better decision making by delivering the right information at
    the right time
    • Allows the integration of people, applications, and internal
    and external resources
    Process step types:
    Message relevant:
    Receive: We use it to receive a message. By receiving a message we are sending the data into process. We can use it to start a process. We can use it for activating or using correlations.
    Send: We use it to send either an asynchronous or synchronous message or an acknowledgement.
    Receiver Determination: We use it to get a list of receivers for sub sequent send step. It calls the receiver determination that we configured in the integration directory and returns receivers list.
    Transformation: We use it to change a message inside the process. E.g. bundling multiple messages into 1 or splits a message into multiple.
    Using this we can create N:1 or 1:N or 1:1 transformations. In general scenario 1:N transformation is possible.
    Process flow control Relevant:
    Container: We use it to set a value for target container element at runtime. Target container element and assigned value must have same value.
    Control: We use it to terminate the current process and to trigger an exception and to trigger an alert.
    While Loop: To repeat the execution of steps within the loop.
    Fork: We use it when you want to continue a process in branches that are independent of each other. E.g. to communicate with two systems that are independent to each other
    Block: We use it to combine steps that you want to execute one after the other and which are to access the local data.
    Empty: It has no influence on the process flow. We use it as a place holder for a step that has not yet been defined, and as a step with no functions for test purposes.
    Wait: We use it to incorporate a delay in the process.
    Switch: We use it to define different processing branches for a process
    T.Codes for B.P.M:
    SXMB_MONI_BPE
    SXWF_XI_SW11
    For Example a Small Explanation reg BPM for this Req we used BPM
    A background program should be scheduled to run every 10 minutes to analyse any material records that have been created or changed or deleted that have occurred to the material master records in the last minute.
    There are two Message Mappings involved in the whole scenario. First mapping is N:1 Mapping which will be used in BPM and second Mapping is 1:1 Mapping:
    1. First Message Mapping  N: 1 u2013 Mapping between IDoc (occurrence u2013 0...unbounded in u201CMessagesu201D tab) to IDoc with changed occurrence of its top node (IDOC) as 0...unbounded. This message mapping will be used in BPM u2013 transformation step.
    Description: This BPM collects all Idocs for 10 minutes which are of Message Type (ZMATMAS05) according to Receiver Partner Number (Field u2013 RCVPRN) and calls N: 1 mapping to bundle the collected Idocs in a One External Definition for that IDOC.
    Use
    You use a wait step ( ) to incorporate a delay in a process. Usually, you use a delay to define when the next step in the process is to start. You can define a delay as either a point in time or a period of time.
    At runtime, the step waits until the specified point in time is reached or the specified period of time has passed. The system then continues the process by proceeding with the next step.
    Expalined clearly how to do a file to file scenario with BPM :
    /people/krishna.moorthyp/blog/2005/06/09/walkthrough-with-bpm
    it is File>RFC>File using BPM then refer this blog.
    /people/arpit.seth/blog/2005/06/27/rfc-scenario-using-bpm--starter-kit
    BPM-1 /people/krishna.moorthyp/blog/2005/06/09/walkthrough-with-bpm
    BPM-2 /people/krishna.moorthyp/blog/2006/04/08/reconciliation-of-messages-in-bpm
    BPM-3 /people/arpit.seth/blog/2005/06/27/rfc-scenario-using-bpm--starter-kit
    BPM-4 /people/michal.krawczyk2/blog/2005/06/11/xi-how-to-retrieve-messageid-from-a-bpm
    Integratio Scenario
    /people/venkat.donela/blog/2006/02/17/companion-guide-to-integration-scenario
    /people/siva.maranani/blog/2005/08/27/modeling-integration-scenario146s-in-xi
    Schedule BPM
    /people/siva.maranani/blog/2005/05/22/schedule-your-bpm
    Use of Synch - Asynch bridge in ccBPM
    /people/sriram.vasudevan3/blog/2005/01/11/demonstrating-use-of-synchronous-asynchronous-bridge-to-integrate-synchronous-and-asynchronous-systems-using-ccbpm-in-sap-xi
    Use of Synch - Asynch bridge in ccBPM
    https://www.sdn.sap.com/irj/sdn/weblogs?blog=/pub/wlg/1403 [original link is broken] [original link is broken] [original link is broken] [original link is broken]
    without BPM
    /people/henrique.pinto/blog/2007/08/02/syncasync-scenarios-without-bpm
    without BPM1
    /people/venkataramanan.parameswaran/blog/2007/01/18/syncasync-communication-in-jms-adapter-without-bpm-sp19
    IDOC BPM
    /people/pooja.pandey/blog/2005/07/27/idocs-multiple-types-collection-in-bpm
    To deal with Multiple sender and receivers based on the conditions we could use BPM. Its one of the feature of BPM, but its not mandatory to go for BPM for each n every case. Its depends upon scnenario.
    /people/marilyn.pratt/blog/2007/10/12/clubhouse-las-vegas-a-bpm-roadmap
    BPM Process Patterns:Repeatable Design for BPM Process Models
    http://www.bptrends.com/publicationfiles/05%2D06%2DWP%2DBPMProcessPatterns%2DAtwood1%2Epdf
    BPM Steps link : http://help.sap.com/search/highlightContent.jsp
    BPM-BUSINEES PROCESS MANAGAEMENT
    *Transformation Error and still stuck ? *
    /people/shabarish.vijayakumar/blog/2005/12/07/transformation-error-and-still-stuck
    Walkthrough with BPM
    /people/krishna.moorthyp/blog/2005/06/09/walkthrough-with-bpm
    Reconciliation of Messages in BPM
    /people/krishna.moorthyp/blog/2006/04/08/reconciliation-of-messages-in-bpm
    Reconciliation of Messages in BPM Contd. - Restart Workflow
    /people/krishna.moorthyp/blog/2006/04/08/reconciliation-of-messages-in-bpm-contd--restart-workflow
    *XI: How to... retrieve MESSAGE_ID from a BPM *
    /people/michal.krawczyk2/blog/2005/06/11/xi-how-to-retrieve-messageid-from-a-bpm
    XI: Do you realy enjoy clicking and waiting while tracing BPM steps?
    /people/michal.krawczyk2/blog/2005/09/04/xi-do-you-realy-enjoy-clicking-and-waiting-while-tracing-bpm-steps * *
    BPM:Single Sender and Multiple Receivers based on synchronous exchange(switch) part-1
    /people/prasadbabu.nemalikanti3/blog/2006/03/10/bpmsingle-sender-and-multiple-receivers-based-on-synchronous-exchangeswitch-part-1
    Collecting IDocs without using BPM
    /people/stefan.grube/blog/2006/09/18/collecting-idocs-without-using-bpm * *
    *Multi-Mapping without BPM - Yes, itu2019s possible! *
    /people/jin.shin/blog/2006/02/07/multi-mapping-without-bpm--yes-it146s-possible
    Sync/Async scenarios without BPM
    /people/henrique.pinto/blog/2007/08/02/syncasync-scenarios-without-bpm
    XI/PI: BPM modeling in Aris for SAP Netweaver - a teaser
    /people/michal.krawczyk2/blog/2006/11/27/xipi-bpm-modeling-in-aris-for-sap-netweaver--a-teaser
    *XI: who said he cannot be stopped? BPM JIM - SP17 *
    /people/michal.krawczyk2/blog/2006/06/27/xi-who-said-he-cannot-be-stopped-bpm-jim--sp17
    *Schedule Your BPM *
    /people/siva.maranani/blog/2005/05/22/schedule-your-bpm * *
    *how to integrate unified worklist to xi-BPM *
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/bb9100f8-0c01-0010-ac8e-e017351f3fc1
    *Usage of Sync-Async when both Sender and Receiver are Synchronous Apps *
    https://www.sdn.sap.com/irj/sdn/weblogs?blog=/pub/wlg/1403 [original link is broken] [original link is broken] [original link is broken] [original link is broken]
    Using a BPM to collect messages for a set interval of time
    /people/daniel.graversen/blog/2006/09/07/using-a-bpm-to-collect-messages-for-a-set-interval-of-time
    Sync/Async scenarios without BPM
    /people/henrique.pinto/blog/2007/08/02/syncasync-scenarios-without-bpm * *
    *Illustration of Multi-Mapping and Message Split using BPM in SAP Exchange Infrastructure *
    /people/sudharshan.aravamudan/blog/2005/12/01/illustration-of-multi-mapping-and-message-split-using-bpm-in-sap-exchange-infrastructure
    Regards,
    Vinod.

  • XI Alerts not working for BPM

    We have been live with alert framework for some time, but we recently noticed we are not getting alerts when mapping fails within a BPM. 
    All other errors are being alerted to us properly.  Is there something we need to configure differently for BPM? 
    Thanks in advance,
    Michael

    Hi,
    Did you trigger alert from BPM ? Check the Alert sending step in the BPM.
    This SAP note on ALert Trougbleshooting may help u-913858
    Alerts triggered from BPM e.g- /people/michal.krawczyk2/blog/2005/03/13/alerts-with-variables-from-the-messages-payload-xi--updated
    In the BPM, u can configure the Exception Branch for the Mapping. Whenever  mapping exception occurs, you can trigger an alert
    Regards,
    Moorthy

  • Why does it take for a month to get internet services?

    I'd like to know why does it take for a month to get internet services? I ordered my service on Jan 18,2012 and I got the installation kit on Jan 21,2012. But I can't  use the internet and I have to wait until Feb 9,2012 for my service ready date. I think it takes a long time. I need the internet to do my project, but I can't. I'm very disappointed right now. I think I can use the internet after I place my order. (2 or 3 days it's ok, but I have to wait for a month that's terrible) Why can't they just turn on the services? I'd like to know who can i talk with? 

    Have you tried hooking the modem up, and does it get sync and does the Internet light come on? Even though you technicaly shouldn't do this, the line might already be up and running but they haven't told you yet. I don't know why it takes a month to hook up DSL. In my area, it takes two weeks typically from order to live, whether it's new service or a speed upgrade/downgrade (these are usually one week. They can be done on the spot, though!). No one has ever explained why it takes that long, but I presume it has someting to do with the billing cycle and perhaps line conditioning if they still do that (away from removing bridged taps and load coils), giving a technician time to pick up the service request and perform the task before someone goes, builds the profile for the line and then connects a few cables to an empty slot in a DSLAM and does some other background tasks to bring the connection online.
    ========
    The first to bring me 1Gbps Fiber for $30/m wins!

Maybe you are looking for

  • My lumia 620 is not responding some time at the ti...

    I have submitted my phon at nokia care for this issue that some time my lumia 620 is not responding at the time of call reciveing.They gave me after software upgradation.But is not solved then I contacted with customer care.They tolled me to submit t

  • [OIM] Error in Direct Provisioning (with auto save form) - GTC DB App Table

    Hi, I am getting an error when setting up direct provision of a GTC DB App Conn using OIM access policy (and group membership) or through manual provisioning with prepopulate and auto save form. Manual provisioning with prepopulate ONLY (not with aut

  • Force Date "Display Pattern"??

    Greetings, We are having troubles with the "Display Pattern" of Date fields.  The scenario is that we receive data in XML, apply it to the form, and then want it displayed in a specific format on the Form, but it doesn't work like it should. For exam

  • Combine table 3 columns in a combo box?

    I need a combo box to display "lastName, firstName - phone". I have these 3 columns in my database. Can anyone provide a short cfc to xml script I could use to create an arrayCollection to use as a dataprovider for my combobox? I'm thinking along the

  • How to catch a event by button personalization

    Hi, I added a button in a standard page by Personalization. On click of button, I need to do some validations. As this is not a "Submit Button", I cannot get the event in processFormRequest. So I need to define a event by setting "Action Type" as "Fi