Pipeline structure

can any one plz explain what is pipeline structure in XI..??
regards
Narasimha

Hi,
As in the XI terms the pipeline is a set of services that process XML messages on the integration engine
(so after the adapter will send the XML to the integration engine)
When a source message reaches the Integration server, it performs 6 steps before the message reaches the destination. The steps are:
1) Receiver Determination: This steps determines the system that participates in the exchange of the message.
2) Interface Detremination: For each receiver determine which interface will should receieve the message.
3) Message Split: If more than one receievers are found, Xi will instantiate new message for each receiver.
4) Message Mapping: Mapping to transform the source message to destination message format.
5) Technical Routing: Bind a specific destination and protocol to the message.
6) Call outbound Adapter: Send the transformed message to the adapter or a proxy.
One can examine these steps in Runtime Workbench using the Transaction: SXMB_MONI.
refer
wat is pipeline service --Answered by Agasthuri Doss & Jai Shankar
http://help.sap.com/saphelp_nw04/helpdata/en/41/b714f85ffc11d5b3ea0050da403d6a/frameset.htm
/people/dmitry.govberg/blog/2006/11/29/post-a-message-to-xi-pipeline
http://help.sap.com/saphelp_nw2004s/helpdata/en/80/942f3ffed33d67e10000000a114084/frameset.htm
/people/aravindh.prasanna/blog/2005/12/23/configuring-scenario-specific-e-mail-alerts-in-xi-ccms-part--1
/people/krishna.moorthyp/blog/2006/04/08/reconciliation-of-messages-in-bpm
/people/michal.krawczyk2/blog/2005/09/09/xi-alerts--step-by-step
/people/siva.maranani/blog/2005/05/25/understanding-message-flow-in-xi
Understanding message flow in XI
Thanks
Swarup

Similar Messages

  • Problem about Opensparc T1 pipeline structure

    Hi,
    I have few questions regrading the Opensparc T1 core pipeline.
    1.Which .v files in the sourse code illustrate the pipeline stages such as fetch,thread selection ,decode,execute,memory and write back?
    Is the fetch,thread selection and decode stages in the sparc_ifu.v or other .v files?
    2.What is the sparc_pipe_flow.v file use for ? Is it used for illustrating the pipeline?
    I'm an newer, thanks a lot !

    Hi Eric,
    The file sparc_pipe_flow.v is a verification and debug file used to track instructions through the pipeline. An engineer debugging a problem with the RTL can use the signals in this file to see what instruction, if any, is in each stage of the pipeline.
    In the RTL itself, there is no one file which contains the pipeline. Each unit contains pieces of the pipeline. For example the instruction fetch unit (IFU) contains the pipe stages f and s.
    Note that signals in the pipeline are usually coded with a sufix which indicates which stage of the pipeline they are in. For example, the signal name tlu_flush_m is located in the M stage of the pipeline.
    Hope this helps,
    Tom

  • Unable to connect multiple MAF components in a WPF host application

    I'm trying to connect my MAF component parts for the addin extensibility to work, however I'm getting certain errors while trying to build my add in store.
    string path = @"...\MyProject\Extensibility\Output";
    string[] errorList = AddInStore.Rebuild(path);
    that's the pipeline path, which looks like according to documentation:
    Extensibility\
    Output\
    AddIns
    AddInSideAdapters
    AddInViews
    Contracts
    HostSideAdapters
    My WPF host app DLL file is located under this directory:
    MyProject\WPFApp\bin\debug
    So, I'm unsure where to build my HostView. Currently the output is the pipeline root directory (Extensibility\Output)
    I have 4 errors in my errorList:
    While examining an assembly for pipeline segments, got a ReflectionTypeLoadException: Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information. File Name: ...MyProject\Extensibility\Output\AddInSideAdapters\AddInSideAdapters.dll
    2:
    Could not connect up a part in a pipeline to its neighbors: Contract Name: "IAddInContract" Location: "Contracts\Contracts.dll".
    3:
    Could not connect up a part in a pipeline to its neighbors: AddInBase Name: "AddInView" Location: "AddInViews\AddInViews.dll".
    4:
    Could not connect 2 valid add-in model parts.
    And the code of addin side adapter, contact and addin view respectively:
    [AddInAdapter]
    public class AddInSideAdapter : ContractBase, IAddInContract
    private AddInView view;
    public AddInSideAdapter(AddInView view)
    this.view = view;
    public void Initialize(IPluginHandler handler)
    view.Initialize(handler);
    public INativeHandleContract GetCustomUI()
    return FrameworkElementAdapters.ViewToContractAdapter(view.GetCustomUI());
    [AddInContract]
    public interface IAddInContract: IContract
    void Initialize(IPluginHandler handler);
    INativeHandleContract GetCustomUI();
    [AddInBase]
    public abstract class AddInView
    public abstract void Initialize(IPluginHandler handler);
    public abstract FrameworkElement GetCustomUI();
    They're output directories are ok, all building into the corresponding pipeline segments. I really cannot understand what's failing. Pipeline structure is ok, build paths of segments are ok (just not sure about the HostView and the host app). The path variable
    is ok, i just omitted the entire path for brevity

    Hi Mefhisto,
    >>”So, I'm unsure where to build my
    HostView. Currently the output is the pipeline root directory”
    I found some information about your problem in “Pipeline Development Requirements”,It said “The host application and the host view are typically deployed
    in the same directory. The pipeline directory can be in any location but is typically in the same directory as the host application.”This is the link of the document :
    https://msdn.microsoft.com/en-us/library/bb384240(v=vs.90).aspx. And there are several samples,you can refer to it for reference.
    http://clraddins.codeplex.com/wikipage?title=Samples&referringTitle=Home

  • Unable to set configured property "/Initial.initialServices"

    Hi All,
    I am getting following error-
    05:21:52,006 INFO  [STDOUT] Unable to set configured property "/Initial.initialServices" atg.nucleus.ConfigurationException: Unable to resolve component /com/progiweb/fbconnect/pipeline/FacebookAuthStatusServlet
    05:21:52,013 INFO  [STDOUT] **** Error
    05:21:52,064 INFO  [STDOUT] Unable to set configured property "/Initial.initialServices" atg.nucleus.ConfigurationException: Unable to resolve component /com/progiweb/fbconnect/pipeline/FacebookProfileRequestServlet
    I have checked the config path of both components and the structure of servlet is as below-
    src/com/progiweb/fbconnect/pipeline/FacebookAuthStatusServlet.java  and  src/com/progiweb/fbconnect/pipeline/FacebookProfileRequestServlet.java.java
    And for the properties files I have created them under the path- config/atg/dynamo/servlet/dafpipeline  and  config/atg/dynamo/servlet/pipeline
    Inside these I am keeping the FacebookAuthStatusServlet.properties and FacebookProfileRequestServlet.properties files
    The path of Initial.properties is as below- config/atg/dynamo/servlet/Initial.properties
    The content of Initial.properties is -
    initialServices+=\
      /atg/dynamo/servlet/dafpipeline/FacebookAuthStatusServlet,\
      /atg/dynamo/servlet/dafpipeline/FacebookProfileRequestServlet
    Should I need to keep the both pipeline- dafpipeline and pipeline or only one of them?
    And my pipeline structure is correct or not?
    How to resolve the above error?
    Please help regarding this!
    Regards,
    Prateek

    The content of config/atg/dynamo/servlet/dafpipeline/FacebookAuthStatusServlet.properties is as below-
    $class=com.progiweb.fbconnect.pipeline.FacebookAuthStatusServlet
    $scope=global
    serviceInfo=This servlet handle facebook redirection if it's a new user.
    insertAfterServlet=/atg/dynamo/servlet/dafpipeline/FacebookProfileRequestServlet
    redirect=true
    redirectURI=register.jsp
    profileUpdater=/fbconnect/userprofiling/FacebookProfileUpdater
    loggingDebug=false
    And the content of config/atg/dynamo/servlet/pipeline/FacebookAuthStatusServlet.properties is as below-
    $class=com.progiweb.fbconnect.pipeline.FacebookAuthStatusServlet
    $scope=global
    serviceInfo=This servlet handle facebook redirection if it's a new user.
    insertAfterServlet=/atg/dynamo/servlet/pipeline/FacebookProfileRequestServlet
    redirect=true
    redirectURI=register.jsp
    profileUpdater=/fbconnect/userprofiling/FacebookProfileUpdater
    bypassExtensions=.css,.gif,.jpg,.swf
    loggingDebug=false
    I think this will clear the confusion.
    Regards,
    Prateek

  • Pipelined function with lagre amount of data

    We would like to use pipelined functions as source of the select statements instead of tables. Thus we can easily switch from our tables to the structures with data from external module due to the need for integration with other systems.
    We know these functions are used in situations such as data warehousing to apply multiple transformations to data but what will be the performance in real time access.
    Does anyone have any experience using pipelined function with large amounts of data in the interface systems?

    It looks like you have already determined that the datatable object will be the best way to do this. When you are creating the object, you must enter the absolute path to your spreadsheet file. Then, you have to create some type of connection (i.e. a pushbutton or timer) that will send a true to the import data member of the datatable object. After these two things have been done, you will be able to access the data using the A3 - K133 data members.
    Regards,
    Michael Shasteen
    Applications Engineering
    National Instruments
    www.ni.com/ask
    1-866-ASK-MY-NI

  • Pipeline Table Function returning a fraction of data

    My current project involves migrating an Oracle database to a new structure based on the new client application requirements. I would like to use pipelined table functions as it seems as though that would provide the best performance.
    The first table has about 65 fields, about 75% of which require some type of recoding for the new app. I have written a function for each transformation and have all of these functions stored in a package. If I do:
    create new_table as select (
    pkg_name.function1(old_field1),
    pkg_name.function2(old_field2),
    pkg_name.function3(old_field3),
    it runs with out any errors but takes about 3 1/2 hours. There are a little more than 10 million rows in the table.
    I wrote a function that is passed the old table as a cursor, runs all the functions for the transformations and then pipes the new row back to the insert statement that called the function. It is incredibly fast but only returns .025% of the data (about 50 rows out of my sample table of 200,000). It does not throw any errors.
    So I am trying to determine what is going on. Perhaps one of my functions has a bug. If there was would cause the row to be kicked out? There are 40 or so functions so tracking this down has been a bit of a bear.
    Any advice as to how I might resolve this would be much appreciated.
    Thanks
    Dan

    . I would like to use pipelined table functions as it seems as though that would provide the best performanceUh huh...
    it runs with out any errors but takes about 3 1/2 hours. There are a little more than 10 million rows in the table.Not the first time a lovely theory has been killed by an ugly fact. Did you do any bench marks to see whether the pipelined functions did offer performance benefits over doing it some other way?
    From the context of your comments I think you are trying to a populate a new table from a single old table. Is this the case? If so I would have thought a straightforward CTAS with normal functions would be more appropriate: pipelined functions are really meant for situations in which one input produced more than one output. Anyway, ifr we are to help you I think you need to give us more details about how this process works and post a sample transformation function.
    There are 40 or so functions so tracking this down has been a bit of a bear.The teaching is: we should code one function and get that working before moving on to the next one. Which might not seem like a helpful thing to say, but the best lesson is often "I'll do it differently next time".
    Cheers, APC

  • Pipeline Materials: Purchase Price Variances

    Hi,
    Can anybody please guide me on how to handle Purchase Price Variances of Pipeline Materials. As pipeline materials doesn't have any POs and invoices, how is it possible to post the PPV so that the variances are recorded in material ledger.
    Best Regards,
    Ganesh Perumalla

    Hi,
    I could see from SAP Notes and Help the following:
    If the consignment prices have changed since the original goods receipt, post the credit memo in Financial Accounting. Clear the invoice and the credit memo in Financial Accounting.
    But the above Credit memo doesn't update the Material Ledger.
    SAP Notes advices to use the function modules FU EXIT_RMVKON00_001, FU EXIT_RMVKON00_002 and FU EXIT_RMVKON00_003.
    If somebody already worked on this scenario, could you please provide any details.
    Best Regards,
    Ganesh Perumalla
    <u><b>FU EXIT_RMVKON00_001</b></u>
    Text
    Customer Exit: Consignment Settlement - Change Invoice Data
    Functionality
    Report RMVKON00 (Settle Consignment/Pipeline Liabilities) calls this customer-specific function module.
    Report RMVKON00 calculates the invoice data from the withdrawals to be settled and calls the interface to Financial Accounting (function module AC_DOCUMENT_CREATE). Before calling the function module, you can change the following data:
    Document header (internal table T_BKPF), contains only one line
    Document items (internal table T_BSEG), the first line contains the vendor line item, the following lines contain the invoice items
    The following information is available:
    The company code (I_BUKRS) and the vendor (I_LIFNR), for which you are creating the document.
    Parameters
    I_BUKRS
    I_LIFNR
    T_BKPF
    T_BSEG
    T_BSET
    T_RKWA
    Exceptions
    ERROR
    Function Group
    XM08
    <u><b>FU EXIT_RMVKON00_002</b>____________________________________________________</u>
    Text
    Customer Exit: Consignment Settlement - Fill RKWA at Goods Withdrawal
    Functionality
    Report RMVKON00 (Settle Consignment/Pipeline Liabilities) calls this customer-specific function module and fills structure CI_RKWA.
    In report RMVKON00, database table RKWA is changed at settlement. In table RKWA, structure CI_RKWA exists as an include that can incorporate customer-specific fields. The system can fill structure CI_RKWA at goods withdrawal or at settlement. The system uses this function module to fill structure CI_RKWA at the time of settlement.
    The RKWA entry for the withdrawal (parameter I_RKWA) is available for filling structure CI_RKWA.
    For technical reasons, it is not possible to return structure CI_RKWA, instead, the whole of table RKWA is returned using the parameter E_RKWA. However,changes to table RKWA that are not part of structure CI_RKWA, are ignored by report RMVKON00.
    Notes
    The system only adopts structure CI_RKWA if the value 'X' is passed on at parameter E_CHECK.
    Parameters
    I_RKWA
    E_RKWA
    E_CHECK
    Exceptions
    ERROR
    Function Group
    XM08

  • How to view fault schema under variable structures in alsb

    We are not able to view the fault element in our alsb flow under body variable and when we get the error in test console the error comes in bea fault schema,but we want the error in our own defined fault schema.Pls. guide.
    Below are the steps we followed to create a wsdl.
    Steps:
    1. We created a webservice project in workspacestudio and generated a wsdl.
    2. Then we have added a fault element manually in that wsdl.
    3. Now we have generated a webservice from that modified wsdl.
    4. Again we have generated a wsdl from webservice created in step 3.
    5. step 4 wsdl is tested and running well on server.
    When we try to configure that service as Proxy/Business service in ALSB and view the XPATH Expression Editor, we are not able to view Fault element under variable structures under body variable as shown below.But when we see the design view of our running wsdl in workspace studio it shows the fault element.How to get the fault element in variable body so that we can view the error in our logs in our defined Fault schema instead of BEA fault schema.WDSL is enclosed as an attachment.
    Thanks in advance.

    Have you tried using the "Log" action in your pipeline proxy and log for $body and $fault. This will defin return the $body and $fault if these were parsed correctly

  • Re: Creation of Pipeline Purchase Order or Consignment

    Hello all,
    Can you all explain me how to create Pipeline Purchase Order or Contract., so that a proof of document is maintained.
    How is is done in SAP?
    Regards,
    Smitha

    Hi,
    For Pipeline Process,
    1. Create Pipeline Material under material type PIPE
    2. You need to maintain the Pipeline Info Record in ME11. Here maintain the Prices and Tax Code mandatory
    Also in SPRO > Enterprise Structure > Assignment > Materials Management > Assign standard purchasing organization to plant
    There will not be any PO and GR Process for Pipeline materials. We consider that materials are available via Pipes in our plant and we have tp directly book consumption and then settlement of the same.
    If Stock is directly consumed from Pipeline Stock via 201 P or 261 P via transaction MB1A or MIGO then following FI Entry will appear;
    (GBB-VBR) Consumption Account - Dr
    (KON) Pipeline Liabilities - Cr
    Now do Pipeline Settlement in MRKO. Here you can not change the Invoice Value. Here following FI Entry will appear;
    Vendor Account - Cr
    Pipeline Liabilities - Dr
    Prerequisite for MRKO: -
    - Maintain condition record for output type KONS in MRM1
    Then take printout of the Settlement Document in MR91.
    Also refer following link;
    [Pipeline Handling 1|http://help.sap.com/saphelp_46c/helpdata/en/fd/45c3fe9d6411d189b60000e829fbbd/content.htm]
    [Pipeline Handling 2|http://help.sap.com/saphelp_erp2005/helpdata/en/4d/2b926443ad11d189410000e829fbbd/frameset.htm]

  • Obligatory node 'ZMAT_SPO' missing in the structure IDOC

    Hello,
    Scenario is : File - IDOC
    I am trying to post the contents of the file into Matmas
    The file gets picked and shows up a red color flag with the error : :Obligatory node 'ZMAT_SPO' missing in the structure Matmas04 Matmas04 "
    I tried by disabling segment ZMAT_SPO, tried mapping to a constant.
    nothing worked
    When I copy the payload from moni, and do test in MM, its successful
    What could be the problem?
    thanks
    nikhil.
    Edited by: nikhil a on Jan 21, 2008 9:52 AM

    Here is the error message:
    <!--  ************************************
      -->
      <Trace level="1" type="T">Message-GUID = 8F7E3AA0C7FD11DC9E5600132165C741</Trace>
      <Trace level="1" type="T">PLNAME = CENTRAL</Trace>
      <Trace level="1" type="T">QOS = EO</Trace>
      <Trace level="1" type="B" name="CL_XMS_MAIN-CALL_PIPELINE_ASYNC" />
    - <!--  ************************************
      -->
      <Trace level="1" type="T">Get definition of external pipeline = CENTRAL</Trace>
      <Trace level="1" type="B" name="CL_XMS_MAIN-LOOKUP_INTERNAL_PL_ID" />
      <Trace level="1" type="T">Get definition of internal pipeline = SAP_CENTRAL</Trace>
      <Trace level="1" type="T">Queue name : XBTI0000</Trace>
      <Trace level="1" type="T">Generated prefixed queue name = XBTI0000</Trace>
      <Trace level="1" type="T">Schedule message in qRFC environment</Trace>
      <Trace level="1" type="T">Setup qRFC Scheduler OK!</Trace>
      <Trace level="1" type="T">----
    </Trace>
      <Trace level="1" type="T">Going to persist message</Trace>
      <Trace level="1" type="T">NOTE: The following trace entries are always lacking</Trace>
      <Trace level="1" type="T">- Exit WRITE_MESSAGE_TO_PERSIST</Trace>
      <Trace level="1" type="T">- Exit CALL_PIPELINE_ASYNC</Trace>
      <Trace level="1" type="T">Async barrier reached. Bye-bye !</Trace>
      <Trace level="1" type="T">----
    </Trace>
      <Trace level="1" type="B" name="CL_XMS_MAIN-WRITE_MESSAGE_TO_PERSIST" />
    - <!--  ************************************
      -->
      <Trace level="1" type="B" name="CL_XMS_MAIN-PERSIST_READ_MESSAGE" />
      <Trace level="1" type="T">Note: the following trace entry is written delayed (after read from persist)</Trace>
      <Trace level="1" type="B" name="SXMS_ASYNC_EXEC" />
    I disabled that field in the course of MM in IR
    do I need to reimport when i do this? i dont think so .....
    nikhil

  • Pipeline step not displying at SXMB_MONI

    Hi all,
    I got my newly configured PI box ,every thing is working fine and we tested the test scenario File to File.
    file got picked and we able to create new file at receiver folder.
    but issue is when i am going to sxmb_moni Tcode I not able to view pipe line step.
    I am getting the successful white and black flag for the msg but when i am expending the selected message i not able to find pipeline step under the tree structure.
    the only 3 message is displayed is inbound message-->receiver grouping--> response msg.
    but when i am going to the XML file on right hand side i able to see that all the step are called properly.
    please tell what went wrong
    regards,
    navneet
    Edited by: navneet sumit on Jun 3, 2009 6:12 PM

    Hi,
       You have to switch on Logging..in the transaction sxmb_adm - specific configuration
    Parameters -
    Runtime
    LOGGING                                   value -  1
    LOGGING_SYNC                          value -  1
    for synchronous messages...
    If you are using proxies and you want to see the steps from ur source systems then use
    LOGGING_SYNC_PROPAGATION   value -  1
    LOGGING_SYNC_PROPAGATION            value -  1
    regards,
    Arvind R

  • OSB Service callout losing fault info xml structure

    Hi there
    I have a proxy service (getAccount) that being called via service callout by getBilling.proxy service.
    getAccount throws a fault message that I want to catch in getBilling.
    But the original fault from getAccount somehow get lost the xml structure in the detail node. The "<" sign change into "&lt;"
    Am I missing something here. How the xml structure get lost in the detail node ?
    Here's the output:
    <soapenv:Body>
      <soapenv:Fault>
      <faultcode>soapenv:Server</faultcode>
      <faultstring>BEA-382502: OSB Service Callout action received an error response</faultstring>
      <detail>
         <con:fault xmlns:con="http://www.bea.com/wli/sb/context">
         <con:errorCode>BEA-382502</con:errorCode>
         <con:reason>OSB Service Callout action received an error response</con:reason>
        <con:details>
           <con1:ErrorResponseDetail xmlns:con1="http://www.bea.com/wli/sb/stages/transform/config">
           <con1:detail>
              &lt;ipms:fault xmlns:soap-env="http://schemas.xmlsoap.org/soap/envelope/" xmlns:ipms="http://www.indosat.com/ipms">
              &lt;ipms:faultcode>IPMS-5000&lt;/ipms:faultcode>
              &lt;ipms:faultstring>Account Not Found&lt;/ipms:faultstring>
              &lt;ipms:faultlocation>
              &lt;ipms:faultsource>/billing-stream/proxy/getAccount&lt;/ipms:faultsource>
             &lt;ipms:faultnode>RouteTo_getAccountByMsisdn_db&lt;/ipms:faultnode>
        &lt;ipms:faultpath>response-pipeline&lt;/ipms:faultpath>
      &lt;/ipms:faultlocation>
      &lt;ipms:detail>
        &lt;con:errorCode xmlns:con="http://www.bea.com/wli/sb/context">IPMS-5000&lt;/con:errorCode>
        &lt;con:reason xmlns:con="http://www.bea.com/wli/sb/context">Account Not Found&lt;/con:reason>
        &lt;con:location xmlns:con="http://www.bea.com/wli/sb/context">
          &lt;con:node>RouteTo_getAccountByMsisdn_db&lt;/con:node>
          &lt;con:path>response-pipeline&lt;/con:path>
        &lt;/con:location>
      &lt;/ipms:detail>
    &lt;/ipms:fault>
      </con1:detail>
      </con1:ErrorResponseDetail>
      </con:details>
      <con:location>
      <con:node>PipelinePairNode1</con:node>
      <con:pipeline>PipelinePairNode1_request</con:pipeline>
      <con:stage>stage1</con:stage>
      <con:path>request-pipeline</con:path>
      </con:location>
      </con:fault>
      </detail>
      </soapenv:Fault>
    </soapenv:Body>

    What's the content-type in the HTTP 500 response from the backend service? (not OSB proxy, but the original source of the fault)
    I suspect it is text/plain or just missing. OSB doesn't know it is XML, and so it escapes it.
    Vlad

  • Schema and execute Pipeline problem

    hello, 
    I created a shcema input that has the following structure 
    Root 
          any 
    date in order to accept the document root after any type of document. And a pattern of output which has the structure 
    Root 
          node A 
          node B 
          node C 
    now I want to map everything that is contained in any node within the node A. 
    I made a wild map with xslt. I deployed everything in biztalk. I set the apipepile receive a pass thru and send a xmlTrasmi with the map. But hottengo the following errors: 
    This Assembler can not retrieve a document specification using this type: "Root". 
    But the schema is deployed properly. 
    Finding the document specification by message type "Root" failed. Verify the schema deployed properly. 
    How can I fix the problem. I tried several solutions, but I did not succeed.

    This error is not to do with map, but the assembler not able to resolve the deployed schema to the received message. So remove the map (just to test) and just use XMLTransmit in the send port. And
    see what do you get. If the message passes thru the XMLTransmit
    pipeline, then the schema resolution happened.
    Can your send pipeline properties and set
    DocumentSpecNames property to fully qualified name of schema in format <schema type>+<root name> ,<schema assembly full name>. This would BizTalk engine to bound the specific deployed
    schema.
    If this answers your question please mark it accordingly. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply.

  • Pipeline Steps and JCo Calls

    Hi All,
    I wanted a few links which can provide me with indepth knowledge of these two points
    1. Pipeline Steps in XI
    2. JCo Calls during processing of messages
    I am looking specifically to understand the message processing in steps where the Mapping is called and returned to ABAP IE. i mean the Enchanced Receiver Determination / Interface Determination Steps
    Rgds
    Aditya

    Hi Aditya,
      Pipeline steps are :
    Receiver Identification
    Determine which system(s) should participate in an exchange with the
    incoming message.
    Interface Determination
    For each receiver system determine which interface(s) should receive a
    message.
    Message Branch
    If multiple receivers are found, XI will instantiate a new message for each
    receiver.
    Request Message Mapping
    Call the mapping program to transform the message structure to the receiver
    format.
    Outbound Binding
    Bind a specific destination and protocol to the message.
    Call Adapter
    Send the transformed message to the adapter or proxy
    Check this weblog for full details on how a message flows in XI:
    /people/siva.maranani/blog/2005/05/25/understanding-message-flow-in-xi
    http://help.sap.com/saphelp_nw04/helpdata/en/ff/3eb33b553e436ee10000000a114084/content.htm
    Regds,
    Pinangshuk.

  • Payload and pipeline

    hi all
    pls define the words payload and pipeline
    Thanks.

    Hey Sri,
    The place where you see your data in SXMB_MONI is called as payload.
    Pipeline: As soon as you test any data in xi it goes through xi-pipeline. Below are the pipeline steps in sequence.
    <b>Receiver Identification</b>
    Determine which system(s) should participate in an exchange with incoming message.
    <b>Interface Determination</b>
    For each receiver system determine which interface(s) should receive a message.
    <b>Message Branch (Message Split)</b>
    If multiple receivers are found, XI will instantiate new message for each receiver.
    <b>Request Message Mapping</b>
    Call the mapping program to perform the message structure to the receiver format.
    <b>Outbound Binding (Technical Routing)</b>
    Bind a specific destination and protocol to the message.
    <b>Call Adapter</b>
    Send the transformed message to the adapter or proxy.
    Regards,
    Sarvesh

Maybe you are looking for

  • Organizer not showing photos in thumbnails...only getting grey square w/hour glass

    The organizer is only showing a blank grey square as the thumbnail and not the actual photo.  However, once I click on the thumbnail, the photo pops up and is veiwable.  This is a problem that just recently started.  All of my other photos, in other

  • How to put another Receive step in middle of BPM

    Hi Experts, i built a BPM in PI 7.0, in this i used a Receive step to start the BPM. in the subsequent steps this message has sent to the extrnal business sytem through the Send step. there it processes. then after some time this business system will

  • Scratchy sound dynex 27" tv

    Hello, I purchased a 27" dynex for christmas.  since day one it had scratchy noise from the left speaker, it was only a few channels at first but slowly progressed until almost all channels were affected.  I took it back to BB who sent it off for rep

  • HT4623 I keep getting an error message when I try to download iOS 7

    When I click to download iOS 7 I keep getting an error message that says the action can't be completed

  • Transferring music to different types of I-Pod

    Can someone please help?? I have installed software for my ipod with a black and white screen (click wheel) onto my computor and now wish to transfer music from said computor onto a colour screen ipod with photos/dvd also click wheel. how do i do thi