BPM for Processing Multiple records in a file

Hello All:
I using BPM to process Goods issue using BAPI BAPI_GOODSMVT_CREATE. The BPM setps are as
Step 1  Receive Request from Asyn File Interface
Step 2  Transformation Map to Bapi request Structure
Step 3  Send Sync Request to Bapi
Step 4  Transformation Map Bapi Response to file
Step 5  Send Asyn Message to File
My input file  is as fallows
H,20050613,20050613,9999,HEAD TXT,03,
P,000001000108,0001,COMMON,2,EA,1000,1011,261,ITM TXT,
H,20050613,20050613,9999,HEAD TXT,03,
P,000001000108,0001,COMMON,3,EA,1000,1011,261,ITM TXT,
The output file is as fallows
  <?xml version="1.0" encoding="UTF-8" ?>
- <ns1:MT_SAP_RESPONSE_ITMS_DATA xmlns:ns1="http://testcompany.com/xi/ITMS">
- <BAPI_STR>
- <HEADER>
  <MAT_DOC>4900000696</MAT_DOC>
  </HEADER>
  </BAPI_STR>
  </ns1:MT_SAP_RESPONSE_ITMS_DATA>
The problem I am facing is although my file has 2 records of goods issue data (Header and Item data) , the BPM processes only one record (Header and Item) and creates a material doc in SAP and outputs a file with the material doc number as above. Ideally I would like the XI system to process both the records and produce 2 material docs as output .
Also my input file adapter has been picking up both the records as this is visible in the SXMB_MONITOR logs.
Do I need to introduce a loop in the BPM ?. If so can any can any body give me an example or point me in the right direction.
Many Thanks
TBH

Udo:
<b>you need a mapping splitting one message with multiple entries into multiple messages with one entry, which you put into a multiple line BPM container.</b>
   Are you implying loop Step 1 Receive Request from Asyn File Interface and collect them into a multiline container ?. How does the loop know how many times to loop.
Would appreciate your comments
Thorsten

Similar Messages

  • Creation of multiple Records in the file as per multiple segments in IDOC

    Hi SapAll.
    i have got a requirement to create a multiple records in a file based on multiple segments at sending Idoc in a File To Idoc Interface.
    the Scenario  is the reciever message type is mapped with fields of 3 segments in sending IDOC.
    SEG01   1.....1(PARENT SEGMENT)
      SEG02  0...999999(CHILD SEGMENT)
      SEG03 0...9999999(CHILD SEGMENT)
    in an instance where if the SEG01 exists for one time and SEG02 exists for 2 times and Seg03 Exists for 2 times PI is creating the file with 2 records in it but
    when if the SEG01 exists for one time and SEG02 exists for 2 times and Seg03 Exists for 1 time it is raising the error in message mapping where it is supposed to create 2 records in a file with empty values in the fields (mapping with seg03) segment.
    can anybody help me in this.
    regards.
    Varma

    you can create a UDF after you validate if the count match. if match you create the message if not, call de UDF.
    this UDF should receive two parameters -->Queue SEG2 and queue of SEG3.
    then you should loop by the count of SEG2. if you find a Supress Value in the queue of SEG3 add a "" to result. for example.
    for(i=0;i<=SEG2.count;i++){
    If SEG3<i>.equals(ResultList.SUPPRESS) {
           result.addValue(" ");
    }else{result.addValue(SEG3<i>);
    the result of UDF is a queue which should map to target field directly coz it has context changes
    I think that is what you are needing. if no let me know.
    RP
    Edited by: Rodrigo Alejandro Pertierra on Jun 17, 2010 11:56 AM

  • Aps for processing RAW or DNG picture files

    Are there any aps for processing RAW and or DNG files on ad iPad?

    In the normal situation, LR needs to see a PS installed for it to detect the Edit In PS option is possible, so if you uninstall both, then install PS, first, then LR.  Of course with this release there may be some hiccup that requires installing either one or the other, again, but if you've uninstalled both and are reinstalling both, then install PS, first.

  • Can not create planning function types (for process empty records)

    Hi! SAP Experts,
         I want to copy 0RSPL_FORMULA to another function types for processing empty records, ZRSPL_FORMULA. I checked this function type via RSPLAN, there is no error!.   But when i try to use it in planning modeler. We found the error message "Infoobject 1FORMULA is does not available in version A". Do we need to create this infoobject?... Anyone know, please let me know.
    Thanks,
    Sake

    Hi!,
    Thanks for your help but I've already put that parameter,HIDDENFORMULAVARTAB->1FORMULA, in my planning function, ZRSPL_FORMULA. I validated it and no error found. But when I go to planning modeler, in web browser, and try to create a new planning function based on ZRSPL_FORMULA. I found the problem as I mention in previous message, "infoobject 1FORMULA is not available in version A". I try to search this infoobject in my system. The result is there is no this infoobject in my system. So, my question is do i need to create it with myself? What kind of infoobject i have to create? Key Figures or Characteristics?
    Anyway, If I have to do this, why 0RSPL_FORMULA still used as normal.
    Brgds,
    Sake

  • How B2B Adapter will process multiple statements in one file

    Hi All,
    I have a file of MT940 format. I am able to read the file using B2B adapter. It contains multiple statements. Each statement is having opening and closing balance.
    Now B2B Adapter is creating the instance on each statements Simultaneously.
    Problem:
    Here i need the closing balance of the 1st statement, is opening for the next statement, and i need to record the updated closing and opening balance in table. As B2B is processing independently each statement, i am not able to correlate the statement details.
    Example -
    Lets say i have 3001490311 - 16012013 MT940 File having mutilple statements, Lets say 3 statements.
    Statement 1 -
    Op Bal - 10,000, Cl Bal - 20,000
    Statement 2 -
    Op Bal - 20,000, Cl Bal - 30,000
    Statement 3 -
    Op Bal - 30,000, Cl Bal - 40,000
    I need to record the Updated closing balance in the table, As B2B Adpter starts processing the statements independently, so i am not able to corelate bettwen the statements. Please if any one has any solution for this problem, please post it immediately

    Thanks Anuj,
    Thanks for your reply. As I unchecked the translate button. while testing i came to know that, In B2B console it showing as complete, but soa process is not getting started.
    Problem:
    B2b reading the file sequentially. In My case file type MT940 Postional Flat File. As its getting the starting position of the statement, B2B is creating a instance and that instance keep on moving independently, now again B2b is keep on reading the another statements in the same file.
    Once again B2B is getting a starting position to read the statement, again creating the instance. so in the similar ways B2B is keep on creating the instance for other statement in the same file.
    But in my case i need the out put of one process should be input to the next processs. As B2B is processing independently, i am not able to correalte the processes.
    In my case I need B2b should processs first statement, till completion of the first process second processes should not initiate, so that i can use the out put of one process to input to the another process.
    Thanks
    Dilllip

  • Creation of multiple records in reciever file as per occurance of 1 segment

    Hi SapAll.
    i have a got a requirement in an IDOC To File Interface
    the Requirement is under sending IDOC there will be multiple segments whose fields will me mapped to reciever file,so here i got a business requirement as
    IDOC -
    Segment 1-occur for 1 time
    segment 2- occurs  for 1 time
    segment 3-occurs for 3 times so here PI Need to create a file with 3 records as segment 3 repeats for 3 times, here segment 1 and segment 2  field  values repeats for all the 3 records int the file and segment 3 fields values will have to be mapped respectively for each record of all the 3 records from each segment 3.
    i dont know on how i can acheive this.
    can any of sap group provide me the solution.
    regards.
    Varma

    > segment 3-occurs for 3 times so here PI Need to create a file with 3 records as segment 3 repeats for 3 times, here segment 1 and segment 2  field  values repeats for all the 3 records int the file and segment 3 fields values will have to be mapped respectively for each record of all the 3 records from each segment 3.
    Create your target structure as mentioned and do the mapping as shown below.
    <Records> 0..unbounded
         <segment1_Fields> </segment1_Fields>
         <segment2_Fields> </segment2_Fields>
         <segment3_Fields> </segment3_Fields>
    </Records>
    Now do the mapping like this..
            segment3 -
    >RemoveContext---> Records (parent node mapping)
             segment1----->CopyValue(0) ---> segment_field1
             segment2----->CopyValue(0) ---> segment_field2
             segment3----->SplitByValue ---> segment_field3
    Note: You can also use "UseOneAsMany" instead of "CopyValue" function to repeat the segment1 & segment2 values in each record.

  • Sender JMS Content Conversion - How to process multiple records

    Hi All,
    I use a Sender JMS Channel with Content Conversion.
    My message structure is like this
    <root>
        <rec>    </rec>
        <rec>    </rec>
    </root>
    I have fixed length flat file with multiple records.
    i have given the parameters FixedFieldLength, FieldNames and StructureTitle.
    Which parameter i need to use specify the RecordDelimiter
    Because my input file will have more than record
    my input file -
    xxxx
    yyyy
    if i dont specify any delimiter value, in the module parameter,then for each newline of the file, a new mesage is created.
    <root>
      <rec>xxxx</rec>
    <root>
    <root>
      <rec>yyyy</rec>
    <root>
    But i want the output to be like this
    <root>
    <rec>xxxx<rec>
    <rec>yyyy</rec>
    </root>

    hi,
    You can do your FCC for sender JMS by going through page 5 of this document.
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/50061bd9-e56e-2910-3495-c5faa652b710

  • Handling Multiple Records in a file adapter

    Hi All,
    My source file message :
    <Source_MT>
        <Records>....<b>0-Unbound</b>
                 <Country>THAILAND</country>
        </Records>
       <Records>....<b>0-Unbound</b>
                 <Country>ANGOLA</country>
        </Records>
    </Source_MT>
    Target Message:
    <Target_MT>
        <Employees>....<b>0-Unbound</b>
                 <Country>THAILAND</country>
        </Employees>
    </Traget_MT>
    Now in my scenario ,source message will have multiple records with different countries........
    I want to send these records to different receivers on the basis of this coutry field,means after target message mapping....all the records corresponding to THAILAND must be sent to THA_RECVR system..all the records corresponding to ANGOLA must be sent to Angola recvr systems...
    Please help me to club target message records on the basis of country field and then send it to correspodning receiver systems ......
    Should I use BPM?

    Shweta,
    You need to do enhanced receiver determination. You can handle this by determining them during runtime.
    Check this out : Re: Condition In Receiver Determination Not Working [Page 3 : My reply]
    raj.

  • Saving multiple records into text file

    Can I save multiple records into a text file at one go?
    My application has a list of data displayed there and when the user clicks on the save button it will save all the records on the screen.
    It works but it only saves the last record.
    Here are my codes
    // this is to display the list of data
    private JLabel[] subjects=new JLabel[20];
    private JLabel[] subTotal=new JLabel[20];
    private JLabel[] codes=new JLabel[20];
    private JLabel[] getTotal=new JLabel[20];
    String moduleCodes;
    String getPrice;
    double price;
    int noOfNotes;
    public testapp(Subjects[] subList)
    int j=0;
                double CalTotal=0;
              for (int i=0; i<subList.length; i++)
                   subjects[i] = new JLabel();
                   subTotal[i] = new JLabel();
                   codes=new JLabel();
                   getTotal[i]=new JLabel();
                   if (subList[i].isSelected)
                        System.out.println(i+"is selected");
                        subjects[i].setText(subList[i].title);
                        subjects[i].setBounds(30, 140 + (j*30), 400, 40);
                        subTotal[i].setText("$"+subList[i].price);
                        subTotal[i].setBounds(430,140+(j*30),100,40);
                        codes[i].setText(subList[i].code);
                        getTotal[i].setText(subList[i].price+"");
                        CalTotal+=subList[i].price;
                        contain.add(subjects[i]);
                        contain.add(subTotal[i]);
                        j++;
                        moduleCodes=codes[i].getText();                              
                        getPrice=getTotal[i].getText();
                        noOfNotes=1;
    // this is where the records are saved
         public void readRecords(String moduleCodes,String getPrice,int notes)throws IOException
              price=Double.parseDouble(getPrice);
              String outputFile = "testing.txt";
              FileOutputStream out = new FileOutputStream(outputFile, true);      
         PrintStream fileOutput = new PrintStream(out);
              SalesData[] sales=new SalesData[7];
              for(int i=0;i<sales.length ;i++)
                   sales[i]=new SalesData(moduleCodes,price,notes);
                   fileOutput.println(sales[i].getRecord());
              out.close();

    I suggest writing a method that takes a SalesData[]
    parameter and a filename. Example:
    public void writeRecords(SalesData[] data,
    String filename) throws IOException
    BufferedWriter out = new BufferedWriter(new
    FileWriter(filename, true));
    for (int i = 0; i < data.length; i++)
    out.write(data.getRecord());
    out.newLine();
    out.close();
    And it's good to get in the habit of doing things like closing resources in finally blocks.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • SXDA split files for RMDATIND - First record in sequential file & not a session record (type 0)

    Hi gurus,
    I'm trying to perform a mass upload of material master records and for this I've have setup a Data Transfer Project in SXDA with the purpose of splitting an LSMW input file into multiple. The file split task is using Data Load Program:
    Object Type: BUS1001006
    Obj. description: AD Standard Material
    Program type: DINP
    Program: RMDATIND
    The problem I'm facing is that once the .conv (Converted Data) is split in multiple files these files are being transferred without a session record. This is causing to get the following error when running program: RMDATIND "First record in sequential file & not a session record (type 0)"
    So the questions is: How can I specify in SXDA that I want to keep that session record in all my split files?
    Thanks in advance for your help!

      Hi Chris ,
    try to re create logical file path/files for converted data.
    regards
    Prabhu

  • FM for reading total record in flat file

    Hi,
    Do we any function module which can tell me about number of record in flat file.
    I want only FM name!!
    Thanks for your reply!! but File is only there in Application server.
    Thanks in advance.
    Message was edited by: Vipin Nagpal

    Hi,
    then you need to call the unix command (if your application server is Unix)
    <b>wc -l fielname</b>
    data: unixcom like   rlgrap-filename.
    unixcom = 'wc -l fielname'
    data: begin of tabl occurs 500,
            line(400),
          end of tabl.
    data: lines type i.
      call 'SYSTEM' id 'COMMAND' field unixcom
                    id 'TAB'     field tabl[].
    loop at tabl.
        write:/01 tabl-line.
      endloop.
    Regards
    vijay

  • Using XSU to process multiple record sets

    Have a design question about XML, Java and Oracle's XSU. This is a common task when importing external data that should be of general interest.
    I have an incoming XML document with multiple records with a known DTD. Before the records can be inserted into the database, part of
    the record must be used to do a lookup for a foreign key. The resulting foreign key is then combined with the other part of the
    record to create a valid input record for the database.
    My question is what's the most efficient way to do this? I assume the general approach must go something like:
    begin
    while (records in xml doc)
    1. parse the incoming XML to extract the next record
    2. if the lookup data change, do a lookup for this record using XSL
    3. combine the resulting foreign key with the data to create
    an insert query
    4. add the insert query as a document fragment to a second XML document.
    use XSU to insert the second XML doc with multiple records.
    end
    I've figured out how to do most of this, except for step 1 above: is there an easy way to extract the next record from an XML doc if I know the row nodes and the DTD? Or do I just have to traverse all the XML nodes until I hit my next row tag?
    Appreciate any suggestions; this is a handy discussion list!
    --Rick Casey
    Rick Casey, Graduate research assistant CADSWES, http://cadswes.colorado.edu
    University of Colorado at Boulder [email protected] 303.492.0892
    null

    Yes, typical behavior is that when you have multiple outputs it will create a one row IllumDoc to wrap them all together with OuputParameter=* (and the xml will be ignored unless it is the only output property or you've requested it by name).
    How about just using "/Lighthammer/Illuminator?QueryTemplate=xxxx/yyyy&Content-Type=text/xml" or "/Lighthammer/Runner?Transaction=xxxx/yyyy&OutputParameter=OutputXML" which will both return your multi-rowset output?
    I may be mistaken, but I believe the XacuteResponse was simplified to not include the Rowsets outer node because certain systems had problems digesting the layers, in fact you don't get the <Columns/> section either in the single Rowset.
    Rick - any feedback to this SOAP simplification from Xacute?

  • Custom Pipeline component for Removing Trailer record from .txt file

    public Microsoft.BizTalk.Message.Interop.IBaseMessage
    Execute(Microsoft.BizTalk.Component.Interop.IPipelineContext pc,
    Microsoft.BizTalk.Message.Interop.IBaseMessage inmsg)
        IBaseMessagePart bodyPart = inmsg.BodyPart;
        Stream originalStrm = bodyPart.GetOriginalDataStream();
        StreamReader sReader =
    new StreamReader(originalStrm,
    System.Text.Encoding.UTF8);
        string sRecord = sReader.ReadToEnd();
        MemoryStream memStream =
    new MemoryStream();
        StreamWriter sw =
    new StreamWriter(memStream);
        inmsg.BodyPart.Data
    = memStream;
        inmsg.BodyPart.Data.Position
    = 0;
        //"\r\n" is the delimeter for the the record
        string[] separator
    = new string[]
    { "\r\n" };
        string[] strArray
    = sRecord.Split(separator,
    StringSplitOptions.None);
        //Loop untill the last line (i.e ignore the trailer)
        for (int n
    = 1; n < strArray.Length; n++)
            sw.Write((strArray[n
    - 1] +
    "\r\n"));
        sw.Flush();
        memStream.Flush();
        memStream.Position
    = 0;
        inmsg.BodyPart.Data
    = memStream;
        inmsg.BodyPart.Data.Position
    = 0;
        return inmsg;
    after Deploying and in Gac, when configuring Receive Pipeline, it shows no properties in Decode stage ?
    MBH

    There is nothing wrong with your code it removes the lastline, if there is no carriage return on it.
    If your input file is like:  
       line1 <cr><lf>
               line2 <cr><lf>
               line3 
    The result is: 
               line1 <cr><lf>
               line2 <cr><lf>
    But if your input file is like:
                 line1 <cr><lf>
                  line2 <cr><lf>
                  line3 <cr><if>
    The result is:
                 line1<cr><lf>
                 line2<cr><lf>
                <empty line>
    So when you have a carriage return on the last line, it results in an empty line, can this be the cause of your problem?

  • N:1 multimapping for multiple records in every file

    Hi ,
    I am merging 2 files into a single file. First file has employee personal data and second file has employee salary data.I have created BPM and used correlation on EmpNumber to merge both the files. In BPM I am using fork step to recieve two files then use Transform step to merge the files and finally send dtep to send the output file.
    If both the input files have single employee record my scenario works fine. However my requirenment is file one will have 10 employee records . File two will have salary details of the same 10 employees . I have to  merge both the files and create 10 final records of employees .
    Here i am not able to use correlation as i dont have unique key in the file.
    Can any one tell me how to achive this?
    Regards,
    Shabari

    Please check the below blog
    http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/10526
    Refer the blog for the BPM design. You can make changes such that you include a fork(with two branches)inside the loop.  But without correlation how will you merge similar employee records.

  • Pl/Sql for creating Multiple record type file

    Hi all,
    I have a scenario where I need to create a flat file that contains two different record types in the same file. Basically, a way of getting both the header record and the corresponding detail records. Following are the details:
    Table A (Header Records)
    "REC_TYPE" "M_ID" "DATE" "*C_ID*"
    1 123 090807 *222*
    1 345 090907 *333*
    Table B (Detail Records)
    "REC_TYPE" "A_NO" "LINE_NO" "*C_ID*"
    2 7564 1 *222*
    2 4535 2 *222*
    2 4656 1 *333*
    2 6576 2 *333*
    In the output file, the resultset should be as:
    222, 123, 090807 (Header Record)
    *222, 7564, 1* (Detail Record)
    *222, 4535, 2* (Detail Record)
    333, 345, 090907 (Header Record)
    *333, 4656, 1 (Detail Record)*
    *333, 6576, 2 (Detail Record)*
    Any input is greatly appreciated.
    Thank you!

    NOT TESTED ! Don't remember when I used loops for the last time. I won't have database access until september (on vacation).
    declare
      type header_t is record
        c_id ... ,
      type detail_t is record
        c_id ... ,
      header_r header_t;
      detail_r detail_t;
      cursor c_h is select c_id, ...
                      from ...
                     order by 1;                      -- header cursor
      cursor c_d is select c_id, ...
                      from ...
                     order by 1;                      -- detail cursor
      end_line    varchar2(2) := chr(13) || chr(10);  -- chr(10) to be used for non Windows
      a_separator varchar2(1) := ',';
      a_buffer    varchar2(32767);
      l_buffer    constant number := 32767;
      f_handle    utl_file.file_type;
      procedure buffer_put(p_line in varchar2) is
      begin
        if length(a_buffer) + length(p_line) < l_buffer then
          a_buffer := a_buffer || p_line;
        else
          utl_file.put(f_handle,a_buffer);
          a_buffer := p_line;
        end if;
      end;
      function build_header_record(p_record in header_t) return varchar2 is
        retval varchar2(4000) := '';
      begin
        retval := retval || to_char(p_record.c_id) || a_separator;
        retval := retval || ... || a_separator;
        retval := retval || ... || end_line;
        return retval;
      end;
      function build_detail_record(p_record in detail_t) return varchar2 is
        retval varchar2(4000) := '';
      begin
        retval := retval || to_char(p_record.c_id) || a_separator;
        retval := retval || ... || a_separator;
        retval := retval || ... || end_line;
        return retval;
      end;
    begin
      f_handle := utl_file.fopen ('THE_DIRECTORY','the_file.ext','w',l_buffer);
      open c_h;
      open c_d;
      loop
        fetch c_h into header_r;
        exit when c_h%notfound;
        buffer_put(build_header_record(header_r));
        if c_h%rowcount > 1 and (header_r.c_id = detail_r.c_id) then
          buffer_put(build_detail_record(detail_r));
        end if;
        loop
          fetch c_d into detail_r;
          exit when c_d%notfound or (detail_r.c_id != header_r.c_id);
          buffer_put(build_detail_record(detail_r));
        end loop;
      end loop;
      if length(a_buffer) > 0 then
        utl_file.put(f_handle,a_buffer);
      end if;
      utl_file.fflush(f_handle);
      utl_file.fclose(f_handle);
    end;Regards
    Etbin
    utl_file.fclose(f_handle); instead of utl_file.fclose;
    Edited by: Etbin on 11.8.2009 8:54

Maybe you are looking for