Splitting large message (60MB) based on payload data

Hi,
I have a file (Flat) to file (xml) scenario. The source flat file is being read by FCC. Since the source flat file is large (upto 60MB) so I have to split it into small files then I have applied "recordset per messages" in FCC level to split large file into small ones. But the clients requirement is to split the large document based on payload data (that is DeliveryDate). That means I cannt split the message based on number of rows of flat file instead I have to split the file on the basis of DeliveryDate so that after splitting into small files, each small file should contain data for exactly one date (say in one file data for 15th NOV and in another file data for 16NOV and so on).
Please suggest some solution to split the large file (60MB) based on payload data(DeliveryDate).
Br,
Madan Agrawal

Hi Madan,
in this case split the message in to different messages like 2 mb file,
XI doesn't support 60 mb files, U have to split the flat files based one some condition.
i have same requirement flat file having huge data i split that data using java map.]
then i processed its work fine for me.
I think u can also do it,
But in my case i divided message based on sequence number uniquenumber to diffentiate data.
if there is any sequence number split the message .
Regards,
Raj

Similar Messages

  • How To split large message using File Adapter

    Hello everyone,
    Here is my scenario FTP > XI > BI.
    I got 2 questions:
    (1) I am getting large message around 70 MB file.
        How we can split the message into multiple files 10 MB each before processing the file into XI?
    (2) Is there is any way we can find out size of file which is on FTP  without contacting FTP admin?    
        through XI before processing the file?
    Thanks
    Vick

    hi vick,
    check the blog
    Zip or Unzip your Payload with the new PayloadZipBean module of the XI Adapter Framework                                   
    Working with the PayloadZipBean module of the XI Adapter Framework
    SAP XI acting as a (huge) file mover                                   
    The specified item was not found.                                   
    Managing bulky flat messages with SAP XI (tunneling once again) - UPDATED                                   
    The specified item was not found.                                   
    regards
    kummari

  • Processing large message payload with weblogic JMS

    Hi,
    I have a requirement in my project to process 50MB-500MG-1GB file using the JMS queue.
    We could able to process 50MB payload but it's talking almost 2-3min to post the message and 2-4min to consume the message. Are there any configurable parameters available to fine tune the JMS queue to handle large payload? or is there any limitation on the size of the queue to process ?
    Please advise.
    Thanks,
    Sri

    There is a way to defer large messages from the rest and process them using a job in a specific queue.
    Please check this configuration :
    <a href="http://help.sap.com/saphelp_nw2004s/helpdata/en/14/80243b4a66ae0ce10000000a11402f/frameset.htm">http://help.sap.com/saphelp_nw2004s/helpdata/en/14/80243b4a66ae0ce10000000a11402f/frameset.htm</a><a href="http://help.sap.com/saphelp_nw2004s/helpdata/en/14/8024">Message Selection Filter</a>
    in this path:
    Under Runtime - > Integration engin ->
    Prioritized Message Processing
    Queues for Prioritized Message
    Good luck
    Nimrod

  • Large Message Payload - Queue selection

    Hi,
        I have one message with payload size around 105MB when I am trying to send through XI this message gets failed in the queue (inbound) of integration engine saying some memory problem to process the message. I have seen the queue it is picking up some XBT09_000 but I want to use the queue XBTL* as this queue is for large messages. The whole thing is that when my message is around 100MB I want it route large message processing queue i.e XBTL*. Is there anyway to do this? Any help on this greatly appreciated.
    Thanks.,
    Daniel.LA

    There is a way to defer large messages from the rest and process them using a job in a specific queue.
    Please check this configuration :
    <a href="http://help.sap.com/saphelp_nw2004s/helpdata/en/14/80243b4a66ae0ce10000000a11402f/frameset.htm">http://help.sap.com/saphelp_nw2004s/helpdata/en/14/80243b4a66ae0ce10000000a11402f/frameset.htm</a><a href="http://help.sap.com/saphelp_nw2004s/helpdata/en/14/8024">Message Selection Filter</a>
    in this path:
    Under Runtime - > Integration engin ->
    Prioritized Message Processing
    Queues for Prioritized Message
    Good luck
    Nimrod

  • Large payload data to  SOA platform

    In our BPEL how to upload large payloads data. Please let me know your suggestions.
    Thank you
    Balaji
    Edited by: Hari.luckey on Dec 5, 2012 6:19 PM

    Hi,
    Thanks for your reply!
    Actually, I got this error:
    Error parsing envelope: (92635, 79) Expected 'EOF'.; nested exception is:
         javax.xml.soap.SOAPException: Error parsing envelope: (92635, 79) Expected 'EOF'.
    when the WS method returns a large payload.
    I got this error only when the WS returns a lot of data. So I supposed that was the problem?!
    I've tried your solution, but nothing has changed.
    any idea?
    Many thanks,
    Regards

  • Split one message and create N Files on target side based on FieldName

    Hi Experts
    How to split one message into N messages
    I have used BPM and I have put
    One Receiver Step
    One Transformation Step and
    One Send Step.
    In Receiver step I have used Correlation as Field2
    In Tansformation Step I have done One to One Mapping
    But In receiver Side only one File is creating.
    Message Structure is
    <Main_MT>.... 1...1
    ........<test>01</test>.... 1...1
    ...........<Sub_MT>...... 0...Unbound
    ................<Field1> </Field1>
    ................<Field2>123</Field>
    ..........</Sub_MT>
    ..........<Sub_MT>...... 0...Unbound
    ................<Field1> </Field1>
    ................<Field2>234</Field>
              </Sub_MT>
    </Main_MT>
    How to resolve this problem
    Thanks & Regards
    Sowmya

    Hey Jayson,
    I dint search the blogs using the keyword " mutiple mapping". The given two blogs I had used for creating my first 1:n mapping, so their names are by heart to me.
      I believe both the blogs provided by me help in creating the mutiple seperate messages on the target end.
    The Claus's blog covers more detailed mapping logic for message split and the Jin's blog covers the ID pieces needed to be configured when we want to create mutiple messages on the target end.
    Pardon me if I have misconstrued your point.
    Thanks,
    Pooja

  • Month Year values based on Posting Date

    In my super huge extra large InfoCube (0CFM_C10) I got a lot of data. I take Posting Date, some KFG and CalMonth/Year. Unfortinally CalMonth/Year duplicates records, if I drop it off the columns/rows I get valid data by Posting Date.
    My question is this - is it possible to create some MonthYear Calculated KFG/field/formula or smthng. based on Posting Date? In other words I need Month/Year in rows/ columns or free characteristics...
    Edited by: Gediminas Berzanskis on Mar 18, 2008 10:18 AM

    Dear,
    When canceling a payment which was created in previous posting periods,
    we  get system message "Date deviates from permissible range",so
    the workaround is changing back the posting period to the previous one
    and try to cancel the payment.
    However,another system message pops up when we try to cancel payment
    after changing back the posting period,which is the "creation date" or
    "posting date".
    In this scenario, you should select the second option from the
    cancellation options window, which is the 'Creation date'. I would like
    to explain more below.
    Posting Date- means the posting date of the cancellation document, it's
    not the posting date of the incoming payment that you wanna perform the
    cancellation. In your case, selecting this 'posting date' option, system
    deems that you want to post this cancellation document on its own
    posting date.
    Creating Date- means the posting date/creation date of the incoming
    payment, it makes sense that the system works fine if you select this
    option. If you cancel the incoming payment and check the JE generated,
    you will find that the posting date of this cancellation document is
    actually recorded as the posting date of the incoming payment.
    Wish it helps you.If you have any problems,please kindly let me know.
    Thanks and best regards,
    Apple

  • JDBC Adapter: J2EE server crashes while sending large messages

    We want to use the following scenario to transfer data from a MS SQL Server to SAP BW via XI:
    JDBC Sender Adapter – XI – SAP ABAP Proxy.
    All works fine with a small amount of data. But if the select statement delivers too many record sets and the size of the transformed XML payload is greater then 50 MB the J2EE server crashes. A complete restart is necessary. It seems to be am memory problem.
    Here are the entries from our log files:
    dev_server0
    [Thr 6151] Mon Jul 24 12:46:57 2006
    [Thr 6151] JLaunchIExitJava: exit hook is called (rc=666)
    [Thr 6151] **********************************************************************
    ERROR => The Java VM terminated with a non-zero exit code.
    Please see SAP Note 940893 , section 'J2EE Engine exit codes'
    for additional information and trouble shooting.
    [Thr 6151] SigISetIgnoreAction : SIG_IGN for signal 17
    [Thr 6151] JLaunchCloseProgram: good bye (exitcode=666)
    std_server0.out
    FATAL: Caught OutOfMemoryError! Node will exit with exit code 666java.lang.OutOfMemoryError
    Is this a general problem of the XI or a specific one of our configuration? Is it possible to transfer such large messages via XI? If not, is there a workaround for such scenarios?
    (Memory heap size of the J2EE server is 1024 MB.)

    > Hi Gil,
    >
    > i had nearly the same problems some times in praxis
    > and the mapping was the reason. Just change your
    > interface determination temporary, delete the mapping
    > and test again to find out if the mapping is the
    > reason.
    >
    > Regards,
    > Udo
    I have changed my interface determination so that no message mapping is used. The J2EE server still crashes.
    > Hi Gil,
    > This does sounds like a memory problem especially
    > when it comes to 50M message with a minimum XI sys
    > requierments...
    > To be sure you can check on the RWB for the
    > componnent monitoring at the JDBC adapters and look
    > for your adapter
    > look at the status of the adapter and the trace
    > there...
    Hi Nimrod
    In case of such an error I have no entries in channel monitor. So I can't see anything there. I have also no entries in message monitor of the RWB in this case. So I don't get any information with standard XI tools.
    > My reccomendation to you is to set the poll intervall
    > to a shorter period,this way you'll make sure you get
    > less records...I hope you have remembered to add  a
    > status/flag column on the table to be set after
    > selection so no duplicate records will be taken on
    > the second pools.
    >
    The problem is that the source of my data is not a simple SQL statement but a stored procedure. So I don't know exactly how many records will be delivered. A update command is not possible.

  • How to Split the message content?

    Hi ....
        I am having an input File like below format.
    <?xml version="1.0" encoding="UTF-8"?>
    <ns0:MT_ISO8583 xmlns:ns0="http://axis.com/bank_statement">
       <Field1></Field1> -
    0 to 1
       <Filed2></Filed2> -
    0 to 1
       <Filed3></Filed3> -
    0 to unbounded
    </ns0:MT_ISO8583>
    Here Field 3 is repeating many times. The value in the Field 3 has to be splitted and should be mapped to the target feilds. Field 3 will stop repeating until when the first value in the field 3 is N.
    The length of the field is 142 characters in length.
    The field should be splitted based on the following structure.
    More data flag : 1 char (u2018Yu2019/u2019Nu2019) This indicates whether there are more records for the given criteria.
    Number of statements : 2char (00 to 20) Number of statements in this fetch
    The following are repeated as many number of statements
    Transaction date : 8 char ( YYYYMMDD )
    Transaction Id : 9 char (right justified, left padded with spaces)
    Part tran serial num : 4 char (right justified left padded with spaces)
    Tran Type : 1 char ( C - Cash, T - Transfer, L - Clearing )
    Tran sub type : 2 char ( BI, CI, NP, NR)
    Debit credit indicator : 1 char ( D - Debit, C - Credit )
    Tran value date : 8 char ( YYYYMMDD )
    Transaction Amount : 17 char ( with decimal )
    Transaction particulars : 50 Char (left justified, right padded with spaces)
    Transaction Posted date : 14 char ( YYYYMMDDHHMISS )
    Instrument number : 8 char (right justified, left padded with spaces)
    Balance at the end of the transaction : 17 char ( with decimal )
    Can you help me , how to split the message content in the field3?
    Below is the sample message.
    <?xml version="1.0" encoding="UTF-8"?>
    <ns0:MT_ISO8583 xmlns:ns0="http://axis.com/bank_statement">
       <Field1>930000</Field1>
       <Filed2>234259901406</Filed2>
       <Filed3><Y0820090127S275433861005TBIC20090127 4872.14EDC/212-213-/M000044375340045 20090127151 959 32663.7620090127S278456264276TBIC20090127 5290.45EDC/214-215-/M000044375340045 20090127205840 37954.2120090131S299799314797TBIC20090131 1883.88EDC/216-216-/M000044375340045 20090131104028 39838.0920090202S31532814 662TBIC20090202 3105.37EDC/217-217-/M00004 4375340045 20090202121514 42943.4620090205S337981634779TBIC20090205 2052.20EDC/2 18-218-/M000044375340045 20090205104040 44995.6620090205 M53898 1CNPD20090205 42000.00TO CASH/SELF 20090205111633 12824 2995.6620090207S354072734740TBIC20 090207 4429.21EDC/219-219-/M000044375340045 20090207112958 7424.8720090209S3671 72631419TB Field 126: IC20090209 7161.55EDC/220-221-/M000044375340045 20090209122637 14586. 42]]></Filed3>
       <Filed3><Y0820090127S275433861005TBIC20090127 4872.14EDC/212-213-/M000044375340045 20090127151 959 32663.7620090127S278456264276TBIC20090127 5290.45EDC/214-215-/M000044375340045 20090127205840 37954.2120090131S299799314797TBIC20090131 1883.88EDC/216-216-/M000044375340045 20090131104028 39838.0920090202S31532814 662TBIC20090202 3105.37EDC/217-217-/M00004 4375340045 20090202121514 42943.4620090205S337981634779TBIC20090205 2052.20EDC/2 18-218-/M000044375340045 20090205104040 44995.6620090205 M53898 1CNPD20090205 42000.00TO CASH/SELF 20090205111633 12824 2995.6620090207S354072734740TBIC20 090207 4429.21EDC/219-219-/M000044375340045 20090207112958 7424.8720090209S3671 72631419TB Field 126: IC20090209 7161.55EDC/220-221-/M000044375340045 20090209122637 14586. 42]]></Filed3>
       <Filed3><Y0820090127S275433861005TBIC20090127 4872.14EDC/212-213-/M000044375340045 20090127151 959 32663.7620090127S278456264276TBIC20090127 5290.45EDC/214-215-/M000044375340045 20090127205840 37954.2120090131S299799314797TBIC20090131 1883.88EDC/216-216-/M000044375340045 20090131104028 39838.0920090202S31532814 662TBIC20090202 3105.37EDC/217-217-/M00004 4375340045 20090202121514 42943.4620090205S337981634779TBIC20090205 2052.20EDC/2 18-218-/M000044375340045 20090205104040 44995.6620090205 M53898 1CNPD20090205 42000.00TO CASH/SELF 20090205111633 12824 2995.6620090207S354072734740TBIC20 090207 4429.21EDC/219-219-/M000044375340045 20090207112958 7424.8720090209S3671 72631419TB Field 126: IC20090209 7161.55EDC/220-221-/M000044375340045 0090209122637 14586. 42]]></Filed3>
       <Filed3><N0820090127S275433861005TBIC20090127 4872.14EDC/212-213-/M000044375340045 20090127151 959 32663.7620090127S278456264276TBIC20090127 5290.45EDC/214-215-/M000044375340045 20090127205840 37954.2120090131S299799314797TBIC20090131 1883.88EDC/216-216-/M000044375340045 20090131104028 39838.0920090202S31532814 662TBIC20090202 3105.37EDC/217-217-/M00004 4375340045 20090202121514 42943.4620090205S337981634779TBIC20090205 2052.20EDC/2 18-218-/M000044375340045 20090205104040 44995.6620090205 M53898 1CNPD20090205 42000.00TO CASH/SELF 20090205111633 12824 2995.6620090207S354072734740TBIC20 090207 4429.21EDC/219-219-/M000044375340045 20090207112958 7424.8720090209S3671 72631419TB Field 126: IC20090209 7161.55EDC/220-221-/M000044375340045 20090209122637 14586. 42]]></Filed3>
    </ns0:MT_ISO8583>
    Thanks & Regards,
    Leela

    using substring to get values you want, and map to corresponding target nodes except:
       fields3 ->count->Number of statement
    Regards.
    Liang

  • Determine the size of EDI payload data

    Hi Experts,
    As far as my B2B Knowledge is concerned, in order to know the size of an EDI payload... we download the payload from wire message or payload, copy the data and paste it in a file. The size of the file determines the EDI payload size.
    But, this is a tedious task, particularly in cases where there are huge number of EDI data. To far as I know, the size of EDI files ranges from 1 MB to 1GB.
    Please advise what should I do in cases of files with large payload sizes(more than 25MB).
    Please advise if there is any B2B table whose one of the columns depicts the size of the payload.
    I have searched through some b2b table as b2b_messageinstance, b2b.ip_b2b_report but could not find any such columns.
    Please advise what exactly the procedure is to determine the size of EDI payload data, particularly for cases where payload data is larger than 25 MB.

    I am afraid that there is no direct way of finding the payload size in 10g. You may write your own standalone program or API which may calculate the size of payload by querying the b2b_instancemessage view or by calling the B2B InstanceMessage API -
    http://www.oracle.com/technetwork/testcontent/b19324-01-instance-msg-api-129535.zip
    Regards,
    Anuj

  • In Mail, one mailbox for Recovered Message (AOL) keeps showing 1 very large message that I cannot delete. How can I get rid of this recurring problem, please?

    In Mail on iMac, successfully running OS X Lion, one mailbox on My Mac for "Recovered Messages (from AOL)" keeps showing 1 very large message (more than 20 Mb) that I just cannot seem to delete. Each time I go into my In Box, the "loading" symbol spins and the message appears in the "Recovered Messages" mailbox. How can I get rid of this recurrent file, please?
    At the same time, I'm not receviving any new mails in my In Box, although, if I look at the same account on my MacBook Pro, I can indeed see the incoming mails (but on that machine I do not have the "recovery" problem).
    The help of a clear-thinking Apple fan would be greatly appreciated.
    Many thanks.
    From Ian in Paris, France

    Ian
    I worked it out.
    Unhide your hidden files ( I used a widget from http://www.apple.com/downloads/dashboard/developer/hiddenfiles.html)
    Go to your HD.
    Go to Users.
    Go to your House (home)
    there should be a hidden Library folder there (it will be transparent)
    Go to Mail in this folder
    The next folder ( for me ) is V2
    Click on that and the next one will be a whole list of your mail servers, and one folder called Mailboxes
    Click on that and there should be a folder called recovered messages (server) . mbox
    Click on that there a random numbered/lettered folder -> data
    In that data folder is a list of random numbered folders (i.e a folder called 2, one called 9 etc) and in EACH of these, another numbered folder, and then a folder called messages.
    In the messages folder delete all of the ebmx (I think that's what they were from memory, sorry I forgot as I already deleted my trash after my golden moment).
    This was GOLDEN for me. Reason being, when I went to delete my "recovered file" in mail, it would give me an error message " cannot delete 2500 files". I knew it was only 1 file so this was weird. Why 2500 files? Because if you click on the ebmx files like I did, hey presto, it turned out that they were ALL THE SAME MESSAGE = 2500 times. In each of those folders in the random numbers, in their related message folder.
    Now remember - DONT delete the folder, make sure you have gone to the message folder, found all those pesky ebmx files and deleted THOSE, not the folder.
    It worked for me. No restarting or anything. And recovered file. GONE.
    Started receiving and syncing mail again. Woohoo.
    Best wishes.

  • How to split the messages in the mapping

    Hi Gurus,
    I need to split the message into two XML message based on the value in  the plant and sent it to two receivers.
      How to do using Graphical mapping.
    Im working in PI7.0. I dont know how to use the enhanced receiver determination. PL guide me.
    This is my input message format
    <ns0:Namespace>
       <row>
          <PlantCode>10<PlantCode>
          <element1>
          <element2>    
          <element3>
    <row>
      <row>
          <PlantCode>40<PlantCode>
          <element1>
          <element2>    
          <element3>
    <row>
      <row>
          <PlantCode>20<PlantCode>
          <element1>
          <element2>    
          <element3>
    <row>
      <row>
          <PlantCode>50<PlantCode>
          <element1>
          <element2>    
          <element3>
    <row>
    <ns0:Namespace>
    My output message should be
    Message1
       <row>
          <PlantCode>10<PlantCode>
          <element1>
          <element2>    
          <element3>
    <row>
       <row>
          <PlantCode>20<PlantCode>
          <element1>
          <element2>    
          <element3>
    <row>
    Message2
       <row>
          <PlantCode>40<PlantCode>
          <element1>
          <element2>    
          <element3>
    <row>
       <row>
          <PlantCode>50<PlantCode>
          <element1>
          <element2>    
          <element3>
    <row>
    Based on the plant i have to split the message
    1. If plant is 10 or 20 it has to go to message 1
    2. If plant is 40 or 50 it has to go to message 2.
    How to do...
    Regards,
    Rama

    1.Create a message mapping.
    2.In the mapping editor, switch to the Messages tab page
    3.Specify the same  target message type 2 time by choosing +
    4.Switch to the Design tab page
    PlantCode---removeContext---equalS
              constant[10]/                     \
                                                   OR------ifWithoutElse-------Message1
                                                  /              plantcode   /
    PlantCode---removeContext---equalS
              constant[20]/
    PlantCode---removeContext---equalS
              constant[40]/                     \
                                                   OR------ifWithoutElse-------Message1
                                                  /              plantcode   /
    PlantCode---removeContext---equalS
              constant[50]/
    Do 1 to 1 mapping between row,plantcode,element1,element2,element3 from source to target structure
    create 2 target message Interface for the same target message type
    Finally Create an interface mapping and reference the both target interfaces in target interfaces. Enter your message mapping in the interface mapping.(1--source Interface and 2 target Interfaces)
    In ID, Interface determination choose Enhanced radio button and under inbound interfaces add both the target Interfaces

  • Combining different measures based on different dates in a single table

    Hi,
    I'm attempting to produce a report that gives two counts of items in a database, the first based on the date added to the database, the second based on the date marked as deleted, all reported by month over the last 12 months.
    The following pertinent fields are available for use in the query builder:
    Item ID, Entry Date, Deleted Date, End date of previous month, End date of previous month last year.
    The report should look something like this:
    Month.....Total Items....Additions....Deletions
    May 08....54.............. 43...............3
    Apr 08.....654..............600.............0
    Mar 08.....654..............0................0
    Feb 08.....654.............10...............10
    Jan 08.....53................0................601
    Jul 08......96................3................2
    Jun 07.....46................4................54
    Month:
    =If(DaysBetween([Last Day Of Prev Month Prev Year];[Entry Date])>0;
    +Month([Entry Date]) + " " + FormatNumber(Year([Entry Date]);"####");+
    "Previous Balance")
    Total Items:
    =RunningSum(Count([Item Id]))-RunningSum(Count(Item ID) Where(Not(IsNull([Deleted Date]))))
    Item Additions:
    =Count([Item ID])
    Item Deletions:
    =Count([Item ID]) Where(Not(IsNull([Deleted Date])))
    The part I'm having problems with is splitting the results by month, such that the number of deletions equals the number of items deleted that month, rather than the number of items deleted that month of the items added that month. I.e. I need to split by month on the additions date for the additions column and by month on the deletions date for the deletions column.
    At the moment, if the 54 items deleted in Jul 07 were originally added to the DB in, say Mar 06, then they would show up as deletions in Mar 06 - because the Month column is being split out based on Additions date. Of course I could reverse this and split the months out by deletion date, but then I would have the same problem in reverse - I can't see any way of doing both.
    Any ideas?
    I hope I've explained that so that it makes vague sense, please ask if it doesn't!
    Many thanks in advance,
    Steve

    Have you tried splitting this out into 2 seperate queries the first to return the additions by month and the second to return the deletions by month. You should then be able to merge the date dimensions and report both additions and deletions against a single date dimension.
    Regards,
    Mike

  • ABAP Mapping for Large Messages

    Hi Folks,
    We are exploring different options for dealing with the fact that XI will choke on very large messages/files. One of the options that we are considering is a third party tool that bypasses XI. However, we've just learned that it may be possible to solve the large message problem by using ABAP Mapping, because supposedly, by doing so, one would bypass the large message being converted to XML as it comes into XI. The scenario involves messages coming into XI on their way to SAP R/3. Does the ABAP Mapping option appear to be viable to you experienced folks out there?
    Thanks
    Nic

    Hey
    tunneling as per the term is used mainly for IDOC but we use the term bypass to implement the same concept but with other interfaces.
    see if u have simple 1-1 mapping then u can do a bypass scenario in which u are not doing anything in IR,u simple do the configurations in ID.
    you can not drastically increase the performance just by choosing some specific mapping.mappin is not made for this .
    for best performance you can design a bypass JDBC to IDOC scenario(but then you wont be able to do any message mapping.
    have a look at the following for bypass scenario
    /people/william.li/blog/2006/09/08/how-to-send-any-data-even-binary-through-xi-without-using-the-integration-repository
    if u want to do message mapping then design a JDBC to Proxy scenario.
    proxy is mainly used to enhance performance and would be the best bet for u i guess
    Thanx
    Aamir suhail
    Message was edited by:
            Aamir Suhail

  • Alert Configuration  in PI 7.31 using SOLMAN based on Payload

    Hello Experts,
    We have created alert configuration in PI which will be consumed by SOLMAN and alerts will be triggered to respective recipients based on alert rules.
    The alert rules are created using standard configuration objects in NWA. Everything is good up to now, But the issue we have here is as follow:
    We have interfaces segregated based on country wise so the configuration objects will be different for which alert rules are created without any issues.
    But for some countries for which interfaces are developed by re-using the existing PI objects, the problem starts here for us:
    Since the configuration PI objects are same, we are not able to segregate here the alert rules using config objects resulting the only common alert rule for all these countries.
    In this case, alerts will be routed to all these countries whenever there is an alert generated. Obviously it is not the best way to proceed. But there will be difference in payload of message based on country .
    Hope my issue is clear to you
    Is there any way to set these alert rules based on payload information rather than config objects of PI.
    Please help me to achieve this ..your help is much appreciated.
    Thanks,
    Venkat

    Hi,
    XPI Service: AII Config Service
    - com.sap.aii.rwb.server.centralmonitoring.r3.ashost
    - com.sap.aii.rwb.server.centralmonitoring.r3.sysnr
    are set as expected but still there is problem.

Maybe you are looking for