Reading Multiple Structures

Hi,
Could anyone tell me how to read multiple structures in a single BPEL process?
I have a requirement to read multiple files with different structures to be read in a single BPEL process.
All these files will be ftped and read from a single source folder, this need to be transformed to common canonical xml format [OAGIS], Any possibilities?
Thanks in Advance
Regards
Sathish
Message was edited by:
user555459

if Reading Multiple Structures mean by Reading Multiple FileType from FTP location then
the solution for above problem is to either fallow the naming convention or stick for on file type. (*.* is not allowed in FTP adapter)
fallowing the naming convention: uploaded files should start with somename_----.txt or csv or any structure, type......
then we can put a condition like somename*.* will pickup all the files start with somename.... instead of structure...
try this link.... study the documentaion
http://download.oracle.com/docs/cd/B31017_01/integrate.1013/b28994/adptr_file.htm#CIABDJEF
hope this will help you....

Similar Messages

  • FTP Adapter to read multiple files from a directory. Not through polling.

    Dear Friends,
    I would like to know is it possible to configure the FTP adapter in Oracle BPEL 10.1.3.4 to read multiple files (different names, same structure) from a given directory. I do not want the BPEL to do a polling. Instead when I submit the BPEL process it should read all files from the directory.
    I was looking at the option of Synchronous read but I am not able to specify wild card in the file name field. I do not know the file names at the time of reading.
    Thanks for your help!

    Hi,
    While you read the file, you can configure an adapter property in 'Receive'. This will store the filename, this filename can be used for sync read as the input parameter.
    1. Create a message type variable called 'fileheader'. This should be of type Inboundheader_msg (whatever relevant Receive activity).
    2. This variable will contain three parts - filename, FTPhost, FTPPort
    3. Copy this fileheader to 'Syncheader'.
    4. syncheader can be passed as an adapter proerty during sync read of the file.
    During Receive and Invoke, you need to navigate to 'Adapter' tab to choose the created message type variable.
    Let me know if you have further questions.
    regards,
    Rev

  • APD using Query with multiple structures as a data source

    All,
    I want to set up an automatic process which executes a query and exports it to a shared drive as a csv file. I have tried various options , when I try to use APD to set up the extract, I get an error and this is because the query that I am trying to use has Strucutres in both rows and columns. Hence, I am unable to use this option. I tried RSCRM_BAPI, It works well, but there is an issue with scheduling this in Process chain. I created an event and scheduled this as a job to trigger after "event" as per SAP instructions, but the job does not exist and it is not possible to trigger it through the Process chain unless the variables are hard coded in the query which I do not want to do.
    Can any one tell me if there is a way to deal with APD using Query with multiple structures?
    Would really appreciate if some one can give me the right solution...
    Thanks

    Hi Tanu ,
    APD is an option but its not very good with large amount of data or hiearachies or if you have attributes in you query structure .
    One more option for this requirement is use of report program using function module RRW3_GET_QUERY_VIEW_DATA .
    This will work fine with multiple structure etc .
    There are some overheads with this FM  ex: if amount of data is too much then program will give dump .Solution for that is we call the FM in LOOP by diving amount of data need to be fetched .ex:  we can read data quarter wise.
    For using this function module what you can do is write an ABAP program (At SE38 ) .which will call this FM and then write the output into a flat file which you can save at application server (AL11) .From there other system can read it .
    To automate this whole process you can further add all the report programs into a process chain (RSPC) which can be schedule as per requirement .
    To pass input parameters you can use variants that will pass the values to the report .
    Check thi link for sample code :
    [http://www.tricktresor.de/content/index.php?navID=696&aID=496]
    Hope this will be helpful .
    Regards,
    Jaya Tiwari

  • Read multiple rows to do aggregation

    Hi All
    I need some help here:
    1. I have to read multiple rows to check where time stamp is same for example rownum (1-3) : *'2013/03/09 10:54:09 PM'*
    2. And status is unique for example *(H,L)*
    3. And if TIME_STAMP is between TIME_A and TIME_B then flag it as Y else flag it as N.
    and if the above condition is not true then
    1. Flag the row with ' Y ' where time_stamp is same i.e. : '2013/03/09 10:54:09 PM'
    2. And the status is *( ' O ')*
    3. the other rows where time_stamp is same, and unique_id is same, and STATUS is either or both ( ' H ' , ' L ' ) flag it as *' N '*
    table structure:
    CREATE TABLE T2
      TIME_STAMP  DATE,
      STATUS      VARCHAR2(1 BYTE),
      UNIQUE_ID   NUMBER,
      TIME_A      DATE,
      TIME_B      DATE
    Insert statement;
    SET DEFINE OFF;
    Insert into T2
       (TIME_STAMP, STATUS, UNIQUE_ID, TIME_A, TIME_B)
    Values
       (TO_DATE('03/09/2013 22:54:09', 'MM/DD/YYYY HH24:MI:SS'), 'H', 6587797, TO_DATE('03/09/2013 23:00:22', 'MM/DD/YYYY HH24:MI:SS'), TO_DATE('03/09/2013 23:03:00', 'MM/DD/YYYY HH24:MI:SS'));
    Insert into T2
       (TIME_STAMP, STATUS, UNIQUE_ID, TIME_A, TIME_B)
    Values
       (TO_DATE('03/09/2013 22:54:09', 'MM/DD/YYYY HH24:MI:SS'), 'L', 6587797, TO_DATE('03/09/2013 22:48:35', 'MM/DD/YYYY HH24:MI:SS'), TO_DATE('03/09/2013 23:00:22', 'MM/DD/YYYY HH24:MI:SS'));
    Insert into T2
       (TIME_STAMP, STATUS, UNIQUE_ID, TIME_A, TIME_B)
    Values
       (TO_DATE('03/09/2013 22:54:09', 'MM/DD/YYYY HH24:MI:SS'), 'O', 6587797, TO_DATE('03/09/2013 23:03:00', 'MM/DD/YYYY HH24:MI:SS'), TO_DATE('03/09/2013 23:05:54', 'MM/DD/YYYY HH24:MI:SS'));
    Insert into T2
       (TIME_STAMP, STATUS, UNIQUE_ID, TIME_A, TIME_B)
    Values
       (TO_DATE('03/09/2013 22:54:57', 'MM/DD/YYYY HH24:MI:SS'), 'H', 6587797, TO_DATE('03/09/2013 23:00:22', 'MM/DD/YYYY HH24:MI:SS'), TO_DATE('03/09/2013 23:03:00', 'MM/DD/YYYY HH24:MI:SS'));
    Insert into T2
       (TIME_STAMP, STATUS, UNIQUE_ID, TIME_A, TIME_B)
    Values
       (TO_DATE('03/09/2013 22:54:57', 'MM/DD/YYYY HH24:MI:SS'), 'L', 6587797, TO_DATE('03/09/2013 22:48:35', 'MM/DD/YYYY HH24:MI:SS'), TO_DATE('03/09/2013 23:00:22', 'MM/DD/YYYY HH24:MI:SS'));
    Insert into T2
       (TIME_STAMP, STATUS, UNIQUE_ID, TIME_A, TIME_B)
    Values
       (TO_DATE('03/09/2013 22:54:57', 'MM/DD/YYYY HH24:MI:SS'), 'O', 6587797, TO_DATE('03/09/2013 23:03:00', 'MM/DD/YYYY HH24:MI:SS'), TO_DATE('03/09/2013 23:05:54', 'MM/DD/YYYY HH24:MI:SS'));
    Insert into T2
       (TIME_STAMP, STATUS, UNIQUE_ID, TIME_A, TIME_B)
    Values
       (TO_DATE('03/09/2013 18:26:55', 'MM/DD/YYYY HH24:MI:SS'), 'L', 6583483, TO_DATE('03/09/2013 18:15:51', 'MM/DD/YYYY HH24:MI:SS'), TO_DATE('03/09/2013 18:26:28', 'MM/DD/YYYY HH24:MI:SS'));
    Insert into T2
       (TIME_STAMP, STATUS, UNIQUE_ID, TIME_A, TIME_B)
    Values
       (TO_DATE('03/09/2013 18:26:55', 'MM/DD/YYYY HH24:MI:SS'), 'O', 6583483, TO_DATE('03/09/2013 18:27:01', 'MM/DD/YYYY HH24:MI:SS'), TO_DATE('03/09/2013 18:33:10', 'MM/DD/YYYY HH24:MI:SS'));
    COMMIT;
    Query used:
    SELECT time_stamp,
           UPPER (SUBSTR(status,1,1)) as status,
           Unique_ID,
           Time_A,
           Time_B,
           CASE
                WHEN time_stamp BETWEEN Time_A AND Time_B
              THEN
                   'Y'
              ELSE
                   'N'
             END   as flag
      FROM t2;
    the query provided above doesn't work with these rows:
    TIME_STAMP,                STATUS,     UNIQUE_ID,      TIME_A,                             TIME_B,                                    FLAG
    2013/03/09 6:26:55 PM,   L,              6583483,          2013/03/09 6:15:51 PM,     2013/03/09 6:26:28 PM,              N
    2013/03/09 6:26:55 PM,   O,              6583483,          2013/03/09 6:27:01 PM,      2013/03/09 6:33:10 PM,             N
    because here where time_stamp is same and time_stamp is not between time_a and time_b therefore it should be flagged Y on the row where status is *' O '*
    Edited by: 855161 on Apr 10, 2013 7:36 AM

    Hi,
    Sorry, it's unclear what you want.
    855161 wrote:
    Hi All
    I need some help here:
    1. I have to read multiple rows to check where time stamp is same for example rownum (1-3) : *'2013/03/09 10:54:09 PM'*
    2. And status is unique for example *(H,L)*
    3. And if TIME_STAMP is between TIME_A and TIME_B then flag it as Y else flag it as N.
    and if the above condition is not true then
    1. Flag the row with ' Y ' where time_stamp is same i.e. : '2013/03/09 10:54:09 PM'
    2. And the status is *( ' O ')*
    3. the other rows where time_stamp is same, and unique_id is same, and STATUS is either or both ( ' H ' , ' L ' ) flag it as *' N '*
    the query provided above doesn't work with these rows:
    TIME_STAMP,                STATUS,     UNIQUE_ID,      TIME_A,                             TIME_B,                                    FLAG
    2013/03/09 6:26:55 PM,   L,              6583483,          2013/03/09 6:15:51 PM,     2013/03/09 6:26:28 PM,              N
    2013/03/09 6:26:55 PM,   O,              6583483,          2013/03/09 6:27:01 PM,      2013/03/09 6:33:10 PM,             Nbecause here where time_stamp is same and time_stamp is not between time_a and time_b therefore it should be flagged Y on the row where status is *' O '*What are the complet, correct results you want from thissample data?
    Are they the same as what you are currently getting, except that the 2 rows above should have flag = 'O'?
    Why should the row that has status='L' get flag='O'?
    Why don't the other rows that currently have status='O' and flag='N' need to be changed?
    To see if there are other rows with the same time_stamp, you can use the analytic COUNT function, like this:
    SELECT time_stamp,
           UPPER (SUBSTR(status,1,1)) as status,
           Unique_ID,
           Time_A,
           Time_B,
           CASE
                WHEN time_stamp BETWEEN Time_A AND Time_B
                    THEN
                        'Y'
                WHEN  COUNT (*) OVER (PARTITION BY  time_stamp) > 1
             AND       UPPER (SUBSTR (status, 1, 1)) = 'O'
                 THEN
                  'O'
                    ELSE
                        'N'
             END   as flag
      FROM t2
    ORDER BY  time_stamp
    ,            status
    ;The results that this produces are :
    `                        UNIQUE
    TIME_STAMP           S      _ID TIME_A               TIME_B               F
    09-Mar-2013 18:26:55 L  6583483 09-Mar-2013 18:15:51 09-Mar-2013 18:26:28 N
    09-Mar-2013 18:26:55 O  6583483 09-Mar-2013 18:27:01 09-Mar-2013 18:33:10 O
    09-Mar-2013 22:54:09 H  6587797 09-Mar-2013 23:00:22 09-Mar-2013 23:03:00 N
    09-Mar-2013 22:54:09 L  6587797 09-Mar-2013 22:48:35 09-Mar-2013 23:00:22 Y
    09-Mar-2013 22:54:09 O  6587797 09-Mar-2013 23:03:00 09-Mar-2013 23:05:54 O
    09-Mar-2013 22:54:57 H  6587797 09-Mar-2013 23:00:22 09-Mar-2013 23:03:00 N
    09-Mar-2013 22:54:57 L  6587797 09-Mar-2013 22:48:35 09-Mar-2013 23:00:22 Y
    09-Mar-2013 22:54:57 O  6587797 09-Mar-2013 23:03:00 09-Mar-2013 23:05:54 OI don't think this is what you wat, but I'm not sure.
    Can you explain, in a different way than before, what flag is supposed to show? What role (if any) does the so-called unique_id play in this problem?

  • Multiple structures in BEx used in CR

    hi,
    I have a BEx-query (P&L) which uses multiple structures.
    When I open it in CR I see my key figures OK, but the other structure is displayed as one field.
    I want to format my report in CR so that some fields in the structure are indented, some should be bold etc etc.
    When my list of fields in the structure are displayed as only ONE field in CR this formatting will be set on all fields in the structure and that's not what I want.
    I have a workaorund for this, but it's quite complex for the end users so I wonder if you have any news on if and when we will be able to get a BEx-query with multiple sttructures in CR and in CR be able to format each field in the structure in an easy way.
    Kind regards
    /martin

    hey, i have the same problem.. whats your workaround? Can you give me a hint?

  • Read multiple files and write data on a single  file in java

    Hello,
    I am facing difficulty that I want to read multiple files in java and write their data to a single file. Means Write data in one file from multiple files in java
    Please help me out.
    Naveed.

    algorithm should be something like:
    File uniqueFile = new File();
    for (File f : manyFilesToRead)
       while (readingF)
           write(dataFromF, intoUniqueFile);

  • Issue in reading multiple time properties file

    Use Case:
    I have a page in which i have to show some Outage Messages on page. So I have configured outage message in properties files. This message should me rendered each time when it finds that properties file in class path or it finds the values w.r.t their keys.
    So what I decide that I shall write java method and call this method on Loading the page but here what happen is that whenever that page is load each time that reader file will call , that will create memory leak .
    So, anyone help me on this. What will be the best approach for that reading multiple times properties file?
    I will be grateful for any help you can provide
    Thanks

    Hi,
    have a bean at applicationScope, somewhat like this
    import java.util.HashMap;
    public class ApplnProperties {
        private HashMap _propMap;
        public HashMap getPropMap() {
            if(_propMap == null){
                //read the properties file and populate _propMap
            return _propMap;
    }here we put condition _propMap == null so it would be called only once during application lifecycle.
    and on page you can refer the prop map like following
    #{beanName.propMap['KEY1']}Regards,

  • Read MULTIPLE idocs data with all sgmn to my internal table in a single

    Dear SAP Folks!
    I have a question, I know to read SINGLE idoc data with all segments, we have FM IDOC_READ_COMPLETELY but my requirement is to Read MULTIPLE idocs data with all segments to my internal table in a single shot without keeping this FM in loop! since it is performance issue, at a time i may want to read 1000 idocs data!
    Could anyone please let me know is there any way to get this, keeping mind of performance and any other FM?
    Best Regards,
    Srini

    Hi,
    I know idoc numbers and i can write a select Query to DB table EDID4 to get all segments data to my internal table. But i am looking for FM to do this. because i have too many number of idocs and each idoc is having many segments(I am thinking in performance point of view!) The FM IDOC_READ_COMPLETELY can accept only ONE IDOC number as import parameters at a time to get segment data. in similar way do we have any other FM? or other way? except select query to EDID4 table?
    Best Regards,
    Srini

  • How to read multiple Digital samples and plot a chart with time stamps

    Hi,
     Could anyone send me a code that:
    1. Reads 'multiple samples(lets say 4) from single digital input' 
    2. 'plot digital data as a chart with time stamps'
    3. Find frequency
    4. Log data into file with time stamps
    I have attached the code which i tried.
    Thanks,
    LK
    Attachments:
    DigitalNSample.vi ‏27 KB
    DigitalNSample.vi ‏27 KB

    Hi,
     Could anyone send me a code that:
    1. Reads 'multiple samples(lets say 4) from single digital input' using NI USB 6009 or NI USB 6251.
    2. 'plot digital data as a chart with time stamps'
    3. Find frequency
    4. Log data into file with time stamps
    I have attached the code which i tried.
    Thanks,
    LK
    Attachments:
    DigitalNSample.vi ‏27 KB

  • Read multiple text files

    Hi everyone out there,
    I would like to know how to read multiple files from same directory, but files hve different name, using java. Any help is appreciate, need asap..
    Thnx,

    I want to process all the files that are in the
    folder. Okay. This is the critical piece of information that you hadn't provided previously.
    And you saying using loop? how do you propose
    using loop for this?There are two different tasks: Identifying all the files, and processing each of them.
    You definitely need a loop for processing them. That's what loops are for--doing the same thing multiple times.
    You might or might not use a loop to identify the files.
    I see two main possible approaches:
    1) Provide the list of files as command line args when you start the VM, using your command shell's wildcard/globbing: java MyClass * Then inside your code: public static void main(String[] args) {
        for (int ix = 0; ix < args.length; ix++) {
            String fileName = args[ix];
            // process it
    } 2) Provide your class with the name of the directory you want to process (for instance as a command line arg). Create a java.io.File instance that corresponds to that directory. File has methods that list the files inside a diretory. Use that to get a list of the files, and run through that list.

  • Read multiple text files and sort them

    I am trying to read multiple text files and store the data from the file in vector.
    but for days. I am with no luck. anyone can help me out with it? any idea of how to sort them will be appreciated.
    Below is part of the code I implemented.
    public class packet {
        private int timestamp;
        private int user_id;
        private int packet_id;
        private int packet_seqno;
        private int packet_size;
        public packet(int timestamp0,int user_id0, int packet_id0,int packet_seqno0, int packet_size0)
            timestamp = timestamp0;
            user_id=user_id0;
            packet_id=packet_id0;
            packet_seqno=packet_seqno0;
            packet_size=packet_size0;
        public void setTime(int atimestamp)
            this.timestamp=atimestamp;
        public void setUserid(int auserid)
            this.user_id=auserid;
        public void setPacketid(int apacketid)
            this.packet_id=apacketid;
        public void setPacketseqno(int apacketseqno)
            this.packet_seqno=apacketseqno;
        public void setPacketsize(int apacketsize)
            this.packet_size=apacketsize;
        public String toString()
            return timestamp+"\t"+user_id+"\t"+packet_id+"\t"+packet_seqno+"\t"+packet_size+"\t";
    }Here is the data from part of the text files. ( the first column is timestamp, second is userid, third is packetid.....)
    0 1 1 1 512
    1 2 1 2 512
    2 3 1 3 512
    3 4 1 4 512
    4 5 1 5 512
    5 6 1 6 512
    6 7 1 7 512
    7 8 1 8 512
    8 9 1 9 512
    9 10 1 10 512
    10 1 2 11 512
    11 2 2 12 512
    12 3 2 13 512
    13 4 2 14 512
    14 5 2 15 512
    15 6 2 16 512
    16 7 2 17 512

    Here's a standard idiom for object-list-sorting:
    /* cnleafdata.txt *********************************************
    0 1 1 1 512
    1 2 1 2 512
    2 3 1 3 512
    3 4 1 4 512
    4 5 1 5 512
    5 6 1 6 512
    6 7 1 7 512
    7 8 1 8 512
    8 9 1 9 512
    9 10 1 10 512
    10 1 2 11 512
    11 2 2 12 512
    12 3 2 13 512
    13 4 2 14 512
    14 5 2 15 512
    15 6 2 16 512
    16 7 2 17 512
    import java.util.*;
    import java.io.*;
    public class Packet implements Comparable<Packet>{
      private int timeStamp;
      private int userId;
      private int packetId;
      private int packetSeqno;
      private int packetSize;
      public Packet(int timeStamp0, int userId0, int packetId0,
       int packetSeqno0, int packetSize0) {
        timeStamp = timeStamp0;
        userId = userId0;
        packetId = packetId0;
        packetSeqno = packetSeqno0;
        packetSize = packetSize0;
      public Packet(String timeStamp0, String userId0, String packetId0,
       String packetSeqno0, String packetSize0) {
        this(Integer.parseInt(timeStamp0), Integer.parseInt(userId0),
         Integer.parseInt(packetId0), Integer.parseInt(packetSeqno0),
         Integer.parseInt(packetSize0));
      public Packet(String[] a){
        this(a[0], a[1], a[2], a[3], a[4]);
      public void setTime(int aTimeStamp){
        timeStamp = aTimeStamp;
      public void setUserId(int aUserId){
        userId = aUserId;
      public void setPacketId(int aPacketId){
        packetId = aPacketId;
      public void setPacketSeqno(int aPacketSeqno){
        packetSeqno = aPacketSeqno;
      public void setPacketSize(int aPacketSize){
        packetSize = aPacketSize;
      public int getUserId(){
        return userId;
      public String toString(){
        return String.format
    ("%2d %2d %2d %2d %4d", timeStamp, userId, packetId, packetSeqno, packetSize);
      public int compareTo(Packet otherPacket){
        return userId - otherPacket.getUserId();
      /* main for test */
      public static void main(String[] args){
        String line;
        ArrayList<Packet> alp;
        alp = new ArrayList<Packet>();
        try{
          BufferedReader br = new BufferedReader(new FileReader("cnleafdata.txt"));
          while ((line = br.readLine()) != null){
            // if (! recordValid(line)){
            //   continue;
            String[] ar = line.split("\\s");
            alp.add(new Packet(ar));
        catch (Exception e){
          e.printStackTrace();
        System.out.println("[original]");
        for (Packet p : alp){
          System.out.println(p);
        System.out.println();
        Collections.sort(alp);
        System.out.println("[sorted by user ID]");
        for (Packet p : alp){
          System.out.println(p);
    }

  • Unable to read multiple files in BODS

    hi all,
    i am unable to read multiple files [with same format of fields] using wild card characters in file name.
    scenario:
    i have 2 files: test1.xlsx & test2.xlsx
    in the excel file format, for the file name column, i have given test*.xlsx.
    and done the direct mapping to target column.
    but when i run the job i am getting below error.
    at com.acta.adapter.msexceladapter.MSExcelAdapterReadTable.ReadAllRows(MSExcelAdapterReadTable.java:1242)
    at com.acta.adapter.msexceladapter.MSExcelAdapterReadTable.readNext(MSExcelAdapterReadTable.java:1285)
    at com.acta.adapter.sdk.StreamListener.handleBrokerMessage(StreamListener.java:151)
    at com.acta.brokerclient.BrokerClient.handleMessage(BrokerClient.java:448)
    at com.acta.brokerclient.BrokerClient.access$100(BrokerClient.java:53)
    at com.acta.brokerclient.BrokerClient$MessageHandler.run(BrokerClient.java:1600)
    at com.acta.brokerclient.ThreadPool$PoolThread.run(ThreadPool.java:100)
    please let me know if there is any solution to this.
    regards,
    Swetha

    Hi,
    i just copied a xlsx file with 3 different names (Test_Data.xlsx, Test_1.xlsx, Test_2.xlsx) and tried with below options and it worked for me.
    Note: I tried on the same OS and DS 4.1 SP2(14.1.2.378)versions. In Linux File names are case sensitive.

  • Reading Multiple lines in a file Using File Adapter

    Hi All,
    Iam new to this technology.How to read multiple lines in a file using file adapter.Brief me with the methodology.

    I didn't look at anything else but if you want to write more than one line ever to your file you should change this
    out = new FileOutputStream("Calculation.log");to this...
    out = new FileOutputStream("Calculation.log",true);A quick look at the API reveals the follow constructor FileOutputStream(File file, boolean append) append means should I add on the end of the file or over-write what is there.
    By default you over-write. So in our case we say true instead which says add on to what is there.
    At the end of that little snippet you shoudl be closing that stream as well.
    So where you have
    p.close();You should have
    p.close();
    out.close();

  • Message Driven Bean reading multiple times from a jms queue

    Hi,
    I am facing a strange problem with my message driven bean. Its configured to read message from a jms queue. But sometimes it read the same message multiple times from the jms queue.
    We are using weblogic server 8.1 sp5.
    Please find below our descriptor files
    ejb-jar.xml  
    <ejb-jar>  
      <display-name>ClarifyCRM_Process_Manager_13.1</display-name>  
      <enterprise-beans>  
        <session>  
          <display-name>ProcessManager</display-name>  
          <ejb-name>ProcessManager</ejb-name>  
          <home>com.clarify.procmgr.ejb.ProcessManagerHome</home>  
          <remote>com.clarify.procmgr.ejb.ProcessManagerRemote</remote>  
          <ejb-class>com.clarify.procmgr.ejb.ProcessManagerEJB</ejb-class>  
          <session-type>Stateless</session-type>  
          <transaction-type>Container</transaction-type>  
        </session>  
        <message-driven>  
          <display-name>ProcessManagerListener</display-name>  
          <ejb-name>ProcessManagerListener</ejb-name>  
          <ejb-class>com.clarify.procmgr.ejb.ProcessManagerMDB</ejb-class>  
          <transaction-type>Bean</transaction-type>  
          <acknowledge-mode>Auto-acknowledge</acknowledge-mode>  
          <message-driven-destination>  
            <destination-type>javax.jms.Queue</destination-type>  
          </message-driven-destination>  
        </message-driven>  
      </enterprise-beans>  
      <assembly-descriptor>  
        <container-transaction>  
          <method>  
            <ejb-name>ProcessManager</ejb-name>  
            <method-name>*</method-name>  
          </method>  
          <trans-attribute>Required</trans-attribute>  
        </container-transaction>  
      </assembly-descriptor>  
    </ejb-jar>  
    weblogic-ejb-jar.xml  
    <weblogic-ejb-jar>  
      <weblogic-enterprise-bean>  
        <ejb-name>ProcessManager</ejb-name>  
        <stateless-session-descriptor>  
          <pool>  
            <max-beans-in-free-pool>100</max-beans-in-free-pool>  
            <initial-beans-in-free-pool>10</initial-beans-in-free-pool>  
          </pool>  
        </stateless-session-descriptor>  
        <enable-call-by-reference>False</enable-call-by-reference>  
        <jndi-name>ProcessManagerHome</jndi-name>  
        <dispatch-policy>PMExecuteQueue</dispatch-policy>  
        <remote-client-timeout>0</remote-client-timeout>  
      </weblogic-enterprise-bean>  
      <weblogic-enterprise-bean>  
        <ejb-name>ProcessManagerListener</ejb-name>  
        <message-driven-descriptor>  
          <pool>  
            <max-beans-in-free-pool>100</max-beans-in-free-pool>  
            <initial-beans-in-free-pool>10</initial-beans-in-free-pool>  
          </pool>  
          <destination-jndi-name>clarify.procmgr.jms.queue.Execution</destination-jndi-name>  
          <connection-factory-jndi-name>clarify.procmgr.jms.factories.ExecConnection</connection-factory-jndi-name>  
        </message-driven-descriptor>  
        <enable-call-by-reference>True</enable-call-by-reference>  
        <dispatch-policy>PMListenerExecuteQueue</dispatch-policy>  
        <remote-client-timeout>0</remote-client-timeout>  
      </weblogic-enterprise-bean>  
    </weblogic-ejb-jar>   The MDB is sometimes reading multiple times from clarify.procmgr.jms.queue.Execution
    Also i would like to add here that the connection factory we are using clarify.procmgr.jms.factories.ExecConnection is having the following properties
    ServerAffinity Enabled=true
    XA connection factory enabled=false.
    Please help me out here!!

    Maybe, your MDB "sometimes" throws an Exception in onMessage.
    Check if this happens when you set <max-beans-in-free-pool>1</max-beans-in-free-pool>.

  • If CSV file contains multiple structure...

    If CSV file contains multiple structures...then how should set vaules in file content conversion.
    pls mention any links regarding File Content Conversion
    thanks in advance..
    Ramesh

    Hi,
    You are using RecordSet. Here are some scenarios.
    http://help.sap.com/saphelp_nw04/helpdata/en/2c/181077dd7d6b4ea6a8029b20bf7e55/frameset.htm
    /people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2
    /people/prateek.shah/blog/2005/06/08/introduction-to-idoc-xi-file-scenario-and-complete-walk-through-for-starters - IDoc to File
    /people/ravikumar.allampallam/blog/2005/03/14/abap-proxies-in-xiclient-proxy - ABAP Proxy to File
    /people/sap.user72/blog/2005/06/01/file-to-jdbc-adapter-using-sap-xi-30 - File to JDBC
    /people/prateek.shah/blog/2005/06/14/file-to-r3-via-abap-proxy - File to ABAP Proxy
    /people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1 - File to File Part 1
    /people/arpit.seth/blog/2005/06/27/rfc-scenario-using-bpm--starter-kit - File to RFC
    https://www.sdn.sap.com/irj/sdn/weblogs?blog=/pub/wlg/1685 [original link is broken] [original link is broken] [original link is broken] [original link is broken] - File to Mail
    Regards,
    Wojciech

Maybe you are looking for

  • Embed a font in a PDF(generated by Word 2013) using Adobe Acrobat XI

    Hi all, I wrote my thesis in Word 2013 using the LaTex font "LM Roman 12" that is an Open Type font not restricted (it results "Editable"). Although it is marked as opentype, word doesn't embed it into the exported pdf and I guess that is becouse the

  • How does someone restore used ipod touch to her, how does someone restore used ipod touch to her information

    I gave my ipod touch to my sister, how does she wipe out all my stuff and load her own? I have xp and she has windows 7.

  • Apps crash on launch iPhone 5S

    Hi I just bought a new iPhone 5S and I did a backup of my iPhone 3GS. Set up the new iPhone from a backup and now when I download a new app on to my iPhone 5S, the app crashes before launching completely. This doesn't happen on all apps, but from the

  • Multiple mouseListeners

    hi everyone. my program consists of a JFrame which paints classes which are extensions of JComponent. I have a mouseListener in my JFrame class which works fine, but is it possible to have a mouseListener in each of the classes which are extensions o

  • Step for CNS0

    Hi SD Guru I am doing CNS0 against third party order..i m writing step what i do... if im wrong kindly suggest me i click on component tab in pup up i fill planned goods move date and ship to party ...then i enter delivery quantity..and then save the