Row By Row File Processing using File Adapter

Hi,
Can any one explain that how can a read a file row by row using File adapter in BPEL?
Like ...When One row is executed completely only then second row starts........
Regards,
En
Edited by: Enceladus on Apr 16, 2013 11:55 AM

For This ...you need to make you service Synchronus by adding a reply in interface WSDL of BPEL process.
Regards,
En

Similar Messages

  • How to eliminate inserting  Duplicate rows into database using JDBC Adapter

    File->Xi->JDBC
    In above Scenario if the file has two rows their values are identical, then how can we eliminated inserting  Duplicate rows into database using JDBC Adapter

    Database is a consumer of a SERVICE (SOA!!!!!!).
    Database plays a business system role here!!!!
    Mapping is part of an ESB service
    Adaptor is a technology adapted to ESB framework to support specific protocol.
    ESB accomplish ESB duties such as transformation, translation, routing. Routing use a protocol accepted by the consumer. In a JDBC consumer it is JDBC protocol and hence it a JDBC adaptor.
    There is clear separation on responsibilities among business system and ESB. ESB do not participate in business decision or try to get into business system data layer.
    So who ever are asking people to check duplicate check as part of mapping (an ESB service) may not understand integration practice.
    Please use an adaptor module which will execute the duplicate check with business system in a plug and play approach and separate that from ESB service so that people can build integration using AGILE approach.
    Thanks

  • Row level commit using DB adapter

    We have a soa 11g requirement, where a File is polled using File adapter and this is inserted into DB using DB adapter. All the commits should happen
    at the record level, where in if one of the record fails due to improper data, the others should commit. I am using a single DB insert rather than looping for
    each record in the bpel process. I initially tried with XADatasource, later created a non XA datasource and used it in DB adapter. ALso used the property setting
    Idempotent=false. But if one of the record fails, the whole transaction is failed. How to achieve the record level commit?

    kiran4soa wrote:
    But if one of the record fails, the whole transaction is failed.That's the expected behaviour and I'm afraid you won't find any option/property to change this...
    http://docs.oracle.com/cd/E21764_01/integration.1111/e10224/soa_transactions.htm
    How to achieve the record level commit?You'll have to redesign your process... One option would be to split your process in two... have a BPEL1 to read the file and send individual messages to a JMS queue, and a BPEL2 to listen to the queue and process individual messages...
    Cheers,
    Vlad

  • 11G BPEL Process using DB Adapter Polling Wont Create Instance

    I have a BPEL project that uses DB Adapter to Poll the database for changes. This process was developed in 10.1.3.3 and migrated to 11.1.1.2. I have successfully deployed this to the weblogic server, but it will not run. It is showing that is is up and it is in active mode, but it will not run. Any suggestions on this?
    Thanks
    Edited by: user2498952 on Mar 23, 2010 8:46 PM
    Edited by: user2498952 on Mar 23, 2010 8:57 PM

    The problem was on the OS side. The firewall was up after the installation and also it was not recognizing the database name. The OS team made the changes and it begin to run.....What happen to an easy migration....
    Edited by: user2498952 on Mar 24, 2010 5:10 PM

  • File adapter don't want to ignore the header row in file

    Hi gurus!
    In my sender communication channel I specified Document Offset = 1 to spent the first row in file(header)
    But it doesn't work.
    Do you have any ideas?

    Under Document Offset, specify the number of lines that are to be ignored at the beginning of the document.
    This enables you to skip comment lines or column names during processing. If you do not make an entry, the default value is zero lines.
    http://help.sap.com/saphelp_nw04/helpdata/en/2c/181077dd7d6b4ea6a8029b20bf7e55/frameset.htm
    But in Receiver File adapter
    NameA.addHeaderLine=0
    is used to have no Header line in the target structure
    http://help.sap.com/saphelp_nw04/helpdata/en/2c/181077dd7d6b4ea6a8029b20bf7e55/frameset.htm

  • Can we process files with attachments using File Adapter

    Hi All,
    I want to process files which have attachments using File Adapter.
    Is it possible to process such files?
    May be using standard modules?
    Regards,
    CBKLP

    Assuming you are talking about sender FILE channel....you can make use of PayloadSwapbean provided by SAP (standard module)...to read data from attachment.
    Regards,
    Abhishek.

  • How to process files one by one using sender file adapter

    Hi,
    I have to process file one by one using my file adapter (sender), because while doing the G/R materials get locked, if the same materal exist in other files.
    I have maintained QOS as "Exactly Once in Order" & processing Sequence as "by date".
    Processing sequence come in which tab? What will be the Queue name?
    Thanks,
    krishna

    It will come under Processing tab
    Here you need to select the first parameter QOS  as EOIO
    Then QUEUE name and processing sequence will be enabled.
    NOTE: This will work only for NFS protocol not for FTP.

  • Processing files in Sequence using FTP Adapter

    Hi Experts,
    I have searched several forums but i am not clear on  how to process the files using FTP Adapter based on Timestamp.
    To process the files in sequence i.e, FIFO using FTP Adapter
    i have the files with file name customer and timestamp :  customer<yyyyMMddHHmmss>
    there are around 50 files in the FTP server llike this.
    I need to process these files acording to the timestamp and place the files in same processing sequence in the receiver end using the file adapter.
    If i specify the parametes in sender FTP Adapter as
    Qos= EOIO
    Queue name = ACCOUNT
    Whether these parameters would do the processing in sequence according to the Timestamp?
    Suppose if the queue ID for Inbound(SMQ2) is XBTI0_ACCOUNT then whether it will be the same for Outbound(SMQ1)?
    Kindly suggest me how to process the files in sequence according to the Timestamp using FTP Adapter
    Please reply..
    Thanks
    Sai

    Hi Shabarish,
    But this would require one more additional channel to process
    So i think it will take more time to process.
    Let me clarify my question once again.
    I need to Pick the files from FTP server based on their TimeStamp and in sequence.
    the file names are like this Customer<YYYYMMDDHHmmSS>.
    suppose i have 3 files as
    Customer20050413044534
    Customer20050414053430
    Customer20050315034533
    So i need to pick these files in this order and place the files in the same order to the receiver end(File Adapter)
    Customer20050315034533
    Customer20050413044534
    Customer20050414053430.
    As i am using FTP sender adapter i cannot use processing sequence "By Date".
    please suggest me on this.
    Thanks
    Sai.

  • Process multi-record & multi-record-format files using ESB & File Adapter

    I am looking to process/parse an in-bound file with the following make-up and to load into a database table using the File Adapter, ESB and DB Adapter.
    File Make-Up:
    - each line in the file is a record
    - each record is made up of 12 fields
    - there are multiple record types denoted by the first field in the line
    - record types may or may not have common fields
    - where there are common fields, the field may be in different columns
    - each record is to be inserted into a database table
    Sample File:
    3,,"03-0243-0188132-00",.20,26,075,"","000000006026","","","22/04/08",03 1303
    3,,"03-0243-0188132-00",20.00,26,075,"","","","","22/04/08",03 0579
    5,,"03-0243-0188132-00",99.60,,,"OPENING BALA",,,"ACME GROUP","22/04/08",
    6,,"03-0243-0188132-00",99.60,,,"CLOSING BALA",,,"ACME GROUP","22/04/08",
    8,,"03-0243-0188132-00",-346119.05,16,000,"DEBITS",,,,"22/04/08",
    8,,"03-0243-0188132-00",346119.05,349,050,"CREDITS",,,,"22/04/08",
    9,,"03-0243-0188132-00",-346119.05,16,000,"DEBITS",,,,"22/04/08",
    9,,"03-0243-0188132-00",346119.05,349,050,"CREDITS",,,,"22/04/08",
    Record Types and corresponding format:
    3, Corp ID, A/C Number, Trans Amt, Serial Number, Trans Code, Particulars, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    5, Corp ID, A/C Number, Opening Balance, Serial Number, Trans Code, Particulars, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    6, Corp ID, A/C Number, Closing Balance, Serial Number, Trans Code, Particulars, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    8, Corp ID, A/C Number, Amount, Number, 000, “DEBITS”, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    8, Corp ID, A/C Number, Amount, Number, 050, “CREDITS”, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    9, Corp ID, A/C Number, Amount, Number, 000, “DEBITS”, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    9, Corp ID, A/C Number, Amount, Number, 050, “CREDITS”, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    Please note that record types 8 and 9 have 2 fixed fields.
    I have tried playing around with the File Adapter and the ESB but not having much luck. Can someone provide any suggestions on how I can handle this?

    James,
    Thanks for your prompt response. I have come across your post previously.
    Please excuse my in-experience (very new to SOA and Oracle SOA Suite) but i have not understood how your post regarding the manual creation of an XSD will assist with my problem. Could you possibly further elaborate on the overall approach i should be taking?
    Regards,
    Simon

  • How to get the read input file name in bpel process using File Adapter.

    Hi,
    I am reading a .txt file from configured directory using File Adapter.
    I had configured file adapter to read file with naming pattern “SalesOrder.*\.txt”.
    Now I had requirement to access the actual file name in bpel process eg: “SalesOrder123.txt”
    How can I get the file name in bpel process.
    Any help is appreciated.
    Vidya.

    1) create a variable of message type. Click on Browse Message type. Here you select Message Type-->Project WSDL Files -->fileAdapterInboundHeader-->Message Types-->Inboundheader_msg. Then click OK
    2)Next dbl Click on teh receive Activity which is receing your File from teh File Adapter. Go to the Adapter Tab. Click on Browse variable and select the variable that you had just created above.
    This would get the File name in the variable declared

  • How to start a BPEL Process using the File Adapter

    Hi
    I would like to automatically start a BPEL Process when I store a file in a specific directory. Can this be done using the File Adapter?.
    Regards,
    Néstor Boscán

    Yes, there are samples of how to do this in the BPEL samples directory.

  • Cannot process a Fixed Field Length file using the File Adapter (Sender)

    Hi -
    I have checked throughout these posts and blogs but I still have not been able to find a solution to my issue.  When using the File Adapter (Sender) I get a Conversion initialization failed with "xml.keyfieldName", no value found.  Why would I require a key field when I am using fixed field lenghts?  The file is comprised of 2 structures - 1 header and multiple details (see below).  There are no key fields in the flat file that I would be able to use.  Any suggestions?
    011000390      Customer Americas        20080605164317 000000000000000800000008000000000016000000                              
    12345678          100500       100500       Supplier 1                         0000000000030000002008040400                    
    12345678          100501       100501       Supplier 2                         0000000000052000002008042100 
    The File Adapter is configured as follows:
    Document Name = Rfchke00
    Document Namespace = 'my namespace'
    Recordset Name = Rfchke00
    Recordset Structure = Dtachkh,1,Dtachkp,*
    Recordset Sequence = Ascending
    Recordsets per Message = 1
    Key Field Type = String (Case-Sensitive)
    Dtachkh.fieldFixedLengths = 15,25,8,6,1,8,8,8,15,3,31
    Dtachkh.fieldNames = F1,F2,F3,F4,F5,F6,F7,F8,F9,F10,F11
    Dtachkh.processFieldNames = fromConfiguration
    Dtachkp,fieldFixedLengths = 18,13,13,35,15,3,8,2,21
    Dtachkp,fieldNames = F1,F2,F3,F4,F5,F6,F7,F8,F9
    Dtachkp,processFieldNames = fromConfiguration
    Thanks,
    Dave

    Hi,
    you can use the module from which u can convert your structure to
    H011000390 Customer Americas 20080605164317 000000000000000800000008000000000016000000
    D12345678 100500 100500 Supplier 1 0000000000030000002008040400
    D12345678 100501 100501 Supplier 2 0000000000052000002008042100
    Please note the extra H,D in the struture added by the module.
    You can then use them as your key fieldValues.. The module should be deployed in Visual Admin and then can be used in the Module tab of your adapter CC
    While writing the content conversion after that please dont forget about the added new characters
    Please note also that i can find that the word supplier kept repeating in all the Dtachkp records
    Please use that
    Also if you feel that the field is of 13 characters and that would cause a problem dont worry... create a dummy field eg split tht 13 to two fields and use the common one as key field Value and identifier... as i see in ure case its like 500 Supplier , 502 Supplier . u can split the first 4 char and the remaing 9 char are key field value.
    try this out
    Rgds
    Aditya

  • How to process huge files using File Adapter without ESR and no FCC?

    Hi All,
    I came to a scenario where i need to pick the file from R/3 sys which of around 500MB and need to transport it to the Third party sys using PI 7.1 . This is just simply a pass through data i.e.., no mapping is required (no ESR). and i don't to make use of FCC or BPM
    My question is can the File Adapter can pick the file which is of 500MB size?
    If not then what would be best solution to do it?
    Do i need to divide the file into smaller sizes say around 100MB a file each in R/3 sys and then pick those files .
    can i do without dividing the files in the R/3 sys?
    Thanks in advance.
    Sai

    Hi Bhaskar,
    If you zip probably the size would come less which may increase his performance a little. But I feel SAP also dont recommend this much load with PI.
    If I am in his shoes then I will do a simple FTP without using PI itself. Else will look into other options. Thats my two cents.
    Hope this alexs blog will give an idea to Sai:
    /people/alessandro.guarneri/blog/2007/02/21/sap-xi-acting-as-a-huge-file-mover
    Regards,
    ---Satish
    Edited by: Satish Reddy on Jul 20, 2011 3:02 PM

  • Testcase problem using two file adapter and a transformation

    We've got an input which looks like this :
    <?xml version="1.0" encoding="UTF-8"?>
    <rows>
    <row>
    <id>10</id>
    <naam>A</naam>
    </row>
    <row>
    <id>20</id>
    <naam>B</naam>
    </row>
    </rows>
    I've created an XSD for this message which looks like this ( straightforward ) :
    <?xml version="1.0" encoding="UTF-8"?>
    <schema xmlns="http://www.w3.org/2001/XMLSchema" targetNamespace="http://www.test.nl/testschema" xmlns:test="http://www.test.nl/testschema" elementFormDefault="unqualified">
         <element name="row">
              <complexType>
                   <sequence>
                        <element ref="test:id"/>
                        <element ref="test:naam"/>
                   </sequence>
              </complexType>
         </element>
         <element name="rows">
              <complexType>
                   <sequence>
                        <element ref="test:row" maxOccurs="unbounded"/>
                   </sequence>
              </complexType>
         </element>
         <element name="naam">
              <simpleType>
                   <restriction base="string">
                   </restriction>
              </simpleType>
         </element>
         <element name="id">
              <simpleType>
                   <restriction base="byte">
                   </restriction>
              </simpleType>
         </element>
    </schema>
    I've imported this XSD in my ESB project and created a file adapter which reads this type op files.
    I've created another file adapter to write the files 1:1 to an output dir.
    In the routingservice I've created a very straightforward XSL mapping which maps everything 1:1.
    Now the problem
    When I use this input :
    <?xml version="1.0" encoding="UTF-8"?>
    <rows>
    <row>
    <id>10</id>
    <naam>Martin</naam>
    </row>
    <row>
    <id>20</id>
    <naam>Edward</naam>
    </row>
    </rows>
    my output result is:
    <?xml version="1.0" ?><imp1:rows xmlns:imp1="http://www.test.nl/testschema"/>
    I know this is a namespace issue. When I add the namespace
    <?xml version="1.0" encoding="UTF-8"?>
    <rows xmlns="http://www.test.nl/testschema">
    <row>
    <id>10</id>
    <naam>Martin</naam>
    </row>
    <row>
    <id>20</id>
    <naam>Edward</naam>
    </row>
    </rows>
    I get the correct ( and 1:1 output ).
    The problem is. In the scenario I'm about to build the input xml messages do not have an namespace. How can I alter my xsd file or anything within my ESB project that all files will be picked up correctly and processed without having an default namespace?
    Any help is appreciated!

    True,
    But its the other way around what is causing my problem.
    Because the input xml files contain no namespace at all the xml messages are transformed but result in an almost empty xml message. ( e.g. the root element is there and thats it ).
    This is because the XML transformation mapper in ESB ( as well as BPEL ) excplicitly needs a namespace.
    I solved it by editing the XSL by hand, removing the :imp1 namespace prefixes in the select="" tags. e.g.
    <xsl:for-each select="/imp1:rows/imp1:row"> is updated in
    <xsl:for-each select="/rows/row">
    As far as I know this is the only workaround at the moment that I could find.

  • Two Files merging using File adapter in Bpel 2.0

    Hi All,
    I have two different files (File 1 and 2) of same format(Two columns each)  , now i want to merge these two files using their data and form a consolidated file (File 3 with four columns).
    Please advise on how to make it possible using file adapter.
    Example:
    File 1 Format: A1 and A2 - are two columns of number and sring type
    File 2 Format : B1 and B2 - are two columns of number and string type
    Consolidated File 3 Format : A1 B1 A2 B2 - forming a single row if A1=B1 and also populating A2 and B2.
    Thanks
    Karthick.

    Hi Karthick,
    I would say read both files completely. Then create a transform and select both messages as an input creating one output. With XSLT it should not be too hard to combine the two inputs. You could loop on one file and then select each row from the other file. That output can be written to file.
    I don't know what triggers the process, but you could either let bpel be triggered by the polling on one file and read the other synchronously. Or read both files synchronously.
    Regards,
    Martien

Maybe you are looking for