ESB file to file scenerio

hi,
I have made a small sample project file 2 file(read-->RS-->write) using file adapters which read and write simple text file,problem is that if the outbound service(write service)is down instead of rollback the file it deleted the file from source(read folder) can anybody tell me why is it so?
regards,
Alok

Hello Eric,
Hmm, I will try out the scenario, but here is something I would like to share.
Here is scenario I have played with:
AQ -> ESB(read) --> ROUTING RULE (SYNC) --> SOAP CALL(2)
It makes entire thing synchronous as well, but if out-bound soap call fails, thing doesn't get rolled back to AQ, but ESB put that message in Rejected messages after trying 3 (or configuraable) times.
- Now, I agree that we can change ESB somehow AQ transactional and do sync read and then message hopefully should be rolled back to AQ,
- but if it is FILE adater, then there it doesn't participate in transaction, and ESB will behave the default way.
I will try this scenario, and let you know if it matches with what I said.
Regards,
Chintan

Similar Messages

  • Complete file not displayed in File server(Idoc - XI File scenerio)

    Hi,
    Am working on Idoc-XI-File scenerio.
    The target file is getting generated properly and complete xml n Adapter Monitoring in RWB.
    But in the file server(in the directory)only part of the xml is displayed.The size of the file is 82KB.
    Is there any configuration need to see the complete file in File Server.

    Hi Amit,
    This could be due to Network. How far is the file server from XI server?
    I suggest putting the file in to local machine or in the same network and check whether it is OK.
    - Pinkle

  • Steps for Processing File to File scenerio

    Hi All:
    I am implementing File to File scenerio...and I have done all the task in
    IR and ID also I put the input message(XML) to the server.
    Now could any one tell me how I process it and get the output file
    in the mention directory??
    If there is any doc. please let me know.
    Thanks in advance.
    Regards,
    Farooq.

    Farooq,
    In sort for configuring a file-file scenario with BPM:
    1. one outbound interface, 1 abstaract interface and 1 inbound interface.
    2. In the message mapping select the source message type and the message type used for abs interface.
    3. Do the mapping and in the interface mapping between source interface and abs interface select this mapping program.
    4. assuming that the message type for both the abs interface and target interface is same activate the IR and go to ID. Import the integration scenario from IR. For recevier agreement sepcify the sender interface and recevier service.
    5. In the interface determination specify the appropriate interface and select the mapping program.
    6. create a sender agreement with the receview being the target (inbound) interface and the sender as the abs interface.
    Activate and monitor.Hope this helps.

  • RFC to File scenerio

    Hi,
    I've created one RFC to file scenerio, but I am not able to get the file at destination
    1. created one program in R/3 to execute the RFC which is in R/3 only.
    2. created RFC Destination in R/3
    3. same is maintained in Sender Communication channel
    I've tested the configuration in ID, it is execting successfully without any error.
    so I m expecting that when i execute the program in R/3 it should generate file... but it is not getting anyting.
    pl let me know the possible cause... or direct me where should I look at for finding the problem.
    -Thanks,

    Hi,
    Check few basic but important things at R3 side as well.
    1. Is your RFC Remote-enabled?
    2. Have you imported the updated RFC from R3 into XI repository?
    3. Have you used Commit work in your ABAP program?
    4. Check if your code looks like as mentioned below..
    CALL FUNCTION 'Your RFC name here'
      IN BACKGROUND TASK
      DESTINATION 'RFC Destination name of  type T'
      TABLES
    COMMIT WORK.
    you can refer this wiki... https://wiki.sdn.sap.com/wiki/display/XI/RFCtoFILE
    Regards,
    Sarvesh

  • XI and R/3 System connection settings for IDOC to File scenerio.

    Hi SAP champs,
                               I am working on a IDOC to File XI scenerio. But I am confused while configuring connection settings at XI system and R/3 system. I am finding problem in transaction IDX2 to maintain the IDOC metadata. Please let me know the complete step by step system cofigurations in XI and R/3 systems for the scenerio. Thanks in advance..
    Regards
    Sanjeev.

    Hi Sanjeev,
    We upload Medata in IDX2 to just reduce runtime effort and hence to improve the performance. However it is not a mandetory step to be performed for R/3 TO XI scenario.
    You are confused just because here you dont find any F4 help to select the metadata and the PORT.
    Thanks
    Sunil Singh

  • Process multi-record & multi-record-format files using ESB & File Adapter

    I am looking to process/parse an in-bound file with the following make-up and to load into a database table using the File Adapter, ESB and DB Adapter.
    File Make-Up:
    - each line in the file is a record
    - each record is made up of 12 fields
    - there are multiple record types denoted by the first field in the line
    - record types may or may not have common fields
    - where there are common fields, the field may be in different columns
    - each record is to be inserted into a database table
    Sample File:
    3,,"03-0243-0188132-00",.20,26,075,"","000000006026","","","22/04/08",03 1303
    3,,"03-0243-0188132-00",20.00,26,075,"","","","","22/04/08",03 0579
    5,,"03-0243-0188132-00",99.60,,,"OPENING BALA",,,"ACME GROUP","22/04/08",
    6,,"03-0243-0188132-00",99.60,,,"CLOSING BALA",,,"ACME GROUP","22/04/08",
    8,,"03-0243-0188132-00",-346119.05,16,000,"DEBITS",,,,"22/04/08",
    8,,"03-0243-0188132-00",346119.05,349,050,"CREDITS",,,,"22/04/08",
    9,,"03-0243-0188132-00",-346119.05,16,000,"DEBITS",,,,"22/04/08",
    9,,"03-0243-0188132-00",346119.05,349,050,"CREDITS",,,,"22/04/08",
    Record Types and corresponding format:
    3, Corp ID, A/C Number, Trans Amt, Serial Number, Trans Code, Particulars, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    5, Corp ID, A/C Number, Opening Balance, Serial Number, Trans Code, Particulars, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    6, Corp ID, A/C Number, Closing Balance, Serial Number, Trans Code, Particulars, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    8, Corp ID, A/C Number, Amount, Number, 000, “DEBITS”, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    8, Corp ID, A/C Number, Amount, Number, 050, “CREDITS”, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    9, Corp ID, A/C Number, Amount, Number, 000, “DEBITS”, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    9, Corp ID, A/C Number, Amount, Number, 050, “CREDITS”, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    Please note that record types 8 and 9 have 2 fixed fields.
    I have tried playing around with the File Adapter and the ESB but not having much luck. Can someone provide any suggestions on how I can handle this?

    James,
    Thanks for your prompt response. I have come across your post previously.
    Please excuse my in-experience (very new to SOA and Oracle SOA Suite) but i have not understood how your post regarding the manual creation of an XSD will assist with my problem. Could you possibly further elaborate on the overall approach i should be taking?
    Regards,
    Simon

  • Listener on ESB file adapter does not appear to listen

    I am new to the world of SOA, ESB, and BPEL and am starting with something simple such as reading a file from one directory performing a data transformation and writing the file to a different directory. I understand that Oracle's ESB and BPEL are very similar in functionality. I have successfully been able to read files in using a file adapter and listener in BPEL. When I try to set up a new ESB application with a file adapter and listener in ESB, I register the project with the Integration Server without any problems, and am able to see all components in the ESB Control. When I drop a new file into the directory that the adapter is attached and listening to, nothing happens. The file is not picked up and there are no new instances of any ESB services. Am I missing something? Is there a log that could be checked to see if the service is even checking the directory?
    Along the same lines, I am wondering if there is an easy way to update the directory that is being listened to once the service has been registered/deployed (with either ESB or BPEL). Is there a config file that someone other than a developer could update without having to redeploy the service again.
    Any help would be greatly appreciated. Thanks.

    What platform are you running and what install option did you use, basic or advanced?
    There are known issues with adapters enabling automatically on Linux that require you to restart the container. See Re: Changing File Adapter target directory - container bounce required?
    You can change the polling directory by following the instructions in this thread.
    Changing File Adapter target directory - container bounce required?

  • ESB/File Adapter - XML files containing multiple messages

    Hi,
    In all the examples on file adapters I read, if files contain multiple messages, it always concerns non-XML files, such as CSV files.
    In our case, we have an XML file containing multiple messages, which we want to process separately (not in a batch). We selected "Files contain Multiple Messages" and set "Publish Messages in Batches of" to 1.
    However, the OC4J log files show the following error:
    ORABPEL-12505
    Payload Record Element is not DOM source.
    The Resource Adapter sent a Message to the Adapter Framework which could not be converted to a org.w3c.dom.Element.
    Anyone knows whether it's possible to do this for XML files?
    Regards, Ronald

    Maybe I need to give a little bit more background info.
    Ideally, one would only read/pick-up small XML documents in which every XML document forms a single message. In that way they can be processed individually.
    However, in our case an external party supplies multiple messages in a single batch-file, which is in XML format. I want to "work" on individual messages as soon as possible and not put a huge batch file through our ESB and BPEL processes. Unfortunately we can not influence the way the XML file is supplied, since we are not the only subscriber to it.
    So yes, we can use XPath to extract all individual messages from the XML batch-file and start a ESB process instance for each individual message. But that would require the creation of another ESB or BPEL process which only task is to "chop up" the batch file and start the original ESB process for each message.
    I was hoping that the batch option in the File adapter could also do this for XML content and not only for e.g. CSV content. That way it will not require an additional process and manual coding.
    Can anyone confirm this is not supported in ESB?
    Regards,
    Ronald
    Message was edited by:
    Ronald van Luttikhuizen

  • ESB File Adapter data validation issue

    I have a really simple ESB process.
    File Adapter Read
    (XML format with XSD) --> Routing Service --> SOAP Call to BPEL Process
    Quite simply, the ESB uses the File Adapter to read an XML file which has an associated XSD. The contents are then routed to a BPEL Process.
    If the XML file is valid in syntax but contains an element which is not in the XSD, the file contents are pushed to the BPEL process which has validateXML set to true - thus the error is caught and handled by the BPEL process.
    This is what I want to happen.
    If the XML file is invalid in syntax - for example -
    <TAG>hello</WRONGTAG>
    The ESB process rejects the file and stops dead.
    I would rather the invalid file is pushed to the BPEL so that the BPEL can handle the error.
    Why?
    Because my error handling in the BPEL picks up the invalid XML format and emails the users to inform them there is a problem.
    Can someone help? Is this possible? Thanks.
    Message was edited by:
    IanBaird

    Hello again! You are now aware of the problem as there has been a lot of conversation between myself and Dave Berry [and you!] regarding this issue.
    For the rest of the readers - here is a brief summary of progress so far. Sorry but there has been so much communication that I think the best way is to direct you to various links.
    If an adapter fails in ESB, then it is possible to trap these failures and handle them as documented here:
    Dave Berry said:
    you need to look into the adapter rejection handlers which are documented on OTN ESB page in the exception handling lesson and the adapters OTN dev resources page http://www.oracle.com/technology/products/integration/adapters/dev_support.html
    I had also devised my own solution in the meantime, which involves reading the input file as opaque schema and passing this to a BPEL process which converts it to an XML document! This can then be validated and use standard BPEL error handling to find any problems in the file format etc.
    See my blog for a tutorial on doing this:
    http://soastuff.wordpress.com/2007/06/06/handling-opaque-datatypes-in-bpelesb/
    The only problem with this at the moment is that the BPEL function ora:parseEscapedXML has a bug in it - so it fails if the XML does not comply with the XSD it tries to convert to.
    This in turn causes an ORABPEL-09500 error which does not seem to get handled by the BPEL process so the process fails.
    As things stand, this bug remains - though Dave Berry has been very helpful in identifying it and bringing it to Muruga's attention. Hopefully we can get a fix (?) and a workaround whilst the fix is being implemented (?)
    Cheers.

  • Set polling time with BPEL and ESB File Adapters

    We have create file adapters in BPEL and ESB. We see where you can set the frequency for polling but do not see a place to set the polling time. We would like it to check a directory every 24 hours at midnight. We can set the 24 hours but not the midnight. How does ESB and BPEL determine the time to run based upon the interval?

    Oracle SOA is started, from the time the adapter is loaded, the interval counting starts. It is not on a particular time.
    If you want to start a BPEL process on a certain time, you could use the quartz timer fucntionality of BPEL to start an instance on a particular time (like unix cron tab).
    Marc

  • Can we read a ZIP file through ESB file adapter?

    Hi,
    Is it possible to read a ZIP file from a specified locaiton either through a FTP adapter or file adapter as a opaque element and place it in another specified locaiton?
    So far the available options are to read a txt, csv, fixed length or cobol file.
    I tried to create a ESB reading ZIP file and place it in another locaiton. but it was not reading the ZIP file.
    Is it first possible for the ESB to read the ZIP file? if so then is there any specific configuratuion that we have to do for it to work?

    how can we read a MSWORD file through java?You can read it like any other file using FileInputStream
    and similar.
    Howvere, if you want do do something with it, like display
    it, you have a problem. AFAIK Microsoft do not release
    information about their document formats, and i have not
    seen a .doc viewer for java.

  • Idoc to file scenerio query

    Hi,
       I have done a XI scenerio ( SAP idoc -> Xi-> file) and it is working perfectly in XID. I have observed a strange thing ...
    In configuration scenerio i have defined sender business system as BS_ECC_400_001. But this is not there in SLD. And the whole scenerio is working....Can anyone please tell me why is this so ???
    FYI, the business system was there in the object tab, i had not created it.
    Regards
    Niladri

    Hello,
    Check whether SLD is pointing to development environment, it might be pointing to other server. Same thing happend in my project. SLD was pointing to production environment.
    Click on SLD link from your home page.
    Regards,
    Sreenvias.

  • Idoc to file scenerio

    Hi,
      My scenerio is to send message type HRMD_A basic type HRMA_A07 to a flat file.
      I need to map the following fields only
    E1P0001 or E1P0002 or E1P0105   PERNR
    E1P0001                                         BUKRS     
    E1P0001                                         KOSTL     
    E1P0001                                         PERSK     
    E1P0001                                         ORGEH     
    E1P0002                                         VORNA     
    E1P0002                                         NACHN     
    E1P0105                                         USRID_LONG     
    I do this thru graphical mapping. In HRMD_A the highest segment is E1PLOGI under which we have the Infotype segments. In one IDOC i can have many E1PLOGI segments( about 100)
    when my scenerio is executed teh data from the first E1PLOGI is ONLY send to file. i want to send all the data.
    Can anyone help me?
    Regards

    hi Pooja,
       The target message is :
        Mt_traget
             Record                 0....Unbounded
                  PERNR                  1..1
                  BUKRS                   1..1
                  KOSTL                    1..1
                  PERSK                   1..1
                  ORGEH                  1..1
                  VORNA                  1..1
                 NACHN                   1..1
                 USRID_LONG          1..1
    I map the fields from the different IDOC segments to the target fields.
    so my target Xml is like :
    </Record>
      <pernr>
      < usrid_long>
    <Record>
    the problem is only one data is mapped to target ..

  • How to use BPM's in File 2 File Scenerio

    Hi All,
    Please do let me know when do we use BPM's in File2File scenerio and also let me know the example for the same.
    Thanks
    Sudharshan

    Hey,
       BPM generally is not required in a file to file scenario.
       And as far as possible avoid the use of BPM.
    In most of the file to file scenario BPM is not required.
    But in certain scenario(file to file) it is required.
    for example N:1 mapping.
    that is when you have two sender and one reciever.
    in this case it is mandatory to use bpm.
    because during configuration it is not possible to give two senders in your interface determination.
    hence if you use a BPM  interface mapping will be referred in your BPM and there is no need for maaping to be reffered in Interface determination.
    In order to design a BPM for this scenario
    steps.
    1) create a data tyoe and message type for sender and reciver file structure.
    2) do the required mapping.
    3) create abstract asynch interface for the two message types.(as bpm requires abstract type)
    4) create a BPM.
    5) create two container type variables fr the two abstract interface(you cannot refer to the abstract interface directly)
    6) assign a recieve step of type async a in this step select the container variable created corresponding to the sender abstract interface .
    7) assign a transformation step and refer to the interface mapping in it.
    8) assign a send step and  select the container variable created corresponding to the reciever  abstract interface .
    your BPM is configured.
    reward points if useful.
    regards,
           Milan

  • ESB File Adapter Service question

    I am experimenting with ESB to enable an inbound file adapter to read multiple records from an XML file. I am using a router to pass to another FileAdapter for writing each of the child records to a separate XML file. So basically I am just trying to test reading multiple records from a single XML file and then putting each record in its own separate XML file. Seems like should be a pretty simple initial experiment with using the ESB. But when I drop this inbound xml file, it is getting picked up properly, but the entire inbound xml file is getting written to just a single outbound xmlfile (and is not getting split up). My confusion is over how the fileadapter works, I have checked the "Files contain multiple messages" checkbox and indicated batches of "1". So shouldn't this tell the file adapter to process one record from the inbound file at a time, passing it thru the router and ultimately to the outbound file adapter? If so, how does the file adapter know what deliminates each record in the XML file? Here is the inbound xml that I am trying to read/break apart and put each service request event in it's own outbound file. Please advice what I am missing here.
    <Events xmlns="http://www.example.org/test">
    <ServiceRequestEvent>
    <summary>test1</summary>
    <severity_id>9</severity_id>
    <urgency_id>64</urgency_id>
    <status>103</status>
    <type_id>11124</type_id>
    <owner_id>100003631</owner_id>
    <current_serial_number>1748AC16206</current_serial_number>
    <inventory_item_id>2320011231602</inventory_item_id>
    <problem_code>1001R3</problem_code>
    </ServiceRequestEvent>
    <ServiceRequestEvent>
    <summary>test2</summary>
    <severity_id>9</severity_id>
    <urgency_id>64</urgency_id>
    <status>103</status>
    <type_id>11124</type_id>
    <owner_id>100003631</owner_id>
    <current_serial_number>1748AC16206</current_serial_number>
    <inventory_item_id>2320011231602</inventory_item_id>
    <problem_code>1001R3</problem_code>
    </ServiceRequestEvent>
    </Events>
    here is the schema....
    <?xml version="1.0" encoding="windows-1252" ?>
    <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
    xmlns="http://www.example.org/test"
    targetNamespace="www.example.org/test"
    xmlns:nxsd="http://xmlns.oracle.com/pcbpel/nxsd"
    elementFormDefault="qualified">
    <!-- type declarations -->
    <xsd:element name="Event" abstract="true" type="EventType"/>
    <xsd:element name="ServiceRequestEvent" substitutionGroup="Event" type="ServiceRequestEventType"/>
    <xsd:element name="PartRequestEvent" substitutionGroup="Event" type="PartRequestEventType"/>
    <xsd:element name="CounterEvent" substitutionGroup="Event" type="CounterEventType"/>
    <xsd:element name="Events">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element ref="Event" maxOccurs="unbounded"/>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    <xsd:element name="ServiceRequestEvents">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element ref="ServiceRequestEvent" maxOccurs="unbounded"/>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    <xsd:complexType name="EventType">
    <xsd:sequence/>
    </xsd:complexType>
    <xsd:complexType name="ServiceRequestEventType">
    <xsd:complexContent>
    <xsd:extension base="EventType">
    <xsd:sequence>
    <xsd:element name="summary" type="xsd:string"/>
    <xsd:element name="severity_id" type="xsd:double"/>
    <xsd:element name="urgency_id" type="xsd:double"/>
    <xsd:element name="status" type="xsd:double"/>
    <xsd:element name="type_id" type="xsd:double"/>
    <xsd:element name="owner_id" type="xsd:double"/>
    <xsd:element name="current_serial_number" type="xsd:string"/>
    <xsd:element name="inventory_item_id" type="xsd:string"/>
    <xsd:element name="problem_code" type="xsd:string"/>
    </xsd:sequence>
    </xsd:extension>
    </xsd:complexContent>
    </xsd:complexType>
    <xsd:complexType name="PartRequestEventType">
    <xsd:complexContent>
    <xsd:extension base="EventType">
    <xsd:sequence>
    </xsd:sequence>
    </xsd:extension>
    </xsd:complexContent>
    </xsd:complexType>
    <xsd:complexType name="CounterEventType">
    <xsd:complexContent>
    <xsd:extension base="EventType">
    <xsd:sequence>
    </xsd:sequence>
    </xsd:extension>
    </xsd:complexContent>
    </xsd:complexType>
    </xsd:schema>
    The next experiment is to try performing some content based routing based on the type of event. But I need to learn how to crawl first. I have been scanning thru the Oracle tutorials and googling, but have not found any examples where a single XML document is broken apart in this way into multiple output files.
    Thanks for any help,
    Todd

    Thanks for your reply Allan!
    I went with your recommendations and was able to get it partially working. I modified my XSLT mapping as you had indicated. I also ran the transform thru Stylus studio and the transform appears to have processed properly.
    What I am seeing happen now is that the writer is still only writing 1 file, but now rather than both children being included in the file, just the very first child element is included. I am not sure what happens to the 2nd one. I was expecting it to be written out to a second file. Below is the configuration settings I am using for the adapters...
    File reader:
    Currently batching turned off (ie unchecked the files contain multiple messages). I originally had this checked with it publishing in batches of 1. This appears to make no difference. The Schema Element is set to the parent element "Events".
    Router:
    Transform via router. This change appears to be working correctly since I was able to transform via stylus studio and see both child element transformed in the output document.
    <xsl:template match="/">
    <xsl:for-each select="/imp1:Events/imp1:ServiceRequestEvent">
    <imp1:ServiceRequestEvent>
    <imp1:summary>
    <xsl:value-of select="imp1:summary"/>
    </imp1:summary>
    File writer:
    File naming convention: event_%yyMMddHHmmssSS%.xml (so should be unique)
    I tried checking and unchecking "Number of messages Equals = 1", No files are written when this is checked? No errors found in the ESB instance console, appears to have written successfully, but no file. Schema element is set to the child element "ServiceRequestEvent" since I want each output file to include a single ServiceRequestEvent.
    So I am a bit confused what I am missing? It is difficult to gain visibility into what is going on under the covers. Is there any debug logging that may show exactly what is happening as the input file is processed, transformed and then written out? I assume that If I can transform the input document via Stylus studio to a document with just the 2 child elements that this is ok, but how does the file adapter know how to distinquish how to break up this output file into separate files without some kind of indication of a record delimiter?
    Thanks again for your help!
    -T

Maybe you are looking for