File Adapter in ESB

Hi,
Am creating a file adapter to read a fixed length file.I have hand written the schema and configured the adapter to poll a directory every 10 secs and the file name filter expression to be *.txt.
First time every thing seems to work as expected.I make change to the schema and reconfigure the adapter, adapter does not pick up the file.It picks up the file after OAS is restarted.What is the problem here??
It works if I specify schema is opaque.
will the adapter not pick up files, which does not conform to the input schema ?
Can some one clarify???

Hi,
Actually in file and ftp adapters, we can poll from multiple locations...
Keep the following property in your .jca file <property name="DirectorySeparator" value="," />
While giving the path in File Adapter configuration, keep comma and give the next location....then the file will be picked up from the locations you gave....
If you want to write the file in multiple locations, may be you can try the same, if that doesn't work, please create two file Adapters as references each having different writing locations...
Hope this helps...
Thanks,
N

Similar Messages

  • ESB or BPEL file adapter and special characters

    Hi,
    We have a scenario where we import rows from .csv file through an ESB project into a database. We use the file adapter for this. There appears to be a problem with special characters (like é). Both in the ESB control (with variable tracking) and in the database, they appear as upside down questionmarks (¿). I've tried doing the same with a BPEL project (file adapter as client PL) and in the BPEL console, I also see strange characters instead of the expected special characters (diamond shaped characters, like ♦ to be precise).
    I can't find anything about character sets of character set conversions in the documentation. What am I missing?
    Regards,
    Arjan

    see
    http://download-west.oracle.com/docs/cd/B31017_01/inte
    grate.1013/b28994/nfb.htm#CIAEFBHHI've looked into the properties mentioned. They are set when you go through the wizard. Everything is set to UTF-8, which should provide me with all special characters I need.
    BPEL does the exact same thing, so I'm starting to believe that the problem really is with the file adapter.
    Regards,
    Arjan

  • Process multi-record & multi-record-format files using ESB & File Adapter

    I am looking to process/parse an in-bound file with the following make-up and to load into a database table using the File Adapter, ESB and DB Adapter.
    File Make-Up:
    - each line in the file is a record
    - each record is made up of 12 fields
    - there are multiple record types denoted by the first field in the line
    - record types may or may not have common fields
    - where there are common fields, the field may be in different columns
    - each record is to be inserted into a database table
    Sample File:
    3,,"03-0243-0188132-00",.20,26,075,"","000000006026","","","22/04/08",03 1303
    3,,"03-0243-0188132-00",20.00,26,075,"","","","","22/04/08",03 0579
    5,,"03-0243-0188132-00",99.60,,,"OPENING BALA",,,"ACME GROUP","22/04/08",
    6,,"03-0243-0188132-00",99.60,,,"CLOSING BALA",,,"ACME GROUP","22/04/08",
    8,,"03-0243-0188132-00",-346119.05,16,000,"DEBITS",,,,"22/04/08",
    8,,"03-0243-0188132-00",346119.05,349,050,"CREDITS",,,,"22/04/08",
    9,,"03-0243-0188132-00",-346119.05,16,000,"DEBITS",,,,"22/04/08",
    9,,"03-0243-0188132-00",346119.05,349,050,"CREDITS",,,,"22/04/08",
    Record Types and corresponding format:
    3, Corp ID, A/C Number, Trans Amt, Serial Number, Trans Code, Particulars, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    5, Corp ID, A/C Number, Opening Balance, Serial Number, Trans Code, Particulars, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    6, Corp ID, A/C Number, Closing Balance, Serial Number, Trans Code, Particulars, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    8, Corp ID, A/C Number, Amount, Number, 000, “DEBITS”, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    8, Corp ID, A/C Number, Amount, Number, 050, “CREDITS”, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    9, Corp ID, A/C Number, Amount, Number, 000, “DEBITS”, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    9, Corp ID, A/C Number, Amount, Number, 050, “CREDITS”, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    Please note that record types 8 and 9 have 2 fixed fields.
    I have tried playing around with the File Adapter and the ESB but not having much luck. Can someone provide any suggestions on how I can handle this?

    James,
    Thanks for your prompt response. I have come across your post previously.
    Please excuse my in-experience (very new to SOA and Oracle SOA Suite) but i have not understood how your post regarding the manual creation of an XSD will assist with my problem. Could you possibly further elaborate on the overall approach i should be taking?
    Regards,
    Simon

  • Listener on ESB file adapter does not appear to listen

    I am new to the world of SOA, ESB, and BPEL and am starting with something simple such as reading a file from one directory performing a data transformation and writing the file to a different directory. I understand that Oracle's ESB and BPEL are very similar in functionality. I have successfully been able to read files in using a file adapter and listener in BPEL. When I try to set up a new ESB application with a file adapter and listener in ESB, I register the project with the Integration Server without any problems, and am able to see all components in the ESB Control. When I drop a new file into the directory that the adapter is attached and listening to, nothing happens. The file is not picked up and there are no new instances of any ESB services. Am I missing something? Is there a log that could be checked to see if the service is even checking the directory?
    Along the same lines, I am wondering if there is an easy way to update the directory that is being listened to once the service has been registered/deployed (with either ESB or BPEL). Is there a config file that someone other than a developer could update without having to redeploy the service again.
    Any help would be greatly appreciated. Thanks.

    What platform are you running and what install option did you use, basic or advanced?
    There are known issues with adapters enabling automatically on Linux that require you to restart the container. See Re: Changing File Adapter target directory - container bounce required?
    You can change the polling directory by following the instructions in this thread.
    Changing File Adapter target directory - container bounce required?

  • ESB/File Adapter - XML files containing multiple messages

    Hi,
    In all the examples on file adapters I read, if files contain multiple messages, it always concerns non-XML files, such as CSV files.
    In our case, we have an XML file containing multiple messages, which we want to process separately (not in a batch). We selected "Files contain Multiple Messages" and set "Publish Messages in Batches of" to 1.
    However, the OC4J log files show the following error:
    ORABPEL-12505
    Payload Record Element is not DOM source.
    The Resource Adapter sent a Message to the Adapter Framework which could not be converted to a org.w3c.dom.Element.
    Anyone knows whether it's possible to do this for XML files?
    Regards, Ronald

    Maybe I need to give a little bit more background info.
    Ideally, one would only read/pick-up small XML documents in which every XML document forms a single message. In that way they can be processed individually.
    However, in our case an external party supplies multiple messages in a single batch-file, which is in XML format. I want to "work" on individual messages as soon as possible and not put a huge batch file through our ESB and BPEL processes. Unfortunately we can not influence the way the XML file is supplied, since we are not the only subscriber to it.
    So yes, we can use XPath to extract all individual messages from the XML batch-file and start a ESB process instance for each individual message. But that would require the creation of another ESB or BPEL process which only task is to "chop up" the batch file and start the original ESB process for each message.
    I was hoping that the batch option in the File adapter could also do this for XML content and not only for e.g. CSV content. That way it will not require an additional process and manual coding.
    Can anyone confirm this is not supported in ESB?
    Regards,
    Ronald
    Message was edited by:
    Ronald van Luttikhuizen

  • ESB - How to add filter to avoid file adapter to start executing?

    I have an esb project with first a file adapter that reads incoming XML files. Mapped with a database adapter that calls a stored procedure that inserts data in certain tables. The output parameter of this stored procedure is of type pl/sql table and contains data of an error if an error has occurred while processing the procedure. A new file adapter will create a new file with the content of the output parameter. But if the outputparameter is empty it will give an error.
    Is there a possibility to add a filter to avoid that the file adapter starts trying to create the error file?
    Thanks,
    Els
    Message was edited by:
    user644760

    Hi Els,
    A possibility is to add a routing service in between the database invocation and file adapter with a single routing rule. Add a filter to the routing service to test whether the input is empty.
    Regards, Ronald

  • ESB File Adapter data validation issue

    I have a really simple ESB process.
    File Adapter Read
    (XML format with XSD) --> Routing Service --> SOAP Call to BPEL Process
    Quite simply, the ESB uses the File Adapter to read an XML file which has an associated XSD. The contents are then routed to a BPEL Process.
    If the XML file is valid in syntax but contains an element which is not in the XSD, the file contents are pushed to the BPEL process which has validateXML set to true - thus the error is caught and handled by the BPEL process.
    This is what I want to happen.
    If the XML file is invalid in syntax - for example -
    <TAG>hello</WRONGTAG>
    The ESB process rejects the file and stops dead.
    I would rather the invalid file is pushed to the BPEL so that the BPEL can handle the error.
    Why?
    Because my error handling in the BPEL picks up the invalid XML format and emails the users to inform them there is a problem.
    Can someone help? Is this possible? Thanks.
    Message was edited by:
    IanBaird

    Hello again! You are now aware of the problem as there has been a lot of conversation between myself and Dave Berry [and you!] regarding this issue.
    For the rest of the readers - here is a brief summary of progress so far. Sorry but there has been so much communication that I think the best way is to direct you to various links.
    If an adapter fails in ESB, then it is possible to trap these failures and handle them as documented here:
    Dave Berry said:
    you need to look into the adapter rejection handlers which are documented on OTN ESB page in the exception handling lesson and the adapters OTN dev resources page http://www.oracle.com/technology/products/integration/adapters/dev_support.html
    I had also devised my own solution in the meantime, which involves reading the input file as opaque schema and passing this to a BPEL process which converts it to an XML document! This can then be validated and use standard BPEL error handling to find any problems in the file format etc.
    See my blog for a tutorial on doing this:
    http://soastuff.wordpress.com/2007/06/06/handling-opaque-datatypes-in-bpelesb/
    The only problem with this at the moment is that the BPEL function ora:parseEscapedXML has a bug in it - so it fails if the XML does not comply with the XSD it tries to convert to.
    This in turn causes an ORABPEL-09500 error which does not seem to get handled by the BPEL process so the process fails.
    As things stand, this bug remains - though Dave Berry has been very helpful in identifying it and bringing it to Muruga's attention. Hopefully we can get a fix (?) and a workaround whilst the fix is being implemented (?)
    Cheers.

  • Soa Suite 10g. ESB Project. File Adapter - load in 2 folders

    Hi, friends!
    Help me please to solve next task.
    A have some ESB Project, which has 2 adapters: DbAdapter (In) and FileAdapter(out). If new rows appears in some table in DB (in CLOB field), I must save it in folder on disk. It's simple to save it in ONE folder, but I need to save it simultaneously in 2 different folders (for example in C:\folder1 and C:\folder2). Which settings can help me?

    Hi,
    Actually in file and ftp adapters, we can poll from multiple locations...
    Keep the following property in your .jca file <property name="DirectorySeparator" value="," />
    While giving the path in File Adapter configuration, keep comma and give the next location....then the file will be picked up from the locations you gave....
    If you want to write the file in multiple locations, may be you can try the same, if that doesn't work, please create two file Adapters as references each having different writing locations...
    Hope this helps...
    Thanks,
    N

  • Can we read a ZIP file through ESB file adapter?

    Hi,
    Is it possible to read a ZIP file from a specified locaiton either through a FTP adapter or file adapter as a opaque element and place it in another specified locaiton?
    So far the available options are to read a txt, csv, fixed length or cobol file.
    I tried to create a ESB reading ZIP file and place it in another locaiton. but it was not reading the ZIP file.
    Is it first possible for the ESB to read the ZIP file? if so then is there any specific configuratuion that we have to do for it to work?

    how can we read a MSWORD file through java?You can read it like any other file using FileInputStream
    and similar.
    Howvere, if you want do do something with it, like display
    it, you have a problem. AFAIK Microsoft do not release
    information about their document formats, and i have not
    seen a .doc viewer for java.

  • ESB File Adapter Service question

    I am experimenting with ESB to enable an inbound file adapter to read multiple records from an XML file. I am using a router to pass to another FileAdapter for writing each of the child records to a separate XML file. So basically I am just trying to test reading multiple records from a single XML file and then putting each record in its own separate XML file. Seems like should be a pretty simple initial experiment with using the ESB. But when I drop this inbound xml file, it is getting picked up properly, but the entire inbound xml file is getting written to just a single outbound xmlfile (and is not getting split up). My confusion is over how the fileadapter works, I have checked the "Files contain multiple messages" checkbox and indicated batches of "1". So shouldn't this tell the file adapter to process one record from the inbound file at a time, passing it thru the router and ultimately to the outbound file adapter? If so, how does the file adapter know what deliminates each record in the XML file? Here is the inbound xml that I am trying to read/break apart and put each service request event in it's own outbound file. Please advice what I am missing here.
    <Events xmlns="http://www.example.org/test">
    <ServiceRequestEvent>
    <summary>test1</summary>
    <severity_id>9</severity_id>
    <urgency_id>64</urgency_id>
    <status>103</status>
    <type_id>11124</type_id>
    <owner_id>100003631</owner_id>
    <current_serial_number>1748AC16206</current_serial_number>
    <inventory_item_id>2320011231602</inventory_item_id>
    <problem_code>1001R3</problem_code>
    </ServiceRequestEvent>
    <ServiceRequestEvent>
    <summary>test2</summary>
    <severity_id>9</severity_id>
    <urgency_id>64</urgency_id>
    <status>103</status>
    <type_id>11124</type_id>
    <owner_id>100003631</owner_id>
    <current_serial_number>1748AC16206</current_serial_number>
    <inventory_item_id>2320011231602</inventory_item_id>
    <problem_code>1001R3</problem_code>
    </ServiceRequestEvent>
    </Events>
    here is the schema....
    <?xml version="1.0" encoding="windows-1252" ?>
    <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
    xmlns="http://www.example.org/test"
    targetNamespace="www.example.org/test"
    xmlns:nxsd="http://xmlns.oracle.com/pcbpel/nxsd"
    elementFormDefault="qualified">
    <!-- type declarations -->
    <xsd:element name="Event" abstract="true" type="EventType"/>
    <xsd:element name="ServiceRequestEvent" substitutionGroup="Event" type="ServiceRequestEventType"/>
    <xsd:element name="PartRequestEvent" substitutionGroup="Event" type="PartRequestEventType"/>
    <xsd:element name="CounterEvent" substitutionGroup="Event" type="CounterEventType"/>
    <xsd:element name="Events">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element ref="Event" maxOccurs="unbounded"/>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    <xsd:element name="ServiceRequestEvents">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element ref="ServiceRequestEvent" maxOccurs="unbounded"/>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    <xsd:complexType name="EventType">
    <xsd:sequence/>
    </xsd:complexType>
    <xsd:complexType name="ServiceRequestEventType">
    <xsd:complexContent>
    <xsd:extension base="EventType">
    <xsd:sequence>
    <xsd:element name="summary" type="xsd:string"/>
    <xsd:element name="severity_id" type="xsd:double"/>
    <xsd:element name="urgency_id" type="xsd:double"/>
    <xsd:element name="status" type="xsd:double"/>
    <xsd:element name="type_id" type="xsd:double"/>
    <xsd:element name="owner_id" type="xsd:double"/>
    <xsd:element name="current_serial_number" type="xsd:string"/>
    <xsd:element name="inventory_item_id" type="xsd:string"/>
    <xsd:element name="problem_code" type="xsd:string"/>
    </xsd:sequence>
    </xsd:extension>
    </xsd:complexContent>
    </xsd:complexType>
    <xsd:complexType name="PartRequestEventType">
    <xsd:complexContent>
    <xsd:extension base="EventType">
    <xsd:sequence>
    </xsd:sequence>
    </xsd:extension>
    </xsd:complexContent>
    </xsd:complexType>
    <xsd:complexType name="CounterEventType">
    <xsd:complexContent>
    <xsd:extension base="EventType">
    <xsd:sequence>
    </xsd:sequence>
    </xsd:extension>
    </xsd:complexContent>
    </xsd:complexType>
    </xsd:schema>
    The next experiment is to try performing some content based routing based on the type of event. But I need to learn how to crawl first. I have been scanning thru the Oracle tutorials and googling, but have not found any examples where a single XML document is broken apart in this way into multiple output files.
    Thanks for any help,
    Todd

    Thanks for your reply Allan!
    I went with your recommendations and was able to get it partially working. I modified my XSLT mapping as you had indicated. I also ran the transform thru Stylus studio and the transform appears to have processed properly.
    What I am seeing happen now is that the writer is still only writing 1 file, but now rather than both children being included in the file, just the very first child element is included. I am not sure what happens to the 2nd one. I was expecting it to be written out to a second file. Below is the configuration settings I am using for the adapters...
    File reader:
    Currently batching turned off (ie unchecked the files contain multiple messages). I originally had this checked with it publishing in batches of 1. This appears to make no difference. The Schema Element is set to the parent element "Events".
    Router:
    Transform via router. This change appears to be working correctly since I was able to transform via stylus studio and see both child element transformed in the output document.
    <xsl:template match="/">
    <xsl:for-each select="/imp1:Events/imp1:ServiceRequestEvent">
    <imp1:ServiceRequestEvent>
    <imp1:summary>
    <xsl:value-of select="imp1:summary"/>
    </imp1:summary>
    File writer:
    File naming convention: event_%yyMMddHHmmssSS%.xml (so should be unique)
    I tried checking and unchecking "Number of messages Equals = 1", No files are written when this is checked? No errors found in the ESB instance console, appears to have written successfully, but no file. Schema element is set to the child element "ServiceRequestEvent" since I want each output file to include a single ServiceRequestEvent.
    So I am a bit confused what I am missing? It is difficult to gain visibility into what is going on under the covers. Is there any debug logging that may show exactly what is happening as the input file is processed, transformed and then written out? I assume that If I can transform the input document via Stylus studio to a document with just the 2 child elements that this is ok, but how does the file adapter know how to distinquish how to break up this output file into separate files without some kind of indication of a record delimiter?
    Thanks again for your help!
    -T

  • ESB file adapter on shared directory

    Hi
    I am trying to use file adapter on a mapped network drive, windows machine.
    I am getting this error:
    ORABPEL-11001
    Invalid Input Directory.
    The value specified for the input (Physical/Logical)Directory activation parameter has an invalid value "O:\BCRSignatures"
    Ensure that the following conditions are satisfied for the input directory :
    1) It exists and is a directory (not a file). and
    2) It is readable (file read permissions). and
    3) If activation parameter "DeleteFile" is set to "true" then the directory should also have granted write permissions. and
    4) If using a logical name, then ensure that the mapping from logical name<->physical directory is correctly specified in the deployment descriptor
    The location is accesible for read and write within Windows.
    The error persists even with "no delete files after processing" and "no archive processed files", options.
    Any ideeas? Sugestions?
    Metalink is poor on this.
    Thank you,
    Liviu

    HI BPEL Gurus,
    Everytime i m facing ORABPEL-11001 Errors . I dont know why its coming.can u someone give suggestion to this regards. i attach the logs as follows
    <2007-11-23 09:53:29,825> <INFO> <dev.collaxa.cube.activation> <File Adapter::Inbound> Recovery still not possible after 151110 attempts due to ORABPEL-11001
    Invalid Input Directory.
    The value specified for the input (Physical/Logical)Directory activation parameter has an invalid value "D:\igefi\bpelinterfaces\mfond\dev\Wrk_Imp_Swap_Price\WORKDIRS\input".
    Ensure that the following conditions are satisfied for the input directory :
    1) It exists and is a directory (not a file). and
    2) It is readable (file read permissions). and
    3) If activation parameter "DeleteFile" is set to "true" then the directory should also have granted write permissions. and
    4) If using a logical name, then ensure that the mapping from logical name<->physical directory is correctly specified in the deployment descriptor.
    the above message i m getting in domain.log and opmn.log
    Regards
    Hameed

  • Getting filename in FTP Adapter in ESB

    I am trying to get the filename on an inbound FTP adapter in an ESB. I've followed the directions I have found, by following these steps:
    1. The namespace xmlns:ehdr="http://www.oracle.com/XSL/Transform/java/oracle.tip.esb.server.headers.ESBHeaderFunctions" is automatically generated in the xsl file.
    2. Create a variable in the xsl file, with value ehdr:getRequestHeader('/fhdr:InboundFileHeaderType/fhdr:fileName','fhdr=http://xmlns.oracle.com/pcbpel/adapter/file/;'). Let's give it the name INFILENAME.
    3. Reference the variable in some expression in the xsl file: $INFILENAME
    4.Set UseHeaders="true" in the FTP inbound wsdl file
    When I do this in a file adapter, I can use the value. When I do this in an ftp adapter, no value comes back.
    Any ideas?

    Yes, that did it, thanks. I had a typo:
    getRequestHeader('/fhdr:InboundFtpHeaderType/fhdr:fileName,'fhdr=http://xmlns.oracle.com/pcbpel/adapter/ftp/;')
    which should have had "FTP" in upper case. The correct code is as you stated:
    getRequestHeader('/fhdr:InboundFTPHeaderType/fhdr:fileName,'fhdr=http://xmlns.oracle.com/pcbpel/adapter/ftp/;')

  • Issue in invoking a BPEL process (having a DB Adapter) from ESB

    Dear All,
    I am having an issue while invoking a ESB process (Batch file mode) which inturn will call a second BPEL process which will invoke the Oracle APPS standard API ego_item_pub.process_items to create an item in Inventocry, through a DB adapter
    I am able to create the Item and API is working when I pass the input XML through the second BPEL process.
    But when invoking through the ESB process which will invoke the BPEL process, based on the polling to the batch file from the ESB. This is done through an FTP adapter in ESB.
    The Issue is that while invoking from ESB, through batch file. The PL/SQL API is throwing error from the standard API 'ego_item_pub.process_items'
    But it works fine if I invoke the BPEL process individually.
    What might be the problem for this and how can I fix this..
    Please update..
    Many thanks in advance ...

    Hi,
    We tried the same thing and it worked for us. In our scenario we had a BPEL process calling a ESB and then this ESB calling another BPEL process which inturn calls the API. I agree with James it seems more of a Data issue/or your XSD is not proper i.e the XSD that you ESB process is having. If you have created a ESB and then changed the BPEL process afterward the issue you mentioned will come. Please run the process in DEBIG mode and provide the domain.log file.
    Regards
    Sahil
    http://soab2bsahil.blogspot.com

  • Dicarding messages in file adapter

    Hi,
    I wish to process a file that has invoice data. The structure of the file is thus:
    $H$<<Supplier Name>>,<<Supplier Site>>,<<Invoice Number>>
    $D$<<Line Number>>     ,<<Business Unit>>,<<Location>>,<<Store Name>>
    $D$<<Line Number>>     ,<<Business Unit>>,<<Location>>,<<Store Name>>
    $H$<<Supplier Name>>,<<Supplier Site>>,<<Invoice Number>>
    $D$<<Line Number>>     ,<<Business Unit>>,<<Location>>,<<Store Name>>
    $D$<<Line Number>>     ,<<Business Unit>>,<<Location>>,<<Store Name>>
    The intent is to configure the file adapter in such a fashion that it rejects the entire invoice if there are any errors in any of the lines that belong to that invoice. The invoices that follow the erred inovice should be processed normally. How do I configure the adapter to achieve this. Any help is greatly appreciated.

    Have a look at the startWith example in this link Jason provides.
    Also you can see this post as it provides examples
    Re: ESB- Flat file records to different DB tables based on first field
    cheers
    James

  • Changing File Adapter target directory - container bounce required?

    I edited USPSshipment.wsdl in the FulfillmentESB project to reflect my desired target directory (/home/oracle/usps_ship instead of C:\temp) and then re-registered the project with ESB.
    The registration completed normally, and I verified that the edited USPSshipment.wsdl file exists in the deploy and content directories in the ESB home. Also, checking the WSDL file from the ESB console showed the new value.
    However, until I bounced the OC4J_soa container, order submissions continued to use the C:\Temp target. After I bounced the container, the next order submission went to the desired /home/oracle/usps_ship.
    What is the expected cacheing behavior here? Is there a programmatic way to clear the cache after an ESB deploy. It is too restrictive if I have to bounce the whole container serving ESB and BPEL (perhaps many processes and ESB adapters) just to make a change in one particular project.
    Thanks.

    Using the SOA Demo, FulfillementESB project ~
    Example: Immediate reflection of change w/o further action
    If I change the Shipment router filter criteria (Fulfillment_Shipment.esbsvc) to cut at say $750 instead of $500, then re-register FulfillmentESB with OrderBookingIS, JDeveloper reports:
    Registration of Services Successful
    Fulfillment Shipment Updates
    The next submitted order follows the new criteria (e.g. a $600 order goes out USPS).
    Example: Change not reflected in runtime, until OC4J container restarted
    If I change the target physical directory of the USPS Shipment file adapter (USPSShipment.wsdl), then re-register FulfillmentESB with OrderBookingIS, JDeveloper reports:
    Registration of Services Successful
    However, the next submitted orders continue to be written to the previous target physical directory. Checking the value of the re-deployed USPSShipment.wsdl at the ESB host filesystem and through the ESB console shows the updated value, but the old copy is being actually used to process the message.
    Restarting the OC4J container, causes the updated USPSShipment.wsdl to be referenced for messages submitted thereafter.
    -Todd
    Message was edited by:
    tbeets

Maybe you are looking for