File adapter initiated process

I've created a simple process that is supposed to start by polling for a file and reading it then deleting the file. The process is deployed and the file is in the correct directory but the process is never started. The file remains in the directory. Is there something I'm overlooking?

It should not happen.can you send me process
my id is
[email protected]

Similar Messages

  • JCBC Adapter and File adapter not processing messages

    Hi
    I noticed that messages are being delivered to Adapter engine and the same are visible in Runtime Workbench with status "to be delivered". But, JDBC Adapter and File Adapter not processing these messages.
    Any idea where I can find the problem?
    I was able to re-deliver successfully via the JDBC Adapter using the MessagingSystem GUI using XISUPER user.
    Regards
    Chandu

    Hi,
    1.Status: TO_BE_DELIVERED
    Which means that the message was successfully delivered from Integration Server point of view and it states that the messages is initially handed over to the Messaging System.
    TO_BE_DELIVERED occurs while the message is put into the Messaging System receive queue.
    Regards
    Agasthuri Doss

  • File Adapter BPEL Process getting switched off

    The file adapter BPEL process reads a csv file which has a series of records in itfrom /xfer/chroot/data/aramex/accountUpdate/files. In between reading the files, the BPEL process gets switched off. The below snippet is the error we found in the domain.log. Anybody can you please suggest what to do?
    <2010-11-25 16:22:28,025> <WARN> <PreActivation.collaxa.cube.ws> <File Adapter::Outbound>
    java.io.FileNotFoundException: /xfer/chroot/data/aramex/accountUpdate/files/VFQ-251120101_1000.csv (No such file or directory)
    at java.io.FileInputStream.open(Native Method)
    at java.io.FileInputStream.<init>(FileInputStream.java:106)
    at oracle.tip.adapter.file.FileUtil.copyFile(FileUtil.java:947)
    at oracle.tip.adapter.file.inbound.ProcessWork.defaultArchive(ProcessWork.java:2341)
    at oracle.tip.adapter.file.inbound.ProcessWork.doneProcessing(ProcessWork.java:614)
    at oracle.tip.adapter.file.inbound.ProcessWork.processMessages(ProcessWork.java:445)
    at oracle.tip.adapter.file.inbound.ProcessWork.run(ProcessWork.java:227)
    at oracle.tip.adapter.fw.jca.work.WorkerJob.go(WorkerJob.java:51)
    at oracle.tip.adapter.fw.common.ThreadPool.run(ThreadPool.java:280)
    at java.lang.Thread.run(Thread.java:619)
    <2010-11-25 16:22:28,025> <INFO> <PreActivation.collaxa.cube.ws> <File Adapter::Outbound> Processer thread calling onFatalError with exception /xfer/chroot/data/aramex/accountUpdate/files/VFQ-251120101_1000.csv (No such file or directory)
    <2010-11-25 16:22:28,025> <FATAL> <PreActivation.collaxa.cube.activation> <AdapterFramework::Inbound> [Read_ptt::Read(root)]Resource Adapter requested Process shutdown!
    <2010-11-25 16:22:28,025> <INFO> <PreActivation.collaxa.cube.activation> <AdapterFramework::Inbound> Adapter Framework instance: OraBPEL - performing endpointDeactivation for portType=Read_ptt, operation=Read
    <2010-11-25 16:22:28,025> <INFO> <PreActivation.collaxa.cube.ws> <File Adapter::Outbound> Endpoint De-activation called in adapter for endpoint : /xfer/chroot/data/aramex/accountUpdate/files/
    <2010-11-25 16:22:28,095> <WARN> <PreActivation.collaxa.cube.ws> <File Adapter::Outbound> ProcessWork::Delete failed, the operation will be retried for max of [2] times
    <2010-11-25 16:22:28,095> <WARN> <PreActivation.collaxa.cube.ws> <File Adapter::Outbound>
    ORABPEL-11042
    File deletion failed.
    File : /xfer/chroot/data/aramex/accountUpdate/files/VFQ-251120101_1000.csv as it does not exist. could not be deleted.
    Delete the file and restart server. Contact oracle support if error is not fixable.
    at oracle.tip.adapter.file.FileUtil.deleteFile(FileUtil.java:279)
    at oracle.tip.adapter.file.FileUtil.deleteFile(FileUtil.java:177)
    at oracle.tip.adapter.file.FileAgent.deleteFile(FileAgent.java:223)
    at oracle.tip.adapter.file.inbound.FileSource.deleteFile(FileSource.java:245)
    at oracle.tip.adapter.file.inbound.ProcessWork.doneProcessing(ProcessWork.java:655)
    at oracle.tip.adapter.file.inbound.ProcessWork.processMessages(ProcessWork.java:445)
    at oracle.tip.adapter.file.inbound.ProcessWork.run(ProcessWork.java:227)
    at oracle.tip.adapter.fw.jca.work.WorkerJob.go(WorkerJob.java:51)
    at oracle.tip.adapter.fw.common.ThreadPool.run(ThreadPool.java:280)
    at java.lang.Thread.run(Thread.java:619)
    <2010-11-25 16:22:28,315> <ERROR> <PreActivation.collaxa.cube> <BaseCubeSessionBean::logError> Error while invoking bean "cube delivery": Process state off.
    The process class "BulkAccountUpdateFileConsumer" (revision "1.0" ) has not been turned on. No operations on the process or any instances belonging to the process may be performed if the process is off.
    Please consult your administrator if this process has been turned off inadvertently.

    This patch is not for 10.1.3.1.
    I have provided a response to on the following post
    BPEL Process Going into Dead State Automatically.
    cheers
    James

  • Sender File adapter to process each file with time gap

    Hi All,
    I have a sender file adapter which is picking 2 files in a folder. I want to delay the processing of each file by 5 minutes. For ex: PI should process the first file, then wait 5 minutes and the process the next etc.  Is this achievable ?
    Is there a way  to create 2 sender agreements if I create 2 separate comm channels for each file .
    I am using PI 7.0
    Edited by: Dev Noronha on Sep 17, 2010 8:48 AM

    Hi Sarvesh...
    U r correct ... but i am nt wrong.... Just joking...
    S u r correct in normal cases..
    But this interface also involves BPM too.....
    Dev didnt mention the whole prblm in clear....
    The prblm is as following..
    When he runs the CC at sender , if u have multiple files, then every file will initiate a BPM instance.
    And in turn every BPM instance is waiting fr a message from PROXY....
    The proxy messages coming from the ECC are not able to go fr exact BPM instance..
    Eg: 2 Proxy message gng fr 1 BPM instance...etc..
    FYI , no correlation on the messages....
    Babu

  • Sender File Adapter stop processing all files

    Hello all,
    the file adapter pick up all files in the directory by default.
    if  a large number of files are in the directory then this could slow down the pi processing.
    is there any way to process only one file per polling??
    regards

    >
    Ralf Zimmerningkat wrote:
    > Hello all,
    > the file adapter pick up all files in the directory by default.
    > if  a large number of files are in the directory then this could slow down the pi processing.
    > is there any way to process only one file per polling??
    >
    > regards
    I do not have any proble if you have got the answer. BUT this blog says how to exclude the other files from the same folder.
    Your Case: For example, Your Sender CC wants to pick up file ABC.txt from /xyz dir, now suppose there are 10 thousand files of same name in the dir and you want ABC.txt should be picked up one by one, So how this blog is going to help you. Can you Plz explain to me and others too?
    @Sachin may be you can throw some light on this... may be I am missing something.

  • File Adapter Not Processing Large File

    Hi Everyone,
    We are experiencing a problem in an interface that includes file adapter. The scenario is as follows:
    1. A .csv file containing multiple records needs to be processed; for each line of the .csv file, an XML file should be genrated. <b>We are using the 'Recordsets Per Message' field in the Content Conversion Parameters to achieve this.</b> After the source .csv file is processed, it is deleted.
    We were testing with small 15-30 records in each input file. For the purpose of testing scalability, we increased the number of records in a file to nearly 300. But the file adapter did not pick up the file.Comm.Channel in RWB is showing green, MDT contains no entries and SXMB_MONI also shows no messages. What can be the problem? Is there any limit on the size of file that can be converted in this way?
    Awaiting your replies,
    Regards,
    Amitabha

    Amitabha,
    300 records should not be a problem at all. If you are not getting any error msg in CC monitoring, then it is time you take a look into VA logs.
    Ref: /people/michal.krawczyk2/blog/2005/09/07/xi-why-dont-start-searching-for-all-errors-from-one-place
    Regards,
    Jai Shankar

  • File Adapter not processing Files which has COMMA in it

    Hi Experts,
    I am working on Project in SOA 11.1.1.6.
    I am trying to process a file which has comma in it. for eg - XXXXXXX-XX_XX_XXX11,99.pdf
    I am picking the file using file adapter from SOURCE directory and using another file adapter to WRITE to destination directory. Polling processing is working fine.
    I am picking the file from SOURCE directory as ATTACHMENT and WRITING to DESTINATION folder through File Adapter
    While doing file WRITE its failing. While reading the file the values of Content Type, Character Set and Encoding are NULL for the ATTACHMENT as they are optional properties.
    The error I am getting while File WRITE is
    <remoteFault>
    <part name="summary">
    <summary>java.lang.RuntimeException: Failed to decode properties string att.encoding=null,att.charset=null,att.contentId=AReadFile/XXXXXXXX_XX_XXX11,99.pdf_1352184009640,att.contentType=null,att.partName=attach</summary>
    </part>
    <part name="detail">
    <detail>1</detail>
    </part>
    </remoteFault>

    Comma is usually treated as a separator/delimiter and hence it is not really common to have comma in filename. I think because of this reason File adapter does not support it (have to cross-verify). If you really have a requirement to read/write files with comma in filename then better log a SR with Oracle support.
    Regards,
    Anuj

  • Sender File Adapter missing processing parameters

    Hi All,
    On my Dev box in ID for configurations of Sender File adapter I am able to see the following options:
    1)    Handling of Empty Files
    2) Archive Source Files with Errors
    But I am unable to see them on QA box.What could be the error.

    Hi All,
    I checked the metadata in IR in Dev and QA.
    They are different.
    In QA i dont see the " Handling of Empty Files"
    What could be the problem
    Thanks

  • Rename file in file adapter after processing

    Hi All,
    My requirement is, i need to rename the file in receiver file adapter after the file is written to the folder. and this file should have a counter attached to its name. For eg. If i had a file created with name Material, i want to change it to Order001, Order002, Order003.... so on and counter should reset after 999. how can i achecive it?
    thanks a lot in advance.
    Regards,
    Rashmi
    Edited by: Rashmi H S on Jul 15, 2009 12:15 PM

    Hi Rshmi,
    use this UDF i hope it will work fine,
    DynamicConfiguration conf = (DynamicConfiguration) container
         .getTransformationParameters().get(StreamTransformationConstants.DYNAMIC_CONFIGURATION);
    DynamicConfigurationKey key = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/File", "FileName");
    String a=conf.get(key);
    int inc = 1;
    Integer seqNo = (Integer) container.getParameter("seqNo");
    if(seqNo == null || seqNo ==999){
                                seqNo = new Integer(inc);
    else{
                               int num = seqNo.intValue() + inc;
                               seqNo = new Integer(num);
    container.setParameter("seqNo", seqNo);
    return var1+seqNo.toString();
    Regards,
    Raj

  • HA File Adapter still processing same file twice

    We are using the eis/HAFileAdapterMSSQL to watch for ZIP files that get dropped into a directory. We are running SOA Suite 11.1.1.6 on WebLogic 10.3 with a 2-server cluster. We set our PollingFrequency to 120 seconds.
    When our ZIP files are small enough to where our BPEL process (which does some unzipping and merging of files therein) completes within those 120 seconds, everything is fine. One, and only one, node in our cluster processes the file.
    But when the ZIP files are large, and our BPEL process takes 30 minutes to complete (i.e. longer than the 120 second polling frequency), we see a new instance of the BPEL process on soa_server1 start as expected, then second process instance start on our soa_server2 instance 120 seconds later.
    It's almost as if, if the entire BPEL process - which is kicked off by the HAFileAdapterMSSQL - does not complete within our polling frequency, then the file is not being blocked from other servers in our cluster picking it up and start processing it. Is there a way to "lock" the file with the first instance of the HAFileAdapterMSSQL that picks it up until it's completely finished with it?
    Has anyone encountered this?
    Thanks,
    Michael

    You need to define the below property in composite.xml file of your bpel. This will allow only one adapter active and another passive to pick up the file.
    <activationAgents>
    <activationAgent className="oracle.tip.adapter.fw.agent.jca.JCAActivationAgent" partnerLink="PickupFile/FTPConsumer">
    *<property name="clusterGroupId">BPELProcessNameCluster</property>*
    <property name="portType">FTP/FileConsumer_Message_ptt</property>
    </activationAgent>
    </activationAgents>
    Thanks,
    Vijay

  • File Adapter only process first row.

    Trying to read a csv file into a database table. I have successfully read the file but the BPEL process seems to only read the first row. I can't see any obvious errors.
    Any help would be appreciated.
    cheers
    James

    I found the solution I needed to use the for-each functionality in the transformation process.
    cheers
    James

  • Reciever content conversion for File adapter.

    Hi ,
    I am doing Idoc to file scenario,it is working fine without any FCC in the reciever file adapter but its failing when I use FCC.getting the error like
    "Conversion initialization failed: java.lang.Exception: java.lang.Exception: Error(s) in XML conversion parameters found: Parameter 'STRUC.fieldFixedLengths' or 'STRUC.fieldSeparator' is missing "
    I know this error is because of FCC problem.
    I am getting this xml structure without FCC in the rec file adapter
    <?xml version="1.0" encoding="UTF-8" ?>
    - <ns0:DataRec_MT xmlns:ns0="http://xyz.....">
    - <SET>
    - <STRUC>
    <PERNR>0000185</PERNR>
    <ENAME>xyz m xyz</ENAME>
    <NACHN>xyz</NACHN>
    </STRUC>
    </SET>
    </ns0:DataRec_MT>
    I want to get 00000185||xyz m xyz ||xyz format
    I am using these FCC parameters
    STRUC.fieldNames fld1,fld2,fld3 etc
    STRUC.fieldSeparator ||(do I need to put this in ' ' ?)
    STRUC.endSeparator 'nl'
    do I need to add any more parameters or not supposed to use any parameters.
    please help.
    I tried with different combinations but no luck.
    i have DT created with stucture like
    DT_xyz
      Recset
         Record set structure.
    and I maintained all with the same case for field names,recordset structure,record set etc
    I searched blogs and links,and I will keep on.
    and what if if we dont get a value for a field in the record structure?whole FCC gonna fail?
    thank you,
    Babu

    Ramana and Raj,
    I tried both options but,still getting error.
    Recordset structure : STRUC
    STRUC.fieldSeparator ||
    STRUC.endSeparator 'nl'
    or
    Recordset Structure: STRUC
    Try with these parameters
    STRUC.addHeaderLine=0
    STRUC.fieldSeparator = ||
    STRUC.endSeparator = 'nl'
    I get this error in File  CC
    <b>Channel has not been correctly initialized and cannot process messages
    Conversion initialization failed: java.lang.Exception: java.lang.Exception: Error(s) in XML conversion parameters found: Parameter 'STRUC.fieldFixedLengths' or 'STRUC.fieldSeparator' is missing </b>
    and when I furthur click on the message ID in CC I get this error
    <b>Success File adapter receiver: processing started; QoS required: ExactlyOnce
    2007-08-01 08:02:57 Error File adapter receiver channel CC_File_RECV is not initialized. Unable to proceed: null
    2007-08-01 08:02:57 Error Exception caught by adapter framework: Channel has not been correctly initialized and cannot process messages
    2007-08-01 08:02:57 Error Delivery of the message to the application using connection File_http://sap.com/xi/XI/System failed, due to: com.sap.aii.af.ra.ms.api.RecoverableException: Channel has not been correctly initialized and cannot process messages</b>
    any suggestions?
    thank you.

  • 'lastFieldsOptional' parameter in Inbound File adapter ?

    Hi there,
    Im getting the following error from the adapter engine when trying to process a flat file.
    At The same time the adapter recommends to use the 'lastFieldsOptional' parameter, however,this parameter cannot be found in the provided documentation. can someone tell me what's exactly the problem here? How can I specify this 'lastFieldsOptional' parameter in the IB file configuration file? I have try to put this parameter in the config file, but the results are the same.
    xml.lastFieldsOptional
    Tue Nov 02 11:14:35 CET 2004 *****
    11:14:35 (4023): File adapter initialized successfully
    11:14:35 (4007): File adapter started
    11:14:35 (4051): Process 1 file(s):
    11:14:35 : /tmp/TF20041024.txt
    11:14:35 (4052): Start processing "TXT" file "/tmp/TF20041024.txt" size 264324 in "EO mode
    11:14:35 (4058): Start converting to XML format 
    11:14:35 (4060): ERROR: Stop processing file "/tmp/TF20041024.txt" (exception "java.lang.Exception: ERROR converting document line no. 1 : java.lang.Exception: Consistency error: field(s) missing - specify 'lastFieldsOptional' parameter to allow this" in XML renderer)
    11:14:35 (4077): Retry  mode -  wait 1200 sec, 0 msec interval before retry

    HI there,
    I just found the solution, the correct syntax of this "unknown" parameter from the file adapter is:
    xml.lastFieldsOptional=YES
    After this change in the config file the adapter is able to process the file.
    Cheers,
    Rob.

  • File Adapter Service debatching issue

    I am experimenting with the ESB to enable an inbound file adapter to read multiple records "debatching" from a file with XML content. I am using a router to pass to another FileAdapter for writing each of the child records to a separate XML file. So basically I am just trying to test reading multiple records from a single XML file and then putting each record in its own separate outbound file. Seems like this should be a pretty simple initial experiment with using the ESB. But when I drop this inbound xml file, it is getting picked up properly, but the entire inbound xml file is not getting split up or debatched into separate XML messages from the file input adapter. My confusion is over how the fileadapter works with files with XML content, I have checked the "Files contain multiple messages" checkbox and indicated batches of "1". So shouldn't this tell the file adapter to process one record from the inbound file at a time, passing it thru the router and ultimately to the outbound file adapter? If so, how does the file adapter know what deliminates each record in the XML file? Here is the inbound xml that I am trying to read/break apart and put each child XML element "service request event" in it's own outbound file. Please advice what I am missing here. I am not so sure that debatching in the File Adapter will work with files including XML content? Or perhaps that is a work around or example that someone has to explain how to do this.
    <?xml version="1.0" encoding="UTF-8"?>
    <ns4:Events xmlns:ns4="http://xmlns.arl.psu.edu/OffboardServicesEvent" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://xmlns.arl.psu.edu/OffboardServicesEvent Events.xsd">
    <ns4:ServiceRequestEvent>
    <ns4:summary>Planetary over temp test1</ns4:summary>
    <ns4:severity_id>9</ns4:severity_id>
    <ns4:urgency_id>64</ns4:urgency_id>
    <ns4:status>103</ns4:status>
    <ns4:type_id>11124</ns4:type_id>
    <ns4:owner_id>100003631</ns4:owner_id>
    <ns4:current_serial_number>1748AC16206</ns4:current_serial_number>
    <ns4:inventory_item_id>2320011231602</ns4:inventory_item_id>
    <ns4:problem_code>1001R3</ns4:problem_code>
    </ns4:ServiceRequestEvent>
    <ns4:ServiceRequestEvent>
    <ns4:summary>Planetary over temp test2</ns4:summary>
    <ns4:severity_id>9</ns4:severity_id>
    <ns4:urgency_id>64</ns4:urgency_id>
    <ns4:status>103</ns4:status>
    <ns4:type_id>11124</ns4:type_id>
    <ns4:owner_id>100003631</ns4:owner_id>
    <ns4:current_serial_number>1748AC16206</ns4:current_serial_number>
    <ns4:inventory_item_id>2320011231602</ns4:inventory_item_id>
    <ns4:problem_code>1001R3</ns4:problem_code>
    </ns4:ServiceRequestEvent>
    </ns4:Events>
    here is the schema....
    <?xml version="1.0" encoding="UTF-8" ?>
    <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
    xmlns="http://xmlns.arl.psu.edu/OffboardServicesEvent"
    targetNamespace="http://xmlns.arl.psu.edu/OffboardServicesEvent"
    xmlns:nxsd="http://xmlns.oracle.com/pcbpel/nxsd"
    elementFormDefault="qualified">
    <xsd:element name="ServiceRequestEvent" type="ServiceRequestEventType"/>
    <xsd:element name="Events">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element ref="ServiceRequestEvent" maxOccurs="unbounded"/>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    <xsd:complexType name="ServiceRequestEventType">
    <xsd:sequence>
    <xsd:element name="summary" type="xsd:string"/>
    <xsd:element name="severity_id" type="xsd:double"/>
    <xsd:element name="urgency_id" type="xsd:double"/>
    <xsd:element name="status" type="xsd:double"/>
    <xsd:element name="type_id" type="xsd:double"/>
    <xsd:element name="owner_id" type="xsd:double"/>
    <xsd:element name="current_serial_number" type="xsd:string"/>
    <xsd:element name="inventory_item_id" type="xsd:string"/>
    <xsd:element name="problem_code" type="xsd:string"/>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:schema>
    I have been scanning thru the Oracle tutorials and googling, but have not found any examples where a single XML document is debatched via the File adapter in this way.
    Thanks for any help,
    Todd

    I figured it out. I was using 10.1.3.1, which does not support XML debatching -- just flat file debatching. I upgraded to 10.1.3.3 and it works now. Before I did that, however, I followed the instructions here: http://www.oracle.com/technology/products/ias/bpel/pdf/10133technotes.pdf (section: Transferring Large Payloads in Oracle BPEL Process Manager) and downloaded those jarfiles. Perhaps they come with 10.1.3.3, but I did not remove them before I upgraded so I'm not sure if those steps were necessary.

  • ESB File Adapter Service question

    I am experimenting with ESB to enable an inbound file adapter to read multiple records from an XML file. I am using a router to pass to another FileAdapter for writing each of the child records to a separate XML file. So basically I am just trying to test reading multiple records from a single XML file and then putting each record in its own separate XML file. Seems like should be a pretty simple initial experiment with using the ESB. But when I drop this inbound xml file, it is getting picked up properly, but the entire inbound xml file is getting written to just a single outbound xmlfile (and is not getting split up). My confusion is over how the fileadapter works, I have checked the "Files contain multiple messages" checkbox and indicated batches of "1". So shouldn't this tell the file adapter to process one record from the inbound file at a time, passing it thru the router and ultimately to the outbound file adapter? If so, how does the file adapter know what deliminates each record in the XML file? Here is the inbound xml that I am trying to read/break apart and put each service request event in it's own outbound file. Please advice what I am missing here.
    <Events xmlns="http://www.example.org/test">
    <ServiceRequestEvent>
    <summary>test1</summary>
    <severity_id>9</severity_id>
    <urgency_id>64</urgency_id>
    <status>103</status>
    <type_id>11124</type_id>
    <owner_id>100003631</owner_id>
    <current_serial_number>1748AC16206</current_serial_number>
    <inventory_item_id>2320011231602</inventory_item_id>
    <problem_code>1001R3</problem_code>
    </ServiceRequestEvent>
    <ServiceRequestEvent>
    <summary>test2</summary>
    <severity_id>9</severity_id>
    <urgency_id>64</urgency_id>
    <status>103</status>
    <type_id>11124</type_id>
    <owner_id>100003631</owner_id>
    <current_serial_number>1748AC16206</current_serial_number>
    <inventory_item_id>2320011231602</inventory_item_id>
    <problem_code>1001R3</problem_code>
    </ServiceRequestEvent>
    </Events>
    here is the schema....
    <?xml version="1.0" encoding="windows-1252" ?>
    <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
    xmlns="http://www.example.org/test"
    targetNamespace="www.example.org/test"
    xmlns:nxsd="http://xmlns.oracle.com/pcbpel/nxsd"
    elementFormDefault="qualified">
    <!-- type declarations -->
    <xsd:element name="Event" abstract="true" type="EventType"/>
    <xsd:element name="ServiceRequestEvent" substitutionGroup="Event" type="ServiceRequestEventType"/>
    <xsd:element name="PartRequestEvent" substitutionGroup="Event" type="PartRequestEventType"/>
    <xsd:element name="CounterEvent" substitutionGroup="Event" type="CounterEventType"/>
    <xsd:element name="Events">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element ref="Event" maxOccurs="unbounded"/>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    <xsd:element name="ServiceRequestEvents">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element ref="ServiceRequestEvent" maxOccurs="unbounded"/>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    <xsd:complexType name="EventType">
    <xsd:sequence/>
    </xsd:complexType>
    <xsd:complexType name="ServiceRequestEventType">
    <xsd:complexContent>
    <xsd:extension base="EventType">
    <xsd:sequence>
    <xsd:element name="summary" type="xsd:string"/>
    <xsd:element name="severity_id" type="xsd:double"/>
    <xsd:element name="urgency_id" type="xsd:double"/>
    <xsd:element name="status" type="xsd:double"/>
    <xsd:element name="type_id" type="xsd:double"/>
    <xsd:element name="owner_id" type="xsd:double"/>
    <xsd:element name="current_serial_number" type="xsd:string"/>
    <xsd:element name="inventory_item_id" type="xsd:string"/>
    <xsd:element name="problem_code" type="xsd:string"/>
    </xsd:sequence>
    </xsd:extension>
    </xsd:complexContent>
    </xsd:complexType>
    <xsd:complexType name="PartRequestEventType">
    <xsd:complexContent>
    <xsd:extension base="EventType">
    <xsd:sequence>
    </xsd:sequence>
    </xsd:extension>
    </xsd:complexContent>
    </xsd:complexType>
    <xsd:complexType name="CounterEventType">
    <xsd:complexContent>
    <xsd:extension base="EventType">
    <xsd:sequence>
    </xsd:sequence>
    </xsd:extension>
    </xsd:complexContent>
    </xsd:complexType>
    </xsd:schema>
    The next experiment is to try performing some content based routing based on the type of event. But I need to learn how to crawl first. I have been scanning thru the Oracle tutorials and googling, but have not found any examples where a single XML document is broken apart in this way into multiple output files.
    Thanks for any help,
    Todd

    Thanks for your reply Allan!
    I went with your recommendations and was able to get it partially working. I modified my XSLT mapping as you had indicated. I also ran the transform thru Stylus studio and the transform appears to have processed properly.
    What I am seeing happen now is that the writer is still only writing 1 file, but now rather than both children being included in the file, just the very first child element is included. I am not sure what happens to the 2nd one. I was expecting it to be written out to a second file. Below is the configuration settings I am using for the adapters...
    File reader:
    Currently batching turned off (ie unchecked the files contain multiple messages). I originally had this checked with it publishing in batches of 1. This appears to make no difference. The Schema Element is set to the parent element "Events".
    Router:
    Transform via router. This change appears to be working correctly since I was able to transform via stylus studio and see both child element transformed in the output document.
    <xsl:template match="/">
    <xsl:for-each select="/imp1:Events/imp1:ServiceRequestEvent">
    <imp1:ServiceRequestEvent>
    <imp1:summary>
    <xsl:value-of select="imp1:summary"/>
    </imp1:summary>
    File writer:
    File naming convention: event_%yyMMddHHmmssSS%.xml (so should be unique)
    I tried checking and unchecking "Number of messages Equals = 1", No files are written when this is checked? No errors found in the ESB instance console, appears to have written successfully, but no file. Schema element is set to the child element "ServiceRequestEvent" since I want each output file to include a single ServiceRequestEvent.
    So I am a bit confused what I am missing? It is difficult to gain visibility into what is going on under the covers. Is there any debug logging that may show exactly what is happening as the input file is processed, transformed and then written out? I assume that If I can transform the input document via Stylus studio to a document with just the 2 child elements that this is ok, but how does the file adapter know how to distinquish how to break up this output file into separate files without some kind of indication of a record delimiter?
    Thanks again for your help!
    -T

Maybe you are looking for

  • What PCI card to USB adapter can I use on my Satellite C

    Hi My remaining working USB port has now deceased and I am stuck as I can no longer print (more importantly the kids can't print their homework!!) From the little understanding I have found out that I can either get the mother board replaced (expensi

  • Information Broadcasting(BW3.5)

    Hi Guys, I am trying to broadcast a web query through email. As the users won't have access to BW,I am taking the first option"Independent HTML" . When the email is sent to the user,it is not being sent as a HTML link,the entire report is pasted in t

  • Self.xml - what's that?

    Hello all: Very new mac guy here so sorry if this is a dumb one. Anyway on day 2 now and still getting my bearings. So I tried Messenger Mac but my webcam doesn't seem to work with that. Did a search and found this Mercury - so tried that but did not

  • USR02 table is corrupted

    We use version 4.7X100 of SAP. The problem I have got is the USR02 table is corrupt and do not know what to do. For example, when I add new parameter in a user's master record, the rest of already existing parameters vanishes after saving the master

  • Not Loaded. A runtime error occurred during the load of the COM add-in.

    Hey All! I am relatively new to Hyperion, and need some help. I recently purchased a new laptop, and I am trying to reinstall Smart View 11.1.2.1 on my home system, but I continue to run into difficulties. I launch the Smart View installer by right-c