File  Adapter :- Handling Large documents

Hi
I am currently working on File Adapter. Reading large documents and writing the same in to some other file location.
I came across the following techniques:
1. Scalable DOM
2. File Chunk Read.
Can any one help me the exact use cases of the above mentioned techniques in File Adapter.
Thanks

1. Scalable DOM - is used to move/copy large files intact.
2. File Chunk Read - is used to process large documents (it uses a while loop).
When you're using File ChunkRead, you can take a large document with many elements and for each of those elements, perform some operations.
-----------Documentation-----------
**Oracle File Adapter Scalable DOM
http://docs.oracle.com/cd/E23943_01/integration.1111/e10231/adptr_file.htm#BABCHCEI
This use case demonstrates how a scalable DOM process uses the streaming feature to copy/move huge files from one directory to another.
The streaming option is not supported with DB2 hydration store.
You can obtain the Adapters-103FileAdapterScalableDOM sample by accessing the Oracle SOA Sample Code site.
**Oracle File Adapter ChunkedRead
http://docs.oracle.com/cd/E23943_01/integration.1111/e10231/adptr_file.htm#BABJFCBH
This is an Oracle File Adapter feature that uses an invoke activity within a while loop to process the target file. This feature enables you to process arbitrarily large files.
You can obtain the Adapters-106FileAdapterChunkedRead sample by accessing the Oracle SOA Sample Code site.
An additional reference that may be helpful is, Handling Binary Content and Large Documents in Oracle SOA Suite 11g
http://www.oracle.com/technetwork/middleware/soasuite/learnmore/binarycontentlargepayloadhandling-1705355.pdf

Similar Messages

  • File adapter and large file

    hi experts, i have a problem:
    i have configured sender file adapter to send 4mb file from FTP server, so when i upload file to FTP, file adapter don't upload it, file just stay in folder.
    I have tried to upload small file, is about 16kb, so it works fine, without any errors
    I checked communication channel log in RWB, there are not any errors, all leds are green.
    So i don't know how to upload 4mb file, also i checked all rights and permissions for file and user, they all have admin rights - "777".
    Can anybody give me some suggestion or solution?
    Thanks all for reply.

    Hi,
    Try to increase your server parameters as below and try ....then you would be able to process large data
    u2022     UME Parameters :  May be we need to look into the pool size and poolmax wait parameters - UME recommended parameters (like: poolmaxsize=50, poolmaxwait=60000)
    u2022     Tuning Parameters:  May be we need to look/define the Message Size Limit u201Clike: EO_MSG_SIZE_LIMIT = 0000100u201D under tuning category
    u2022     ICM Parameters: May be we need to consider ICM parameters (ex: icm/conn_timeout = 900000. icm/HTTP/max_request_size_KB = 2097152)
    Regards,
    Naveen

  • Can the File Adapter handle Multiple Record Layouts in COBOL?

    I am trying to parse flat files that have a record type field in the first four characters on each record. As a test I tried to use the following record definitions:
    01 CONTRACT-HEADER.
    05 RECORD-TYPE PIC X(4).
    05 TRANSACTION-TYPE PIC X.
    05 CONTRACT-NUMBER PIC X(32).
    05 CONTRACT-RELEASE-NBR PIC X(30).
    05 CONTRACT-FILLER PIC X(2025).
    01 CONTRACT-LINEITEM.
    05 RECORD-TYPE PIC X(4).
    05 TRANSACTION-TYPE PIC X.
    05 LINE-FILLER PIC X(1664).
    I also tried smaller records with simple test data. When I press the "test" button in the native file adapter the XML is never generated if I have more than one record type as above. Suppose that the contract-header record type always has the string "X001" in it, while the contract-lineitem always has "X015" in the record-type. I chose Us-ascii and chose $eol as a delimiter. The file on my windows computer has carriage return line feed after each record. I know that the fixed length converter works, but because the records really have 50 fields on each record type and are long, I don't think I can actually use that one in one sitting. Has anyone ever gotten two different record lenght cobol definitions to work? Do I have to hand edit the XML to specify the delimiter? I don't see any delimter specified in the generated XML before I attempt the test.

    Rahul,
    I recommned you should use "|" as "," can be embedded in a text string. I guess you can pass a directive to use to deliver "|" delimited file :). If you want to handle both then you might have to write a shell script or bat to pass delimiter as an argument which will edit and replace your control file.
    Regards

  • File Adapter-handling erroneous files and let channel continue to poll file

    Hello,
    Even tough there is not an out-of-the-box solution in PI, I'd like to hear your suggestions. When having a file sender communication channel to pick and send files, there are cases where we get errors at the adapter level. These errors are generally due to errors in the file, which will, for example, lead to conversion errors. When this happens, this communication channel will present us an error and will continuously try to pick the erroneous file till we fix the error in that file. Now imagine the situation where in the same folder you have another 1000 files to be picked up and these are all right. Despite this fact, the communication channel will continue to be in an error state till it can process the first file. My question is, what's your generic and standard solution to automatically place this erroneous file in an erroneous folder directory enabling the communication channel to process the other 1000 files?
    Thank you for your suggestions.
    Goncalo Mouro Vaz

    Hi,
        As said above...that option need to be utilised in case of error files....
    search SDN if need more details...and coming to the second u mentioned that if 1000 files are there for processing and waiting...
    if file adapter criteria for processing the file matches and if file is correct then those files will get processed inspite of one error file also...
    Hope this clears your queries...
    HTH
    Rajesh

  • Problem about Handling of Empty Files in File Adapter

    Hello everyone,
    NetWeaver 2004s --- XI
    In Sender i have a File Adapter.
    Now i meet a problem about Handling of Empty Files. When i send empty file, but don't cerate a leer message.
    I have seen following text in help document. But in adapter configuration i can not find the correspond parameter.
    can you give me some tips?
    Thx in advance
    best regards
    Yaning
    SAP Help Document über File Adapter
    +Handling of Empty Files
    Specify how empty files (length 0 bytes) are to be handled.
    ○       Do Not Create Message
    No XI messages are created from empty files.
    The files are processed according to the selected Processing Mode.
    For example, if the processing mode is Delete, empty files are deleted in the source directory.
    ○       Process Empty Files
    XI messages are created with an empty main payload.
    The files are processed according to the selected Processing Mode.
    ○       Skip Empty Files
    No XI messages are created from empty files.
    Empty files are skipped and remain in the source directory.+
    Help Docu

    hi,
    it's available since Sp19 for XI 3.0
    and the corresponding SPS fpr XI 7.0
    http://help.sap.com/saphelp_nw04/helpdata/en/44/f565854b7341e6e10000000a1553f6/frameset.htm
    so probably you need to install the new SP
    Regards,
    michal
    <a href="/people/michal.krawczyk2/blog/2005/06/28/xipi-faq-frequently-asked-questions"><b>XI / PI FAQ - Frequently Asked Questions</b></a>

  • How to handle large data in file adapter

    We have a scenario Proxy -> PI -> File Sever using File adapter.
    File adapter is using FCC for conversion.
    recently we had wave 2 products live and suddenly for this interface we have increase in volume of messages, due to which File adapter is not performing well, PI goes slow or frequent disconnect from file server problem. Due to which either we will have duplicate records in file or file format created is wrong.
    File size is somewhere around 4.07 GB which I also think quite high for PI to handle.
    Can anybody suggest how we can handle such large data.
    Regards,
    Vikrant

    Check this Blog for Huge File Processing:
    Night Mare-Processing huge files in SAP XI
    However, you can take a look also to this Blog, about High Volume Messages:
    Step-by-Step Guide in Processing High-Volume Messages Using PI 7.1's Message Packaging
    PI Performance Tuning Best Practice:
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/2016a0b1-1780-2b10-97bd-be3ac62214c7?QuickLink=index&overridelayout=true&45896020746271

  • Java exception appear during file adapter to handle large xml

    Hi,Guys,
    I am using file adapter to poll one folder of my local.one get input.xml,then do something,then write out output.xml.
    input.xml format is :
    <mes:DataRequestIns xmlns:mes="http://xml.netbeans.org/schema/MessageDefined">
    <!--1 or more repetitions:-->
         <mes:DataIns>
              <mes:address>Ang Mo Kio 5,63 street</mes:address>
              <mes:birthday>14/12/1970</mes:birthday>
              <mes:firstName>Sing1</mes:firstName>
              <mes:gender>0</mes:gender>
              <mes:ID>1001</mes:ID>
              <mes:lastName>NorthMan</mes:lastName>
              <mes:postcode>2002</mes:postcode>
              <mes:telephone>6665222</mes:telephone>
              <mes:surname>Sur</mes:surname>
              <mes:status>single</mes:status>
              <mes:location>sgp</mes:location>
              <mes:HP1>11111</mes:HP1>
              <mes:HP2>22222</mes:HP2>
              <mes:passport>123456878</mes:passport>
              <mes:company>ST</mes:company>
              <mes:department>TO</mes:department>
              <mes:occupation>engineer</mes:occupation>
              <mes:experience>10</mes:experience>
              <mes:level>2</mes:level>
              <mes:certificate>high</mes:certificate>
              <mes:specialisation>computer</mes:specialisation>
              <mes:staffNumber>1020</mes:staffNumber>
              <mes:fax>00200</mes:fax>
              <mes:email>[email protected]</mes:email>
              <mes:national1>sgp</mes:national1>
              <mes:national2>sgp</mes:national2>
              <mes:interestingField>bpel</mes:interestingField>
              <mes:national>earth</mes:national>
              <mes:oversea>yes</mes:oversea>
         </mes:DataIns>
    <mes:DataIns>
    </mes:DataIns>
    </mes:DataRequestIns>
    if my input.xml contain less than 1000 records,then it's working,once greater than 1000,then appear exception.for normal case ,must have more than 10000 records need to handle,and file size is 11.2MB.
    so,Does any other way or how to use file adapter to handle it ?
    thank you
    exception :
    SEVERE: An error occured while scanning for the next trigger to fire.
    org.quartz.JobPersistenceException: Failed to obtain DB connection from data sou
    rce 'soaNonManagedDS': java.sql.SQLException: Could not retrieve datasource via
    JNDI url 'jdbc/SOALocalTxDataSource' weblogic.jdbc.extensions.ConnectionDeadSQLE
    xception: weblogic.common.resourcepool.ResourceDeadException: 0:weblogic.common.
    ResourceException: Could not create pool connection. The DBMS driver exception w
    *as: Socket read timed out [See nested exception: java.sql.SQLException: Could no*
    *t retrieve datasource via JNDI url 'jdbc/SOALocalTxDataSource' weblogic.jdbc.ext*
    *ensions.ConnectionDeadSQLException: weblogic.common.resourcepool.ResourceDeadExc*
    *eption: 0:weblogic.common.ResourceException: Could not create pool connection. T*
    *he DBMS driver exception was: Socket read timed out]*
    at org.quartz.impl.jdbcjobstore.JobStoreCMT.getNonManagedTXConnection(Jo
    bStoreCMT.java:167)
    at org.quartz.impl.jdbcjobstore.JobStoreSupport.executeInNonManagedTXLoc
    k(JobStoreSupport.java:3652)
    at org.quartz.impl.jdbcjobstore.JobStoreSupport.acquireNextTrigger(JobSt
    oreSupport.java:2654)
    at org.quartz.core.QuartzSchedulerThread.run(QuartzSchedulerThread.java:
    *235)*
    Caused by: java.sql.SQLException: Could not retrieve datasource via JNDI url 'jd
    bc/SOALocalTxDataSource' weblogic.jdbc.extensions.ConnectionDeadSQLException: we
    blogic.common.resourcepool.ResourceDeadException: 0:weblogic.common.ResourceExce
    ption: Could not create pool connection. The DBMS driver exception was: Socket r
    ead timed out
    another exception:
    Message handle error.
    error while attempting to process the message "com.collaxa.cube.engine.dispatch.
    message.invoke.InvokeInstanceMessage"; the reported exception is: Transaction Ro
    lledback.: weblogic.transaction.internal.TimedOutException: Transaction timed ou
    t after 302 seconds
    BEA1-3468FC5E3C698D2A82F7
    at weblogic.transaction.internal.ServerTransactionImpl.wakeUp(ServerTran
    sactionImpl.java:1742)
    at weblogic.transaction.internal.ServerTransactionManagerImpl.processTim
    edOutTransactions(ServerTransactionManagerImpl.java:1609)
    at weblogic.transaction.internal.TransactionManagerImpl.wakeUp(Transacti
    onManagerImpl.java:1885)
    at weblogic.transaction.internal.ServerTransactionManagerImpl.wakeUp(Ser
    verTransactionManagerImpl.java:1519)
    at weblogic.transaction.internal.WLSTimer.timerExpired(WLSTimer.java:35)
    at weblogic.timers.internal.TimerImpl.run(TimerImpl.java:273)
    at weblogic.work.SelfTuningWorkManagerImpl$WorkAdapterImpl.run(SelfTunin
    gWorkManagerImpl.java:516)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
    *; nested exception is: weblogic.transaction.internal.TimedOutException: Transact*
    ion timed out after 302 seconds
    BEA1-3468FC5E3C698D2A82F7
    This error contained an exception thrown by the message handler.
    Edited by: aris yu on Dec 27, 2009 11:55 PM

    Hi,
    we have to explicitly mention batch while configuaring file adapter.
    i.e. in step 5 : Check file contains multiple messages and mention how many messages u want to process in a batch.
    i.e. if you have 10000 msgs in ur file mention the size as 500. Then it will read 10000 records in 20 batches.
    Now while writing to a file use Append=true property.
    U have to add this property to fileadapter.wsdl (in jca:operation tag) which will be created when u complete the configuaration of file adapter with write operation.
    Regards
    PavanKumar.M

  • Receiver file adapter creates empty files, Empty-Message Handling SP19

    Hello,
    We have just upgraded the system to SP19.
    One of the new features is that it should be possible to determine how XI messages with an empty main payload are to be handled in the receiver file adapter.
    If the parameter Empty-Message Handling is set to 'Ignore' no file should be created if the main payload is empty. In our case an empty file (size 0 kb) is still created even though the main payload is empty and the flag is set to 'Ignore'.
    Has anybody experienced the same problem?
    //  Best regards  Hans

    This should work:
    Use your own adapter module that parses incoming message and checks if it has any record sets in the document. If it does not have any record sets, then set the message to empty and then give this modified message to File receiver.
    For example, see the example code below:
    Module imports..
    Audit log import..
    DOM imports/SAX imports..
    public ModuleData process(ModuleContext moduleContext, ModuleData inputModuleData) throws ModuleException {
              try {
                   // get the XI message from the environment
                   Message msg = (Message) inputModuleData.getPrincipalData();
                   AuditMessageKey amk = new AuditMessageKey(msg.getMessageId(),AuditDirection.INBOUND);
                   Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,"RemoveRootTag: Module called");
                   XMLPayload payLoad = msg.getDocument();
                   Document doc = parseXmlFile(payLoad.getInputStream());
                   if(doc != null){
                        if(!doc.getDocumentElement().hasChildNodes()){
                             Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS, "Document is empty!!");
                             payLoad.setContent("".getBytes());
                             msg.setDocument(payLoad);
                   // provide the XI message for returning
                   inputModuleData.setPrincipalData(msg);
                   } catch (Exception e) {
                   // raise exception, when an error occurred
                   ModuleException me = new ModuleException(e);
                   throw me;
                   // return XI message
              return inputModuleData;
         private Document parseXmlFile(InputStream xmlpayload) {
              try {
                   DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance();
                   factory.setValidating(false);
                   //        Create the builder and parse the file
                   Document doc = factory.newDocumentBuilder().parse(xmlpayload);
                   return doc;
              } catch (SAXException e) {
              } catch (ParserConfigurationException e) {
              } catch(IOException e){
              return null;

  • Ways to handle large volume data (file size = 60MB) in PI 7.0 file to file

    Hi,
    In a file to file scenario (flat file to xml file), the flat file is getting picked up by FCC and then send to XI. In xi its performing message mapping and then XSL transformation in a sequence.
    The scenario is working fine for small files (size upto 5MB) but when the input flat file size is more then 60 MB, then XI is showing lots of problem like (1) JCo call error or (2) some times even XI is stoped and we have to strat it manually again to function properly.
    Please suggest some way to handle large volume (file size upto 60MB) in PI 7.0 file to file scenario.
    Best Regards,
    Madan Agrawal.

    Hi Madan,
    If every record of your source file was processed in a target system, maybe you could split your source file into several messages by setting up this in Recordset Per Messages parameter.
    However, you just want to convert you .txt file into a .xml file. So, try firstly to setting up
    EO_MSG_SIZE_LIMIT parameter in SXMB_ADM.
    However this could solve the problem in Inegration Engine, but the problem will persit in Adapter Engine, I mean,  JCo call error ...
    Take into account that file is first proccessed in Adapter Engine, File Content Conversion and so on...
    and then it is sent to the pipeline in Integration Engine.
    Carlos

  • Handling Large files in PI scenarios?

    Hello,
    We have lot of scenarios (almost 50) where we deal with file interfaces atleast in receiver or sender side. Some of them are just file transfers where we use AAE and some are where we have to do message mapping (sometimes very complex ones).
    the interfaces work perfectly fine will a normal file which dont have much records but recently we started testing big files with over 1000 records and its taking a lot of time to process. It is also causing other messages which gets lined up in the same queue to wait in the queue for the amount of time it takes for the first message to process.
    This must be a very practical scenario where PI has to process large files specially files coming from banks. What is the best way to handle its processing? Apart from having a better system hardware (we are currently in the test environment. Production environment will definetely be better) is there any technique which might help us improve the processing of large files without data loss and without interrupting other message?
    Thanks,
    Yash

    Hi Yash,
    Check this blogs for the strcuture you are mentioning:
    /people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    Regards,
    ---Satish

  • Handling large files with FTP in XI

    Hi All,
    I have a file scenario where i have to post the file with size of more than 500MB and with the fields upto 700 in each line.
    The other scenario worked fine if the file size is below 70MB and less number of fields.
    Could anyone help me in handling such scenario with large file size and without splitting the file.
    1) From your previous experience did you use any Tools to help with the development of the FTP interfaces?
    2) The client looked at ItemField, but is not willing to use it, due to the licensing costs. Did you use any self-made pre-processors?
    3) Please let me know the good and bad experiences you had when using the XI File Adapter?
    Thanks & Regards,
    Raghuram Vedagiri.

    500 MB is huge. XI will not be able to handle such a huge payload for sure.
    Are you using XI as a mere FTP or are you using Content Conversion with mapping etc?
    1. Either use a splitting logic to split the file outside XI ( using Scripts ) and then XI handles these file.
    2. or Quick Size your hardware ( Java Heap etc ) to make sure that XI can handle this file ( not recommended though). SAP recommends a size of 5 MBas the optimum size.
    Regards
    Bhavesh

  • Read Large Files Using BPEL File Adapter

    Hi,
    I have a scenario, where files of large size in text format are to be read and send to 3rd party. MTOM Policy has to be attached. Files of size 3MB or less are polled and size greater than 3 MB are not Retrieved. How can I resolve the issue. Do I have to break the file and send the data? If so how?
    Thanks
    Ranga

    You have to use streaming feature of the JCA file adapter for handling huge file.
    You could go through the following link
      http://docs.oracle.com/cd/E23943_01/integration.1111/e10231/adptr_file.htm#CIAHHEBF

  • Handling large files in scope of WSRP portlets

    Hi there,
    just wanted to ask if there are any best practices in respect to handling large file upload/download when using WSRP portlets (apart from by-passing WebCenter all-together for these use-cases, that is). We continue to get OutOfMemoryErrors and TimeoutExceptions as soon as the file being transfered becomes larger than a few hundred megabytes. The portlet is happily streaming the file as part of its javax.portlet.ResourceServingPortlet.serveResource(ResourceRequest, ResourceResponse) implementation, so the problem must somehow lie within WebCenter itself.
    Thanks in advance,
    Chris

    Hi Yash,
    Check this blogs for the strcuture you are mentioning:
    /people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    Regards,
    ---Satish

  • Hardware Question – Handling large files in Photoshop

    I'm working with some big TIFF files (~1GB) for large-scale hi-res printing (60" x 90", 10718 x 14451), and my system is lagging hard like never before (Retina MacBook Pro 2012 2.6GHz i7 /8 GB RAM/ 512GB HD).
    So far I've tried:
    1) converting to .psd and .psb
    2) changing the scratch disk to an external Thunderbolt SSD
    3) allocating all available memory to the program within photoshop preferences
    4) closing all other applications
    In general I'm being told that I don't have enough RAM. So what are the minimum recommended system requirements to handle this file size more comfortably? Newest Retina Pro with 16GB RAM? Or switch to iMac w/ 32? Mac Pro?
    Thanks so much!

    Hi Yash,
    Check this blogs for the strcuture you are mentioning:
    /people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    Regards,
    ---Satish

  • Handling Large File

    Hi all,
    We need to handle a large file 88omb .is there any provision at the adapter level to break the file in to smaller chunks.
    Can we avoid using the shell scripts ansd OS level commands.
    Thanks,
    Srinivas

    Hi Srinivas,
       if it is a text file then you could break up the file into multiple recordsets,
      e.g.,
    [Converting Text Format in the Sender File/FTP Adapter to XML   |http://help.sap.com/saphelp_nwpi711/helpdata/en/44/658ac3344a4de0e10000000a1553f7/frameset.htm]
    and
    [#821267 File Adapter FAQ|http://service.sap.com/sap/support/notes/821267]
    14. Memory Requirements
    Regards
      Kenny

Maybe you are looking for

  • What are the consequences of having more than one iphone connected to the same itunes on 1 computer?

    I'm wondering if there are any weird problems or glitches to be expected when 2 iphones are connected to the same itunes on a single computer.  My wife and I both have iphone 5's with separate apple IDs.  We only have one computer with one user accou

  • Airtunes to old Express using TC as wireless connection

    Obviously a Newbie. Can't figure out how to use the Express (b/g) to get music to our stereo. Should the AE be part of the TC network or a standalone component to move music to the stereo (yes, I do know it needs to be wired to the stereo!) What shou

  • Crystal report export button not working

    Hi i have a problem with a crystal report.......that in a crystal report when i click the export button the page refreshes and my report is dissapears but nothing happens i am unable to export the report in any format what shoould i do....if u wan to

  • How to change the location of the GPS?

    I have an iPod Touch and when I'm on facebook chat says to send the message from Buenos Aires (Argentina) when I'm really in Colombia, how I can fix this? thank you very much

  • How do I monitor Jakarta Tomcat?

    Hi: Please excuse me if this isnt the right forum to ask for this, but Im not sure where I should ask. Im trying to monitor resources (memory leak, cpu usage, etc) used by Jakarta Tomcat. I have tried RUE (http://rue.nolimits.ro) and JInsight (http:/