Problem while reading multiple files through FTP Adapter

Hi,
We have a requirement to read the excel files placed in an FTP Location and as there is no Adapter to read Excel file
we are using FTP Adapter and reading the Header values of the file(name of the
file) and we are paasing as input to the Java code which will read the data nd insert into the Database.
If we place above 20 files it was reading only some files and some were left and if we delete the files and place the unread files again some files are read and if we do the same procedure then all the files were read.
Any help regaring this appreciated.
Thanks and Regards,
Nagaraju .D

Are you doing anything complex for your polling, e.g. Files that must be n time old.
Can you post the WSDL so I can see the polling configuration. I only need to see the adapter configuration, not the whole file.
cheers
James

Similar Messages

  • Problem while reading the file from FTP server

    Hi Friends,
    I have a problem while fetching files from FTP server.
    I used FTP_Connect, FTP_COMMAND function modules. I can able to put the files into FTP server.
    but I cant able to pick the files from FTP server.
    anyone have faced similar issues kindly let me know.
    Thanks
    Gowrishankar

    Hi,
    try this way..
    for reading the file using FTP you need to use different unix command ..
    Prabhuda

  • Problem  while reading XML file from Aplication server(Al11)

    Hi Experts
    I am facing a problem while  reading XML file from Aplication server  using open data set.
    OPEN DATASET v_dsn IN BINARY MODE FOR INPUT.
    IF sy-subrc <> 0.
        EXIT.
      ENDIF.
      READ DATASET v_dsn INTO v_rec.
    WHILE sy-subrc <> 0.
      ENDWHILE.
      CLOSE DATASET v_dsn.
    The XML file contains the details from an IDOC number  ,  the expected output  is XML file giving  all the segments details in a single page and send the user in lotus note as an attachment, But in the  present  output  after opening the attachment  i am getting a single XML file  which contains most of the segments ,but in the bottom part it is giving  the below error .
    - <E1EDT13 SEGMENT="1">
      <QUALF>001</QUALF>
      <NTANF>20110803</NTANF>
      <NTANZ>080000</NTANZ>
      <NTEND>20110803<The XML page cannot be displayed
    Cannot view XML input using XSL style sheet. Please correct the error and then click the Refresh button, or try again later.
    Invalid at the top level of the document. Error processing resource 'file:///C:/TEMP/notesD52F4D/SHPORD_0080005842.xml'.
    /SPAN></NTEND>
      <NTENZ>000000</NTENZ>
    for all the xml  its giving the error in bottom part ,  but once we open the source code and  if we saved  in system without changing anything the file giving the xml file without any error in that .
    could any one can help to solve this issue .

    Hi Oliver
    Thanx for your reply.
    see the latest output
    - <E1EDT13 SEGMENT="1">
      <QUALF>003</QUALF>
      <NTANF>20110803</NTANF>
      <NTANZ>080000</NTANZ>
      <NTEND>20110803</NTEND>
      <NTENZ>000000</NTENZ>
      <ISDD>00000000</ISDD>
      <ISDZ>000000</ISDZ>
      <IEDD>00000000</IEDD>
      <IEDZ>000000</IEDZ>
      </E1EDT13>
    - <E1EDT13 SEGMENT="1">
      <QUALF>001</QUALF>
      <NTANF>20110803</NTANF>
      <NTANZ>080000</NTANZ>
      <NTEND>20110803<The XML page cannot be displayed
    Cannot view XML input using XSL style sheet. Please correct the error and then click the Refresh button, or try again later.
    Invalid at the top level of the document. Error processing resource 'file:///C:/TEMP/notesD52F4D/~1922011.xml'.
    /SPAN></NTEND>
      <NTENZ>000000</NTENZ>
    E1EDT13 with QUALF>003 and  <E1EDT13 SEGMENT="1">
    with   <QUALF>001 having almost same segment data . but  E1EDT13 with QUALF>003  is populating all segment data
    properly ,but E1EDT13 with QUALF>001  is giving in between.

  • Problem While reading a file in text mode from Unix in ECC 5.0

    Hi Experts,
    I am working on Unicode Upgrade project of ECC5.0.
    Here i got a problem with reading a file format which it does successfully in 4.6 and not in ECC5.0
    My file format was as follows:
    *4 000001862004060300000010###/#######L##########G/##########G/########
    It was successfully converting corresponding field values in 4.6:
    *4
    00000186
    2004
    06
    03
    00000010
    25
    0
    4
    0
    54.75
    0
    54.75
    0.00
    While i am getting some problem in ECC5.0 during conversion of the above line:
    *4 000001862004060300000010###/#######L##########G/##########G/########
    it was consider in the same # values.
    I have used the following statement to open and read dataset.
    OPEN DATASET i_dsn IN LEGACY TEXT MODE FOR INPUT.
    READ DATASET i_dsn INTO pos_rec.
    Thanks for your help.
    Regards,
    Gopinath Addepalli.

    Hi
          You might be facing this problem because of uni code. So while opening or reading the file, there is a statement call ENCODING. Use that option and keep the code page which you want. Then the problem may be solved.
    Thanks & Regards.
    Harish.

  • Error while reading files through FTP Adapter

    Hi,
    I am using FTP Adpater to read files and archive in local folders.
    But i got the following errors when i deployed the process:
    <2008-12-18 13:48:31,140> <INFO> <default.collaxa.cube.activation> <File Adapter
    ::Inbound> Connection Created
    <2008-12-18 13:48:31,390> <ERROR> <default.collaxa.cube.activation> <File Adapte
    r::Inbound> Unable to get Binary file '/MySharedFolders/abc/AAAA_TT_
    Trrrrplan Bireeee Baaaa? ??ketmmllri Ltd.?ti._4673651.pdf'; FTP command RETR returned unexpe
    cted reply code : 550
    <2008-12-18 13:48:44,156> <INFO> <default.collaxa.cube.activation> <File Adapter
    ::Inbound> Managed Connection Created
    <2008-12-18 13:48:44,156> <INFO> <default.collaxa.cube.activation> <File Adapter
    ::Inbound> Processer thread calling onFatalError with exception Error getting bi
    nary file from FTP Server.
    Unable to get binary file from server.
    Check the error stack and fix the cause of the error. Contact oracle support if
    error is not fixable.
    <2008-12-18 13:48:44,156> <FATAL> <default.collaxa.cube.activation> <AdapterFram
    ework::Inbound> [Get_ptt::Get(opaque)]Resource Adapter requested Process shutdow
    n!
    Please help me out on this issue.
    Thanks,
    Synthia

    Hi James,
    We have upgraded to 10.1.3.4. but still we could not read the file. the problem is, it could not read the file names contains with UTF-8 characters.
    error is below:
    <2009-12-01 15:43:33,109> <INFO> <default.collaxa.cube.activation> <FTP Adapter:
    :Inbound> Managed Connection Created
    <2009-12-01 15:43:33,109> <INFO> <default.collaxa.cube.activation> <FTP Adapter:
    :Inbound> Connection Created
    <2009-12-01 15:43:38,406> <ERROR> <default.collaxa.cube.activation> <FTP Adapter
    ::Inbound> Unable to get Binary file '/MySharedFolders/Invoice History/EMEA_TR_B
    imta?-Bo?aziτi Peysaz ?n?aat Mⁿ?avirlik Teknik Hizmetler A?aτ Sanayi ve Ticaret
    Anonim ?irketi_44596501.pdf'; FTP command RETR returned unexpected reply code :
    *550*
    <2009-12-01 15:43:51,156> <INFO> <default.collaxa.cube.activation> <FTP Adapter:
    :Inbound> Processer thread calling onFatalError with exception Error getting bin
    ary file from FTP Server.
    Unable to get binary file from server.
    Check the error stack and fix the cause of the error. Contact oracle support if
    error is not fixable.
    <2009-12-01 15:43:51,156> <FATAL> <default.collaxa.cube.activation> <AdapterFram
    ework::Inbound> [Get_ptt::Get(opaque)]*Resource Adapter requested Process shutdow*
    n!
    Thanks you.

  • Problem while uploading text file through portal into WebDAV repository .

    Hi all...
    I am not able to upload any file through portal onto my webDav repository for remote server however reverse is possible i.e. any document created at remote server is reflected in portal.
    Everytime when i try to upload file though portal , i get the following error::
    The item could not be created because an exception occurred in the framework.
    Kindly suggest what to do....
    ThankS

    Hi Chetna,
    Have you specified any user information in the webdav repository....Like always connect through this user, in the webdav repsository tab....This user may not have write permission in the windows.
    Also are you sure that the user you logged in portal by which you failed to create new files or folders and the user that was able to create file in the windows are one and the same.....
    Regards,
    Ganesh N

  • Dynamic name for File through FTP Adapter

    I am working on FTP Put and I have a requirement where in I have to generate the file with dynamic content in its name.I should actually embed a Purchase Order number in that file name follwed by a sequence number.I dont need any time stamp.Should I create a variable for this?
    Help in this regard would be highly appreciated.

    Hi,
    you have to enable the adapter specific parameter --> file name in both the sender and the receiver CC to get the input file name as output file name.
    Refer this blog:
    /people/michal.krawczyk2/blog/2005/11/10/xi-the-same-filename-from-a-sender-to-a-receiver-file-adapter--sp14
    Also you need to use this udf
    DynamicConfiguration conf = (DynamicConfiguration) container.getTransformationParameters().get(StreamTransformationConstants.DYNAMIC_CONFIGURATION);
    DynamicConfigurationKey key = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/File","FileName");
    String ourSourceFileName = conf.get(key);
    return  ourSourceFileName; 
    Also refer this blog:
    /people/william.li/blog/2006/04/18/dynamic-configuration-of-some-communication-channel-parameters-using-message-mapping
    Regards,
    Nithiyanandam

  • Problem while reading XML File from Memory

    I want to read the data from an XML File which is residing in memory.The XML File should be read not by the name or the location of the file but by a Response String.
    So Can anybody help me out how can get this.
    I tried it with BufferedInputStream but BufferedInputStream needs InputStream as a Parameter and i am not able to get what parameter should be passed into the BufferedInputStream as the syntax of this is
    BufferedInputStream(InputStream in)
    Its very Urgent !!!
    Thanks in Advance

    I have created the parsing method.
    The code is like this.
    if(response.toString().equals("CS_NEWS_RESPONSE"))
    *//InputStream in = new BufferedInputStream(InputStream());*
    *//InputStream in = new BufferedInputStream(response.toString());*
    *//BufferedInputStream b = new BufferedInputStream(in);*
    DocumentBuilderFactory docBuilderFactory = DocumentBuilderFactory.newInstance();
    DocumentBuilder docBuilder = docBuilderFactory.newDocumentBuilder();
    Document doc = docBuilder.parse(in);
              Element root = doc.getDocumentElement();
    The code will go like this....
    response.toString() is containing the response type which is the identification of that particular XML File. By using this my file is identified and after identification of this file, i should read the contents and parse them.
    Sentences in the bold are the ways. Is this the correct way ??
    Amit Will you be able to help me or is there anybody else who knows about this.
    Edited by: Monadear on Oct 5, 2007 9:37 AM

  • First row is repeting while accessing multiple rows through Database adapte

    Hi,
    Iam using Database adapter in BPEL process to retrive the data from remote database.In Database adapter i have selected select operation and passing input parameter.
    Actually that adapter needs to return 10 different rows though output variable of that adapter.
    Its returing 10 rows but the first row is repeting 10 times.
    Iam not able to get 10 rows.only first row is repeting.
    Please help me in this.
    Thanks in Advance

    Hi Arik,
    I got the solution.The problem is primary key only.
    Your solution correct.
    Thank u very much for answering this question quickly.
    I have changed the primary key its working fine.
    Could u please explain me what is the problem with primay key exactly.
    Thanks in advance,
    Sreeni

  • Problem while updating multiple records through standard RFC

    HI all,
    I am doing multiple  object updation using a standard RFC(BAPI_PROJECT_MAINTAIN). The RFC i am calling from Enterprise portal. I am sending data to RFC one by one. But the error i am getting is object is locked by user so data can't be save.
    Though i am using Lock and unlock method before and after calling RFC the project lock error comes up.
    What might be the reason
    regards
    sandeep

    use saveandwait instead save in bapi

  • FTP Adapter to read multiple files from a directory. Not through polling.

    Dear Friends,
    I would like to know is it possible to configure the FTP adapter in Oracle BPEL 10.1.3.4 to read multiple files (different names, same structure) from a given directory. I do not want the BPEL to do a polling. Instead when I submit the BPEL process it should read all files from the directory.
    I was looking at the option of Synchronous read but I am not able to specify wild card in the file name field. I do not know the file names at the time of reading.
    Thanks for your help!

    Hi,
    While you read the file, you can configure an adapter property in 'Receive'. This will store the filename, this filename can be used for sync read as the input parameter.
    1. Create a message type variable called 'fileheader'. This should be of type Inboundheader_msg (whatever relevant Receive activity).
    2. This variable will contain three parts - filename, FTPhost, FTPPort
    3. Copy this fileheader to 'Syncheader'.
    4. syncheader can be passed as an adapter proerty during sync read of the file.
    During Receive and Invoke, you need to navigate to 'Adapter' tab to choose the created message type variable.
    Let me know if you have further questions.
    regards,
    Rev

  • I am facing problem while reading values from properties file ...i am getting null pointer exception earlier i was using jdeveloper10g now i am using 11g

    i am facing problem while reading values from properties file ...i am getting null pointer exception earlier i was using jdeveloper10g now i am using 11g

    hi TimoHahn,
    i am getting following exception in JDeveloper(11g release 2) Studio Edition Version 11.1.2.4.0 but it works perfectly fine in JDeveloper 10.1.2.1.0
    Root cause of ServletException.
    java.lang.NullPointerException
    at java.util.PropertyResourceBundle.handleGetObject(PropertyResourceBundle.java:136)
    at java.util.ResourceBundle.getObject(ResourceBundle.java:368)
    at java.util.ResourceBundle.getString(ResourceBundle.java:334)
    at org.rbi.cefa.master.actionclass.UserAction.execute(UserAction.java:163)
    at org.apache.struts.action.RequestProcessor.processActionPerform(RequestProcessor.java:431)
    at org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:236)
    at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1196)
    at org.apache.struts.action.ActionServlet.doPost(ActionServlet.java:432)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
    at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
    at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
    at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
    at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
    at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:119)
    at java.security.AccessController.doPrivileged(Native Method)
    at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:315)
    at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:442)
    at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:103)
    at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:171)
    at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
    at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:139)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3715)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
    at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
    at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
    at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
    at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
    at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)

  • Reading huge flat file through SOAP adapter

    Hi Everybody,
        In one of our interface we need to read big flat file using soap adapter at sender side into xi and we are using java map to convert into xml. but before that i need to split this flat file into multiple files  in the first message mapping. and in the second map we have to write a java map to do the flat file conversion to XMLBut i got struck up in reading this big flat file into XI as i need to declare some datatype to read this entire file. Can anybody tell me how i can do this. is it a possible to do first of all with SOAP adapter . 
    Thanks
    raj

    hi vijay,
      Thanks for your prompt reply. Due to some reasons i am not allowed to use file adapter . i can use only JMS adapter or SOAP adapter. we tried few scenarios with JMS content conversion but what ever scenario i am asking here is complex at multilevel i can't even use JMS in this case. so we are thinking to read whole file using SOAP adapter and then we are planning to split the file into multiple files, as file can be huge size ,using java mapping and in next level we want to use another mapping to do content conversion. SO I have to do experiements whether this is a feasible solution or not. because when u declare at sender side
    <ffdata_MT>
    <Recordset>
        <ROW>    String type
    when u declare like this and when u sent the flat file using SOAP adapter at sender side we are getting whole file which we sent at part of "ROW" as string. but inside java mapping i need to see whenther i can split this in XI ,so that i can use these split files in next mapping for content conversion. Hope i am clear now. I want to know whether it is a feasible solution or not.
    I really appreciate if sombody give some idea on this
    Thanks
    raj

  • Problem while processing large files

    Hi
    I am facing a problem while processing large files.
    I have a file which is around 72mb. It has around more than 1lac records. XI is able to pick the file if it has 30,000 records. If file has more than 30,000 records XI is picking the file ( once it picks it is deleting the file ) but i dont see any information under SXMB_MONI. Either error or successful or processing ... . Its simply picking and igonring the file. If i am processing these records separatly it working.
    How to process this file. Why it is simply ignoring the file. How to solve this problem..
    Thanks & Regards
    Sowmya.

    Hi,
    XI pickup the Fiel based on max. limit of processing as well as the Memory & Resource Consumptions of XI server.
    PRocessing the fiel of 72 MB is bit higer one. It increase the Memory Utilization of XI server and that may fali to process at the max point.
    You should divide the File in small Chunks and allow to run multiple instances. It will  be faster and will not create any problem.
    Refer
    SAP Network Blog: Night Mare-Processing huge files in SAP XI
    /people/sravya.talanki2/blog/2005/11/29/night-mare-processing-huge-files-in-sap-xi
    /people/michal.krawczyk2/blog/2005/11/10/xi-the-same-filename-from-a-sender-to-a-receiver-file-adapter--sp14
    Processing huge file loads through XI
    File Limit -- please refer to SAP note: 821267 chapter 14
    File Limit
    Thanks
    swarup
    Edited by: Swarup Sawant on Jun 26, 2008 7:02 AM

  • FILE and FTP Adapter file size limit

    Hi,
    Oracle SOA Suite ESB related:
    I see that there is a file size limit of 7MB for transferring using File and FTP adapter and that debatching can be used to overcome this issue. Also see that debatching can be done only for strucutred files.
    1) What can be done to transfer unstructured files larger than 7MB from one server to the other using FTP adapter?
    2) For structured files, could someone help me in debatching a file with the following structure.
    000|SEC-US-MF|1234|POPOC|679
    100|PO_226312|1234|7130667
    200|PO_226312|1234|Line_id_1
    300|Line_id_1|1234|Location_ID_1
    400|Location_ID_1|1234|Dist_ID_1
    100|PO_226355|1234|7136890
    200|PO_226355|1234|Line_id_2
    300|Line_id_2|1234|Location_ID_2
    400|Location_ID_2|1234|Dist_ID_2
    100|PO_226355|1234|7136890
    200|PO_226355|1234|Line_id_N
    300|Line_id_N|1234|Location_ID_N
    400|Location_ID_N|1234|Dist_ID_N
    999|SSS|1234|88|158
    I would need a the complete data in a single file at the destination for each file in the source. If there are as many number of files as the number of batches at the destination, I would need the file output file structure be as follows:
    000|SEC-US-MF|1234|POPOC|679
    100|PO_226312|1234|7130667
    200|PO_226312|1234|Line_id_1
    300|Line_id_1|1234|Location_ID_1
    400|Location_ID_1|1234|Dist_ID_1
    999|SSS|1234|88|158
    Thanks in advance,
    RV
    Edited by: user10236075 on May 25, 2009 4:12 PM
    Edited by: user10236075 on May 25, 2009 4:14 PM

    Ok Here are the steps
    1. Create an inbound file adapter as you normally would. The schema is opaque, set the polling as required.
    2. Create an outbound file adapter as you normally would, it doesn't really matter what xsd you use as you will modify the wsdl manually.
    3. Create a xsd that will read your file. This would typically be the xsd you would use for the inbound adapter. I call this address-csv.xsd.
    4. Create a xsd that is the desired output. This would typically be the xsd you would use for the outbound adapter. I have called this address-fixed-length.xsd. So I want to map csv to fixed length format.
    5. Create the xslt that will map between the 2 xsd. Do this in JDev, select the BPEL project, right-click -> New -> General -> XSL Map
    6. Edit the outbound file partner link wsdl, the the jca operations as the doc specifies, this is my example.
    <jca:binding  />
            <operation name="MoveWithXlate">
          <jca:operation
              InteractionSpec="oracle.tip.adapter.file.outbound.FileIoInteractionSpec"
              SourcePhysicalDirectory="foo1"
              SourceFileName="bar1"
              TargetPhysicalDirectory="C:\JDevOOW\jdev\FileIoOperationApps\MoveHugeFileWithXlate\out"
              TargetFileName="purchase_fixed.txt"
              SourceSchema="address-csv.xsd" 
              SourceSchemaRoot ="Root-Element"
              SourceType="native"
              TargetSchema="address-fixedLength.xsd" 
              TargetSchemaRoot ="Root-Element"
              TargetType="native"
              Xsl="addr1Toaddr2.xsl"
              Type="MOVE">
          </jca:operation> 7. Edit the outbound header to look as follows
        <types>
            <schema attributeFormDefault="qualified" elementFormDefault="qualified"
                    targetNamespace="http://xmlns.oracle.com/pcbpel/adapter/file/"
                    xmlns="http://www.w3.org/2001/XMLSchema"
                    xmlns:FILEAPP="http://xmlns.oracle.com/pcbpel/adapter/file/">
                <element name="OutboundFileHeaderType">
                    <complexType>
                        <sequence>
                            <element name="fileName" type="string"/>
                            <element name="sourceDirectory" type="string"/>
                            <element name="sourceFileName" type="string"/>
                            <element name="targetDirectory" type="string"/>
                            <element name="targetFileName" type="string"/>                       
                        </sequence>
                    </complexType>
                </element> 
            </schema>
        </types>   8. the last trick is to have an assign between the inbound header to the outbound header partner link that copies the headers. You only need to copy the sourceDirectory and SourceGileName
        <assign name="Assign_Headers">
          <copy>
            <from variable="inboundHeader" part="inboundHeader"
                  query="/ns2:InboundFileHeaderType/ns2:fileName"/>
            <to variable="outboundHeader" part="outboundHeader"
                query="/ns2:OutboundFileHeaderType/ns2:sourceFileName"/>
          </copy>
          <copy>
            <from variable="inboundHeader" part="inboundHeader"
                  query="/ns2:InboundFileHeaderType/ns2:directory"/>
            <to variable="outboundHeader" part="outboundHeader"
                query="/ns2:OutboundFileHeaderType/ns2:sourceDirectory"/>
          </copy>
        </assign>you should be good to go. If you just want pass through then you don't need the native format set to opaque, with no XSLT
    cheers
    James

Maybe you are looking for