Processing a CSV file in batching by FTP Adapter gives translation error

Hi All,
I have a CSV with 2000 records.. want to process it in batch of 500.
When i dont use batching in FTP adapter.. everything goes fine.
But when i include batching in the adpater.. and try to process the same file. it gives:
<2009-09-09 12:09:15,997> <ERROR> <VDSTServices.collaxa.cube.translation> <NXSDTranslatorImpl::logError> translateFromNative Failed with exception = 50
<2009-09-09 12:09:15,997> <INFO> <VDSTServices.collaxa.cube.activation> <FTP Adapter::Inbound> Error while translating inbound file : VDST_CNP_EMR5110026_20090904_000000.csv
<2009-09-09 12:09:15,997> <INFO> <VDSTServices.collaxa.cube.activation> <FTP Adapter::Inbound>
ORABPEL-11100
Translation Failure.
[Line=27, Col=1] Translation from native failed. 50.
Check the error stack and fix the cause of the error. Contact oracle support if error is not fixable.
     at oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl.doTranslateFromNative(NXSDTranslatorImpl.java:754)
     at oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl.translateFromNative(NXSDTranslatorImpl.java:489)
     at oracle.tip.adapter.file.inbound.ProcessWork.doTranslation(ProcessWork.java:748)
     at oracle.tip.adapter.file.inbound.ProcessWork.processMessages(ProcessWork.java:336)
     at oracle.tip.adapter.file.inbound.ProcessWork.run(ProcessWork.java:218)
     at oracle.tip.adapter.fw.jca.work.WorkerJob.go(WorkerJob.java:51)
     at oracle.tip.adapter.fw.common.ThreadPool.run(ThreadPool.java:280)
     at java.lang.Thread.run(Thread.java:595)
Caused by: java.lang.ArrayIndexOutOfBoundsException: 50
     at oracle.tip.pc.services.translation.xlators.nxsd.ErrorList.addError(ErrorList.java:108)
     at oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl.terminateLoop(NXSDTranslatorImpl.java:1688)
     at oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl.parseNXSD(NXSDTranslatorImpl.java:1307)
     at oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl.parseNXSD(NXSDTranslatorImpl.java:1216)
     at oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl.parseNXSD(NXSDTranslatorImpl.java:1084)
     at oracle.tip.pc.services.translation.xlators.nxsd.NXSDTranslatorImpl.doTranslateFromNative(NXSDTranslatorImpl.java:706)
     ... 7 more
I tried deleting the rows 24-28 from the CSV and process again.. again got the same error [Line=27, Col=1] .. so its not an issue with CSV.
Please suggest.
Thanks

Can you publish or pass your csv and the native schema you are using. Please also mention your soa version. thx.

Similar Messages

  • BPEL Process with multiple file types using one FTP adapter is not working

    i created a bpel process which will fetch the files from remote location using FTP adapter.
    Now the process works for only one format or file type like *.xls.
    How can i use more than one file format in one FTP adapter.
    OR
    is there any other way to do it.
    file type assignation is 5th step in FTP adapter configuration.
    i have tried *.xls,*.csv and *.xls;*.csv and *.xls:*.csv by seperating with comman, colon, space... still not working.
    i read the documentation *.* will not work.. for one file format it's working fine.
    looking forward for reply as soon as possible.

    Are you positive that it is not working? I'm not sure how you can use one FTP adapter for multiple file types unless the underlying data is exactly the same format or you are processing it as opaque data. Sometimes when a FTP adapter chokes on a file with a bad structure it doesn't create a BPEL instance, it simply moves the bad file to a separate folder.
    So I assume you are using opaque as the data type instead of using an XSD element?
    That said, I don't think you can put two separate file types in the filter. Is it possible for you to do something like: CommonFileName*.* or do you have similar files with other extensions?
    I know the above probably isn't of much help, but I had so many problems with the FTP adapter and its lack of features that I am writing my own. Unfortunately that is a large undertaking and there isn't any good documentation of JCA resource adapter / BPEL PM integration.

  • Processing a .CSV file to Table

    HI Friends,
    Please le tme know how to process a .CSV file to Table by using SQL loader Concept.
    No other technique required...! Know it can be done thru various processes. But needed in the above way...
    Thanks in adv...!
    tempdbs
    www.dwforum.net

    here:
    http://forums.oracle.com/forums/search.jspa?threadID=&q=loading+csv+sql+loader&objID=c84&dateRange=lastyear&userID=&numResults=15

  • How to find the File name using the FTP Adapter

    hi all,
    how to find the File name using the FTP Adapter with BPEL.
    Regards

    Found the solution for this.
    First In the mediator's routing rule use assign property $in.property.jca.file.FileName to $out.property.jca.file.FileName
    In the BPEL's receive activity go to the properties tab and get the property to a BPEL variable. That should do it.
    Thanks for the posts

  • Uploading & Processing of CSV file  fails in clustered env.

              We have a csv file which is uploaded to the weblogic application server, written
              to a temporary directory and then manipulated before being written to the database.
              This process works correctly in a single server environment but fails in a clustered
              environment.
              The file gets uploaded to the server into the temporary directory without problem.
              The processing starts. When running in a cluster the csv file is replicated
              to a temporary directory on the secondary server as well as the primary.
              The manipulation process is running but never finishes and the browser times out
              with the following message:
              Message from the NSAPI plugin:
              No backend server available for connection: timed out after 30 seconds.
              Build date/time: Jun 3 2002 12:27:28
              The server which is loading the file and processing it writes this to the log:
              17-Jul-03 15:13:12 BST> <Warning> <HTTP> <WebAppServletContext(1883426,fulfilment,/fu
              lfilment) One of the getParameter family of methods called after reading from
              the Serv
              letInputStream, not merging post parameters>.
              

    Anna Bancroft wrote:
              > We have a csv file which is uploaded to the weblogic application server, written
              > to a temporary directory and then manipulated before being written to the database.
              > This process works correctly in a single server environment but fails in a clustered
              > environment.
              >
              > The file gets uploaded to the server into the temporary directory without problem.
              > The processing starts. When running in a cluster the csv file is replicated
              > to a temporary directory on the secondary server as well as the primary.
              >
              > The manipulation process is running but never finishes and the browser times out
              > with the following message:
              > Message from the NSAPI plugin:
              > No backend server available for connection: timed out after 30 seconds.
              >
              >
              >
              > --------------------------------------------------------------------------------
              >
              > Build date/time: Jun 3 2002 12:27:28
              >
              > The server which is loading the file and processing it writes this to the log:
              > 17-Jul-03 15:13:12 BST> <Warning> <HTTP> <WebAppServletContext(1883426,fulfilment,/fu
              > lfilment) One of the getParameter family of methods called after reading from
              > the Serv
              > letInputStream, not merging post parameters>.
              >
              >
              It doesn't make sense? Who is replicating the file? How long does it
              take to process the file? Which plugin are you using as proxy to the
              cluster?
              -- Prasad
              

  • Processing files in Sequence using FTP Adapter

    Hi Experts,
    I have searched several forums but i am not clear on  how to process the files using FTP Adapter based on Timestamp.
    To process the files in sequence i.e, FIFO using FTP Adapter
    i have the files with file name customer and timestamp :  customer<yyyyMMddHHmmss>
    there are around 50 files in the FTP server llike this.
    I need to process these files acording to the timestamp and place the files in same processing sequence in the receiver end using the file adapter.
    If i specify the parametes in sender FTP Adapter as
    Qos= EOIO
    Queue name = ACCOUNT
    Whether these parameters would do the processing in sequence according to the Timestamp?
    Suppose if the queue ID for Inbound(SMQ2) is XBTI0_ACCOUNT then whether it will be the same for Outbound(SMQ1)?
    Kindly suggest me how to process the files in sequence according to the Timestamp using FTP Adapter
    Please reply..
    Thanks
    Sai

    Hi Shabarish,
    But this would require one more additional channel to process
    So i think it will take more time to process.
    Let me clarify my question once again.
    I need to Pick the files from FTP server based on their TimeStamp and in sequence.
    the file names are like this Customer<YYYYMMDDHHmmSS>.
    suppose i have 3 files as
    Customer20050413044534
    Customer20050414053430
    Customer20050315034533
    So i need to pick these files in this order and place the files in the same order to the receiver end(File Adapter)
    Customer20050315034533
    Customer20050413044534
    Customer20050414053430.
    As i am using FTP sender adapter i cannot use processing sequence "By Date".
    please suggest me on this.
    Thanks
    Sai.

  • How to set file permission using Oracle FTP Adapter

    Hi,
    I am using Oracle SOA Suite 11.1.1.4. I am trying to put a file using Oracle FTP adapter on unix box. The file that gets written to target system has file permission as RW/R/R. But this is a legacy system, and for them to consume this file - they expect the file permission to be RW/RW/R.
    They have set the .profile with the required permissions for the ftp user account that we are using. But still when my BPEL process writes a file to unix box through ftp adapter, the file permission is RW/R/R.
    Is there any way to control file permissions while writing files using ftp adapter? Any help would be highly appreciated.

    907597 wrote:
    But these setting needs to be done on unix server. Yes, that's the way to go... There's no config for that on FtpAdapter as far as I know...
    This setting will enable the same configurations for all other ftp accounts on the server, which doesnt sound correct. Any other way of doing this? Or can this be done only for one ftp account?You have to check if your ftp server is capable of having different umask for different ftp users... I believe most do not...
    http://h30499.www3.hp.com/t5/System-Administration/Setting-FTP-umask-per-user/td-p/2590101#.UMZ0TeEe7ng
    Cheers,
    Vlad

  • File size restriction on FTP Adapter (GET)

    A customer has deployed a BPEL process that we had created that uses the FTP Adapter to download (GET) files. The customer is reporting that when they try to download a large file (about 800 MB in size), the process does not seem to kick in for more than one hour. At this point I don't know how good or bad the network speed is and how long a FTP download will take for a file of that size. But before looking into that, wanted to know if there is any restriction on how big a file can be for it to be downloaded by the BPEL FTP Adapter. If not, is there some way to find out what is wrong with the downloading or to know to progress of the FTP GET?
    The user guide talks about increasing the value of transaction-config timeout in server.xml for the File Adapter. Is this applicable for FTP Adapter also and will it work if I increase this?
    Thanks in advance for your attention.
    Jay

    Thanks for the pointer. The processes are deployed on a 10.1.3.5 instance.
    Of the info I found in that article, only the following seemed applicable:
    53.2.1 Setting Audit Levels from Oracle Enterprise Manager for Large Payload Processing
    I can ask the customer to turn off auditing if it is on. But I am skeptical that this will resolve this issue.
    The following seem to be applicable only to 11g version and I am not sure if they have any corresponding options in 10.1.3.5 (please let me know if I am mistaken):
    1. 53.2.4 Using Adapter Support for Streaming Large Payloads
    2. 53.2.6 Processing Large Documents in Oracle B2B
    Section 53.3.1 Opaque Schema for Processing Large Payloads says the following:
    There is a limitation when you use an opaque schema for processing large payloads. The entire data for the opaque translator is converted to a single Base64-encoded string. An opaque schema is generally used for smaller data. For large data, use the attachments feature instead of the opaque translator.
    Based on your experience, has the FTP Adapter ever been used to download files of 800MB or larger size on a 10.1.3.5 instance? I would like to know if there is any insurmountable system limitation with this approach or is it just a question of proper configuration or is this a bug that the FTP adapter product team can look at. I assume the attachment feature is not available in 10.1.3.5.
    The remaining sections don't seem applicable to the the customer scenario.
    Can you please explain what ODI is?
    Appreciating your input,
    Jay

  • CSV file with text qualifiers around each field causing error on Import

    Hi
    I have a csv file which I am trying to import - a one line extract is shown below. It is delimited by semi colon and each field has a text qualifier around it.
    XXX Drinks Ltd;"BR01";"1";"001.2008";"2008";"Distribution";"-186";"-186";"-186"
    When importing i get the following issue
    1) BPC doesn't seem to handle the text qualifier for the fields. For example the "BR01" field above requires me to put a conversion as follows ""BR01"" i.e. I have to double the quotes because BPC adds them
    2) Even after the required conversion, BPC does not like the double quotes around the amounts, even though when validating the transform I get no error message, when running the import package I get the following message
    Record Count: 1
    Accept Count: 1
    Reject Count: 0
    Skip Count  
    The number of failing rows exceeds the maximum specified. (Microsoft Data Transformation Services (DTS) Data Pump (8004202c): TransformCopy 'DTSTransformation__9' conversion error:  General conversion failure on column pair 1 (source column 'SIGNEDDATA' (DBTYPE_STR), destination column 'SIGNEDDATA' (DBTYPE_NUMERIC)).)
    Does this my source file can't have double quotes as a text qualifier?
    thanks in advance
    Scott Farrington

    James, thanks for your reply
    does that mean that BPC can't deal with the double quotes? I understand about removing them and using a comma for a delimiter, but this is the file format I have been given.
    What I really need to know is, given this format, using a transformation and/or mapping function, can I import the data the way it is?
    And I still need an answer to my second point, about the error message received on running the import package,
    thanks
    Scott

  • Dynamic file name creation using FTP adapter wired from a mediator

    Hi All,
    My Requirement is as follows....
    Mediator is wired to Three FTP adapters to create three files.
    File names are dynamic.
    In the mediator those three routings, mappings and assignment for the directory and file name are being created.
    But out of 3 files, 2 files are being created with the names mentioned during the FTP adapter configuration and the last one is being created
    with the dynamic value.
    Any help in this regard is highly thankful.
    Thank you.
    Srivatsasa.

    Create a UDF in mapping taking counter from IDoc as input parameter
    DynamicConfiguration conf = (DynamicConfiguration) container.getTransformationParameters().get(StreamTransformationConstants.DYNAMIC_CONFIGURATION);
    DynamicConfigurationKey key = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/File","FileName");
    String totalFilename = "AAAA_" + counter;
    conf.put(key,totalFilename);
    return  totalFilename; 
    Map output of this UDF to Top node at target.
    you will not be able to see the result in test tab of mapping but it works end to end
    Select Adapter Specific Message Attributes in receiver file adapter..here select filename checkbox

  • CSV File attachment in Receiver Mail Adapter

    Hi Experts ,
                                   mine is a proxy to mail scenario  Information from proxy needs to be converted to CSV file and to be send across in the form of attachment to multiple recievers.
       I would request best suggestions from experts.
    Regards,
    Arnab .

    Hi,
    first approach is,
       1. in cilent proxy report to call  'GUI_UPLOAD' function module ,    in  exporting, filetype =  'txt', filename.
    second approach is,
       2. you can you message transformation bean in you mail receiver communication channel.
    http://help.sap.com/saphelp_nw04/helpdata/en/57/0b2c4142aef623e10000000a155106/frameset.htm
    you want to send multiple receivers ,
    3. In the Receiver Determination, you need to select " Extended" in the Type of Receeiver determination.
    http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/3343.
    regards,
    ganesh.

  • Get the File Name received from FTP Adapter

    Hi,
    How to get the File name reived through the FTP Adpater. I have created a variable with the Message type from ftpAdapterinboundheader.wsdl. from there I mapped the filename attribute to a local string variable.
    But I did not receive the file name. The output in the Audit trail is as follows:
    <?xml version="1.0" encoding="UTF-8" ?>
    - <Invoke_File_Process_FileProcess_InputVariable>
    - <part xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" name="InputParameters">
    - <InputParameters xmlns="http://xmlns.oracle.com/pcbpel/adapter/db/PIERS/SP_FILE_PROCESS_UPDATE/">
    <AS_DIR xmlns="">I</AS_DIR>
    <AS_FILE_NAME xmlns="" />
    </InputParameters>
    </part>
    </Invoke_File_Process_FileProcess_InputVariable>
    Can any one let me know how to get the recevied file name from FTP adapter.
    Thanks

    you have to define variable of type InboundHeader_msg. Then in receive activity click on Adapter tab and for header variable chose your newly created variable (InboundHeader_msg). Once you receive message from FTP you should see in this variable fileName.

  • Assign file permissions dynamically through FTP adapter

    In my composite, i have a requirement to assign the file permissions dynamically.
    I know the static way of doing it is to use the property 'Permission' in the jca file of the ftp adapter.
    Is there any way to pass the value to this property dynamically?
    I tried using jca.ftp.Permission. but not working.
    Any help on this is much appreciated.
    Thanks,
    Naveen Kumar T.

    Hi,
    The document with the properties is this... It doesn't mention any properties for permissions though...
    http://docs.oracle.com/cd/E25178_01/integration.1111/e10231/adptr_propertys.htm#CHDJBDHC
    Cheers,
    Vlad

  • Garbage name of the Archive File while using the FTP adapter

    Hello All
    I am using Oralcle FTP adapter to poll a file ,after polling I am deleting the file from the current directory and
    archiving it into another directory.
    Problem is after archiving the file into the another dirctory name of the is fine changed as follows :
    d2wXEgGZrfypNsGa15uzOA==_20081015_082233_0015
    I want same name for the archived file as orginal file with timestamp
    Please help to solve this problem.
    Thanks
    Satendra Pare

    You cannot change the file name in the save dialog, it always uses the originally file name of the form.
    But you can use scripting to save a form under another name.
    Sample.
    http://thelivecycle.blogspot.com/search/label/Save

  • Zero byte files not transferred with FTP Adapter

    All,
    I am seeing that zero byte files are not being transferred using FTP Adapter (SFTP type). Is there a known bug around this? Please respond if anybody have seen this behavior and resolved this issue. I am using 10.1.3.3.0
    Thanks,
    Dipal

    Thanks Marc,
    I still have doubt if that is the case. When I tested the same from 10.1.3.3.1 windows platform and FTP being windows OS zero bytes files transferred OK. Do you think this is to do with OS or SOA version?
    Thanks,
    Dipal

Maybe you are looking for

  • Missing Library Albums and Built in Smart Albums

    Hi, In AP3 I am missing the built in "Library Albums" with the associated smart albums (★★★★★,Rejected etc). I did a search and someone solved the problem by deleting the BuiltinSmartAlbums.plist located at user/library/application support/aperture a

  • Got an error while setting up account after reset password: "Wrong user name or password"

    Firefox Sync I've perform "Reset password" operation and tryed to set up Sync again - and got an error "Wrong user name or password". After resetting password once more error still remains. This affect after I've update my Firefox to version 8.0.

  • How to decrease the dynamic table data loading time

    hi i have problem with dynamic table. when i execute the the table with passing a query , getting lot of time for loading the table data.( it takes 30sec for every 100 rows.) pls help me how to overcome this problem. thanks advance.

  • Multipage PDF different layouts export to Excel format

    Problem how to convert 24 page file with different layouts on each page to clean Excel spreadsheet layout. Column layout returned is a mess and could be resolved if individual PDF pages or selected parts of page could be exported. I can send source f

  • Original Airport Cart can;t locate network? Help!!!!

    I have a 2002 quicksilver g4 with the original airport card installed. It had been working flawlessly up until recently when it can;t locate the wireless netowrk in my home. I know it it's recognizing the card because it doesnt say "no airport card i