File adapter SyncRead Operation adds "???" to first record of csv file

Hi All,
One interface is reading one CSV file (containing two fields only) using SyncRead Operation of File Adapter. But I found, when file adapter parsed the data it appends "???" to first field of the first record which cause further data validation in the process.
Test.csv file
123456,159357
147258,987654
Trace of the FileAdapter:
<messages>
<Invoke_FILE_Inbound_InputVariable>
<part  name="Empty">
<empty/>  
</part>
</Invoke_FILE_Inbound_InputVariable>
<Invoke_FILE_Inbound_OutputVariable>
<part  name="body">
<Root-Element>
<TestElements>
<Emplid>���123456</Emplid>  
<SupId>159357</SupId>
</TestElements>
<HRRespIDElements>
<Emplid>147258</Emplid>  
<SupId>987654</SupId>
</TestElements>
</Root-Element>
</part>
</Invoke_FILE_Inbound_OutputVariable>
</messages>
Can anyone help me to understand , what is reason? Or I missed anything.
Thanks & Regards,
Sharmistha

Unfortunately, this is documented bahaviour (solution/workaround anyone?!):
Elapsed Time Exceeds:
Specify a time which, when exceeded, causes a new outgoing file to be created.
Note:
The Elapsed Time Exceeds batching criteria is evaluated and a new outgoing file is created, only when an invocation happens.
For example, if you specify that elapsed time exceeds 15 seconds, then the first message that is received is not written out, even after 15 seconds, as batching conditions are not valid. If a second message is received, then batching conditions become valid for the first one, and an output file is created when the elapsed time exceeds 15 seconds.

Similar Messages

  • SXDA split files for RMDATIND - First record in sequential file & not a session record (type 0)

    Hi gurus,
    I'm trying to perform a mass upload of material master records and for this I've have setup a Data Transfer Project in SXDA with the purpose of splitting an LSMW input file into multiple. The file split task is using Data Load Program:
    Object Type: BUS1001006
    Obj. description: AD Standard Material
    Program type: DINP
    Program: RMDATIND
    The problem I'm facing is that once the .conv (Converted Data) is split in multiple files these files are being transferred without a session record. This is causing to get the following error when running program: RMDATIND "First record in sequential file & not a session record (type 0)"
    So the questions is: How can I specify in SXDA that I want to keep that session record in all my split files?
    Thanks in advance for your help!

      Hi Chris ,
    try to re create logical file path/files for converted data.
    regards
    Prabhu

  • Count total no of records in csv file

    hello
    i made a function which count total no of records in csv file.
    but it always return 0 goes in the exception when no data found which
    it also display the total no of records in another parameter.
    code is here
    CREATE OR REPLACE FUNCTION COUNT_RECORDS_CSV (storeid1 number ) RETURN NUMBER
    IS
    wfile_handl UTL_FILE.FILE_TYPE;     
    v_file varchar(100);
    S VARCHAR2(32767);
    c number :=0;
    v_dir VARCHAR2 (50) := 'REPORTS';
    BEGIN
    v_file :='DATA_'||storeid1 ||'.CSV';
    DBMS_OUTPUT.PUT_LINE('V_FILE='||v_file);
    wfile_handl := UTL_FILE.FOPEN(v_dir,'DATA_5138.CSV','R');
    LOOP
    UTL_FILE.GET_LINE(wfile_handl,S);
    C := C + 1;
    END LOOP;
    UTL_FILE.FCLOSE(wfile_handl);
    RETURN C;
    EXCEPTION
    WHEN NO_DATA_FOUND THEN
    RETURN 0;
    END COUNT_RECORDS_CSV;
    if i print the values of c it show me correct result but it always return 0 can any body told me why its happening.
    thanks

    The fact that you have written this:
    EXCEPTION
      WHEN NO_DATA_FOUND THEN
        RETURN 0;means you didn't bother to read this.

  • File Adapter read Operation

    I am using File adapter to read files from the given location.
    With this read operation it picks all the files at time.
    I want to configure File Adapters that it should pick one file at time from the given location with given polling interval.
    Can we do this with file Adapters in BPEL.
    I am using SOA Suite 11g.
    Thanks,
    Arun Jadhav

    Hi, in order to achieve that you need to set the "SingleThreadModel" and "MaxRaiseSize" properties for your file adapter.
    Edit the adapter's jca file and add the following properties:
    <property name="SingleThreadModel" value="true"/>
    <property name="MaxRaiseSize" value="1"/>
    You can set these properties also through jdeveloper, by opening composite.xml, selecting the adapter and then changing the properties through the properties panel.

  • Sender file adapter Need to Get Only one record Using FCC

    Hi All,
                  I am using File to ABAP Proxy interface, where i need to Trigger my proxy so in my sender file adapter , i need to configure as such it is should take only one record, from whole file i am using Key field parametrs in FCC but it is taking all the records which is not having the key fields also
    please help me
    Thanking you
    Sridhar

    2A64310       1 6V83970       03751650016001154000                    1 6V9961XT-3   13236157001160
    2A64310       1 6V83970                        000                    1 6V9962XT-3   23236162
    2A64730       1 6V83970       03751650016000106000                    1 6V9962XT-3   13236162000112
    2A64730       1 6V83970                        000                    1 6V9961XT-3   23236157
    2A64741 6V99481 6V83971 4S541403751650016000152360                    1 6V9964XT-3   13236168000166
    2A64741 6V99471 6V83971 4S5414                 000                    1 6V9963XT-3   23236167
    2A64772 6V99492 6V83982 3S860605001650017000244000                    2 6V9965XT-3   13289090000248
    2A65690       1 6V97460       07501226872000110000                    11242153L&MP   1
    2A65690       1 6V97460                        000                    11242121L&MP   2
    EOF
    this is sample file , i need to have any one line , cause i need to just trigger a proxy , i am not to going tp take entire file cause in real time i will getting 100 MB file which will be a performance issue

  • File Adapter Write Operation inserts a new line at the end

    Hi,
    I am using the write operation in the File Adapter from BPEL Process. The data is written successfully to the file and file is also created in the location specified in the adapter wsdl. But in the file created, it creates a new line at the end of the file. This new line has to be avoided when writing data to a file.
    Has anyone faced this and solved this ?
    I am SOA 10.1.3.4.
    Cheers,
    - AR

    It is a bug in 10g and will be resolved in one of the versions of 11g.
    Cheers,
    -AR

  • File adapter Read operation not working.

    Hi All,
    I have a simple file adapter which will read from a csv file and insert into a table. I have deployed the application successfully, but the application is not reading from the specific location. I am not getting any error in the EM console. Below is the XSD that I used.
    <?xml version="1.0" encoding="UTF-8" ?>
    <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
                xmlns:nxsd="http://xmlns.oracle.com/pcbpel/nxsd"
                xmlns:tns="http://TargetNamespace.com/InboundService"
                targetNamespace="http://TargetNamespace.com/InboundService"
                elementFormDefault="qualified"
                attributeFormDefault="unqualified"
                nxsd:version="NXSD"
                nxsd:stream="chars"
                nxsd:encoding="US-ASCII"
    >
      <xsd:element name="Root-Element">
        <xsd:complexType>
          <xsd:choice minOccurs="1" maxOccurs="unbounded" nxsd:choiceCondition="terminated" nxsd:terminatedBy=",">
            <xsd:element name="Student" nxsd:conditionValue="R">
              <xsd:complexType>
                <xsd:sequence>
                  <xsd:element name="Roll" type="xsd:int" nxsd:style="terminated" nxsd:terminatedBy="," nxsd:quotedBy="&quot;" />
                  <xsd:element name="Name" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="," nxsd:quotedBy="&quot;" />
                  <xsd:element name="Flag" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="${eol}" nxsd:quotedBy="&quot;" />
                </xsd:sequence>
              </xsd:complexType>
            </xsd:element>
          </xsd:choice>
        </xsd:complexType>
      </xsd:element>
    This is the file that I want to read:
    5,Ram,N
    6,Shyam,N
    Please help friends.

    You should extensively practice the samples given in Oracle's documentation guide for modelling Native XSD. That will give you a better idea, about how to go about the native xsd schemas.. Anyways, I looked into your schema.. You need to make 2 changes to make this xsd work..
    (1) Change "xsd:choice" tags to "xsd:sequence".
    (2) Include a closing tag of "xsd:schema" to make the xsd well-formed.

  • How to get file name from file adapter (Read Operation)

    Hi All,
    I am reading files from local location, in jca file i gave filename property =*.* ( which reads all files), and now I want to read the incoming file name inside bpel and I want to copy the file name to some variable. Is this possible?

    Yes, its possible.
    In your receive activity,set below property in its properties tab:
    <bpelx:property name="jca.file.FileName" variable="FileName"/>
    Before that make sure to create a variable "FileName" of string type.
    Whatever file your file adapter will read,its name will be stored in this variable.
    Hope this helps.
    Regards,
    Karan
    http://learn-oraclesoa.blogspot.com/

  • Loading records from .csv file to SAP table via SAP Program

    Hi,
    I have a .csv file with 132,869 records and I am trying to load it to an SAP table with a customized SAP program.
    After executing the program, only 99,999 records are being loaded into the table.
    Is there some setting to define how many records can be loaded into a table? Or what else could be the problem?
    Pls advice.
    Thanks!!!

    hi Arun ,
    A datasource need a extract structure to fetch data .It is nothing but a temp table to hold data.
    First you need to create atable in SE11 with fields coming from CSV file.
    Then you need to write a report program to read you CSV file and populate your table in BW .
    Then you can create a datasource on top of this table .
    After that replicate and load data at PSA and use to upper flow.
    Regards,
    Jaya Tiwari

  • Trying to omit first line of CSV file (the header)

    Hello everybody,
    Newbie here havin a lil problem. I have this program below that reads in a CSV file. My test CSV file has the follow structure:
    ID, LastName, FirstName, Phone, Email,
    2345, Yez, Wei do, 456-789-0123, [email protected],
    2744, Avanto, Aldo, 562-593-6500, [email protected],
    1212, Dewy, Cheatem, 123-456-7890, [email protected],
    1234, No, Wei Foo, 123-456-7890, [email protected],
    The first line is my header. (here is the heart of my problem)
    I have the program reading in the file and the it sorts it. The problem i have is that the first line, my header gets sorted with everything else. Here is my feable attempt with resulting error:
    import java.io.*;
    import java.util.*;
    public class File2Array {
        public static void main(String[] args)
            File file = new File("person_test1.csv"); //File to read in
            ArrayList persons = (ArrayList) getPersons(file); //call to arraylist
            Collections.sort(persons);
            for (Iterator i = persons.iterator(); i.hasNext(); )
                System.out.println(i.next());
        public static List getPersons(File file)
    //NEW PIECE OF CODE        boolean first = true;
            ArrayList persons = new ArrayList();
            try {
                BufferedReader buf = new BufferedReader(new FileReader(file));
                while (buf.ready()) {
                    String line = buf.readLine();
    //NEW PIECE OF CODE
         while( ( line = buf.readLine() ) != null )
              if( first )
                   first = false;    continue;
    //ORIGINAL IF
         //   if (!line.equals("") )
    //NEW IF
                    if (line != null && !line.equals("") )
    //END OF NEW CODE
                         StringTokenizer tk = new StringTokenizer(line, ",");
                         ArrayList params = new ArrayList();
                         while (tk.hasMoreElements())
                             params.add(tk.nextToken());
                         Person p = new Person(
                                 (String) params.get(0),
                                 (String) params.get(1),
                                 (String) params.get(2),
                                 (String) params.get(3),
                                 (String) params.get(4));
                         persons.add(p);
            } catch (IOException ioe)
                System.err.println(ioe);
            return persons;
    class Person implements Comparable
        private String ID;
        private String LastName;
        private String FirstName;
        private String Phone;
        private String Email;
        public Person(
            String ID,
            String LastName,
            String FirstName,
            String Phone,
            String Email)
            if (FirstName==null || LastName==null)
                throw new NullPointerException();
            this.ID = ID;
            this.LastName = LastName;
            this.FirstName = FirstName;
            this.Phone = Phone;
            this.Email = Email;
        public String toString()
            StringBuffer sb = new StringBuffer ();
            sb.append( ID );
            sb.append(" ");
            sb.append( LastName );
            sb.append(" ");
            sb.append( FirstName );
            sb.append(" ");
            sb.append( Phone );
            sb.append(" ");
            sb.append( Email );
            sb.append(" ");
            return sb.toString();
        public boolean equals(Object o)
            if (!(o instanceof Person))
                return false;
            Person p = (Person)o;
            return p.FirstName.equals(FirstName) &&
                   p.LastName.equals(LastName);
        public int hashCode()
            return 31*FirstName.hashCode() + LastName.hashCode();
        public int compareTo(Object o)
            Person p = (Person)o;
            int lastCmp = LastName.compareTo(p.LastName);
            return (lastCmp!=0 ? lastCmp :
                    FirstName.compareTo(p.FirstName));
    this code compiles and runs but does not produce any output and I don't understand why.
    Question: What is the proper way to have the code omit the first line of a CSV file??
    By the way I do have another file called person.java that goes with this but I didn't think I had to post it.
    Also am I posting correctly? Am I putting in too much code??
    Than you!!

    Hot diggity! jverd,
    I got it to work!! here's what I did all thanks to you :-)
    ArrayList persons = new ArrayList();
            try {
                BufferedReader buf = new BufferedReader(new FileReader(file));
                //read through file and populate array
                //while (buf.ready()) {
                    String firstline = buf.readLine();  //gets header, renamed line to firstline
                    String line;                  //added this line     
                    System.out.println(firstline);     //added this to print out firstline     
         //try this
              while( ( line = buf.readLine() ) != null )
              /*{                    //commented out this portion
              if( first )
                   first = false;    continue;
              //end try this                
                    //if (line != null && !line.equals("") )     //down to here
                    if (!line.equals("") )          //added this
                         StringTokenizer tk = new StringTokenizer(line, ",");
                         ArrayList params = new ArrayList();
                         while (tk.hasMoreElements())
                             params.add(tk.nextToken());
                         Person p = new Person(
                                 (String) params.get(0),
                                 (String) params.get(1),
                                 (String) params.get(2),
                                 (String) params.get(3),
                                 (String) params.get(4));
                         persons.add(p);
                      }                       

  • Checking for existence of a record in csv file

    Hey all, I have two csv files residing on the server that I will be processing using the UTL_FILE package. One of these files (File A) has only one column and the other one (File B) has over ten. The data in file A is the same as the one in column 2 of File B. File A is mainly there for a lookup purpose whereas File B is the one that has all the data that will be processed. The following example should describe better:
    File A[b]
    Number
    "TR_56575"
    "TY_76756"
    etc
    File B
    Column1, Number, Column3, Column4
    "Mine","TR_56575","uhsht","76744"
    "Yours","TY_76756","nghdjd","45645"
    What I have to do is check whether a Number in File B exists anywhere in File A. If it does then I just skip that record and move on to the next. So basically, I wanted to find out if there is any way to compare the two columns on the server side. I know there is a dbms_lob.fileexists function to check the existence of a file on a server, just wanted to see if there is an extension to that or something that checks the existence of a string in file. Any feedback would be appreciated. Thanks.

    Yeah I thought about that. It would be the ideal way to go, however in this case, I already have a Pl/Sql procedure that grabs File B and just processes it from the server. It works fine and everything but now that this extra logic and file is added where I have to check the existence of a corresponding record in File A, I'm just looking for something to add in my code to just make the comparison on the fly and the rest would be the same.

  • Inserting Record into CSV file from BizTalk Orchestration

    Scenario:
    1.Receive file from Source system via RecvPipeline
    2.In Orchestration  extracting some values like ENO,Ename,Salary etc.these values to be added in to CSV file from Expression Shape.How to append/add emp records in to CSV with out overriding the rows.
    Ex:If we submitted 10 files then the CSV file should contain 10 rows in CSV.
    Let me know how to create CSV file from Orchestration and how to add rows into that csv value
    Regards BizTalkWorship

    Simple.
    Receive the message through a Receive Port/Location.
    Create a flat-file schema representing the CSV file structure. Ensure each row is delimited by “{CR}{LF}”. 
    This flat-file schema should only contain the element which you want to see in the destination CSV file like ENO,Ename,Salary etc.
    Have a map where the source schema should be the one which represents the received file and destination schema should be the one which is above created flat-file schema.
    Map the source schema to the destination schema mapping the filed 
    ENO,Ename,Salary etc.
    Have a custom send pipeline with flat-file assembler component it. Use this send pipeline in the send port.
    In send port, configure the send filter like “BTS.ReceivePortName == YourReceivePortName”. Configure the send port’s “Outbound Maps” to the map which you have created in
    above step
    Key Point. In your send port, set the “Copy Mode” property to “Append” from default “Create New”
    With your send port’s, “Copy Mode” property configured to “Append” this will append the value of the output to the existing file. Since in your flat-file schema, each record
    is delimited by “{CR}{LF}” and since you’re overwriting the output file you will have one file with records appended. So if 10 files received, instead of 10 output files, you will have 1 CVS file with 10 rows.
    If you want to construct the message in Orchestration as do, you do as opposed to map in send port at outbound map you can still do.
    If this answers your question please mark it accordingly. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply.

  • File Adapter going into infinite mode while polling for the file

    Hi,
    I am facing an issue while the file adapter is reading the data from the file.
    If the file from which it is reading has been deleted it keeps on polling for it infinite number of times and also doesn't pops out with any error message. Is there any method that after polling for some time period we can pop out a message that the particular file is not available and then the file adapter should stop polling for the file.Anyone has some ideas about this. Any pointers will be of great help.
    Thanks
    Vinay
    Message was edited by:
    soachd

    Hi soachd,
    As far as I understand polling is infinite process. You need to adopt this strategy if you are expecting some thing to happen all the time or may be when you want to be ready when ever an expected event happens. This provides a real time solution but with the downside of infinite execution.
    Hope it helps.
    Kalyan.

  • Duplicate record in CSV file

    Hi
    Needed your inputs to take forward this scenario
    It is an IDOC - CSV File scenario where WBS elements are sent across in the file to the legacy system
    9 fields are sent in the file and in those 2 fields are mapped with constants .
    Requirement here is to have a duplication of record in the file with only one constant changed  My CSV file should look like this for each IDOC occurence
         AAA,0123.45,Edgbaston,1234,WBS_Code1,,Airline,
         AAA,0123.45,Edgbaston,1234,WBS_Code2,,Airline,
    it would just be the duplication of record with only one field changed (WBS_Code1 and WBS_code2) so that each IDOC occurence would comprise one WBSelement information sent from the IDOC and the generated file  would have 2 lines of the same record
    Thanks
    Anusha

    Hi
    Unable to get you and I have my receiver strcuture as  mentioned below .And only last 2fields are getting populated from the IDOC and rest all mapped with the constants
    <?xml version="1.0" encoding="UTF-8"?>
    <ns0:MT_WBSCodes xmlns:ns0="http://us.abc.com/PI/aaaat">
    <WBSFile> 0..unbounded
    <RecordTypeIdentifier>FIELDLISTVALUES</RecordTypeIdentifier>
    <CompanyID>UNIT_TOP</CompanyID>
    <CustomFieldID>CL_WBS_CODE_1</CustomFieldID>
    <CustomFieldValue1></CustomFieldValue1>
    <CustomFieldValue2></CustomFieldValue2>
    <Status></Status>
    </WBSFile>
    <WBSFile>0..unbounded
    <RecordTypeIdentifier>FIELDLISTVALUES</RecordTypeIdentifier>
    <CompanyID>UNIT_TOP</CompanyID>
    <CustomFieldID>CL_WBS_CODE_2</CustomFieldID>
    <CustomFieldValue1> - </CustomFieldValue1>
    <CustomFieldValue2></CustomFieldValue2>
    <Status></Status>
    </WBSFile>
    </ns0:MT_WBSCodes>
    Edited by: Anusha  Ramsiva on Sep 3, 2010 4:58 PM

  • Power shell script to add multiple aliases with a .csv file.

    I have an email address lets say [email protected] and I need 1000 aliases added to the email. For example [email protected], [email protected] etc etc. and I have a .csv file with 990 columns with just project1, project2.
    Is there a way to run a command in powershell exchange to import all the aliases I need in a few commands? Can someone point me in the right direction if I am in the wrong forum also? Thanks
    No sure if this is correct or not:
    > $mbx = Get-Mailbox Project
    > Import-CSV "C:\SomePath\wherever.csv" | foreach { $mbx.EmailAddresses += $._SmtpAddress }
    > Set-Mailbox project -EmailAddresses $mbx.EmailAddresses

    Help Import-Csv -FULL
    $aliases=Import-Csv aliases.csv
    ¯\_(ツ)_/¯

Maybe you are looking for