Read Fixed length file

I have a text file that has data in specific positions of the file.
ie.
Pos 1-20 Name
Pos 21-30 Division
Pos 31-38 Birthdate
I saved the file in the DB and want to process it to read the data in the specific positions I listed above.
I started with the code below, but not sure how to get to the specific positions in the file.
DECLARE
v_blob_data BLOB;
v_blob_len NUMBER;
v_clob_data CLOB := 'anything';
v_clob_len NUMBER;
dest_offset NUMBER := 1;
src_offset NUMBER := 1;
blob_csid NUMBER := dbms_lob.default_csid;
lang_ctx INTEGER := dbms_lob.default_lang_ctx;
warning INTEGER;
BEGIN
select blob_content into v_blob_data from wwv_flow_files;
v_blob_len := dbms_lob.getlength(v_blob_data);
dbms_lob.converttoclob(v_clob_data, v_blob_data, v_blob_len,dest_offset,src_offset,blob_csid,lang_ctx,warning);
END;

bobmagan wrote:
Unfortunately, I have to do this inside of an APEX page using pl/sql.What is unfortunate about that!? Apex is good. PL/SQL is awesome.{noformat} :-) {noformat}
I would not have it any other way. Seriously.
If the file is uploaded into Apex - then you still can use external tables.
The basic steps are:
1. spool the uploaded file (from the Apex flow files table) to the local file system of the Oracle server using UTL_FILE
2. alter the external table and specify the newly created file's location
3. select from the external data to load the file contents
Alternatively, you need to write your own CLOB parser - that can read lines from the CLOB, parse these, and output the tokens to use (e.g. token 1 is the invoice id, token 2 the invoice date, etc).
This all is very much doable in PL/SQL. And Apex is PL/SQL - and allows you to call your own PL/SQL packages and code to process data.
Keep in mind that the simpler a process is, the easier it is to write, maintain and debug. Doing manual parsing.. not something that should be treated lightly as that can become quite complex.
Dealing with an external table and its easy definition of the format of the file content to load, is a lot easier and more flexible. The only downside is having to spool the CLOB to an external file in order to use the external table... However... I do recall a few months ago looking at the "cartridge" used by Oracle to load external files as an external table, and read/saw something that seemed to also support CLOBs..?
So I'm not sure if one can now define an external table and use it to load CLOB contents - worth looking at... and even if not, also worth considering spooling a CLOB to external file for loading via the external table interface.
Unless of course you're keen to put your regex kung fu skills to use in writing a custom CLOB parser for the file contents.

Similar Messages

  • Reading fixed length file with different record types

    Hi,
    I need to read a fixed-length file with different record types, but the record identifier is in 31st position and not in 1st position.
    But if I give 31 as position in File adpater wizard, BPEL takes whole 1-31 as identifier.
    How we need to read such files.
    Thanks
    Ravdeep

    hi ,
    u cannot use the default wzard for this
    use some thing like this nxsd:lookAhead="30" nxsd:lookFor="S"have a look at the below link it has some examples
    http://download.oracle.com/docs/cd/B31017_01/integrate.1013/b28994/nfb.htm

  • Please help with reading fixed-length file

    I am reading a fixed-length file from a batch program and I need to read through each line, parsing out records in each, and replacing the first character in each line after I finish processing. I am using a RandomAccessFile, but I am not sure if this is best for my needs.
    Here is essentially the code that I am using:
    RandomAccessFile raf = new RandomAccessFile(File, "rw");
    int pointer = 0;
    int recordLength = 65;
    int counter = 0;
    string test = "X";
    pointer = raf.getFilePointer();
    while ((line = raf.readLine()) != null) {
    //set file pointer
    seek(counter * recordLength);
    counter ++;
    if (line.substring(0, 1).equals(test)) {
    continue;
    var1 = line.substring(7, 22);
    var2 = line.substring(23, 41);
    var3 = line.substring(42, 52);
    var4 = line.substring(53, 53);
    //PROCESSING
    raf.writeChar(test);

    Thanks for the suggestion. I think that the
    BufferedReader is the way that I will go. Do you
    happen to know the best way to update the 1st Char of
    each line as I'm reading each line using Buffered
    Writer?Hi,
    To update the first char of each line, you could read each line of the file into a StringBuffer, modify the value, and then write the value back out to the file.
    I had to do something similar.
    eg.
    StringBuffer buf = new StringBuffer();
    BufferedReader in = new BufferedReader(
        // data file source
    new FileReader(db_location));
    String s;
    // while there are still items to read
    while ((s = in.readLine()) != null)
        System.out.println("reading file");
        // split into component parts so we can modify
        StringTokenizer t = new StringTokenizer(s,",");
        int id = Integer.parseInt(t.nextToken());
        String code = t.nextToken();
        int quantity = Integer.parseInt(t.nextToken());
        // copy component parts into StringBuffer
        buf.append(id+",");
        buf.append(code+",");
        buf.append("\n");
         // use StringBuffer methods here to isolate first character
         // use StringBuffer methods to replace first value of input in buffer
         // eg.
              buf.replace(index,length,","+ new_code);
    // print out the StringBuffer to rewrite updated file
    PrintWriter output = new PrintWriter(
        new BufferedWriter(
        new FileWriter(db_location, false)));
    output.println(buf.toString());
    output.close();As always, there is more than one way to skin a cat. But this is my way.
    I think that JDK 1.4 has some new methods, but not 100% sure of that.
    //Daniel.

  • Reading a fixed length file

    Hi All,
    I am trying to read a fixed length file with .DAT extension through an FTP Adapter. I am using a read(polling) operation.
    In the file there are three records(H,D,T). The length of all the three records is more than 100.When I am trying to build a schema for that file using native format builder, maximum position that i get is 100 and if i manually try to put the position beyond 100 say 120,while building the schema and then run the composite using the generated schema, the file is not polled from the location So if anyone could help me to clarify my doubts regarding reading of fixed length files.
    a. Can i read i fixed length file which contains records whose position is more than 100.
    b. If yes, then how do i do that?
    Any help is appreciated.
    Thanks in Advance.

    Here's one way to start. Extend this class for the particular functionality you want:import java.io.DataInputStream;
    import java.io.File;
    import java.io.FileInputStream;
    import java.io.IOException;
    /** Processes binary files which have embedded data records. */
    public abstract class BinaryInputFile extends File
    * Class constructor.
    * @param fileName The input file name.
    * @throws NullPointerException if the <code>pathname</code> parameter is <code>null</code>
    protected BinaryInputFile(String fileName) throws NullPointerException
       super(fileName);
    * Reads the file, one line at a time, passing each line to the subclass� process()
    * function.
    * @throws IOException if an error occurs.
    public void process() throws IOException
       DataInputStream stream = new DataInputStream(new FileInputStream(this));
       process(stream);
    * Process the file data.
    * @param stream The input stream.
    protected abstract void process(DataInputStream stream);
    }

  • File Adapter: Fixed length file read fails when all data not present

    Hi
    We have a BPEL process that reads fixed length data files. It works fine when all the data elements are available in the file but fails with 'rejected:10002' when even a single data is missing.
    How to handle this situation in BPEL file adapter?
    Are we doing something wrong or is this a normal functionality.
    If yes, then is there any work around for this as this is a very usual business condition which may occur, where all data elements are not mandetory.
    fixedLength
    ==========
    2,3,3,2
    Data - Successful
    ============
    1234567890
    2345678901
    3456789012
    Data - Failed
    ===========
    1234567890
    2345678901
    345678901
    Thanks in advance
    Buddhadev

    Hi Naveen,
          Do check the following things,
    >>Note : I have been asked to give the Transport Protocol as "NFS" (Whether this is the problem???) I have summarized the complete details below. Please help me
           1.If your file resides on your local network/local computer give NFS(Network file system). if your file resides on a FTP location give FTP and also give the FTP log on parameters.
    Additional Parameters
    File_MT.fieldFixedLengths 10,10,5
    File_MT.fieldNames VendorNumber,VendorName,City
    File_MT.fieldSeparator
    File_MT..processFieldNames fromConfiguration 
           2.If this structure does not match with the input file structure the file adapter wont pick up the file. So check for the help document provided by SAP in the following path.
    help.sap.com  --> Documentation  --> SAPNetWeaver --> SAPNetWeaver '04 --> English --> process integration --> SAP Exchange Infrastructure --> connectivity --> Adapters --> File Adapter
           Your file contains three records
    V123456789 A123456789 Bosto
    V234567890 B123456789 Atlan
    V334587900 C123456789 Austi
    You have mentioned the fieldSeperator as space but there is no File_MT.endSeparator '/n'  which differentiates between each and every row (record).
            Parameters for Record set Structures mentioned in the sender adapter configuration does not match with the actual file structure .
            Try giving exact structures in the configuration of sender file adapter.
    regards,
    Aravindh.

  • XSLT Mapping : XML to Fixed Length File

    Hi,
    I have to code a XSLT mapping which converts the XML into a Fixed Length File Format. I am getting the output but it has some garbage values (Some extra spaces in front of first record and also extra blank lines before the first record)
    I am pasting my xsl sheet :
    <?xml version="1.0"?>
    <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
         <xsl:output method="text" indent="yes" media-type="text/plain"/>
         <xsl:template match="Employees">
              <xsl:for-each select="Employee">
                   <xsl:value-of select="Name"/>
                   <xsl:value-of select="ID"/>
                   <xsl:value-of select="ADD"/>
                   <xsl:text>&#xA;</xsl:text>
              </xsl:for-each>
         </xsl:template>
    My input XML file is as follows:
    <?xml version="1.0"?>
    <p1:Test02 xmlns:p1="http://www.infosys.com/xi/training/hyd/66289">
            <Employees>
              <Employee>
                 <Name>Anurag</Name>
                 <ID>1121</ID>
                 <ADD>Hyderabad</ADD>
             </Employee>
             <Employee>
                 <Name>Divya</Name>
                 <ID>1122</ID>
                 <ADD>Hyderabad</ADD>
             </Employee>
             <Employee>
                 <Name>Rasmi</Name>
                 <ID>1123</ID>
                 <ADD>Bangalore</ADD>
                </Employee>
         </Employees>
    </p1:Test02>
    And the output i am receiving is as follows:
        Anurag1121Hyderabad
    Divya1122Hyderabad
    Rasmi1123Bangalore
    Please do help.....

    hi,
    >>>>
    <xsl:output method="text" indent="yes" media-type="text/plain"/>
    you allow the spaces by using indent="yes"
    try with indent="no"
    Regards,
    michal
    <a href="/people/michal.krawczyk2/blog/2005/06/28/xipi-faq-frequently-asked-questions"><b>XI / PI FAQ - Frequently Asked Questions</b></a>

  • Upload a Fixed Length file in terms of Bytes..

    Hi,
    Here is my query.
    I have a fixed length file that I need to upload into my program from my presentation server.
    The file is in a Shift-JIS Format.
    The file is a fixed length format. But it is fixed interms of the number of bytes that each column occupies.
    Eg. The 1st column takes 8 bytes, the second 15 bytes, so on and so forth. We do not know the no. of characters each column takes... just the numbe of bytes.
    This is how I had approached the upload.
    I created an internal table with just one field of type XSTRING.
    I used GUI_UPLOAD FM with CODEPAGE = `8000`.
    But i noticed during debugging that in each record, the moment a SPACE occured in the input file, it would stop reading and go to the next record in the file. Meaning, I loose all the data after the first occurance of SPACE.
    Am I missing something here?? Why does the FM truncate after the first SPACE. ??
    Do I need to declare the internal table in any other format..??

    " May be placing a carriage return end of each records
    " will solve your problem
    class cl_abap_char_utilities definition load.
    data : begin of itab,
            field1(1) type c,
            field2(2) type c,
            field3(3) type c,
            field4(4) type c,
    crlf(2) type c value cl_abap_char_utilities=>cr_lf. "<<<See this line<<<
    data : end of itab.
    Data : begin of itab1 occurs 0.
            Field(20) type c.
    Data : end of itab1.
    Loop at itab.
         Move itab to itab1.
         Append itab1.
    Endloop.
    Open dataset  ........
    Loop at itab1.
       Transfer itab1 TO dataset.
    Endloop.

  • Creating a Fixed Length File

    Greetings,
    I'm creating an application that need to create a fixed length file on a UNIX system and need help. I have an internal table(s) which contain structures with fields of different lengths (type c) and so I have a routine that concatenates these fields into a single record to be sent to a file using the open dataset. This process is squeezing out all my spaces and so my fixed length file is lost. Can someone assist in creating a fixed length file from an internal table without using delimiters?
    Thanks!

    " May be placing a carriage return end of each records
    " will solve your problem
    class cl_abap_char_utilities definition load.
    data : begin of itab,
            field1(1) type c,
            field2(2) type c,
            field3(3) type c,
            field4(4) type c,
    crlf(2) type c value cl_abap_char_utilities=>cr_lf. "<<<See this line<<<
    data : end of itab.
    Data : begin of itab1 occurs 0.
            Field(20) type c.
    Data : end of itab1.
    Loop at itab.
         Move itab to itab1.
         Append itab1.
    Endloop.
    Open dataset  ........
    Loop at itab1.
       Transfer itab1 TO dataset.
    Endloop.

  • How to handle a fixed length file without newline?

    Hi Experts,
    I'd like to handle a fixed length file without newline by sender file adapter.
    A file like following.
    It contains three recores."AAXBBBXCCCCX" is one record.
    AA1BBB1CCCC1AA2BBB2CCCC2AA3BBB3CCCC3
    I tried that following two parameters set. But only first recored was read.
    fieldFixedLengths
    fieldFixedLengthType
    Please tell me how to handle.
    Thanks
    Shinya Kawagoe.

    For this case we wrote a simple Adapter Module inserting an end of line character after an offset.
    This way it can be reused in many interfaces.
    And reading the whole file may not be an option in case of large source files. May cause performance / memory issues.
    eolbean.offset = <recordLlen>
    XMLPayload xmlpayload = msg.getDocument();
    byte[] content = xmlpayload.getContent();
    byte crlf = 0x0A;
    int current = 0;
    ByteArrayOutputStream baos = new ByteArrayOutputStream();
    int lines = content.length / recordLen;
    do
         lines--;
         baos.write(content, current, recordLen);
         if (lines > 0) // if other lines, eol required
              baos.write(crlf);
              current += recordLen;
    } while (lines > 0);
    xmlpayload.setContent(baos.toByteArray());
    baos.close();
    Audit.addAuditLogEntry(key, AuditLogStatus.SUCCESS,     MODULE + " Done EOLing.");

  • How to create a fixed length file

    Can anyone suggest a simple technique for creating fixed
    length files? CSV and delimited files are pretty simple using
    CFFILE. Is there an elegant way to create fixed length
    files?

    Simplest: Use LJustify() or RJustify() on your lines of data.
    EG:
    <cfset FileLine = RJustify (LineOfText, 128)>

  • Fixed length file sending to user.

    My output filxed length file contains records of length 750 chars.
      My problem is I need to emailed such file to user...so the user can download the file & if error is there then, user can  correct the values in this file and then rerun the program with this corrected fixed length file.
    I can send this fixed lenth file as an attachment also.
    How can I send, this fixed length file  to the user mail id ?
    Could you please help me with the code?
    THANKS IN ADV.

    it is possible in case of binary files not text files just read Thomas comments in the Blog.
    Sending E-Mail from ABAP - Version 610 and Higher - BCS Interface

  • Fixed Length File attachment to Mail Receiver

    Hi,
    I have an interface requirement IDOC - PI - Fixed Length File attached to email.
    Is it possible to do this? I need to pass the IDOC through a mapping and content conversion to build the fixed length file, I then need to attach this file to an email and send it to a user.
    Any help would be greatly appreciated.
    Thanks
    Gareth

    Hi,
    I actually need to convert the IDOC to a different file format before sending it on. Eg.
    <IDOC>
         <Segment 1...n>
               <username>User</username>
               <address>Address 1</Address>
         </Segment>
         <Segment>
               <username>User2</username>
               <address>Address 2</Address>
         </Segment>
    <IDOC>
    Converted to flat file with structure:
    User......Address 1.....
    User2....Address 2.....
    The file needs to have fields of Fixed Length (thats what the .... after each field is to represent)
    So I think the Data Type I am using for the file needs to go through Content Conversion to do this change, and then the file attached to an email.
    I hope this makes sense.

  • Option to read fixed-length format, EXCEL or ASCII TAB delimited

    Hi Experts,
                      I have a requirement for giving an option to read fixed-length format, EXCEL or ASCII TAB delimited in the selection screen.Kindly help me how can i read the fixed length format?
    Thanks,
    Ankita

    As far as I could have understood your problem
    You have to read the file with
    Open dataset  'file path+name'  FOR INPUT in text mode encoding default.
    and move its contents to string type variable
    Transfer to 'file path+name' to v_string.
    Now as per your requirements you need to use Offsets to transfer the data into a work area.
    For example
    wa_task-year=  v_string(4).
    wa_task-month=  v_string+4(2).
    Please confirm if this is your requirement.

  • Gui_download issue - trailing spaces getting truncated for fixed length fil

    Hi All,
    I have a requirement where I need to download an internal table as a fixed length file.
    The code is as follows:
    CALL FUNCTION 'GUI_DOWNLOAD'
    EXPORTING
    BIN_FILESIZE =
    FILENAME = L_FILE
    FILETYPE = 'ASC'
    APPEND = 'X'
    WRITE_FIELD_SEPARATOR = ' '
    HEADER = '00'
    TRUNC_TRAILING_BLANKS = ' '
    WRITE_LF = ' '
    COL_SELECT = ' '
    COL_SELECT_MASK = ' '
    DAT_MODE = ' '
    IMPORTING
    FILELENGTH =
    TABLES
    DATA_TAB = IT_TEXT
    EXCEPTIONS
    FILE_WRITE_ERROR = 1
    NO_BATCH = 2
    GUI_REFUSE_FILETRANSFER = 3
    INVALID_TYPE = 4
    NO_AUTHORITY = 5
    UNKNOWN_ERROR = 6
    HEADER_NOT_ALLOWED = 7
    Each row in the internal table IT_TEXT is 242 chars long.
    The FM is truncatinf the trailing blanks on the file. How do I get the FM to not truncate the trailing blanks in each row?
    My internal table has multiple rows and the number of rows on the table should be same as the number of rows on the downloaded file.
    I tried setting the WRITE_LF parameter to space.
    In this case, the trailing spaces are not truncated(which is as per my requirement), BUT all the rows in the internal table appear in a single line on the downloaded file instead of multiple rows.
    I also tried setting the TRUNC_TRAILING_BLANKS field to space but that does not work either. Spaces at the end of the row are still truncated.
    so the requirement is: the spaces at the end of each row should not be truncated and
    each row on the internal table should have a corresponding row on the downloaded file.
    (it is a fixed length file)
    I also tried using the following code
    class cl_abap_char_utilities definition load.
    DATA: BEGIN OF IT_TEXT OCCURS 0,
           TEXT(242) TYPE C,
           cr_lf TYPE c VALUE cl_abap_char_utilities=>cr_lf,
          END OF IT_TEXT.
    when i compile, i get the following error
    The type "CL_ABAP_CHAR_UTILITIES" is unknown.     
    Im using R/3 4.6C. Could this be a problem?     
    Please suggest a solution for this problem.
    Thanks!
    Sandeep
    Edited by: sandeep reddy on Jul 25, 2008 7:16 PM

    Hi,
    Try this..This worked..Add a dummy character at the end of the internal table...Then pass trunc_trailing_blanks   = ' '...
    PARAMETERS: p_file TYPE rlgrap-filename
                DEFAULT 'c:\test_download.txt'.
    DATA: BEGIN OF s_data,
            data TYPE char10,
            dummy,      " Added this.
          END OF s_data.
    DATA: t_data LIKE TABLE OF s_data.
    s_data-data = 'Test'.
    APPEND s_data TO t_data.
    s_data-data = 'Test2'.
    APPEND s_data TO t_data.
    s_data-data = 'Test3'.
    APPEND s_data TO t_data.
    s_data-data = 'Test4'.
    APPEND s_data TO t_data.
    * Download.
    DATA: v_file TYPE string.
    v_file = p_file.
    CALL FUNCTION 'GUI_DOWNLOAD'
      EXPORTING
        filename                = v_file
        trunc_trailing_blanks   = ' '
      TABLES
        data_tab                = t_data
      EXCEPTIONS
        file_write_error        = 1
        no_batch                = 2
        gui_refuse_filetransfer = 3
        invalid_type            = 4
        no_authority            = 5
        unknown_error           = 6
        header_not_allowed      = 7
        separator_not_allowed   = 8
        filesize_not_allowed    = 9
        header_too_long         = 10
        dp_error_create         = 11
        dp_error_send           = 12
        dp_error_write          = 13
        unknown_dp_error        = 14
        access_denied           = 15
        dp_out_of_memory        = 16
        disk_full               = 17
        dp_timeout              = 18
        file_not_found          = 19
        dataprovider_exception  = 20
        control_flush_error     = 21
        OTHERS                  = 22.
    Thanks
    Naren

  • External tables-Fixed length file

    Hi All,
    I have a fixed length file that i load daily using an External table. Recently, one of the field, IP length was changed and customer wants to send both old records with 8 byte length and new records with 11 byte length in the same data file, until complete migration takes place.
    Will it be possible for External tables to handle this requirement?. Or Is there any other possibility to treat it.
    The old file contains 104 fields with IP field position form 490 to 498. Total
    The new file contains 104 fields with the IP position from 490 to 501.
    Thanks,
    Sri.

    If the two record types are mixed in the same file, then you will have problems loading them. I can see two possible solutions, in no particular order of preference (using your example data):
    1. Redefine the external table something like:
    Position (record_type (1:1)
              version     (2:5)
              data        (6:41))then parse the remaining fields based on the version number when you select from the external table.
    2. Create two external tables over the same file, one for version 1.00 and one for version 1.01 using the LOAD WHEN clause to determine which set of data to load when you select. Something like:
    CREATE TABLE version1 ...
    ORGANIZATION EXTERNAL ...
    ACCESS PARAMETERS
    (RECORDS DELIMITED BY newline
      LOAD WHEN (version = 1.00)
    < definition for the old format >
    and
    CREATE TABLE version101 ...
    ORGANIZATION EXTERNAL ...
    ACCESS PARAMETERS
    (RECORDS DELIMITED BY newline
      LOAD WHEN (version = 1.01)
    < definition for the new format >Then yor processing would use something like:
    SELECT ip, last_name
    FROM version1
    UNION ALL
    SELECT ip, last_name
    FROM version101HTH
    John

Maybe you are looking for

  • How can I sync my Contacts... With iTunes to my iPod Nano 5th?

    Hello, I have an iPod Nano 5th generation and a Macbook Air with iTunes 11 and OS X Maverics. I would like to sync my contacts, agenda, notes... I can only sync my photos, music... (With iTunes) How can I do it? I have tried making a .txt file, but I

  • Is there any way to install programs from a disc onto the new iMac?

    My boss has purchased a new iMac, and was dismayed to find out that he could not insert his Vectorworks, and other disc programs he previously purchased into the machine.  Is there any cost effective solution?

  • Regarding File Offset While Reading

    Suppose I have a file with the string "HELLO" written in it. I now try reading it with readBytes(AdobeArray,20,2)   // But I dont get any EOF error inspite of reading the file from offset 20 which is well beyond the EOF offset for this file. Why is i

  • DMM 5.2.2 - 5.2.3 MCS 7835-H2

    Dear all. I have a customer who is running DMM 5.2.2 on a MCS 7835-H2. They have purchased a C210 to run DMM 5.2.3 a migrate away from the MCS hardware platform. I know that DMM 5.2.3 is not supported on the MCS hardware platform. http://www.cisco.co

  • My hostname has changed...why?

    I just got this message "This computer's local hostname 'firstname-lastname-iMac.local' is already in use on this network.  The name has been changed to 'firstname-lastname-iMac-2.local'." what does this mean? Is this a security issue?