Option to read fixed-length format, EXCEL or ASCII TAB delimited

Hi Experts,
                  I have a requirement for giving an option to read fixed-length format, EXCEL or ASCII TAB delimited in the selection screen.Kindly help me how can i read the fixed length format?
Thanks,
Ankita

As far as I could have understood your problem
You have to read the file with
Open dataset  'file path+name'  FOR INPUT in text mode encoding default.
and move its contents to string type variable
Transfer to 'file path+name' to v_string.
Now as per your requirements you need to use Offsets to transfer the data into a work area.
For example
wa_task-year=  v_string(4).
wa_task-month=  v_string+4(2).
Please confirm if this is your requirement.

Similar Messages

  • Reading fixed length flatfile & splitting it into header and lineitem data

    Hi Friends,
    I am reading a fixed length flat file from application server into an Internal table.
    But problem is, data in flat file is in the below format with fixed start and end positions.
    1 - 78 -  control header
    1 - 581 - Invoice header data
    1 - 411 - Invoice Line item data
    1 - 45 -   trailer record
    There will be one control header and one trailer record per file and number of invoice headers and its line items can vary.
    There is unique identifiers to identify as below.
    Control header - starts with 'CHR'
    Invoice Header starts with - '000'
    Invoice Lineitem stats with - '001'
    trailer record - starts with 'TRL'
    So its like
    CHR.......control  data..(79)000.....header data...(660)001....lineitem1...(1481)001...lineitem2....multiples of 411 and 581 and ends with... TRL...trailer record..
    (position)
    I am first reading the data set and store in internal table with a field of 255char.
    by looping on above ITAB i have to split it into Header records and line item records.
    Did anyone face this kind of scenario before. If yes appreciate if you can throw some ideas on logic to split this data.
    Any help in splitting up the data is highly appreciated.
    Regards,
    Simha
    ITAB declaration
    DATA: BEGIN OF ITAB OCCURS 0,
                   FIELD(255),
               END OF ITAB,
                lt_header type table of ZTHDR,
                lt_lineitem type table of ZTLINITM.

    Hi,
    i am sending sample code which resembles your requiremeant.
    data: BEGIN OF it_input OCCURS 0, "used for store all the data in one line.
          line type string ,
          END OF it_input,
          it_header type TABLE OF string WITH HEADER LINE,"use to store all header with corresponding items
          it_item   type TABLE OF string WITH HEADER LINE.."used to store all item data
    CALL FUNCTION 'GUI_UPLOAD'
      EXPORTING
        filename                      = 'd:\test.txt'
      FILETYPE                      = 'ASC'
      HAS_FIELD_SEPARATOR           = ' '
      HEADER_LENGTH                 = 0
      READ_BY_LINE                  = 'X'
      DAT_MODE                      = ' '
      CODEPAGE                      = ' '
      IGNORE_CERR                   = ABAP_TRUE
      REPLACEMENT                   = '#'
      CHECK_BOM                     = ' '
      VIRUS_SCAN_PROFILE            =
      NO_AUTH_CHECK                 = ' '
    IMPORTING
      FILELENGTH                    =
      HEADER                        =
      tables
        data_tab                      = it_input
    EXCEPTIONS
      FILE_OPEN_ERROR               = 1
      FILE_READ_ERROR               = 2
      NO_BATCH                      = 3
      GUI_REFUSE_FILETRANSFER       = 4
      INVALID_TYPE                  = 5
      NO_AUTHORITY                  = 6
      UNKNOWN_ERROR                 = 7
      BAD_DATA_FORMAT               = 8
      HEADER_NOT_ALLOWED            = 9
      SEPARATOR_NOT_ALLOWED         = 10
      HEADER_TOO_LONG               = 11
      UNKNOWN_DP_ERROR              = 12
      ACCESS_DENIED                 = 13
      DP_OUT_OF_MEMORY              = 14
      DISK_FULL                     = 15
      DP_TIMEOUT                    = 16
      OTHERS                        = 17
    IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    LOOP AT  it_input.
       write : it_input-line.
    ENDLOOP.
    before doing the below steps just takethe controle record and trail record cutt of from the table input based up on the length
      split it_input-line AT '000' INTO TABLE it_header IN CHARACTER MODE.
      LOOP AT  it_header.
        split it_header AT '0001' INTO TABLE it_item IN CHARACTER MODE.
        write :/ it_header.
      ENDLOOP.
    after this you need to cut the records in tocorresponding  fields in to respective tables by putting loop on item and header table as i decleared above.
    i think this may solve your problem you need to keep some minnor effort.
    Regaurds,
    Sree..

  • FILE model support for CSV and fixed-length format!!

    Hi all,
    As I'm working on a data conversation project, we are in the middel to design whether the extraction for the data source should be in CSV or fixed-length. Here is the pros and cons:
    CSV:
    1. There are a lot of remarks field in the data which are all free text. It's really hard to define the record delimiter.
    2. Extracted file size for CSV is relatively small compare with fixed-length.
    fixed-length:
    Even for fixed length, we can either
    1. Extract the file without an separator/delimiter (1 row) with fixed column length
    2. Extract the file with fixed column length per record and define the record delimiter to separarter the record.
    I'd try all the above formats and it seems ODI is more prefer on CSV or combined mode (fixed column length) as:
    1. The LKM (file to external table) need to be customzised to handle the file format which without any separator/delimiter
    2. It is very hard to reverse the column definition for fixed-length (no delimiter)
    Any suggestion?
    Tao

    You can transform your files to an xml format using respective tags:
    eg:
    <?xml version="1.0" encoding="UTF-8"?>
    <MONEY>
    <MoneyID>MoneyID0</MoneyID>
    <ContractID>ContractID0</ContractID>
    <EffectiveDate>2006-05-04T18:13:51.0Z</EffectiveDate>
    <MessageDate>2006-05-04T18:13:51.0Z</MessageDate>
    <ReversalIndicator>false</ReversalIndicator>
    <PriorMoneyID>PriorMoney, something "ID0" for life;</PriorMoneyID>
    <Amount>0</Amount>
    <MoneyType>0</MoneyType>
    <ExchangeDetails>
    <CostBasis>0</CostBasis>
    <ExchangeType>0</ExchangeType>
    <MEC>false</MEC>
    <LoanAmount>0</LoanAmount>
    </ExchangeDetails>
    </MONEY>
    So, as you see, you can embed any symbol in a tag.
    After creating this xml file, you can use XML technology in Topology Manager to map to this file and then create a data store based on this technology.
    And you will be able to retrieve the data in this file under respective column headers without worrying about delimiters existing in the actual data.

  • Reading fixed length file with different record types

    Hi,
    I need to read a fixed-length file with different record types, but the record identifier is in 31st position and not in 1st position.
    But if I give 31 as position in File adpater wizard, BPEL takes whole 1-31 as identifier.
    How we need to read such files.
    Thanks
    Ravdeep

    hi ,
    u cannot use the default wzard for this
    use some thing like this nxsd:lookAhead="30" nxsd:lookFor="S"have a look at the below link it has some examples
    http://download.oracle.com/docs/cd/B31017_01/integrate.1013/b28994/nfb.htm

  • Please help with reading fixed-length file

    I am reading a fixed-length file from a batch program and I need to read through each line, parsing out records in each, and replacing the first character in each line after I finish processing. I am using a RandomAccessFile, but I am not sure if this is best for my needs.
    Here is essentially the code that I am using:
    RandomAccessFile raf = new RandomAccessFile(File, "rw");
    int pointer = 0;
    int recordLength = 65;
    int counter = 0;
    string test = "X";
    pointer = raf.getFilePointer();
    while ((line = raf.readLine()) != null) {
    //set file pointer
    seek(counter * recordLength);
    counter ++;
    if (line.substring(0, 1).equals(test)) {
    continue;
    var1 = line.substring(7, 22);
    var2 = line.substring(23, 41);
    var3 = line.substring(42, 52);
    var4 = line.substring(53, 53);
    //PROCESSING
    raf.writeChar(test);

    Thanks for the suggestion. I think that the
    BufferedReader is the way that I will go. Do you
    happen to know the best way to update the 1st Char of
    each line as I'm reading each line using Buffered
    Writer?Hi,
    To update the first char of each line, you could read each line of the file into a StringBuffer, modify the value, and then write the value back out to the file.
    I had to do something similar.
    eg.
    StringBuffer buf = new StringBuffer();
    BufferedReader in = new BufferedReader(
        // data file source
    new FileReader(db_location));
    String s;
    // while there are still items to read
    while ((s = in.readLine()) != null)
        System.out.println("reading file");
        // split into component parts so we can modify
        StringTokenizer t = new StringTokenizer(s,",");
        int id = Integer.parseInt(t.nextToken());
        String code = t.nextToken();
        int quantity = Integer.parseInt(t.nextToken());
        // copy component parts into StringBuffer
        buf.append(id+",");
        buf.append(code+",");
        buf.append("\n");
         // use StringBuffer methods here to isolate first character
         // use StringBuffer methods to replace first value of input in buffer
         // eg.
              buf.replace(index,length,","+ new_code);
    // print out the StringBuffer to rewrite updated file
    PrintWriter output = new PrintWriter(
        new BufferedWriter(
        new FileWriter(db_location, false)));
    output.println(buf.toString());
    output.close();As always, there is more than one way to skin a cat. But this is my way.
    I think that JDK 1.4 has some new methods, but not 100% sure of that.
    //Daniel.

  • Dealing with Fixed Length formats both at sender and receiver side

    Hi all,
    I have file -- File with fixed lenght formats.
    Source system will generate a text file with fixed lenght and target also needs the same file with same format.
    Then we have an idea to implement the interface with out IR Development .this is fine upto now.
    But now the problem is we are receiving more fields from source than we require for target process. we are not using at receiver side.
    So receiving system needs only few required fields from the source file.
    From soucre
    Ex: Source....  DocNum   Invoice No   Bill date   Bill amount vendor code
    We require at target
    Ex: Target.....   DocNum  Bill Date Vendor code
    Please suggest me
    Regards

    Hi,
    >>The source file content conversion is simple...In the target when you specify the target field names just leave out the ones you dont need...
    from source we wil read all the fields, this is fine and in the target how to mention the field positions
    Here all fields will come into picture, we cant mention the positions at receiver
    Source side Ex: Docnum    InvoiceNo     Billdate     Bill Amt     Vendor
    Fixed lenghts are 10            20               10             10            20
    Target Expects Ex: Docnum   BillDate  Vendor
                                  10            10          20
    in receiver side if you mention fixed lenghts like 10 10 20 means it may read docnum, Invoiceno, BillDate
    It wont ignore InvoiceNo field in this case
    Regards

  • Issue related to Excel to Text tab delimited conversion

    Hi,
    I am facing a strange issue in converting an excel file to a text tab delimited file. I have an excel sheet which contains two columns:
    vendor         new_zip
    153357      99645-6340
    162642         99669-7631
    $209930-1   91320-1201
    $209989-1   91710-5766
    When i save this file as a text tab delimited file  this is how it turns into:
        vendor  new_zip
        153357  99645-6340
        162642  99669-7631
        $209930-1     91320-1201
        $209989-1     91710-5766
    i am using WS_UPLOAD since i am not using the enterprise version of SAP. In the above mentioned u can see that the zip values(first character) of the first two records and the vendor numbers(last character) of the last two records get in line with each other. so when i load this file it gives a "File open error" exception. if i move the zip codes of the first two records by one space so that they are not in line with the vendor numbers of the last two records the file uploading happens fine. but this is something which cant be done from the user point of view. so can anyone tell as to what can be done without changing the concept of using text tab delimited file and aviod this problem?
    thanks & regards,
    Bala.

    The following program uses alsm_convert_excel_to_internal_table.but i want to use split at tab delimiter.pls help.
    REPORT ZSAINAL_BDC_TRAN .
    *INCLUDE BDCRECX1.
    *definition of selection screen
    SELECTION-SCREEN BEGIN OF BLOCK bl1 WITH FRAME .
    SELECTION-SCREEN BEGIN OF BLOCK bl2 WITH FRAME TITLE text-t01.
    SELECTION-SCREEN SKIP.
    SELECTION-SCREEN BEGIN OF LINE.
    SELECTION-SCREEN COMMENT 10(20) text-c01.
    SELECTION-SCREEN END OF LINE.
    SELECTION-SCREEN SKIP.
    SELECTION-SCREEN BEGIN OF LINE.
    SELECTION-SCREEN COMMENT 10(10) text-c02.
    SELECTION-SCREEN POSITION 22.
    PARAMETERS: p_txt RADIOBUTTON GROUP grp1 DEFAULT 'X'.
    SELECTION-SCREEN COMMENT 30(12) text-c03.
    SELECTION-SCREEN POSITION 46.
    PARAMETERS: p_xls RADIOBUTTON GROUP grp1 .
    SELECTION-SCREEN END OF LINE.
    SELECTION-SCREEN SKIP.
    SELECTION-SCREEN END OF BLOCK bl2.
    *PARAMETERS
    PARAMETERS ctumode LIKE ctu_params-dismode DEFAULT 'A'.
    PARAMETERS cupdate LIKE ctu_params-updmode DEFAULT 'S'.
    SELECTION-SCREEN SKIP.
    SELECTION-SCREEN END OF BLOCK bl1.
    *DATA
    DATA : bdc_data LIKE STANDARD TABLE OF bdcdata INITIAL SIZE 10
           WITH HEADER LINE.
    *DATA: bdc_tab LIKE TABLE OF bdcdata INITIAL SIZE 0 WITH HEADER LINE.
    *Internal table to get legacy data records structured in
    *target format
    DATA: BEGIN OF inrec OCCURS 0,
              lifnr LIKE lfa1-lifnr,
              name1 LIKE lfa1-name1,
              stras LIKE lfa1-stras,
              ort01 LIKE lfa1-ort01,
          END OF inrec.
    *Internal table of structure ALSMEX_TABLINE suitable for
    *uploading records from an Excel worksheet
    DATA: it_excel TYPE STANDARD TABLE OF alsmex_tabline
                  INITIAL SIZE 0 WITH HEADER LINE,
          it_excel_dummy TYPE  alsmex_tabline.
    data: messtab like bdcmsgcoll occurs 0 with header line.
    DATA: L_MSTRING(480).
    DATA: L_SUBRC LIKE SY-SUBRC.
    *START-OF-SELECTION
    START-OF-SELECTION.
      PERFORM read_data.
      PERFORM populate_bdcdata.
    *START NEW SCREEN
    FORM bdc_dynpro USING program dynpro.
      CLEAR bdc_data.
      bdc_data-program  = program.
      bdc_data-dynpro   = dynpro.
      bdc_data-dynbegin = 'X'.
      APPEND bdc_data.
    ENDFORM.
    *INSERT FIELD
    FORM bdc_field USING fnam fval.
    IF FVAL <> NODATA.
      CLEAR bdc_data.
      bdc_data-fnam = fnam.
      bdc_data-fval = fval.
      APPEND bdc_data.
    ENDIF.
    ENDFORM.
    *&      Form  read_data
          text
    -->  p1        text
    <--  p2        text
    FORM read_data.
      IF p_txt = 'X'.
        CALL FUNCTION 'GUI_UPLOAD'
            EXPORTING
              filename                      = 'C:\URMILA.TXT'
             filetype                      = 'ASC'
             has_field_separator           = 'X'
      HEADER_LENGTH                 = 0
      READ_BY_LINE                  = 'X'
    IMPORTING
      FILELENGTH                    =
      HEADER                        =
            TABLES
              data_tab                      = inrec
       EXCEPTIONS
         file_open_error               = 1
         file_read_error               = 2
         no_batch                      = 3
         gui_refuse_filetransfer       = 4
         invalid_type                  = 5
         no_authority                  = 6
         unknown_error                 = 7
         bad_data_format               = 8
         header_not_allowed            = 9
         separator_not_allowed         = 10
         header_too_long               = 11
         unknown_dp_error              = 12
         access_denied                 = 13
         dp_out_of_memory              = 14
         disk_full                     = 15
         dp_timeout                    = 16
         OTHERS                        = 17
        IF sy-subrc <> 0.
          WRITE: /'ERROR UPLOADING TXT FILE FROM PRESENTATION SERVER!',
                 / 'RETURN CODE : ',sy-subrc.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
        ENDIF.
      ELSE.  "filetype is excel
    -read excel file into an int. table in row/col format--
        CALL FUNCTION 'ALSM_EXCEL_TO_INTERNAL_TABLE'
             EXPORTING
                  filename                = 'C:\URMILA.XLS'
                  i_begin_col             = 1
                  i_begin_row             = 1
                  i_end_col               = 4
                  i_end_row               = 3
             TABLES
                  intern                  = it_excel
             EXCEPTIONS
                  inconsistent_parameters = 1
                  upload_ole              = 2
                  OTHERS                  = 3.
        IF sy-subrc <> 0.
          WRITE: /'ERROR UPLOADING XLS FILE FROM PRESENTATION SERVER!',
                 / 'RETURN CODE : ',sy-subrc.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
        ELSE.
    --now fill data from it_excel into final legacy data itabinrec----
          IF NOT it_excel[] IS INITIAL.
            CLEAR inrec.
            REFRESH inrec[].
            LOOP AT it_excel.
              it_excel_dummy = it_excel.
              AT NEW col.
                CASE it_excel_dummy-col.
                  WHEN 1.
                    inrec-lifnr = it_excel_dummy-value(10).
                  WHEN 2.
                    inrec-name1 = it_excel_dummy-value(35).
                  WHEN 3.
                    inrec-stras = it_excel_dummy-value(35).
                  WHEN 4.
                    inrec-ort01 = it_excel_dummy-value(35).
                    APPEND inrec.
                    CLEAR inrec.
                ENDCASE.
              ENDAT.
             AT END OF row.
             ENDAT.
            ENDLOOP.
          ENDIF.
        ENDIF.
      ENDIF.
    ENDFORM.
    *&      Form  populate_bdcdata
      to populate BDCDATA table from legacy file data contained in
      internal table inrec.
    -->  p1        text
    <--  p2        text
    FORM populate_bdcdata.
      IF  not inrec[] IS INITIAL.
        LOOP AT inrec.
          CLEAR bdc_data.
          REFRESH bdc_data.
          PERFORM bdc_dynpro      USING 'SAPMF02K' '0106'.
          PERFORM bdc_field       USING 'BDC_OKCODE'
                                        '/00'.
          PERFORM bdc_field       USING 'RF02K-LIFNR'
                                        inrec-lifnr.
          PERFORM bdc_field       USING 'RF02K-BUKRS'
                                        '1000'.
          PERFORM bdc_field       USING 'RF02K-D0110'
                                        'X'.
          PERFORM bdc_dynpro      USING 'SAPMF02K' '0110'.
          PERFORM bdc_field       USING 'BDC_OKCODE'
                                        '=PF03'.
          PERFORM bdc_field       USING 'LFA1-NAME1'
                                       inrec-name1.
          PERFORM bdc_field       USING 'LFA1-SORTL'
                                        'ALLGEMEIN'.
          PERFORM bdc_field       USING 'LFA1-STRAS'
                                        inrec-stras.
          PERFORM bdc_field       USING 'LFA1-PFACH'
                                        '645793'.
          PERFORM bdc_field       USING 'LFA1-ORT01'
                                        inrec-ort01.
          PERFORM bdc_field       USING 'LFA1-PSTLZ'
                                        '12001'.
          PERFORM bdc_field       USING 'LFA1-ORT02'
                                        'COOK'.
          PERFORM bdc_field       USING 'LFA1-PSTL2'
                                        '76905'.
          PERFORM bdc_field       USING 'LFA1-LAND1'
                                        'DE'.
          PERFORM bdc_field       USING 'LFA1-REGIO'
                                        '11'.
          PERFORM bdc_field       USING 'LFA1-SPRAS'
                                        'DE'.
          PERFORM bdc_field       USING 'LFA1-TELF1'
                                        '06894/55501-0'.
          PERFORM bdc_field       USING 'LFA1-TELFX'
                                        '06894/55501-100'.
          PERFORM bdc_dynpro      USING 'SAPLSPO1' '0100'.
          PERFORM bdc_field       USING 'BDC_OKCODE'
                                        '=YES'.
          CALL TRANSACTION 'FK02'
                           USING bdc_data
                           MODE ctumode
                           UPDATE cupdate
                           MESSAGES INTO MESSTAB.
        ENDLOOP.
        perform mess_handling.
      ENDIF.
    ENDFORM.                    " populate_bdcdata
    *&      Form  mess_handling
          ERROR MESSAGE HANDLING IN CALL TRANACTION
    -->  p1        text
    <--  p2        text
    form mess_handling.
    tables:T100.
    LOOP AT MESSTAB.
            SELECT SINGLE * FROM T100 WHERE SPRSL = MESSTAB-MSGSPRA
                                      AND   ARBGB = MESSTAB-MSGID
                                      AND   MSGNR = MESSTAB-MSGNR.
            IF SY-SUBRC = 0.
              L_MSTRING = T100-TEXT.
              IF L_MSTRING CS '&1'.
                REPLACE '&1' WITH MESSTAB-MSGV1 INTO L_MSTRING.
                REPLACE '&2' WITH MESSTAB-MSGV2 INTO L_MSTRING.
                REPLACE '&3' WITH MESSTAB-MSGV3 INTO L_MSTRING.
                REPLACE '&4' WITH MESSTAB-MSGV4 INTO L_MSTRING.
              ELSE.
                REPLACE '&' WITH MESSTAB-MSGV1 INTO L_MSTRING.
                REPLACE '&' WITH MESSTAB-MSGV2 INTO L_MSTRING.
                REPLACE '&' WITH MESSTAB-MSGV3 INTO L_MSTRING.
                REPLACE '&' WITH MESSTAB-MSGV4 INTO L_MSTRING.
              ENDIF.
              CONDENSE L_MSTRING.
              WRITE: / MESSTAB-MSGTYP, L_MSTRING(250).
            ELSE.
              WRITE: / MESSTAB.
            ENDIF.
          ENDLOOP.
    endform.                    " mess_handling

  • Read Fixed length file

    I have a text file that has data in specific positions of the file.
    ie.
    Pos 1-20 Name
    Pos 21-30 Division
    Pos 31-38 Birthdate
    I saved the file in the DB and want to process it to read the data in the specific positions I listed above.
    I started with the code below, but not sure how to get to the specific positions in the file.
    DECLARE
    v_blob_data BLOB;
    v_blob_len NUMBER;
    v_clob_data CLOB := 'anything';
    v_clob_len NUMBER;
    dest_offset NUMBER := 1;
    src_offset NUMBER := 1;
    blob_csid NUMBER := dbms_lob.default_csid;
    lang_ctx INTEGER := dbms_lob.default_lang_ctx;
    warning INTEGER;
    BEGIN
    select blob_content into v_blob_data from wwv_flow_files;
    v_blob_len := dbms_lob.getlength(v_blob_data);
    dbms_lob.converttoclob(v_clob_data, v_blob_data, v_blob_len,dest_offset,src_offset,blob_csid,lang_ctx,warning);
    END;

    bobmagan wrote:
    Unfortunately, I have to do this inside of an APEX page using pl/sql.What is unfortunate about that!? Apex is good. PL/SQL is awesome.{noformat} :-) {noformat}
    I would not have it any other way. Seriously.
    If the file is uploaded into Apex - then you still can use external tables.
    The basic steps are:
    1. spool the uploaded file (from the Apex flow files table) to the local file system of the Oracle server using UTL_FILE
    2. alter the external table and specify the newly created file's location
    3. select from the external data to load the file contents
    Alternatively, you need to write your own CLOB parser - that can read lines from the CLOB, parse these, and output the tokens to use (e.g. token 1 is the invoice id, token 2 the invoice date, etc).
    This all is very much doable in PL/SQL. And Apex is PL/SQL - and allows you to call your own PL/SQL packages and code to process data.
    Keep in mind that the simpler a process is, the easier it is to write, maintain and debug. Doing manual parsing.. not something that should be treated lightly as that can become quite complex.
    Dealing with an external table and its easy definition of the format of the file content to load, is a lot easier and more flexible. The only downside is having to spool the CLOB to an external file in order to use the external table... However... I do recall a few months ago looking at the "cartridge" used by Oracle to load external files as an external table, and read/saw something that seemed to also support CLOBs..?
    So I'm not sure if one can now define an external table and use it to load CLOB contents - worth looking at... and even if not, also worth considering spooling a CLOB to external file for loading via the external table interface.
    Unless of course you're keen to put your regex kung fu skills to use in writing a custom CLOB parser for the file contents.

  • Reading a fle format thats part ascii and part binary

    Hey
    I'm trying to write a importer for the PLY filke format. That file format starts of with a header in ascii and then continues with either the data in ascii or in binary. When the data is in ascii I have no problem reading the file but when the data is in binary I'm noth quite sure how I should do it. Right now I do something like this:
    BufferedInputStream stream = new BufferedInputStream(new FileInputStream(file)); // I first create this
    BufferedReader reader = new BufferedReader(new InputStreamReader(stream));  // then this is for reader ascii
    DataInputStream dataInputStream = new DataInputStream(stream); // And this is for reading binaryThen I read the ascii header with the BuffereReader and then wheter I turns out that the data is in ascii I countine reading with the reader, otherwise I start reading with the DataInputStream. But the data from the DataInputStream is not the same as I get from the BufferedReader. I try it on these files that are in diffrent endians and either ascii ot binary
    http://www.cs.ucl.ac.uk/staff/Joao.Oliveira/ply.html in the Nbunny.zip.
    Anyone that can help me with this? Or know about a PLY file reader in java already?
    Discordia
    Message was edited by:
    Discordia

    DataInputStream is NOT the right class to use for general binary input. It is used only for the specific purpose of reading Java primitives. The input obviously needs to be in the right format for that to work, i.e. it needs to be written using a DataOutputStream.
    I would read everything using an InputStream, not a reader. Store the bytes of the header until you reach the end of the header (not sure how to determine that since I don't know the fileformat). Then convert those bytes into a String using the proper charset ("US-ASCII" in this case, if by "ascii" you really mean ASCII).
    Then just continue reading the binary data.

  • File Adapter: Fixed length file read fails when all data not present

    Hi
    We have a BPEL process that reads fixed length data files. It works fine when all the data elements are available in the file but fails with 'rejected:10002' when even a single data is missing.
    How to handle this situation in BPEL file adapter?
    Are we doing something wrong or is this a normal functionality.
    If yes, then is there any work around for this as this is a very usual business condition which may occur, where all data elements are not mandetory.
    fixedLength
    ==========
    2,3,3,2
    Data - Successful
    ============
    1234567890
    2345678901
    3456789012
    Data - Failed
    ===========
    1234567890
    2345678901
    345678901
    Thanks in advance
    Buddhadev

    Hi Naveen,
          Do check the following things,
    >>Note : I have been asked to give the Transport Protocol as "NFS" (Whether this is the problem???) I have summarized the complete details below. Please help me
           1.If your file resides on your local network/local computer give NFS(Network file system). if your file resides on a FTP location give FTP and also give the FTP log on parameters.
    Additional Parameters
    File_MT.fieldFixedLengths 10,10,5
    File_MT.fieldNames VendorNumber,VendorName,City
    File_MT.fieldSeparator
    File_MT..processFieldNames fromConfiguration 
           2.If this structure does not match with the input file structure the file adapter wont pick up the file. So check for the help document provided by SAP in the following path.
    help.sap.com  --> Documentation  --> SAPNetWeaver --> SAPNetWeaver '04 --> English --> process integration --> SAP Exchange Infrastructure --> connectivity --> Adapters --> File Adapter
           Your file contains three records
    V123456789 A123456789 Bosto
    V234567890 B123456789 Atlan
    V334587900 C123456789 Austi
    You have mentioned the fieldSeperator as space but there is no File_MT.endSeparator '/n'  which differentiates between each and every row (record).
            Parameters for Record set Structures mentioned in the sender adapter configuration does not match with the actual file structure .
            Try giving exact structures in the configuration of sender file adapter.
    regards,
    Aravindh.

  • Upload a Fixed Length file in terms of Bytes..

    Hi,
    Here is my query.
    I have a fixed length file that I need to upload into my program from my presentation server.
    The file is in a Shift-JIS Format.
    The file is a fixed length format. But it is fixed interms of the number of bytes that each column occupies.
    Eg. The 1st column takes 8 bytes, the second 15 bytes, so on and so forth. We do not know the no. of characters each column takes... just the numbe of bytes.
    This is how I had approached the upload.
    I created an internal table with just one field of type XSTRING.
    I used GUI_UPLOAD FM with CODEPAGE = `8000`.
    But i noticed during debugging that in each record, the moment a SPACE occured in the input file, it would stop reading and go to the next record in the file. Meaning, I loose all the data after the first occurance of SPACE.
    Am I missing something here?? Why does the FM truncate after the first SPACE. ??
    Do I need to declare the internal table in any other format..??

    " May be placing a carriage return end of each records
    " will solve your problem
    class cl_abap_char_utilities definition load.
    data : begin of itab,
            field1(1) type c,
            field2(2) type c,
            field3(3) type c,
            field4(4) type c,
    crlf(2) type c value cl_abap_char_utilities=>cr_lf. "<<<See this line<<<
    data : end of itab.
    Data : begin of itab1 occurs 0.
            Field(20) type c.
    Data : end of itab1.
    Loop at itab.
         Move itab to itab1.
         Append itab1.
    Endloop.
    Open dataset  ........
    Loop at itab1.
       Transfer itab1 TO dataset.
    Endloop.

  • How to export data in fixed lengh format

    I want to take data in 4 tables, and export the data to a fixed lenght flat file format. Then have that file sent to an email address. How can I do this?

    Suzie,
    I never used Toad ;-) But I always use the Oracle SQL Developer instead (which is for free by the way).
    SQL Developer gets installed with every 11g database installation but you could install it also to your 9.2 or 10.x environment - just grab it from otn.oracle:
    http://www.oracle.com/technology/software/products/sql/index.html
    (will take two minutes to install). Then you'd just define your connection to your target database, go into REPORTS (left side) and define a USER DEFINED REPORT. That's pretty easy because for instance you'd just say 'select * from scott.emp;' - once the report gets executed just move your mouse over one of the rows, hit the right mouse buttom and choose the last entry: EXPORT DATA - you'll have the options of TEXT, CSV, SQL LOADER FORMAT, INSERT, XML, HTML and XLS ;-)
    Another possibility within SQL*Plus would be using the column format options (the font of this webpage scrambles the output columns but in SQL*Plus you'll get everything formated in a fixed length format:
    column "info" format A11
    column "P1" format 99999
    column "P2" format 99999
    column "P3" format 99999
    SELECT Info " ", P1, P2, P3
    FROM tr_PortfolioSummary
    ORDER BY SortOrder
    P1 P2 P3
    Nb Bonds 10 40 40
    Nb Warrants 20 50 50
    Nb Shares 30 60 60
    Kind regards
    Mike
    Edited by: Mike Dietrich on Sep 12, 2008 9:41 AM

  • Uploading the fixed length file into intenral table

    hi folks,
    I have a file in the fixed length format, need to upload that into the intenral table. How to do that? I have done with comma delimited file using 'SPLIT' command how to go about this?
    I cannot change the fiel format have to use this..
    Thanks in advance for your help.
    Vinu

    Hi,
    Create a structure exactly as the length of the file..
    Example
    Assuming your file has first 10 characters for customers and next 4 characters for company code..
    CUST1234561000
    DATA: BEGIN OF FILE_STRUC,
            KUNNR(10),
            BUKRS(4),
          END OF FILE_STRUC.
    if you assign the file value to the structure FILE_STRUC..automatically the first ten characters will be stored in the KUNNR field and next characters will be stored in the BUKRS field..
    Move the file line to the internal table ITAB of the same structure FILE_STRUC.
    THanks,
    Naren
    Message was edited by: Narendran Muthukumaran

  • Export Data To Fixed Length File

    Hello,
    I have a requirement to export a couple of files that need to be in a fixed length format. 
    I've tried opening the dataset and writing the records passing the column that the records should appear:
        WRITE /, field1, 10 field2, etc...
    I've also tried to do it through SAP_CONVERT_TO_CSV_FORMAT and the text version as well.
    Is there a predefined function that will allow me to export as fixed length? 
    Thank you in advance

    Hi Jeff,
    Following is the code to do the same:
    DATA: file TYPE string VALUE `flights.dat`,
          wa   TYPE spfli.
    FIELD-SYMBOLS TYPE x.
    OPEN DATASET file FOR OUTPUT IN BINARY MODE.
    SELECT *
           FROM spfli
           INTO wa.
      ASSIGN wa TO CASTING.
      TRANSFER TO file.
    * TRANSFER LENGTH 200 to file.
    ENDSELECT.
    CLOSE DATASET file.
    Best regards,
    Prashant

  • Files with fixed length

    I have an input file, which is fixed length format. I need to configure the Sender File Adapter for a fixe length file(i.e each field length is available and I need to mention that in my sender file Adapter)
    Can anyone suggest an appropriate blog, that explains my scenario.
    (I searched this web and it keep on hanging)

    Hi Naveen
      check these links
    /people/venkat.donela/blog/2005/06/08/how-to-send-a-flat-file-with-various-field-lengths-and-variable-substructures-to-xi-30
    for sender conversion (check the sub links in this topic)
    http://help.sap.com/saphelp_nw04/helpdata/en/0b/9a50465ccf84479e39a6d50c90fb3f/frameset.htm
    and my reply to your previous post for idoc to flat file.
    Regerds
    Vishnu

Maybe you are looking for

  • Nano no longer charges on iHome

    So, my nano stopped charging on my iHome 5. I can dock it and play music on it, but it fails to charge. Any idea what I can do? Thanks.

  • Debug Assertion from javaw.exe

    I'm getting a Debug Assertion Failed from javaw.exe: (Windows 2000) jre1.4.2\bin\javaw.exe File: dbgheap.c Line 1044 Expression: _CrtIsValidHeapPointer(pUserData) This occurs on one of my native DLL method calls. Here's the scenario: I've got a set o

  • Can't start GUI apps as Root.

    If i am logged as root i can't start GUI apps like: Xlib: connection to ":0.0" refused by server Xlib: No protocol specified kate: cannot connect to X server :0.0 should i post xorg.conf?

  • Changes to UEFI variables lost on reboot

    I have a Toshiba L855-S5155 laptop which came with Windows 8 preinstalled in UEFI / Secure Boot mode. I have since successfully installed Ubuntu 13.10.  My problem is that when I change the UEFI boot variables (such as to delete boot entries or chang

  • Wave distortion filter not working in cs5

    Really getting frustrated with this.  I've never had any problems with any previous versions of photoshop or with any filters working before.  I recently got photoshop cs5 extended and have been trying to do alot with 3D effects and texts, but for so