FM for reading total record in flat file

Hi,
Do we any function module which can tell me about number of record in flat file.
I want only FM name!!
Thanks for your reply!! but File is only there in Application server.
Thanks in advance.
Message was edited by: Vipin Nagpal

Hi,
then you need to call the unix command (if your application server is Unix)
<b>wc -l fielname</b>
data: unixcom like   rlgrap-filename.
unixcom = 'wc -l fielname'
data: begin of tabl occurs 500,
        line(400),
      end of tabl.
data: lines type i.
  call 'SYSTEM' id 'COMMAND' field unixcom
                id 'TAB'     field tabl[].
loop at tabl.
    write:/01 tabl-line.
  endloop.
Regards
vijay

Similar Messages

  • Creating BOM using BDC :How to display no of records from flat file under

    Hi,
          How to display no of records from flat file under one (Alternative BOM) vertically.
        When i execute, the records are replacing one by one.
    Here my coding:
    report ZBOM1
           no standard page heading line-size 255.
    *include bdcrecx1.
    DATA: BEGIN OF bdc OCCURS 0,
           matnr(18),
           werks(4),
           stlan(1),
          END OF BDC.
    DATA: BEGIN OF BDC1 OCCURS 0,
           idnrk(18),
           MENGE(18),
           MEINS(3),
           postp(1),
          END OF bdc1.
    DATA: BEGIN OF BDCDATA OCCURS 0,
             matnr(18),
             werks(4),
             stlan(1),
             idnrk(18),
             MENGE(18),
             MEINS(3),
             postp(1),
             posnr(4),
          END OF BDCDATA.
    data: ibdcdata type  standard table of bdcdata WITH HEADER LINE.
    *start-of-selection.
    CALL FUNCTION 'GUI_UPLOAD'
      EXPORTING
        FILENAME                      = 'C:\Documents and Settings\dilipkumar.b\Desktop\soft.txt'
       FILETYPE                       = 'ASC'
       HAS_FIELD_SEPARATOR            = ','
      HEADER_LENGTH                 = 0
      READ_BY_LINE                  = 'X'
      DAT_MODE                      = ' '
      CODEPAGE                      = ' '
      IGNORE_CERR                   = ABAP_TRUE
      REPLACEMENT                   = '#'
      CHECK_BOM                     = ' '
      VIRUS_SCAN_PROFILE            =
      NO_AUTH_CHECK                 = ' '
    IMPORTING
      FILELENGTH                     =
      HEADER                         =
      TABLES
        DATA_TAB                      = BDCDATA
    EXCEPTIONS
      FILE_OPEN_ERROR               = 1
      FILE_READ_ERROR               = 2
      NO_BATCH                      = 3
      GUI_REFUSE_FILETRANSFER       = 4
      INVALID_TYPE                  = 5
      NO_AUTHORITY                  = 6
      UNKNOWN_ERROR                 = 7
      BAD_DATA_FORMAT               = 8
      HEADER_NOT_ALLOWED            = 9
      SEPARATOR_NOT_ALLOWED         = 10
      HEADER_TOO_LONG               = 11
      UNKNOWN_DP_ERROR              = 12
      ACCESS_DENIED                 = 13
      DP_OUT_OF_MEMORY              = 14
      DISK_FULL                     = 15
      DP_TIMEOUT                    = 16
      OTHERS                        = 17
    IF SY-SUBRC <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    *perform open_group.
    loop at bdcdata.
    perform bdc_dynpro      using 'SAPLCSDI' '0100'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'RC29N-STLAN'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '/00'.
    perform bdc_field       using 'RC29N-MATNR'
                                  'SOFTDRINKS'.
    perform bdc_field       using 'RC29N-WERKS'
                                  'WIND'.
    perform bdc_field       using 'RC29N-STLAN'
                                  '1'.
    perform bdc_field       using 'RC29N-DATUV'
                                  '16.09.2008'.
    perform bdc_dynpro      using 'SAPLCSDI' '0110'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '/00'.
    perform bdc_field       using 'RC29K-BMENG'
                                  '1'.
    perform bdc_field       using 'RC29K-STLST'
                                  '1'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'RC29K-EXSTL'.
    perform bdc_dynpro      using 'SAPLCSDI' '0111'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'RC29K-LABOR'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '/00'.
    perform bdc_dynpro      using 'SAPLCSDI' '0140'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'RC29P-POSTP(01)'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '=FCBU'.
    perform bdc_field       using 'RC29P-IDNRK(01)'
                                  BDCDATA-IDNRK.
    perform bdc_field       using 'RC29P-MENGE(01)'
                                  BDCDATA-MENGE.
    perform bdc_field       using 'RC29P-MEINS(01)'
                                  BDCDATA-MEINS.
    perform bdc_field       using 'RC29P-POSTP(01)'
                                  BDCDATA-POSTP.
    perform bdc_dynpro      using 'SAPLCSDI' '0130'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '/00'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'RC29P-POSNR'.
    perform bdc_field       using 'RC29P-POSNR'
                                   BDCDATA-POSNR.            "'0010'.
    perform bdc_field       using 'RC29P-IDNRK'
                                  BDCDATA-IDNRK.             "'15'.
    perform bdc_field       using 'RC29P-MENGE'
                                  BDCDATA-MENGE.             "'1'.
    perform bdc_field       using 'RC29P-MEINS'
                                  BDCDATA-MEINS.             "'ml'.
    perform bdc_dynpro      using 'SAPLCSDI' '0131'.
    perform bdc_field       using 'BDC_OKCODE'
                                  '/00'.
    perform bdc_field       using 'BDC_CURSOR'
                                  'RC29P-POTX1'.
    perform bdc_field       using 'RC29P-SANKA'
                                  'X'.
    *perform bdc_transaction using 'CS01'.
    *perform close_group.
    CALL TRANSACTION 'CS01' USING IBDCDATA MODE 'A' UPDATE 'S'.
    REFRESH IBDCDATA.
    endloop.
           Start new screen                                              *
    FORM BDC_DYNPRO USING PROGRAM DYNPRO.
      CLEAR iBDCDATA.
      iBDCDATA-PROGRAM  = PROGRAM.
      iBDCDATA-DYNPRO   = DYNPRO.
      iBDCDATA-DYNBEGIN = 'X'.
      APPEND ibDCDATA .
    ENDFORM.
           Insert field                                                  *
    FORM BDC_FIELD USING FNAM FVAL.
    IF FVAL <> NODATA.
        CLEAR iBDCDATA.
        iBDCDATA-FNAM = FNAM.
        iBDCDATA-FVAL = FVAL.
        APPEND iBDCDATA .
    ENDIF.
    ENDFORM.

    Hi,
    the BDCDATA structure must be fnam, fval,dynbegin,dynpro,program.
    You have to declare like this and pass this in your CALL TRANSACTION statement.
    Please give some other table name for BDCDATA you declared for and also for IBDCDATA.

  • Code for reading particular  fields from the file placed in application

    hi,
    code for reading particular  fields from the file placed in application server in to the internal table.

    Hi,
    Use the GUI_UPLOAD FM to upload the File into ur Internal Table.
    DATA : FILE_TABLE TYPE FILE_TABLE OCCURS 0,
             fwa TYPE FILE_TABLE,
             FILENAME TYPE STRING,
             RC TYPE I.
    CALL METHOD CL_GUI_FRONTEND_SERVICES=>FILE_OPEN_DIALOG
      EXPORTING
        WINDOW_TITLE            = 'Open File'
       DEFAULT_EXTENSION       =
       DEFAULT_FILENAME        =
       FILE_FILTER             =
       INITIAL_DIRECTORY       =
       MULTISELECTION          =
       WITH_ENCODING           =
      CHANGING
        FILE_TABLE              = FILE_TABLE
        RC                      = RC
       USER_ACTION             =
       FILE_ENCODING           =
      EXCEPTIONS
        FILE_OPEN_DIALOG_FAILED = 1
        CNTL_ERROR              = 2
        ERROR_NO_GUI            = 3
        NOT_SUPPORTED_BY_GUI    = 4
        others                  = 5
    IF SY-SUBRC <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
                WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    READ TABLE FILE_TABLE INDEX 1 into fwa.
    FILENAME = fwa-FILENAME.
        CALL FUNCTION 'GUI_UPLOAD'
             EXPORTING
                  filename                = filename
                  FILETYPE                = 'DAT'
           IMPORTING
                FILELENGTH              =
             TABLES
                  data_tab                = itab
             EXCEPTIONS
                  file_open_error         = 1
                  file_read_error         = 2
                  no_batch                = 3
                  gui_refuse_filetransfer = 4
                  invalid_type            = 5
                  OTHERS                  = 6 .
        IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
             WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
        ENDIF.
    Regards,
    Balakumar.G
    Reward Points if helpful.

  • BAPI_PO_CREATE1 not able to create PO's for multiple rows from the flat fil

    Hi
    i am uploading PO's from a flat file into SAP using the BAPI_PO_CREATE1. Everything works fine if the flat file hast only one record.
    if the flat file has more than one record then while loading the second record the BAPI returns a error message. I am calling the BAPI in a loop.
    The strange thing is that if i load the second record individually the program is able to create the PO. So only when i have multiple records in the flat file i am unable to load the PO into SAP. I debugged and checked all the internal tables passed to the BAPI. All seems to have the data correctly but still the BAPI fails.
    any idea where i am going wrong?
    the code looks something like this.
    LOOP AT HEADER_ITAB.
       PERFORM FILL_HEADER_RECORDS.
    LOOP AT ITEM_ITAB WHERE EBELN eq HEADER_ITAB-EBELN.
         PERFORM FILL_ITEM_RECORDS.
    ENDLOOP.
      PERFORM CERATE_PO_VIA_BAPI.
    ENDLOOP.

    What is the error message. Are you trying something like this:
        LOOP AT T_DATA1.
          AT NEW LIFNR.
            READ TABLE T_DATA1 INDEX SY-TABIX.
            PERFORM INIT_TABLES.
            PERFORM FILL_DATA.
    --Call the BAPI to create PO
            PERFORM CREATE_PO.
          ENDAT.
        ENDLOOP.
    FORM CREATE_PO .
      CALL FUNCTION 'BAPI_PO_CREATE1'
        EXPORTING
          POHEADER                     =   POHEADER
          POHEADERX                    =   POHEADERX
        POADDRVENDOR                 =
        TESTRUN                      =
        MEMORY_UNCOMPLETE            =
        MEMORY_COMPLETE              =
        POEXPIMPHEADER               =
        POEXPIMPHEADERX              =
        VERSIONS                     =
        NO_MESSAGING                 =
        NO_MESSAGE_REQ               =
        NO_AUTHORITY                 =
        NO_PRICE_FROM_PO             =
       IMPORTING
         EXPPURCHASEORDER             =  EXPPURCHASEORDER
         EXPHEADER                    =  EXPHEADER
         EXPPOEXPIMPHEADER            =  EXPPOEXPIMPHEADER
       TABLES
         RETURN                       =  RETURN
         POITEM                       =  POITEM
         POITEMX                      =  POITEMX
        POADDRDELIVERY               =
         POSCHEDULE                   =  POSCHEDULE
         POSCHEDULEX                  =  POSCHEDULEX
         POACCOUNT                    =  POACCOUNT
        POACCOUNTPROFITSEGMENT       =
         POACCOUNTX                   =  POACCOUNTX
        POCONDHEADER                 =
        POCONDHEADERX                =
         POCOND                       =  POCOND
         POCONDX                      =  POCONDX
        POLIMITS                     =
        POCONTRACTLIMITS             =
        POSERVICES                   =
        POSRVACCESSVALUES            =
        POSERVICESTEXT               =
        EXTENSIONIN                  =
        EXTENSIONOUT                 =
        POEXPIMPITEM                 =
        POEXPIMPITEMX                =
        POTEXTHEADER                 =
          POTEXTITEM                   = POTEXTITEM
        ALLVERSIONS                  =
         POPARTNER                    =  POPARTNER
      CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
        EXPORTING
          WAIT   = 'X'
        IMPORTING
          RETURN = RETURN1.
      DATA: L_NAME TYPE LFA1-NAME1.
      CLEAR L_NAME.
      SELECT SINGLE NAME1
               FROM  LFA1
               INTO L_NAME
               WHERE LIFNR = POHEADER-VENDOR.
      LOOP AT RETURN.
        WRITE : / RETURN-TYPE,
                  RETURN-ID,
                  RETURN-MESSAGE.
        WRITE : '--> For vendor:',
                 POHEADER-VENDOR,
                 L_NAME.
      ENDLOOP.
    ENDFORM.                    " CREATE_PO

  • Reg: how to count no.of records in flat file

    Hi,
    Very god morning to all, i faced a one problem like this, i am new to jython code.my problem is i want count the no.of records flat file before loading to the table, is there any easiest way to count the number records in file rather than jython ?
    please share immediately.
    Regards,
    sh

    Hi actdi,
    Thanks for giving reply, i am working in windows environment, and my previous project developers use the coding like
    below:
    filesrc = open('#filename','r')
    first=filesrc.readline()
    lines = 0
    while first:
    lines+=1
    first=filesrc.readline()
    s1=str(lines)
    s2= ' in the file ------'
    s3='#filename'
    final = s1 + s2 + s3
    raise ' \n\n The Number of Lines in the File are ---' , final
    see this code they are wrote in a procedure but i have a doubt with this , where it finally stores the row count.
    Kindly share if u have any solutions reg this in windows env.
    Regards,
    sh.

  • Reading and loading a flat file to a table by line by line

    Hi All
    My requirement is like this, i receive a daily a file which are having transaction realted data,
    For ex: The first row will also have the transaction num and the second and third rows will be transaction realted infomation and in the 4th record i will have new transaction num and the following rows are realted to 2nd trans num details In the flat file i able to identify which record is transn num related data, now the issue will be the following will not have the transaction num informations, blindly 2 and 3 rows are having transaction num1 info only
    TRAN123
    'SALE'
    '450'
    TRAN245
    'SALE'
    '123'
    my output : TRAN123 is having a sale of 450 and TRAN245 is having a sale 123
    For this i need to read the data from the flat file by an order only
    Thanks
    Ranga

    Ranganathan wrote:
    Hi All
    My requirement is like this, i receive a daily a file which are having transaction realted data,
    For ex: The first row will also have the transaction num and the second and third rows will be transaction realted infomation and in the 4th record i will have new transaction num and the following rows are realted to 2nd trans num details In the flat file i able to identify which record is transn num related data, now the issue will be the following will not have the transaction num informations, blindly 2 and 3 rows are having transaction num1 info only
    TRAN123
    'SALE'
    '450'
    TRAN245
    'SALE'
    '123'
    my output : TRAN123 is having a sale of 450 and TRAN245 is having a sale 123
    For this i need to read the data from the flat file by an order only
    Thanks
    RangaUTL_FILE
    but consider using EXTERNAL TABLE instead

  • How to capture errors records in flat file in BDC

    hi ,
        i would like to know how to capture error records while  uploading a flat file to screen through BDC .
    appreciatable solutions are rewarded.
    thanks,
    shan

    Hi shan,
    write this code, it will solve your problem.
    DATA : BDCTAB LIKE BDCDATA OCCURS 0 WITH HEADER LINE.
    DATA : I_MSG LIKE BDCMSGCOLL OCCURS 0 WITH HEADER LINE.
    DATA : BEGIN OF I_ERR OCCURS 0,
              MATNR(18),
              FLAG(1),
              MSG(100),
           END OF I_ERR.
    DATA :V_LINES TYPE I.
    LOOP AT ITAB.
    REFRESH BDCTAB.
    PERFORM SCREEN USING: 'SAPLMGMM' '0060'.
    PERFORM FIELD USING: 'RMMG1-MATNR'  ITAB-MATNR,
                          'RMMG1-MBRSH' ITAB-MBRSH ,
                          'RMMG1-MTART' ITAB-MTART,
                          'BDC_OKCODE' '/00'.
    PERFORM SCREEN USING: 'SAPLMGMM' '0070'.
    PERFORM FIELD USING: 'MSICHTAUSW-KZSEL(01)' 'X' ,
                         'MSICHTAUSW-KZSEL(02)' 'X' ,
                         'MSICHTAUSW-KZSEL(09)' 'X' ,
                         'BDC_OKCODE' '=ENTR'.
    PERFORM SCREEN USING: 'SAPLMGMM' '0080'.
    PERFORM FIELD USING: 'RMMG1-WERKS' ITAB-WERKS,
                         'BDC_OKCODE' '=ENTR'.
    PERFORM SCREEN USING: 'SAPLMGMM' '4004'.
    PERFORM FIELD USING:  'MAKT-MAKTX' ITAB-MAKTX,
                          'MARA-MEINS' 'EA' ,
                          'MARA-MATKL' '001',
                          'BDC_OKCODE' '/00'.
    PERFORM SCREEN USING: 'SAPLMGMM' '4004'.
    PERFORM FIELD USING:  'BDC_OKCODE' '/00'.
    PERFORM SCREEN USING: 'SAPLMGMM' '4000'.
    PERFORM FIELD USING:  'MAKT-MAKTX' ITAB-MAKTX,
                          'MARA-MEINS' 'EA',
                          'MARC-EKGRP' '001',
                          'MARA-MATKL' '001',
                          'BDC_OKCODE' '=BU'.
    CALL TRANSACTION 'MM01' USING BDCTAB
                            MODE 'A'
                            MESSAGES INTO I_MSG.
    FINDING LAST MESSAGE IN THE I_MSG TABLE*****
    DESCRIBE TABLE I_MSG LINES V_LINES.
    ACCORDING TO THE V_LINES NUMBER TABLE WILL BE READ****
    READ TABLE I_MSG INDEX V_LINES.
    CALL FUNCTION 'FORMAT_MESSAGE'
    EXPORTING
       ID              = I_MSG-MSGID
       LANG            = '-D'
       NO              = I_MSG-MSGNR
       V1              = I_MSG-MSGV1
       V2              = I_MSG-MSGV2
       V3              = I_MSG-MSGV3
       V4              = I_MSG-MSGV4
    IMPORTING
       MSG             = I_ERR-MSG
    EXCEPTIONS
       NOT_FOUND       = 1
       OTHERS          = 2
    IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
             WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    IF I_MSG-MSGID EQ 'M3' AND I_MSG-MSGNR EQ '800'.
    I_ERR-FLAG = 'S'.
    ELSE.
    I_ERR-FLAG = 'E'.
    ENDIF.
    I_ERR-MATNR = ITAB-MATNR.
    APPEND I_ERR.
    CLEAR I_ERR.
    ENDLOOP.
    WRITE:/ 'SUCCESS RECORDS' COLOR COL_POSITIVE.
    SKIP.
    WRITE:/ 'MATERIAL' COLOR COL_HEADING, 20 'MESSAGE' COLOR COL_HEADING.
    LOOP AT I_ERR WHERE FLAG EQ 'S'.
    WRITE:/ I_ERR-MATNR, 20 I_ERR-MSG.
    ENDLOOP.
    SKIP 2.
    WRITE:/ 'ERROR RECORDS' COLOR COL_NEGATIVE.
    SKIP.
    WRITE:/ 'MATERIAL' COLOR COL_HEADING, 20 'MESSAGE' COLOR COL_HEADING.
    LOOP AT I_ERR WHERE FLAG EQ 'E'.
    WRITE:/ I_ERR-MATNR, 20 I_ERR-MSG.
    ENDLOOP.
    *&      Form  SCREEN
         SCREEN
    form SCREEN  using P_PROG P_SCREEN.
    BDCTAB-PROGRAM = P_PROG.
    BDCTAB-DYNPRO = P_SCREEN.
    BDCTAB-DYNBEGIN = 'X'.
    APPEND BDCTAB.
    CLEAR  BDCTAB.
    endform.                    " SCREEN
    *&      Form  FIELD
       FIELD
    form FIELD  using  FNAME FVAL .
    BDCTAB-FNAM = FNAME.
    BDCTAB-FVAL = FVAL.
    APPEND BDCTAB.
    CLEAR BDCTAB.
    endform.                    " FIELD
    Thanks,
    Murali

  • Is there a case for creatiing more than on flat file sourece system in BI 7

    Hello BI/BW Friends:
    What are the Best practices / recommendations for the number of
    flat file interfaces, Flat file source systems to be created in a BI/BW system.
    I need to load 20 flat files into a BI 7.0 system.
    Is there a best practice in this case?
    Is there a case for more that one Flat File source systems being created to increase the speed or
    parallel extraction of these flat files?
    Am I creating any system contention by trying to extract all 20 of these files through the same flat file source system?
    Any and all comments are appreciated.
    Regarcs,
    Joey G.
    856 912 1136

    You don't need to create several Flatfile source systems for several files uploading, One would do
    You can parallelize the loads with one file source , For performance point of view it's the same even you create several file sources or with one, the only thing you will have to concern is to fine tune the routines if any , load MD and then TD
    If you have several file sources, You may get confusion and has to maintain them across the landscape.
    Best practice would be upload all the files in application server and automate the loads thru process chains.

  • How to read relevant record type from file into LSMW ?

    Here my req is like this.
    file with 5 records.
    some of records in file are begin with 'AD'.
    i want to read only that records from file while reading data.
    i created one field in source structure with record type filed.
    even i had given this Identifing Field Content:AD
    but after reading data from file its reading all records from file.
    But it should read only record type AB from source file.
    How can we achieve this?
    Thanks in Advance.

    For this do the follwing steps
    goto step 3 Maintain Source Fields
    press the change mode button  after the double click the filed which is pass the AD value
    then it will appier one popup window in this window you have one check box like " selection parameter for import/convert data"
    tick thic chekc box and save it.
    now when you read the data in Step9 , here you find select options there you enter AD* now you will get the only records start with AD.

  • Custom Pipeline component for Removing Trailer record from .txt file

    public Microsoft.BizTalk.Message.Interop.IBaseMessage
    Execute(Microsoft.BizTalk.Component.Interop.IPipelineContext pc,
    Microsoft.BizTalk.Message.Interop.IBaseMessage inmsg)
        IBaseMessagePart bodyPart = inmsg.BodyPart;
        Stream originalStrm = bodyPart.GetOriginalDataStream();
        StreamReader sReader =
    new StreamReader(originalStrm,
    System.Text.Encoding.UTF8);
        string sRecord = sReader.ReadToEnd();
        MemoryStream memStream =
    new MemoryStream();
        StreamWriter sw =
    new StreamWriter(memStream);
        inmsg.BodyPart.Data
    = memStream;
        inmsg.BodyPart.Data.Position
    = 0;
        //"\r\n" is the delimeter for the the record
        string[] separator
    = new string[]
    { "\r\n" };
        string[] strArray
    = sRecord.Split(separator,
    StringSplitOptions.None);
        //Loop untill the last line (i.e ignore the trailer)
        for (int n
    = 1; n < strArray.Length; n++)
            sw.Write((strArray[n
    - 1] +
    "\r\n"));
        sw.Flush();
        memStream.Flush();
        memStream.Position
    = 0;
        inmsg.BodyPart.Data
    = memStream;
        inmsg.BodyPart.Data.Position
    = 0;
        return inmsg;
    after Deploying and in Gac, when configuring Receive Pipeline, it shows no properties in Decode stage ?
    MBH

    There is nothing wrong with your code it removes the lastline, if there is no carriage return on it.
    If your input file is like:  
       line1 <cr><lf>
               line2 <cr><lf>
               line3 
    The result is: 
               line1 <cr><lf>
               line2 <cr><lf>
    But if your input file is like:
                 line1 <cr><lf>
                  line2 <cr><lf>
                  line3 <cr><if>
    The result is:
                 line1<cr><lf>
                 line2<cr><lf>
                <empty line>
    So when you have a carriage return on the last line, it results in an empty line, can this be the cause of your problem?

  • Reading Data from a flat file in UCCX

    We have UCCX 7.0 and I need to do a lookup on a flat file or cvs file from a script.
    Can that be done or does it have to be from a database?

    Anything's possible if you want to write a custom Java class that parses the file. Out of the box you need to use XML and XPath or an ODBC connection though.

  • Need to read a field from flat file

    Hi All,
        In flat file i have  one field which may contain 100 enteries separated by delimanator ',' (comma).  There can be 1 , 10 or 100 enteries.   I need to put them in the internal table . Please suggest me .
        I have used the below logic .
    i split the field in 100 variables and then appending the internal table . the problem here is even if there is only one entry , the logic will append 99 blank enteries .  so can any one suggest , is there any other way do it . any Functon Modules available .
    Thanks.

    Hi Chetan,
    If i understand your requirement it is as below,
    Flat file.
    1,2,3,4,5,6,7,8,9,....
    1,2,3,4,5,6,7,8,9,....
    1,,,,,,,,,,,,,,,,,,,,
    Now you read the first line and split into an itab of type string, and you are worried about the third line where not all the fields have values and end up creating blank entries. to fix this, simply
    Sort the Itab, and use delete itab statement with the where clause as where column is ''.
    This should solve it.
    types: begin of t_data,
                line type string,
               end of data.
    data: li_file_line type standard table of t_data.
    split l_file_line at ',' into li_file_line.
    delete li_file_line where line = ''.

  • Processing records using Flat file ActiveSync resource

    Hi Folks,
    I was working on flat file activeSync resource adapter. Documentation says the following..
    The Flat File Active Sync adapter can track the timestamp of a flat file. In addition, the adapter can archive the last file processed and then compare it to the most recent version. Identity Manager will then act on the accounts that are different in the two files.
    If these features are enabled, the first time Identity Manager polls the source flat file, the system copies the file and places it in the same directory. The copied (archived) file is named FFAS_timestamp.FFAS, with the timestamp indicating the last time the original file was changed. The format of the timestamp is determined by the operating system on which the source file resides.
    On each subsequent poll, Identity Manager compares the timestamp on the original file with the most recent timestamp. If the new timestamp value is the same as the previous value, then the file has not changed, and no further processing is performed until the next poll. If the timestamp values are different, Identity Manager checks for the presence of the FFAS file. If the file does not exist, Identity Manager processes the updated source file as if it were a new file.
    If the timestamps are different and the archived FFAS file exists, Identity Manager compares the source file with the archived file. The comparison will filter any users that have not changed. If a user has changed, then it will be sent through the adapter in the normal manner, and the configured process, correlation and delete rules determine what to do with the user.
    I have set the following attributes in synchronization policy as
    Track Last Processed Timestamp : true     
    Process Differences Only : true
    Unique Key for Diff : <some attribute>
    I have the following flat file..
    PERSONID,accountID,FIRSTNAME,LASTNAME,NATID,BIRTHDATE,EXTERNALID,TYPE,COUNTRY
    10004,,test,4,123444,12/01/1804,204,A,USA
    10005,,test,5,,12/01/1805,205,M,USA
    There is no user account in IDM repository. After processing this file, my code creates the above accounts in IDM.
    I deleted the above two accounts from IDM using admin interface and the accounts are not created again. So far good.
    Now I opened the above flat file and just saved again. It creates the account in IDM.
    According to the above documentation, it should not create the account as the flat file is already processed based on diff operation. Then Why it is creating the account again in IDM? Any logic??
    Your repsonse will be appreciated.

    Hi,
    I am pretty sure this has been working for me but with IAPI.Cancel. Please try to capitalize the C.
    Regards,
    Patrick

  • Duplicate records in flat file extracted using openhub

    Hi folks
    I am extracting data from the cube to opnhub into a flat file, I see duplicate records in the file.
    I am doing a full load to a flat file
    I cannot have technical key because I am using a flat file.
    Poonam

    I am using aggregates(In DTP there is a option to use aggregates) and the aggregates are compressed and I am still facing thiis issue.
    Poonam

  • Transferring over billion records to flat file destination.

    I have a requirement, where i need to create a flat file which loads over billion records in it. Can you please suggest me the best way to follow? 
    Thank you in advance!

    SSIS or even SQLBulkcopy class. Make sure  your have plenty space on the disk...
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

Maybe you are looking for

  • Multiple All Day Events Missing in Week View

    Hello, I just got my first BB a Curve 8900 and am very impressed. The only thing I don't like is that if I have more than one all day event on a particular day, in Week View I can't see them. (I've used a variety of other phones/PDAs over the years a

  • What's wrong with iCloud and all that. See below

    What's wrong with iCloud tonight I am Sending texts and it is going as a normal text Not iMessage it was working earlier. Now Gas stopped working all of a sudden. If Anyone Is experiencing the same please let Me know. Also it is saying when I turn on

  • Folder view needs enlarging

    Whenever I open a folder in finder and other applications it always opens a window that needs enlarging to see the full content of the folder is there anyway to make the size of the initial window open a larger view? Thanks John

  • Message 476 Routing does not match selection

    Hi I am just starting to try and learn some CO for material cost estimates.  I am trying to get a planned cost estimate that shows a labor cost component and machine hours cost component.  I think I am not seeing it because of this message. Can anyon

  • My cursor keeps freezing i have to turn off my computer to get it working again,

    when i leave my laptop for abit my cursor freezers, and to get it working again i have to turn off my laptop. i have done an upgrade on my cursor hardware but to no joy. can anyone help me please...