Null data in the flat file

HI ,
i have generated a flat file after extracting data from oracle database.
my table has 10 fields and out of which second and third fields are null..
But after populating the data in the file, i could not see the empty spaces for those fields, the first column is immediately followed by the fourth column.
Is there anything i have to do to get the spaces for null fields
thanks in advance,
Naveen

What delimiter you are using? For comma delimited or other delimited file it should write ,,, or suitable delimiter for null values.

Similar Messages

  • Format data in the flat file while downloading from internal table

    Just I used 'GUI_DOWNLOAD' with WRITE_FIELD_SEPARATOR = 'X'. In my case one of the field contains large quantity. Bcoz of this value , it pushes the remaining filed position values in that particular row. The fields in the internal table refers the stadard data elements.I need the field values should be under corresponding field names.Can anyone help me?

    Hi,
    declare another internal table in the format in which you want to download .
    say itab_text.
    itab is your orginal internal table .
    than loop at itab.
    MOVE-CORRESPONDING Itab TO Itab_text.
    APPEND ITab_text.
    endloop.
    now pass the itab_text to gui_download.
    Thanks

  • What is the best way to load and convert data from a flat file?

    Hi,
    I want to load data from a flat file, convert dates, numbers and some fields with custom logic (e.g. 0,1 into N,Y) to the correct format.
    The rows where all to_number, to_date and custom conversions succeed should go into table STG_OK. If some conversion fails (due to an illegal format in the flat file), those rows (where the conversion raises some exception) should go into table STG_ERR.
    What is the best and easiest way to archive this?
    Thanks,
    Carsten.

    Hi,
    thanks for your answers so far!
    I gave them a thought and came up with two different alternatives:
    Alternative 1
    I load the data from the flat file into a staging table using sqlldr. I convert the data to the target format using sqlldr expressions.
    The columns of the staging table have the target format (date, number).
    The rows that cannot be loaded go into a bad file. I manually load the data from the bad file (without any conversion) into the error table.
    Alternative 2
    The columns of the staging table are all of type varchar2 regardless of the target format.
    I define data rules for all columns that require a later conversion.
    I load the data from the flat file into the staging table using external table or sqlldr without any data conversion.
    The rows that cannot be loaded go automatically into the error table.
    When I read the data from the staging table, I can safely convert it since it is already checked by the rules.
    What I dislike in alternative 1 is that I manually have to create a second file and a second mapping (ok, I can automate this using OMB*Plus).
    Further, I would prefer using expressions in the mapping for converting the data.
    What I dislike in alternative 2 is that I have to create a data rule and a conversion expression and then keep the data rule and the conversion expression in sync (in case of changes of the file format).
    I also would prefer to have the data in the staging table in the target format. Well, I might load it into a second staging table with columns having the target format. But that's another mapping and a lot of i/o.
    As far as I know I need the data quality option for using data rules, is that true?
    Is there another alternative without any of these drawbacks?
    Otherwise I think I will go for alternative 1.
    Thanks,
    Carsten.

  • Error while loading  data into External table from the flat files

    HI ,
    We have a data load in our project which feeds the oracle external tables with the data from the Flat Files(.bcp files) in unix.
    While loading the data, we are encountering the following error.
    Error occured (Error Code : -29913 and Error Message : ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04063: un) while loading data into table_ext
    Please let us know what needs to be done in this case to solve this problem.
    Thanks,
    Kartheek

    Kartheek,
    I used Google (mine still works).... please check those links:
    http://oraclequirks.blogspot.com/2008/07/ora-29400-data-cartridge-error.html
    http://jonathanlewis.wordpress.com/2011/02/15/ora-29913/
    HTH,
    Thierry

  • CUNIT error in data loading from flat file after r/3 extraction

    Hi all
    After R/3 business content extraction, if i load data from flat file to info cube, I am getting Conversion exit CUNIT error, what might be the reason, the data in the flat file 0unit column is accurate, and mapping rules are also correct, but still i am getting error with CUNIT.?

    check your unit if you are loading amount or quantities what mapping you have and what you are loading from flat files.
    BK

  • Data misplaced during Flat file data load

    Hi all,
        i have to load the data from the flat file(txt format) but i found some record are not populating in PSA.
    the voluem of the data is very huge. how to resolve this and data load seems to be green in info package monitoring.

    Hi,
    How did you conclude that the records are missing the PSA?  Is it because the count is less?
    There are the chances that the records are duplicated in the flat file.
    Perhaps, you can increase the number of records per datapackage in the settings and can check the count.
    If you feel, a particular record is not getting loaded, the load that record selectively through filter.
    Split the load according to your convenience and load so that it will be easy to keep track of the records.
    Thanks.

  • Refreshing the data in a flat file

    Hi
    I am working on data integration.
    I need to fetch data from Oracle data base and then write it to a flat file.
    It is working fine now,but for the next fetch I don't need the old data to remain there in the file.The data should get refreshed and only the newly fetched data should be present.
    After the data is written to the flat file can i rename the file?
    I need the format of the file to be 'File_yyyyMMDDHHmmss.txt'.
    My final question is how should I FTP this to the target?
    Please help me on this as soon as possible since this is needed in an urgent part of the delivery.

    All you ask is achievable:
    1) The IKM SQL to file has a TRUNCATE option, which will if set to YES, will start from a clean file.
    2) You could rename the file after writing it, but why not just write it with that name? If you set the resource name to be a variable, (e.g. #MyProj.MyFilename), and be sure to set the variable in the package before executing your interface, you should be able to get the file you want. Otherwise, you can use the OdiFileMove tool in your package to rename the file.
    To set the name of the variable you can use a query on the database (if you are using Oracle, something like SELECT 'File_'||TOCHAR(SYSDATE) from DUAL.)
    3) ODI has ftp classes built in- you can find the doc under doc\webhelp\en\ref_jython\jyt_ex_ftp.htm
    Hope this helps
    Message was edited by:
    CTS

  • Error while uploading data from a flat file to the hierarchy

    Hi guys,
    after i upload data from a flat file to the hierarchy, i get a error message "Please select a valid info object" am loading data using PSA, having activated all external chars still get the problem..some help on this please..
    regards
    Sri

    there is o relation of infoobject name in flat file and infoobjet name at BW side.
    please check with the object in the BW and their lengths and type of the object and check your flat file weather u have the same type there,
    now check the sequence of the objects in the transfer rules  and activate them.
    there u go.

  • Uploading the data from a flat file into ztable

    Hi,
    I have a requirement where I have to upload the data from 2 flat files into 2 z tables(ZRB_HDR,ZRB_ITM).From the 1st flat file only data for few fields have to be uploaded into ztable(ZRB_HRD) .Fromthe 2nd flat file data for all the fields have to me uploaded into ztable(ZRB_ITM). How can I do this?
    Regards,
    Hema

    hi,
    declare two internal table with structur of your tables.
    your flat files should be .txt files.
    now make use of GUI_UPLOAD function module to upload your flatfile into internal tables.
    CALL FUNCTION 'GUI_UPLOAD'
        EXPORTING
          filename            = 'c:\file1.txt'
          has_field_separator = 'X'
        TABLES
          data_tab            = itab1
        EXCEPTIONS
          OTHERS              = 1.
    use this function twice for two tables.
    then loop them individually and make use of insert command.

  • How to load data from a  flat file which is there in the application server

    HI All,
              how to load data from a  flat file which is there in the application server..

    Hi,
    Firstly you will need to place the file(s) in the AL11 path. Then in your infopackage in "Extraction" tab you need to select "Application Server" option. Then you need to specify the path as well as the exact file you want to load by using the browsing button.
    If your file name keeps changing on a daily basis i.e. name_ddmmyyyy.csv, then in the Extraction tab you have the option to write an ABAP routine that generates the file name. Here you will need to append sy-datum to "name" and then append ".csv" to generate complete filename.
    Please let me know if this is helpful or if you need any more inputs.
    Thanks & Regards,
    Nishant Tatkar.

  • Updating Table using the data from a flat file

    Hi,
    I have a table called emp;
    Name,
    empno,
    accountno,
    amount.
    This table is filled with values of name and empno.
    The columns accountno and amount are empty.
    I have a flat file created in some folder.
    Contents of the flat file can be like this:
    mani | 23 | 123 | 1000
    spr | 22 | 342 | 2133
    asjf | 54 | 432 | 2345
    I need to access the file in the specified location, read all the records one after the other in the flat file and update the table "emp" with the values in the flat file.
    Row after row all the records in the file should be updated to the table.
    I found out some way to do this - its sqlloader - But it loads the data from a file to the table - i dont need that - i need to update the table.
    Let me know how this can be solved ?????????
    Thanks in advance.......

    Just to clarify Andrew's point, you can use external tables as the source table for an UPDATE statement. You cannot use them as a target for an UPDATE statement (i.e. you can't update the text file from an Oracle table).
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • Problem in the BDC program to upload the data from a flat file.

    Hi,
    I am required to write a BDC program to upload the data from a flat file. The conditions are as mentioned below:-
    1) Selection Screen will be prompted to user and user needs to provide:- File Path on presentation server (with F4 help for this obligatory parameter) and File Separator e.g. @,#,$,%,... etc(fields in the file will be separated by using this special character) or fields may be separated by tab(tab delimited).
    2) Finally after the data is uploaded, following messages need to be displayed:-
    a) Total Number of records successfully uploaded.
    b) Session Name
    c) Number of Sessions created.
    Problem is when each record is fetched from flat file, the record needs to be split into individual fields separated by delimiter or in case tab separated, then proceeding in usual manner.
    It would be great if you provide me either the logic, pseudocode, or sample code for this BDC program.
    Thanks,

    Here is an example program,  if you require the delimitor to be a TAB, then enter TAB on the selection screen, if you require the delimitor to be a comma, slash, pipe, whatever, then simply enter that value.  This example is simply the uploading of the file, not the BDC, I assume that you know what to do once you have the data into the internal table.
    REPORT zrich_0001.
    TYPES: BEGIN OF ttab,
            rec TYPE string,
           END OF ttab.
    TYPES: BEGIN OF tdat,
           fld1(10) TYPE c,
           fld2(10) TYPE c,
           fld3(10) TYPE c,
           fld4(10) TYPE c,
           END OF tdat.
    DATA: itab TYPE TABLE OF ttab.
    data: xtab like line of itab.
    DATA: idat TYPE TABLE OF tdat.
    data: xdat like line of idat.
    DATA: file_str TYPE string.
    DATA: delimitor TYPE string.
    PARAMETERS: p_file TYPE localfile.
    PARAMETERS: p_del(5) TYPE c.
    AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
      DATA: ifiletab TYPE filetable.
      DATA: xfiletab LIKE LINE OF ifiletab.
      DATA: rc TYPE i.
      CALL METHOD cl_gui_frontend_services=>file_open_dialog
        CHANGING
          file_table = ifiletab
          rc         = rc.
      READ TABLE ifiletab INTO xfiletab INDEX 1.
      IF sy-subrc = 0.
        p_file = xfiletab-filename.
      ENDIF.
    START-OF-SELECTION.
      TRANSLATE p_del TO UPPER CASE.
      CASE p_del.
        WHEN 'TAB'.
          delimitor = cl_abap_char_utilities=>horizontal_tab.
        WHEN others.
          delimitor = p_del.
      ENDCASE.
      file_str = p_file.
      CALL METHOD cl_gui_frontend_services=>gui_upload
        EXPORTING
          filename = file_str
        CHANGING
          data_tab = itab.
      LOOP AT itab into xtab.
        CLEAR xdat.
        SPLIT xtab-rec AT delimitor INTO xdat-fld1
                                         xdat-fld2
                                         xdat-fld3
                                         xdat-fld4.
        APPEND xdat to idat.
      ENDLOOP.
      LOOP AT idat into xdat.
        WRITE:/ xdat-fld1, xdat-fld2, xdat-fld3, xdat-fld4.
      ENDLOOP.
    Regards,
    Rich Heilman

  • Rejecting the Data to a flat file

    Hi,
    I want to reject the bad data to a flat file. For example, for primary key violation, the bad data should be loaded to a flat file, instead of giving the error and stopping my mapping job (after the error limit of 50).
    Can I do this? Please do help me.
    Thanks,
    Harsha

    Hi,
    what is your actual requirement?
    If you do not want the mapping to fail after raising the 50 errors, then configure your mapping and increase the value for Maximum number of errors tab under runtime parameters.
    If you actually want to capture all the PK violations into a flat file, there is no direct way of doing this.
    Either you have to query RT repository views (write your own sql) to produce a file OR
    While in Mapping Configure window
    expand Sources and Targets
    expand constraint management
    set Enable constraints to FALSE
    set Exceptions Table Name <your own exceptions table>.
    Have a look in Oracle Documentation for constraint management. That should give you an idea how to create your own exceptions table etc.
    HTH
    mahesh

  • Steps to load the data by using flat file for hierarchies in BI 7.0

    Hi Gurus,
    steps to load the data by using flat file for hierarchies in BI 7.0

    hi ,
    u will get the steps int he following blog by Prakash Bagali
    Hierarchy Upload from Flat files
    regards,
    Rathy

  • Date and Time in the flat file

    Hi All,
    I am trying to design a flow which will get data from a flat file. The file has a field which contains both the time and date. How can I handle this in BW? Do I need to creat 2 infoobjects and split the flat file field in start routine or transfer rule? Or is there any stadard infoobject which can hold both the data and time?
    Also, the client asked me if I want a .CSV file or .XLS file. Which one is better for uploading into BW? Any PROS and CONS?
    Best Regards,

    Hi,
    There is no single Data Type which accepts the Date and Time. Other way is to look the data as CHAR. Else the Update Routine/Formula is the Best
    In the Update Rule or Transfer Rule use the Formula
    Let the Info Objects be
    0date
    0Time
    The Transfer Rule/ Update Rule Formulas Be
    0Date --> LEFT( 8, 'DateTime' )
    0Time --> RIGHT(6, 'DateTime')
    Then CSV is the best Option to accept the Data as it is ready for Upload
    Regards
    Happy Tony

Maybe you are looking for

  • Problem With Project

    I have project on J2ME, but dont know how to run. please anybody help to run the project

  • Confirmation error - Number of hours recorded  exceeds target hours 0,00

    Hello Friends, I am getting one error in service confirmation . Service confirmation did not go through R/3 properly. Also how system checks recorded hours against target hours ? We can find recorded hours in Actual duration column but where we can s

  • People against transparent backgrounds?

    If you look at this design firm's site, the home page has a lady walking around in front of a transparent background so other graphics can show through.. can anyone tell me what the technique is to do this? Obviously you video the subject.. against g

  • Why is my file size so limited?

    i bought this program to convert a short children's book to a pdf. and now they tell me its too big.. boo. hiss. not happy.

  • Unicode Not Rendering

    Hello All, Presently we are trying to deploy oracle forms using Hindi Languages(Unicode support of oracle). We are using Oracle 9i Release 2 for the development of the form application. We have set the NLS_LANG,DEVELOPER_NLS_LANG,USER_NLS_LANG to "hi