Export Table to flat file and upload to FTP server for external vendor

I have done some extensive searches on the forums and I think I have an idea but really need some help. Basically have a table that needs to be exported to a flat file and uploaded to a FTP server so that our vendor can get the data Monday-Saturday and automated by sysdate. This data is used to produce daily letters that go out to customers.
After doing some searching I came across the UTL_FILE package and reading more into this right now but I am not sure if it is what I should be using to create the file.
I am using TOAD right now and can see how to do it manually however really need the file to create itself and send automatically at 6 am Monday - Saturday with the date at the end of the file name making it a unique file for each day.
Thanks in advance for all of your help.

As Justin and others have mentioned I wrote [XUTL_FTP|http://www.chrispoole.co.uk/apps/xutlftp.htm] so you wont find it in the Oracle documentation. It's pure PL/SQL and uses UTL_TCP and implements an integrated PL/SQL FTP client / API. One of it's many overloaded procedures FTP's the contents of a refcursor straight to a FTP server of your choice, meaning no intervening file is required. Since it is pure PL/SQL, it is trivial to schedule using DBMS_JOB/DBMS_SCHEDULER.
Any questions just ask.
HTH Chris

Similar Messages

  • Export table to flat file and need to insert sysdate in flat file column

    Hi, I created an interface to export oracle table to a csv file. All of the table columns are working well. Then I need to insert the sysdate in csv file column.
    I made the mapping as working in staging area, implementation is to_char(sysdate,'dd/mm/yyyy'). But the result is insert 14 to the column.
    I have tried to create a variable refreshing as select to_char(sysdate,'dd/mm/yyyy') from dual, then mapping that to the column in csv file, but it only insert to 1 row and the format is yyyymmdd.
    I have tried to use  SELECT '<%=odiRef.getSysDate( "yyyyMMdd")%>' from dual for the variable, and it also only insert one row to the flat file.
    I used the same methodology in ODI10g, it works fine.
    So, I am wondering how it can be implemented in 11g.
    Thanks

    The first option you have stated seems like the obvious choice - I don't see any reason why this shouldn't work.  What do you mean by "But the result is insert 14 to the column." Do you mean that it inserted the string "14" (I can't imagine why this would be the case) or that it inserted 14 rows?

  • Exporting tables to flat files

    Hi,
    Can I export tables to flat files(csv) using pl/sql. I mean suppose I have 2 tables 'student' and 'subject' and I want to create two flat files student.csv and subject.csv Please let me know, if I can do it using pl/sql for both the tables in one go.
    Even if I can do it for one table, than I can write a script to automate for all the tables. Please let me know with your suggestions. Appreciate your help.
    Thanks

    As sys user:
    CREATE OR REPLACE DIRECTORY TEST_DIR AS '\tmp\myfiles'
    GRANT READ, WRITE ON DIRECTORY TEST_DIR TO myuser
    /As myuser:
    CREATE OR REPLACE PROCEDURE run_query(p_sql IN VARCHAR2
                                         ,p_dir IN VARCHAR2
                                         ,p_header_file IN VARCHAR2
                                         ,p_data_file IN VARCHAR2 := NULL) IS
      v_finaltxt  VARCHAR2(4000);
      v_v_val     VARCHAR2(4000);
      v_n_val     NUMBER;
      v_d_val     DATE;
      v_ret       NUMBER;
      c           NUMBER;
      d           NUMBER;
      col_cnt     INTEGER;
      f           BOOLEAN;
      rec_tab     DBMS_SQL.DESC_TAB;
      col_num     NUMBER;
      v_fh        UTL_FILE.FILE_TYPE;
      v_samefile  BOOLEAN := (NVL(p_data_file,p_header_file) = p_header_file);
    BEGIN
      c := DBMS_SQL.OPEN_CURSOR;
      DBMS_SQL.PARSE(c, p_sql, DBMS_SQL.NATIVE);
      d := DBMS_SQL.EXECUTE(c);
      DBMS_SQL.DESCRIBE_COLUMNS(c, col_cnt, rec_tab);
      FOR j in 1..col_cnt
      LOOP
        CASE rec_tab(j).col_type
          WHEN 1 THEN DBMS_SQL.DEFINE_COLUMN(c,j,v_v_val,2000);
          WHEN 2 THEN DBMS_SQL.DEFINE_COLUMN(c,j,v_n_val);
          WHEN 12 THEN DBMS_SQL.DEFINE_COLUMN(c,j,v_d_val);
        ELSE
          DBMS_SQL.DEFINE_COLUMN(c,j,v_v_val,2000);
        END CASE;
      END LOOP;
      -- This part outputs the HEADER
      v_fh := UTL_FILE.FOPEN(upper(p_dir),p_header_file,'w',32767);
      FOR j in 1..col_cnt
      LOOP
        v_finaltxt := ltrim(v_finaltxt||','||lower(rec_tab(j).col_name),',');
      END LOOP;
      --  DBMS_OUTPUT.PUT_LINE(v_finaltxt);
      UTL_FILE.PUT_LINE(v_fh, v_finaltxt);
      IF NOT v_samefile THEN
        UTL_FILE.FCLOSE(v_fh);
      END IF;
      -- This part outputs the DATA
      IF NOT v_samefile THEN
        v_fh := UTL_FILE.FOPEN(upper(p_dir),p_data_file,'w',32767);
      END IF;
      LOOP
        v_ret := DBMS_SQL.FETCH_ROWS(c);
        EXIT WHEN v_ret = 0;
        v_finaltxt := NULL;
        FOR j in 1..col_cnt
        LOOP
          CASE rec_tab(j).col_type
            WHEN 1 THEN DBMS_SQL.COLUMN_VALUE(c,j,v_v_val);
                        v_finaltxt := ltrim(v_finaltxt||',"'||v_v_val||'"',',');
            WHEN 2 THEN DBMS_SQL.COLUMN_VALUE(c,j,v_n_val);
                        v_finaltxt := ltrim(v_finaltxt||','||v_n_val,',');
            WHEN 12 THEN DBMS_SQL.COLUMN_VALUE(c,j,v_d_val);
                        v_finaltxt := ltrim(v_finaltxt||','||to_char(v_d_val,'DD/MM/YYYY HH24:MI:SS'),',');
          ELSE
            v_finaltxt := ltrim(v_finaltxt||',"'||v_v_val||'"',',');
          END CASE;
        END LOOP;
      --  DBMS_OUTPUT.PUT_LINE(v_finaltxt);
        UTL_FILE.PUT_LINE(v_fh, v_finaltxt);
      END LOOP;
      UTL_FILE.FCLOSE(v_fh);
      DBMS_SQL.CLOSE_CURSOR(c);
    END;This allows for the header row and the data to be written to seperate files if required.
    e.g.
    SQL> exec run_query('select * from emp','TEST_DIR','output.txt');
    PL/SQL procedure successfully completed.Output.txt file contains:
    empno,ename,job,mgr,hiredate,sal,comm,deptno
    7369,"SMITH","CLERK",7902,17/12/1980 00:00:00,800,,20
    7499,"ALLEN","SALESMAN",7698,20/02/1981 00:00:00,1600,300,30
    7521,"WARD","SALESMAN",7698,22/02/1981 00:00:00,1250,500,30
    7566,"JONES","MANAGER",7839,02/04/1981 00:00:00,2975,,20
    7654,"MARTIN","SALESMAN",7698,28/09/1981 00:00:00,1250,1400,30
    7698,"BLAKE","MANAGER",7839,01/05/1981 00:00:00,2850,,30
    7782,"CLARK","MANAGER",7839,09/06/1981 00:00:00,2450,,10
    7788,"SCOTT","ANALYST",7566,19/04/1987 00:00:00,3000,,20
    7839,"KING","PRESIDENT",,17/11/1981 00:00:00,5000,,10
    7844,"TURNER","SALESMAN",7698,08/09/1981 00:00:00,1500,0,30
    7876,"ADAMS","CLERK",7788,23/05/1987 00:00:00,1100,,20
    7900,"JAMES","CLERK",7698,03/12/1981 00:00:00,950,,30
    7902,"FORD","ANALYST",7566,03/12/1981 00:00:00,3000,,20
    7934,"MILLER","CLERK",7782,23/01/1982 00:00:00,1300,,10The procedure allows for the header and data to go to seperate files if required. Just specifying the "header" filename will put the header and data in the one file.
    Adapt to output different datatypes and styles are required.

  • Automate a BI report to flat file on to a FTP server.

    Hello all,
    I have an same issue like the following thread. After extract CSV file successfully to the directory, but unable to find the file. Any thought?
    Automate a BI report to flat file on to a FTP server.
    Regards!

    Need to analyze the business purpose ,however heres my take on it..
    You can utilize 3 options to export query result on to the SAP directories as below and map that path to your FTP server.
    1. Transaction RSCRM_BAPI (still valid in nw2004s)
    2. Transaction RSANWB (analytical Process designer)
    3. Information Broadcasting - KM folder has to be mapped as a file system location.
    Hope it Helps
    Chetan
    @CP..

  • Export table to flat file programmatically

    Hi all,
    i want to know how to export a table/tables
    programmatically to a flat file
    i know that we can do that by right click on the table
    and choose save as .... etc... (in toad)
    But how do i make it with a script..
    thx

    http://osi.oracle.com/~tkyte/flat/index.html

  • Flat  file to upload data using BDC for transaction MM01

    Hi
    I am trying to update data using bdc code has been attached below using a txt file.
    It is updating the first set of data into the table mara ,but  for the rest it is not
    All the data from txt file has being loaded to internal table , but the problem is it does not gets updated from internal table to the database .
    Only the first set of data has been loaded ,<u><b> rest of the data is not loaded</b></u>
    <u><b>content of txt file</b></u>
    zsc     zsc     kg     
    zsv     zsv     kg     
    zsb     zsb     kg
    <u><b>Actual code</b></u>
    report ZMAT_UPLOAD
           no standard page heading line-size 255.
    types declaration..........................................................................
    types : begin of t_mat,
       matnr(20),
       desc(50),
       uom(5),
    end of t_mat.
    internal table and workarea declaration.......................................
    data : i_mat type table of t_mat.
    data : wa_mat type t_mat.
    include bdcrecx1.
    start-of-selection.
    moving the flat file content to internal table................................
    CALL FUNCTION 'UPLOAD'
         EXPORTING
             FILETYPE   = 'DAT'
         TABLES
             data_tab   = i_mat.
    IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    perform open_group.
    loop at i_mat into wa_mat.
    perform bdc_dynpro      using 'SAPLMGMM' '0060'.
    perform bdc_field       using 'BDC_CURSOR'
                                       'RMMG1-MATNR'.
    perform bdc_field       using 'BDC_OKCODE'
                                        '=AUSW'.
    perform bdc_field       using 'RMMG1-MATNR'
                                        wa_mat-matnr.
    perform bdc_field       using 'RMMG1-MBRSH'
                                        'P'.
    perform bdc_field       using 'RMMG1-MTART'
                                        'ZOH'.
    perform bdc_dynpro   using 'SAPLMGMM' '0070'.
    perform bdc_field       using 'BDC_CURSOR'
                                        'MSICHTAUSW-DYTXT(01)'.
    perform bdc_field       using 'BDC_OKCODE'
                                        '=ENTR'.
    perform bdc_field       using 'MSICHTAUSW-KZSEL(01)'
                                        'X'.
    perform bdc_dynpro   using 'SAPLMGMM' '4004'.
    perform bdc_field       using 'BDC_OKCODE'
                                        '=BU'.
    perform bdc_field       using 'MAKT-MAKTX'
                                        wa_mat-desc.
    perform bdc_field       using 'BDC_CURSOR'
                                        'MARA-MEINS'.
    perform bdc_field       using 'MARA-MEINS'
                                        wa_mat-uom.
    perform bdc_field       using 'MARA-MTPOS_MARA'
                                        'NORM'.
    perform bdc_transaction using 'MM01'.
    endloop.
    Perform close_group.

    Hi Sumant,
    just concentrate on bold one
    report ZMAT_UPLOAD
    no standard page heading line-size 255.
    types declaration..........................................................................
    <b>
    data : begin of t_mat occurs 0,
    matnr(20),
    desc(50),
    uom(5),
    end of t_mat.</b>
    internal table and workarea declaration.......................................
    <b>*data : i_mat type table of t_mat.
    *data : wa_mat type t_mat.</b>
    include bdcrecx1.
    start-of-selection.
    moving the flat file content to internal table................................
    CALL FUNCTION 'UPLOAD'
    EXPORTING
    FILETYPE = 'DAT'
    TABLES
    <b>data_tab = i_mat.---> t_mat.</b>
    IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    perform open_group.
    <b>loop at i_mat into wa_mat.------>loop at t_mat.</b>
    perform bdc_dynpro using 'SAPLMGMM' '0060'.
    perform bdc_field using 'BDC_CURSOR'
    'RMMG1-MATNR'.
    perform bdc_field using 'BDC_OKCODE'
    '=AUSW'.
    perform bdc_field using 'RMMG1-MATNR'
    <b>wa_mat-matnr.---->t_mat-matnr(change in this for ur wa to t_mat.</b>
    perform bdc_field using 'RMMG1-MBRSH'
    'P'.
    perform bdc_field using 'RMMG1-MTART'
    'ZOH'.
    perform bdc_dynpro using 'SAPLMGMM' '0070'.
    perform bdc_field using 'BDC_CURSOR'
    'MSICHTAUSW-DYTXT(01)'.
    perform bdc_field using 'BDC_OKCODE'
    '=ENTR'.
    perform bdc_field using 'MSICHTAUSW-KZSEL(01)'
    'X'.
    perform bdc_dynpro using 'SAPLMGMM' '4004'.
    perform bdc_field using 'BDC_OKCODE'
    '=BU'.
    perform bdc_field using 'MAKT-MAKTX'
    wa_mat-desc.
    perform bdc_field using 'BDC_CURSOR'
    'MARA-MEINS'.
    perform bdc_field using 'MARA-MEINS'
    wa_mat-uom.
    perform bdc_field using 'MARA-MTPOS_MARA'
    'NORM'.
    perform bdc_transaction using 'MM01'.
    endloop.
    Perform close_group.
    Reward points for helpful answers.
    Thanks
    Naveen khan
    Message was edited by:
            Pattan Naveen
    Message was edited by:
            Pattan Naveen

  • How to display read file and put in ftp server in jsp

    Hi all
    since this is very importent question, please understand it and reply soon.
    i am using power analyzer sdk. front end as jsp.
    i want read a file using poweranalyzer sdk(how to read this?)and put it in ftp server(how to put this?)
    thanks and regards
    Kasim
    Bangalore

    http://java.sun.com/docs/books/tutorial/uiswing/components/editorpane.html

  • Exporting R3 tables into Flat Files for BPC

    Dear BPC experts,
    I understand currently the way for BPC to extract data from R3 is via flat files. I have the following queries:
    1) What exactly are the T codes and steps required to export R3 tables into flat files (without going through the OpenHub in BI7)? Can this process be automated?
    2) Is Data Manager of BPC equivalent to SSIS (Integration Services) of SQL Server?
    Please advise. Thanks!!
    SJ

    Hi Soong Jeng,
    I would take a look at the existing BI Extractors for the answer to Q1, I am working on finishing up a HTG regarding this. Look for it very soon.
    Here is the code to dump out data from a BI Extractor directly from ERP.
    You need dev permissions in your ERP system and access to an app server folder. ...Good Luck
    *& Report  Z_EXTRACTOR_TO_FILE                                         *
    report  z_extractor_to_file                     .
    type-pools:
      rsaot.
    parameters:
      p_osrce type roosource-oltpsource,
      p_filnm type rlgrap-filename,
      p_maxsz type rsiodynp4-maxsize default 100,
      p_maxfc type rsiodynp4-calls default 10,
      p_updmd type rsiodynp4-updmode default 'F'.
    data:
      l_lines_read type sy-tabix,
      ls_select type rsselect,
      lt_select type table of rsselect,
      ls_field type rsfieldsel,
      lt_field type table of rsfieldsel,
      l_quiet type rois-genflag value 'X',
      l_readonly type rsiodynp4-readonly value 'X',
      l_desc_type type char1 value 'M',
      lr_data type ref to data,
      lt_oltpsource type rsaot_s_osource,
      l_req_no type rsiodynp4-requnr,
      l_debugmode type rsiodynp4-debugmode,
      l_genmode type rois-genflag,
      l_columns type i,
      l_temp_char type char40,
      l_filename    like rlgrap-filename,
      wa_x030l      like x030l,
      tb_dfies      type standard table of dfies,
      wa_dfies      type dfies,
      begin of tb_flditab occurs 0,
    *   field description
        fldname(40)  type c,
      end of tb_flditab,
      ls_flditab like line of tb_flditab,
      l_file type string.
    field-symbols:
      <lt_data> type standard table,
      <ls_data> type any,
      <ls_field> type any.
    call function 'RSA1_SINGLE_OLTPSOURCE_GET'
      exporting
        i_oltpsource   = p_osrce
      importing
        e_s_oltpsource = lt_oltpsource
      exceptions
        no_authority   = 1
        not_exist      = 2
        inconsistent   = 3
        others         = 4.
    if sy-subrc <> 0.
    * ERROR
    endif.
    create data lr_data type standard table of (lt_oltpsource-exstruct).
    assign lr_data->* to <lt_data>.
    call function 'RSFH_GET_DATA_SIMPLE'
      exporting
        i_requnr                     = l_req_no
        i_osource                    = p_osrce
        i_maxsize                    = p_maxsz
        i_maxfetch                   = p_maxfc
        i_updmode                    = p_updmd
        i_debugmode                  = l_debugmode
        i_abapmemory                 = l_genmode
        i_quiet                      = l_quiet
        i_read_only                  = l_readonly
      importing
        e_lines_read                 = l_lines_read
      tables
        i_t_select                   = lt_select
        i_t_field                    = lt_field
        e_t_data                     = <lt_data>
      exceptions
        generation_error             = 1
        interface_table_error        = 2
        metadata_error               = 3
        error_passed_to_mess_handler = 4
        no_authority                 = 5
        others                       = 6.
    * get table/structure field info
    call function 'GET_FIELDTAB'
      exporting
        langu               = sy-langu
        only                = space
        tabname             = lt_oltpsource-exstruct
        withtext            = 'X'
      importing
        header              = wa_x030l
      tables
        fieldtab            = tb_dfies
      exceptions
        internal_error      = 01
        no_texts_found      = 02
        table_has_no_fields = 03
        table_not_activ     = 04.
    * check result
    case sy-subrc.
      when 0.
    *      copy fieldnames
        loop at tb_dfies into wa_dfies.
          case l_desc_type.
            when 'F'.
              tb_flditab-fldname = wa_dfies-fieldname.
            when 'S'.
              tb_flditab-fldname = wa_dfies-scrtext_s.
            when 'M'.
              tb_flditab-fldname = wa_dfies-scrtext_m.
            when 'L'.
              tb_flditab-fldname = wa_dfies-scrtext_l.
            when others.
    *         use fieldname
              tb_flditab-fldname = wa_dfies-fieldname.
          endcase.
          append tb_flditab.
    *        clear variables
          clear: wa_dfies.
        endloop.
      when others.
        message id sy-msgid type sy-msgty number sy-msgno
        with  sy-subrc raising error_get_dictionary_info.
    endcase.
    describe table tb_flditab lines l_columns.
    " MOVE DATA TO THE APPLICATION SERVER
    open dataset p_filnm for output in text mode encoding utf-8
      with windows linefeed.
    data i type i.
    loop at <lt_data> assigning <ls_data>.
      loop at tb_flditab into ls_flditab.
        i = sy-tabix.
        assign component i of structure <ls_data> to <ls_field>.
        l_temp_char = <ls_field>.
        if i eq 1.
          l_file = l_temp_char.
        else.
          concatenate l_file ',' l_temp_char  into l_file.
        endif.
      endloop.
      transfer l_file to p_filnm.
      clear l_file.
    endloop.
    close dataset p_filnm.
    Cheers,
    Scott
    Edited by: Jeffrey Holdeman on May 25, 2010 4:44 PM
    Added  markup to improve readability
    Edited by: Jeffrey Holdeman on May 25, 2010 4:47 PM

  • How can one  read a Excel File and Upload into Table using Pl/SQL Code.

    How can one read a Excel File and Upload into Table using Pl/SQL Code.
    1. Excel File is on My PC.
    2. And I want to write a Stored Procedure or Package to do that.
    3. DataBase is on Other Server. Client-Server Environment.
    4. I am Using Toad or PlSql developer tool.

    If you would like to create a package/procedure in order to solve this problem consider using the UTL_FILE in built package, here are a few steps to get you going:
    1. Get your DBA to create directory object in oracle using the following command:
    create directory TEST_DIR as ‘directory_path’;
    Note: This directory is on the server.
    2. Grant read,write on directory directory_object_name to username;
    You can find out the directory_object_name value from dba_directories view if you are using the system user account.
    3. Logon as the user as mentioned above.
    Sample code read plain text file code, you can modify this code to suit your need (i.e. read a csv file)
    function getData(p_filename in varchar2,
    p_filepath in varchar2
    ) RETURN VARCHAR2 is
    input_file utl_file.file_type;
    --declare a buffer to read text data
    input_buffer varchar2(4000);
    begin
    --using the UTL_FILE in built package
    input_file := utl_file.fopen(p_filepath, p_filename, 'R');
    utl_file.get_line(input_file, input_buffer);
    --debug
    --dbms_output.put_line(input_buffer);
    utl_file.fclose(input_file);
    --return data
    return input_buffer;
    end;
    Hope this helps.

  • Size of Infocube Export to a flat file (.CSV)

    First off, I'm new to BW so I apologize if I use the wrong terminology. I was wondering if there was any way to estimate the size of a flat file infocube export. For example, if I look up all the associated DB tables for a cube, let's call it TESTCUBE, would the total size of those tables be about the size of the TESTCUBE export to a flat file. This assumes that the cube is not compressed in any way. So if the the total DB table size is 100 MB then is the flat file about 100 MB? If not, is there a way to estimate this.
    Thanks!

    Can't say for sure how all DBs that BW run on work, but with Oracle, space is consumed in blocks, not by rows/records. 
    Let's say you have a table where the average row width (record length) is 100 bytes.  In a tablespace that has 8K blocks, in theory, you could fit 81 rows in the block.
    However there is usually a certain amount of freespace left in a block when rows are added to accomodate subsequent inserts/updates.  The BW tables I have looked at default this to 10%. So that means youe have over 800 bytes of the block that might not get filled.  There are other things that can result in space being consumed, such as deletions that will leave dead space, and updates can cause a row to migrate to another block, etc.  Also, space is allocated in chunks as a table grows, and you may add one row to a table that causes a 10MB chunk of storage to be allocated, but it is holding only 1 row.
    So as you can begin to see.  A flat export should require less space than the source table (unless you are using database table compression which could completely invalidate what I have said), but by how much depends on all the things I mentioned, and could change from day to day.
    You could certainly use as a rough rule of thumb, that a flat file export will require the same space and you would be safe (except if you are using db compression).

  • Data convertion while exporting data into flat files using export wizard in ssis

    Hi ,
    while exporting data to flat file through export wizard the source table is having NVARCHAR types.
    could you please help me on how to do the data convertion while using the export wizard?
    Thanks.

    Hi Avs sai,
    By default, the columns in the destination flat file will be non-Unicode columns, e.g. the data type of the columns will be DT_STR. If you want to keep the original DT_WSTR data type of the input column when outputting to the destination file, you can check
    the Unicode option on the “Choose a Destination” page of the SQL Server Import and Export Wizard. Then, on the “Configure Flat File Destination” page, you can click the Edit Mappings…“ button to check the data types. Please see the screenshot:
    Regards,
    Mike Yin
    TechNet Community Support

  • Download data from internal table to flat file.

    I need to download the data from Internal table to Flat file. can any one suggest how to do it? i suppose WS_Download OR GUI_DOWNLOAD.
    but if it is please guide me how to use this.
    is thre any other F.M. please provide the information.
    Thanks in advance

    Hi,
    Try this,
    * File download, uses older techniques but achieves a perfectly
    * acceptable solution which also allows the user to append data to
    * an existing file.
      PARAMETERS: p_file like rlgrap-filename.
    * Internal table to store export data  
      DATA: begin of it_excelfile occurs 0,
       row(500) type c,
       end of it_excelfile.
      DATA: rc TYPE sy-ucomm,
            ld_answer TYPE c.
      CALL FUNCTION 'WS_QUERY'
           EXPORTING
                query    = 'FE'  "File Exist?
                filename = p_file
           IMPORTING
                return   = rc.
      IF rc NE 0.                       "If File alread exists
        CALL FUNCTION 'POPUP_TO_CONFIRM'
          EXPORTING
    *          TITLEBAR              = ' '
    *          DIAGNOSE_OBJECT       = ' '
               text_question         = 'File Already exists!!'
               text_button_1         = 'Replace'
    *          ICON_BUTTON_1         = ' '
               text_button_2         = 'New name'
    *          ICON_BUTTON_2         = ' '
    *          DEFAULT_BUTTON        = '1'
    *          DISPLAY_CANCEL_BUTTON = 'X'
    *          USERDEFINED_F1_HELP   = ' '
    *          START_COLUMN          = 25
    *          START_ROW             = 6
    *          POPUP_TYPE            =
          IMPORTING
               answer                = ld_answer
    *     TABLES
    *         PARAMETER              =
          EXCEPTIONS
              text_not_found         = 1
              OTHERS                 = 2.
    * Option 1: Overwrite
        IF ld_answer EQ '1'.
          CALL FUNCTION 'GUI_DOWNLOAD'
            EXPORTING
    *            BIN_FILESIZE            =
                 filename                = p_file        "File Name
                 filetype                = 'ASC'
    *       IMPORTING
    *            FILELENGTH              =
            TABLES
                data_tab                = it_excelfile   "Data table
            EXCEPTIONS
                file_write_error        = 1
                no_batch                = 2
                gui_refuse_filetransfer = 3
                invalid_type            = 4
                OTHERS                  = 5.
          IF sy-subrc <> 0.
            MESSAGE i003(zp) WITH
                     'There was an error during Excel file creation'(200).
            exit. "Causes short dump if removed and excel document was open 
          ENDIF.
    * Option 2: New name.
        ELSEIF ld_answer EQ '2'.
          CALL FUNCTION 'DOWNLOAD'
            EXPORTING
                 filename            = p_file          "File name
                 filetype            = 'ASC'           "File type
    *             col_select          = 'X'            "COL_SELECT
    *             col_selectmask      = 'XXXXXXXXXXXXXXXXXXXXXXXXXX'
    *                                                   "COL_SELECTMASK
                 filetype_no_show    = 'X'     "Show file type selection?
    *       IMPORTING
    *             act_filename        = filename_dat
            TABLES
                 data_tab            = it_excelfile    "Data table
    *            fieldnames          =
            EXCEPTIONS
                 file_open_error     = 01
                 file_write_error    = 02
                 invalid_filesize    = 03
                 invalid_table_width = 04
                 invalid_type        = 05
                 no_batch            = 06
                 unknown_error       = 07.
        ENDIF.
      ELSE.                               "File does not alread exist.
        CALL FUNCTION 'GUI_DOWNLOAD'
          EXPORTING
    *          BIN_FILESIZE            =
               filename                = p_file         "File name
               filetype                = 'ASC'          "File type
    *     IMPORTING
    *          FILELENGTH              =
          TABLES
               data_tab                = it_excelfile   "Data table
          EXCEPTIONS
               file_write_error        = 1
               no_batch                = 2
               gui_refuse_filetransfer = 3
               invalid_type            = 4
               OTHERS                  = 5.
        IF sy-subrc <> 0.
          MESSAGE i003(zp) WITH
                   'There was an error during Excel file creation'(200).
          exit. "Causes short dump if removed and excel document was open 
        ENDIF.
      ENDIF.
    Regards,
    Raghav

  • How to convert from SQL Server table to Flat file (txt file)

    I need To ask question how convert from SQL Server table to Flat file txt file

    Hi
    1. Import/Export wizened
    2. Bcp utility
    3. SSIS 
    1.Import/Export Wizard
    First and very manual technique is the import wizard.  This is great for ad-hoc and just to slam it in tasks.
    In SSMS right click the database you want to import into.  Scroll to Tasks and select Import Data…
    For the data source we want out zips.txt file.  Browse for it and select it.  You should notice the wizard tries to fill in the blanks for you.  One key thing here with this file I picked is there are “ “ qualifiers.  So we need to make
    sure we add “ into the text qualifier field.   The wizard will not do this for you.
    Go through the remaining pages to view everything.  No further changes should be needed though
    Hit next after checking the pages out and select your destination.  This in our case will be DBA.dbo.zips.
    Following the destination step, go into the edit mappings section to ensure we look good on the types and counts.
    Hit next and then finish.  Once completed you will see the count of rows transferred and the success or failure rate
    Import wizard completed and you have the data!
    bcp utility
    Method two is bcp with a format file http://msdn.microsoft.com/en-us/library/ms162802.aspx
    This is probably going to win for speed on most occasions but is limited to the formatting of the file being imported.  For this file it actually works well with a small format file to show the contents and mappings to SQL Server.
    To create a format file all we really need is the type and the count of columns for the most basic files.  In our case the qualifier makes it a bit difficult but there is a trick to ignoring them.  The trick is to basically throw a field into the
    format file that will reference it but basically ignore it in the import process.
    Given that our format file in this case would appear like this
    9.0
    9
    1 SQLCHAR 0 0 """ 0 dummy1 ""
    2 SQLCHAR 0 50 "","" 1 Field1 ""
    3 SQLCHAR 0 50 "","" 2 Field2 ""
    4 SQLCHAR 0 50 "","" 3 Field3 ""
    5 SQLCHAR 0 50 ""," 4 Field4 ""
    6 SQLCHAR 0 50 "," 5 Field5 ""
    7 SQLCHAR 0 50 "," 6 Field6 ""
    8 SQLCHAR 0 50 "," 7 Field7 ""
    9 SQLCHAR 0 50 "n" 8 Field8 ""
    The bcp call would be as follows
    C:Program FilesMicrosoft SQL Server90ToolsBinn>bcp DBA..zips in “C:zips.txt” -f “c:zip_format_file.txt” -S LKFW0133 -T
    Given a successful run you should see this in command prompt after executing the statement
    Starting copy...
    1000 rows sent to SQL Server. Total sent: 1000
    1000 rows sent to SQL Server. Total sent: 2000
    1000 rows sent to SQL Server. Total sent: 3000
    1000 rows sent to SQL Server. Total sent: 4000
    1000 rows sent to SQL Server. Total sent: 5000
    1000 rows sent to SQL Server. Total sent: 6000
    1000 rows sent to SQL Server. Total sent: 7000
    1000 rows sent to SQL Server. Total sent: 8000
    1000 rows sent to SQL Server. Total sent: 9000
    1000 rows sent to SQL Server. Total sent: 10000
    1000 rows sent to SQL Server. Total sent: 11000
    1000 rows sent to SQL Server. Total sent: 12000
    1000 rows sent to SQL Server. Total sent: 13000
    1000 rows sent to SQL Server. Total sent: 14000
    1000 rows sent to SQL Server. Total sent: 15000
    1000 rows sent to SQL Server. Total sent: 16000
    1000 rows sent to SQL Server. Total sent: 17000
    1000 rows sent to SQL Server. Total sent: 18000
    1000 rows sent to SQL Server. Total sent: 19000
    1000 rows sent to SQL Server. Total sent: 20000
    1000 rows sent to SQL Server. Total sent: 21000
    1000 rows sent to SQL Server. Total sent: 22000
    1000 rows sent to SQL Server. Total sent: 23000
    1000 rows sent to SQL Server. Total sent: 24000
    1000 rows sent to SQL Server. Total sent: 25000
    1000 rows sent to SQL Server. Total sent: 26000
    1000 rows sent to SQL Server. Total sent: 27000
    1000 rows sent to SQL Server. Total sent: 28000
    1000 rows sent to SQL Server. Total sent: 29000
    bcp import completed!
    BULK INSERT
    Next, we have BULK INSERT given the same format file from bcp
    CREATE TABLE zips (
    Col1 nvarchar(50),
    Col2 nvarchar(50),
    Col3 nvarchar(50),
    Col4 nvarchar(50),
    Col5 nvarchar(50),
    Col6 nvarchar(50),
    Col7 nvarchar(50),
    Col8 nvarchar(50)
    GO
    INSERT INTO zips
    SELECT *
    FROM OPENROWSET(BULK 'C:Documents and SettingstkruegerMy Documentsblogcenzuszipcodeszips.txt',
    FORMATFILE='C:Documents and SettingstkruegerMy Documentsblogzip_format_file.txt'
    ) as t1 ;
    GO
    That was simple enough given the work on the format file that we already did.  Bulk insert isn’t as fast as bcp but gives you some freedom from within TSQL and SSMS to add functionality to the import.
    SSIS
    Next is my favorite playground in SSIS
    We can do many methods in SSIS to get data from point A, to point B.  I’ll show you data flow task and the SSIS version of BULK INSERT
    First create a new integrated services project.
    Create a new flat file connection by right clicking the connection managers area.  This will be used in both methods
    Bulk insert
    You can use format file here as well which is beneficial to moving methods around.  This essentially is calling the same processes with format file usage.  Drag over a bulk insert task and double click it to go into the editor.
    Fill in the information starting with connection.  This will populate much as the wizard did.
    Example of format file usage
    Or specify your own details
    Execute this and again, we have some data
    Data Flow method
    Bring over a data flow task and double click it to go into the data flow tab.
    Bring over a flat file source and SQL Server destination.  Edit the flat file source to use the connection manager “The file” we already created.  Connect the two once they are there
    Double click the SQL Server Destination task to open the editor.  Enter in the connection manager information and select the table to import into.
    Go into the mappings and connect the dots per say
    Typical issue of type conversions is Unicode to non-unicode.
    We fix this with a Data conversion or explicit conversion in the editor.  Data conversion tasks are usually the route I take.  Drag over a data conversation task and place it between the connection from the flat file source to the SQL Server destination.
    New look in the mappings
    And after execution…
    SqlBulkCopy Method
    Sense we’re in the SSIS package we can use that awesome “script task” to show SlqBulkCopy.  Not only fast but also handy for those really “unique” file formats we receive so often
    Bring over a script task into the control flow
    Double click the task and go to the script page.  Click the Design script to open up the code behind
    Ref.
    Ahsan Kabir Please remember to click Mark as Answer and Vote as Helpful on posts that help you. This can be beneficial to other community members reading the thread. http://www.aktechforum.blogspot.com/

  • Best Practice for Flat File Data Uploaded by Users

    Hi,
    I have the following scenario:
    1.     Users would like to upload data from flat file and subsequently view their reports.
    2.     SAP BW support team would not be involved in data upload process.
    3.     Users would not go to RSA1 and use InfoPackages & DTPs. Hence, another mechanism for data upload is required.
    4.     Users consists of two group, external and internal users. External users would not have access to SAP system. However, access via a portal is acceptable.
    What are the best practice we should adopt for this scenario?
    Thanks!

    Hi,
    I can share what we do in our project.
    We get the files from the WEB to the Application Server in path which is for this process.The file placed in the server has a naming convention based on ur project,u can name it.Everyday the same name file is placed in the server with different data.The path in the infopackage is fixed to that location in the server.After this the process chain trigers and loads the data from that particular  path which is fixed in the application server.After the load completes,a copy of file is taken as back up and deleted from that path.
    So this happens everyday.
    Rgds
    SVU123
    Edited by: svu123 on Mar 25, 2011 5:46 AM

  • Method to dump all tables to flat files?  (spool is slow)

    Hey all,
    I have been using spool with a bunch of set commands and it works great, but performance is a bit slow. Especially for our partitioned tables. (4.2 Mill records) Is there a better way to dump flat files?
    I want to use sqlldr to import (faster than IMP) but no good if creating the flat files is 10X slower than EXP utility.
    Any ideas?
    Thanks,
    Chris G

    Hi Chris,
    Maybe spooling itself is not a key issue which slow down your dumping. In my daily job, I need to dump a big bunch of data (even over 10M records in one table) to .txt files and .csv files and they are all done via spooling.
    If you find the query itself is slow, caused by partition or join, you can try to create materialized tables to get the query results and then dump the data from these tables. This is an approach when I was working on a big data warehouse project.
    Good luck,

Maybe you are looking for

  • Solution Manager IBASE and R/3 assets

    Hi, In R/3 we have our Assets (tcode AS02) and we are using the tables ANLA, ANLB, ANLC, ALNH, ANLP and ANLZ. We want to setup the IBase in Solution Manager with data from the R/3 Assets. What I mean is, how can we import the Assets from R/3 in the I

  • New type of order

    Hi, I want to create new (from copy of existing type) type of order. I  will create new order in ME21 with new type which I create. There is some stuff in forum - f.e. information that I can do it in SPRO: "PM/CS>Maintenance and service processing>Ma

  • Excessive Heat

    Experiencing excessive heat after 20+ minutes of use. 

  • Boot camp 5 drivers won't install

    nvidia 306.37 installation shows "NVIDIA installer failed" and the HD4000 driver install just says that this computer doesn't meet the minimum requirement... I've tried playing battlefield 3 and minecraft, battlefield 3 gives me a direct X error that

  • Mavericks - How do I fix my keyboard?

    Since updating to Mavericks, my keyboard no longer offers alternative language choices for letters that would normally offer them when I hold down the key for the letter I want to change. As I switch frequently from Spanish to English, this is was re