Export data to a flat file

Hi all,
I need to export a table data to a flat file.
But the problem is that the data is huge about 200million rows occupying around 60GB of space
If I use SQL*Loader in Toad, it is taking huge time to export
After few months, I need to import the same data again into the table
So please help me which is the efficient and less time taking method to do this
I am very new to this field.
Can some one help me with this?
My oracle database version is 10.2
Thanks in advance

OK so first of all I would ask the following questions:
1. Why must you export the data and then re-import it a few months later?
2. Have you read through the documentation for SQLLDR thoroughly ? I know it is like stereo instructions but it has valuable information that will help you.
3. Does the table the data is being re-imported into have anything attached to it eg: triggers, indices or anything that the DB must do on each record? If so then re-read the the sqlldr documentation as you can turn all of that off during the import and re-index, etc. at your leisure.
I ask these questions because:
1. I would find a way for whatever happens to this data to be accomplished while it was in the DB.
2. Pumping data over the wire is going to be slow when you are talking about that kind of volume.
3. If you insist that the data must be dumped, massaged, and re-imported do it on the DB server. Disk IO is an order of magnitude faster then over the wire transfer.

Similar Messages

  • Export HRMS data to a flat file

    Hi All!
    Are there any ways of exporting employee related data to a flat file without using a client app (PeopleCode or Integration Broker), that is simply generate a CSV feed from UI?

    You can Schedule a query and specify the output format as text. Note that when you select View Log/Trace in process monitor, you will see a file with a .csv extension. However, it will open by default in Excel, and even if you select Save instead of Open it will try to change the extension to .xls. You will have to change it back to .csv.

  • Export table data in a flat file without using FL

    Hi,
    I am looking for options where I can export table data into a flat file without using FL(File Layout) i.e., by using App Engine only.
    Please share your experience if you did anything as this
    Thanks

    A simple way to export any record (table/view) to an csv fiel, is to create a rowset and loop through all record fields, like below example code
    Local Rowset &RS;
    Local Record &Rec;
    Local File &MYFILE;
    Local string &FileName, &strRecName, &Line, &Seperator, &Value;
    Local number &numRow, &numField;
    &FileName = "c:\temp\test.csv";
    &strRecName = "PSOPRDEFN";
    &Seperator = ";";
    &RS = CreateRowset(@("Record." | &strRecName));
    &RS.Fill();
    &MYFILE = GetFile(&FileName, "W", %FilePath_Absolute);
    If &MYFILE.IsOpen Then
       For &numRow = 1 To &RS.ActiveRowCount
          &Rec = &RS(&numRow).GetRecord(@("RECORD." | &strRecName));
          For &numField = 1 To &Rec.FieldCount
             &Value = String(&Rec.GetField(&numField).Value);
             If &numField = 1 Then
                &Line = &Value;
             Else
                &Line = &Line | &Seperator | &Value;
             End-If;
          End-For;
          &MYFILE.WriteLine(&Line);
       End-For;
    End-If;
    &MYFILE.Close(); You can of course create an application class for generic calling this piece of code.
    Hope it helps.
    Note:
    Do not come complaining to me on performance issues ;)

  • Extract PSA Data to a Flat File

    Hello,
    I would like to download PSA data into a flat file, so I can load it into SQL Server and run SQL statements on it for analysis purposes.  I have tried creating a PSA export datasource; however, I can find way to cleanly get the data into a flat structure for analysis and/or download. 
    Can anyone suggest options for doing this?
    Thanks in advance for your help. 
    Sincerely,
    Sonya

    Hi Sonya,
    In teh PSA screen try pressing Shift and F8. If this does not bring up the file types then you can try the following: Settings > Chnage Display Variants > View tab > Microsoft Excel > Click SAP_OM.xls and then the green check amrk. The data will be displayed in excel format which you can save to your PC.
    Hope this helps...

  • Uploading the data from a flat file into ztable

    Hi,
    I have a requirement where I have to upload the data from 2 flat files into 2 z tables(ZRB_HDR,ZRB_ITM).From the 1st flat file only data for few fields have to be uploaded into ztable(ZRB_HRD) .Fromthe 2nd flat file data for all the fields have to me uploaded into ztable(ZRB_ITM). How can I do this?
    Regards,
    Hema

    hi,
    declare two internal table with structur of your tables.
    your flat files should be .txt files.
    now make use of GUI_UPLOAD function module to upload your flatfile into internal tables.
    CALL FUNCTION 'GUI_UPLOAD'
        EXPORTING
          filename            = 'c:\file1.txt'
          has_field_separator = 'X'
        TABLES
          data_tab            = itab1
        EXCEPTIONS
          OTHERS              = 1.
    use this function twice for two tables.
    then loop them individually and make use of insert command.

  • Problem in the BDC program to upload the data from a flat file.

    Hi,
    I am required to write a BDC program to upload the data from a flat file. The conditions are as mentioned below:-
    1) Selection Screen will be prompted to user and user needs to provide:- File Path on presentation server (with F4 help for this obligatory parameter) and File Separator e.g. @,#,$,%,... etc(fields in the file will be separated by using this special character) or fields may be separated by tab(tab delimited).
    2) Finally after the data is uploaded, following messages need to be displayed:-
    a) Total Number of records successfully uploaded.
    b) Session Name
    c) Number of Sessions created.
    Problem is when each record is fetched from flat file, the record needs to be split into individual fields separated by delimiter or in case tab separated, then proceeding in usual manner.
    It would be great if you provide me either the logic, pseudocode, or sample code for this BDC program.
    Thanks,

    Here is an example program,  if you require the delimitor to be a TAB, then enter TAB on the selection screen, if you require the delimitor to be a comma, slash, pipe, whatever, then simply enter that value.  This example is simply the uploading of the file, not the BDC, I assume that you know what to do once you have the data into the internal table.
    REPORT zrich_0001.
    TYPES: BEGIN OF ttab,
            rec TYPE string,
           END OF ttab.
    TYPES: BEGIN OF tdat,
           fld1(10) TYPE c,
           fld2(10) TYPE c,
           fld3(10) TYPE c,
           fld4(10) TYPE c,
           END OF tdat.
    DATA: itab TYPE TABLE OF ttab.
    data: xtab like line of itab.
    DATA: idat TYPE TABLE OF tdat.
    data: xdat like line of idat.
    DATA: file_str TYPE string.
    DATA: delimitor TYPE string.
    PARAMETERS: p_file TYPE localfile.
    PARAMETERS: p_del(5) TYPE c.
    AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
      DATA: ifiletab TYPE filetable.
      DATA: xfiletab LIKE LINE OF ifiletab.
      DATA: rc TYPE i.
      CALL METHOD cl_gui_frontend_services=>file_open_dialog
        CHANGING
          file_table = ifiletab
          rc         = rc.
      READ TABLE ifiletab INTO xfiletab INDEX 1.
      IF sy-subrc = 0.
        p_file = xfiletab-filename.
      ENDIF.
    START-OF-SELECTION.
      TRANSLATE p_del TO UPPER CASE.
      CASE p_del.
        WHEN 'TAB'.
          delimitor = cl_abap_char_utilities=>horizontal_tab.
        WHEN others.
          delimitor = p_del.
      ENDCASE.
      file_str = p_file.
      CALL METHOD cl_gui_frontend_services=>gui_upload
        EXPORTING
          filename = file_str
        CHANGING
          data_tab = itab.
      LOOP AT itab into xtab.
        CLEAR xdat.
        SPLIT xtab-rec AT delimitor INTO xdat-fld1
                                         xdat-fld2
                                         xdat-fld3
                                         xdat-fld4.
        APPEND xdat to idat.
      ENDLOOP.
      LOOP AT idat into xdat.
        WRITE:/ xdat-fld1, xdat-fld2, xdat-fld3, xdat-fld4.
      ENDLOOP.
    Regards,
    Rich Heilman

  • Exporting R3 tables into Flat Files for BPC

    Dear BPC experts,
    I understand currently the way for BPC to extract data from R3 is via flat files. I have the following queries:
    1) What exactly are the T codes and steps required to export R3 tables into flat files (without going through the OpenHub in BI7)? Can this process be automated?
    2) Is Data Manager of BPC equivalent to SSIS (Integration Services) of SQL Server?
    Please advise. Thanks!!
    SJ

    Hi Soong Jeng,
    I would take a look at the existing BI Extractors for the answer to Q1, I am working on finishing up a HTG regarding this. Look for it very soon.
    Here is the code to dump out data from a BI Extractor directly from ERP.
    You need dev permissions in your ERP system and access to an app server folder. ...Good Luck
    *& Report  Z_EXTRACTOR_TO_FILE                                         *
    report  z_extractor_to_file                     .
    type-pools:
      rsaot.
    parameters:
      p_osrce type roosource-oltpsource,
      p_filnm type rlgrap-filename,
      p_maxsz type rsiodynp4-maxsize default 100,
      p_maxfc type rsiodynp4-calls default 10,
      p_updmd type rsiodynp4-updmode default 'F'.
    data:
      l_lines_read type sy-tabix,
      ls_select type rsselect,
      lt_select type table of rsselect,
      ls_field type rsfieldsel,
      lt_field type table of rsfieldsel,
      l_quiet type rois-genflag value 'X',
      l_readonly type rsiodynp4-readonly value 'X',
      l_desc_type type char1 value 'M',
      lr_data type ref to data,
      lt_oltpsource type rsaot_s_osource,
      l_req_no type rsiodynp4-requnr,
      l_debugmode type rsiodynp4-debugmode,
      l_genmode type rois-genflag,
      l_columns type i,
      l_temp_char type char40,
      l_filename    like rlgrap-filename,
      wa_x030l      like x030l,
      tb_dfies      type standard table of dfies,
      wa_dfies      type dfies,
      begin of tb_flditab occurs 0,
    *   field description
        fldname(40)  type c,
      end of tb_flditab,
      ls_flditab like line of tb_flditab,
      l_file type string.
    field-symbols:
      <lt_data> type standard table,
      <ls_data> type any,
      <ls_field> type any.
    call function 'RSA1_SINGLE_OLTPSOURCE_GET'
      exporting
        i_oltpsource   = p_osrce
      importing
        e_s_oltpsource = lt_oltpsource
      exceptions
        no_authority   = 1
        not_exist      = 2
        inconsistent   = 3
        others         = 4.
    if sy-subrc <> 0.
    * ERROR
    endif.
    create data lr_data type standard table of (lt_oltpsource-exstruct).
    assign lr_data->* to <lt_data>.
    call function 'RSFH_GET_DATA_SIMPLE'
      exporting
        i_requnr                     = l_req_no
        i_osource                    = p_osrce
        i_maxsize                    = p_maxsz
        i_maxfetch                   = p_maxfc
        i_updmode                    = p_updmd
        i_debugmode                  = l_debugmode
        i_abapmemory                 = l_genmode
        i_quiet                      = l_quiet
        i_read_only                  = l_readonly
      importing
        e_lines_read                 = l_lines_read
      tables
        i_t_select                   = lt_select
        i_t_field                    = lt_field
        e_t_data                     = <lt_data>
      exceptions
        generation_error             = 1
        interface_table_error        = 2
        metadata_error               = 3
        error_passed_to_mess_handler = 4
        no_authority                 = 5
        others                       = 6.
    * get table/structure field info
    call function 'GET_FIELDTAB'
      exporting
        langu               = sy-langu
        only                = space
        tabname             = lt_oltpsource-exstruct
        withtext            = 'X'
      importing
        header              = wa_x030l
      tables
        fieldtab            = tb_dfies
      exceptions
        internal_error      = 01
        no_texts_found      = 02
        table_has_no_fields = 03
        table_not_activ     = 04.
    * check result
    case sy-subrc.
      when 0.
    *      copy fieldnames
        loop at tb_dfies into wa_dfies.
          case l_desc_type.
            when 'F'.
              tb_flditab-fldname = wa_dfies-fieldname.
            when 'S'.
              tb_flditab-fldname = wa_dfies-scrtext_s.
            when 'M'.
              tb_flditab-fldname = wa_dfies-scrtext_m.
            when 'L'.
              tb_flditab-fldname = wa_dfies-scrtext_l.
            when others.
    *         use fieldname
              tb_flditab-fldname = wa_dfies-fieldname.
          endcase.
          append tb_flditab.
    *        clear variables
          clear: wa_dfies.
        endloop.
      when others.
        message id sy-msgid type sy-msgty number sy-msgno
        with  sy-subrc raising error_get_dictionary_info.
    endcase.
    describe table tb_flditab lines l_columns.
    " MOVE DATA TO THE APPLICATION SERVER
    open dataset p_filnm for output in text mode encoding utf-8
      with windows linefeed.
    data i type i.
    loop at <lt_data> assigning <ls_data>.
      loop at tb_flditab into ls_flditab.
        i = sy-tabix.
        assign component i of structure <ls_data> to <ls_field>.
        l_temp_char = <ls_field>.
        if i eq 1.
          l_file = l_temp_char.
        else.
          concatenate l_file ',' l_temp_char  into l_file.
        endif.
      endloop.
      transfer l_file to p_filnm.
      clear l_file.
    endloop.
    close dataset p_filnm.
    Cheers,
    Scott
    Edited by: Jeffrey Holdeman on May 25, 2010 4:44 PM
    Added  markup to improve readability
    Edited by: Jeffrey Holdeman on May 25, 2010 4:47 PM

  • Procurement Card data upload from flat file to database

    Hi All,
    I need to upload Procurement Card data from a flat file to the database in the table BBP_PCMAS.
    I found a BAPI BAPI_PCARD_CREATEMULTIPLE which uploads the data perfectly, however the structure PCMASTER that it takes as input does not contain the field for Blocking reason PCBLOCK - Reason for blocking procurement card. I need to upload this file as well from the flat file.
    Any suggestions?
    Thanks

    Hi,
    You are correct the function module BAPI_PCARD_CREATEMULTIPLE  does not contain the PCBLOCK field.
    Alternatively what you can do is read the PC data after it is created and modify it with the PCBLOCK appropiately. The necessary function modules are given below.
    BBP_PCMAS_READ_PCMAS - Read Data
    BBP_PCMAS_MODIFY_PCMAS - Modify Data
    Note: BBP_PCMAS_MODIFY_PCMAS is a Update Task FM. Hence it shoild be called as given below, ( refer form write_data of the FM BAPI_PCARD_CREATEMULTIPLE)
      call function 'BBP_PCMAS_MODIFY_PCMAS' in update task
           exporting
                i_pcmas     = i_pcmas
    *         I_PCMAS_OLD =
    *         I_DELETE    =
          tables
               t_pcacc     = i_pcacc
    *         T_PCACC_OLD =
          exceptions
               not_found   = 1
               others      = 2.
    Regards
    Kathirvel

  • BAPI to upload data from a flat file to VA01

    Hi guys,
    I have a requirement wherein i need to upload data  from a flat file to VA01.Please tell me how do i go about this.
    Thanks and regards,
    Frank.

    Hi
    previously i posted code also
        Include           YCL_CREATE_SALES_DOCU                         *
         Form  salesdocu
         This Subroutine is used to create Sales Order
         -->P_HEADER           Document Header Data
         -->P_HEADERX          Checkbox for Header Data
         -->P_ITEM             Item Data
         -->P_ITEMX            Item Data Checkboxes
         -->P_LT_SCHEDULES_IN  Schedule Line Data
         -->P_LT_SCHEDULES_INX Checkbox Schedule Line Data
         -->P_PARTNER  text    Document Partner
         <--P_w_vbeln  text    Sales Document Number
    DATA:
      lfs_return like line of t_return.
    FORM create_sales_document changing P_HEADER  like fs_header
                                       P_HEADERX like fs_headerx
                                       Pt_ITEM   like t_item[]
                                       Pt_ITEMX  like t_itemx[]
                                       P_LT_SCHEDULES_IN  like t_schedules_in[]
                                       P_LT_SCHEDULES_INX like t_schedules_inx[]
                                       Pt_PARTNER  like t_partner[]
                                       P_w_vbeln  like w_vbeln.
    This Perform is used to fill required data for Sales order creation
      perform sales_fill_data changing p_header
                                       p_headerx
                                       pt_item
                                       pt_itemx
                                       p_lt_schedules_in
                                       p_lt_schedules_inx
                                       pt_partner.
    Function Module to Create Sales and Distribution Document
      perform sales_order_creation using p_header
                                         p_headerx
                                         pt_item
                                         pt_itemx
                                         p_lt_schedules_in
                                         p_lt_schedules_inx
                                         pt_partner.
      perform return_check using p_w_vbeln .
    ENDFORM.                                 " salesdocu
        Form  commit_work
        To execute external commit                                    *
    FORM commit_work .
      CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
    EXPORTING
       WAIT          = c_x
    ENDFORM.                                 " Commit_work
    Include ycl_sales_order_header          " To Fill Header data and Item data
    Include ycl_sales_order_header.
         Form  return_check
        To validate the sales order creation
    FORM return_check using pr_vbeln type vbeln.
    if pr_vbeln is initial.
        LOOP AT t_return into lfs_return .
          WRITE / lfs_return-message.
          clear lfs_return.
        ENDLOOP.                             " Loop at return
      else.
        perform commit_work.                 " External Commit
        Refresh t_return.
        fs_disp-text = text-003.
        fs_disp-number = pr_vbeln.
        append fs_disp to it_disp.
      if p_del eq c_x or p_torder eq c_x or
        p_pgi eq c_x or p_bill eq c_x.
        perform delivery_creation.           " Delivery order creation
        endif.                               " If p_del eq 'X'......
      endif.                                 " If p_w_vbeln is initial
    ENDFORM.                                 " Return_check
    *&      Form  sales_order_creation
          text
         -->P_P_HEADER  text
         -->P_P_HEADERX  text
         -->P_PT_ITEM  text
         -->P_PT_ITEMX  text
         -->P_P_LT_SCHEDULES_IN  text
         -->P_P_LT_SCHEDULES_INX  text
         -->P_PT_PARTNER  text
    FORM sales_order_creation  USING    P_P_HEADER like fs_header
                                        P_P_HEADERX like fs_headerx
                                        P_PT_ITEM like t_item[]
                                        P_PT_ITEMX like t_itemx[]
                                        P_P_LT_SCHEDULES_IN like t_schedules_in[]
                                        P_P_LT_SCHEDULES_INX like t_schedules_inx[]
                                        P_PT_PARTNER like t_partner[].
        CALL FUNCTION 'BAPI_SALESDOCU_CREATEFROMDATA1'
        EXPORTING
          sales_header_in     = p_p_header
          sales_header_inx    = p_p_headerx
        IMPORTING
          salesdocument_ex    = w_vbeln
        TABLES
          return              = t_return
          sales_items_in      = p_pt_item
          sales_items_inx     = p_pt_itemx
          sales_schedules_in  = p_p_lt_schedules_in
          sales_schedules_inx = p_p_lt_schedules_inx
          sales_partners      = p_pt_partner.
    ENDFORM.                    " sales_order_creation
    plzz reward if i am usefull plzz

  • Error while uploading data from a flat file to the hierarchy

    Hi guys,
    after i upload data from a flat file to the hierarchy, i get a error message "Please select a valid info object" am loading data using PSA, having activated all external chars still get the problem..some help on this please..
    regards
    Sri

    there is o relation of infoobject name in flat file and infoobjet name at BW side.
    please check with the object in the BW and their lengths and type of the object and check your flat file weather u have the same type there,
    now check the sequence of the objects in the transfer rules  and activate them.
    there u go.

  • Delete data from a flat file using PL/SQL -- Please help.. urgent

    Hi All,
    We are writing data to a flat file using Text_IO.Put_Line command. We want to delete data / record from that file after data is written onto it.
    Please let us know if this is possible.
    Thanks in advance.
    Vaishali

    There's nothing in UTL_FILE to do this, so your options are either to write a Java stored procedure to do it or to use whatever mechanism the host operating system supports for editing files.
    Alternatively, you could write a PL/SQL procedure to read each line in from the file and then write them out to a second file, discarding the line you don't want along the way.

  • What is the best way to load and convert data from a flat file?

    Hi,
    I want to load data from a flat file, convert dates, numbers and some fields with custom logic (e.g. 0,1 into N,Y) to the correct format.
    The rows where all to_number, to_date and custom conversions succeed should go into table STG_OK. If some conversion fails (due to an illegal format in the flat file), those rows (where the conversion raises some exception) should go into table STG_ERR.
    What is the best and easiest way to archive this?
    Thanks,
    Carsten.

    Hi,
    thanks for your answers so far!
    I gave them a thought and came up with two different alternatives:
    Alternative 1
    I load the data from the flat file into a staging table using sqlldr. I convert the data to the target format using sqlldr expressions.
    The columns of the staging table have the target format (date, number).
    The rows that cannot be loaded go into a bad file. I manually load the data from the bad file (without any conversion) into the error table.
    Alternative 2
    The columns of the staging table are all of type varchar2 regardless of the target format.
    I define data rules for all columns that require a later conversion.
    I load the data from the flat file into the staging table using external table or sqlldr without any data conversion.
    The rows that cannot be loaded go automatically into the error table.
    When I read the data from the staging table, I can safely convert it since it is already checked by the rules.
    What I dislike in alternative 1 is that I manually have to create a second file and a second mapping (ok, I can automate this using OMB*Plus).
    Further, I would prefer using expressions in the mapping for converting the data.
    What I dislike in alternative 2 is that I have to create a data rule and a conversion expression and then keep the data rule and the conversion expression in sync (in case of changes of the file format).
    I also would prefer to have the data in the staging table in the target format. Well, I might load it into a second staging table with columns having the target format. But that's another mapping and a lot of i/o.
    As far as I know I need the data quality option for using data rules, is that true?
    Is there another alternative without any of these drawbacks?
    Otherwise I think I will go for alternative 1.
    Thanks,
    Carsten.

  • Sqlplus – spool data to a flat file

    Hi,
    Does any oracle expert here know why the sqlplus command could not spool all the data into a flat file at one time.
    I have tried below command. It seems like every time I will get different file size :(
    a) sqlplus -s $dbUser/$dbPass@$dbName <<EOF|gzip -c > ${TEMP_FILE_PATH}/${extract_file_prefix}.dat.Z
    b) sqlplus -s $dbUser/$dbPass@$dbName <<EOF>> spool.log
    set feedback off
    set trims on
    set trim on
    set feedback off
    set linesize 4000
    set pagesize 0
    whenever sqlerror exit 173;
    spool ${extract_file_prefix}.dat

    For me, this is working. What exactly are you getting and what exactly are you expecting?
    (t352104@svlipari[GEN]:/lem) $ cat test.ksh
    #!/bin/ksh
    TEMP_FILE_PATH=`pwd`
    extract_file_prefix=emp
    dbUser=t352104
    dbPass=t352104
    dbName=gen_dev
    dataFile=${TEMP_FILE_PATH}/${extract_file_prefix}.dat
    sqlplus -s $dbUser/$dbPass@$dbName <<EOF > $dataFile
    set trims on
    set trim on
    set tab off
    set linesize 7000
    SET HEAD off AUTOTRACE OFF FEEDBACK off VERIFY off ECHO off SERVEROUTPUT off term off;
    whenever sqlerror exit 173;
    SELECT *
    FROM emp ;
    exit
    EOF
    (t352104@svlipari[GEN]:/lem) $ ./test.ksh
    (t352104@svlipari[GEN]:/lem) $ echo $?
    0
    (t352104@svlipari[GEN]:/lem) $ cat emp.dat
          7369 SMITH      CLERK           7902 17-DEC-80        800                    20
          7499 ALLEN      SALESMAN        7698 20-FEB-81       1600        300         30
          7521 WARD       SALESMAN        7698 22-FEB-81       1250        500         30
          7566 JONES      MANAGER         7839 02-APR-81       2975                    20
          7654 MARTIN     SALESMAN        7698 28-SEP-81       1250       1400         30
          7698 BLAKE      MANAGER         7839 01-MAY-81       2850                    30
          7782 CLARK      MANAGER         7839 09-JUN-81       2450                    10
          7788 SCOTT      ANALYST         7566 09-DEC-82       3000                    20
          7839 KING       PRESIDENT            17-NOV-81       5000                    10
          7844 TURNER     SALESMAN        7698 08-SEP-81       1500          0         30
          7876 ADAMS      CLERK           7788 12-JAN-83       1100                    20
          7900 JAMES      CLERK           7698 03-DEC-81        950                    30
          7902 FORD       ANALYST         7566 03-DEC-81       3000                    20
          7934 MILLER     CLERK           7782 23-JAN-82       1300                    10
    (t352104@svlipari[GEN]:/lem) $

  • How to load data from a  flat file which is there in the application server

    HI All,
              how to load data from a  flat file which is there in the application server..

    Hi,
    Firstly you will need to place the file(s) in the AL11 path. Then in your infopackage in "Extraction" tab you need to select "Application Server" option. Then you need to specify the path as well as the exact file you want to load by using the browsing button.
    If your file name keeps changing on a daily basis i.e. name_ddmmyyyy.csv, then in the Extraction tab you have the option to write an ABAP routine that generates the file name. Here you will need to append sy-datum to "name" and then append ".csv" to generate complete filename.
    Please let me know if this is helpful or if you need any more inputs.
    Thanks & Regards,
    Nishant Tatkar.

  • Loading CLOB data to a flat file in owb 11.2.0.3

    Hi,
    Also,I have one more question.I want to load a table with clob data into a flat file using my owb client 11.2.0.3.But when I run this simple mapping ,I am getting this error:
    ORA-06502:pl/sql:NUMERIC OR VALUE ERROR:CHARACTER STRING BUFFER TOO SMALL.
    Please suggest what needs to be done.

    I was up all night as I had a dead line doing this....
    Turned out my Mapping was corrupted.
    If you do a series of synchronizations by object name, position, id the mapping would end up corrupted.
    I had to delete my staging table operator within the mapping, drag and drop and reconnect and now my loads work just fine..
    Thanks

Maybe you are looking for

  • How do I install without having a CD drive?

    Hi, I bought CS6 a few months ago and installed it onto my main computer via the CD drive. I would now like to install it onto my laptop (macbook pro) but that doesn't have a CD drive. How do I go about this. Can I download it from here and use my se

  • How to reinstall my system only?

    My system crashed and I want reinstal it. I have 4 partitions on my HDD /boot /root /home swap I want reinstal /root only, need /home stay without changes. How to do it safety?

  • Deactivate batch

    Hello I am unable to deactivate batch management indicator in Material Master. It shows Batches exist. Though i have deleted and archived the batch numbers. Pls explain. Thanks

  • Adding a Spreadsheet to the Front Panel

    Hi, I want to add something like a spreadheet to my front panel and I need some recommendations.  I want to allow values to be entered at runtime and I like the ability of locking certain cells and changing font attributes of cells. I tried adding an

  • 6509:How can I find out which port ping is going thru

    I am trying to find out on which port <targetIP> is connected to. I know there may be a hub between the switch port and the <targetIP>. From my Telnet-session-6509, I do: traceroute <targetIP> and also ping <targetIP> I got responses OK for both, tra