Upload data from flat file to OVKK  using BDC

Hi All,
I want to upload data from flat file to OVKK tcode using BDC.
OVKK is a maintaince view with  a table control.
So please send me code for uploading data to OVKK through BDC.
Thanks & Regards,
Siva.B

Hi,
Welcome to SDN!!!!!!!!!!
Can you see this example for Table control.
http://www.sap-img.com/abap/bdc-example-using-table-control-in-bdc.htm
Today this is the second post on the same issue and same Tranx.
1st try through SHDB and check the code.
Thanks.
If this helps you reward with points.

Similar Messages

  • How to  upload data from flat file to datastore object in BI 7.0

    Dear friends,
    Please tell me
    step by step process for upload data from flat file to datastore object in BI 7.0
    <removed by moderator>
    please help me
    Thanks,
    D.prabhu
    Edited by: Siegfried Szameitat on Aug 17, 2011 11:40 AM

    Create transformation on thr data source and keep the DSO as the target and load.
    Ravi Thothadri

  • Error when uploading data from Flat file

    I am uplading data from Flat file.
    When I am uploading it gives an error
    An error occurred in record 1 during execution of conversion exit CONVERSION_EXIT_CUNIT_INPUT for field UNIT.
    Can any one help.
    Thanx in Advance.
    Regards,
    Pradeep.

    Hi Pradeep,
    Refer this link:
    CONVERSION_EXIT_CUNIT_INPUT error in flat file load
    Re: Error in Flat files upload
    Also try this - In the transfer structure check the checkbox fo UNIT and retry the load. Hopefully it should be fine.
    Bye
    Dinesh

  • Upload data from flat file

    Is there a way to upload data from a flat file and check of existing records and perform updates and inserts respectively on a table using MERGE as available in Oracle 9i version.
    I don't want to use SQLLOADER utility for this purpose as I will be having existing data which might give exceptions for primary key, etc.
    Is it possible to do this using UTL_FILE utility and use of MERGE statement.
    Any inputs will be highly appreciated.
    Thanks

    This is what i am doing as of today :-
    This process reads the flat file and uploads and updates data into 2 different tables.
    create or replace procedure EPS.GET_DATA(filelocation varchar2,filename varchar2)
    IS
    --declare
    my_file utl_file.file_type;
    -- variables used to parse the input data line ...
    line char(45);
    l_EPCI_CID_ID char(25);
    l_cid number :=0;
    l_EPCI_BILLING_ACCOUNT_ID char(15);
    l_account number :=0;
    l_EPCI_BILLING_COMPANY_CD char(2);
    l_EPCI_SOURCE_SYSTEM_CD char(3);
    -- timing variables ...
    l_start_time date;
    l_end_time date;
    l_difference number;
    l_run_id varchar2(40) := '';
    counter number := 0;
    mrow number :=0;
    oldCustomerCid number :=0 ;
    rowsFound number :=0;
    rowsFoundForCustomer number :=0;
    rowsFound2 number :=0;
    rowsFound3 number :=0;
    mUpdate number :=0;
    cUpdate number :=0;
    cInsert number :=0;
    cSkipped number :=0 ;
    mrowupd number :=0;
    rowsSkipped number :=0;
    begin
    -- Open the input file ....
    begin
    dbms_output.put_line('opening file :'||filename||' from location :'||filelocation);
    my_file := utl_file.fopen(filelocation,filename,'r');
    exception
    when utl_file.invalid_filehandle then
    dbms_output.put_line('invalid_filehandle');
    raise_application_error(-20100,'file error');
    when no_data_found then
    dbms_output.put_line('no data found exception');
    raise_application_error(-20100,'file error');
    when others then
    null;
    end;
    -- Read data and insert it into the table ...
    dbms_output.put_line('program started: now = '|| to_char(sysdate,'HH:MI:SS'));
    l_start_time := sysdate;
    begin
    dbms_output.put_line('looping cursor to search existing groups');
    loop
    utl_file.get_line(my_file,line);
    l_EPCI_CID_ID := substr(line,1,25);
    l_cid := to_number(l_EPCI_CID_ID);
    l_EPCI_BILLING_ACCOUNT_ID := substr(line,26,15);
    l_account :=to_number(l_EPCI_BILLING_ACCOUNT_ID);
    l_EPCI_BILLING_COMPANY_CD := substr(line,41,2);
    l_EPCI_SOURCE_SYSTEM_CD := substr(line,43,3);
    -- Populating Mad.Mad_Customer_Profile table with EPS CID, If CID exists then updating MCP_LAST_INVOICE_LOAD_DT with system date
    if(oldCustomerCid <> l_cid) then
    Select count(*) into rowsFoundForCustomer from MAD.MAD_CUSTOMER_PROFILE WHERE MCP_CID_ID = l_cid ;
    if rowsFoundForCustomer=0 then
    Insert into MAD.MAD_CUSTOMER_PROFILE values(l_cid,sysdate,sysdate,'EBILL');
    cInsert := cInsert+1;
    else
    Update MAD.MAD_CUSTOMER_PROFILE set MCP_LAST_INVOICE_LOAD_DT=SYSDATE,
    MCP_UPDATE_DATE=SYSDATE, MCP_USER_ID='EBILL' where MCP_CID_ID=l_cid;
    cUpdate := cUpdate+1;
    end if;
    else
    cSkipped := cSkipped+1;
    end if ;
    oldCustomerCid := l_cid;
    -- Populating EPS.EPS_CBS_INVOICE_CUSTOMER table with the flat file data.
    select count(*) into rowsFound
    from EPS.EPS_CBS_INVOICE_CUSTOMER
    WHERE EPCI_CID_ID=l_cid and
    to_number(EPCI_BILLING_ACCOUNT_ID)=l_account
    AND rtrim(EPCI_BILLING_COMPANY_CD)=rtrim(l_EPCI_BILLING_COMPANY_CD)
    AND rtrim(EPCI_SOURCE_SYSTEM_CD)=rtrim(l_EPCI_SOURCE_SYSTEM_CD)
    and rtrim(EPCI_ACTIVE_IND)='Y';
    if rowsFound=0 then
    Declare
    Cursor c1 is
    select * from EPS.EPS_CBS_INVOICE_CUSTOMER
    WHERE
    to_number(EPCI_BILLING_ACCOUNT_ID)=l_account
    AND rtrim(EPCI_BILLING_COMPANY_CD)=rtrim(l_EPCI_BILLING_COMPANY_CD)
    AND rtrim(EPCI_SOURCE_SYSTEM_CD)=rtrim(l_EPCI_SOURCE_SYSTEM_CD)
    and EPCI_ACTIVE_IND='Y';
    myRec eps.eps_cbs_invoice_customer%ROWTYPE ;
    Begin
    if not c1%isopen then
    open c1 ;
    end if;
    loop
    fetch c1 into myRec;
    exit When c1%notfound;
    UPDATE EPS.EPS_CBS_INVOICE_CUSTOMER
    SET EPCI_ACTIVE_IND='N',EPCI_UPDATE_DT=SYSDATE,EPCI_UPDATE_USER_ID='EBILL'
    WHERE
                             --EPCI_CID_ID=L_CID and
    TO_NUMBER(EPCI_BILLING_ACCOUNT_ID)=l_account
    AND rtrim(EPCI_BILLING_COMPANY_CD)=rtrim(l_EPCI_BILLING_COMPANY_CD)
    AND rtrim(EPCI_SOURCE_SYSTEM_CD)=rtrim(l_EPCI_SOURCE_SYSTEM_CD);
    mrowupd := mrowupd+1;
    end loop;
    close c1;
    insert into EPS.EPS_CBS_INVOICE_CUSTOMER(EPCI_CID_ID,EPCI_BILLING_ACCOUNT_ID,
    EPCI_BILLING_COMPANY_CD,EPCI_SOURCE_SYSTEM_CD,EPCI_ACTIVE_IND,EPCI_UPDATE_DT,EPCI_UPDATE_USER_ID)
    Values(l_cid,l_EPCI_BILLING_ACCOUNT_ID,l_EPCI_BILLING_COMPANY_CD,l_EPCI_SOURCE_SYSTEM_CD,'Y',SYSDATE,'EBILL');
    mrow := mrow+1;
    end ;
    Else
    rowsSkipped:=rowsSkipped+1;
    END IF;
    end loop;
    Exception
    when no_data_found then
    utl_file.fclose(my_file);
    dbms_output.put_line('Number of lines parsed =');
    when value_error then
    dbms_output.put_line('value error');
    raise_application_error(-20100,'file error');
    when utl_file.invalid_path then
    dbms_output.put_line('invalid path');
    raise_application_error(-20100,'file error');
    when utl_file.invalid_mode then
    dbms_output.put_line('invalid_mode');
    raise_application_error(-20100,'file error');
    when utl_file.invalid_filehandle then
    dbms_output.put_line('invalid_filehandle');
    raise_application_error(-20100,'file error');
    when utl_file.invalid_operation then
    dbms_output.put_line('invalid_operation');
    raise_application_error(-20100,'file error');
    when utl_file.read_error then
    dbms_output.put_line('read_error');
    raise_application_error(-20100,'file error');
    when utl_file.write_error then
    dbms_output.put_line('write_error');
    raise_application_error(-20100,'file error');
    when utl_file.internal_error then
    dbms_output.put_line('internal_error');
    raise_application_error(-20100,'file error');
    WHEN DUP_VAL_ON_INDEX THEN
    dbms_output.put_line('DUPLICATE RECORD FOUND WHILE INSERTING RECORD');
    when others then
    dbms_output.put_line('un-handled');
    raise_application_error(-20100,'file error');
    null;
    end;
    -- Now commit and close the file ...
    COMMIT;
    dbms_output.put_line('program finished: now = '||
    to_char(sysdate,'HH:MI:SS'));
    dbms_output.put_line('Rows Inserted into EPS table :'|| mrow);
    dbms_output.put_line('Rows Updates in EPS table:'|| mrowupd);
    dbms_output.put_line('Rows Skipped :'|| rowsSkipped);
    dbms_output.put_line('Rows Inserted into Customer table :'|| cInsert);
    dbms_output.put_line('Rows Updates in Customer table:'|| cUpdate);
    dbms_output.put_line('Duplicate CID''s Skipped :'|| cSkipped);
    l_end_time := sysdate;
    l_difference := l_end_time - l_start_time;
    utl_file.fclose(my_file);
    end;

  • Scheduling Problem for uploading Data from Flat file to SAP

    Hi guys,
    I am facing a weared problem in uploading some leave records in z table. The code is working fine if we run it through se38 after selecting the file from a shared location from production server which has all the access rights.
    This folder lies in the \usr folder of SAP Production.
    I have kept all the Flat files in the shared path "
    Tis-mum-iz-s1\migration\SAP-INT\leave\" ...
    To give u exact directory structure..
    Tis-mum-iz-s1 is the Server Name
    usr is the SAP System folder used for uploads and downloads
    usr |
    ...-> Migration |
                      -> SAP-INT |
                                 -> leave -> (Flat Files)
    Migration folder is shared with all rights.
    Obviously, we cannot give shared drive as the variant in the scheduler.
    So i use the system path i.e. \usr\sap\tmp\migration\sap-int\leave\ as the variant.
    All my other download programs are working fine with this path as a variant...
    But my this particular upload program does not work with this path....
    I am giving u my code...
    TATA INTERACTIVE SYSTEMS (A Division of TATA INDUSTRIES LIMITED)
    REPORT      :  ZMIGRATE_ZLEAVE
    DESCRIPTION :  To Upload the Leave data. (ZLEAVE)
    CREATED BY  :  Abhishek Bachhawat
    CREATED ON  :  01.09.2005
    CONSULTANT  :  ANAND
    REPORT  ZMIGRATE_ZLEAVE.
    TABLES: ZLEAVE.
    data: begin of wtab,
              MANDT(3),
              ZLVID(8),
              PERNR(8),
              ZSTDT(8),
              ZENDT(8),
              ZDAYS(4),
              AEDAT(8),
              ERDAT(8),
          end of wtab,
          itab like WTAB occurs 0 WITH HEADER LINE.
    data: temp like zleave occurs 0 WITH HEADER LINE.
    SELECTION-SCREEN BEGIN OF BLOCK file
                   WITH FRAME TITLE text-005.
    parameters: file like rlgrap-filename Obligatory.
    Concatenate File SY-DATUM '_Leave.txt' into File.
    SELECTION-SCREEN END OF BLOCK file.
    at SELECTION-SCREEN ON VALUE-REQUEST FOR file .
      CALL FUNCTION 'WS_FILENAME_GET'
        IMPORTING
          FILENAME = file.
      IF SY-SUBRC <> 0.
      ENDIF.
    start-of-selection.
      if file ne space.
        CALL FUNCTION 'WS_UPLOAD'
          EXPORTING
            FILENAME = FILE
            FILETYPE = 'DAT'
          TABLES
            DATA_TAB = ITAB.
      else.
        message e000(zps) with 'Specify a file'.
      endif.
      SORT ITAB BY ZLVID.
      LOOP AT ITAB.
        REFRESH TEMP.
        CLEAR TEMP.
        TEMP-MANDT = sy-mandt.
        TEMP-ERDAT = SY-DATUM.
        TEMP-ZLVID = ITAB-ZLVID.
        TEMP-PERNR = ITAB-PERNR.
        TEMP-ZSTDT = ITAB-ZSTDT.
        TEMP-ZENDT = ITAB-ZENDT.
        TEMP-ZDAYS = ITAB-ZDAYS.
        TEMP-AEDAT = ITAB-AEDAT.
        TEMP-ERDAT = ITAB-ERDAT.
        APPEND TEMP.
        SELECT SINGLE *
               FROM   ZLEAVE
               WHERE  ZLVID = TEMP-ZLVID
               AND    PERNR = TEMP-PERNR.
        IF SY-SUBRC = 0.
          UPDATE ZLEAVE SET ZSTDT = TEMP-ZSTDT
                            ZENDT = TEMP-ZENDT
                            ZDAYS = TEMP-ZDAYS
                            AEDAT = TEMP-AEDAT
                            ERDAT = TEMP-ERDAT
                 WHERE ZLVID = TEMP-ZLVID
                 AND   PERNR = TEMP-PERNR.
        ELSE.
          INSERT ZLEAVE FROM TABLE TEMP.
          COMMIT WORK.
        ENDIF.
      ENDLOOP.

    Hi,
    open dataset file for input in text mode.
    check sy-subrc = 0.
    while sy-subrc = 0.
      read dataset file into wa.
      if sy-subrc = 0.
      append wa to itab.
      else.
        exit.
      endif.
    endwhile.
    close dataset file.
    regards
    Siggi
    PS: check also the F1-help for open, read and close statements!

  • Problem while uploading data from flat file

    hi friends,
    suppose if there are 100 records in flat file , if 20 records uploaded with out any problem and if error occurs while uploading remaing data . is necessary to upload entire data again or else only remaining data should be uploaded?
    i had used call transaction for purchase order application?
    please give me reply  soon its urgent
    thanks & regards
    priya

    Hi Hari,
    you have to upload the remaining data.
    as u have used CT method, do 1 thing trap the error data in runtime & prepare another flat file. next time correct the data in new flat file. & run the DBC program again with this new flat file.
    Reward if useful
    Regards
    ANUPAM

  • Upload data from flat file into internal table

    Hi friends,
    I want to upload the data from a flat file into internal table , but the problem is that all the columns in that flat file are seperated by "|" character instead of tabs.
    Plz help me out.........

    HEllo,
    DO like this.
      CALL FUNCTION 'GUI_UPLOAD'
      EXPORTING
      FILENAME = LV_FILENAME
      FILETYPE = 'ASC'
      HAS_FIELD_SEPARATOR = 'X'  " Check here
    * HEADER_LENGTH = '1'
    * READ_BY_LINE = 'X'
    * DAT_MODE = ' '
    * CODEPAGE = ' '
    * IGNORE_CERR = ABAP_TRUE
    * REPLACEMENT = '#'
    * CHECK_BOM = ' '
    * IMPORTING
    * FILELENGTH =
    * HEADER =
      TABLES
      DATA_TAB = IT_COJRNL
      EXCEPTIONS
      FILE_OPEN_ERROR = 1
      FILE_READ_ERROR = 2
      NO_BATCH = 3
      GUI_REFUSE_FILETRANSFER = 4
      INVALID_TYPE = 5
      NO_AUTHORITY = 6
      UNKNOWN_ERROR = 7
      BAD_DATA_FORMAT = 8
      HEADER_NOT_ALLOWED = 9
      SEPARATOR_NOT_ALLOWED = 10
      HEADER_TOO_LONG = 11
      UNKNOWN_DP_ERROR = 12
      ACCESS_DENIED = 13
      DP_OUT_OF_MEMORY = 14
      DISK_FULL = 15
      DP_TIMEOUT = 16
      OTHERS = 17
      IF SY-SUBRC <> 0.
        MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
        WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      ENDIF.
    VAsanth

  • Uploading data from flat file to table control

    HI All,
    I want to upload data to OVKK tcode using BDC. For this I wrote Z program as shown below:
    REPORT ZSD_BDC_OVKK_UPLOAD
           NO STANDARD PAGE HEADING LINE-SIZE 255.
    *INCLUDE bdcrecx1.
    DATA : BEGIN OF T_DUMMY OCCURS 0,
           VAR(100) TYPE C,
           END OF T_DUMMY.
    DATA:  BEGIN OF ITAB OCCURS 0,
           KALSM(10) TYPE C,
    *       KARTV(10) TYPE C,
           VKORG(4) TYPE C,
           VTWEG(2) TYPE C,
           SPART(2) TYPE C,
           KALVG(1) TYPE C,
           KALKS(1) TYPE C,
             KALSM(6) TYPE C,
             KARTV(4) TYPE C,
           END OF ITAB.
    DATA : IT_BDCDATA LIKE BDCDATA OCCURS 0 WITH HEADER LINE,
           IT_BDCMSGCOLL LIKE BDCMSGCOLL OCCURS 0 WITH HEADER LINE.
    DATA : v_filename TYPE string.
    PARAMETER : filename LIKE rlgrap-filename OBLIGATORY.
    AT SELECTION-SCREEN ON VALUE-REQUEST FOR filename.
      CALL FUNCTION 'KD_GET_FILENAME_ON_F4'
        EXPORTING
          field_name = filename
        CHANGING
          file_name  = filename.
    START-OF-SELECTION.
      v_filename = filename.
      CALL FUNCTION 'GUI_UPLOAD'
        EXPORTING
          filename                      = v_filename
          filetype                      = 'ASC'
         has_field_separator           = 'X'
        TABLES
          data_tab                      = T_DUMMY.
    LOOP AT T_DUMMY.
        ITAB-VKORG = T_DUMMY-VAR+0(4).
        ITAB-VTWEG = T_DUMMY-VAR+5(2).
        ITAB-SPART = T_DUMMY-VAR+8(2).
        ITAB-KALVG = T_DUMMY-VAR+11(1).
        ITAB-KALKS = T_DUMMY-VAR+13(1).
        ITAB-KALSM = T_DUMMY-VAR+15(6).
        ITAB-KARTV = T_DUMMY-VAR+22(4).
        APPEND ITAB.
    ENDLOOP.
    DATA: FNAM(20) TYPE C,
          IDX      TYPE C.
        MOVE 1 TO IDX.
    LOOP AT ITAB.
    REFRESH IT_BDCDATA.
        PERFORM bdc_dynpro      USING 'SAPML080Z' '0100'.
        PERFORM bdc_field       USING 'BDC_CURSOR'
                                      'V_T683V-KALSM(01)'.
        PERFORM bdc_field       USING 'BDC_OKCODE'
                                      '=NEWL'.
        PERFORM bdc_dynpro      USING 'SAPML080Z' '0100'.
        PERFORM bdc_field       USING 'BDC_CURSOR'
                                      'V_T683V-VKORG(01)'.
        PERFORM bdc_field       USING 'BDC_OKCODE'
                                      '=SAVE'.
        CONCATENATE 'V_T683V-KALSM(' IDX ')' INTO FNAM.
         PERFORM bdc_field       USING FNAM
                                       itab-kalsm.
        CONCATENATE 'V_T683V-KARTV(' IDX ')' INTO FNAM.
         PERFORM bdc_field       USING FNAM
                                       itab-kartv.
        CONCATENATE 'V_T683V-VKORG(' IDX ')' INTO FNAM.
         PERFORM bdc_field       USING FNAM
                                       itab-vkorg.
        CONCATENATE 'V_T683V-VTWEG(' IDX ')' INTO FNAM.
         PERFORM bdc_field       USING FNAM
                                       itab-vtweg.
        CONCATENATE 'V_T683V-SPART(' IDX ')' INTO FNAM.
         PERFORM bdc_field       USING FNAM
                                       itab-spart.
        CONCATENATE 'V_T683V-KALVG(' IDX ')' INTO FNAM.
         PERFORM bdc_field       USING FNAM
                                       itab-kalvg.
        CONCATENATE 'V_T683V-KALKS(' IDX ')' INTO FNAM.
         PERFORM bdc_field       USING FNAM
                                       itab-kalks.
    PERFORM bdc_dynpro      USING 'SAPLSTRD' '0300'.
        PERFORM bdc_field       USING 'BDC_CURSOR'
                                      'KO008-TRKORR'.
        PERFORM bdc_field       USING 'BDC_OKCODE'
                                      '=LOCK'.
       PERFORM bdc_field       USING 'KO008-TRKORR'
                                       'D47K919377'.
       PERFORM bdc_field       USING 'BDC_CURSOR'
                                      'V_T683V-VKORG(01)'.
       PERFORM bdc_field       USING 'BDC_OKCODE'
                                      '/00'.
       PERFORM bdc_field       USING 'BDC_CURSOR'
                                      'V_T683V-VKORG(01)'.
        PERFORM bdc_field       USING 'BDC_OKCODE'
                                       '=SAVE'.
    CALL TRANSACTION 'OVKK' USING IT_BDCDATA
                            MODE  'A'
                           UPDATE 'S'
                         MESSAGES INTO IT_BDCMSGCOLL.
    IDX = IDX + 1  .
    endloop.
    FORM BDC_DYNPRO USING PROG SCR.
      CLEAR IT_BDCDATA.
      IT_BDCDATA-PROGRAM = PROG.
      IT_BDCDATA-DYNPRO  = SCR.
      IT_BDCDATA-DYNBEGIN = 'X'.
      APPEND IT_BDCDATA.
    ENDFORM.
    FORM BDC_FIELD USING FNAM FVAL.
      CLEAR IT_BDCDATA.
      IT_BDCDATA-FNAM = FNAM.
      IT_BDCDATA-FVAL  = FVAL.
      APPEND IT_BDCDATA.
    ENDFORM.
    I checked in debugging mode and found that data is passed to internal table but its not uploading data to OVKK, there no data is displayed.
    Please tell me solution for this.....
    regards,
    ravindra.

    Hi,
    Look at the following link you will find some good example of Table control.
    http://www.sap-img.com/abap/bdc-example-using-table-control-in-bdc.htm
    Here you need to take care to which line of table control you are populating the data.
    Make sure to take care of page down in table control.
    *********Try Call Transaction in MODE 'N'.
    Check your OK_CODE once again.
    Try to give BACk also after saving.
    The Table control is mainly used to update no.of line items at one shot. No need to come BACK for each and every record.
    Try to do recording in SHDB once again and see the changes closely.
    Thanks.
    If this helps you reward with points.
    Message was edited by: KDeepak
    Message was edited by: KDeepak

  • How to upload data from  flat to ztables with in the same client by idocs

    Hi Experts,
                   I have a requirement in IDOCS, I need to create a custom IDOC .I  am working on IDES 4.6c. The reqirement is , there  are ztables with header and item data. say for example Authors and Books. I need to upload data from flat file which is available in presentation  server of the same client, which will updated in ztables by using idocs.  For this i need to do ale settings also. The client is 800. There is no other client available. With in the same client i need to do the above stuff.
                    For this requirement how to approach (step by step) to accomplish.
    Thanks in Advance.
    Regards
    J.S.Varma

    Hi,
      This is the procedure.
    create segments using we31. <b>don't forget to relaese it</b>
    create idoc using above segments using we30 <b>don't forget to relaese it</b>
    create message type using we81
    create function module to upload data using se37
    maintain process code using we42
    create  partner profiles we20.
    In the fm module itself write the code for downloading the data from presentation server by GUI_DOWNLOAD.
      Then update the database tables directly by insrt through other internal table in the same client itself.
    Thanks
    Manju

  • Loading from flat file to dso using process chains

    hi,
    i am using BI7.0
    i am new to  process chains
    can anyone  explain how to load data from flat file to dso using process chains(i have created all the objects created) preffered if explained with an example

    You can find a lot info if you can searh SDN.
    Metachain
    Steps for Metachain :
    1. Start ( In this variant set ur schedule times for this metachain )
    2.Local Process Chain 1 ( Say its a master data process chain - Get into the start variant of this chain ( Sub chain - like any other chain ) and check the second radio button " Start using metachain or API " )
    3.Local Process Chain 2 ( Say its a transaction data process chain do the same as in step 2 )
    Steps for Process Chains in BI 7.0 for a Cube.
    1. Start
    2. Execute Infopackage
    3. Delete Indexes for Cube
    4.Execute DTP
    5. Create Indexes for Cube
    For DSO
    1. Start
    2. Execute Infopackage
    3. Execute DTP
    5. Activate DSO
    For an IO
    1. Start
    2.Execute infopackage
    3.Execute DTP
    4.Attribute Change Run
    Data to Cube thru a DSO
    1. Start
    2. Execute Infopackage ( loads till psa )
    3.Execute DTP ( to load DSO frm PSA )
    4.Activate DSO
    5.Delete Indexes for Cube
    6.Execute DTP ( to load Cube frm DSO )
    7.Create Indexes for Cube
    3.X
    Master loading ( Attr, Text, Hierarchies )
    Steps :
    1.Start
    2. Execute Infopackage ( say if you are loading 2 IO's just have them all parallel )
    3.You might want to load in seq - Attributes - Texts - Hierarchies
    4.And ( Connecting all Infopackages )
    5.Attribute Change Run ( add all relevant IO's ).
    Start
    Infopackge1A(Attr)|Infopackge2A(Attr)
    Infopackge1B(Txts)|Infopackge2B(Txts)
    /_____________________|
    Infopackge1C(Txts)______|
    \_____________________|
    ___________________|
    __\___________________|
    ___\__________________|
    ______ And Processer_ ( Connect Infopackge1C & Infopackge2B )
    __________|__________
    Attribute Change Run ( Add Infobject 1 & Infoobject 2 to this variant )
    1. Start
    2. Delete Indexes for Cube
    3. Execute Infopackage
    4.Create Indexes for Cube
    For DSO
    1. Start
    2. Execute Infopackage
    3. Activate DSO
    For an IO
    1.Start
    2.Execute infopackage
    3.Attribute Change Run
    Data to Cube thru a DSO
    1. Start
    2. Execute Infopackage
    3.Activate DSO
    4.Delete Indexes for Cube
    5.Execute Infopackage
    6.Create Indexes for Cube
    Some Links
    http://help.sap.com/saphelp_nw2004s/helpdata/en/8f/c08b3baaa59649e10000000a11402f/frameset.htm
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/8da0cd90-0201-0010-2d9a-abab69f10045
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/19683495-0501-0010-4381-b31db6ece1e9
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/36693695-0501-0010-698a-a015c6aac9e1
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/9936e790-0201-0010-f185-89d0377639db
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3507aa90-0201-0010-6891-d7df8c4722f7
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/263de690-0201-0010-bc9f-b65b3e7ba11c

  • Data loading from flat file to cube using bw3.5

    Hi Experts,
                       Kindly give  me the detailed steps with screens  about Data loading from flat file to cube using bw3.5
           ...............Please

    Hi ,
    Procedure
    You are in the Data Warehousing Workbench in the DataSource tree.
           1.      Select the application components in which you want to create the DataSource and choose Create DataSource.
           2.      On the next screen, enter a technical name for the DataSource, select the type of DataSource and choose Copy.
    The DataSource maintenance screen appears.
           3.      Go to the General tab page.
                                a.      Enter descriptions for the DataSource (short, medium, long).
                                b.      As required, specify whether the DataSource builds an initial non-cumulative and can return duplicate data records within a request.
                                c.      Specify whether you want to generate the PSA for the DataSource in the character format. If the PSA is not typed it is not generated in a typed structure but is generated with character-like fields of type CHAR only.
    Use this option if conversion during loading causes problems, for example, because there is no appropriate conversion routine, or if the source cannot guarantee that data is loaded with the correct data type.
    In this case, after you have activated the DataSource you can load data into the PSA and correct it there.
           4.      Go to the Extraction tab page.
                                a.      Define the delta process for the DataSource.
                                b.      Specify whether you want the DataSource to support direct access to data.
                                c.      Real-time data acquisition is not supported for data transfer from files.
                                d.      Select the adapter for the data transfer. You can load text files or binary files from your local work station or from the application server.
    Text-type files only contain characters that can be displayed and read as text. CSV and ASCII files are examples of text files. For CSV files you have to specify a character that separates the individual field values. In BI, you have to specify this separator character and an escape character which specifies this character as a component of the value if required. After specifying these characters, you have to use them in the file. ASCII files contain data in a specified length. The defined field length in the file must be the same as the assigned field in BI.
    Binary files contain data in the form of Bytes. A file of this type can contain any type of Byte value, including Bytes that cannot be displayed or read as text. In this case, the field values in the file have to be the same as the internal format of the assigned field in BI.
    Choose Properties if you want to display the general adapter properties.
                                e.      Select the path to the file that you want to load or enter the name of the file directly, for example C:/Daten/US/Kosten97.csv.
    You can also create a routine that determines the name of your file. If you do not create a routine to determine the name of the file, the system reads the file name directly from the File Name field.
                                  f.      Depending on the adapter and the file to be loaded, make further settings.
    ■       For binary files:
    Specify the character record settings for the data that you want to transfer.
    ■       Text-type files:
    Specify how many rows in your file are header rows and can therefore be ignored when the data is transferred.
    Specify the character record settings for the data that you want to transfer.
    For ASCII files:
    If you are loading data from an ASCII file, the data is requested with a fixed data record length.
    For CSV files:
    If you are loading data from an Excel CSV file, specify the data separator and the escape character.
    Specify the separator that your file uses to divide the fields in the Data Separator field.
    If the data separator character is a part of the value, the file indicates this by enclosing the value in particular start and end characters. Enter these start and end characters in the Escape Charactersfield.
    You chose the; character as the data separator. However, your file contains the value 12;45 for a field. If you set u201C as the escape character, the value in the file must be u201C12;45u201D so that 12;45 is loaded into BI. The complete value that you want to transfer has to be enclosed by the escape characters.
    If the escape characters do not enclose the value but are used within the value, the system interprets the escape characters as a normal part of the value. If you have specified u201C as the escape character, the value 12u201D45 is transferred as 12u201D45 and 12u201D45u201D is transferred as 12u201D45u201D.
    In a text editor (for example, Notepad) check the data separator and the escape character currently being used in the file. These depend on the country version of the file you used.
    Note that if you do not specify an escape character, the space character is interpreted as the escape character. We recommend that you use a different character as the escape character.
    If you select the Hex indicator, you can specify the data separator and the escape character in hexadecimal format. When you enter a character for the data separator and the escape character, these are displayed as hexadecimal code after the entries have been checked. A two character entry for a data separator or an escape sign is always interpreted as a hexadecimal entry.
                                g.      Make the settings for the number format (thousand separator and character used to represent a decimal point), as required.
                                h.      Make the settings for currency conversion, as required.
                                  i.      Make any further settings that are dependent on your selection, as required.
           5.      Go to the Proposal tab page.
    This tab page is only relevant for CSV files. For files in different formats, define the field list on the Fields tab page.
    Here you create a proposal for the field list of the DataSource based on the sample data from your CSV file.
                                a.      Specify the number of data records that you want to load and choose Upload Sample Data.
    The data is displayed in the upper area of the tab page in the format of your file.
    The system displays the proposal for the field list in the lower area of the tab page.
                                b.      In the table of proposed fields, use Copy to Field List to select the fields you want to copy to the field list of the DataSource. All fields are selected by default.
           6.      Go to the Fields tab page.
    Here you edit the fields that you transferred to the field list of the DataSource from the Proposal tab page. If you did not transfer the field list from a proposal, you can define the fields of the DataSource here.
                                a.      To define a field, choose Insert Row and specify a field name.
                                b.      Under Transfer, specify the decision-relevant DataSource fields that you want to be available for extraction and transferred to BI.
                                c.      Instead of generating a proposal for the field list, you can enter InfoObjects to define the fields of the DataSource. Under Template InfoObject, specify InfoObjects for the fields in BI. This allows you to transfer the technical properties of the InfoObjects into the DataSource field.
    Entering InfoObjects here does not equate to assigning them to DataSource fields. Assignments are made in the transformation. When you define the transformation, the system proposes the InfoObjects you entered here as InfoObjects that you might want to assign to a field.
                                d.      Change the data type of the field if required.
                                e.      Specify the key fields of the DataSource.
    These fields are generated as a secondary index in the PSA. This is important in ensuring good performance for data transfer process selections, in particular with semantic grouping.
                                  f.      Specify whether lowercase is supported.
                                g.      Specify whether the source provides the data in the internal or external format.
                                h.      If you choose the external format, ensure that the output length of the field (external length) is correct. Change the entries, as required.
                                  i.      If required, specify a conversion routine that converts data from an external format into an internal format.
                                  j.      Select the fields that you want to be able to set selection criteria for when scheduling a data request using an InfoPackage. Data for this type of field is transferred in accordance with the selection criteria specified in the InfoPackage.
                                k.      Choose the selection options (such as EQ, BT) that you want to be available for selection in the InfoPackage.
                                  l.      Under Field Type, specify whether the data to be selected is language-dependent or time-dependent, as required.
           7.      Check, save and activate the DataSource.
           8.      Go to the Preview tab page.
    If you select Read Preview Data, the number of data records you specified in your field selection is displayed in a preview.
    This function allows you to check whether the data formats and data are correct.
    For More Info:  http://help.sap.com/saphelp_nw70/helpdata/EN/43/01ed2fe3811a77e10000000a422035/content.htm

  • Reading data from flat file Using TEXT_IO

    Dear Gurus
    I already posted this question but this time i need some other changes .....Sorry for that ..
    I am using 10G forms and using TEXT_IO for reading data from flat file ..
    My data is like this :-
    0|BP-V1|20100928|01|1|2430962.89|27|2430962.89|MUR|20100928120106
    9|2430962.89|000111111111|
    1|61304.88|000014104113|
    1|41961.73|000022096086|
    1|38475.65|000023640081|
    1|49749.34|000032133154|
    1|35572.46|000033093377|
    1|246671.01|000042148111|
    Here each column is separated by | . I want to read all the columns and want to do some validation .
    How can i do ?
    Initially my requirement was to read only 2 or 3 columns so i did like this ...
    Procedure Pay_Simulator(lfile_type varchar2,lac_no varchar2,lcur varchar2,lno_item number,ltotal number,ldate date,lpay_purp varchar2,lfile_name varchar2)
    IS
    v_handle utl_file.file_type;
    v_filebuffer varchar2(500);
    line_0_date VARCHAR2 (10);
    line_0_Purp VARCHAR2 (10);
    line_0_count Number;
    line_0_sum number(12,2);
    line_0_ccy Varchar2(3);
    line_9_sum Number(12,2);
    line_9_Acc_no Varchar2(12);
    Line_1_Sum Number(12,2);
    Line_1_tot Number(15,2) := 0;
    Line_1_flag Number := 0;
    lval number;
    lacno varchar2(16);
    v_file varchar2(20);
    v_path varchar2(50);
    Begin
    v_file := mcb_simulator_pkg.GET_FILENAME(lfile_name); -- For the file name
    v_path :=rtrim(regexp_substr( lfile_name , '.*\\' ),'\'); For the Path
    v_path := SUBSTR (lfile_name,0, INSTR (lfile_name, '\', -1));
    v_handle := UTL_FILE.fopen (v_path, v_file, 'r');
    LOOP
    UTL_FILE.get_line (v_handle, v_filebuffer);
    IF SUBSTR (v_filebuffer, 0, 1) = '0' THEN
    SELECT line_0 INTO line_0_date
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 3;
    SELECT line_0 INTO line_0_Purp
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 4;
    SELECT line_0 INTO line_0_count
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 7;
    SELECT line_0 INTO line_0_sum
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 8;
    SELECT line_0 INTO line_0_ccy
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 9;
    ELSIF SUBSTR (v_filebuffer, 0, 1) = '9' THEN
    SELECT line_9 INTO line_9_Acc_no
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_9, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 3;
    SELECT line_9 INTO line_9_sum
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_9, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 2;
    ELSIF SUBSTR (v_filebuffer, 0, 1) = '1' THEN
    line_1_flag := line_1_flag+1;
    SELECT line_1 INTO line_1_sum
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_1, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 3;
    Line_1_tot := Line_1_tot + line_1_sum;
    END IF;
    END LOOP;
    DBMS_OUTPUT.put_line (Line_1_tot);
    DBMS_OUTPUT.PUT_LINE (Line_1_flag);
    UTL_FILE.fclose (v_handle);
    END;
    But now how can i do ? Shall i use like this select Statement for all the columns ?

    Sorry for that ..
    As per our requirement ...
    I need to read the flat file and it looks like like this .
    *0|BP-V1|20100928|01|1|2430962.89|9|2430962.89|MUR|20100928120106*
    *9|2430962.89|000111111111|*
    *1|61304.88|000014104113|*
    *1|41961.73|000022096086|*
    *1|38475.65|000023640081|*
    *1|49749.34|000032133154|*
    *1|35572.46|000033093377|*
    *1|246671.01|000042148111|*
    *1|120737.25|000053101979|*
    *1|151898.79|000082139768|*
    *1|84182.34|000082485593|*
    I have to check the file :-
    Validation are 1st line should start from 0 else it should raise an error and insert that error into one table .
    The for 2nd line also same thing ..it should start from 9 else it should raise an error and insert that error into one table .
    Then the 3rd line should start from 1 else it should raise an error and insert that error into one table .
    After that i have to do a validation like i will read the 1st line 2nd column .. It should be like this BP-V1 else raise an error and insert that error to a table . Then i will check the 3rd column which is 20100928 , it should be YYYYMMDD format else same thing ERROR.
    Then like this for all columns i have different validation .......
    Then it will check for the 2nd line 3rd column . this is an account no .1st i will check it should be 12 char else ERROR .Then I will check that what user has imputed in the form.Like for example User putted 111111111 then i will check with this 000111111111 which is there in the 2nd line . I have to add 000 before that Account no which user imputed .
    Then the lines which is starting from 1 , i have to take all the 2nd column for all the lines which is starting from 1 and i have to do a sum . After that i have to compare that sum with the value in the 1st lines ( Starting from 0) 6th column . It should be same else ERROR ...
    Then same way i have to count all the lines which is starting from 1 . Then i have to compare with the 7th column of 1st line . It should be same . Here in this file it should be 9.
    MY CODE IS :-
    Procedure Pay_Simulator(lfile_type varchar2,lac_no varchar2,lcur varchar2,lno_item number,ltotal number,ldate date,lpay_purp varchar2,lfile_name varchar2)
    IS
    v_handle TEXT_IO.file_type;
    v_filebuffer varchar2(500);
    line_0_date VARCHAR2 (10);
    line_0_Purp VARCHAR2 (10);
    line_0_count Number;
    line_0_sum number(12,2);
    line_0_ccy Varchar2(3);
    line_9_sum Number(12,2);
    line_9_Acc_no Varchar2(12);
    Line_1_Sum Number(12,2);
    Line_1_tot Number(15,2) := 0;
    Line_1_flag Number := 0;
    lval number;
    lacno varchar2(16);
    v_file varchar2(20);
    v_path varchar2(50);
    LC$String VARCHAR2(50) ;--:= 'one|two|three|four|five|six|seven' ;
    LC$Token VARCHAR2(100) ;
    i PLS_INTEGER := 2 ;
    lfirst_char number;
    lvalue Varchar2(100) ;
    Begin
    v_file := mcb_simulator_pkg.GET_FILENAME(lfile_name); For the file name
    v_path :=rtrim(regexp_substr( lfile_name , '.*\\' ),'\'); For the Path
    --v_path := SUBSTR (lfile_name,0, INSTR (lfile_name, '\', -1));
    Message(lfile_name);
    v_handle := TEXT_IO.fopen(lfile_name, 'r');
              BEGIN
                        LOOP
                        TEXT_IO.get_line (v_handle, v_filebuffer);
                        lfirst_char := Substr(v_filebuffer,0,1);
                        --Message('First Char '||lfirst_char); 
                                  IF lfirst_char = '0' Then
                                  Loop
                                  LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
                                  Message('VAL - '||LC$Token);
                                  lvalue := LC$Token;
                                  EXIT WHEN LC$Token IS NULL ;
    i := i + 1 ;
    End Loop;
                                  Else
                                       Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (9999,'0002','First line should always start with 0');
                                       Forms_DDL('Commit');
                                       raise form_Trigger_failure;
                                  End if ;
                        TEXT_IO.get_line (v_handle, v_filebuffer);
                        lfirst_char := Substr(v_filebuffer,0,1);
                        LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
                        --Message('Row '||LC$Token);
                             IF lfirst_char = '9' Then
                                  Null;
                             Else
                                  Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (8888,'0016','Second line should start with 9');
                                  Forms_DDL('Commit');
                                  raise form_Trigger_failure;
                             End IF;
                        LOOP
                        TEXT_IO.get_line (v_handle, v_filebuffer);
                        lfirst_char := Substr(v_filebuffer,0,1);
                        LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
                        --Message('Row '||LC$Token);
                                  IF lfirst_char = '1' Then
                                  Null;
                                  Else
                                       Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (7777,'0022','The third line onward should start with 1');
                                       Forms_DDL('Commit');
                                       raise form_Trigger_failure;
                                  End if;
                        END LOOP;
                        --END IF;
                        END LOOP;
              EXCEPTION
                   When No_Data_Found Then
              TEXT_IO.fclose (v_handle);
              END;
    Exception
         When Others Then
         Message('Other error');
    END;
    I am calling the FUNCTION which you gave SPLIT as mcb_simulator_pkg.Split.

  • Wrong century for the dates from flat file

    I am in the process of uploading the data from flat file ( aka CSV) into oracle via external table .
    All the dates have the year of 20xx. My understanding was 51-99 will be prefixed with 19 ( such as 1951 - 1999) and 00-50
    will be prefixed with 20 ( such as 2004 ... )
    The column below ( startdate ) is of timestamp .
    Is my understanding wrong ?
    SQL> select startdate , to_date(startdate , 'dd/mm/yyyy' ) newdate from tab;
    NEW_DATE STARTDATE
    09/18/2090 18-SEP-90 12.00.00.000000000 AM
    BANNER
    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
    PL/SQL Release 11.1.0.7.0 - Production
    CORE 11.1.0.7.0 Production
    TNS for 64-bit Windows: Version 11.1.0.7.0 - Production
    NLSRTL Version 11.1.0.7.0 - Production
    SQL> show parameter nls
    NAME TYPE VALUE
    nls_calendar string
    nls_comp string BINARY
    nls_currency string
    nls_date_format string
    nls_date_language string
    nls_dual_currency string
    nls_iso_currency string
    nls_language string AMERICAN
    nls_length_semantics string BYTE
    nls_nchar_conv_excp string FALSE
    nls_numeric_characters string
    nls_sort string
    nls_territory string AMERICA
    nls_time_format string
    nls_time_tz_format string
    nls_timestamp_format string
    nls_timestamp_tz_format string
    SQL>
    SQL>

    Offense . None taken ...
    I literally typed the sql ( as I did not copy & paste SQL / resultsets ) . I did use to_char in my original sql . I don't want to expose the real table . thats why I was giving the made up test case. If you carefully the SQL and the equivalent result sets ... you would have noticed .
    Here is the SQL used ( of course , I have changed the table name ) ...
    select to_char(startdate , 'mm/dd/yyyy') sdate, startdate from t
    SDATE STARTDATE
    09/18/2090 18-SEP-90 12.00.00.000000 AM
    The flatfile had the value of 091890 as the data .

  • UPLOAD DATA FROM EXCEL FILE

    hi guru,
    I want to upload data from excel file for mm02.. first of all help on the matter of how to upload data from excel...
    i hv used the FM ALSM_EXCEL_TO_INTERNAL_TABLE.. but its not working it uploading garbage value ... so tell me how to used it...
    help me on this matter.

    Check below example.
    parameters :  p_file  LIKE rlgrap-filename OBLIGATORY.
    AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
      PERFORM get_file_name.
      PERFORM get_file_to_excel.
    *&      Form  get_file_name
    FORM get_file_name.
      DATA: lv_name LIKE sy-repid.
      lv_name = sy-repid..
      CALL FUNCTION 'KD_GET_FILENAME_ON_F4'
           EXPORTING
                program_name  = lv_name
                dynpro_number = syst-dynnr
                static        = 'X'
           CHANGING
                file_name     = p_file.
    ENDFORM. " get_file_name
    *&      Form  get_file_to_excel
    FORM get_file_to_excel.
      DATA: idata LIKE alsmex_tabline OCCURS 0 WITH HEADER LINE.
      CALL FUNCTION 'ALSM_EXCEL_TO_INTERNAL_TABLE'
           EXPORTING
                filename                = p_file
                i_begin_col             = '1'
                i_begin_row             = '2'  "Do not require headings
                i_end_col               = '2'
                i_end_row               = '60000'
           TABLES
                intern                  = idata
           EXCEPTIONS
                inconsistent_parameters = 1
                upload_ole              = 2
                OTHERS                  = 3.
      IF sy-subrc <> 0.
        MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
                WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
        STOP.
      ENDIF.
    * Get first row retrieved
      READ TABLE idata INDEX 1.
    * Set first row retrieved to current row
      DATA: gd_currentrow TYPE i.
      gd_currentrow = idata-row.
      LOOP AT idata.
    *   Reset values for next row
        IF idata-row NE gd_currentrow.
          APPEND f.  CLEAR f.
          gd_currentrow = idata-row.
        ENDIF.
        CASE idata-col.
          WHEN '0001'.
            f-belnr   = idata-value.
        ENDCASE.
      ENDLOOP.
      APPEND f.  CLEAR f.

  • Upload data from excel file to mii without UDS and PCo

    Hi Experts,
    I am trying to upload data from excel file to mii db without using UDS and PCo. Is there any other ways that we can achieve it.
    I am thinking one solution , writing stored procedure. any other solutions?
    Thanks in advance,
    Eswar.

    Hi,
    Thanks for reply.
    I tried to create OLEDB data server which will point to my excel file in D drive.Its created successfully and i tried to query the datasource using sql query, I am not getting any modes in sql query template.
    can u provide some steps..how to access this file.
    Thanks, Eswar

Maybe you are looking for

  • List of incoming documents and vendor list in sap

    Hi folks, Just a query. What will be the possible ground why F.98 or S_ALR_87012150 is not displaying data from NFF. Thanks in advance.

  • Xperia Z - Error Ocurred During Tethering

    I can't get tethering working, when I turn it on i get a message 'an error occurred during tethering' i can still connect my laptop to the phone's wifi hotspot but only get local access..I am on Ornage UKYes they support tetheringYes mobile data is s

  • What is the object for ?

    Hello, I need to do a program which show the result of a select in a DataBase. I would like to show them as in a peer to peer software, I mean in different categories that I could possibly sort by clicking in the name of the category. What is the obj

  • Oracle database Structure Size

    HI , Can any one tell me how to find the size of a oracle database with out data , meaning only the structure meta data only. Thanks lbn76

  • Photoshop - Printing RAW Images

    Hi, I am trying to print RAW file images in photoshop.  The image appears on my iMac exactly the same as the camera.  It is printing completely different. I did a printer check and the printer is fine.  I printed other images from Photoshop and they