Reading data from Lookup [qualified flat] (multi-valued) table

I have a main table u2019ABCu2019 which has a Lookup [qualified flat] (multi-valued) field u2018XYZu2019 along with some other text fields. I am able to read data from table ABC including lookup field data, but I do not see the relation
For Ex: If for a contract in table u2018ABCu2019, there are multiple line items in look up table. I have a contract with 2 line items in u2018ABCu2019, but 4 line items in look up table (2 records each for each line item with item number).  Now, when I read the data from table u2018ABCu2019 for that contract, I get all the 4 lines in result table; but item no is blank even though there is item no field in table u2018XYZu2019.
Below is the sample code:
wa_query-parameter_code = Contract ID Field.
  wa_query-operator = 'EQ'.
  wa_query-dimension_type = 1. "mdmif_search_dim_field.
  wa_query-constraint_type =8. " mdmif_search_constr_text.
  wa_string =  Contract Number.
  GET REFERENCE OF wa_string INTO wa_query-value_low.
  APPEND wa_query TO lt_query.
  CLEAR wa_query.
CALL METHOD lcl_api->mo_core_service->query
        EXPORTING
          iv_object_type_code      = 'ABCu2019
        it_query                 = lt_query
        IMPORTING
          et_result_set            = lt_result_set_key.
LOOP AT lt_result_set_key INTO ls_result_set_key.
    lt_keys = ls_result_set_key-record_ids.
  ENDLOOP.
Contract Price for lower bound
  ls_result_set_definition-field_name = XYZ'.
  APPEND ls_result_set_definition TO lt_result_set_definition.
CALL METHOD lcl_api->mo_core_service->retrieve
    EXPORTING
      iv_object_type_code      = 'ABCu2019
      it_result_set_definition = lt_result_set_definition
      it_keys                  = lt_keys
    IMPORTING
      et_result_set            = lt_result_set.
In table lt_result_set, I see 4 records, but I do not know which 2 records belong to item 1 of the contract and other 2 records for item2.
Can anyone help me in resolving this. Any kind of help is appreciated.
Thanks,
Shekar

i do not quite understand ur problem statement ..
but if u have items in the main table to which QLUT entries are linked ..
then when you read the main table record (u get LinkIds at the record level) that gives u the relation to the Qlookup values
i mean for ur case - u would get only 2 values in the QLUT values for the "selected" record in the main table
am not sure if you say correctly - that you get "unlinked" values in the QLUts
so u should 2 values in lt_result_set and not 4  .. see if u are using the LinkIds properly
hth
thanks
-Adrivit

Similar Messages

  • Search by qualified parameter in a LookUP Qualified Flat Multi Valued Table

    I am working  on a MDM development but I am having some problems the development consists in searching in MDM a certain key but the thing is that this value is stored in a Look up qualified flat multi valued table, I have implemented a code that gives me the correct results but it takes a lot of time because it goes through all the main table the ideal would be to search directly for this value but as it is in a look up qualified  multi valued table I dont know how to implement this kind of search, do you have an idea how to implement this.
    Main Table : ING_ROLE_PLAYER
    Lookup qualified flat multi valued table : ILYTICS_SEARCH_KEY, this table has two qualifiers KEY_TYPE and SEARCH_KEY
    I receive a string that represents a SEARCH KEY as a parameter to look into MDM and get some data this is I want to search using a qualifier parameter as a search parameter.
    If anybody could help me it would be vey helpful
    Best Regards
    Michele González

    Hi Michele,
    Try the following code :
    String searchKey="/the value of the input searchkey/";
    // create a catalog object
    CatalogData catalog = new CatalogData();
    // create a search object on the main table.
    Search search = new Search("ING_ROLE_PLAYER");
    // create a resultset definition on the main table
    ResultSetDefinition rsd = new ResultSetDefinition("ING_ROLE_PLAYER");
    A2iResultSet result=null;
    // adding a free-form limiting for the specified table
         FreeFormTableParameter fftpRole = search.GetParameters().NewFreeFormTableParameter("ING_ROLE_PLAYER");
    //add a new table 'ILYTICS_SEARCH_KEY'-the qualified lookup table to the search object.
         FreeFormTableParameter fftpKeys =  search.GetParameters().NewFreeFormTableParameter("ILYTICS_SEARCH_KEY");
    // get the field 'SEARCH_KEY' of the 'ILYTICS_SEARCH_KEY' table for free-form search.
         FreeFormParameterField fftpKey = fftpKeys.GetFields().New("SEARCH_KEY",FreeFormParameterField.SEARCH_OPERATOR_AND);
    //add a free-form search parameter for 'SEARCH_KEY'
         fftpKey.GetFreeForm().NewString(searchKey,FreeFormParameter.EqualToSearchType);
    // obtain the result set
         result = catalog.GetResultSet(search, rsd, "/any field name in main table and resultset definition/", true, 0);
    Hope it helps.
    Pls reward, if helpful.
    Regards,
    Karambir Singh.

  • Insert Value(s) in Lookup Flat Multi-valued Table with Java API

    I've been looking in MDM Java API Library Reference Guide, MDM SP4 API JavaDoc, and SDN Forums for information on how to Insert/Update different values in a field in the Main Table of a given repository that belongs to a Lookup Flat Multi-valued Table using the Java API with no success.
    I also haven't been successful in adding this values in the same way that I'll add a single value, using the MDM Java API, in a single-value Lookup Table (a2iFields.Add(new A2iField(FIELD_CODE,FIELD_VALUE)) for each value I want to add, like for example:
    a2iFields.Add(new A2iField("Country","USA")
    a2iFields.Add(new A2iField("Country","Mexico")
    a2iFields.Add(new A2iField("Country","Germany")
    Can anybody point me to the correct documentation that I need to read to fulfill this task? Or, even better, if someone can post a piece of code, I'll be more thankful.
    Thanks for your help.

    HI,
    little code example, where you add existing lookup values based on there record id:
         int USA = 1;
                        int GERMANY = 2;
                        A2iValueArray countryArray = new A2iValueArray();               
                        countryArray.Add(new Value(USA));
                        countryArray.Add(new Value(GERMANY));
                        A2iFields record = new A2iFields();
                        record.Add(new A2iField("Country", new Value(countryArray)));
    Please reward points if helpful.
    Regards,
    Robert

  • How to read data from a CLUSTER STRUCTURE not cluster table.

    Hi,
    how to read data from a CLUSTER STRUCTURE not cluster table.
    regards,
    Usha.

    Hello,
    A structre doesnt contain data.. so u cannot read from it. U need to find out table of that structure and read data from it.
    Regards,
    Mansi.

  • How to read data from spreadsheet as a look up table

    Hello Can anybody please help me out in this..??
    I want to read data from a spreadsheet file as it is a look up table.
    I want to create a program which lets the user enter an element which is be found in the data in the spreadsheet file and gives back the number which is to the adjoining column of the element in the spreadsheet file.
    Example i have the following data in spreadsheet file:
    Range  Count
    2              10
    4              49
    6              60
    Etc.
    If i enter 2 to search the data in the spreadsheet file, i do expect the program replies back with the answer 10 and so on...
    Can anyone please help out...
    Thanks in advance...

    apok wrote:
    Why autoindex the output?  You should only have 1 output.  Besides, the Search 1D Array is simpler (no loop needed).
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines
    Attachments:
    Lookup Table.png ‏11 KB

  • Read data from xml files and  populate internal table

    Hi.
    How to read data from xml files into internal tables?
    Can u tell me the classes and methods to read xml data..
    Can u  explain it with a sample program...

    <pre>DATA itab_accontextdir TYPE TABLE OF ACCONTEXTDIR.
    DATA struct_accontextdir LIKE LINE OF itab_accontextdir.
    DATA l_o_error TYPE REF TO cx_root.
    DATA: filename type string ,
                 xmldata type xstring .
    DATA: mr      TYPE REF TO if_mr_api.
    mr = cl_mime_repository_api=>get_api( ).
    mr->get( EXPORTING  i_url     = 'SAP/PUBLIC/BC/xml_files_accontext/xml_accontextdir.xml'
                  IMPORTING  e_content = xmldata ).
    WRITE xmldata.
    TRY.
    CALL TRANSFORMATION id
          SOURCE XML xmldata
          RESULT shiva = itab_accontextdir.
      CATCH cx_root INTO l_o_error.
    ENDTRY.
    LOOP AT itab_accontextdir INTO struct_accontextdir.
        WRITE: / struct_accontextdir-context_id,
               struct_accontextdir-context_name,
               struct_accontextdir-context_type.
        NEW-LINE.
        ENDLOOP.</pre>
    <br/>
    Description:   
    In the above code snippet I am storing the data in an xml file(you know xml is used to store and transport data ) called 'xml_accontextdir.xml' that is uploaded into the MIME repository at path 'SAP/PUBLIC/BC/xml_files_accontext/xml_accontextdir.xml'.
    The below API is used to read a file in MIME repo and convert it into a string that is stored in ' xmldata'. (This is just a raw data that is got by appending the each line of  xml file).
    mr = cl_mime_repository_api=>get_api( ).
    mr->get( EXPORTING  i_url     = 'SAP/PUBLIC/BC/xml_files_accontext/xml_accontextdir.xml'
                  IMPORTING  e_content = xmldata ).
        Once the 'xmldata' string is available we use the tranformation to parse the xml string that we have got from the above API and convert it into the internal table.
    <pre>TRY.
    CALL TRANSFORMATION id
          SOURCE XML xmldata
          RESULT shiva = itab_accontextdir.
      CATCH cx_root INTO l_o_error.
    ENDTRY.</pre>
    Here the trasnsformation 'id ' is used to conververt the source xml 'xmldata' to resulting internal table itab_accontextdir, that have same structure as our xml file 'xml_accontextdir.xml'.  In the RESULT root of the xml file has to be specified. (In my the root is 'shiva'). 
    Things to be taken care:
    One of the major problem that occurs when reading the xml file is 'format not compatible with the internal table' that you are reading into internal table.  Iin order to get rid of this issue use one more tranformation to convert the data from the internal table into the xml file.    
    <pre>TRY.
          CALL TRANSFORMATION id
            SOURCE shiv = t_internal_tab
            RESULT XML xml.
        CATCH cx_root INTO l_o_error.
      ENDTRY.
      WRITE xml.
      NEW-LINE.</pre>
    <br/>
    This is the same transformation that we used above but the differnce is that the SOURCE and RESULT parameters are changed the source is now the internal table and result is *xml *string. Use xml browser that is available with the ABAP workbench to read the xml string displayed with proper indentation. In this way we get the format of xml file to be used that is compatable with the given internal table. 
    Thank you, Hope this will help you!!!
    Edited by: Shiva Prasad L on Jun 15, 2009 7:30 AM
    Edited by: Shiva Prasad L on Jun 15, 2009 11:56 AM
    Edited by: Shiva Prasad L on Jun 15, 2009 12:06 PM

  • Table unable to read data from CSV file

    Dear All,
    I have created a table which have to read data from external CSV file. The table is giving error if the file is not there at the specified location,but when i put the file at that location there is no error but no rows are returned.
    Please suggest what should i do???
    Thanks in advance.

    No version.
    No operating system information.
    No DDL.
    No help is possible.
    I want to drive home the point here to the many people that write posts similar to yours.
    "My car won't start please tell me why" is insuffiicent information. Perhaps it is out of gas. Perhaps the battery is dead. Perhaps you didn't turn the key in the ignition.

  • Problem reading data from two tables

    Hi experts,
    I'm developing a JDBC - IDOC scenario that needs to read data from two oracle tables. I have created a BPM that has a initial fork for the two channels and it works fine.
    The problem is that I need to read data from the first, two or both tables depending if there is data to read. If there is data in the two tables it works, but if only there is data in one of the two tables, I have read problems. I have tryed to set the 'neccesary branches' to 1 but this is a problem when I have data in both tables.
    Any idea?
    Best Regards,
    Alfredo Lagunar.

    Hi,
    have your fork step inside a block and then right-click your block to insert a deadline branch to your BPM process and specify the time after which your BPM process should be cancelled.....so if in that time, you get data from both tables, your BPM will work okay otherwise if that time is over, then your BPM process will be cancelled.
    Regards,
    Rajeev Gupta

  • To read data from exel file into sap

    hi all,
    How to read data from exel file into the internal table in abap?
    Regards,
    sugeet.

    Hi Sugeet,
    Use the following code.
    DATA : BEGIN OF tbl_asset occurs 0,
             anlkl LIKE anla-anlkl,          " Asset Class
             bukrs LIKE anla-bukrs,          " Company Code
             ranl1 LIKE ra02s-ranl1,         " Asset #
             txt50 LIKE anla-txt50,          " Description 1
             txa50 LIKE anla-txa50,          " Description 2
             sernr LIKE anla-sernr,          " Serial #
             invnr LIKE anla-invnr,          " Inventory #
             menge LIKE anla-menge,          " Quantity
             meins LIKE anla-meins,          " Base UOM
             inken LIKE anla-inken,          " Inventory
    END OF tbl_asset.
    DATA : w_filename TYPE IBIPPARMS-path,
           w_file     TYPE string.
    start-of-selection.
    *popup for file path from user
    CALL FUNCTION 'F4_FILENAME'
    EXPORTING
       PROGRAM_NAME        = SYST-CPROG
       DYNPRO_NUMBER       = SYST-DYNNR
    IMPORTING
       FILE_NAME           = w_filename          .
    MOVE w_filename TO w_file .
    * upload data
    CALL FUNCTION 'GUI_UPLOAD'
      EXPORTING
        FILENAME                      =  w_file
        FILETYPE                      = 'ASC'
        HAS_FIELD_SEPARATOR           = 'X'
      TABLES
        DATA_TAB                      = tbl_asset
      EXCEPTIONS
       FILE_OPEN_ERROR               = 1
       FILE_READ_ERROR               = 2
       NO_BATCH                      = 3
       GUI_REFUSE_FILETRANSFER       = 4
       INVALID_TYPE                  = 5
       NO_AUTHORITY                  = 6
       UNKNOWN_ERROR                 = 7
       BAD_DATA_FORMAT               = 8
       HEADER_NOT_ALLOWED            = 9
       SEPARATOR_NOT_ALLOWED         = 10
       HEADER_TOO_LONG               = 11
       UNKNOWN_DP_ERROR              = 12
       ACCESS_DENIED                 = 13
       DP_OUT_OF_MEMORY              = 14
       DISK_FULL                     = 15
       DP_TIMEOUT                    = 16
       OTHERS                        = 17
    IF SY-SUBRC <> 0.
    * MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    *         WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    for HAS_FIELD_SEPARATOR Use
    'X': Fields are separated by tabs.
    SPACE: Fields are not separated by tabs. In this case, the table must contain only one column or all columns must be contained in the file in their entire length.
    Hope it helps...
    Lokesh
    Pls. reward appropriate points

  • Read data from MDM For Lookup and Flat table using MDM ABAP API

    Hi,
    I have requriment to read data from MDM from FLAT and Lookup table using MDM ABAP API. My design  is like this ,
    I have one ITEMS (Main table in MDM) and inside that i have one Lookup flat table ITEM_TYPE , my requriment is to read Item number and its related Item type.
    From ABAP.
    Please help if any body has any idea.
    Regards,
    Shyam

    HI Guys,
    I found my solution by myself. Below is the solution , hope this will help others:-
    Retrieve data from MDM  using MDM ABAP API.
    Step- 1. Create structure in SAP with the same name as that of MDM field code for MDM Main table.
    Step-2. Create another structure in SAP having all  lookup fields of MDM , fieldname in ECC must be same as that of MDM field
    code.
    Step-3.Create structure in SAP for  individual lookup field(Single Field only)   with the same name as MDM Field code.
    Step-4.
    DATA: IT_QUERY            TYPE STANDARD TABLE OF MDM_QUERY,  "MDM_QUERY_TABLE,
          WA_QUERY            TYPE  MDM_QUERY,
          WA_CDT_TEXT         TYPE  MDM_CDT_TEXT,
          IT_RESULT_SET_KEY   TYPE  MDM_SEARCH_RESULT_TABLE,
          WA_RESULT_SET_KEY   TYPE  MDM_SEARCH_RESULT,
          WA_STRING           TYPE  STRING.
    DATA:<Internal table> TYPE STANDARD TABLE OF <SAP Str Having all LOOKup Fields>    
    DATA: :<Internal table>TYPE STANDARD TABLE OF <SAP Str one LOOKup field>,
         <Workarea> LIKE LINE OF :<Internal table>.
    *PASS LOGICAL OBJECT NAME.
    V_LOG_OBJECT_NAME = 'Logical object name defined in Customization'.
    Define logon language, country & region for server
    WA_LANGUAGE-LANGUAGE = 'eng'.
    WA_LANGUAGE-COUNTRY = 'US'.
    WA_LANGUAGE-REGION = 'USA'.
    TRY.
        CREATE OBJECT LR_API
          EXPORTING
            IV_LOG_OBJECT_NAME = V_LOG_OBJECT_NAME.
    ENDTRY.
    CONNECT to repository. Apply particular logon language info
    CALL METHOD LR_API->MO_ACCESSOR->CONNECT
      EXPORTING
        IS_REPOSITORY_LANGUAGE = WA_LANGUAGE.
    *NOW PASS ITEM NO AND GET KEY FROM MDM.
    CLEAR WA_QUERY.
    WA_QUERY-PARAMETER_CODE  = <MDM FIELD CODE>. "Field code
    WA_QUERY-OPERATOR        = 'EQ'. "Contains
    WA_QUERY-DIMENSION_TYPE  = 1. "Field search
    WA_QUERY-CONSTRAINT_TYPE = 8. "Text search
    WA_STRING                = <Field Value>.
    GET REFERENCE OF WA_STRING INTO WA_QUERY-VALUE_LOW.
    APPEND WA_QUERY TO IT_QUERY.
    CLEAR WA_QUERY.
    *PASS ITEM NUMBER AND GET RELATED KEY FROM MDM.
    TRY.
        CALL METHOD LR_API->MO_CORE_SERVICE->QUERY
          EXPORTING
            IV_OBJECT_TYPE_CODE = <MDM Main Table>
            IT_QUERY            = IT_QUERY
          IMPORTING
            ET_RESULT_SET       = IT_RESULT_SET_KEY.
      CATCH CX_MDM_COMMUNICATION_FAILURE .
      CATCH CX_MDM_KERNEL .
      CATCH CX_MDM_NOT_SUPPORTED .
      CATCH CX_MDM_USAGE_ERROR .
      CATCH CX_MDM_PROVIDER .
      CATCH CX_MDM_SERVER_RC_CODE .
    ENDTRY.
    Pass record id into keys.
    LOOP AT IT_RESULT_SET_KEY INTO WA_RESULT_SET_KEY.
      WA_KEYS = WA_RESULT_SET_KEY-RECORD_IDS.
    ENDLOOP.
    WA_RESULT_SET_DEFINITION-FIELD_NAME = <Look field name>.
    APPEND WA_RESULT_SET_DEFINITION TO IT_RESULT_SET_DEFINITION.
    CALL METHOD LR_API->MO_CORE_SERVICE->RETRIEVE
      EXPORTING
        IV_OBJECT_TYPE_CODE      = <MDM Main Table>
        IT_RESULT_SET_DEFINITION = IT_RESULT_SET_DEFINITION
        IT_KEYS                  = WA_KEYS
      IMPORTING
        ET_RESULT_SET            = IT_RESULT_SET.
    LOOP AT IT_RESULT_SET INTO
            WA_RESULT_SET.
    *PASS KEYS INTO MAIN TABLE TO GET Structure for FALT or Look up Table
      TRY.
          CALL METHOD LR_API->MO_CORE_SERVICE->RETRIEVE_SIMPLE
            EXPORTING
              IV_OBJECT_TYPE_CODE = <MDM Main Table>
              IT_KEYS             = WA_KEYS
            IMPORTING
              ET_DDIC_STRUCTURE =<SAP Strct having all Look up fileds of MDM>         
      ENDTRY.
      LOOP AT <SAP Strct having all Look up fileds of MDM> INTO <Work area>.
        CLEAR WA_KEYS.
        APPEND <Work area>-field name TO WA_KEYS.
        CALL METHOD LR_API->MO_CORE_SERVICE->RETRIEVE_SIMPLE
          EXPORTING
            IV_OBJECT_TYPE_CODE = <MDM Lookup table name>
            IT_KEYS             = WA_KEYS
          IMPORTING
            ET_DDIC_STRUCTURE   = <Single Structure in SAP For Lookup field>.
        READ TABLE <Single Structure in SAP For Lookup field>. INTO <Work Area> INDEX 1.
    Here you can get the value of realted lookup fields associated with main table data.
      ENDLOOP.
    ENDLOOP.
    LR_API->MO_ACCESSOR->DISCONNECT( ).
    Edited by: Shyam Babu Sah on Nov 24, 2009 4:52 AM

  • [Forum FAQ] How to configure a Data Driven Subscription which get multi-value parameters from one column of a database table?

    Introduction
    In SQL Server Reporting Services, we can define a mapping between the fields that are returned in the query to specific delivery options and to report parameters in a data-driven subscription.
    For a report with a parameter (such as YEAR) that allow multiple values, when creating a data-driven subscription, how can we pass a record like below to show correct data (data for year 2012, 2013 and 2014).
    EmailAddress                             Parameter                      
    Comment
    [email protected]              2012,2013,2014               NULL
    In this article, I will demonstrate how to configure a Data Driven Subscription which get multi-value parameters from one column of a database table
    Workaround
    Generally, if we pass the “Parameter” column to report directly in the step 5 when creating data-driven subscription.
    The value “2012,2013,2014” will be regarded as a single value, Reporting Services will use “2012,2013,2014” to filter data. However, there are no any records that YEAR filed equal to “2012,2013,2014”, and we will get an error when the subscription executed
    on the log. (C:\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\LogFiles)
    Microsoft.ReportingServices.Diagnostics.Utilities.InvalidReportParameterException: Default value or value provided for the report parameter 'Name' is not a valid value.
    This means that there is no such a value on parameter’s available value list, this is an invalid parameter value. If we change the parameter records like below.
    EmailAddress                        Parameter             Comment
    [email protected]         2012                     NULL
    [email protected]         2013                     NULL
    [email protected]         2014                     NULL
    In this case, Reporting Services will generate 3 reports for one data-driven subscription. Each report for only one year which cannot fit the requirement obviously.
    Currently, there is no a solution to solve this issue. The workaround for it is that create two report, one is used for view report for end users, another one is used for create data-driven subscription.
    On the report that used create data-driven subscription, uncheck “Allow multiple values” option for the parameter, do not specify and available values and default values for this parameter. Then change the Filter
    From
    Expression:[ParameterName]
    Operator   :In
    Value         :[@ParameterName]
    To
    Expression:[ParameterName]
    Operator   :In
    Value         :Split(Parameters!ParameterName.Value,",")
    In this case, we can specify a value like "2012,2013,2014" from database to the data-driven subscription.
    Applies to
    Microsoft SQL Server 2005
    Microsoft SQL Server 2008
    Microsoft SQL Server 2008 R2
    Microsoft SQL Server 2012
    Please click to vote if the post helps you. This can be beneficial to other community members reading the thread.

    For every Auftrag, there are multiple Position entries.
    Rest of the blocks don't seems to have any relation.
    So you can check this code to see how internal table lt_str is built whose first 3 fields have data contained in Auftrag, and next 3 fields have Position data. The structure is flat, assuming that every Position record is related to preceding Auftrag.
    Try out this snippet.
    DATA lt_data TYPE TABLE OF string.
    DATA lv_data TYPE string.
    CALL METHOD cl_gui_frontend_services=>gui_upload
      EXPORTING
        filename = 'C:\temp\test.txt'
      CHANGING
        data_tab = lt_data
      EXCEPTIONS
        OTHERS   = 19.
    CHECK sy-subrc EQ 0.
    TYPES:
    BEGIN OF ty_str,
      a1 TYPE string,
      a2 TYPE string,
      a3 TYPE string,
      p1 TYPE string,
      p2 TYPE string,
      p3 TYPE string,
    END OF ty_str.
    DATA: lt_str TYPE TABLE OF ty_str,
          ls_str TYPE ty_str,
          lv_block TYPE string,
          lv_flag TYPE boolean.
    LOOP AT lt_data INTO lv_data.
      CASE lv_data.
        WHEN '[Version]' OR '[StdSatz]' OR '[Arbeitstag]' OR '[Pecunia]'
             OR '[Mita]' OR '[Kunde]' OR '[Auftrag]' OR '[Position]'.
          lv_block = lv_data.
          lv_flag = abap_false.
        WHEN OTHERS.
          lv_flag = abap_true.
      ENDCASE.
      CHECK lv_flag EQ abap_true.
      CASE lv_block.
        WHEN '[Auftrag]'.
          SPLIT lv_data AT ';' INTO ls_str-a1 ls_str-a2 ls_str-a3.
        WHEN '[Position]'.
          SPLIT lv_data AT ';' INTO ls_str-p1 ls_str-p2 ls_str-p3.
          APPEND ls_str TO lt_str.
      ENDCASE.
    ENDLOOP.

  • Reading data from flat file Using TEXT_IO

    Dear Gurus
    I already posted this question but this time i need some other changes .....Sorry for that ..
    I am using 10G forms and using TEXT_IO for reading data from flat file ..
    My data is like this :-
    0|BP-V1|20100928|01|1|2430962.89|27|2430962.89|MUR|20100928120106
    9|2430962.89|000111111111|
    1|61304.88|000014104113|
    1|41961.73|000022096086|
    1|38475.65|000023640081|
    1|49749.34|000032133154|
    1|35572.46|000033093377|
    1|246671.01|000042148111|
    Here each column is separated by | . I want to read all the columns and want to do some validation .
    How can i do ?
    Initially my requirement was to read only 2 or 3 columns so i did like this ...
    Procedure Pay_Simulator(lfile_type varchar2,lac_no varchar2,lcur varchar2,lno_item number,ltotal number,ldate date,lpay_purp varchar2,lfile_name varchar2)
    IS
    v_handle utl_file.file_type;
    v_filebuffer varchar2(500);
    line_0_date VARCHAR2 (10);
    line_0_Purp VARCHAR2 (10);
    line_0_count Number;
    line_0_sum number(12,2);
    line_0_ccy Varchar2(3);
    line_9_sum Number(12,2);
    line_9_Acc_no Varchar2(12);
    Line_1_Sum Number(12,2);
    Line_1_tot Number(15,2) := 0;
    Line_1_flag Number := 0;
    lval number;
    lacno varchar2(16);
    v_file varchar2(20);
    v_path varchar2(50);
    Begin
    v_file := mcb_simulator_pkg.GET_FILENAME(lfile_name); -- For the file name
    v_path :=rtrim(regexp_substr( lfile_name , '.*\\' ),'\'); For the Path
    v_path := SUBSTR (lfile_name,0, INSTR (lfile_name, '\', -1));
    v_handle := UTL_FILE.fopen (v_path, v_file, 'r');
    LOOP
    UTL_FILE.get_line (v_handle, v_filebuffer);
    IF SUBSTR (v_filebuffer, 0, 1) = '0' THEN
    SELECT line_0 INTO line_0_date
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 3;
    SELECT line_0 INTO line_0_Purp
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 4;
    SELECT line_0 INTO line_0_count
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 7;
    SELECT line_0 INTO line_0_sum
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 8;
    SELECT line_0 INTO line_0_ccy
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_0, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 9;
    ELSIF SUBSTR (v_filebuffer, 0, 1) = '9' THEN
    SELECT line_9 INTO line_9_Acc_no
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_9, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 3;
    SELECT line_9 INTO line_9_sum
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_9, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 2;
    ELSIF SUBSTR (v_filebuffer, 0, 1) = '1' THEN
    line_1_flag := line_1_flag+1;
    SELECT line_1 INTO line_1_sum
    FROM (SELECT LTRIM (REGEXP_SUBSTR (v_filebuffer, '[^|]+{1}', 1, LEVEL)) line_1, ROWNUM rn
    FROM DUAL
    CONNECT BY LEVEL <= LENGTH (REGEXP_REPLACE (v_filebuffer, '[^|]*')) + 1)
    WHERE rn = 3;
    Line_1_tot := Line_1_tot + line_1_sum;
    END IF;
    END LOOP;
    DBMS_OUTPUT.put_line (Line_1_tot);
    DBMS_OUTPUT.PUT_LINE (Line_1_flag);
    UTL_FILE.fclose (v_handle);
    END;
    But now how can i do ? Shall i use like this select Statement for all the columns ?

    Sorry for that ..
    As per our requirement ...
    I need to read the flat file and it looks like like this .
    *0|BP-V1|20100928|01|1|2430962.89|9|2430962.89|MUR|20100928120106*
    *9|2430962.89|000111111111|*
    *1|61304.88|000014104113|*
    *1|41961.73|000022096086|*
    *1|38475.65|000023640081|*
    *1|49749.34|000032133154|*
    *1|35572.46|000033093377|*
    *1|246671.01|000042148111|*
    *1|120737.25|000053101979|*
    *1|151898.79|000082139768|*
    *1|84182.34|000082485593|*
    I have to check the file :-
    Validation are 1st line should start from 0 else it should raise an error and insert that error into one table .
    The for 2nd line also same thing ..it should start from 9 else it should raise an error and insert that error into one table .
    Then the 3rd line should start from 1 else it should raise an error and insert that error into one table .
    After that i have to do a validation like i will read the 1st line 2nd column .. It should be like this BP-V1 else raise an error and insert that error to a table . Then i will check the 3rd column which is 20100928 , it should be YYYYMMDD format else same thing ERROR.
    Then like this for all columns i have different validation .......
    Then it will check for the 2nd line 3rd column . this is an account no .1st i will check it should be 12 char else ERROR .Then I will check that what user has imputed in the form.Like for example User putted 111111111 then i will check with this 000111111111 which is there in the 2nd line . I have to add 000 before that Account no which user imputed .
    Then the lines which is starting from 1 , i have to take all the 2nd column for all the lines which is starting from 1 and i have to do a sum . After that i have to compare that sum with the value in the 1st lines ( Starting from 0) 6th column . It should be same else ERROR ...
    Then same way i have to count all the lines which is starting from 1 . Then i have to compare with the 7th column of 1st line . It should be same . Here in this file it should be 9.
    MY CODE IS :-
    Procedure Pay_Simulator(lfile_type varchar2,lac_no varchar2,lcur varchar2,lno_item number,ltotal number,ldate date,lpay_purp varchar2,lfile_name varchar2)
    IS
    v_handle TEXT_IO.file_type;
    v_filebuffer varchar2(500);
    line_0_date VARCHAR2 (10);
    line_0_Purp VARCHAR2 (10);
    line_0_count Number;
    line_0_sum number(12,2);
    line_0_ccy Varchar2(3);
    line_9_sum Number(12,2);
    line_9_Acc_no Varchar2(12);
    Line_1_Sum Number(12,2);
    Line_1_tot Number(15,2) := 0;
    Line_1_flag Number := 0;
    lval number;
    lacno varchar2(16);
    v_file varchar2(20);
    v_path varchar2(50);
    LC$String VARCHAR2(50) ;--:= 'one|two|three|four|five|six|seven' ;
    LC$Token VARCHAR2(100) ;
    i PLS_INTEGER := 2 ;
    lfirst_char number;
    lvalue Varchar2(100) ;
    Begin
    v_file := mcb_simulator_pkg.GET_FILENAME(lfile_name); For the file name
    v_path :=rtrim(regexp_substr( lfile_name , '.*\\' ),'\'); For the Path
    --v_path := SUBSTR (lfile_name,0, INSTR (lfile_name, '\', -1));
    Message(lfile_name);
    v_handle := TEXT_IO.fopen(lfile_name, 'r');
              BEGIN
                        LOOP
                        TEXT_IO.get_line (v_handle, v_filebuffer);
                        lfirst_char := Substr(v_filebuffer,0,1);
                        --Message('First Char '||lfirst_char); 
                                  IF lfirst_char = '0' Then
                                  Loop
                                  LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
                                  Message('VAL - '||LC$Token);
                                  lvalue := LC$Token;
                                  EXIT WHEN LC$Token IS NULL ;
    i := i + 1 ;
    End Loop;
                                  Else
                                       Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (9999,'0002','First line should always start with 0');
                                       Forms_DDL('Commit');
                                       raise form_Trigger_failure;
                                  End if ;
                        TEXT_IO.get_line (v_handle, v_filebuffer);
                        lfirst_char := Substr(v_filebuffer,0,1);
                        LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
                        --Message('Row '||LC$Token);
                             IF lfirst_char = '9' Then
                                  Null;
                             Else
                                  Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (8888,'0016','Second line should start with 9');
                                  Forms_DDL('Commit');
                                  raise form_Trigger_failure;
                             End IF;
                        LOOP
                        TEXT_IO.get_line (v_handle, v_filebuffer);
                        lfirst_char := Substr(v_filebuffer,0,1);
                        LC$Token := mcb_simulator_pkg.Split( v_filebuffer, i , '|') ;
                        --Message('Row '||LC$Token);
                                  IF lfirst_char = '1' Then
                                  Null;
                                  Else
                                       Insert into MU_SIMULATOR_output_ERR (load_no,ERR_CODE,ERR_DESC) values (7777,'0022','The third line onward should start with 1');
                                       Forms_DDL('Commit');
                                       raise form_Trigger_failure;
                                  End if;
                        END LOOP;
                        --END IF;
                        END LOOP;
              EXCEPTION
                   When No_Data_Found Then
              TEXT_IO.fclose (v_handle);
              END;
    Exception
         When Others Then
         Message('Other error');
    END;
    I am calling the FUNCTION which you gave SPLIT as mcb_simulator_pkg.Split.

  • Read data from ODS table using value mapping

    hi all;
    can anyone help on how to read data from ODS table using value mapping

    Mudit,
    Take a look at this blog,
    <a href="/people/siva.maranani/blog/2005/08/23/lookup146s-in-xi-made-simpler">DB Lookup in Mapping</a>
    Regards,
    Bhavesh

  • Static lookup lists:read data from a Java *.properties file

    Hi
    i need to make static lookup lists i am using read data from a Java *.properties file
    i am using the Class "PropertyFileBasedLookupViewObjectImpl" that wrote by Steve Muench in ToyStore.
    but i need to use the default language for that i update the loadDataFromPropertiesFile()
    method to find the correct properties file
    String temp=Locale.getDefault().getLanguage();
    String propertyFile =
    getViewDef().getFullName().replace('.', '/')+"_"+temp+ ".properties";
    the problem:
    For English(TEST_en.properties) it is good and working
    For Arabic(TEST_ar.properties) read from correct file _ar.properties
    but the dispaly character is wrong
    When Debug
    In the File 1=&#1583;&#1605;&#1588;&#1602;
    In debug 1=/u32423

    Depending on your use case you can either use a programmatic VO or directly expose the JV class as a data control.
    http://docs.oracle.com/cd/E18941_01/tutorials/jdtut_11r2_36/jdtut_11r2_36.html

  • Reading data from flat file in aplication server

    hi all,
    can any body provide code how to read data from flat file which is in application server.
    thanks in advance

    hi,
    chk this sample code.
    parameters: p_file like rlgrap-filename obligatory
    default '/usr/sap/upload.xls'.
    types: begin of t_data,
    vbeln like vbap-vbeln,
    posnr like vbap-posnr,
    matnr like vbap-matnr,
    werks like vbap-werks,
    megne like vbap-zmeng,
    end of t_data.
    data: it_data type standard table of t_data,
    wa_data type t_data.
    open dataset p_file for output in text mode encoding default.
    if sy-subrc ne 0.
    write:/ 'Unable to open file:', p_file.
    else.
    do.
    read dataset p_file into wa_data.
    if sy-subrc ne 0.
    exit.
    else.
    append wa_data to it_data.
    endif.
    enddo.
    close dataset p_file.
    endif.
    Rgds
    Anver

Maybe you are looking for