How to implement State of charge kalman filter algorithm in C code

hi,
I am going to implement kamlan filter algorithm in C code. Is anyone here did this before. Please share your ideas and tell me how i can implement this. Give me some ideas. It would be highly apreciate. thanks in advance
here i attached the kalman filter algorithm file.
regards,
usman
Attachments:
1.docx ‏74 KB

Hi,
did you already have a look at some implementations of the Kalman filter ? For example, that one : 
Kalman filter c code implementation. How to tune required variables?
http://forums.udacity.com/questions/1021647/kalman-filter-c-code-implementation-how-to-tune-required...
or that one :
http://alumni.media.mit.edu/~wad/mas864/psrc/kalman.c.txt
Hope it helps!
Aurelie

Similar Messages

  • How to implement state machine architecture

    Earlier in this forum, i had taken help in improving the subparts of my main vi, now i had developed a main vi,it is working correctly without any error, but i had used sequence architecture in that and as everyone says we should use state machine instead of sequence architecture, i am posting my vi please help how to implement state machine architecture in this, and actually in my vi, i am having 5 tabs( presently i have developed only 2 tabs) 1. patient info, 2.naadi acquisition, 3.nostril temp., 4.vision acquisition, 5.other info. presently with tab control i am using case structure so that only only particular page runs at particular time, i want to know how can i improve this code to work more efficiently.
    Attachments:
    intelligent diagnostic bench.lvproj ‏68 KB

    i am sorry, i thought that vi's are by default attched with project files,i am attaching the main vi and all other associated subvi's also. 
    Attachments:
    Final Diagnostic Project.vi ‏375 KB
    createfolder(SubVI).vi ‏65 KB
    checkstring(SubVI).vi ‏11 KB

  • How to implement classes with alv's

    hi
    how to implement classes with alv's

    Hi Jyotsna,
    check this example codes.
    *"Table declarations...................................................
    TABLES:
    EKKO, " Purchasing Document Header
    CDHDR, " Change document header
    SSCRFIELDS. " Fields on selection screens
    *"Selection screen elements............................................
    SELECT-OPTIONS:
    S_EBELN FOR EKKO-EBELN, " Purchasing Document Number
    S_LIFNR FOR EKKO-LIFNR, " Vendor's account number
    S_EKGRP FOR EKKO-EKGRP, " Purchasing group
    S_BEDAT FOR EKKO-BEDAT, " Purchasing Document Date
    S_UDATE FOR CDHDR-UDATE. " Creation date of the change
    " document
    *" Data declarations...................................................
    Field String to hold Purchase Document Number *
    DATA:
    BEGIN OF FS_EBELN,
    EBELN(90) TYPE C, " Purchase Document Number
    ERNAM TYPE EKKO-ERNAM, " Name of Person who Created
    " the Object
    LIFNR TYPE EKKO-LIFNR, " Vendor's account number
    EKGRP TYPE EKKO-EKGRP, " Purchasing group
    BEDAT TYPE EKKO-BEDAT, " Purchasing Document Date
    END OF FS_EBELN,
    Field String to hold Purchase Document Header *
    BEGIN OF FS_EKKO,
    EBELN TYPE EKKO-EBELN, " Purchasing Document Number
    ERNAM TYPE EKKO-ERNAM, " Name of Person who Created the
    " Object
    LIFNR TYPE EKKO-LIFNR, " Vendor's account number
    EKGRP TYPE EKKO-EKGRP, " Purchasing group
    BEDAT TYPE EKKO-BEDAT, " Purchasing Document Date
    END OF FS_EKKO,
    Field String to hold Account Number and name of the Vendor *
    BEGIN OF FS_LFA1,
    LIFNR TYPE LFA1-LIFNR, " Account Number of Vendor
    NAME1 TYPE LFA1-NAME1, " Name1
    END OF FS_LFA1,
    Field String to hold Change date and the name of the user *
    BEGIN OF FS_CDHDR,
    OBJECTCLAS TYPE CDHDR-OBJECTCLAS, " Object Class
    OBJECTID TYPE CDHDR-OBJECTID, " Object value
    CHANGENR TYPE CDHDR-CHANGENR, " Document change number
    USERNAME TYPE CDHDR-USERNAME, " User name
    UDATE TYPE CDHDR-UDATE, " Creation date of the change
    " document
    END OF FS_CDHDR,
    Field String to hold Change document items *
    BEGIN OF FS_CDPOS,
    OBJECTCLAS TYPE CDPOS-OBJECTCLAS," Object class
    OBJECTID(10) TYPE C, " Object Value
    CHANGENR TYPE CDPOS-CHANGENR, " Document change number
    TABNAME TYPE CDPOS-TABNAME, " Table Name
    FNAME TYPE CDPOS-FNAME, " Field Name
    VALUE_NEW TYPE CDPOS-VALUE_NEW, " New contents of changed field
    VALUE_OLD TYPE CDPOS-VALUE_OLD, " Old contents of changed field
    END OF FS_CDPOS,
    Field String to hold Date Element Name *
    BEGIN OF FS_DATAELE,
    TABNAME TYPE DD03L-TABNAME, " Table Name
    FIELDNAME TYPE DD03L-FIELDNAME, " Field Name
    ROLLNAME TYPE DD03L-ROLLNAME, " Data element (semantic domain)
    END OF FS_DATAELE,
    Field String to hold Short Text of the Date Element *
    BEGIN OF FS_TEXT,
    ROLLNAME TYPE DD04T-ROLLNAME, " Data element (semantic domain)
    DDTEXT TYPE DD04T-DDTEXT, " Short Text Describing R/3
    " Repository Objects
    END OF FS_TEXT,
    Field String to hold data to be displayed on the ALV grid *
    BEGIN OF FS_OUTTAB,
    EBELN TYPE EKKO-EBELN, " Purchasing Document Number
    ERNAM TYPE EKKO-ERNAM, " Name of Person who Created the
    " Object
    LIFNR TYPE EKKO-LIFNR, " Vendor's account number
    EKGRP TYPE EKKO-EKGRP, " Purchasing group
    BEDAT TYPE EKKO-BEDAT, " Purchasing Document Date
    WERKS TYPE LFA1-WERKS, " Plant
    NAME1 TYPE LFA1-NAME1, " Name1
    USERNAME TYPE CDHDR-USERNAME, " User name
    UDATE TYPE CDHDR-UDATE, " Creation date of the change
    " document
    DDTEXT TYPE DD04T-DDTEXT, " Short Text Describing R/3
    " Repository Objects
    VALUE_NEW TYPE CDPOS-VALUE_NEW, " New contents of changed field
    VALUE_OLD TYPE CDPOS-VALUE_OLD, " Old contents of changed field
    END OF FS_OUTTAB,
    Internal table to hold Purchase Document Number *
    T_EBELN LIKE STANDARD TABLE
    OF FS_EBELN,
    Internal table to hold Purchase Document Header *
    T_EKKO LIKE STANDARD TABLE
    OF FS_EKKO,
    Temp Internal table to hold Purchase Document Header *
    T_EKKO_TEMP LIKE STANDARD TABLE
    OF FS_EKKO,
    Internal table to hold Account number and Name of the Vendor *
    T_LFA1 LIKE STANDARD TABLE
    OF FS_LFA1,
    Internal Table to hold Change date and the name of the user *
    T_CDHDR LIKE STANDARD TABLE
    OF FS_CDHDR,
    Internal Table to hold Change document items *
    T_CDPOS LIKE STANDARD TABLE
    OF FS_CDPOS,
    Temp. Internal Table to hold Change document items *
    T_CDPOS_TEMP LIKE STANDARD TABLE
    OF FS_CDPOS,
    Internal Table to hold Data Element Name *
    T_DATAELE LIKE STANDARD TABLE
    OF FS_DATAELE,
    Temp. Internal Table to hold Data Element Name *
    T_DATAELE_TEMP LIKE STANDARD TABLE
    OF FS_DATAELE,
    Internal Table to hold Short Text of the Date Element *
    T_TEXT LIKE STANDARD TABLE
    OF FS_TEXT,
    Internal Table to hold data to be displayed on the ALV grid *
    T_OUTTAB LIKE STANDARD TABLE
    OF FS_OUTTAB.
    C L A S S D E F I N I T I O N *
    CLASS LCL_EVENT_HANDLER DEFINITION DEFERRED.
    *" Data declarations...................................................
    Work variables *
    DATA:
    W_EBELN TYPE EKKO-EBELN, " Purchasing Document Number
    W_LIFNR TYPE EKKO-LIFNR, " Vendor's account number
    W_EKGRP TYPE EKKO-EKGRP, " Purchasing group
    W_VALUE TYPE EKKO-EBELN, " Reflected Value
    W_SPACE VALUE ' ', " Space
    W_FLAG TYPE I, " Flag Variable
    W_VARIANT TYPE DISVARIANT, " Variant
    ALV Grid
    W_GRID TYPE REF TO CL_GUI_ALV_GRID,
    Event Handler
    W_EVENT_CLICK TYPE REF TO LCL_EVENT_HANDLER,
    Field catalog table
    T_FIELDCAT TYPE LVC_T_FCAT.
    AT SELECTION-SCREEN EVENT *
    AT SELECTION-SCREEN ON S_EBELN.
    Subroutine to validate Purchase Document Number.
    PERFORM VALIDATE_PD_NUM.
    AT SELECTION-SCREEN ON S_LIFNR.
    Subroutine to validate Vendor Number.
    PERFORM VALIDATE_VEN_NUM.
    AT SELECTION-SCREEN ON S_EKGRP.
    Subroutine to validate Purchase Group.
    PERFORM VALIDATE_PUR_GRP.
    START-OF-SELECTION EVENT *
    START-OF-SELECTION.
    Subroutine to select all Purchase orders.
    PERFORM SELECT_PO.
    CHECK W_FLAG EQ 0.
    Subroutine to select Object values.
    PERFORM SELECT_OBJ_ID.
    CHECK W_FLAG EQ 0.
    Subroutine to select Changed values.
    PERFORM SELECT_CHANGED_VALUE.
    CHECK W_FLAG EQ 0.
    Subroutine to Select Purchase Orders.
    PERFORM SELECT_PUR_DOC.
    Subroutine to select Vendor Details.
    PERFORM SELECT_VENDOR.
    Subroutine to select Text for the Changed values.
    PERFORM DESCRIPTION.
    END-OF-SELECTION EVENT *
    END-OF-SELECTION.
    IF NOT T_EKKO IS INITIAL.
    Subroutine to populate the Output Table.
    PERFORM FILL_OUTTAB.
    Subroutine to build Field Catalog.
    PERFORM PREPARE_FIELD_CATALOG CHANGING T_FIELDCAT.
    CALL SCREEN 100.
    ENDIF. " IF NOT T_EKKO...
    CLASS LCL_EVENT_HANDLER DEFINITION
    Defining Class which handles events
    CLASS LCL_EVENT_HANDLER DEFINITION .
    PUBLIC SECTION .
    METHODS:
    HANDLE_HOTSPOT_CLICK
    FOR EVENT HOTSPOT_CLICK OF CL_GUI_ALV_GRID
    IMPORTING E_ROW_ID E_COLUMN_ID.
    ENDCLASS. " LCL_EVENT_HANDLER DEFINITION
    CLASS LCL_EVENT_HANDLER IMPLEMENTATION
    Implementing the Class which can handle events
    CLASS LCL_EVENT_HANDLER IMPLEMENTATION .
    *---Handle Double Click
    METHOD HANDLE_HOTSPOT_CLICK .
    Subroutine to get the HotSpot Cell information.
    PERFORM GET_CELL_INFO.
    SET PARAMETER ID 'BES' FIELD W_VALUE.
    CALL TRANSACTION 'ME23N'.
    ENDMETHOD. " HANDLE_HOTSPOT_CLICK
    ENDCLASS. " LCL_EVENT_HANDLER
    *& Module STATUS_0100 OUTPUT
    PBO Event
    MODULE STATUS_0100 OUTPUT.
    SET PF-STATUS 'OOPS'.
    SET TITLEBAR 'TIT'.
    Subroutine to fill the Variant Structure
    PERFORM FILL_VARIANT.
    IF W_GRID IS INITIAL.
    CREATE OBJECT W_GRID
    EXPORTING
    I_SHELLSTYLE = 0
    I_LIFETIME =
    I_PARENT = CL_GUI_CONTAINER=>SCREEN0
    I_APPL_EVENTS =
    I_PARENTDBG =
    I_APPLOGPARENT =
    I_GRAPHICSPARENT =
    I_NAME =
    I_FCAT_COMPLETE = SPACE
    EXCEPTIONS
    ERROR_CNTL_CREATE = 1
    ERROR_CNTL_INIT = 2
    ERROR_CNTL_LINK = 3
    ERROR_DP_CREATE = 4
    OTHERS = 5.
    IF SY-SUBRC 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF. " IF SY-SUBRC 0
    CALL METHOD W_GRID->SET_TABLE_FOR_FIRST_DISPLAY
    EXPORTING
    I_BUFFER_ACTIVE =
    I_BYPASSING_BUFFER =
    I_CONSISTENCY_CHECK =
    I_STRUCTURE_NAME =
    IS_VARIANT = W_VARIANT
    I_SAVE = 'A'
    I_DEFAULT = 'X'
    IS_LAYOUT =
    IS_PRINT =
    IT_SPECIAL_GROUPS =
    IT_TOOLBAR_EXCLUDING =
    IT_HYPERLINK =
    IT_ALV_GRAPHICS =
    IT_EXCEPT_QINFO =
    IR_SALV_ADAPTER =
    CHANGING
    IT_OUTTAB = T_OUTTAB
    IT_FIELDCATALOG = T_FIELDCAT
    IT_SORT =
    IT_FILTER =
    EXCEPTIONS
    INVALID_PARAMETER_COMBINATION = 1
    PROGRAM_ERROR = 2
    TOO_MANY_LINES = 3
    OTHERS = 4
    IF SY-SUBRC 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF. " IF SY-SUBRC 0.
    ENDIF. " IF W_GRID IS INITIAL
    CREATE OBJECT W_EVENT_CLICK.
    SET HANDLER W_EVENT_CLICK->HANDLE_HOTSPOT_CLICK FOR W_GRID.
    ENDMODULE. " STATUS_0100 OUTPUT
    *& Module USER_COMMAND_0100 INPUT
    PAI Event
    MODULE USER_COMMAND_0100 INPUT.
    CASE SY-UCOMM.
    WHEN 'BACK'.
    LEAVE TO SCREEN 0.
    WHEN 'EXIT'.
    LEAVE PROGRAM.
    WHEN 'CANCEL'.
    LEAVE TO SCREEN 0.
    ENDCASE.
    ENDMODULE. " USER_COMMAND_0100 INPUT
    *& Form PREPARE_FIELD_CATALOG
    Subroutine to build the Field catalog
    <--P_T_FIELDCAT Field Catalog Table
    FORM PREPARE_FIELD_CATALOG CHANGING PT_FIELDCAT TYPE LVC_T_FCAT .
    DATA LS_FCAT TYPE LVC_S_FCAT.
    Purchasing group...
    LS_FCAT-FIELDNAME = 'EKGRP'.
    LS_FCAT-REF_TABLE = 'EKKO'.
    LS_FCAT-INTTYPE = 'C'.
    LS_FCAT-OUTPUTLEN = '10'.
    APPEND LS_FCAT TO PT_FIELDCAT.
    CLEAR LS_FCAT.
    Purchasing Document Number...
    LS_FCAT-FIELDNAME = 'EBELN'.
    LS_FCAT-REF_TABLE = 'EKKO' .
    LS_FCAT-EMPHASIZE = 'C411'.
    LS_FCAT-INTTYPE = 'C'.
    LS_FCAT-OUTPUTLEN = '10'.
    LS_FCAT-HOTSPOT = 'X'.
    APPEND LS_FCAT TO PT_FIELDCAT .
    CLEAR LS_FCAT .
    Name of Person who Created the Object...
    LS_FCAT-FIELDNAME = 'ERNAM'.
    LS_FCAT-REF_TABLE = 'EKKO'.
    LS_FCAT-OUTPUTLEN = '15' .
    APPEND LS_FCAT TO PT_FIELDCAT.
    CLEAR LS_FCAT.
    Purchasing Document Date...
    LS_FCAT-FIELDNAME = 'BEDAT'.
    LS_FCAT-REF_TABLE = 'EKKO'.
    LS_FCAT-INTTYPE = 'C'.
    LS_FCAT-OUTPUTLEN = '10'.
    APPEND LS_FCAT TO PT_FIELDCAT.
    CLEAR LS_FCAT.
    Vendor's account number...
    LS_FCAT-FIELDNAME = 'LIFNR'.
    LS_FCAT-REF_TABLE = 'EKKO'.
    LS_FCAT-INTTYPE = 'C'.
    LS_FCAT-OUTPUTLEN = '10'.
    APPEND LS_FCAT TO PT_FIELDCAT.
    CLEAR LS_FCAT.
    Account Number of Vendor or Creditor...
    LS_FCAT-FIELDNAME = 'NAME1'.
    LS_FCAT-REF_TABLE = 'LFA1'.
    LS_FCAT-INTTYPE = 'C'.
    LS_FCAT-OUTPUTLEN = '10'.
    LS_FCAT-COLTEXT = 'Vendor Name'(001).
    LS_FCAT-SELTEXT = 'Vendor Name'(001).
    APPEND LS_FCAT TO PT_FIELDCAT.
    CLEAR LS_FCAT.
    Creation date of the change document...
    LS_FCAT-FIELDNAME = 'UDATE'.
    LS_FCAT-REF_TABLE = 'CDHDR'.
    LS_FCAT-INTTYPE = 'C'.
    LS_FCAT-OUTPUTLEN = '10'.
    LS_FCAT-COLTEXT = 'Change Date'(002).
    LS_FCAT-SELTEXT = 'Change Date'(002).
    APPEND LS_FCAT TO PT_FIELDCAT.
    CLEAR LS_FCAT.
    User name of the person responsible in change document...
    LS_FCAT-FIELDNAME = 'USERNAME'.
    LS_FCAT-REF_TABLE = 'CDHDR'.
    LS_FCAT-INTTYPE = 'C'.
    LS_FCAT-OUTPUTLEN = '10'.
    LS_FCAT-COLTEXT = 'Modified by'(003).
    LS_FCAT-SELTEXT = 'Modified by'(003).
    APPEND LS_FCAT TO PT_FIELDCAT.
    CLEAR LS_FCAT.
    Short Text Describing R/3 Repository Objects...
    LS_FCAT-FIELDNAME = 'DDTEXT'.
    LS_FCAT-REF_TABLE = 'DD04T'.
    LS_FCAT-INTTYPE = 'C'.
    LS_FCAT-OUTPUTLEN = '15'.
    APPEND LS_FCAT TO PT_FIELDCAT.
    CLEAR LS_FCAT.
    Old contents of changed field...
    LS_FCAT-FIELDNAME = 'VALUE_OLD'.
    LS_FCAT-REF_TABLE = 'CDPOS'.
    LS_FCAT-INTTYPE = 'C'.
    LS_FCAT-OUTPUTLEN = '12'.
    APPEND LS_FCAT TO PT_FIELDCAT.
    CLEAR LS_FCAT.
    New contents of changed field...
    LS_FCAT-FIELDNAME = 'VALUE_NEW'.
    LS_FCAT-REF_TABLE = 'CDPOS'.
    LS_FCAT-INTTYPE = 'C'.
    LS_FCAT-OUTPUTLEN = '12'.
    APPEND LS_FCAT TO PT_FIELDCAT.
    CLEAR LS_FCAT.
    ENDFORM. " PREPARE_FIELD_CATALOG
    *& Form SELECT_PO
    Subroutine to select all the Purchase Orders
    There are no interface parameters to be passed to this subroutine.
    FORM SELECT_PO .
    SELECT EBELN " Purchasing Document Number
    ERNAM " Name of Person who Created
    " the Object
    LIFNR " Vendor's account number
    EKGRP " Purchasing group
    BEDAT " Purchasing Document Date
    FROM EKKO
    PACKAGE SIZE 10000
    APPENDING TABLE T_EBELN
    WHERE EBELN IN S_EBELN
    AND BEDAT IN S_BEDAT.
    ENDSELECT.
    IF SY-SUBRC NE 0.
    W_FLAG = 1.
    MESSAGE S401(M8).
    ENDIF. " IF SY-SUBRC NE 0
    ENDFORM. " SELECT_PO
    *& Form SELECT_OBJ_ID
    Subroutine to select Object ID
    There are no interface parameters to be passed to this subroutine.
    FORM SELECT_OBJ_ID .
    IF NOT T_EBELN IS INITIAL.
    SELECT OBJECTCLAS " Object Class
    OBJECTID " Object value
    CHANGENR " Document change number
    USERNAME " User name
    UDATE " Creation date
    FROM CDHDR
    INTO TABLE T_CDHDR
    FOR ALL ENTRIES IN T_EBELN
    WHERE OBJECTID EQ T_EBELN-EBELN
    AND UDATE IN S_UDATE
    AND TCODE IN ('ME21N','ME22N','ME23N').
    ENDSELECT.
    IF SY-SUBRC NE 0.
    W_FLAG = 1.
    MESSAGE S833(M8) WITH 'Header Not Found'(031).
    ENDIF. " IF SY-SUBRC NE 0.
    ENDIF. " IF NOT T_EBELN IS INITIAL
    ENDFORM. " SELECT_OBJ_ID
    *& Form SELECT_CHANGED_VALUE
    Subroutine to select Changed Values
    There are no interface parameters to be passed to this subroutine.
    FORM SELECT_CHANGED_VALUE .
    IF NOT T_CDHDR IS INITIAL.
    SELECT OBJECTCLAS " Object class
    OBJECTID " Object value
    CHANGENR " Document change number
    TABNAME " Table Name
    FNAME " Field Name
    VALUE_NEW " New contents of changed field
    VALUE_OLD " Old contents of changed field
    FROM CDPOS
    PACKAGE SIZE 10000
    APPENDING TABLE T_CDPOS
    FOR ALL ENTRIES IN T_CDHDR
    WHERE OBJECTCLAS EQ T_CDHDR-OBJECTCLAS
    AND OBJECTID EQ T_CDHDR-OBJECTID
    AND CHANGENR EQ T_CDHDR-CHANGENR.
    ENDSELECT.
    IF SY-SUBRC NE 0.
    W_FLAG = 1.
    MESSAGE S833(M8) WITH 'Item Not Found'(032).
    ENDIF. " IF SY-SUBRC NE 0.
    ENDIF. " IF NOT T_CDHDR IS INITIAL
    T_CDPOS_TEMP] = T_CDPOS[.
    ENDFORM. " SELECT_CHANGED_VALUE
    *& Form SELECT_PUR_DOC
    Subroutine to select Purchase Order Details
    There are no interface parameters to be passed to this subroutine.
    FORM SELECT_PUR_DOC .
    IF NOT T_CDPOS IS INITIAL.
    SORT T_EBELN BY EBELN.
    LOOP AT T_CDPOS INTO FS_CDPOS.
    READ TABLE T_EBELN INTO FS_EBELN WITH KEY EBELN =
    FS_CDPOS-OBJECTID BINARY SEARCH.
    IF SY-SUBRC NE 0.
    DELETE TABLE T_EBELN FROM FS_EBELN.
    ENDIF. " IF SY-SUBRC NE 0.
    ENDLOOP. " LOOP AT T_CDPOS...
    LOOP AT T_EBELN INTO FS_EBELN.
    MOVE FS_EBELN-EBELN TO FS_EKKO-EBELN.
    MOVE FS_EBELN-ERNAM TO FS_EKKO-ERNAM.
    MOVE FS_EBELN-LIFNR TO FS_EKKO-LIFNR.
    MOVE FS_EBELN-EKGRP TO FS_EKKO-EKGRP.
    MOVE FS_EBELN-BEDAT TO FS_EKKO-BEDAT.
    APPEND FS_EKKO TO T_EKKO.
    ENDLOOP. " LOOP AT T_EBELN...
    T_EKKO_TEMP] = T_EKKO[.
    ENDIF. " IF NOT T_CDPOS IS INITIAL
    ENDFORM. " SELECT_PUR_DOC
    *& Form SELECT_VENDOR
    Subroutine to select Vendor details
    There are no interface parameters to be passed to this subroutine.
    FORM SELECT_VENDOR .
    IF NOT T_EKKO IS INITIAL.
    SORT T_EKKO_TEMP BY LIFNR.
    DELETE ADJACENT DUPLICATES FROM T_EKKO_TEMP COMPARING LIFNR.
    SELECT LIFNR " Account Number of Vendor or
    " Creditor
    NAME1 " Name 1
    FROM LFA1
    INTO TABLE T_LFA1
    FOR ALL ENTRIES IN T_EKKO_TEMP
    WHERE LIFNR EQ T_EKKO_TEMP-LIFNR.
    IF SY-SUBRC NE 0.
    MESSAGE S002(M8) WITH 'Master Details'(033).
    ENDIF. " IF SY-SUBRC NE 0.
    ENDIF. " IF NOT T_EKKO IS INITIAL
    ENDFORM. " SELECT_VENDOR
    *& Form DESCRIPTION
    Subroutine to get the description
    There are no interface parameters to be passed to this subroutine.
    FORM DESCRIPTION .
    IF NOT T_CDPOS IS INITIAL.
    SORT T_CDPOS_TEMP BY TABNAME FNAME.
    DELETE ADJACENT DUPLICATES FROM T_CDPOS_TEMP COMPARING TABNAME FNAME
    SELECT TABNAME " Table Name
    FIELDNAME " Field Name
    ROLLNAME " Data element
    FROM DD03L
    INTO TABLE T_DATAELE
    FOR ALL ENTRIES IN T_CDPOS_TEMP
    WHERE TABNAME EQ T_CDPOS_TEMP-TABNAME
    AND FIELDNAME EQ T_CDPOS_TEMP-FNAME.
    IF NOT T_DATAELE IS INITIAL.
    T_DATAELE_TEMP] = T_DATAELE[.
    SORT T_DATAELE_TEMP BY ROLLNAME.
    DELETE ADJACENT DUPLICATES FROM T_DATAELE_TEMP COMPARING ROLLNAME.
    SELECT ROLLNAME " Data element
    DDTEXT " Short Text Describing R/3
    " Repository Objects
    FROM DD04T
    INTO TABLE T_TEXT
    FOR ALL ENTRIES IN T_DATAELE_TEMP
    WHERE ROLLNAME EQ T_DATAELE_TEMP-ROLLNAME
    AND DDLANGUAGE EQ SY-LANGU.
    IF SY-SUBRC NE 0.
    EXIT.
    ENDIF. " IF SY-SUBRC NE 0.
    ENDIF. " IF NOT T_DATAELE IS INITIAL.
    ENDIF. " IF NOT T_CDPOS IS INITIAL.
    ENDFORM. " DESCRIPTION
    *& Form FILL_OUTTAB
    Subroutine to populate the Outtab
    There are no interface parameters to be passed to this subroutine.
    FORM FILL_OUTTAB .
    SORT T_CDHDR BY OBJECTCLAS OBJECTID CHANGENR.
    SORT T_EKKO BY EBELN.
    SORT T_LFA1 BY LIFNR.
    SORT T_DATAELE BY TABNAME FIELDNAME.
    SORT T_TEXT BY ROLLNAME.
    LOOP AT T_CDPOS INTO FS_CDPOS.
    READ TABLE T_CDHDR INTO FS_CDHDR WITH KEY
    OBJECTCLAS = FS_CDPOS-OBJECTCLAS
    OBJECTID = FS_CDPOS-OBJECTID
    CHANGENR = FS_CDPOS-CHANGENR
    BINARY SEARCH.
    IF SY-SUBRC EQ 0.
    MOVE FS_CDHDR-USERNAME TO FS_OUTTAB-USERNAME.
    MOVE FS_CDHDR-UDATE TO FS_OUTTAB-UDATE.
    READ TABLE T_EKKO INTO FS_EKKO WITH KEY
    EBELN = FS_CDHDR-OBJECTID
    BINARY SEARCH.
    IF SY-SUBRC EQ 0.
    MOVE FS_EKKO-EBELN TO FS_OUTTAB-EBELN.
    MOVE FS_EKKO-ERNAM TO FS_OUTTAB-ERNAM.
    MOVE FS_EKKO-LIFNR TO FS_OUTTAB-LIFNR.
    MOVE FS_EKKO-EKGRP TO FS_OUTTAB-EKGRP.
    MOVE FS_EKKO-BEDAT TO FS_OUTTAB-BEDAT.
    READ TABLE T_LFA1 INTO FS_LFA1 WITH KEY
    LIFNR = FS_EKKO-LIFNR
    BINARY SEARCH.
    IF SY-SUBRC EQ 0.
    MOVE FS_LFA1-NAME1 TO FS_OUTTAB-NAME1.
    ENDIF. " IF SY-SUBRC EQ 0.
    ENDIF. " IF SY-SUBRC EQ 0.
    ENDIF. " IF SY-SUBRC EQ 0.
    MOVE FS_CDPOS-VALUE_NEW TO FS_OUTTAB-VALUE_NEW.
    MOVE FS_CDPOS-VALUE_OLD TO FS_OUTTAB-VALUE_OLD.
    READ TABLE T_DATAELE INTO FS_DATAELE WITH KEY
    TABNAME = FS_CDPOS-TABNAME
    FIELDNAME = FS_CDPOS-FNAME
    BINARY SEARCH.
    IF SY-SUBRC EQ 0.
    READ TABLE T_TEXT INTO FS_TEXT WITH KEY
    ROLLNAME = FS_DATAELE-ROLLNAME
    BINARY SEARCH.
    IF SY-SUBRC EQ 0.
    MOVE FS_TEXT-DDTEXT TO FS_OUTTAB-DDTEXT.
    ENDIF. " IF SY-SUBRC EQ 0.
    ENDIF. " IF SY-SUBRC EQ 0.
    APPEND FS_OUTTAB TO T_OUTTAB.
    CLEAR FS_OUTTAB.
    ENDLOOP.
    ENDFORM. " FILL_OUTTAB
    *& Form GET_CELL_INFO
    Subroutine to get the Cell Information
    --> W_VALUE Holds the value of Hotspot clicked
    FORM GET_CELL_INFO .
    CALL METHOD W_GRID->GET_CURRENT_CELL
    IMPORTING
    E_ROW =
    E_VALUE = W_VALUE
    E_COL =
    ES_ROW_ID =
    ES_COL_ID =
    ES_ROW_NO =
    ENDFORM. " GET_CELL_INFO
    *& Form VALIDATE_PD_NUM
    Subroutine to validate Purchase Document Number
    There are no interface parameters to be passed to this subroutine.
    FORM VALIDATE_PD_NUM .
    IF NOT S_EBELN[] IS INITIAL.
    SELECT EBELN " Purchase Document Number
    FROM EKKO
    INTO W_EBELN
    UP TO 1 ROWS
    WHERE EBELN IN S_EBELN.
    ENDSELECT.
    IF SY-SUBRC NE 0.
    CLEAR SSCRFIELDS-UCOMM.
    MESSAGE E717(M8).
    ENDIF. " IF SY-SUBRC NE 0
    ENDIF. " IF NOT S_EBELN[]...
    ENDFORM. " VALIDATE_PD_NUM
    *& Form VALIDATE_VEN_NUM
    Subroutine to validate Vendor Number
    There are no interface parameters to be passed to this subroutine.
    FORM VALIDATE_VEN_NUM .
    IF NOT S_LIFNR[] IS INITIAL.
    SELECT LIFNR " Vendor Number
    FROM LFA1
    INTO W_LIFNR
    UP TO 1 ROWS
    WHERE LIFNR IN S_LIFNR.
    ENDSELECT.
    IF SY-SUBRC NE 0.
    CLEAR SSCRFIELDS-UCOMM.
    MESSAGE E002(M8) WITH W_SPACE.
    ENDIF. " IF SY-SUBRC NE 0
    ENDIF. " IF NOT S_LIFNR[]...
    ENDFORM. " VALIDATE_VEN_NUM
    *& Form VALIDATE_PUR_GRP
    Subroutine to validate the Purchase Group
    There are no interface parameters to be passed to this subroutine.
    FORM VALIDATE_PUR_GRP .
    IF NOT S_EKGRP[] IS INITIAL.
    SELECT EKGRP " Purchase Group
    FROM T024
    INTO W_EKGRP
    UP TO 1 ROWS
    WHERE EKGRP IN S_EKGRP.
    ENDSELECT.
    IF SY-SUBRC NE 0.
    CLEAR SSCRFIELDS-UCOMM.
    MESSAGE E622(M8) WITH W_SPACE.
    ENDIF. " IF SY-SUBRC NE 0
    ENDIF. " IF NOT S_EKFRP[]...
    ENDFORM. " VALIDATE_PUR_GRP
    *& Form FILL_VARIANT
    Subroutine to fill the Variant Structure
    There are no interface parameters to be passed to this subroutine
    FORM FILL_VARIANT .
    Filling the Variant structure
    W_VARIANT-REPORT = SY-REPID.
    W_VARIANT-USERNAME = SY-UNAME.
    ENDFORM. " FILL_VARIANT
    REPORT YMS_HIERSEQLISTDISPLAY .
    Program with FM REUSE_ALV_HIERSEQ_LIST_DISPLAY *
    Author : Michel PIOUD *
    Email : mpioudyahoo.fr HomePage : http://www.geocities.com/mpioud *
    TYPE-POOLS: slis. " ALV Global types
    CONSTANTS :
    c_x VALUE 'X',
    c_gt_vbap TYPE SLIS_TABNAME VALUE 'GT_VBAP',
    c_gt_vbak TYPE SLIS_TABNAME VALUE 'GT_VBAK'.
    SELECTION-SCREEN :
    SKIP, BEGIN OF LINE,COMMENT 5(27) v_1 FOR FIELD p_max. "#EC NEEDED
    PARAMETERS p_max(02) TYPE n DEFAULT '10' OBLIGATORY.
    SELECTION-SCREEN END OF LINE.
    SELECTION-SCREEN :
    SKIP, BEGIN OF LINE,COMMENT 5(27) v_2 FOR FIELD p_expand. "#EC NEEDED
    PARAMETERS p_expand AS CHECKBOX DEFAULT c_x.
    SELECTION-SCREEN END OF LINE.
    TYPES :
    1st Table
    BEGIN OF ty_vbak,
    vbeln TYPE vbak-vbeln, " Sales document
    kunnr TYPE vbak-kunnr, " Sold-to party
    netwr TYPE vbak-netwr, " Net Value of the Sales Order
    erdat TYPE vbak-erdat, " Creation date
    waerk TYPE vbak-waerk, " SD document currency
    expand TYPE xfeld,
    END OF ty_vbak,
    2nd Table
    BEGIN OF ty_vbap,
    vbeln TYPE vbap-vbeln, " Sales document
    posnr TYPE vbap-posnr, " Sales document
    matnr TYPE vbap-matnr, " Material number
    netwr TYPE vbap-netwr, " Net Value of the Sales Order
    waerk TYPE vbap-waerk, " SD document currency
    END OF ty_vbap.
    DATA :
    1st Table
    gt_vbak TYPE TABLE OF ty_vbak,
    2nd Table
    gt_vbap TYPE TABLE OF ty_vbap.
    INITIALIZATION.
    v_1 = 'Maximum of records to read'.
    v_2 = 'With ''EXPAND'' field'.
    START-OF-SELECTION.
    Read Sales Document: Header Data
    SELECT vbeln kunnr netwr waerk erdat
    FROM vbak
    UP TO p_max ROWS
    INTO CORRESPONDING FIELDS OF TABLE gt_vbak.
    IF NOT gt_vbak[] IS INITIAL.
    Read Sales Document: Item Data
    SELECT vbeln posnr matnr netwr waerk
    FROM vbap
    INTO CORRESPONDING FIELDS OF TABLE gt_vbap
    FOR ALL ENTRIES IN gt_vbak
    WHERE vbeln = gt_vbak-vbeln.
    ENDIF.
    PERFORM f_display.
    Form F_DISPLAY
    FORM f_display.
    Macro definition
    DEFINE m_fieldcat.
    ls_fieldcat-tabname = &1.
    ls_fieldcat-fieldname = &2.
    ls_fieldcat-ref_tabname = &3.
    ls_fieldcat-cfieldname = &4. " Field with currency unit
    append ls_fieldcat to lt_fieldcat.
    END-OF-DEFINITION.
    DEFINE m_sort.
    ls_sort-tabname = &1.
    ls_sort-fieldname = &2.
    ls_sort-up = c_x.
    append ls_sort to lt_sort.
    END-OF-DEFINITION.
    DATA:
    ls_layout TYPE slis_layout_alv,
    ls_keyinfo TYPE slis_keyinfo_alv,
    ls_sort TYPE slis_sortinfo_alv,
    lt_sort TYPE slis_t_sortinfo_alv," Sort table
    ls_fieldcat TYPE slis_fieldcat_alv,
    lt_fieldcat TYPE slis_t_fieldcat_alv." Field catalog
    ls_layout-group_change_edit = c_x.
    ls_layout-colwidth_optimize = c_x.
    ls_layout-zebra = c_x.
    ls_layout-detail_popup = c_x.
    ls_layout-get_selinfos = c_x.
    IF p_expand = c_x.
    ls_layout-expand_fieldname = 'EXPAND'.
    ENDIF.
    Build field catalog and sort table
    m_fieldcat c_gt_vbak 'VBELN' 'VBAK' ''.
    m_fieldcat c_gt_vbak 'KUNNR' 'VBAK' ''.
    m_fieldcat c_gt_vbak 'NETWR' 'VBAK' 'WAERK'.
    m_fieldcat c_gt_vbak 'WAERK' 'VBAK' ''.
    m_fieldcat c_gt_vbak 'ERDAT' 'VBAK' ''.
    m_fieldcat c_gt_vbap 'POSNR' 'VBAP' ''.
    m_fieldcat c_gt_vbap 'MATNR' 'VBAP' ''.
    m_fieldcat c_gt_vbap 'NETWR' 'VBAP' 'WAERK'.
    m_fieldcat c_gt_vbap 'WAERK' 'VBAP' ''.
    m_sort c_gt_vbak 'KUNNR'.
    m_sort c_gt_vbap 'NETWR'.
    ls_keyinfo-header01 = 'VBELN'.
    ls_keyinfo-item01 = 'VBELN'.
    ls_keyinfo-item02 = 'POSNR'.
    Dipslay Hierarchical list
    CALL FUNCTION 'REUSE_ALV_HIERSEQ_LIST_DISPLAY'
    EXPORTING
    i_callback_program = sy-cprog
    i_callback_user_command = 'USER_COMMAND'
    is_layout = ls_layout
    it_fieldcat = lt_fieldcat
    it_sort = lt_sort
    i_tabname_header = c_gt_vbak
    i_tabname_item = c_gt_vbap
    is_keyinfo = ls_keyinfo
    TABLES
    t_outtab_header = gt_vbak
    t_outtab_item = gt_vbap
    EXCEPTIONS
    program_error = 1
    OTHERS = 2.
    IF sy-subrc 0.
    MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
    WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.
    ENDFORM. " F_LIST_DISPLAY
    Form USER_COMMAND *
    FORM user_command USING i_ucomm TYPE sy-ucomm
    is_selfield TYPE slis_selfield. "#EC CALLED
    DATA ls_vbak TYPE ty_vbak.
    CASE i_ucomm.
    WHEN '&IC1'. " Pick
    CASE is_selfield-tabname.
    WHEN c_gt_vbap.
    WHEN c_gt_vbak.
    READ TABLE gt_vbak INDEX is_selfield-tabindex INTO ls_vbak.
    IF sy-subrc EQ 0.
    Sales order number
    SET PARAMETER ID 'AUN' FIELD ls_vbak-vbeln.
    Display Sales Order
    CALL TRANSACTION 'VA03' AND SKIP FIRST SCREEN.
    ENDIF.
    ENDCASE.
    ENDCASE.
    ENDFORM. " USER_COMMAND
    Kindly Reward Points If You Found The Reply Helpful,
    Cheers,
    Chaitanya.

  • How to implement User Exit in APO?

    Hello All,
    I am not sure how to use user exits. I was wondering if anyone can help me understand how to implement any user exit? Is there any T.Code where you do that? or ABAP coding is required? Also how different is BAdi from User exits?
    Any advice is welcome.
    Thanks,
    Sanju

    Hi Sanju,
        Check this link.
    http://www.sap-img.com/abap/difference-between-badi-and-user-exits.htm
    BADI or user exit, I don't think it is a choice between the two. Both serve the same purpose, add some custom logic to the standard logic. It depends on your requirement, the point when the user exit or BADI is called, information that is available to you in that user exit/BADI and information that you can change in that user exit/BADI. There is no difference in implementing the User Exit in APO or R/3.
    Regards,
    Siva.

  • How is the Kalman filter estimator implemented in an MPC?

    I have a physical process with 2 inputs, 4 states and 2 outputs. Two of the states are
    measured while the other two needs to be estimated using kalman filter.I have
    developed the controller (2 i/p-2 o/p) with an MPC. I can see the 4 states in the implemented 
    MPC, but how do I implement a K.filter required to estimate the states of the 2 unmeasured 
    states in the physical process. ...help help help... 
    Attached is the implemented MPC.vi for comments.
    Attachments:
    MPC controller.zip ‏116 KB

    Hi Vicky,
    I dont have Labview installed on this laptop, but I have implemented the Labview MPC on a distillation column. You have 2 blocks that are necessary for the MPC, one outside the loop and one inside. You can connect all the necesary settings, like the model, the control horizon, the kalman filter settings,... to the block outside the loop. I think the control for Kalman is called estimation parameters. There you select between no estimation, pole placement and Kalman filter. For the Kalman filter you have to give the controller 3 matrices, that define your Kalman filter, N, Q and R. These matrices are related to your noise model, I don't know the exact theory unfortunately.
    regards
    KF

  • How to Implement Sort, Filter funtinality in Normal web dynpro ABAP Table

    Hello,
    How to Implement Sort, Filter funtinality in Normal web dynpro ABAP Table ?
    Thanks

    hi,
    Check out this link for sorting in Table.
    Sorting option in WebDynPro ABAP UI Table
    steps to follow :
    ->Have the data in internal table (itab).
    ->Now use sort command for the particular column which ever you want to sort.
      e.g sort itab descending by <Column>.
    ->Now you can bind the internal table with the Context Node which is binded to Table.
    I hope it helps.
    Thanx.

  • How to implement a filter

    Can anyone tell me how to implement a filter (like working sets filter in the project toolbar in Jdeveloper).

    I am developing an extension and want to implement a "Filter"(funnel shaped) with similar features which is present in the jdeveloper(project toolbar->working sets)

  • How to implement result states in custom web dynpro components

    Hi all,
    My callable objects are custom implemented -web dynpro Componenets
    How am i to implement the result states in them so that i can use them to take logical decisions.?
    There is decision dialog component in  Process Control Callable Object. It has Exit states. I need my component also to have exit states like that
    Help me to implement this.
    Points assured for help

    Hi Shobhendra,
    You can define the result states of your custom Web Dynpro callable object like this in the getDescription() method:
    //add success result state
    IGPCOResultStateInfo success =               technicalDescription.addResultState("Success");
    success.setDescriptionKey("Success");
    //add failure result state
    IGPCOResultStateInfo failure =               technicalDescription.addResultState("Failed");
    failure.setDescriptionKey("Failure");
    And in the custom comelete() method (which will be called at the end of the execution of the WDP comp from GP) you can set the actual resultstate at runtime:
    executionContext.setResultState("Success");
    or
    executionContext.setResultState("Failed");
    The result states defined in the WDP callable object will appear in the the GP design time and you can set target for each result state.
    For more info on how to implement the WDP callable object check the doc:
    <a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/50d74ada-0c01-0010-07a8-8c118d408e59">Implementating Web Dynpro callable Object</a>
    Please let me know if you need any further help.
    Thanks,
    Dipankar
    [P.S. Award points for helpful answer]

  • Filter using 2 input images (Flash). How to implement?

    Hi. I know how to implement filters that process single image as it was described in Pixel Benders Developer's Guide, but how can I apply filter using 2 input images one of which is computed? I wrote a filter that coppies alpha channel of one picture and applies it to another, but flash implementation is quite a puzzle for me... Any help appreciated

    In our tutorial from last year's MAX, I showed how to use a two-input pixel bender kernel as a blend filter for two images on the stage. that is posted here:http://blogs.adobe.com/kevin.goldsmith/2008/12/materials_from_1.html
    In one of my recent blog postings, I showed how to use a ShaderJob with multiple inputs. I was doing it for audio processing, but it works almost identically for images: http://blogs.adobe.com/kevin.goldsmith/2009/08/pixel_bender_au.html
    Hopefully one of these should help, but if you have specific questions, don't hesitate to ask...

  • How to implement a psophometric filter in Labview software without uising Sound and vibration toolkit?

    Someone know the method to implement a psophometric or weighted filter in Labview software Without using Sound and Vibration Toolkit?
    Thanks

    The simplest way to implement a psophometric filter is to use the Sound and Vibration toolkit.
    Anyway if you don't want to purchase it, you can have a look at this forum discussion and this one.
    Serena Monti
    Applications Engineer
    National Instruments

  • Noise reduction using extended kalman filter

    How do i go about eliminating noise from a signal using extended kalman filter in labview?
    Regards,
    KM

    Dear Kamasani
    Are there any updates regarding the subject of this threat?
    I also want to use one of Kalman filters for noise removal but till now I can not figure out how to create the input matrices for the state space and noise models.
    Thank you

  • What is BI ? How we implement & what is the cost to implement ?

    What is BI ? How we implement & what is the cost to implement ?
    Thanks,
    Sumit.

    Hi Sumit,
                        Below is the description according to ur query
    Business Intelligence is a process for increasing the competitive advantage of a business by intelligent use of available data in decision making. This process is pictured below.
    The five key stages of Business Intelligence:
    1.     Data Sourcing
    2.     Data Analysis
    3.     Situation Awareness
    4.     Risk Assessment
    5.     Decision Support
    Data sourcing
    Business Intelligence is about extracting information from multiple sources of data. The data might be: text documents - e.g. memos or reports or email messages; photographs and images; sounds; formatted tables; web pages and URL lists. The key to data sourcing is to obtain the information in electronic form. So typical sources of data might include: scanners; digital cameras; database queries; web searches; computer file access; etcetera.
    Data analysis
    Business Intelligence is about synthesizing useful knowledge from collections of data. It is about estimating current trends, integrating and summarising disparate information, validating models of understanding, and predicting missing information or future trends. This process of data analysis is also called data mining or knowledge discovery. Typical analysis tools might use:-
    u2022     probability theory - e.g. classification, clustering and Bayesian networks; 
    u2022     statistical methods - e.g. regression; 
    u2022     operations research - e.g. queuing and scheduling; 
    u2022     artificial intelligence - e.g. neural networks and fuzzy logic.
    Situation awareness
    Business Intelligence is about filtering out irrelevant information, and setting the remaining information in the context of the business and its environment. The user needs the key items of information relevant to his or her needs, and summaries that are syntheses of all the relevant data (market forces, government policy etc.).  Situation awareness is the grasp of  the context in which to understand and make decisions.  Algorithms for situation assessment provide such syntheses automatically.
    Risk assessment
    Business Intelligence is about discovering what plausible actions might be taken, or decisions made, at different times. It is about helping you weigh up the current and future risk, cost or benefit of taking one action over another, or making one decision versus another. It is about inferring and summarising your best options or choices.
    Decision support
    Business Intelligence is about using information wisely.  It aims to provide warning you of important events, such as takeovers, market changes, and poor staff performance, so that you can take preventative steps. It seeks to help you analyse and make better business decisions, to improve sales or customer satisfaction or staff morale. It presents the information you need, when you need it.
    This section describes how we are using extraction, transformation and loading (ETL) processes and a data warehouse architecture to build our enterprise-wide data warehouse in incremental project steps. Before an enterprise-wide data warehouse could be delivered, an integrated architecture and a companion implementation methodology needed to be adopted. A productive and flexible tool set was also required to support ETL processes and the data warehouse architecture in a production service environment. The resulting data warehouse architecture has the following four principal components:
    u2022 Data Sources
    u2022 Data Warehouses
    u2022 Data Marts
    u2022 Publication Services
    ETL processing occurs between data sources and the data warehouse, between the data warehouse and data marts and may also be used within the data warehouse and data marts.
    Data Sources
    The university has a multitude of data sources residing in different Data Base Management System (DBMS) tables and non-DBMS data sets. To ensure that all relevant data source candidates were identified, a physical inventory and logical inventory was conducted. The compilation of these inventories ensures that we have an enterprise-wide view of the university data resource.
    The physical inventory was comprised of a review of DBMS cataloged tables as well as data sets used by business processes. These data sets had been identified through developing the enterprise-wide information needs model.
    3
    SUGI 30 Focus Session
    The logical inventory was constructed from u201Cbrain-stormingu201D sessions which focused on common key business terms which must be referenced when articulating the institutionu2019s vision and mission (strategic direction, goals, strategies, objectives and activities). Once the primary terms were identified, they were organized into directories such as u201CProjectu201D, u201CLocationu201D, u201CAcademic Entityu201D, u201CUniversity Personu201D, u201CBudget Envelopeu201D etc. Relationships were identified by recognizing u201Cnatural linkagesu201D within and among directories, and the u201Cdrill-downsu201D and u201Croll-upsu201D that were required to support u201Creport byu201D and u201Creport onu201D information hierarchies. This exercise allowed the directories to be sub-divided into hierarchies of business terms which were useful for presentation and validation purposes.
    We called this important deliverable the u201CConceptual Data Modelu201D (CDM) and it was used as the consolidated conceptual (paper) view of all of the Universityu2019s diverse data sources. The CDM was then subjected to a university-wide consultative process to solicit feedback and communicate to the university community that this model would be adopted by the Business Intelligence (BI) project as a governance model in managing the incremental development of its enterprise-wide data warehousing project.
    Data Warehouse
    This component of our data warehouse architecture (DWA) is used to supply quality data to the many different data marts in a flexible, consistent and cohesive manner. It is a u2018landing zoneu2019 for inbound data sources and an organizational and re-structuring area for implementing data, information and statistical modeling. This is where business rules which measure and enforce data quality standards for data collection in the source systems are tested and evaluated against appropriate data quality business rules/standards which are required to perform the data, information and statistical modeling described previously.
    Inbound data that does not meet data warehouse data quality business rules is not loaded into the data warehouse (for example, if a hierarchy is incomplete). While it is desirable for rejected and corrected records to occur in the operational system, if this is not possible then start dates for when the data can begin to be collected into the data warehouse may need to be adjusted in order to accommodate necessary source systems data entry u201Cre-worku201D. Existing systems and procedures may need modification in order to permanently accommodate required data warehouse data quality measures. Severe situations may occur in which new data entry collection transactions or entire systems will need to be either built or acquired.
    We have found that a powerful and flexible extraction, transformation and loading (ETL) process is to use Structured Query Language (SQL) views on host database management systems (DBMS) in conjunction with a good ETL tool such as SAS® ETL Studio. This tool enables you to perform the following tasks:
    u2022 The extraction of data from operational data stores
    u2022 The transformation of this data
    u2022 The loading of the extracted data into your data warehouse or data mart
    When the data source is a u201Cnon-DBMSu201D data set it may be advantageous to pre-convert this into a SAS® data set to standardize data warehouse metadata definitions. Then it may be captured by SAS® ETL Studio and included in the data warehouse along with any DBMS source tables using consistent metadata terms. SAS® data sets, non-SAS® data sets, and any DBMS table will provide the SAS® ETL tool with all of the necessary metadata required to facilitate productive extraction, transformation and loading (ETL) work.
    Having the ability to utilize standard structured query language (SQL) views on host DBMS systems and within SAS® is a great advantage for ETL processing. The views can serve as data quality filters without having to write any procedural code. The option exists to u201Cmaterializeu201D these views on the host systems or leave them u201Cun-materializedu201D on the hosts and u201Cmaterializeu201D them on the target data structure defined in the SAS® ETL process. These choices may be applied differentially depending upon whether you are working with u201Ccurrent onlyu201D or u201Ctime seriesu201D data. Different deployment configurations may be chosen based upon performance issues or cost considerations. The flexibility of choosing different deployment options based upon these factors is a considerable advantage.
    4
    SUGI 30 Focus Session
    Data Marts
    This component of the data warehouse architecture may manifest as the following:
    u2022 Customer u201Cvisibleu201D relational tables
    u2022 OLAP cubes
    u2022 Pre-determined parameterized and non-parameterized reports
    u2022 Ad-hoc reports
    u2022 Spreadsheet applications with pre-populated work sheets and pivot tables
    u2022 Data visualization graphics
    u2022 Dashboard/scorecards for performance indicator applications
    Typically a business intelligence (BI) project may be scoped to deliver an agreed upon set of data marts in a project. Once these have been well specified, the conceptual data model (CDM) is used to determine what parts need to be built or used as a reference to conform the inbound data from any new project. After the detailed data mart specifications (DDMS) have been verified and the conceptual data model (CDM) components determined, a source and target logical data model (LDM) can be designed to integrate the detailed data mart specification (DDMS) and conceptual data model (CMD). An extraction, transformation and loading (ETL) process can then be set up and scheduled to populate the logical data models (LDM) from the required data sources and assist with any time series and data audit change control requirements.
    Over time as more and more data marts and logical data models (LDMu2019s) are built the conceptual data model (CDM) becomes more complete. One very important advantage to this implementation methodology is that the order of the data marts and logical data models can be entirely driven by project priority, project budget allocation and time-to-completion constraints/requirements. This data warehouse architecture implementation methodology does not need to dictate project priorities or project scope as long as the conceptual data model (CDM) exercise has been successfully completed before the first project request is initiated.
    McMasteru2019s Data Warehouse design
    DevelopmentTestProductionWarehouseWarehouseWarehouseOtherDB2 OperationalOracle OperationalETLETLETLETLETLETLETLETLETLDataMartsETLETLETLDataMartsDataMartsDB2/Oracle BIToolBIToolBIToolNoNoUserUserAccessAccessUserUserAccessAccess(SAS (SAS Data sets)Data sets)Staging Area 5
    SUGI 30 Focus Session
    Publication Services
    This is the visible presentation environment that business intelligence (BI) customers will use to interact with the published data mart deliverables. The SAS® Information Delivery Portal will be utilized as a web delivery channel to deliver a u201Cone-stop information shoppingu201D solution. This software solution provides an interface to access enterprise data, applications and information. It is built on top of the SAS Business Intelligence Architecture, provides a single point of entry and provides a Portal API for application development. All of our canned reports generated through SAS® Enterprise Guide, along with a web-based query and reporting tool (SAS® Web Report Studio) will be accessed through this publication channel.
    Using the portalu2019s personalization features we have customized it for a McMaster u201Clook and feelu201D. Information is organized using pages and portlets and our stakeholders will have access to public pages along with private portlets based on role authorization rules. Stakeholders will also be able to access SAS® data sets from within Microsoft Word and Microsoft Excel using the SAS® Add-In for Microsoft Office. This tool will enable our stakeholders to execute stored processes (a SAS® program which is hosted on a server) and embed the results in their documents and spreadsheets. Within Excel, the SAS® Add-In can:
    u2022 Access and view SAS® data sources
    u2022 Access and view any other data source that is available from a SAS® server
    u2022 Analyze SAS® or Excel data using analytic tasks
    The SAS® Add-In for Microsoft Office will not be accessed through the SAS® Information Delivery Portal as this is a client component which will be installed on individual personal computers by members of our Client Services group. Future stages of the project will include interactive reports (drill-down through OLAP cubes) as well as balanced scorecards to measure performance indicators (through SAS® Strategic Performance Management software). This, along with event notification messages, will all be delivered through the SAS® Information Delivery Portal.
    Publication is also channeled according to audience with appropriate security and privacy rules.
    SECURITY u2013 AUTHENTICATION AND AUTHORIZATION
    The business value derived from using the SAS® Value Chain Analytics includes an authoritative and secure environment for data management and reporting. A data warehouse may be categorized as a u201Ccollection of integrated databases designed to support managerial decision making and problem solving functionsu201D and u201Ccontains both highly detailed and summarized historical data relating to various categories, subjects, or areasu201D. Implementation of the research funding data mart at McMaster has meant that our stakeholders now have electronic access to data which previously was not widely disseminated. Stakeholders are now able to gain timely access to this data in the form that best matches their current information needs. Security requirements are being addressed taking into consideration the following:
    u2022 Data identification
    u2022 Data classification
    u2022 Value of the data
    u2022 Identifying any data security vulnerabilities
    u2022 Identifying data protection measures and associated costs
    u2022 Selection of cost-effective security measures
    u2022 Evaluation of effectiveness of security measures
    At McMaster access to data involves both authentication and authorization. Authentication may be defined as the process of verifying the identity of a person or process within the guidelines of a specific
    6
    SUGI 30 Focus Session
    security policy (who you are). Authorization is the process of determining which permissions the user has for which resources (permissions). Authentication is also a prerequisite for authorization. At McMaster business intelligence (BI) services that are not public require a sign on with a single university-wide login identifier which is currently authenticated using the Microsoft Active Directory. After a successful authentication the SAS® university login identifier can be used by the SAS® Meta data server. No passwords are ever stored in SAS®. Future plans at the university call for this authentication to be done using Kerberos.
    At McMaster aggregate information will be open to all. Granular security is being implemented as required through a combination of SAS® Information Maps and stored processes. SAS® Information Maps consist of metadata that describe a data warehouse in business terms. Through using SAS® Information Map Studio which is an application used to create, edit and manage SAS® Information Maps, we will determine what data our stakeholders will be accessing through either SAS® Web Report Studio (ability to create reports) or SAS® Information Delivery Portal (ability to view only). Previously access to data residing in DB-2 tables was granted by creating views using structured query language (SQL). Information maps are much more powerful as they capture metadata about allowable usage and query generation rules. They also describe what can be done, are database independent and can cross databases and they hide the physical structure of the data from the business user. Since query code is generated in the background, the business user does not need to know structured query language (SQL). As well as using Information Maps, we will also be using SAS® stored processes to implement role based granular security.
    At the university some business intelligence (BI) services are targeted for particular roles such as researchers. The primary investigator role of a research project needs access to current and past research funding data at both the summary and detail levels for their research project. A SAS® stored process (a SAS® program which is hosted on a server) is used to determine the employee number of the login by checking a common university directory and then filtering the research data mart to selectively provide only the data that is relevant for the researcher who has signed onto the decision support portal.
    Other business intelligence (BI) services are targeted for particular roles such as Vice-Presidents, Deans, Chairs, Directors, Managers and their Staff. SAS® stored processes are used as described above with the exception that they filter data on the basis of positions and organizational affiliations. When individuals change jobs or new appointments occur the authorized business intelligence (BI) data will always be correctly presented.
    As the SAS® stored process can be executed from many environments (for example, SAS® Web Report Studio, SAS® Add-In for Microsoft Office, SAS® Enterprise Guide) authorization rules are consistently applied across all environments on a timely basis. There is also potential in the future to automatically customize web portals and event notifications based upon the particular role of the person who has signed onto the SAS® Information Delivery Portal.
    ARCHITECTURE (PRODUCTION ENVIRONMENT)
    We are currently in the planning stages for building a scalable, sustainable infrastructure which will support a scaled deployment of the SAS® Value Chain Analytics. We are considering implementing the following three-tier platform which will allow us to scale horizontally in the future:
    Our development environment consists of a server with 2 x Intel Xeon 2.8GHz Processors, 2GB of RAM and is running Windows 2000 u2013 Service Pack 4.
    We are considering the following for the scaled roll-out of our production environment.
    A. Hardware
    1. Server 1 - SAS® Data Server
    - 4 way 64 bit 1.5Ghz Itanium2 server
    7
    SUGI 30 Focus Session
    - 16 Gb RAM
    - 2 73 Gb Drives (RAID 1) for the OS
    - 1 10/100/1Gb Cu Ethernet card
    - 1 Windows 2003 Enterprise Edition for Itanium
    2 Mid-Tier (Web) Server
    - 2 way 32 bit 3Ghz Xeon Server
    - 4 Gb RAM
    - 1 10/100/1Gb Cu Ethernet card
    - 1 Windows 2003 Enterprise Edition for x86
    3. SAN Drive Array (modular and can grow with the warehouse)
    - 6 u2013 72GB Drives (RAID 5) total 360GB for SAS® and Data
    B. Software
    1. Server 1 - SAS® Data Server
    - SAS® 9.1.3
    - SAS® Metadata Server
    - SAS® WorkSpace Server
    - SAS® Stored Process Server
    - Platform JobScheduler
    2. Mid -Tier Server
    - SAS® Web Report Studio
    - SAS® Information Delivery Portal
    - BEA Web Logic for future SAS® SPM Platform
    - Xythos Web File System (WFS)
    3. Client u2013Tier Server
    - SAS® Enterprise Guide
    - SAS® Add-In for Microsoft Office
    REPORTING
    We have created a number of parameterized stored processes using SAS® Enterprise Guide, which our stakeholders will access as both static (HTML as well as PDF documents) and interactive reports (drill-down) through SAS® Web Report Studio and the SAS® Add-In for Microsoft Office. All canned reports along with SAS® Web Report Studio will be accessed through the SAS® Information Delivery Portal.
    NEXT STEPS
    Next steps of the project include development of a financial data mart along with appropriate data quality standards, monthly frozen snapshots and implementation of university-wide financial reporting standards. This will facilitate electronic access to integrated financial information necessary for the development and maintenance of an integrated, multi-year financial planning framework. Canned reports to include monthly web-based financial statements, with drill-down capability along with budget templates automatically populated with data values and saved in different workbooks for different subgroups (for example by Department). The later will be accomplished using Microsoft Direct Data Exchange (DDE).
    8
    SUGI 30 Focus Session
    As well, we will begin the implementation of SAS® Strategic Performance Management Software to support the performance measurement and monitoring initiative that is a fundamental component of McMasteru2019s strategic plan. This tool will assist in critically assessing and identifying meaningful and statistically relevant measures and indicators. This software can perform causal analyses among various measures within and across areas providing useful information on inter-relationships between factors and measures. As well as demonstrating how decisions in one area affect other areas, these cause-and-effect analyses can reveal both good performance drivers and also possible detractors and enable u2018evidenced-basedu2019 decision-making. Finally, the tool provides a balanced scorecard reporting format, designed to identify statistically significant trends and results that can be tailored to the specific goals, objectives and measures of the various operational areas of the University.
    LESSONS LEARNED
    Lessons learned include the importance of taking a consultative approach not only in assessing information needs, but also in building data hierarchies, understanding subject matter, and in prioritizing tasks to best support decision making and inform senior management. We found that a combination of training and mentoring (knowledge transfer) helped us accelerate learning the new tools. It was very important to ensure that time and resources were committed to complete the necessary planning and data quality initiatives prior to initiating the first project. When developing a project plan, it is important to

  • How do I use the charging adapter for the UK

    I just received an HTC Droid rental for use in the UK next week. I'm confused about the charging adapters that came with it. Do I just plug the United States USB charger into the UK wall adapter and that's how one uses it?

    Ok. Thank you
    Sent from my iPhone

  • How to implement SEARCH HELP for input field in WDA

    Hi All,
    I am doing a tool for my team. in this tool there are 4 input fields to show different processes. I implemented the 4 input fields and also bind the respective tables to these fields successfully. Actually I want to filter the data between 2 fields . Means the data in the 2nd input field should be based on the 1st input field. Means when I'll select for example 'CHI' (code for CHINA) in the 1st input field the 2nd field meant for country name should show all country name with value 'CHINA'. But the problem is that all values related to each field are in different table with no foreign key relationship and also some combinations are in single table..
    I have tried the import and export parameter method for search help but no luck till now.
    Can anyone please suggest me how to implement this scenario..
    Thanks in advance...
    sekhar

    Hi sekhar,
    Your Requirement can be implemented by OVS(Object Value selector) This is the custom search help .So that you can define on what basis the value has to be fetched.
    Look at the below link
    http://wiki.sdn.sap.com/wiki/display/WDABAP/ABAPWDObjectValueSelector(OVS)
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/606288d6-04c6-2c10-b5ae-a240304c88ea?quicklink=index&overridelayout=true
    You need to use the component WDR_OVS.
    It has 3 phases
    In phase 1
         You will fetch the value from input field to F4 help dialog box.
    In phase 2
         You will generate the search results , in this phase look for the input value given in field 2 and accordingly generate search result for the field 3.
    In phase 3
         The selected value will be returned to the original screen.
    Regards
    Karthiheyan M

  • How to implement row level security using external tables

    Hi All Gurus/ Masters,
    I want to implement row level security using external tables, as I'm not sure how to implement that. and I'm aware of using it by RPD level authentication.
    I can use a filter condition in my user level so that he can access his data only.
    But when i have 4 tables in external tables
    users
    groups
    usergroups
    webgrups
    Then in which table I need to give the filter conditions..
    Pl let me know this ...

    You pull the Group into a repository variable using a session variable init block, then reference that variable in the data filters either in the LTS directly or in the security management as Filters. You reference it with the syntax VALUEOF("NQ_SESSION.Variable Name")
    Hope this helps

Maybe you are looking for