BCS performance

Hello
I have the external SQL database (SQL 2012) with table that might have 100000 or more rows. Currently I designed the page with connection to that table that has 600 test records (using external content and list). Even with 600 records initial load of the
page sometimes takes a minute or even longer. I will filter out some records in the sql, but it still might be up to 5000 records. Then user has additional filters on the page, that will cut down number of records to 20-50 or so. As I understand, BCS will
try to load all records first then filter it based on user' input. What can be done to improve performance? I had read some Microsoft articles about BCS performance, but most suggestions are about filtering it prior to loading to the SharePoint, which is impossible
for this project. Please advise! Thank you!
Alla Sanders

Hi,
According to your post, my understanding is that you want to improve BCS performance.
I recommend that you can cache the data so you only load changes.
The goal is to give users after the first load a pleasant experience.
This Article on MSDN
explains about how to take advantage of the BCS Client Cache to optimize your solutions.
For more information, you can refer to:
SharePoint 2010: Basic BCS Performance Troubleshooting
Thanks,
Linda Li                
Forum Support
Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
[email protected]
Linda Li
TechNet Community Support

Similar Messages

  • BCS: Performance Issue - Most of the time is spent doing commit work.

    Hello,
    We are experiencing performance issues with our BI server. This performance issue can been seen during our BCS runs. Our DBA has indicated that he sees a very high percentage of time time is spent doing "commit work".
    Curretly, we are running BI7 nw2004s, Basis 700, support pack 14.
    Anyone else experience this? As the BCS run is mainly stanard SAP code, I was wondering if there may be some snotes that correct this?
    Thank you for any help you could provide us.

    If it is related to SEM-BCS and new EHP releases, then there are still big problems with the monitor and tasks status management (meaning problems with performance). Is it the case? If yes, then you'd better  look for already released notes reg this and formulate your own OSSs if you don't find anything relevant.

  • BW BCS cube(0bcs_vc10 ) Report huge performance issue

    Hi Masters,
    I am working out for a solution for BW report developed in 0bcs_vc10 virtual cube.
    Some of the querys is taking more 15 to 20 minutes to execute the report.
    This is huge performance issue. We are using BW 3.5, and report devloped in bex and published thru portal. Any one faced similar problem please advise how you tackle this issue. Please give the detail analysis approach how you resolved this issue.
    Current service pack we are using is
    SAP_BW 350 0016 SAPKW35016
    FINBASIS 300 0012 SAPK-30012INFINBASIS
    BI_CONT 353 0008 SAPKIBIFP8
    SEM-BW 400 0012 SAPKGS4012
    Best of Luck
    Chris
    BW BCS cube(0bcs_vc10 ) Report huge performance issue

    Ravi,
    I already did that, it is not helping me much for the performance. Reports are taking 15 t0 20 minutes. I wanted any body in this forum have the same issue how
    they resolved it.
    Regards,
    Chris

  • BCS Virtual Cubes Performance Tuning

    Hi All,
    We are working on improving the performanc tuning of queries on BCS Virtual Cubes (with services).
    Any specific changes (from RAM to specific properties on queries, Virtual Cubes with services) that you have seen working in improving the performance in yuor environemnt is greately appreciated.
    Thanks,
    - Shashi

    Thanks a lot Marc,
    We are on NW2004, with the following support pack levels
    SAP_BW     350     0016
    FINBASIS     300     0012
    BI_CONT     353     0008
    SEM-BW     400     0012
    I have checked the service market place and the current SP's available are:
    SAP_BW     350     0019
    FINBASIS     300     0015
    BI_CONT     353     0013
    SEM-BW     400     0015
    Which Service Packs that you suggest us to go with for performance issues on BCS Virtual Infoprovider queries?
    Thanks,
    - Shashi

  • BCS Load Data Stream Performance

    Data Stream upload performance is not good. It takes about 50 minutes to upload the BCS cube. 0FIGL_O02 is the source data base.
    Have you had similar problems?  Do you have any benchmark data on # of records per hour?
    Thanks
    Tim

    Tim,
    This issue can be controlled by restricting the characteristic values. For expample consolidation units, GL account etc.
    Thanks
    RJ

  • Can we use BCS component  in ECC 6.0, without installing BI 7.0?

    We have a unique situation. We have planned to migrate from 4.6B to ECC 6.0. We have FI- LC and we want to move to SEM-BCS 6.0 in ECC 6.0, without installing separate BI (netweaver 2004s) instance. Is that possible? If possible, will there be any performance issues?  If we use BCS in ECC 6.0, does it eliminate the job of FS data extraction to BW-SEM?
    Do we need a separate BI system, which can also include SEM-BCS? What are the advantages?
    We look forward for advice.

    We found that without installing BI 7.0, BCS component (with its own internal BI) is working in ECC 6.0. In earlier ECC 5.0 or R/3, such facility was not available.
    My question is that "Is it good to use SEM-BCS with in ECC 6.0" or  "Is it better to have separate BI instance, which can include BCS" .  Either answer has to be supported with points. Appreciate any assistance.

  • First Project - BCS

    ***INCLUDE LUCR_LSTF03 .
    *&       Form  display_detail_list
    FORM display_detail_list_subscreen.
      IF g_display_detail_again IS INITIAL.
        IF g_display_detail_next IS INITIAL.
    * Header text is displayed as dyn. document
          IF go_dydo IS INITIAL.
            CREATE OBJECT go_dydo.
          ELSE.
            FREE  go_dydo.
            CLEAR go_dydo.
            CREATE OBJECT go_dydo.
            PERFORM free_create_screen_objects.
          ENDIF.
        ELSE.
          IF NOT go_dydo->html_control IS INITIAL.
            CALL METHOD go_splitter_container->remove_control
              EXPORTING
                row    = 1
                column = 1.
          ENDIF.
        ENDIF.
      ENDIF.
    * Initialize document
      CALL METHOD go_dydo->initialize_document.
      IF  LINES( gt_header ) > 0.
    * Build and display detail-header
        PERFORM build_header TABLES gt_header
                             USING  go_dydo
                                    gs_header.
      ENDIF.
    * Merge header-document
      CALL METHOD go_dydo->merge_document.
    * set toolbar-added functions for detailed list
      REFRESH gt_toolbar.
      IF go_model->ds_tx_data_io_type-documents IS BOUND.       "mb210303
        PERFORM fill_tool_tab USING:
                     gt_toolbar gs_toolbar 'UCR_DUMMY'   '1',
                     gt_toolbar gs_toolbar 'UCR_ENTRIES' '2',
                     gt_toolbar gs_toolbar 'UCR_DUMMY'   '3',
                     gt_toolbar gs_toolbar 'UCR_PREV'    '4',
                     gt_toolbar gs_toolbar 'UCR_NEXT'    '5',
                     gt_toolbar gs_toolbar 'UCR_DUMMY'   '6'.
      ELSE.                                                   "mb210202 beg.
        PERFORM fill_tool_tab USING:
                     gt_toolbar gs_toolbar 'UCR_DUMMY'   '1',
                     gt_toolbar gs_toolbar 'UCR_PREV'    '2',
                     gt_toolbar gs_toolbar 'UCR_NEXT'    '3',
                     gt_toolbar gs_toolbar 'UCR_DUMMY'   '4'.
      ENDIF.                                                  "mb210202 end.
      SET HANDLER go_event_grid->handle_toolbar_comp   FOR go_grid.
    * Display header
      CALL METHOD go_dydo->display_document
        EXPORTING
          parent = go_container_head.
    * Set header-control visible with user set or
    * predefined height
      IF NOT go_splitter_container IS INITIAL.
        IF g_height IS INITIAL.
          g_height = 27.
        ENDIF.
        CALL METHOD go_splitter_container->set_row_height
          EXPORTING
            id     = 1
            height = g_height.
      ENDIF.
    * display detail-list
      CALL METHOD go_grid->set_table_for_first_display
        EXPORTING
          i_bypassing_buffer   = 'X'
          is_layout            = gs_layout_save
          it_toolbar_excluding = gt_exclude_toolbar
        CHANGING
          it_fieldcatalog      = gt_fieldcat_d
          it_outtab            = <gt_outtab_detail>.
      IF NOT g_display_detail_next IS INITIAL.
        CALL METHOD go_grid->refresh_table_display.
      ENDIF.
    * Set cursor on selected row if necesary
      IF NOT gt_row_id_detail IS INITIAL.
        CALL METHOD go_grid->set_selected_rows
          EXPORTING
            it_index_rows = gt_row_id_detail.
        REFRESH gt_row_id_detail.
      ENDIF.
      CLEAR: g_display_detail_next,
             g_display_detail_again.
    ENDFORM.                    " display_detail_list_subscreen
    *&      Form  display_main_list
    FORM display_main_list_subscreen. "using i_reporting_logic type uc_flg.
      CLEAR g_detail.
    * set toolbar-added functions
      REFRESH gt_toolbar.
      IF go_model->ds_tx_data_io_type IS NOT INITIAL.
        IF go_model->ds_tx_data_io_type-documents IS BOUND.     "mb210303
    *        and i_reporting_logic is initial.
          PERFORM fill_tool_tab USING:
                       gt_toolbar gs_toolbar 'UCR_DUMMY'   '1',
                       gt_toolbar gs_toolbar 'UCR_ENTRIES' '2',
                       gt_toolbar gs_toolbar 'UCR_DUMMY'   '3',
                       gt_toolbar gs_toolbar 'UCR_SEL_COND' '4',
                       gt_toolbar gs_toolbar 'UCR_DUMMY'   '5'.
        ELSE.
          PERFORM fill_tool_tab USING:
                       gt_toolbar gs_toolbar 'UCR_DUMMY'   '1',
                       gt_toolbar gs_toolbar 'UCR_SEL_COND' '2',
                       gt_toolbar gs_toolbar 'UCR_DUMMY'   '3'.
        ENDIF.                                                  "mb210303
        SET HANDLER go_event_grid->handle_toolbar_comp   FOR go_grid.
      ELSE    .
        PERFORM fill_tool_tab USING:
                     gt_toolbar gs_toolbar 'UCR_DUMMY'   '1',
                     gt_toolbar gs_toolbar 'UCR_SEL_COND' '2',
                     gt_toolbar gs_toolbar 'UCR_DUMMY'   '3'.
        SET HANDLER go_event_grid->handle_toolbar_comp   FOR go_grid.
      ENDIF.
    * Set header-control invisible while main-list is displayed
      IF NOT go_splitter_container IS INITIAL.
    * Header container can be set visible by user on main-screen
        CALL METHOD go_splitter_container->set_row_height
          EXPORTING
            id     = 1
            height = 0.
        IF NOT go_dydo IS INITIAL.
    * Display last header if user wants to set it visible
          CALL METHOD go_dydo->display_document
            EXPORTING
              parent = go_container_head.
        ENDIF.
      ENDIF.
    * subscreen with main list simply is started once again
    * prepared with current user settings
      IF ( LINES( gt_fieldcat_save ) > 0 ) AND
         NOT g_display_main_again IS INITIAL.
        REFRESH gt_fieldcat.
        gt_fieldcat[] = gt_fieldcat_save[].
    *    clear g_display_main_again.
    * is necessary, because current_frontend_fieldcat must be set
        CALL METHOD go_grid->set_table_for_first_display
          EXPORTING
            i_bypassing_buffer   = 'X'
            i_save               = g_save
            is_layout            = gs_layout
            is_variant           = gs_variant
            it_special_groups    = gt_sgrp                      "mb110403
            it_toolbar_excluding = gt_exclude_toolbar
          CHANGING
            it_fieldcatalog      = gt_fieldcat
            it_outtab            = <gt_outtab>.
    * set current fieldcat
        CALL METHOD go_grid->set_frontend_fieldcatalog
          EXPORTING
            it_fieldcatalog = gt_fieldcat.
    * now reset other current layout-infos if necessary
        PERFORM set_current_layout USING go_grid
                                         gt_filter_save
                                         gt_sort_save
                                         gs_layout_save.
    * display current main-list
        CALL METHOD go_grid->refresh_table_display.
        REFRESH: gt_fieldcat_save, gt_filter_save, gt_sort_save.
        CLEAR  : gs_layout_save.
      ELSE.
    * Show ALV
        CALL METHOD go_grid->set_table_for_first_display
          EXPORTING
            i_bypassing_buffer   = 'X'
            i_save               = g_save
            is_layout            = gs_layout
            is_variant           = gs_variant
            it_special_groups    = gt_sgrp                      "mb110403
            it_toolbar_excluding = gt_exclude_toolbar
          CHANGING
            it_sort              = gt_sort
            it_fieldcatalog      = gt_fieldcat
            it_outtab            = <gt_outtab>.
        CLEAR g_display_detail_again.
        g_display_main_again = 'X'.
      ENDIF.
    * Set cursor on selected row if necesary
      IF NOT gt_row_id IS INITIAL.
        CALL METHOD go_grid->set_selected_rows
          EXPORTING
            it_index_rows = gt_row_id.
      ENDIF.
    ENDFORM.                    " display_main_list_subscreen
    *&      Form  display_message
    FORM display_message USING lo_grid_object TYPE REF TO cl_gui_alv_grid
                               lf_refresh_msg_table TYPE c.
      DATA: l_lines       TYPE i,
            ls_layout     TYPE lvc_s_layo.
    * header_main is not used by Worbench
      IF g_subscreen IS INITIAL.
        PERFORM header_main.
      ENDIF.
      g_no_data = 'X'.
    * if first data selection results no data gt_parameter_save must be
    * filled  from lt_parameter to take care of further user changed
    * parameters from list screen
      DESCRIBE TABLE gt_parameter_save LINES l_lines.
      IF l_lines =  0.
        gt_parameter_save[] = gt_parameter[].
      ENDIF.
      PERFORM no_data_selected IN PROGRAM (gc_progname)
                                USING      gt_fieldcat_mess
                                           gs_fieldcat_mess
                                           gt_message
                                           gs_message
                                           text-101
                                           text-102
                                           'TEXT'
                                           'LT_MESSAGE'
                                           lf_refresh_msg_table   " 'X'
                                           space.      "  'C610'. no color, accessibility
    *** Start of Comment By Ramesh Babu N,IBM - C1DK900725 ***
    * Set header-control invisible while main-list is displayed
    *  IF NOT go_splitter_container IS INITIAL.
    *    CALL METHOD go_splitter_container->set_row_height
    *      EXPORTING
    *        id     = 1
    *        height = 0.
    *    CALL METHOD go_splitter_container->set_row_sash
    *      EXPORTING
    *        id    = 1
    *        type  = cl_gui_splitter_container=>type_sashvisible
    *        value = cl_gui_splitter_container=>false.
    *  ENDIF.
    *** End of Comment By Ramesh Babu N,IBM - C1DK900725  ***
    * display messages
    *  ls_layout-no_toolbar = 'X'.
      CALL METHOD lo_grid_object->set_table_for_first_display
        EXPORTING
          i_bypassing_buffer   = 'X'
          it_toolbar_excluding = gt_exclude_toolbar_mess
          is_layout            = ls_layout
        CHANGING
          it_fieldcatalog      = gt_fieldcat_mess
          it_outtab            = gt_message.
    *** Start of Comment By Ramesh Babu N,IBM - C1DK900725 ***
    * Event-Handler for docking-container
    *  IF go_event_dock IS INITIAL.
    *    CREATE OBJECT go_event_dock.
    *  ENDIF.
    *** End of Comment By Ramesh Babu N,IBM - C1DK900725  ***
      IF go_parameter IS INITIAL.
        CALL METHOD cl_uc_parameter=>get_instance
          IMPORTING
            eo_instance = go_parameter.
      ENDIF.
    *** Start of Comment By Ramesh Babu N,IBM - C1DK900725 ***
    *  SET HANDLER go_event_dock->handle_new_parameters FOR go_parameter.
    *** End of Comment By Ramesh Babu N,IBM - C1DK900725  ***
    ENDFORM.                    " display_message
    *&      Form  sub_download_data
    * Used for downloading BCS data to a TAB delimited file in Custom task *
    *      <--ct_data  HASHED TABLE
    FORM f_download_data  USING     ct_task     TYPE uc_task
                                    ct_sel      TYPE uc0_ts_sel
                                    gt_param    TYPE ucm_ts_parameter
                          CHANGING  ct_message  TYPE uc0_t_message
                                    ct_data     TYPE HASHED TABLE..
    * SUBROUTINE DESCRIPION: Used for downloading BCS data to file in Custome Task
    *           DEVELOPER: Ramesh Babu Nalla , IBM
    *       CREATION DATE: 2007-10-11
    *          DER NUMBER: None
    * TRANSPORT NUMBER(S): C1DK900725
    * REVISION HISTORY-----------------------------------------------------*
    *       REVISION NO: C1DK900725      REFERENCE NO:  None
    *         DEVELOPER: Ramesh Babu N,IBM       DATE:  2007-10-11
    *       DESCRIPTION: Copied from SAP FM UCR_LST_LOGIC *
      CONSTANTS : c_task01 TYPE uc_task VALUE 'T2700',
                  c_task02 TYPE uc_task VALUE 'T2490'.
      TYPES:  BEGIN OF ty_download,
    **            /bic/zcs_comp   TYPE char08,      " Unilever-Company
    **            /bic/zcs_item   TYPE char10,      " Unilever-Item
    **            /1fb/move_type  TYPE char03,      " Unilever-Movement type
    **            /bic/zcs_pcom   TYPE char08,      " Unilever-Partner Company
    **            /bic/zcs_invc   TYPE char08,      " Unilever-Investee Company
    **            /bic/zcs_cang   TYPE char03,      " Unilever-Cost Analysis Group
    **            /bic/zcs_ad     TYPE char08,      " Unilever-Aquisitions/Disposals
    **            /bic/zcs_cd     TYPE char08,      " Unilever-Continued/Discontinued Ops.
    **            /bic/zcs_prgp   TYPE char08,      " Unilever-Product Category
                /bic/zfb_comp   TYPE char08,      " Unilever-Company
                /bic/zfb_item   TYPE char10,      " Unilever-Item
                /bic/zfb_move   TYPE char03,      " Unilever-Movement type
                /bic/zfb_pcom   TYPE char08,      " Unilever-Partner Company
                /bic/zfb_invc   TYPE char08,      " Unilever-Investee Company
                /bic/zfb_cang   TYPE char03,      " Unilever-Cost Analysis Group
                /bic/zfb_ad     TYPE char08,      " Unilever-Aquisitions/Disposals
                /bic/zfb_cd     TYPE char08,      " Unilever-Continued/Discontinued Ops.
                /bic/zfb_prgp   TYPE char08,      " Unilever-Product Category
                /1fb/cs_trn_lc  TYPE string,      " Unilever-Period value in Local currency
                /1fb/cs_trn_qty TYPE string,     " Unilever-Periodic quantity
                unit            type string,
              END OF ty_download.
    * flag
      DATA flg_chk TYPE c.
      DATA: lr_s_data_out TYPE REF TO data,
            lr_t_data_out TYPE REF TO data,
            l_outtype     TYPE field_type VALUE 'UCR_SX_TX_DATA_LST',
            lr            TYPE REF TO data,
            lo_conv       TYPE REF TO lcl_convert_output,
            lt_char       TYPE lcl_convert_output=>th_comp,
            ls_comp       TYPE lcl_convert_output=>s_comp,
            lr_s_data     TYPE REF TO data,
            lr_t_data     TYPE REF TO data,
            l_filename    TYPE string,
            l_action      TYPE i,
            l_path        TYPE string,
            l_seperator   TYPE char01 VALUE 'X',
            l_fullpath    TYPE string,
            l_mmyy        TYPE string,
            lr_sel_data   TYPE REF TO data,
            lr_t_val      TYPE REF TO data,
            lr_t_final    TYPE REF TO data,
            lr_val        TYPE REF TO data,
            ls_download   TYPE REF TO data,
            lt_download   TYPE REF TO data,
            ls_sel        TYPE REF TO data,
            ls_msg        TYPE uc0_s_message.
    ** SOC by Dpak-------------------------------------------------------------------------------------------
    ** Changed by Deepak N Jain, IBM on 16/11/2006 as the corresponding field name assigned by Ramesh from
    ** the structure <LS_DATA_OUT> are not matching. This is because of the new cube from whch the Financial
    ** data is coming now. The new cube is ZFBCS_T1 which has replaced the old cube ZCS_T1.
    ** Hence, Short Dump on Execution.
    **  DATA: l_comp        TYPE string VALUE '/BIC/ZCS_COMP',  " Unilever-Company
    **        l_cgcomp      TYPE string VALUE '/1FB/SEM_CGCOMP'," Unilever-Consolidation Group
    **        l_pcomp       TYPE string VALUE '/BIC/ZCS_PCOM',  " Unilever-Partner Company
    **        l_doct        TYPE string VALUE '/BIC/ZCS_DOCT',  " Unilever-Document type
    **        l_plevel      TYPE string VALUE 'CS_PLEVEL',      " Unilever-Posting Level
    **        l_tc          TYPE string VALUE '/1FB/CS_TRN_TC', " Unilever-Period value in Transaction currency
    **        l_qty         TYPE string VALUE '/1FB/CS_TRN_QTY'," Unilever-Periodic quantity
    **        l_trn_lc      TYPE string VALUE '/1FB/CS_TRN_LC', " Unilever-Period value in Local currency
    **        l_bu          TYPE string VALUE '/BIC/ZCS_PROF',  " Unilever-Business Unit/Cost Centre
    **        l_low         TYPE string VALUE 'LOW'.
      DATA: l_comp        TYPE string VALUE '/BIC/ZFB_COMP',  " Unilever-Company
            l_cgcomp      TYPE string VALUE '/BIC/ZFB_CG1',   " Unilever-Consolidation Group
            l_pcomp       TYPE string VALUE '/BIC/ZFB_PCOM',  " Unilever-Partner Company
            l_doct        TYPE string VALUE '/BIC/ZFB_DOCT',  " Unilever-Document type
            l_invc        TYPE string VALUE '/BIC/ZFB_INVC',  " Unilever-Investee Company
            l_plevel      TYPE string VALUE 'CS_PLEVEL',      " Unilever-Posting Level
            l_tc          TYPE string VALUE '/1FB/CS_TRN_TC', " Unilever-Period value in Transaction currency
            l_qty         TYPE string VALUE '/1FB/CS_TRN_QTY'," Unilever-Periodic quantity
            l_trn_lc      TYPE string VALUE '/1FB/CS_TRN_LC', " Unilever-Period value in Local currency
            l_bu          TYPE string VALUE '/BIC/ZCS_PROF',  " Unilever-Business Unit/Cost Centre
            l_low         TYPE string VALUE 'LOW',
            l_unit        TYPE string  VALUE 'UNIT'.
    ** EOC by Dpak-----------------------------------------------------------------------------------------------
    ** future use **
    **  FISCVARNT TYPE L0002FISCVARNT,
    **  FISCPERIOD TYPE L0002FISCPERIOD,
    **  /BIC/ZFB_VERS TYPE L0002/BIC/ZFB_VERS,
    **  /BIC/ZFB_CG1 TYPE L0002/BIC/ZFB_CG1,
    **  /BIC/ZFB_COMP TYPE L0002/BIC/ZFB_COMP,
    **  /1FB/CS_CHART TYPE L0002/1FB/CS_CHART,
    **  /BIC/ZFB_MOVE TYPE L0002/BIC/ZFB_MOVE,
    **  /BIC/ZFB_PCOM TYPE L0002/BIC/ZFB_PCOM,
    **  ACQ_YEAR TYPE L0002ACQ_YEAR,
    **  ACQ_PER TYPE L0002ACQ_PER,
    **  /BIC/ZFB_INVC TYPE L0002/BIC/ZFB_INVC,
    **  /BIC/ZFB_ALCO TYPE L0002/BIC/ZFB_ALCO,
    **  CS_PLEVEL TYPE L0002CS_PLEVEL,
    **  /BIC/ZFB_DOCT TYPE L0002/BIC/ZFB_DOCT,
    **  BCS_CTFLG TYPE L0002BCS_CTFLG,
    **  UNIT TYPE L0002UNIT,
    **  /1FB/CURKEY_TC TYPE L0002/1FB/CURKEY_TC,
    **  /1FB/CURKEY_LC TYPE L0002/1FB/CURKEY_LC,
    **  /1FB/CURKEY_GC TYPE L0002/1FB/CURKEY_GC,
    **  /BIC/ZFB_PRGP TYPE L0002/BIC/ZFB_PRGP,
    **  /BIC/ZFB_COUN TYPE L0002/BIC/ZFB_COUN,
    **  /BIC/ZFB_CANG TYPE L0002/BIC/ZFB_CANG,
    **  /1FB/FUNC_AREA TYPE L0002/1FB/FUNC_AREA,
    **  /BIC/ZFB_CT TYPE L0002/BIC/ZFB_CT,
    **  /BIC/ZFB_CD TYPE L0002/BIC/ZFB_CD,
    **  /BIC/ZFB_AD TYPE L0002/BIC/ZFB_AD,
    **  /BIC/ZFB_FR01 TYPE L0002/BIC/ZFB_FR01,
    **  /BIC/ZFB_FR02 TYPE L0002/BIC/ZFB_FR02,
    **  FISCYEAR TYPE L0002FISCYEAR,
    **  /BIC/ZFB_ITEM TYPE L0002/BIC/ZFB_ITEM,
    ** future use **
      FIELD-SYMBOLS: <lt_data_out> TYPE STANDARD TABLE,
                     <ls_data_out> TYPE ANY,
                     <ls_data>     TYPE ANY,
                     <ls_data_cop> TYPE ANY,
                     <lt_data_std> TYPE STANDARD TABLE,
                     <ls_data_std> TYPE ANY,
                     <ls_download>  TYPE ANY,
                     <lt_download> TYPE STANDARD TABLE,
                     <comp>        TYPE ANY,
                     <cgcomp>      TYPE ANY,
                     <pcomp>       TYPE ANY,
                     <doct>        TYPE ANY,
                     <plevel>      TYPE ANY,
                     <tc>          TYPE ANY,
                     <invc>        TYPE ANY,
                     <qty>         TYPE ANY,
                     <unit>        TYPE ANY,
                     <trn_lc>      TYPE ANY,
                     <comp1>       TYPE ANY,
                     <cgcomp1>     TYPE ANY,
                     <pcomp1>      TYPE ANY,
                     <doct1>       TYPE ANY,
                     <plevel1>     TYPE ANY,
                     <tc1>         TYPE ANY,
                     <invc1>       TYPE ANY,
                     <qty1>        TYPE ANY,
                     <trn_lc1>     TYPE ANY,
                     <unit1>       TYPE ANY,
                     <bu>          TYPE ANY,
                     <bu1>         TYPE ANY,
                     <low>         TYPE ANY,
                     <l_year>      TYPE ANY ,
                     <l_period>    TYPE ANY ,
                     <l_mmyy>      TYPE ANY,
                     <ls_sel_data> TYPE ANY,
                     <ls_val>      TYPE zbcs_check_val,
                     <lt_t_val>    TYPE STANDARD TABLE,
                     <lt_t_final>  TYPE STANDARD TABLE.
    **************** Begin Of Addtion - ASIF MAQBOOL ******************
      Data: BEGIN OF l_s_tab,
            values TYPE C LENGTH 400,
           end OF l_s_tab,
          gs_param LIKE LINE OF gt_param,
          l_temp_store LIKE gt_param.
      Field-SYMBOLS: <gt_param> TYPE ANY TABLE,
                     <final_gt> TYPE any ,
                     <gs_fieldname> TYPE any,
                     <gs_value> TYPE any.
      Data: l_t_tab TYPE TABLE OF string INITIAL SIZE 0,
           str TYPE string,
           str1 TYPE string.
    **************** End Of Addtion - ASIF MAQBOOL ******************
    * create Line-structure of data table
      CREATE DATA lr_s_data LIKE LINE OF ct_data.
      ASSIGN lr_s_data->* TO <ls_data_std>.
    ** create cumulation table
      CREATE DATA lr_t_data LIKE STANDARD TABLE OF <ls_data_std>.
      ASSIGN lr_t_data->* TO <lt_data_std>.
      LOOP AT ct_data INTO <ls_data_std>.
        COLLECT <ls_data_std> INTO <lt_data_std>.
      ENDLOOP.
      FREE ct_data.
    * get reference for outtab / create outtab
      CALL METHOD go_model->create_data_reference
        EXPORTING
          io_tx_data_io_type = go_model->ds_tx_data_io_type-totals
          i_type             = l_outtype "'UCR_SX_TX_DATA_LST'
        IMPORTING
          er_data            = lr_s_data_out.
      ASSIGN lr_s_data_out->* TO <ls_data_out>.
      ASSIGN lr_s_data_out->* TO <ls_data_cop>.
      CREATE DATA lr_t_data_out LIKE STANDARD TABLE OF <ls_data_out>.
      ASSIGN lr_t_data_out->* TO <lt_data_out>.
      CREATE DATA lr LIKE LINE OF <lt_data_std>.
      ASSIGN lr->* TO <ls_data>.
      CALL METHOD lcl_convert_output=>get_instance
        EXPORTING
          io_model    = go_model
          it_char     = lt_char
        IMPORTING
          eo_instance = lo_conv
        CHANGING
          cs_data     = <ls_data>.
      ASSIGN: l_comp   TO <comp>,
              l_cgcomp TO <cgcomp>,
              l_pcomp  TO <pcomp>,
              l_invc   TO <invc>,
              l_doct   TO <doct>,
              l_plevel TO <plevel>,
              l_tc     TO <tc>,
              l_qty    TO <qty>,
              l_trn_lc TO <trn_lc>,
              l_bu     TO <bu>,
              l_mmyy   TO <l_mmyy>,
              l_low    TO <low>,
              l_unit   TO <unit>.
      LOOP AT <lt_data_std> INTO <ls_data>.    "  loop
        CALL METHOD lo_conv->convert_output.      "end of "wis240605
    *     fill outtab
        ASSIGN COMPONENT if_uc_model=>gc_type_comp_s_char
        OF STRUCTURE <ls_data> TO <ls_data_cop>.
        MOVE-CORRESPONDING <ls_data_cop> TO <ls_data_out>.
        ASSIGN COMPONENT if_uc_model=>gc_type_comp_s_kfig
        OF STRUCTURE <ls_data> TO <ls_data_cop>.
        MOVE-CORRESPONDING <ls_data_cop> TO <ls_data_out>.
        IF ct_task = c_task01 OR ct_task = space . "'it can be T2700 or blank
    *****  aggregate the transactionaldata for given Rules  *****
          ASSIGN COMPONENT : <comp>   OF STRUCTURE <ls_data_out> TO <comp1>,
                             <cgcomp> OF STRUCTURE <ls_data_out> TO <cgcomp1>,
                             <pcomp>  OF STRUCTURE <ls_data_out> TO <pcomp1>,
                             <doct>   OF STRUCTURE <ls_data_out> TO <doct1>,
                             <invc>   OF STRUCTURE <ls_data_out> TO <invc1>, "added by Asif M.
                             <plevel> OF STRUCTURE <ls_data_out> TO <plevel1>,
                             <tc>     OF STRUCTURE <ls_data_out> TO <tc1>,
                             <qty>    OF STRUCTURE <ls_data_out> TO <qty1>,
                             <trn_lc> OF STRUCTURE <ls_data_out> TO <trn_lc1>.
    **** 1st Rule ****
    * dont include records where '/1FB/CS_TRN_QTY' and '/1FB/CS_TRN_LC' are blank
          IF <trn_lc1> = 0 AND <qty1> = 0.
            CONTINUE.
          ENDIF.
    **** 2nd Rule ****
    * delete the Posting levels if it is > 1 and clear to blank  CS_PLEVEL
          CHECK <plevel1> LE 1.
          CLEAR <plevel1>.
    **** 3rd Rule ****
    * replace the Unilever Company with CG without prefix G and compare with Partner comp for deletion
    * /BIC/ZCS_COMP with   /1FB/SEM_CGCOMP and check with /BIC/ZCS_PCOM
          IF ct_task <> space.
    ***  code added by Ramesh for the removal GBRNCH  records while downloading the file.
    **** code for removal of GBRNCH records only - Hardcode - sample code
    *        IF <cgcomp1> <> 'GBRNCH'.   " to avoid BRNCH records
    ** replace Company with CG without prefix G when task name is not blank
    *          <comp1> = <cgcomp1>+1.
    *        ELSE.
    *          CONTINUE.
    *        ENDIF.
    **** code for removal of GBRNCH records only - Hardcode - sample code
    ***  the assumption here,is consider only records with Legal entity as numeric excepting the first character
            IF <cgcomp1>+1 CN sy-abcde.   " to avoid BRNCH records
    * replace Company with CG without prefix G when task name is not blank
              <comp1> = <cgcomp1>+1.
            ELSE.
              CONTINUE.
            ENDIF.
          ENDIF.
    * removing leading zero's as SAP sometimes adding them to <pcomp1>.
    *      CALL FUNCTION 'CONVERSION_EXIT_ALPHA_OUTPUT'
    *        EXPORTING
    *          input  = <pcomp1>
    *        IMPORTING
    *          output = <pcomp1>.
          PERFORM f_alpha_conversion USING <comp1>
                                          CHANGING <comp1>.
          PERFORM f_alpha_conversion USING <pcomp1>
                                     CHANGING <pcomp1>.
          " removing leading zero's from Investee Unit company - Added By Asif Maqbool, IBM/Unilever, 25/01/2008.
          PERFORM f_alpha_conversion USING <invc1>
                                     CHANGING <invc1>.
          CHECK <comp1> <> <pcomp1>.
    **** 4rth Rule ****
    * initialise Document type & PV TC to blank   /BIC/ZCS_DOCT & /1FB/CS_TRN_TC
          CLEAR: <doct1>,<tc1>.
        ENDIF.
        COLLECT: <ls_data_out> INTO <lt_data_out>.
      ENDLOOP.          " end loop
      IF <lt_data_out> IS INITIAL.
        PERFORM f_build_msgs USING 'UCM0'
                                   'W'
                                   '053'
                                   text-102
                                   space "ct_task
                                   space
                                   space
                          CHANGING ct_message.
        RETURN.
        MESSAGE e208(00) WITH text-102.
        EXIT.
      ENDIF.
    * perform various steps based on task name
      CASE ct_task.
        WHEN c_task01 OR space.              " 'T2700' or blank
    * download the data into a tab delimited file
    ****             start of Task 0001              ****
    * create Line-structure of download table
          CREATE DATA ls_download TYPE ty_download.
          ASSIGN ls_download->* TO <ls_download>.
    ** create Download data internal table for task T2700
          CREATE DATA lt_download LIKE STANDARD TABLE OF <ls_download>.
          ASSIGN lt_download->* TO <lt_download>.
          LOOP AT <lt_data_out> INTO <ls_data_out>.
            MOVE-CORRESPONDING <ls_data_out> TO <ls_download>.
            ASSIGN COMPONENT <trn_lc> OF STRUCTURE <ls_download> TO <trn_lc1>.
    *** 6th Rule **********************
            " Check for values, if present remove decimals, if not present clear it of (blank).
            UNASSIGN <qty1>.
            ASSIGN COMPONENT <qty> of STRUCTURE <ls_download> to <qty1>.
            ASSIGN COMPONENT <unit> of STRUCTURE <ls_download> to <unit1>.
            if <unit1> = '' OR <unit1> <> '%'.
              REPLACE ALL OCCURRENCES OF '.' in <qty1> WITH '' RESPECTING CASE.
              <qty1> = ''.
            endif.
            if <unit1> <> '' And <qty1> <> ''.
              <unit1> = ''. " We dont need unit downloaded.
              REPLACE ALL OCCURRENCES OF '.' in <qty1> WITH '' RESPECTING CASE.
              <qty1> = <qty1>+0(2).
            endif.
    **** 5th Rule ****
    * Move the Negative sign to front
            PERFORM f_put_sign_in_front CHANGING <trn_lc1>.
            INSERT <ls_download>  INTO TABLE <lt_download>.
          ENDLOOP.
    * move the aggregated data to final table for display
          FREE ct_data.
          ct_data = <lt_data_out>.
          CLEAR: l_path,l_filename,l_fullpath,l_action.
          CALL FUNCTION 'GUI_FILE_SAVE_DIALOG'
           EXPORTING
             window_title            = 'Download aggregated BCS data to Tab Delimited file'
             default_extension       = 'txt'
    *   DEFAULT_FILE_NAME       = dynamic file name as like ALE settings
             file_filter             = 'Text files (*.txt)'
           IMPORTING
             filename                = l_filename
             path                    = l_path
             fullpath                = l_fullpath
             user_action             = l_action .
          CALL METHOD cl_gui_cfw=>flush.
          " *************************** START OF CHANGE - ASIF MAQBOOL ************************
          IF l_action = 0 OR l_action = 1.
            ASSIGN gt_param TO <gt_param>.
            APPEND '100' to l_t_tab.
            READ TABLE gt_param INDEX 6 INTO gs_param.
            ASSIGN COMPONENT 2 OF STRUCTURE gs_param to <gs_value>.
            APPEND <gs_value> to l_t_tab.
            Clear gs_param.
            READ TABLE gt_param INDEX 4 INTO gs_param.
            ASSIGN COMPONENT 2 OF STRUCTURE gs_param to <gs_value>.
            APPEND <gs_value> to l_t_tab.
    *        LOOP AT <gt_param> INTO gs_param.
    *          ASSIGN COMPONENT 1 OF STRUCTURE gs_param to <gs_fieldname>.
    *          CASE <gs_fieldname>.
    *             WHEN '/BIC/ZFB_VERS'.
    *              APPEND '100' to l_t_tab.
    *             WHEN 'FISCPERIOD'.
    *                ASSIGN COMPONENT 2 OF STRUCTURE gs_param to <gs_value>.
    *                APPEND <gs_value> to l_t_tab.
    *             WHEN 'FISCYEAR'.
    *                ASSIGN COMPONENT 2 OF STRUCTURE gs_param to <gs_value>.
    *                APPEND <gs_value> to l_t_tab.
    *              WHEN OTHERS.
    *                ENDCASE.
    *        ENDLOOP.
            CONCATENATE LINES OF l_t_tab INTO str1 SEPARATED BY cl_abap_char_utilities=>horizontal_tab.
            CLEAR l_t_tab.
            INSERT str1 INTO TABLE l_t_tab.
            ASSIGN l_t_tab TO <final_gt>.
            "  To Start by adding the Header Data.
            CALL METHOD cl_gui_frontend_services=>gui_download
              EXPORTING
                filename              = l_fullpath
                write_field_separator = l_seperator "SPACE
              CHANGING
                data_tab              = <final_gt>
              EXCEPTIONS
                file_write_error      = 1.
            " Now append the Data after the header data insertion.
            CALL METHOD cl_gui_frontend_services=>gui_download
              EXPORTING
                filename                = l_fullpath
                write_field_separator   = l_seperator "SPACE
                append                  = 'X'
              CHANGING
                data_tab                = <lt_download>
              EXCEPTIONS
                file_write_error        = 1
                no_batch                = 2
                gui_refuse_filetransfer = 3
                invalid_type            = 4
                no_authority            = 5
                unknown_error           = 6
                header_not_allowed      = 7
                separator_not_allowed   = 8
                filesize_not_allowed    = 9
                header_too_long         = 10
                dp_error_create         = 11
                dp_error_send           = 12
                dp_error_write          = 13
                unknown_dp_error        = 14
                access_denied           = 15
                dp_out_of_memory        = 16
                disk_full               = 17
                dp_timeout              = 18
                file_not_found          = 19
                dataprovider_exception  = 20
                control_flush_error     = 21
                not_supported_by_gui    = 22
                error_no_gui            = 23
                OTHERS                  = 24.
            " *************************** END OF CHANGE - ASIF MAQBOOL ************************
            IF sy-subrc <> 0.
    *          MESSA

    Hi,
    I am also from same project.
    We are facing a problem with this code.
    How can I increase the length of the field obtained by this method.
    CALL METHOD go_model->create_data_reference
        EXPORTING
          io_tx_data_io_type = go_model->ds_tx_data_io_type-totals
          i_type             = l_outtype "'UCR_SX_TX_DATA_LST'
        IMPORTING
          er_data            = lr_s_data_out.
    ASSIGN lr_s_data_out->* TO <ls_data_out>.
    We are getting data overflow error when we try to move some large value to one of the field in <ls_data_out>.
    we can avoid this if the field length is increased.
    Waiting for your reply.
    Regards
    Madhu G S

  • Back up for BCS data

    Hello,
    I have some issue and I don't know how to solve it.
    The client wants to create a backup for data by creating additional cubes. After the finish working on a data and approve the financial reports, they want to copy the transactional data to the backup cube. In a case, some data will get lost or damaged in the original cube (ZBCS_C11) they want to analyze the data in a backup cube (ZBCS_C11G).
    I asked the BW team to create a transactional cube ( which is a copy of ZBCS_C11 cube). They did it. But they say they can not create a copy of all other cubes (virtual Infoprovider, Infocube for reporting (Delta load) and Multiprovider for reporting(Delta load)). So, it is impossible to design BEX reports on a backup cube as it designed on an original cube.
    They built a multiprovider above 2 cubes  -the original one (ZBCS_C11) and the backup cube (ZBCS_C11G). But the BEX report designed on this  multiprovider doesn't show all data. I think it is because the "Reporting mode" object is missing.
    Any ideas?
    Thank you,
    Lilia

    I see the point...
    Actually, you only need a virtual cube to read the back up cube data ( in order to apply the consolidation logic), not necessarly the complete delta load scenario... (indeed the delta load scenario was designed to improve the performances).
    And I agree, that it is not easy to create manually this virtual cube : in standard it is generated by the system from the BCS workbench, from a precise databasis / datamodel.
    What is maybe more reallistic is to find another server where you could copy your BCS/BW customizing, such as a Pre-production system, and fill this system with your production data for a back up. Please look to the following blog to perform the data transfer from one BCS system to the other one very easily (without system copy) :
    /people/collet.thibaud/blog/2010/10/18/sem-bcs--refresh-your-test-system-with-productive-data-using-a-datamart-interface

  • Performance of Query

    Hi BW Folks,
    I am working on virtual cube 0bcs_vc10 for bcs(business consolidation) the base cube is 0bcs_c10. We compressed and partitioned the base cube since it was having a huge performance issue. The queries which i developed are running fine and are in production.
    Now when I developed new queries after developing them and running in DEv they are taking 20 to 25 mins to run i.e. the whole partitioning and compressing of the cube is not helping us.
    I went to RSRV and check the indices of the cube and I got this
    yellow signal <b>ORACLE: Index /BI0/ICS_ITEM~0 has possibly degenerated</b>
    I need your suggestions what should be my next step to this Will assign full points
    Thanks

    Hi Ravi,
    i did RSRV and corrected the error but it still shows me the same error and there  is a yellow signal. Could you please tell me where else should I look in oder to get the performance of the query right.....I have already done partition and compression.
    It was running fine till 2 days back and all of a sudden there is a huge runtime for the queries.
    Your suggestions will be appreciatd with full pioints
    Thanks

  • Huge Performance issue and RSRT

    Hi BW Gurus,
    We are using BCS cube for our consolidation queries and reports . There is a huge prformance problem.
    I need to know that wht should be the appropriate size of the Global cache as compared to Local Cache. My global cache size is 100 MB and Global Cache size is 200 MB.
    Also when I go to RSRT properties
    Read Mode is H: Query to read when you navigate or expand hierarchy .
    Cache Mode is : 4 persistent cache across each application server
    persistence mode : 3 transparent table (BLOB).
    Do I have to change these settings ....please give your suggestions
    will appreciated with lot of points
    Thanks

    Hi Folks,..
    Could you'll please tell me where exactly we put the break point I will paste my code. I did Run SE30 and the list cube extraction simaltaneoulsy and gave me a message error generating the test frame
    tatics:
    FUNCTION RSSEM_CONSOLIDATION_INFOPROV3.
    ""Lokale Schnittstelle:
    *"  IMPORTING
    *"     REFERENCE(I_INFOPROV) TYPE  RSINFOPROV
    *"     REFERENCE(I_KEYDATE) TYPE  RSDRC_SRDATE
    *"     REFERENCE(I_TH_SFC) TYPE  RSDD_TH_SFC
    *"     REFERENCE(I_TH_SFK) TYPE  RSDD_TH_SFK
    *"     REFERENCE(I_TSX_SELDR) TYPE  RSDD_TSX_SELDR
    *"     REFERENCE(I_FIRST_CALL) TYPE  RS_BOOL
    *"     REFERENCE(I_PACKAGESIZE) TYPE  I
    *"  EXPORTING
    *"     REFERENCE(E_T_DATA) TYPE  STANDARD TABLE
    *"     REFERENCE(E_END_OF_DATA) TYPE  RS_BOOL
    *"     REFERENCE(E_T_MSG) TYPE  RS_T_MSG
    *"  EXCEPTIONS
    *"      ERROR_IN_BCS
      statics:
    UT begin:
    this flag is switched in order to record data returned by the current query in UT
    it can only be switched on/off in debug mode.
        s_record_mode  type rs_bool,
        s_qry_memo     type char256,    " at the moment, for query name
    package No, UUID, for unit testing
        s_packageno    type i,
        s_guid         type guid_22,
    UT end.
        s_first_call   like i_first_call,
        s_destination  type rfcdest,
        s_basiccube    type rsinfoprov,
        s_dest_back    type rfcdest,
        s_report       type programm,
        s_bw_local     type rs_bool,
        sr_data        type ref to data,
        sr_data_p      type ref to data,
        st_sfc         type t_sfc,
        st_sfk         type t_sfk,
        st_range       type t_seqnr_range,
        st_hienode     type t_seqnr_hienode,
        st_hienodename type t_seqnr_hienodename,
        st_seltype     type t_seqnr_seltype,
        st_datadescr   type T_DATADESCR,
        s_end_of_data  type rs_bool
      data:
        l_ucr_data_read_3 type funcname value 'UCR_DATA_READ_3',
        l_packagesize like i_packagesize,
        lt_message type t_message,
        ls_message like line of e_t_msg,
        l_xstring type xstring,
        l_nr type i.
      field-symbols:
        <ls_message> type s_message,
        <lt_data>   type standard table,
        <ls_data>   type any,"nos100804
        <lt_data_p> type hashed table."nos100804
      clear: e_t_data, e_end_of_data, e_t_msg.
    react on packagesize -1
      if i_packagesize le 0.    "nos050705
        l_packagesize = rssem_cs_integer-max.
      else.
        l_packagesize = i_packagesize.
      endif.
      if i_first_call = rs_c_true.
        s_first_call = rs_c_true.
        clear s_end_of_data.
    begin "nos100804
        data:
          lo_structdescr type ref to cl_abap_structdescr
         ,lo_tabledescr type ref to cl_abap_tabledescr
         ,lo_typedescr   type ref to cl_abap_typedescr
        data:
          lt_key     type table of abap_compname.
        field-symbols <ls_component> type abap_compdescr.
        create data sr_data_p like line of e_t_data.
        assign sr_data_p->* to <ls_data>.
        CALL METHOD CL_ABAP_STRUCTDESCR=>DESCRIBE_BY_DATA
          EXPORTING
            P_DATA      = <ls_data>
          RECEIVING
            P_DESCR_REF = lo_typedescr.
        lo_structdescr ?= lo_typedescr.
      collect all key components to lt_key
        loop at lo_structdescr->components assigning <ls_component>.
          insert <ls_component>-name into table lt_key.
          if <ls_component>-name = '&KEYEND'.
            exit.
          endif.
        endloop.
        data ls_sfk like line of i_th_sfk.
        data l_key     type abap_compname.
        loop at i_th_sfk into ls_sfk.
          l_key = ls_sfk-kyfnm.
          if l_key is not initial.
            delete table lt_key from l_key.
          endif.
          l_key = ls_sfk-value_returnnm.
          if l_key is not initial.
            delete table lt_key from l_key.
          endif.
        endloop.
        create data sr_data_p like hashed table of <ls_data>
            with unique key (lt_key).
       create data sr_data_p like e_t_data.
        create data sr_data   like e_t_data.
    end "nos100804
        perform determine_destinations  using    i_infoprov
                                        changing s_destination
                                                 s_dest_back
                                                 s_report
                                                 s_basiccube.
        perform is_bw_local changing s_bw_local.
    ***--> convert the selection, enhance non-Sid-values.
    --> Handle fiscper7
        data:
          lt_SFC      TYPE  RSDRI_TH_SFC
         ,lt_sfk      TYPE  RSDRI_TH_SFK
         ,lt_range    TYPE  RSDRI_T_RANGE
         ,lt_RANGETAB TYPE  RSDRI_TX_RANGETAB
         ,lt_HIER     TYPE  RSDRI_TSX_HIER
         ,lt_adj_hier type  t_sfc "nos290704
        statics: so_convert type ref to lcl_sid_no_sid
               , sx_seldr_fp34 type xstring
               , s_fieldname_fp7 type RSALIAS
               , st_sfc_fp34    TYPE  RSDD_TH_SFC
        create object so_convert type lcl_sid_no_sid
                  exporting i_infoprov = i_infoprov.
    Transform SIDs...
        perform convert_importing_parameter
                         using    i_th_sfc
                                  i_th_sfk
                                  i_tsx_seldr
                                  so_convert
                                  e_t_data
                         changing lt_sfc
                                  lt_sfk
                                  lt_range
                                  lt_rangetab
                                  lt_hier
                                  sx_seldr_fp34
                                          "Complete SELDR as XSTRING
                                  st_sfc_fp34
                                          "SFC of a selection with
                                          "FISCPER3/FISCYEAR
                                  s_fieldname_fp7
                                          "Name of Field for 0FISCPER
                                          "(if requested)
    This is the old routine, but ST_HIENDODE and ST_HIENODENAME can
    be neglected, since they are not used at all.
        perform prepare_selections
                         using    lt_sfc
                                  lt_sfk
                                  lt_range
                                  lt_rangetab
                                  lt_hier
                         changing st_sfc
                                  st_sfk
                                  st_range
                                  st_hienode
                                  st_hienodename
                                  st_seltype.
      endif.
      assign sr_data->*   to <lt_data>.
      assign sr_data_p->* to <lt_data_p>.
      describe table <lt_data_p> lines l_nr.
      while l_nr < l_packagesize and s_end_of_data is initial.
        if s_dest_back is initial and s_bw_local = rs_c_true.
      Local call
          call function l_UCR_DATA_READ_3
            EXPORTING
              IT_SELTYPE      = sT_SELTYPE
              IT_HIENODE      = sT_HIENODE        "not used
              IT_HIENODENAME  = sT_HIENODENAME    "not used
              IT_RANGE        = sT_RANGE
              I_PACKAGESIZE   = i_packagesize
              I_KEYDATE       = i_Keydate
              IT_SFC          = sT_SFC
              IT_SFK          = sT_SFK
              i_infoprov      = i_infoprov
              i_rfcdest       = s_destination
              ix_seldr        = sx_seldr_fp34
              it_bw_sfc       = st_sfc_fp34
              it_bw_sfk       = i_th_sfk
              i_fieldname_fp7 = s_fieldname_fp7
            IMPORTING
              ET_DATA         = <lT_DATA>
              E_END_OF_DATA   = s_END_OF_DATA
              ET_MESSAGE      = lT_MESSAGE
              et_adj_hier     = lt_adj_hier         "nos290704
            CHANGING
              c_first_call    = s_first_call.
        elseif s_dest_back is initial and s_bw_local = rs_c_false.
        !!! Error !!! No SEM-BCS destination registered for infoprovider!
          if 1 = 2.
            message e151(rssem) with i_infoprov.
          endif.
          ls_message-msgty = 'E'.
          ls_message-msgid = 'RSSEM'.
          ls_message-msgno = '151'.
          ls_message-msgv1 =  i_infoprov.
          insert ls_message into table e_t_msg.
        else.
        remote call to SEM-BCS
    ** Call UCR_DATA_READ_3 ...
          if s_first_call is not initial.
      get the datadescription to create the requested return-structure
      in the RFC-System.
            perform get_datadescr
                      using <lt_data>
                      changing st_datadescr
          endif.
          call function 'UCR_DATA_READ_4'
            destination s_dest_back
            exporting i_infoprov     = i_infoprov
                      i_rfcdest      = s_destination
                      i_first_call   = s_first_call
                      i_packagesize  = i_packagesize
                      i_keydate      = i_keydate
                      ix_seldr       = sx_seldr_fp34
                      it_bw_sfc      = st_sfc_fp34
                      it_bw_sfk      = i_th_sfk
                      it_datadescr   = st_datadescr
                      i_fieldname_fp7 = s_fieldname_fp7
            importing c_first_call   = s_first_call
                      e_end_of_data  = s_end_of_data
                      e_xstring      = l_xstring
            tables    it_seltype     = st_seltype
                      it_range       = st_range
                      it_hienode     = st_hienode      "not used
                      it_hienodename = st_hienodename  "not used
                      it_sfc         = st_sfc
                      it_sfk         = st_sfk
                      et_message     = lt_message
                      et_adj_hier    = lt_adj_hier.         "nos290704.
          clear <lt_data>.
          if lt_message is initial.
            call function 'RSSEM_UCR_DATA_UNWRAP'
              EXPORTING
                i_xstring = l_xstring
              CHANGING
                ct_data   = <lt_data>.
          endif.
        endif.
      convert the returned data (SID & Hierarchy).
        call method so_convert->convert_nosid2sid
          exporting it_adj_hier = lt_adj_hier[]     "nos290704
          CHANGING
            ct_data = <lt_data>.
       e_t_data = <lt_data>.
    Begin "nos100804
        data l_collect type sy-subrc.
        l_collect = 1.
        if <lt_data_p> is initial and
           <lt_data>   is not initial.
          call function 'ABL_TABLE_HASH_STATE'
            exporting
              itab          = <lt_data>
            IMPORTING
              HASH_RC       = l_collect "returns 0 if hash key exist.
        endif.
        if l_collect is initial.
          <lt_data_p> = <lt_data>.
        else.
          loop at <lt_data> assigning <ls_data>.
            collect <ls_data> into <lt_data_p>.
          endloop.
        endif.
       append lines of <lt_data> to <lt_data_p>.
    End "nos100804
      messages
        loop at lt_message assigning <ls_message>.
          move-corresponding <ls_message> to ls_message.
          insert ls_message into table e_t_msg.
        endloop.
        if e_t_msg is not initial.
          raise error_in_bcs.
        endif.
        describe table <lt_data_p> lines l_nr.
      endwhile.
      if l_nr <= l_packagesize.
        e_t_data = <lt_data_p>.
        clear <lt_data_p>.
        e_end_of_data = s_end_of_data.
      else.
    Begin "nos100804
        <lt_data> = <lt_data_p>.
        append lines of <lt_data> to l_packagesize to e_t_data.
        data l_from type i.
        l_from = l_packagesize + 1.
        clear <lt_data_p>.
        insert lines of <lt_data> from l_from into table <lt_data_p>.
        clear <lt_data>.
    End "nos100804
      endif.
    UT begin: start to record data
      if s_record_mode = rs_c_true.
        if i_first_call = rs_c_true.
          clear: s_guid, s_packageno.
          perform prepare_unit_test_rec_param
                      using
                         e_end_of_data
                         i_infoprov
                         i_keydate
                         i_th_sfc
                         i_th_sfk
                         i_tsx_seldr
                         i_packagesize
                         lt_key
                         e_t_data
                         s_qry_memo
                      changing
                         s_guid.
        endif.
        add 1 to s_packageno.
        perform prepare_unit_test_rec_data
                      using
                         s_guid
                         s_packageno
                         e_t_data
                         i_infoprov
                         e_end_of_data.
      endif.  "s_record_mode = rs_c_true
    UT end.
      if not e_end_of_data is initial.
      clean-up
        clear: s_first_call, s_destination, s_report, s_bw_local,
               st_sfc, st_sfk, st_range, st_hienode, s_basiccube,
               st_hienodename, st_seltype, s_dest_back, sr_data,
               so_convert , s_end_of_data, sr_data_p."nos100804
        free: <lt_data> , <lt_data_p>.
      endif.
    endfunction.
    It stores query parameters into cluster table
    form prepare_unit_test_rec_param using i_end_of_data type rs_bool
                                           i_infoprov    type rsinfoprov
                                           i_keydate     type rrsrdate
                                           i_th_sfc      type RSDD_TH_SFC
                                           i_th_sfk      type RSDD_TH_SFk
                                           i_tsx_seldr   type rsdd_tsx_seldr
                                           i_packagesize type i
                                           it_key        type standard table
                                           it_retdata    type standard table
                                           i_s_memo      type char256
                                     changing c_guid     type guid_22.
      data:
            ls_key          type g_rssem_typ_key,
            ls_cluster      type rssem_rfcpack,
            l_timestamp     type timestampl.
    get GUID, ret component type
      call function 'GUID_CREATE'
        importing
          ev_guid_22 = c_guid.
      ls_key-idxrid = c_guid.
      clear ls_key-packno.
    cluster record
      get time stamp field l_timestamp.
      ls_cluster-infoprov = i_infoprov.
      ls_cluster-end_of_data = i_end_of_data.
      ls_cluster-system_time = l_timestamp.
      ls_cluster-username = sy-uname.
    return data type
      data:
        lo_tabtype     type ref to cl_abap_tabledescr,
        lo_linetype    type ref to cl_abap_structdescr,
        lt_datadescr   type t_datadescr,
        ls_datadescr   like line of lt_datadescr,
        lt_retcomptab  type abap_compdescr_tab,
        ls_retcomptab  like line of lt_retcomptab,
        lt_rangetab    type t_seqnr_range.
      lo_tabtype   ?= cl_abap_typedescr=>describe_by_data( it_retdata ).
    lo_linetype  ?= lo_tabtype->get_table_line_type( ).
    lt_retcomptab = lo_linetype->components.
    call the sub procedure to use external format of C, instead of interal format (unicode).
    otherwise, when create data type from internal format, it won't be the same length as stored in cluster.
      PERFORM get_datadescr USING    it_retdata
                            CHANGING lt_datadescr.
      loop at lt_datadescr into ls_datadescr.
        move-corresponding ls_datadescr to ls_retcomptab.
        append ls_retcomptab to lt_retcomptab.
      endloop.
    range, excluding
    record param
      export p_infoprov        from i_infoprov
             p_keydate         from i_keydate
             p_th_sfc          from i_th_sfc
             p_th_sfk          from i_th_sfk
             p_txs_seldr       from i_tsx_seldr
             p_packagesize     from i_packagesize
             p_t_retcomptab    from lt_retcomptab
             p_t_key           from it_key
             p_memo            from i_s_memo
      to database rssem_rfcpack(ut)
      from ls_cluster
      client sy-mandt
      id ls_key.
    endform.
    It stores return data to cluster table
    form prepare_unit_test_rec_data using
                                      i_guid        type guid_22
                                      i_packageno   type i
                                      it_retdata    type standard table
                                      i_infoprov    type rsinfoprov
                                      i_end_of_data type rs_bool.
      data:
            l_lines         type i,
            ls_key          type g_rssem_typ_key,
            ls_cluster      type rssem_rfcpack,
            l_timestamp     type timestampl.
      ls_key-idxrid = i_guid.
      ls_key-packno = i_packageno.
      describe table it_retdata lines l_lines.
      if l_lines = 0.
        clear it_retdata.
      endif.
    cluster record
      get time stamp field l_timestamp.
      ls_cluster-infoprov = i_infoprov.
      ls_cluster-end_of_data = i_end_of_data.
      ls_cluster-system_time = l_timestamp.
      ls_cluster-username = sy-uname.
      export p_t_retdata       from it_retdata
      to     database rssem_rfcpack(ut)
      from   ls_cluster
      client sy-mandt
      id     ls_key.
    endform.
    form convert_importing_parameter
                   using    i_th_sfc    TYPE  RSDD_TH_SFC
                            i_th_sfk    TYPE  RSDD_TH_SFK
                            i_tsx_seldr TYPE  RSDD_TSX_SELDR
                            io_convert  type  ref to lcl_sid_no_sid
                            i_t_data    type  any table
                   changing et_sfc      TYPE  RSDRI_TH_SFC
                            et_sfk      TYPE  RSDRI_TH_SFK
                            et_range    TYPE  RSDRI_T_RANGE
                            et_rangetab TYPE  RSDRI_TX_RANGETAB
                            et_hier     TYPE  RSDRI_TSX_HIER
                            ex_seldr    type xstring
                            e_th_sfc    TYPE  RSDD_TH_SFC
                            e_fieldname_fp7 type  rsalias
      data lt_seldr TYPE  RSDD_TSX_SELDR.
      data ls_th_sfc type RRSFC01.
    0) rename 0BCSREQUID   > 0REQUID
      data l_tsx_seldr like i_tsx_seldr.
      data l_th_sfc like i_th_sfc.
      data l_th_sfc2 like i_th_sfc.                            "nos070605
      l_tsx_seldr = i_tsx_seldr.
      l_th_sfc = i_th_sfc.
      data ls_sfc_requid type   RRSFC01.
      data ls_seldr_requid type RSDD_SX_SELDR.
      ls_sfc_requid-chanm = '0BCS_REQUID'.
      read table l_th_sfc from ls_sfc_requid into ls_sfc_requid.
      if sy-subrc = 0.
        delete table l_th_sfc from ls_sfc_requid.
        ls_sfc_requid-chanm = '0REQUID'.
        insert ls_sfc_requid into table l_th_sfc.
      endif.
      ls_seldr_requid-chanm = '0BCS_REQUID'.
      read table l_tsx_seldr from ls_seldr_requid into ls_seldr_requid.
      if sy-subrc = 0.
        delete table l_tsx_seldr from ls_seldr_requid.
        ls_seldr_requid-chanm = '0REQUID'.
        field-symbols: <ls_range> like line of ls_seldr_requid-range-range.
        loop at ls_seldr_requid-range-range assigning <ls_range>.
          check <ls_range>-keyfl is not initial. "jhn190106
          if <ls_range>-sidlow is initial and <ls_range>-low is not initial.
            <ls_range>-sidlow = <ls_range>-low.
            clear <ls_range>-low.
          endif.
          if <ls_range>-sidhigh is initial and <ls_range>-high is not initial.
            <ls_range>-sidhigh = <ls_range>-high.
            clear <ls_range>-high.
          endif.
          clear <ls_range>-keyfl.     "jhn190106
        endloop.
        insert ls_seldr_requid into table l_tsx_seldr.
      endif.
    *1) Convert SIDs..., so that all parameter look like the old ones.
      call method io_convert->convert_sid2nosid
        EXPORTING
          it_sfc      = l_th_sfc
          it_sfk      = i_th_sfk
          it_seldr    = l_tsx_seldr
          it_data     = i_t_data
         IMPORTING
          et_sfc      = et_sfc
          et_sfk      = et_sfk
          et_range    = et_range
          et_rangetab = et_rangetab
          e_th_sfc    = l_th_sfc2                  "nos070605
    Ignore the old hierachy information:
      clear et_hier.
      delete et_range where chanm = '0REQUID'.
      delete table et_sfc with table key chanm = '0REQUID'.
    *2) Eliminate FISCPER7, from new strucutres:
    lt_seldr = i_tsx_seldr. "nos131004
      e_th_sfc = l_th_sfc.
    the fiscper7 can be deleted completly from the SID-selection, because
    it is also treated within et_range...
      clear e_fieldname_fp7.
    delete lt_seldr where chanm = cs_iobj_time-fiscper7."nos131004
    Begin "nos131004
    Ensure that there is no gap in the seldr.
      data:
         ls_seldr   like line of lt_seldr
        ,l_fems_act like ls_seldr-fems
        ,l_fems_new like ls_seldr-fems
      loop at l_tsx_seldr into ls_seldr
        where chanm ne cs_iobj_time-fiscper7.
        if ls_seldr-fems ne l_fems_act.
          l_fems_act = ls_seldr-fems.
          add 1 to l_fems_new.
        endif.
        ls_seldr-fems = l_fems_new.
        insert ls_seldr into table lt_seldr.
      endloop.
    end "nos131004
      e_th_sfc = l_th_sfc2.                                "nos070605
    Is fiscper7 in the query? (BCS requires allways two fields)
      read table e_th_sfc with key chanm = cs_iobj_time-fiscper7
           into ls_th_sfc.
      if sy-subrc = 0.
    ==> YES
    --> change the SFC, so that FISCPER3 and FISCYEAR is requested.
    The table ET_RANGE does contain also the selection for
    FISCPER3/FISCYEAR
    But since also E_FIELDNAME_FP7 is transferred to BCS, the
    transformation of the data, back to FISCPER7 is done on BCS-side.
        e_fieldname_fp7 = ls_th_sfc-KEYRETURNNM.
                                                "begin nos17060
        if e_fieldname_fp7 is initial.
          e_fieldname_fp7 = ls_th_sfc-sidRETURNNM.
          translate e_fieldname_fp7 using 'SK'.
        endif.
                                                "end nos17060
        delete table e_th_sfc from ls_th_sfc.
        ls_th_sfc-chanm       = cs_iobj_time-fiscper3.
        ls_th_sfc-keyreturnnm = ls_th_sfc-chanm.
        insert ls_th_sfc into table e_th_sfc.
        ls_th_sfc-chanm       = cs_iobj_time-fiscyear.
        ls_th_sfc-keyreturnnm = ls_th_sfc-chanm.
        insert ls_th_sfc into table e_th_sfc.
      endif.
    Store the SELDR in a XSTRING and unpack it just before selecting data
    in BW. It is not interpreted in BCS!
      export t_seldr  = lt_seldr
    Store also the SFC, because the BW-Systems migth be differrnt rel./SP.
             t_bw_sfc = e_th_sfc to data buffer ex_seldr compression on.
    endform.                    "convert_importing_parameter
    *&      Form  get_datadescr
          text
         -->IT_DATA    text
         -->ET_DATADESCtext
    form get_datadescr
                  using it_data type any table
                  changing et_datadescr type t_datadescr
      data: lr_data  type ref to data
          , lo_descr TYPE REF TO CL_ABAP_TYPEDESCR
          , lo_elemdescr TYPE REF TO CL_ABAP_elemDESCR
          , lo_structdescr TYPE REF TO CL_ABAP_structDESCR
          , lt_components  type abap_component_tab
          , ls_components  type abap_componentdescr
          , ls_datadescr type s_datadescr
      field-symbols: <ls_data> type any
                   , <ls_components> type abap_compdescr
      clear et_datadescr.
      create data lr_data like line of it_data.
      assign lr_data->* to <ls_data>.
      CALL METHOD CL_ABAP_STRUCTDESCR=>DESCRIBE_BY_DATA
        EXPORTING
          P_DATA      = <ls_data>
        RECEIVING
          P_DESCR_REF = lo_descr.
      lo_structdescr ?= lo_descr.
      CALL METHOD lo_structdescr->GET_COMPONENTS
        RECEIVING
          P_RESULT = lt_components.
      loop at lo_structdescr->components assigning <ls_components>.
        move-corresponding <ls_components> to ls_datadescr.
        if   ls_datadescr-type_kind = cl_abap_elemdescr=>typekind_char
          or ls_datadescr-type_kind = cl_abap_elemdescr=>typekind_num
          read table lt_components with key name = <ls_components>-name
                                   into ls_components.
          if sy-subrc = 0.
            lo_elemdescr ?= ls_components-type.
    ls_datadescr-length = lo_elemdescr->output_length.
          endif.
        endif.
        append ls_datadescr to et_datadescr.
      endloop.
    endform.                    "get_datadescr
    Try to give your inputs will appreciate that
    thanks

  • SEM-BCS Short Dump while executing Manual Posting Task

    Hi SEM-BCS Colleagues,
    I am having a peculiar problem with relation to the settings for Manual Postings. I have three scenarios for which I have defined manual document types and tasks 1) Standardizing Entry (Data Collection) 2) Manual Document in IU for adjustment 3) COI Group level manual postings. I am just creating and not changing any settings for document field properties also. I have configured monthly consolidation frequency and period category and I am posting document type 1 in Local Currency and 2 and 3 in Group Currency since they are after currency translation.
    Have any of you faced similar problem and help me please?

    Hello,
    I am also getting short dump during data collection in BCS.
    we recently upgraded the system to SP20. Is it anyway related to upgrade.
    I also checked the MYSELF as a source system, but not found one.
    Could you please let me know how you have solved this issue.
    short dump:
    Runtime Errors         PERFORM_TOO_MANY_PARAMETERS
    Exception              CX_SY_DYN_CALL_PARAM_NOT_FOUND
    Date and Time          04.12.2009 05:21:11
    Short text
         Too many parameters specified with PERFORM.
    What happened?
         In a subroutine call, there were more parameters than in the
         routine definition.
         Error in the ABAP Application Program
         The current ABAP program "SAPLRSDRI" had to be terminated because it has
         come across a statement that unfortunately cannot be executed.
    Error analysis
         An exception occurred that is explained in detail below.
         The exception, which is assigned to class 'CX_SY_DYN_CALL_PARAM_NOT_FOUND', was
          not caught in
         procedure "RSDRI_CUBE_WRITE_PACKAGE_RFC" "(FUNCTION)", nor was it propagated by
          a RAISING clause.
         Since the caller of the procedure could not have anticipated that the
         exception would occur, the current program is terminated.
         The reason for the exception is:
         A PERFORM was used to call the routine "CUBE_WRITE_PACKAGE" of the program
          "GPD1S8520HM8UV0U2XBDVPADF03".
         This routine contains 7 formal parameters, but the current call
         contains 10 actual parameters.

  • SAP BPC Consolidations vs SEM-BCS

    Hi all,
    Has anyone compared the features/functionalities of SAP BPC Consolidation with SEM-BCS consolidation? What are the advantages of BPC Consolidation  over SEM-BCS, and vice-versa? For a brand new implementation, what is the preferred tool?
    Thanks,
    Sujoy Banerjee

    Hi Sujoy,
    In general if you are starting out with a new implementation, BPC is the recommended solution for several reasons:
    1) It's a unified solution offering both Consolidations and Planning
    2) It's designed to be owned and managed by business users, not IT
    3) It's integrated with the Enterprise Performance Management for Finance suite, which also includes profitability and strategy management
    If however you need support for complex consolidation scenarios, such as legal consolidations, you may want to consider BCS.
    You should be able to find several documents comparing the two solutions if you go the Support Center (https://websmp104.sap-ag.de/support) and do a document search on BPC vs. BCS.
    Hope this helps.
    Chris Clay

  • Bad query performance - how to analyze it?

    Hi all,
    since 8 weeks we locate a bad query performance (round about 30% worse than before) in our BW system. At the moment we use a BIA on revision 49 with 4 blades (16GB).
    I have already read note 1318214 and analyzed that the most time is spend on the virtual provider(over 80%!).
    I´ve seen that a lot of time is spend on the "Datamanager":
    For example: It takes 0,76s to select 3.5million items in the relative provider and 78s!!! to select 0 items in the virtual provider.
    information from RSDDSTATTREXSERV:
    RFC Server    BIA client  BIA Kernel    ABAP RFC
    497          464              450               619
    So it seems to be a problem an the BW site, what can we do to improve the performance or analyse the query performance better.
    Best Regards,
    Jens

    Hi Jens,
    A few checks you may consider doing.
    BIA Availability :  Check the BI connection with BIA.
    Check if you need to rebuild BIA indices again. SAP recommends to do this often, to repair the degenerate indices or delete the indice which are not referenced any more.(eg data in the cube was compressed/deleted and the indices are no more needed.)
    Check the if BIA  reorganization is required - This is done to see the indices are evenly distribueded areoss the BIA Landscape.
    Try to find from BI Admin if major administration work was done within these 8 weeks.eg: Copy cube, dimension restructureing, copying data to some copy cube, archiving etc.
    You can use the BIA monitor to peform checks/monitor alerts from BIA servers
    [ BIA monitor|http://help.sap.com/saphelp_nw70/helpdata/en/43/7719d270d81a6ee10000000a11466f/content.htm]
    This link would tell you on the overall status of the BIA and any actions if required.
    Also it has sublinks to other important transaction of BIA monitoring and maintainnance.
    To go to BIA monitor : RSA!---> BIA monitor icon.
    Is your virtual provider reading data from R/3 or BW.
    Generally virtual providers are used to read data from other systems , so it woulfd not have an indices in BIA, I believe if this is the case. except for some applications like BCS wher eyou may be reading data from BW itself.
    Hope this helps
    Bext regards,
    Sunmit.

  • BCS Task execution tracking

    Hello experts,
    I have a question regarding tracking task execution in BI-BCS.
    My client has a large structure of more than 100 companies. They all use BCS solution on BI7 to load data, perform reclassifications, validations, balance carry forward, sign-offs, etc... during each reporting cycle.
    As time during reporting cycles is crucial, the Group reporting team would like know the progress - they would like to track how many companies executed each task:
    - how many loaded data,
    - how many run reclasses, etc...
    Does BCS have any mechanism to provide these statictics? If not, maybe it's possible to build report (either BEx or ABAP program), but where the tasks statuses are stored? Any idea what's the table name where status of each item from task hierarchy is stored?
    Kind regards,
    Roman

    Hi Eugene,
    Many thanks for your reply!
    Yes, every company enters its own data, runs tasks through Consolidation monitor. And the status of each task is displayed for this particular company.
    However, as a Group controller, I would like to see the summarized progress report from all companies. Example - let's assume we have 108 companies in total. The report should show number of companies that executed each task:
    Tasks:   |   Company count
    |----
    Task 1   |   90
    Task 2   |   40
    Task 3   |   39
    or precentages (eg. 80% of companies executed task 1, 35% executed task 2...)
    or another way - BEx report with drilldown by company code per each task:
    Tasks:   |   Company
    |----
    Task 1   |   3001
                 |   3003
                 |   3014
                 |   3015
                 |   3020
    Task 1   |   3001
                 |   3014
                 |   3015
    Task 3   |   3001
    or anything else that would summarize where are we in terms of data preparation in current reporting cycle.
    Any ideas?

  • Group Currency In BCS

    HI,
    We have  group currency is USD. There are  some company codes whose company codes currency is also USD.
    Insuch cases how do we get the group currency field filled for these company codes as USD, since unless we fill the group currency we cannot create the consolidated reporting by group currency.
    BCS experts please let me know your inputs.
    Regards,
    Ram.

    Hi Dan,
    I don't agree with this your paragragh:<i>
    If a group currency is provided with data collection, as suggested by Eugene, the exchange rates are not always what is required for reporting (Generally Accepted Accounting Practice- GAAP). Thus translation task is used in this case as well.</i>
    I mean if I provide, say, in a flat file data for upload where I have
    -- amount in local currency, which equals the group currency ($ZZZ)
    -- amount in group currency ($ZZZ)
    then there is no exchange rate (and moreover the wrong rate) and currency translation execution.
    Do you remember the system information after the currency translation which says that data not all consolidation units were translated?
    It's exactly our case. The system finds out that the local currency equals to group currency and doesn't perform the CT.
    BUT, it makes check if both amounts are equal. Though, not sure that the check is made in case of missing completely amount in GC.
    Do you agree?

Maybe you are looking for

  • A chance for YOU to play for the T-SQL team, in the TechNet Guru World Cup!

    The World Cup is here again! Not balls... brains! And YOU have been selected to play on our team! Yes forum reader, step up and take a shot! Slam some techie tips in the back of our nets! No dribbling please, just lots of problem tackling. So come on

  • How to get my Lumia 720's broken screen repaired?

    Hi, My Lumia 720 just fell out my hand on to the soil, and it's screen just broke today. This is really bad news from a Nokia Phone. I want my screen to be repaired at the earliest and in the cheapest way. I am looking for some serious suggestion, an

  • Small lil question

    how do i change the name of my ipod? i remember i did it to my old one but now i cant remember how ?

  • Need G4 Powerbook power supply replacement. Help?

    I need a replacement for my G4 Powerbook power supply. My currrent one (over 2.5 years old) has a 'short' in it. Can anyone recommend where I can find either resonably priced: 1) A factory 'Apple' powersupply exatly like the one I have or 2) A '3rd p

  • Leading and trailing 0's in Address Book???

    I get a string of 0's after phone numbers in my address book.  I transferred my addres book from a Treo 650 to my AT&T Centro. It seems that there are random 0's in the data and I can's seem to get rid of them. Does anyone else have this issue??? Tom