Max data pull from Virtual Cube - is this a setting?

We have a user doing a query against a Remote cube in our BW system, and they're hitting a "maximum data" limit of data from this remote cube.  Is this a setting for this cube or globals, and can you modify it?
Thanks,
Ken Little
RJ Reynolds Tobacco

Hi,
MAXSIZE = Maximum size of an individual data packet in KB.
The individual records are sent in packages of varying sizes in the data transfer to the Business In-formation Warehouse. Using these parameters you determine the maximum size of such a package and therefore how much of the main memory may be used for the creation of the data package. SAP recommends a data package size between 10 and 50 MB.
https://www.sdn.sap.com/irj/sdn/directforumsearch?threadid=&q=cube+size&objid=c4&daterange=all&numresults=15
MAXLINES = Upper-limit for the number of records per data packet
The default setting is 'Max. lines' = 100000
The maximum main memory space requirement per data packet is around
memory requirement = 2 * 'Max. lines' * 1000 Byte,
meaning 200 MByte with the default setting
3 THE FORMULA FOR CALCULATING NUMBER OF RECORDS
The formula for calculating the number of records in a Data Packet is:
packet size = MAXSIZE * 1000 / transfer structure size (ABAP Length)
but not more than MAXLINES.
eg. if MAXLINES < than the result of the formula, then MAXLINES size is transferred into BW.
The size of the Data Packet is the lowest of MAXSIZE * 1000 / transfer structure size (ABAP Length) or MAXLINES
Goto RSCUSTV6 tcode and set it.
Go to your Infopackage, from tool bar, scheduler, data packet settings, here you can specify your data packet size
Go to R/3, Transaction SBIW --> General settings --> Maintain Control Parameters for Data Transfer.
Here you can set the maximum number. But the same can be reduced in BW...
Info package>Scheduler>data’s default data transfer-->here you can give the size, but can reduce the size given in R/3 side, you can’t increase here...
In RSCUSTV6 you can set the package size...press F1 on it to have more info and take a look to OSS Notes 409641 'Examples of packet size dependency on ROIDOCPRMS' and 417307 'Extractor Package Size Collective Note'...
Also Check SAP Note 919694.
This applies irrelevant of source system meaning applicable for all the DS:
Go To SBIW-> General Settings -> Maintain Control Parameters for Data Transfer -> Enter the entries in table
If you want to change at DS level then:
IS->IP -> Scheduler Menu -> Data’s. Default Data Transfer and change the values.
Before changing the values keep in mind the SAP recommended params.
Hope this helps u..
Best Regards,
VVenkat..

Similar Messages

  • Simple Max date pull from several rows with the same employee numbers.

    Ok so I'm hoping this is flying over my head because it's almost 3am, but I haven't been able to figure this query out for the last 2hrs, but I like to figure stuff out on my own. I'm wondering why the following statement works fine:
    select emp_num, to_char(max(end_date)) as newestdate
    from bank_history
    where emp_num in ('22964', '21667', '20758', '12739', '12731', '20929', '22795', '20594', '23077', '12588', '21294', '20618', '21204', '22952', '19990', '20632', '03093', '19991', '22951', '07779', '20014', '11981', '06149', '20364', '21103')
    and bank_type = 'P' group by emp_num
    BUT! When I start to add more columns that I need to see it starts adding the multiple employee numbers again.
    select emp_num, bank_type, earn_hrs, used_paid_hrs, used_paid_hrs, to_char(max(end_date)) as newestdate
    from bank_history
    where emp_num in ('22964', '21667', '20758', '12739', '12731', '20929', '22795', '20594', '23077', '12588', '21294', '20618', '21204', '22952', '19990', '20632', '03093', '19991', '22951', '07779', '20014', '11981', '06149', '20364', '21103')
    and bank_type = 'P' group by emp_num, bank_type, earn_hrs, used_paid_hrs, used_unpaid_hrs
    The orginal table looks something like this:
    EMP_NUM BANK_TYPE EARN_HRS USED_PAID_HRS END_DATE
    60393 P 0.58 0 3-Aug-2004
    60394 P 7.48 1 28-Oct-2003
    60394 P 40 40 28-Oct-2004
    60394 P 40 12.4 28-Oct-2005
    60395 P 40 40 21-Oct-2004
    60395 P 0 0 21-Oct-2003
    60395 P 40 40 21-Oct-2005
    60397 P 40 39.85 21-Oct-2004
    60397 P 0.97 0.97 21-Oct-2003
    Much thanks for any direction you can give to help guide me on my way to a solution. If you can help it please don't give me the answer, but try to give me the reason it's not working. I'll never learn if you give it to me right up front lol. Thanks again.
    Luke
    Message was edited by: trying to fix the table formating sorry it's so unreadable
    Luke22

    I just give you a prompt:
    SQL> with t as (select 1 emp_num, 'A' bank, date '2007-08-01' dt from dual union all
      2             select 1 emp_num, 'B' bank, date '2007-08-02' dt from dual union all
      3             select 1 emp_num, 'C' bank, date '2007-08-03' dt from dual union all
      4             select 2 emp_num, 'A' bank, date '2007-08-05' dt from dual union all
      5             select 2 emp_num, 'B' bank, date '2007-08-04' dt from dual union all
      6             select 3 emp_num, 'A' bank, date '2007-08-07' dt from dual union all
      7             select 3 emp_num, 'C' bank, date '2007-08-08' dt from dual)
      8  --
      9             select emp_num, max(dt) from t
    10             group by emp_num
    11  /
       EMP_NUM MAX(DT)
             1 03.08.2007
             2 05.08.2007
             3 08.08.2007
    SQL>
    SQL> with t as (select 1 emp_num, 'A' bank, date '2007-08-01' dt from dual union all
      2             select 1 emp_num, 'B' bank, date '2007-08-02' dt from dual union all
      3             select 1 emp_num, 'C' bank, date '2007-08-03' dt from dual union all
      4             select 2 emp_num, 'A' bank, date '2007-08-05' dt from dual union all
      5             select 2 emp_num, 'B' bank, date '2007-08-04' dt from dual union all
      6             select 3 emp_num, 'A' bank, date '2007-08-07' dt from dual union all
      7             select 3 emp_num, 'C' bank, date '2007-08-08' dt from dual)
      8  --
      9             select emp_num, bank, max(dt) from t
    10             group by emp_num, bank
    11  /
       EMP_NUM BANK MAX(DT)
             1 A    01.08.2007
             1 B    02.08.2007
             1 C    03.08.2007
             3 A    07.08.2007
             2 A    05.08.2007
             2 B    04.08.2007
             3 C    08.08.2007
    7 rows selected
    SQL>
    SQL> with t as (select 1 emp_num, 'A' bank, date '2007-08-01' dt from dual union all
      2             select 1 emp_num, 'B' bank, date '2007-08-02' dt from dual union all
      3             select 1 emp_num, 'C' bank, date '2007-08-03' dt from dual union all
      4             select 2 emp_num, 'A' bank, date '2007-08-05' dt from dual union all
      5             select 2 emp_num, 'B' bank, date '2007-08-04' dt from dual union all
      6             select 3 emp_num, 'A' bank, date '2007-08-07' dt from dual union all
      7             select 3 emp_num, 'C' bank, date '2007-08-08' dt from dual)
      8  --
      9             select emp_num, max(bank) keep (dense_rank last order by dt) bank, max(dt) from t
    10             group by emp_num
    11  /
       EMP_NUM BANK MAX(DT)
             1 C    03.08.2007
             2 A    05.08.2007
             3 C    08.08.2007

  • Can i get the data file from the cube

    Hi All,
    Previously I created and activated one cube using one data file, by mistake lost that data file. Is there any chance to get that data file from that Cube?
    Thanks
    Bhaskar

    Hi Paul,
    yes you can..
    1) If you have loaded through  PSA , Then goto that PSA table and from settings menu> change display variants-> view tab> view as Excel> then save this file
    2) If you have not used PSA.then use LISTCUBE to view the data and change settings to excel and download..
    Don't forget to assign points,If it is useful..
    Further queries always welocme..
    Regards.
    Pradeep choudhari

  • A/P & Consignment data pull to Planning Cube.....

    I put this in "BC & Extractors" section, but did not get any response.....
    Need couple of clarifications (brief background is provided.....):
    1. We do DELTA extractions using 0FI_AP_3 adn 0FI_AR_4 to a base cube which includes both 'Open' and 'Cleared'
    items. From this cube we do a DELTA load to a Planning Cube on a DAILY basis. We filter the data on Company Code
    and 'Item Status' (OnlyOPEN Items) for Planning purposes. The key figure pulled to Planning Cube is '0DEB_CRE_DC'.
    The bsae cube has data at Line Item Document level, where as the planning cube is only at Customer & Vendor level.
    The question is - Will this Daily DELTA to Planning cube with filter on Open Item bring correct key figure? For example,
    if the status of one of the items pulled as OPEN in earlier Delta changes to "CLEARED" the subsequent day, will the Key
    Figure reflect the changes and show correct value? Is there any impact on the key figure due to the filter on Open Item?
    My thinking is the key figure 0DEB_DRE_DC will reflect he correct number (adjsuted) since it is a summated key figure.
    I could be totally wrong and want to understand this with comments from experts.
    2. Has anyone worked with pulling Consignment (SMI) data into Planning cube? We pull this data using a generic extractor
    from a SAP table "RKWA" with some logic. Since I pull A/P data that contains SMI & Non-SMI transactions. To avoid
    Double Counting, how can I avoid bringing in the SMI transactions from A/P? There is no flag to diffferentiate the SMI /
    Non-SMI transactions....
    I am hoping someone has worked in these area to offer expert comments/ recommendations.
    Thanks.. Shaun
    Please post one question only once
    Edited by: Vikram Srivastava on Aug 13, 2010 1:02 PM

    Hello Lee,
    For your first question as you are using the Filter on the status of the record,
    lets say today the status is OPen and as a part of the filter the Delta will be able to pick this record as the status is OPen but if the status is changed to Closed tomorrow then this record will be completely ignored and hence in the next level target you will not have the right status at all.
    So you need to plan your filters accordingly.
    Hope this helps.
    Thanks
    Murali

  • Query Data cached? (Virtual Cube)

    Hi folks,
    I have some problems with a Query-Objekt which gets data out of an Virtual Cube. The Virtual Cube is based on a 3.X Infosource an gets data out of a table in the ERP.
    When i call the Query from VC everything works fine and current data is shown. But if i manipulate data of the table which the Virtual Cube points to and send a refresh event to the Query-Objekt, the manipulated data is not shown. It always returns the data which it fetched at the first call. If i refresh the whole application in the browser (via F5), the manipulated data is shown. I disabled cachmode in rsrt for this query but it doesn't work.
    Any chance to get the current data by just sending a refresh action and call the Query-Objekt again without reloading the whole application? Any Idea?
    Points will be awarded for usefull information.

    Hello,
    The reason why the data manipulated is not showing up in the query even after refresh is sent is because the cache for the virtual provider does not get reset as it would for a normal InfoCube.
    So, it does not know when to reset the cache for itself even when data is manipulated.
    The way we have worked around this is have a temporary process chain which runs on a frequent basis and executes the function module RSDMD_SET_DTA_TIMESTAMP for the virtual cube in consideration.
    Thanks
    Dharma.

  • Automate data load from BPC cube to BW cube

    Hi Gurus,
    I've got all my budgeting & forecasting data in BPC cube. Now I need to load it to a BW cube and combine with Actuals in another BW cube through Multiprovider and build reports on the multiprovider.
    My question is:
    What is the best way to automate the loading process of BPC cube data to BW cube ??
    I should also be able to load the property values of BPC dimensions to the BW info objects.
    The methods I followed are:
    1. Run "Export" data package and load BPC data to a CSV file and run BW DTP/infopackage to process the CSV file into BW cube. Problem with this is - I canot automate these two steps, and even if I did, I cannot export property values to a flat file.
    2. Build transformations directly from BPC cube to an Infosource and from Infosource to BW cube. Problem with this is - in the transformations I cannot use the rule: "Read Master Data". I may have to write a routine, but my ABAP is not good enuf.
    Please help with an alternative solution
    Thanks,
    Venkat

    Thanks for the reply. I know I will have more options if I'm BPC 7.5 NW. But I'm on BPC 7.0 NW.
    I managed to pull the attribute values of BPC dimensions using routines. But still it's risky because the tech names of BPC objects may change if one modifies the dimensions or run optimization.
    It's a shame SAP haven't provided a robust solution to load BPC cube data to BW cube data.
    I don't want to load BW cube data to BPC cube and depend on EVDRE reports for my 'Plan vs Actual' reports. I (and end users) always want to lean towards BEx reports.

  • Data source to Virtual Cube

    Hello friends,
    I am having a Virtual Cube with services based on Functional Module.
    How can i find out from where data is comming into this virtual cube.
    I would like to see the source data which is comming into this Virtual Cube?
    Thanks
    Tony

    Hi Tony,
    For Virtual Infoprovider with services, there will be a FM in BW system itself. All the connection and selection of data is done in the FM. So, You do not have a separate DS for them. Just look into the FM for the logic of connection and extraction.
    Thanks and Regards
    Subray Hegde

  • Data transfer from G4 cube to MacBook Pro w/ Snow Leopard

    Hi,
    I just purchased a MacBook Pro Friday and was able to successfully transfer my data from my Cube (OS 10.4.11) via Migration Assistant after the second attempt. Unfortunately, this MacBook encountered constant Prohibitive Screen errors even after re-installing Leopard OS requiring me to exchange it. Now with the second new MBP, I can get the firewire connection established w/ Cube HD mounting but the MBP just spins as it tries to get to the next window (selecting the data to transfer over.) The first time I tried it w/ the second MBP, I got a message that there was no OS on the cube. The cube works perfectly fine. The multiple later attempts - nothing but a spinning circle. (P.S. Mac already shipped off the first MBP to be reset so that's not an option to retrieve the data from.)
    I looked into transfering via ethernet but I do not have the install DVD that is required for it (unless I'm mis-reading it.) Any suggestions? From reading the other queries, users seem to have issues getting the connection established. I am able to do so and was able to transfer the data once before, but now nothing.
    I'm completely clueless re: the other suggestions of creating networks, encasing the HD, etc.
    If all else fails, what can I do w/ my Cube? I don't want to sell it and ideally would like to use it as a separate drive.
    Thanks,
    Christina

    One of the bigger problems with migrating from PowerPC to Intel is not the operating system, but the chip. Migration Assistant has been known to fail for unknown reasons when moving between the Mac platforms. Instead, this user tip has been recommended to avoid such issues:
    http://discussions.apple.com/thread.jspa?threadID=435350&tstart=0

  • Short Dump while Complete deletion of data Contents from a Cube.

    Hello Experts,
    I am facing a runtime Short dump whenever i attempt to delete data from a Cube. Shown Below:-
    Runtime Errors         MESSAGE_TYPE_X
    Error analysis
        Short text of error message:
    Data request to the OLTP
        Long text of error message:
        Technical information about t
        Message class....... "RSM"
        Number.............. 000
        Variable 1.......... " "
        Variable 2.......... " "
        Variable 3.......... " "
        Variable 4.......... " "
    How to correct the error
        Probably the only way to eliminate the error is to correct the program.
        If the error occures in a non-modified SAP program, you may be able to
        find an interim solution in an SAP Note.
        If you have access to SAP Notes, carry out a search with the following
        keywords:
        "MESSAGE_TYPE_X" " "
        "SAPLRSM1" or "LRSM1U43"
        "RSSM_UPDATE_RSBKREQUEST"
    Information on where terminated
        Termination occurred in the ABAP program "SAPLRSM1" - in
         "RSSM_UPDATE_RSBKREQUEST".
        The main program was "RSAWBN_START ".
        In the source code you have the termination point in line 117
        of the (Include) program "LRSM1U43".
    Line  SourceCde
       87   call function 'RSSTATMAN_GET_TYPE_FOR_DTA'
       88     exporting
       89       i_dta  = l_dta
       90     importing
       91       e_type = l_dta_type
       92     exceptions
       93       error  = 1
       94       others = 2.
       95   if sy-subrc <> 0.
       96     message id sy-msgid type sy-msgty number sy-msgno
       97             with sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
       98   endif.
       99   call function 'RSSTATMAN_DELETE_STATUS'
      100     exporting
      101       i_dta      = l_dta
      102       i_dta_type = l_dta_type
      103       i_process  = 'ALL'.
      104
      105   sort l_t_iccont by rnsidlast ascending.
      106   loop at l_t_iccont.
      107     data: l_s_reqdone like rsreqdone.
      108     select single * from rsreqdone into l_s_reqdone where
      109            rnr = l_t_iccont-rnr.
      110     if sy-subrc <> 0.
      111       read table l_h_dtpreq with key
      112            requid = l_t_iccont-rnsidlast.
      113       if sy-subrc = 0 and
      114          ( l_h_dtpreq-ustate = '0' or l_h_dtpreq-tstate = '0' ).
      115         clear l_s_reqdone-archived.
      116       else.
    >>>>>         message x000.
       118       endif.
      119     endif.
      120     if l_s_reqdone-archived <> rs_c_false.
      121       call function 'RSREQARCH_RELOAD_REQUEST'
      122         exporting
      123           i_rnr         = l_t_iccont-rnr.
      124     endif.
      125     select single * from rsbkrequest into l_s_req where
      126            requid = l_t_iccont-rnsidlast.
      127     if sy-subrc <> 0.
      128       continue.
      129     endif.
      130
      131     read table l_h_rsbkdtp with table key
      132          dtp = l_s_req-dtp.
      133     if sy-subrc <> 0.
      134       l_h_rsbkdtp-dtp = l_s_req-dtp.
      135       insert table l_h_rsbkdtp.
      136     endif.
    This issue only exists in Production system not QTY or development thr Cube deletion is working fine.
    So far I have tried Deleting Index's and then deleting the Cube Contents & Run Elementary Checks from RSRV but not find any lead yet.Also unable to find appropriate notes for this issue aswell I am using SAP_Bw comp 700 with SP26.
    Any suggestion  would be of great help .
    Thanks & Regards
    Aman

    Hi Ap_SAP & Arvind,
    Any recent upgrades or system refresh done?
    In recent update just from SP24 to SP26 was updated and a DB stats Referesh was carried on.But this proublem existed before these activities.
    Except deleting can you able to perform all other operations like loading data..etc?
    Other than deletion I am able to do load data into cube extract from R3 and also i can delete data from PSA.Only deletion giving dump.
    You are trying to delete transactional or master data.
    I am tring for transaction data.
    WHen I try to delete request by request it also lead to the same short dump.
    @ arvind : I checked Sm12 there i see not table which has been locked.
    Thanks
    Aman

  • Data archiving from a cube to another cube

    Hi All,
    We need to archive huge volume of data from a cube to some other cube may be like a back up cube.
    We have come across archiving to a file but we need to archive to another cube.
    Kindly suggest steps or way to execute this.
    Thanks,
    Dinesh

    Hi Dinesh,
    As you are looking for a backup of data in other target,
    You can collect the below information first before loading the data from Main target to Back up target.
    1. Data volume present in main target in terms of Number of year.
    2. Kind of application.
    3. Also whether all the data from the lower level target has been updated to this main target or not?
    Once everything is there then you can split the data into multiple based on calyear or fiscal year period. So you can load multiple load to back up target instead of pushing all the records in one load which will take long time and also the system resources.
    So perform multiple load with different selections based on the system processors availability.
    Perform this once all the selections are done.
    Hope this helps.
    Murali

  • Inventory data load from Inventory Cube to another Copy Cube

    Hello Experts,
    I am trying to load Inventory data from the Inventory cube(say YCINV) to a copy cube  say YCOPY_CINV(copy of Inventory cube), but the results appear inconsistant when I compare the reports on these 2 cubes. I am trying to populate a fiield in copy cube so that I can populate the same data back to the original cube with the new field in it, I am doing this reload back and forth for historical data purpose only.
    I have seen lot of posts as how to run the setups for Inventory data, but my case does not need to perform set up runs.
    Does the note 1426533 solve the issue of loading from one cube to another ? we are on SAP BI 7.01 with SP 06 ,but the note specifies SP 07 ?
    I have tried note 375098 to see if it works, but I do not see the options as mentioned from  step "Using DTP (BW 7.x)" in BI to perform this note
    Please advise on whether to go with implementing note 1426533 or is there any other way to load inventory data from one cube to other.
    Regards,
    JB

    Hi Luis,
    Thanks for your reply,
    I do not see any setting like Intial stock in DTP except "initial non-cumulative for non- cumulative". I did try using the option "initial non-cumulative for non- cumulative" ,but the results still do not match with the inventory cube data.I do not see the check box for marker in the copy cube (under roll up tab). Please let me know if we can really implement this solution ,i.e. copying from inventory cube to copy cube and then re-loading it back to inventory cube for historical data. Currenlty, I am comparing the queries on these 2 cubes, if the data matches then I can go  ahead and implement it in Production, other wise it would not be wise to do so .
    Regards,
    JB

  • Data pulling from R3 to BW server in BI7.0

    Hi all,
    I am very new in BI7.0 but know the concept of  pulling the data in BW3.5 .
    I create a generic data source and replicate it in to BI7.0 .Now i am confuse in BI server how to i pull the data from R3 server Bcz each and every screen is differ from BW3.5 can anybody tell me the way for data pulling in BI 7.0 .or give me some idea abt BI7.0 data pulling steps.
    Thanks and Regards.
    Ankit modi.

    Hi Ankit
      As of BI 7.0 there were several changes in the data loading process.
      First of all, you need to replicate the DataSource (here you can choose between use a 3.5 or 7.0 replication. I recomend the last one)
      Now you are not obligue to create an InfoSource.
      Then you have to connect your InfoProvider (DSO, InfoCube, etc) to the DataSource. This is made with a Transformation (the symbol is similar to the previous Update Rules but without an small square/dot in the graphic). Once you have created the Transformation you connect the fields of the DataSource with the InfoObject of your DataProvider.
      Finally you'll create an InfoPackage to load data from your R/3 system to PSA and then you'll create a DTP to load the data from the PSA to the InfoProvider.
    Hope it helps.
    Kindly regards.
    Germán.-

  • Error 8 during data transfer from PSA - Cube

    Hello,
    when I want to load data from PSA to Cube I get "error 8 in update".
    I already had error 7 oder error 4 in the past, but what does this error 8 mean and how can I solve it?
    Thanks in advance

    The reason is this
    that if you try to load data records to the cube which is already archived. Please note that until the archives are not reloaded
    back, the same interval cannot be loaded.
    workaround in case of semantic lock is
    is to reload the data from archive -> semantic lock disappears -> load/update the data -> archive data again.

  • Read Data from "virtual" Cube with different ConsUnit Hier. Version

    Dear all,
    I got a odd request.
    I need to load data from the virutal Reporting-Bapi Cube for a certain Data Version.
    This Version has been attached to a new consUnit Hierachy a few weeks ago.
    Now we get a request to read data in the past for units which are not present in the current hierarchy anymore.
    I know that this is a general issue... but does anyone got a workaround for this?
    Thanks in advance,
    regards
    Oliver

    Dear All,
    Restamtent functionality did not help at this issue.
    What we did a Copy of the Version and attached the "old" hierachy.
    This was the only solution.
    Regards
    Oliver

  • How to extract data from virtual cube..?

    Gurus,
    How can i read data from an virtual infocube thru an ABAP code..Is there any FM that i can use..
    Kindly Help me in this...
    Really URGENT..
    Thanks
    Sam

    Subray,
    Thanks for the reply..I have created a wrapper for this FM in the same way te DEMO program shows but still doesn't wrk.
    No data is returned in the Table.
    I also tried using the FM RSDRI_INFOPROV_READ_RFC and it does some processing but does nt return the results..
    I am attaching my code...Can u please help me with that...
    TYPES:
      BEGIN OF gt_s_data,
      cs_version(3) TYPE c,
      cs_chart(2) TYPE c,
      bcs_llob(4) TYPE c,
      bcs_lcus(3) TYPE c,
      bcs_ldch(2) TYPE c,
      bcs_lprg(5) TYPE c,
      curkey_lc TYPE /BI0/OICURKEY_LC,
      curkey_tc TYPE /BI0/OICURKEY_TC,
      bcs_lmay TYPE /BIC/OIBCS_LMAY,
      figlxref3 TYPE /BIC/OIFIGLXREF3,
      unit TYPE /BI0/OIUNIT,
      CS_TRN_LC TYPE /BI0/OICURKEY_GC,
      CS_TRN_TC TYPE /BI0/OICURKEY_GC,
      CS_TRN_QTY TYPE /BI0/OIUNIT,
      END OF gt_s_data.
      DATA:
      l_msg_text TYPE string,
      g_s_sfc    TYPE rsdri_s_sfc,
      g_th_sfc   TYPE rsdri_th_sfc,
      g_s_sfk         TYPE rsdri_s_sfk,
      g_th_sfk        TYPE rsdri_th_sfk,
      g_s_range       TYPE rsdri_s_range,
      g_t_range       TYPE rsdri_t_range.
      DATA:
      g_s_data        TYPE gt_s_data,
      g_t_data        TYPE STANDARD TABLE OF gt_s_data,
      g_t_rfcdata     TYPE rsdri_t_rfcdata,
      g_t_sfc         TYPE rsdri_t_sfc,
      g_t_sfk         TYPE rsdri_t_sfk,
      g_t_field       TYPE rsdp0_t_field.
      DATA:
      g_first_call   TYPE rs_bool.
      g_first_call  = rs_c_true.
    CLEAR g_s_sfc.
    g_s_sfc-chanm    = '0CS_VERSION'.
    g_s_sfc-chaalias = 'cs_version'.
    g_s_sfc-orderby  = 0.
    INSERT g_s_sfc INTO TABLE g_th_sfc.
    CLEAR g_s_sfc.
    g_s_sfc-chanm    = '0CS_CHART'.
    g_s_sfc-chaalias = 'CS_CHART'.
    g_s_sfc-orderby  = 0.
    INSERT g_s_sfc INTO TABLE g_th_sfc.
    CLEAR g_s_sfc.
    g_s_sfc-chanm    = 'BCS_LLOB'.
    g_s_sfc-chaalias = 'BCS_LLOB'.
    g_s_sfc-orderby  = 0.
    INSERT g_s_sfc INTO TABLE g_th_sfc.
    CLEAR g_s_sfc.
    g_s_sfc-chanm    = 'BCS_LCUS'.
    g_s_sfc-chaalias = 'BCS_LCUS'.
    g_s_sfc-orderby  = 0.
    INSERT g_s_sfc INTO TABLE g_th_sfc.
    CLEAR g_s_sfc.
    g_s_sfc-chanm    = 'BCS_LDCH'.
    g_s_sfc-chaalias = 'BCS_LDCH'.
    g_s_sfc-orderby  = 0.
    INSERT g_s_sfc INTO TABLE g_th_sfc.
    CLEAR g_s_sfc.
    g_s_sfc-chanm    = 'BCS_LPRG'.
    g_s_sfc-chaalias = 'BCS_LPRG'.
    g_s_sfc-orderby  = 0.
    INSERT g_s_sfc INTO TABLE g_th_sfc.
    CLEAR g_s_sfc.
    g_s_sfc-chanm    = '0CURKEY_GC'.
    g_s_sfc-chaalias = 'CURKEY_LC'.
    g_s_sfc-orderby  = 0.
    INSERT g_s_sfc INTO TABLE g_th_sfc.
    CLEAR g_s_sfc.
    g_s_sfc-chanm    = '0CURKEY_GC'.
    g_s_sfc-chaalias = 'CURKEY_TC'.
    g_s_sfc-orderby  = 0.
    INSERT g_s_sfc INTO TABLE g_th_sfc.
    CLEAR g_s_sfc.
    g_s_sfc-chanm    = 'BCS_LMAY'.
    g_s_sfc-chaalias = 'BCS_LMAY'.
    g_s_sfc-orderby  = 0.
    INSERT g_s_sfc INTO TABLE g_th_sfc.
    CLEAR g_s_sfc.
    g_s_sfc-chanm    = 'FIGLXREF3'.
    g_s_sfc-chaalias = 'FIGLXREF3'.
    g_s_sfc-orderby  = 0.
    INSERT g_s_sfc INTO TABLE g_th_sfc.
    CLEAR g_s_sfc.
    g_s_sfc-chanm    = '0UNIT'.
    g_s_sfc-chaalias = 'UNIT'.
    g_s_sfc-orderby  = 0.
    INSERT g_s_sfc INTO TABLE g_th_sfc.
    ***Fill up the key figures data.
    CLEAR g_s_sfk.
    g_s_sfk-kyfnm    = '0CS_TRN_GC'.
    g_s_sfk-kyfalias = 'CS_TRN_LC'.
    g_s_sfk-aggr     = 'SUM'.
    INSERT g_s_sfk INTO TABLE g_th_sfk.
    CLEAR g_s_sfk.
    g_s_sfk-kyfnm    = '0CS_TRN_GC'.
    g_s_sfk-kyfalias = 'CS_TRN_TC'.
    g_s_sfk-aggr     = 'SUM'.
    INSERT g_s_sfk INTO TABLE g_th_sfk.
    CLEAR g_s_sfk.
    g_s_sfk-kyfnm    = '0CS_TRN_QTY'.
    g_s_sfk-kyfalias = 'CS_TRN_QTY'.
    g_s_sfk-aggr     = 'SUM'.
    INSERT g_s_sfk INTO TABLE g_th_sfk.
    Fill up selection criteria.
    CLEAR g_s_range.
    g_s_range-chanm    = '0CS_VERSION'.
    g_s_range-sign     = rs_c_range_sign-including.
    g_s_range-compop   = rs_c_range_opt-equal.
    g_s_range-low      = '100'.
    APPEND g_s_range TO g_t_range.
    CLEAR g_s_range.
    g_s_range-chanm    = 'BCS_VERS'.
    g_s_range-sign     = rs_c_range_sign-including.
    g_s_range-compop   = rs_c_range_opt-equal.
    g_s_range-low      = 'ACT'.
    APPEND g_s_range TO g_t_range.
    CLEAR g_s_range.
    g_s_range-chanm    = '0CS_CHART'.
    g_s_range-sign     = rs_c_range_sign-including.
    g_s_range-compop   = rs_c_range_opt-equal.
    g_s_range-low      = 'ZG'.
    APPEND g_s_range TO g_t_range.
    CLEAR g_s_range.
    g_s_range-chanm    = '0CO_AREA'.
    g_s_range-sign     = rs_c_range_sign-including.
    g_s_range-compop   = rs_c_range_opt-equal.
    g_s_range-low      = 'AZ01'.
    APPEND g_s_range TO g_t_range.
    CLEAR g_s_range.
    g_s_range-chanm    = '0FISCVARNT'.
    g_s_range-sign     = rs_c_range_sign-including.
    g_s_range-compop   = rs_c_range_opt-equal.
    g_s_range-low      = 'K2'.
    APPEND g_s_range TO g_t_range.
    CLEAR g_s_range.
    g_s_range-chanm    = '0FISCYEAR'.
    g_s_range-sign     = rs_c_range_sign-including.
    g_s_range-compop   = rs_c_range_opt-equal.
    g_s_range-low      = '2007'.
    APPEND g_s_range TO g_t_range.
    CLEAR g_s_range.
    g_s_range-chanm    = '0FISCPER3'.
    g_s_range-sign     = rs_c_range_sign-including.
    g_s_range-compop   = rs_c_range_opt-equal.
    g_s_range-low      = '003'.
    APPEND g_s_range TO g_t_range.
    CLEAR g_s_range.
    g_s_range-chanm    = '0SEM_CGCOMP'.
    g_s_range-sign     = rs_c_range_sign-including.
    g_s_range-compop   = rs_c_range_opt-equal.
    g_s_range-low      = 'US0075'.
    APPEND g_s_range TO g_t_range.
    g_t_sfc = g_th_sfc.
    g_t_sfk = g_th_sfk.
    CALL FUNCTION 'RSDRI_INFOPROV_READ_RFC'
      EXPORTING
        I_INFOPROV                   = 'BCS_C1V11'
      I_REFERENCE_DATE             = SY-DATUM
      I_SAVE_IN_TABLE              = ' '
      I_TABLENAME                  =
      I_SAVE_IN_FILE               = ' '
      I_FILENAME                   =
      I_AUTHORITY_CHECK            = RSDRC_C_AUTHCHK-READ
      I_CURRENCY_CONVERSION        = 'X'
      I_S_RFCMODE                  =
      I_MAXROWS                    = 0
      I_USE_DB_AGGREGATION         = RS_C_TRUE
      I_USE_AGGREGATES             = RS_C_TRUE
      I_ROLLUP_ONLY                = RS_C_TRUE
      I_READ_ODS_DELTA             = RS_C_FALSE
      I_RESULTTYPE                 = ' '
      I_DEBUG                      = RS_C_FALSE
    IMPORTING
      E_END_OF_DATA                =
      E_AGGREGATE                  =
      E_RFCDATA_UC                 =
      E_SPLIT_OCCURRED             =
      TABLES
        I_T_SFC                      = g_t_sfc
        I_T_SFK                      = g_t_sfk
       I_T_RANGE                     = g_t_range
      I_T_TABLESEL                 =
      I_T_RTIME                    =
      I_T_REQUID                   =
       E_T_RFCDATA                   = g_t_rfcdata
      E_T_RFCDATAV                 =
       E_T_FIELD                     = g_t_field
    EXCEPTIONS
      ILLEGAL_INPUT                = 1
      ILLEGAL_INPUT_SFC            = 2
      ILLEGAL_INPUT_SFK            = 3
      ILLEGAL_INPUT_RANGE          = 4
      ILLEGAL_INPUT_TABLESEL       = 5
      NO_AUTHORIZATION             = 6
      GENERATION_ERROR             = 7
      ILLEGAL_DOWNLOAD             = 8
      ILLEGAL_TABLENAME            = 9
      ILLEGAL_RESULTTYPE           = 10
      X_MESSAGE                    = 11
      DATA_OVERFLOW                = 12
      OTHERS                       = 13
    IF SY-SUBRC <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    CALL FUNCTION 'RSDRI_DATA_UNWRAP'
      EXPORTING
        i_t_rfcdata = g_t_rfcdata
      CHANGING
        c_t_data    = g_t_data.

Maybe you are looking for