Enhancement 0FI_AP_30 donu2019t working on virtual cube 0FIAP_R30

Hi Gurus,
I have make an enhancement for the extractor 0FI_AP_30 and 0FI_AP_3.
I have added the Purchasing document and the item to the structure DTFIAP_3 and all working fine.
After that I have added the following code enhancement for add the WBS element in the extraction.
Itu2018s working only for the extraction on ODS 0FIAP_O03 and cube 0FIAP_C30 but not with the virtual cube 0FIAP_R30.
Thank for helps,
Franck
CASE I_DATASOURCE.
  WHEN '0FI_AP_3' OR '0FI_AP_30'.
* On recherche si la pièce comptable est liée à un document et un poste d'achat.
* Si oui, on recherche l'élément d'OTP.
    SELECT * FROM EKKN INTO TABLE LT_EKKN.
    SELECT * FROM PRPS INTO TABLE LT_PRPS.
    LOOP AT C_T_DATA INTO LS_DATA.
      LV_TABIX = SY-TABIX.
      IF LS_DATA-EBELN IS NOT INITIAL AND LS_DATA-EBELP IS NOT INITIAL.
        READ TABLE LT_EKKN INTO LS_EKKN
                       WITH KEY EBELN = LS_DATA-EBELN
                                EBELP = LS_DATA-EBELP.
        IF SY-SUBRC = 0.
          READ TABLE LT_PRPS INTO LS_PRPS WITH KEY PSPNR = LS_EKKN-PS_PSP_PNR.
          IF SY-SUBRC = 0.
            LS_DATA-POSID = LS_PRPS-POSID.
            MODIFY C_T_DATA FROM LS_DATA INDEX LV_TABIX.
          ENDIF.
        ENDIF.
      ENDIF.
    ENDLOOP.
ENDCASE.

HI,
Check the below link,
Loanding Multiproviders 0FIAP_M20 and 0FIAP_M30 (EhP3)
For Direct update no need to extract separately its just get the data based extract structure.
Regards,
Satya

Similar Messages

  • Report on Virtual cube is not working.

    Hi,
    I  created a report on virtual cube, Virtual cube is based on Function module which will pick data from Multiprovider, In the FM mention the import parameter name as " Muli Provider" Name.
    Multiprovider built on SPO object , On SPO object BIA is created, If can de activate the BIA on SPO then the report built on virtual cube is working in the portal. If I can activate the BIA the report is not working in the portal.
    The report is working fine in RSRT and Analyzer either BIA is active and deactive.
    Regards
    GK

    Hi
    The multi cube you created must be comprising of Info cubes, DSO and Info objects. Now this error which you are getting is it for
    - a specific characteristics or
    - any characterisitcs that is entered fourth in the series of filter selections or
    -  the fourth value in a specific characteristics
    Is the characteristics and the value which you use in filter present in all the underlying objects included in the multi cube ? You can check it in each of the objects associated in the multi cube, independently. This will give you an idea as to whether the error reported by the system is genuine or not.
    Cheers
    Umesh

  • How to Virtual cube with services works

    Hi,
    How to Virtual cube with services works.
    can anyone provide me a realtime scenerio.
    if possible provide some code.
    Thanks,
    cheta.

    For which functinality you are trying to create VC with services.
    This is mostly used in SEM BCS.
    You have to create cube similar path like normal cube, but select in the properties virtual cube, services.
    There is a standard FM which connects the Virtual cube to the required cube in BCS. ( this is done by BCS data basis generation function) and we have to do nothing for FM.
    In BCS we use virtual cube only for reporting purpose, since standard BCS cube cannot be used for reporting. hence what we do is create another virtucal cube very smiliar to BCS cube.
    Then both the Virtual cube and BCS cube are connected by standard FM.

  • Enhancing Virtual Cube 0FIGL_V01

    Hi all,
      We want to enhance the cube 0FIGL_C01 to add few additional characteristics. We want to know whether we can genereate Balance sheet reports (in balance sheet format) from 0figl_c01 cube itself or we have to create new or enhance the virtual cube 0FIGL_V01. I couldnt get the name of the functiona module which is called by the standard virtual cube.
    Looking for your valuable inputs.
    Thanks in advance
    PKR

    hi
    check whether dat has avilable or not in u r source system.the selection values which u given does not have values may be.chk this
    mahesh

  • Not able to fetch the data by Virtual Cube

    Hi Experts,
    My requirement is I need to fetch the data from Source System (From Data base table) by using Virtual Cube.
    What I have done is I have created Virtual Cube and created corresponding Function Module in Source System.
    That Function MOdule is working fine, if Data base table is small in Source System.But If data base table contains huge amount of data (millions of record), I am not able to fetch the data.
    Below is the code I have incorporated in my function module.
    c_th_mapping TYPE CL_RSDRV_EXTERNAL_IPROV_SRV=>TN_TH_IOBJ_FLD_MAPPING.
      DATA:
        l_s_map TYPE CL_RSDRV_EXTERNAL_IPROV_SRV=>TN_S_IOBJ_FLD_MAPPING.
      l_s_map-iobjnm = '0PARTNER'.
      l_s_map-fldnm  = 'PARTNER'.
      insert l_s_map into table l_th_mapping.
    create object l_r_srv
        exporting
           i_tablnm              = '/SAPSLL/V_BLBP'
          i_th_iobj_fld_mapping = l_th_mapping.
      l_r_srv->open_cursor(
        i_t_characteristics = characteristics[]
        i_t_keyfigures      = keyfigures[]
        i_t_selection       = selection[] ).
       l_r_srv->fetch_pack_data(
        importing
          e_t_data = data[] ).
      return-type = 'S'.
    In the above function Module,Internal table L_TH_MAPPING contains Info Objects from Virtual Cube and corresponding field from Underlying data base table.
    The problem where I am facing is, in the method FETCH_PACK_DATA, initially program is trying to fetch all the recordsfrom data base table to internal table.If Data base table so lagre, this logic is not working.
    So would you please help me how to handle these kind of issues.

    Hi Experts,
    My requirement is I need to fetch the data from Source System (From Data base table) by using Virtual Cube.
    What I have done is I have created Virtual Cube and created corresponding Function Module in Source System.
    That Function MOdule is working fine, if Data base table is small in Source System.But If data base table contains huge amount of data (millions of record), I am not able to fetch the data.
    Below is the code I have incorporated in my function module.
    c_th_mapping TYPE CL_RSDRV_EXTERNAL_IPROV_SRV=>TN_TH_IOBJ_FLD_MAPPING.
      DATA:
        l_s_map TYPE CL_RSDRV_EXTERNAL_IPROV_SRV=>TN_S_IOBJ_FLD_MAPPING.
      l_s_map-iobjnm = '0PARTNER'.
      l_s_map-fldnm  = 'PARTNER'.
      insert l_s_map into table l_th_mapping.
    create object l_r_srv
        exporting
           i_tablnm              = '/SAPSLL/V_BLBP'
          i_th_iobj_fld_mapping = l_th_mapping.
      l_r_srv->open_cursor(
        i_t_characteristics = characteristics[]
        i_t_keyfigures      = keyfigures[]
        i_t_selection       = selection[] ).
       l_r_srv->fetch_pack_data(
        importing
          e_t_data = data[] ).
      return-type = 'S'.
    In the above function Module,Internal table L_TH_MAPPING contains Info Objects from Virtual Cube and corresponding field from Underlying data base table.
    The problem where I am facing is, in the method FETCH_PACK_DATA, initially program is trying to fetch all the recordsfrom data base table to internal table.If Data base table so lagre, this logic is not working.
    So would you please help me how to handle these kind of issues.

  • Virtual Cube with DTP for direct access

    Hello experts, I would like to hear about some tips to improve performance on reporting over this kind of infoprovider. If there were no performance impact in reporting then this cubes would be perfect to face the requirement we are facing. So I would like to get some help on deciding if we can go ahead with this idea or not.
    The volume of information is not going to be small in underlying cube staged in APO source system but we can take care on reports using restrictions and small horizons of information. There is no processing in BW side, we would just transfer records from source system to report.
    What else besides restrictions in reports can we do to reduce query performance impact? Any work in underlying cube like partitioning, compression or so?.
    Any advice from your experience?.
    Thanks so much.

    Hi Martin,
    There are 2 BCS documents that might give you some starting point on Virtual cubes, if you have a S-userID for service.sap.com/support, these generally discuss the use of Multiproviders, helped by VirtualProviders so you'll have to skip through some redundnat content.
    How tou2026 use Deltacache and Deltapair in SEM-BCS (
    https://websmp208.sap-ag.de/~form/sapnet?_FRAME=CONTAINER&_OBJECT=011000358700000063832008E)
    How to... configure the Delta Load based MultiProvider Scenario
    https://websmp208.sap-ag.de/~form/sapnet?_FRAME=CONTAINER&_OBJECT=011000358700000063842008E
    or
    https://websmp208.sap-ag.de/~form/sapnet?_FRAME=CONTAINER&_OBJECT=011000358700006323702006E
    How to... Set up a SEM-BCS Data Mart in BW;
    https://websmp208.sap-ag.de/~form/sapnet?_FRAME=CONTAINER&_OBJECT=011000358700006323962006E
    A generic Virtual provider link that I had lying around;
    https://websmp208.sap-ag.de/~form/sapnet?_FRAME=CONTAINER&_OBJECT=011000358700004124212004E

  • VIrtual Cube with user exit

    Hi,
    This is the Scenario :
    Version : Bi 7 / Nw04s patch 11.
    Source : ECC 6.
    1) Virtual Cube V1 based on a generic Data source D1 in ECC or R/3 source system
    2) The generic data Source has 10 fields , 7 are from a Standard Table, the 3 are populated during extraction via Transaction data User exit ZRS01*).
    3) When I run the extractor checker for Data source D1 in R/3 all the 10 fields are populated.
    4) Problem : when I do A list cube in BI on the virtual cube all te data in ECC is seen however the enhanced fields derived in the user exit are coming in as blank.
    Any experience on this, whether this is a limitation of I need to create a message with SAP.
    Would apreciate any thoughts or experience n this are,
    thanks
    TC

    Check the user exit,if the code has been written to that specific datasource which was meant for the ODS(Watch out for case statements).

  • How to load data from a virtual cube with services

    Hello all,
    we have set up a virtual cube with service and create a BEx report to get the data from an external database. That works fine. The question is now:
    Is it some how possible to "load" the data from this virtual cube with service (I know that there are not really data...) into an other InfoCube?
    If that is possible, can you please give my some guidance how to set up this scenario.
    Thanks in advance
    Jürgen

    Hi:
    I don't have system before me, so try this.
    I know it works for Remote Cube.
    Right Click on the Cube and Select Generate Export Data Source.
    If you can do this successfully, then go to Source Systems tab and select the BW. Here, Right CLick on select Replicate DataSources.
    Next, go to InfoSOurces, click on Refresh. Copy the name of Virtual Cube and add 8 as a prefix and search for the infosource.
    If you can see it, that means, you can load data from this cube to anywhere you want, just like you do to ODS.
    ELSE.
    Try and see if you can create an InfoSpoke in Virtual Cube. Tran - RSBO.
    Here, you can load to a database table and then, from this table, you can create datasource, etc.
    ELSE.
    Create query and save it as CSV file and load it anywhere you want. This is more difficult.
    Good luck
    Ram Chamarthy

  • Virtual cube with services after copy system from PRD to SBox

    Hi
    I have problem.
    The Virtual cube 0FIGL_VC1 with FM "RS_BCT_FIGL_DATA_GET_VC1"
    works well on PRD but not on SBox.
    I think, i did everything on SBox in t-code SM59, WExx.
    Infopackages load data from connected (not R3-PRD but R3-SBox) system perfectly.
    What I must do with this virtual cube so that it works well ?
    Regards

    Hi Adam
    FM "RS_BCT_FIGL_DATA_GET_VC1"
    takes data from basic cube "0FIGL_C01" not from R3
    See t-code se37, tab "Import" and parameter Name "I_BASIC_INFOPROV".
    If you have a problem with charging data, see this parameter
    or rather basic cube and data transfer to this cube.
    Regards
    PWnuk

  • Performance issue on a virtual cube

    Hi BW gurus,
    I am working on Consolidation virtual cube and the query performance thru that cube is very bad. I know that we cannot build aggregates or partition on a virtual cube ..what should be my approach then.....
    Your suggestions will be appreciated with lots of points

    Hi Nick,
    If you can not move out of the virtualcube option, then I think you should try to improve the performance of the virtual cube. This mainly ABAP work. You can use SE30 to analyze what parts of the code is taking too much time. Follow this steps:
    1) Create a breakpoint in the function module of your virtualcube
    2) Go to listcube and initiate an extraction from your virtualcube.
    3) On a separate session, run SE30 to start an analysis of the extraction process for your virtual cube.
    You can use the report from SE30 as a starting point on your performance optimization work.
    Note: Transaction ST05 can help you determine what database calls are taking a long time.
    Hope this helps.

  • How to check the Datasource name for Virtual cube

    Hello Experts,
                          I have a report which is built on one multiproviider for which we are getting balance sheet data reporting on company code,enduser told us that he wanted to  see the data on plant level...
    The problem here is that I have checked the data flow for the multiprovider,It is showing as getting data from virtual cube.
    So,from there It is not showing the datasource and infosource under it(under virtual cube)..so how can I check the datasource name for it,so that I can go and enhance the datasource in case the plant(WERKS) is not there
    thanks
    reddy

    Hi Friend,
    go virtual cube lavel and see the data flow with "downward & upward" option in RSA1
    Go to RSA1
    Select the Virtual cube> right click > display data flow> select "downward & upward"
    it may help you

  • Query on BCS virtual cube is not using the aggregates on BCS basic cube

    Hi all,
    I have BCS Virtual cube which is linked to BCS Basic cube. I built aggregates on BCS Basic cube.
    I created simple query on BCS basic cube and ran in debug mode of rsrt, it showed the aggregates on bcs basic cube. But when I created the same query on BCS vitual cube and ran it rsrt debug mode the query did not show any aggregates, that was strange.
    So My questions is whether query built on virtual bcs can utilize the aggregates built BCS basic cube, if possible please let me knows the tweaks.
    Thanks,
    Raj.

    1. Goto se37. Enter RSDRI_INFOPROV_READ and choose Display.
    2. In line 82 (in a BW 3.5) there is a line that says:
      CLEAR: e_t_data, e_end_of_data, e_aggregate, e_split_occurred.
    Put the cursor in there and press the 'stop shield' or use CtrlShiftF12.
    3. In the same mode open transaction RSRT and choose your query. Execute it. If you stop at the breakpoint, enter I_TH_SFC into one of the fields in the lower left area and press Enter. You should see a table with the characteristics you need in the system.
    As I said I'm not quite sure if it works. I have access to a BCS system on monday. I'll try then to find out more.
    Best regards
    Dirk

  • Query on Virtual cube 0FIGL_V10 show old "Last Data Updated:" on Portal

    Hello all,
    We are using the SAP delivered virtual cube for G/L financial statements reporting. We are running these queries on Portal (BI 7.0) we are using the 0ANALYSIS_PATTERN template. Now since we are not actually loading data in this cube the template show "Last Data updated" as long time back, I am guessing when the cube was last activated or transported.
    The template and time stamp is working perfect for queries on all other cubes or Multiproviders.
    The business users are asking can we have the actual time stamp there (when actually the data was last loaded),
    Does anyone have any idea how we can do it without actually rewriting the queries on standard cube.
    Thanks in advance,

    Hi Kiran,
    Always last update data status only will see in our reports on last data updated date. It may be activation of data in cube. please check is there any option of activate data in data flow level.
    Try to create one record in remote data source  and load it again. It may be find the latest status.
    Thanks,
    Chandra

  • BCS Virtual Cubes Performance Tuning

    Hi All,
    We are working on improving the performanc tuning of queries on BCS Virtual Cubes (with services).
    Any specific changes (from RAM to specific properties on queries, Virtual Cubes with services) that you have seen working in improving the performance in yuor environemnt is greately appreciated.
    Thanks,
    - Shashi

    Thanks a lot Marc,
    We are on NW2004, with the following support pack levels
    SAP_BW     350     0016
    FINBASIS     300     0012
    BI_CONT     353     0008
    SEM-BW     400     0012
    I have checked the service market place and the current SP's available are:
    SAP_BW     350     0019
    FINBASIS     300     0015
    BI_CONT     353     0013
    SEM-BW     400     0015
    Which Service Packs that you suggest us to go with for performance issues on BCS Virtual Infoprovider queries?
    Thanks,
    - Shashi

  • Virtual cube reporting logic

    I have created report directly out of BCS transactional Totals cube and is giving correct output.
    But why does SAP suggest to make BCS reports out of virtual cube, rather than BCS totals cube? Is it more helpful for reporting if COI logic is implemented?
    What is the  distinct difference reason for creating reports with Virtual cube Vs Transactional cube.
    What is the pros and cons of creating report out of BCS Transactional cube vs virtual cube.?
    Inputs are appreciated.

    Hi Christopher,
    In general, you are right.
    But, in practice... The restrictions that you mentioned may be very easily avoided. As I wrote here:
    COI Equity method...
    it's not even necessary to have associated companies in the Group's hierarchy. Hence, at equity method limitation doesn't work in our situation.
    The main point here is a general consolidation logic. And the virtual cube performs it perfectly.
    Of course, we can devise some situation which the vcube will not be able to handle. Like running the report for a period when a daughter company was already sold or become an associate (drastical org changes).

Maybe you are looking for

  • Time Machine over the Network

    Hello, I've been using Time Machine for over 3 years now with no issues over the network; the destination being a 1TB-limited sparseimage setup as instructed by this tutorial. The image resides on an external HD connected on a network-connected iMac.

  • "Error writing metadata to" message and extended attributes

    After being frustrated by this error message in Bridge while trying to apply keywords to various image files I finally took the time to investigate, and I think I found the cause of the issue. In a nutshell, if the extend attribute "com.apple.FinderI

  • Determine VKONTO only with the help of SERNR

    Hello to all, I'm new in IS-U EDM. I have to specify VKONTO out of the device SERNR. Is there a function module in order to do this or must this be done manually? What tables are involved in order to get this information? Are the duplicate devices. W

  • Storing Uploaded Audio Files in Your Database

    I have been using the cffile tag to upload files into a directory on the server and then writing the details of that file into a database, but I would like to now instead of uploading and writing the file to a directory, I would like to write that fi

  • Auditing drop tablespace in Oracle 10.1

    Hi all, I want to audit the drop table and drop tablespace statement for all users in the oracle database 10.1. So, could someone gives me a help to do this? The audit_trail is already set to db in the database. Regards