Impact of BWA on Multiprovider

Hi Friends,
If we build a multi-provider on a cube (BWA enabled) and a DSO (which is not BWA enabled since our version is < 7.3), and if we build a query on that multi-provider, do you know if the query will utilize the BWA for the cube portion of the data..? or will it not use it at all since it is built on the DSO as well..?
Any input will be greatly appreciated.
Thanks
lz70d7

Yes, in this case each part-provider is processed separately and InfoCube data will come from BWA, while DSO data will come from database, and them both subsets will be united by Analytic Engine into the final result.
-Vitaliy

Similar Messages

  • Impact on Master data Infoobject

    Hi Bwers,
    Currently we are making changes to exsiting master data infoobject (Business partner) in the system through adding one more Attribute. And we are including 2 new info objects to exsiting DSO also.
    I was asked to investigate impact of these chanes in systems. Pls provide your valuable suggestions about this activity.
    1.  What about the impact on other info objects (say 0CUSTMOER) which is reference to Bussiness partner.
    2. If any infoset incluing this infoobject, what is the impact on this Infoset / Multiprovider.
    3. If Business partner has been used as a navigational attribute for any other Infoobjects.
    Impact on DSO:-
    1. Impact on above data targets.
    2. And transformation between DSO to above all the infoproviders.
    How to investigate this things in the system.
    Regards,
    Venkat

    1. What about the impact on other info objects (say 0CUSTMOER) which is reference to Bussiness partner.
    > Adding an Attribute to wont affwct 0Customer.
    2. If any infoset incluing this infoobject, what is the impact on this Infoset / Multiprovider.
    > You need to adjust your inoset/MP. In case you want this attribute in the report, you have to select this object in infoset/MP.
    3. If Business partner has been used as a navigational attribute for any other Infoobjects.
    > It can still be used like that.
    Impact on DSO:-
    1. Impact on above data targets.
    ---> need to activate the DSO and load the data again. Need to change the transformation as well.
    2. And transformation between DSO to above all the infoproviders.
    > Those will get inactive. You have to activate it again.
    Thanks...
    Shambhu

  • BW upgrade - Inactive Cubes / Objects

    Hi ,
    We are upgrading from BW3.5 to BI7.3.
    One of the pre-upgrade requirement is :
    All InfoCubes should be active -->  Execute ABAP RSUPGRCHECK to locate any inactive InfoCubes. See SAP note 449160.
    There are some multiproviders in our development environment which is inactive and its not in use. We are keeping it incase we need to re-use it later.
    1.Can i know can the upgrade tool progress successfully without any issue  with this remaining inactive cubes?
    2.If can, what is the impact to this inactive multiprovider after the upgrade? Can it still be accessed as before and activated later in BI7.3 in case we need it?
    Thank you.
    Regards
    Maili
    Edited by: Maili06 on Aug 22, 2011 4:41 AM

    hi malli,
    if you are not using now then no need activate that multiprovider.
    it will be available even after upgradation in your system.
    if you want you can activate later and use it.
    regards,
    venkatesh.

  • Multiprovider Definition changed..Impact on query

    Hi,
    I had created a query on multiprovider...
    Now the underlying Infocubes  (Two)  of multiproviders needs to be changed as the datamodel completly changed..
    I had copied both the cubes to the new cubes..
    Now i need to include this new two  cubes in multiprovider which the query can use...
    What would be the best procedre for it?
    IF i directly change the defintion of multprovider then query gets affected ..
    as the original query has lot of formulas and selections and i do not want to play with original query definiton..
    Any idea of how to procede..
    Since this multiprovider has been used in roles i cannot create a new multiprovider..if i have to create new multicube then i also need to create roles ( we have three kinds of roles)...

    Hi,
       Changing the underlying cube means, are you going to remove any existing field or only going to include fields?.
       If you are going to remove existing objects from cube, defenetly your original query will be affected.
    Or if you are going to include some more fields with existing fields, you can just include the new cubes and scrap the existing cubes from multiprovider( since new cubes will be having the structure of existing one also).
    Then just a copy your original query into a new query (on same multirprovider) , if you want you can adjust the query with the new fields added , otherwise use the same.
    Regards,
    V.Senthil

  • Query takes long time on multiprovider

    Hi,
    When i execute a query on the multiprovider, it takes very long time. it doesnt show up the results also. It just keep processing. I have executed the report only for one day but still it doesnt show any result. But when i execute on the cube, it executes quickly and shows the result.
    Actually i added one more cube to the multiprovider and ten transported that multiprovider to QA and PRD. Transportation went on successfully. After this i am unalbe to execute the reports on that multiprovider. What might be the cause? your help is appreciated.
    Thanks
    Annie

    Hi Annie.......
    Checklist for the performance of a Query........from a DOc........
    1. If exclusions exist, make sure they exist in the global filter area. Try to remove exclusions by subtracting out inclusions.
    2. Use Constant Selection to ignore filters in order to move more filters to the global filter area. (Use ABAPer to test and validate that this ensures better code)
    3. Within structures, make sure the filter order exists with the highest level filter first.
    4. Check code for all exit variables used in a report.
    5. Move Time restrictions to a global filter whenever possible.
    6. Within structures, use user exit variables to calculate things like QTD, YTD. This should generate better code than using overlapping restrictions to achieve the same thing. (Use ABAPer to test and validate that this ensures better code).
    7. When queries are written on multiproviders, restrict to InfoProvider in global filter whenever possible. MultiProvider (MultiCube) queries require additional database table joins to read data compared to those queries against standard InfoCubes (InfoProviders), and you should therefore hardcode the infoprovider in the global filter whenever possible to eliminate this problem.
    8. Move all global calculated and restricted key figures to local as to analyze any filters that can be removed and moved to the global definition in a query. Then you can change the calculated key figure and go back to utilizing the global calculated key figure if desired
    9. If Alternative UOM solution is used, turn off query cache.
    10. Set read mode of query based on static or dynamic. Reading data during navigation minimizes the impact on the R/3 database and application server resources because only data that the user requires will be retrieved. For queries involving large hierarchies with many nodes, it would be wise to select Read data during navigation and when expanding the hierarchy option to avoid reading data for the hierarchy nodes that are not expanded. Reserve the Read all data mode for special queriesu2014for instance, when a majority of the users need a given query to slice and dice against all dimensions, or when the data is needed for data mining. This mode places heavy demand on database and memory resources and might impact other SAP BW processes and tasks.
    11. Turn off formatting and results rows to minimize Frontend time whenever possible.
    12. Check for nested hierarchies. Always a bad idea.
    13. If u201CDisplay as hierarchyu201D is being used, look for other options to remove it to increase performance.
    14. Use Constant Selection instead of SUMCT and SUMGT within formulas.
    15. Do review of order of restrictions in formulas. Do as many restrictions as you can before calculations. Try to avoid calculations before restrictions.
    16. Check Sequential vs Parallel read on Multiproviders.
    17. Turn off warning messages on queries.
    18. Check to see if performance improves by removing text display (Use ABAPer to test and validate that this ensures better code).
    19. Check to see where currency conversions are happening if they are used.
    20. Check aggregation and exception aggregation on calculated key figures. Before aggregation is generally slower and should not be used unless explicitly needed.
    21. Avoid Cell Editor use if at all possible.
    22. Make sure queries are regenerated in production using RSRT after changes to statistics, consistency changes, or aggregates.
    23. Within the free characteristics, filter on the least granular objects first and make sure those come first in the order.
    24. Leverage characteristics or navigational attributes rather than hierarchies. Using a hierarchy requires reading temporary hierarchy tables and creates additional overhead compared to characteristics and navigational attributes. Therefore, characteristics or navigational attributes result in significantly better query performance than hierarchies, especially as the size of the hierarchy (e.g., the number of nodes and levels) and the complexity of the selection criteria increase.
    25. If hierarchies are used, minimize the number of nodes to include in the query results. Including all nodes in the query results (even the ones that are not needed or blank) slows down the query processing. The u201Cnot assignedu201D nodes in the hierarchy should be filtered out, and you should use a variable to reduce the number of hierarchy nodes selected.
    Also check this.........Recommendations for Modeling MultiProviders
    http://help.sap.com/saphelp_nw70/helpdata/EN/43/5617d903f03e2be10000000a1553f6/frameset.htm
    Hope this helps......
    Regards,
    Debjani......

  • Erro while executing query using BWA

    Hello BWA experts,
    I am facing a issue while running a query by using BIA. I get error message as "Program error in class SAPMSSY1 method: UNCAUGHT_EXCEPTIION.
    My querz contains one basis key figure in the column and a charateristic say company code in row. In free characteristic, I have sales document no. The qeury executes without any issue at first. But when I go to 'sales document no' in the free characteristic area and select filter values, this error pops out. This document no cannot be removed due to customer requirements.
    If I restrict the query output with selection, then it is able to execute & filtering can be done without and errors. Also if I activate not to use BIA in tcode 'RSDDBIAMON', the query executes & filtering can be done whithout and errors.
    If I change the query & put 0doc_number in the drill down and execute for the same selection, I get error as follows,
    Error subfield access to table (row3, column 0 to row 3, cloumn 0) outside of
    Error Serialization/Deserialization error
    Error An an exception with the type CX_TREX_SERIALIZATION occured, but was neither
    Error Error reading data  of infoprovider XXXXXX
    Error Error while reading data, navigation is possible.
    I was suggested by SAP to increase the max_cells_one_index parameter to 200 million. We are on revision version 53 and the current value set for max_cells_one_index to 40 million. They also mention with a caution that 'by changing this parameter it will bring more load to BW servers'.
    I would like to know how the bw server load would increase by increasing this parameter. Will there be any another impact on the BIA or BW servers.
    Secondly, how is the value arrived to give a memory out , serialization error.
    Also will this kind of an issue be solved in the next revision version? I know that revision version 54 is available.
    Please help.
    Thanks,
    Sandeep

    Hi Marc,
    Thanks for the explaination.
    I am not able to even evaluate the amount of data with this document no in the drilldown. When I run this report I get an abap dump "TSV_TNEW_PAGE_ALLOC_FAILED".So I guess the no of documents must be very high. Is there a way to check in BIA Monitor the no of documents returned after execution, for a particular selection.
    However, I have come accross a note '1157582' which talks about spiltting the result set in different data packets. Will this be helpfull in any case? Currently I see that the parameter 'chunk_size' is set to zero
    Thanks,
    Sandeep

  • Authorization object impact

    Hi,
    I have got an infocube(IC) and multicube based on IC (MC).
    The authorization object for IC and MC are A & B.
    A report on MC has got B as authorization object.
    Now does the object 'A' has got any impact on report.
    As on now its saying that indequate authorization, even if authorization for B is given.
    Please clarify,
    Thanks in Advance,
    Naveen.A

    hi Naveen,
    object A won't get impact on report if in RSSM you didn't mark multicube/multiprovider MC. check again RSSM for object A if MC is marked. (by default system will mark new create reporting authorization object for relevant infoproviders).
    further more, what the 'no authorization' message say ? is it say no authorization for A ?
    you can try with transaction RSRT.
    and have trace set trace with RSRTRACE to find what exactly the authorization problem.
    hope this helps.

  • Error in Multiprovider consistency check after technical upgrade

    Hello,
    We have upgraded our BW system from 3.5 to 7.0 now.
    When I check the consistency of Multiprovider it gives me warning message that CMP Problem occurs with one of the info provider.
    Because I have used 0MAT_PLANT info object into my cube.
    I have a Multiprovider which is combined with 4 basic cubes. Two of the cube has 0plant and 0mat_plant info object.
    In other two cubes (C and D) have only 0plant and no 0MAT_PLANT is used.
    When I checked the Multiprovider for consistency, it shows me warnings that CMP Error occurred in the info provider C and D, So it requires to add the 0MAT_PLANT in the basic cubes C and D and then do the identification in Multiprovider.
    But this info object does not contain any data in the basic cube.
    Please help me to understand the impacts on this issue.
    Thanks in Advance.
    Regards
    M.A

    Hi,
    Thanks for the resposnse.
    But when i go through the OSS notes it is provided that there will be performance problem in executing the report.
    How to get it solved?
    Thanks,
    Regards
    M.A

  • 13 line item dimension cube into BWA?

    I have heard earlier that BWA does not support line item dimensions due to swap of data between blades. not sure this is still an issue.
    My cube has 13 line item dimensions. For performane reasons, we created 13 line item dimensions as cube storing 240 million records and user want to analuse all this data.
    I want to push this line item cube to BWA for better performance and like to know the impact.
    thanks for advance replies.
    Thanks

    texas_user wrote:
    My assumption for BWA has no limitations. That's why we spent so much for BWA right?
    Rewording Mark Victor Hansen: "Only in imagination, there's no limitation".
    Best cases for BWA performance are documented; unfortunately the only link I have on hand is to BIExpert article [Tips and Techniques for Optimal Query Performance with SAP NetWeaver BI Accelerator|http://www.bw-biexpertonline.com/article.cfm?id=3665].
    > It should support all navigation included single record reading to whole set of data.
    > I have 300 million of records in cube. We don't know what kind of adhoc analysis user planning to do.
    > We put all 14 fields from base cubes in query allowing user to do anything with data to make better decisions.
    > My question is making 13 line item dimension in cube and putting 300 millions records in BWA improve performance or not?
    > Space or cost is not concern here. Do we able to do it?
    > Does Web intelligence query come back? Or die?
    So far there is nothing worrisome in your description. 300M recs is not mind-blowing. The areas of watch-out:
    1/ There used to be performance implications with line-item dimensions in the past, when number of records in dimension was higher than number of records in fact table. But it was 2-3 years ago...
    2/ Once you put something on top of BW, like BO in your case - you need to check where time is spent, because in the past communication between BI tool and BW used to have impact as well. Make sure you have latest patches there.
    Cheers,
    -Vitaliy

  • Longrunning query on multiprovider

    Hi All,
    In our Production system, there is one multiprovider for which the queries are taking too much time to complete and at the end they result result in "TIME_OUT" dumps.
    This multiprovider contains receipt level data and hence data volume is quite huge  however customers were able to execute queries on this multiprovider before.
    We have just recently performed EHP upgrade on our production environment. Is there anything which is impacting performance due to this EHP upgrde?
    Note that problem is ONLY coming for only one multiprovider and not for others (these other MPs are not containing tht huge data).
    Customers are using restriction on calendar month. There isn't any way to further restrict the query/rather it's not meaningful to further restrict it
    Can SAP Note 0000911939(Optimization hint for logical MultiProvider partitioning) be consider to resolve this?
    Please let me know your opinions for resolving this issue.
    Thanks & Regards,
    Nilima

    Hi,
    We were not using any idexing earlier but at that time as well queries were working fine.
    Aggregates are on the infoproviders but they are not being used for these queries. They just do not contain the data that query needs.
    Regards,
    Nilima Rodrigues

  • Changing US Calender to Custom Calender-BI impact

    Hi,
    ECC is planning to change the US calender to custom calendar to include the holidays. We have all the financial modules in BI. If the calendar is changed to custom to include the holidays, can anybody tell me how we will have to measure the impact analysis in BI?
    Thanks
    Vijay

    Keep your existing data in the present cubes as it is.
    Create copy of the existing cubes and upload them.After the change dont upload the existing cubes.
    Reconcile them in a multiprovider to analyse data content and change to them.
    You can build similar reports in new copy cubes to compare change to the results.
    If you use time dependent master data,key date intervals in queries, change in posting dates( if that day is a custom holiday in new calendar) the query results are definetely prone to change.

  • BWA 7.2 RKF and CKF

    Ongoing from Jamie Hylton's thread...
    If we had BWA with the blades configured for both SBOE and BW access - in what scenario does BW access actually get to use the "query snapshot" functionality?
    We read that BWA 7.2 comes with RKF and CKF but as far as I see - this only applies to the CKFs and RKFs in the query snapshot load for SBOE and not for normal BW access
    Or did I read wrongly?

    Hi
    You can use BWA 7.20 with BW 7.01 SP05 and the indexing RKF and CKF functionalities are supported only with this combination of BWA and BW and finally reporting in Explorer.
    Also as Marc has mentioned the exploring of data with the Infoproviders having RKF and CKF is possible only in Explorer.
    The CKF and RKF as part of Multiprovider Snapshot Index can also be used to explore the data in the Explorer.
    c.f note : 1332392.
    Hope this helps!
    Regards, Hyma
    Edited by: Hymavathi Yanamadala on Mar 16, 2010 1:16 PM

  • Unable to create Aggregation level on Multiprovider

    Hello Experts,
    I am getting the following error when activating a newly created aggregation level on a certain Multiprovider.
    Error : Aggregation level B2MP01L02: InfoProvider 0COMP_CODE cannot be used for definition.
    0comp_code is a characteristic that I have included in a dimension of the Multiprovider. In the same dimension, i have another characteristic opcomp_code whose reference characteristic is also 0comp_code.
    Could this be the source of the error? There are already many queries built on the Multiprovider, I am reluctant in deleting opcomp_code and check the impact while activating the Agg level.
    Has anyone encountered this issue?
    Any help will be greatly appreciated.
    Thanks,
    Nitish.

    Hi Nitisch,
    it seems that you get message RSPLS192, cf. long text of the message.
    Not all multiproviders can be used as a basis for an aggregation level. The multiprovider has to contain at least one real-time InfoCube and not aggregation levels (aggregation levels cannot be nested).
    Regards,
    Gregor

  • Running Query on ODS Vs Multiprovider

    I would like to know the efficient method of running the queries on 1) OD'S directly 2)Create Multiprovider on ODS and run the queries using Multiprovider.
    Which method we have to adopt in which contexts. Please help me.
    thanks
    anuthalapti

    Dear Anila,
         You can always make use of Multiprovider to build queries, as the change in the data model would not impact on the Multiprovider and Query design. You can use a query on ODS , if its going to be only on that particular ODS , if you have Queries on two ODS Objects, always make use of Infosets, in this way you could eliminate any data mismatches..as they are both flat table structures.
    Hope it helps..
    Thanks,
    Krish

  • Impact of logical partitioning on BIA

    We are on BI release 701, SP 5. We are planning to create logical partition for some of the infocubes in our system. These cubes are already BIA enabled, so will creation of logical indexes have any impact on the BIA or improve the BIA rollup runtime

    Hi Leonel,
    Logical partitioning will have impact on BIA in terms of performance .
    Current cube is already indexed on BIA. Now if you divide current cube data into different cubes and create multiprovider on it then each cube will have its own F table index on BIA.
    you have to put new cubes to BIA and Execute Initial filling step on BIA and place rollups for respective cubes.
    Point to be noted :
    As Data will be deleted from Current cube and move to other cubes , Indexes will not get deleted
    from corresponding F table indexes in BIA . There will be indexes for records which are not present in cube. Thus it is always a good practice to flush BIA which will remove all the indexes from BIA for current cube and create new Indexes on BIA.
    In this case , we will have consistent indexes on BIA which will not hamper performance.
    This will also improve rollup time as data will be less in each cube after logical partitioning. For rollup
    improvement time , we can implement Delta indexing on BIA as well.
    Question : Why do we want to create logical partitioning for cubes which are present on BIA as queries will never come to Cubes in BI system ?
    Regards,
    Kishanlal Kumawat.

Maybe you are looking for