Query issue in BPS cube

I have a BPS cube. The query executes fine for the first time but when I start drilling down it exits complaining that it is running out of memory. There is hardly any data in the cube for memory issue. I did all the tuning that needs to be done on a cube for performance. 
Please advice....

Arun,
This is not usual. What settings you have in your query. Do you have any virtual characteristic / key figure? Check all the objects one by one, you will get it.
Tell me the query sertings, pl.
Thanks
Ravi Thothadri
[email protected]

Similar Messages

  • Query Performance Issues on a cube sized 64GB.

    Hi,
    We have a non-time based cube whose size is 64 GB . Effectively, I can't use time dimension for partitioning. The transaction table has ~ 850 million records. We have 20+ dimensions among which 2 of the dimensions have 50 million records.
    I have equally distributed the fact table records among 60 partitions. Each partition size is around 900 MB.
    The processing of the cube is not an issue as it completes in 3.5 hours. The issue is with the query performance of the cube.
    When an MDX query is submitted, unfortunately, in majority of the cases the storage engine has to scan all the partitions (as our cube  is not time dependent and we can't find a suitable dimension that will fit the bill to partition measure group based
    on it.)
    I'm aware of the cache warming and  usage based aggregation(UBO) techniques.
    However, the cube is available for users to perform adhoc queries and hence the benefits of cache warming and UBO may cease to contribute to the performance gain as there is a high probability that each user may look at the data from different perspectives
    (especially when we have 20 + dimensions) as day(s) progress.
    Also, we have 15 + average calculations (calculated measures) in the cube. So, the storage engine sends all the granular data that the formula engine might have requested (could be millions of rows) and then perform the average calculation.
    A look at the profiler suggested that considerable amount of time has been spent by storage engine to gather the records (from 60 partitions).
    FYI - Our server has RAM 32 GB and 8 cores  and it is exclusive to Analysis Services.
    I would appreciate comments from anyone who has worked on a large cube that is not time dependent and the steps they took to improve the adhoc query performance for the users.
    Thanks
    CoolP

    Hello CoolP,
    Here is a good articles regarding how to tuning query performance in SSAS, please see:
    Analysis Services Query Performance Top 10 Best Practices:
    http://technet.microsoft.com/en-us/library/cc966527.aspx
    Hope you can find some helpful clues to tuning your SSAS Server query performance. Moreover, there are two ways to improve the query response time for an increasing number of end-users:
    Adding more power to the existing server (scale up)
    Distributing the load among several small servers (scale out)
    For detail information, please see:
    http://technet.microsoft.com/en-us/library/cc966449.aspx
    Regards,
    Elvis Long
    TechNet Community Support

  • Authorization issue in BPS

    Hi guys,
    I've the authorization issue in a BPS application, where a user can upload a flatfile into a BPS-cube, but only when I select in the authorization object S_RS_AUTH 0BI_ALL.
    Without selecting 0BI_ALL (another analysis authorization) yields to the message, that the user has not enough authorization...
    Now the user gets access to data in the BW reporting to all the organizational marks like the organization unit (0ORGUNIT).
    How is it possible to design the authorizations / analysis authorization, that the same user can upload data via flatfile, but gets only access to transaction data for organizational data which he should see???
    How should the analysis authorization should be designed? Has it something to do with the techn. char. like 0TCAACTVT?
    THX in advance!
    Clemens

    Hi,
    Have you tried creating Authorization Variable for organizational Unit ?
    This will give restricted access to data based on the authorization assigned .
    Thanks
    Pratyush

  • Universe Designer vs BEx query - Can't see the query in the OLAP Cubes list

    Hello Experts,
    We are facing a rather strange issue:
    While creating a connection to an existing BEx query - which wasn't being used by BO before - we aren't able to see that one query in a long list of available multiproviders/queries on the universe side.
    We already checked the option on BEx analyser to authorize external access to the query, with ODBO.
    The connection type we're using is the same for all the other universes built on top of BEx queries: SAP BW Client. Again, we already have a bunch of universes built using this method, but we can't seem to find the BEx query in the OLAP Cubes list.
    Does anyone have a lead on what may be causing this issue?
    Our environment is:
    - SAP BOE XI R3.1 SP2
    - SAP BW 3.5
    - BEx Analyser
    - Integration Kit
    Best Regards,
    Francisco

    Hi,
    i don't know, you'd have to search the BW-BEX component for notes.
    I noticed it with a revision of SAP GUI Front End tools 720 SP7 i think
    I applied the latest corrections (SP08 or 09) and that tickbox worked ok again.
    You might notice - as i did - that when you go back to check the bex query - that property becomes 'magically' unselected/
    Regards.
    H

  • InfoCube QM Status=Green before Reporting on BPS Cube

    Matez,
      As I understand, that an InfoCube needs to have it`s QM status to be =Green, before any query can be executed on the Info Cube.
    So, for BPS Cubes, as I understand that the QM Status will only turn Green once, 50000 records has been written .
    Please advice if I`m able to report on the BPS INfoCube while the Status QM = Yellow
    Thanking in advance !

    As Anurag mentions, for queries that need to pick up the most recent information from an open (yellow) Request, you need to include the Most Recent Data variable.
    As I recall, what happens is there is one database query  the reads the open request and another that reads the rest of the fact table ( or two if you have compressed some of the Requests into the E fact table.
    If you have an aggregate created on the transaction cube, the BW is smart enough to go get the data from the open Request and then get the rest of the data from the aggregate.

  • Query issue with exception aggregation

    Dear all,
    I have to solve the following reporting issue with BEx:
    Cube Structure:
    Cube A:
    Characteristics: Company Code, Article, Distribution Channel, Customer, FiscalYear/Period
    Key-Figures: Sales Val., Sales Qty.
    Cube B:
    Characteristics: Company Code, Article, FiscalYear/Period
    Key-Figures: COGS
    I simply want to multiply:  Sales Qty@COGS = NODIM(Sales Qty) * COGS,
    but this calculation should be valid for all characteristics of my cube A, even if I do not have them available in Cube B (like Customer and Distribution Channel). Additionally the calculated totals of my characteristics must be correct so that I can apply in second step a Margin Calculation: Sales Val. - Sales Qty@COGS which is valid on single level, as well as on total level.
    I started to setup calculated key-figures
    COGS1 = NODIM(Sales Qty) * COGS   with Exception aggregation TOTAL by Company Code
    COGS2 = COGS1 with Exception Aggregation TOTAL by Article
    and this worked fine for both characteristics. If I use COGS2 in my report I get correct figures in a drilldown by Company Code and Article.
    I enhanced the calculation by
    COGS3 = COGS2 with Exception Aggregation TOTAL by Distribution Channel, but the result in this case is 0. I guess the result is 0, as the characteristic Distribution Channel is not availble in Cube B.
    Any ideas how to solve this? Is there probably a different (more elegant) approach, as I fear that having all the exception aggregations my query runtime/ressource consumption will be awful.
    Thanks for any idea,
    Andreas

    Hi,
    You should define a new selection for COGS having Constant Selection on DC as defined in following link for PRICE with CUSTOMER.
    [http://help.sap.com/saphelp_nw70/helpdata/en/46/91f0f090ea13e8e10000000a155369/content.htm]
    and then apply your formulas....
    hope it will solve the problem...
    Regards.

  • Reporting on BPS Cube

    Hi All:
    In BPS Cube when the request is yellow Iu2019m able to report on it by adding variable (All Data) on Request ID in the Query. Is this going to effect the performance, whatu2019s the mechanism behind using this variable? If I have 50 requests in the Cube is it going to read all the requests and find the request what I want in the Query? please explain in detail.
    Thanks a lot!
    Manasa.

    Hi Manasa,
    Regarding your case ..
    There is method to change the status from yellow to green automatically.
    You can use process chain (tx: RSPC), there is process node to change the cube from planning type to load type (it's grouped in Other BW Processes). .
    After you create the process chain, you can trigger the process chain by function: RSPC_API_CHAIN_START
    With this method, you can activate your report from cube planning.
    Hopefully it can help you a lot.
    Regards,
    Niel.

  • Using the ROLLUPTIME with a BPS Cube

    Hi, all.
    I built a multiprovider with a BPS Cube (transactional) plus a "normal" Cube (not transactional). Then a put a query over this cube in a web template.
    When I use a text element ROLLUPTIME for display the last update of data, only the "normal" Cube is considered.
    I don't know if this is a limitation of the BW or if there are some trick to do this work correctly.
    We're on BW 3.5.
    Any advices will be welcome.
    Thanks,
    Henrique Teodoro

    This question is duplicated

  • Authorization issue to view cube contents

    Hi Gurus,
      I am getting Authorization issue to view cube contents in Production server, When I execute the cube it is showing me the following statement.
    "You do not have sufficient authorization for the infoprovider ZMMG_C05".
    Please provide me a possible solution for this.
    Thanks,
    Jackie.

    Hi,
    Two things to be checked with respect to authorization for this one.
    1) Functional Roles: Check whether Info cube is present in the functional roles that are assigned to you.
                                  If not you need to get the functional role in which the Infocube is assigned.
    2) Data Access Roles: Check in the data access roles assigned to you, whether you have the access
                                      to the selection that you are using to see the data in the info cube. Else, request
                                      BASIS team to assign the appropriate data access roles to you.
    Hope this helps.
    Regards,
    Bharat

  • Dataset query issues twice if the dataset is connected to matrix and used in multilookup function

    hello everybody.
    could not find any information if this is an intended behavior:
    dataset query issues twice if the dataset is connected to matrix and used in multilookup function
    parameters in both queries are the same
    ssrs: 2008 r2, sharepoint 2010 integrated
    sharepoint 2010: september 2014 cu
    thanks in advance
    Sergey Vdovin

    Hello, Wendy.
    I prepared a very empty sample report for you to demonstrate the problem - with this report, i hope, there is no place to discuss the shrinking of time data retrieval.
    There is one dataset, one parameter and one lookup function. The query is executed twice.
    <?xml version="1.0" encoding="utf-8"?>
    <Report xmlns:rd="http://schemas.microsoft.com/SQLServer/reporting/reportdesigner" xmlns:cl="http://schemas.microsoft.com/sqlserver/reporting/2010/01/componentdefinition" xmlns="http://schemas.microsoft.com/sqlserver/reporting/2010/01/reportdefinition">
    <AutoRefresh>0</AutoRefresh>
    <DataSources>
    <DataSource Name="DataSource1">
    <DataSourceReference>http://t005/ProjectBICenter/DocLib/IntegrationDBVdovin.rsds</DataSourceReference>
    <rd:SecurityType>None</rd:SecurityType>
    <rd:DataSourceID>7e554344-d6c2-48a5-a7f4-1d24608cb4b5</rd:DataSourceID>
    </DataSource>
    </DataSources>
    <DataSets>
    <DataSet Name="DataSet1">
    <Query>
    <DataSourceName>DataSource1</DataSourceName>
    <CommandText>select 1 as temp, '$' as tempname</CommandText>
    <rd:UseGenericDesigner>true</rd:UseGenericDesigner>
    </Query>
    <Fields>
    <Field Name="temp">
    <DataField>temp</DataField>
    <rd:TypeName>System.Int32</rd:TypeName>
    </Field>
    <Field Name="tempname">
    <DataField>tempname</DataField>
    <rd:TypeName>System.String</rd:TypeName>
    </Field>
    </Fields>
    </DataSet>
    </DataSets>
    <ReportSections>
    <ReportSection>
    <Body>
    <ReportItems>
    <Textbox Name="ReportTitle">
    <CanGrow>true</CanGrow>
    <KeepTogether>true</KeepTogether>
    <Paragraphs>
    <Paragraph>
    <TextRuns>
    <TextRun>
    <Value>=Lookup(1,Fields!temp.Value,Fields!tempname.Value,"DataSet1")</Value>
    <Style>
    <FontFamily>Verdana</FontFamily>
    <FontSize>20pt</FontSize>
    </Style>
    </TextRun>
    </TextRuns>
    <Style />
    </Paragraph>
    </Paragraphs>
    <rd:WatermarkTextbox>Title</rd:WatermarkTextbox>
    <rd:DefaultName>ReportTitle</rd:DefaultName>
    <Top>0mm</Top>
    <Height>10.16mm</Height>
    <Width>139.7mm</Width>
    <Style>
    <Border>
    <Style>None</Style>
    </Border>
    <PaddingLeft>2pt</PaddingLeft>
    <PaddingRight>2pt</PaddingRight>
    <PaddingTop>2pt</PaddingTop>
    <PaddingBottom>2pt</PaddingBottom>
    </Style>
    </Textbox>
    </ReportItems>
    <Height>57.15mm</Height>
    <Style>
    <Border>
    <Style>None</Style>
    </Border>
    </Style>
    </Body>
    <Width>152.4mm</Width>
    <Page>
    <PageFooter>
    <Height>11.43mm</Height>
    <PrintOnFirstPage>true</PrintOnFirstPage>
    <PrintOnLastPage>true</PrintOnLastPage>
    <Style>
    <Border>
    <Style>None</Style>
    </Border>
    </Style>
    </PageFooter>
    <PageHeight>29.7cm</PageHeight>
    <PageWidth>21cm</PageWidth>
    <LeftMargin>2cm</LeftMargin>
    <RightMargin>2cm</RightMargin>
    <TopMargin>2cm</TopMargin>
    <BottomMargin>2cm</BottomMargin>
    <ColumnSpacing>0.13cm</ColumnSpacing>
    <Style />
    </Page>
    </ReportSection>
    </ReportSections>
    <ReportParameters>
    <ReportParameter Name="ReportParameter1">
    <DataType>String</DataType>
    <DefaultValue>
    <Values>
    <Value>1</Value>
    </Values>
    </DefaultValue>
    <Prompt>ReportParameter1</Prompt>
    <ValidValues>
    <DataSetReference>
    <DataSetName>DataSet1</DataSetName>
    <ValueField>temp</ValueField>
    <LabelField>tempname</LabelField>
    </DataSetReference>
    </ValidValues>
    </ReportParameter>
    </ReportParameters>
    <rd:ReportUnitType>Mm</rd:ReportUnitType>
    <rd:ReportServerUrl>http://t005/ProjectBICenter</rd:ReportServerUrl>
    <rd:ReportID>cd1262ef-eca7-4739-a2ce-d3ca832d5cd6</rd:ReportID>
    </Report>
    Sergey Vdovin

  • Web Query issues

    Hi All,
    I have created a web query and assigned it to a role. I have moved the role, web templete and the query from dev to QA was able run the query fine without issues.
    The problem I am having is when I make changes to the query and transport it to QA...when I run the query from the role menu it still shows the old query instead of the changed query.
    Can someone please explain what are the steps I need to take when I make changes to exsisting web query in dev and move to QA and be able to see the changes. What are all the objects I need to collect. Also would like to know if there are any settings on the role menu or web templete that need to be changed and also any buffers that I need to cleared??
    Any help is appriciated and will give max points.
    Thanks

    BWdesi,
    please avoid reposts - it could be that due to network issues the post got posted twice... close one of them as answered and then proceed with the other one...
    Web Query issues
    Arun

  • Query on BCS virtual cube is not using the aggregates on BCS basic cube

    Hi all,
    I have BCS Virtual cube which is linked to BCS Basic cube. I built aggregates on BCS Basic cube.
    I created simple query on BCS basic cube and ran in debug mode of rsrt, it showed the aggregates on bcs basic cube. But when I created the same query on BCS vitual cube and ran it rsrt debug mode the query did not show any aggregates, that was strange.
    So My questions is whether query built on virtual bcs can utilize the aggregates built BCS basic cube, if possible please let me knows the tweaks.
    Thanks,
    Raj.

    1. Goto se37. Enter RSDRI_INFOPROV_READ and choose Display.
    2. In line 82 (in a BW 3.5) there is a line that says:
      CLEAR: e_t_data, e_end_of_data, e_aggregate, e_split_occurred.
    Put the cursor in there and press the 'stop shield' or use CtrlShiftF12.
    3. In the same mode open transaction RSRT and choose your query. Execute it. If you stop at the breakpoint, enter I_TH_SFC into one of the fields in the lower left area and press Enter. You should see a table with the characteristics you need in the system.
    As I said I'm not quite sure if it works. I have access to a BCS system on monday. I'll try then to find out more.
    Best regards
    Dirk

  • Query issue

    I had query issue on planning layout
    Sold    mat   price   qty   amount
    100      3        5        5          25    layout 1
    100      4       10       5          50    layout2
    when I should removed  material  cal has to come  like this
    100             15        10          75
    but I getting value like
    100        15       10       150
    can u plz tell me how to slove this
    regards
    raju

    Dear Raju,
    In the query, for the Calculated Key Figure (Amount)
       Select properties, enhanced and change to:
                          "BEFORE AGGREGATION"
    It seems to be doing after aggregation, which is  default.
    it is aggregating:
        price = 5+10 = 15
        qty   = 5+5 = 10
        Amount = 15 * 10 = 150
    Good luck, BB

  • Physical query issued by Obiee when cache is on is different and slow

    When the same report runs in OBIEE 10g and cache is OFF it takes less then 1min to get results. If cache is turned ON physical query issued by Obiee is totally different and it takes 2h to get results. Has anyone experienced this with having cache on that some queries are performing poorly.
    Thanks,
    Tatjana

    We are using BI Apps Order Management and Fulfillment Analytics and all tables are cached anyway. Dimensions used are not that huge up to 40K rows. What should I check when it comes to DB query? As I said is different than one generated when cache is disabled although both have the almost the explain plan.

  • Do i have to select all the key figures when i design a query off of a cube

    Do i have to select all the key figures when i design a query off of a cube? i have three keyfigures on the top of the three keyfigures it says calculated keyfigure. kind of lost dont know what to do. please let me know.
    Thanks!
    York

    No, you have to select at least one.
    Calculated keyfigure (CKF) means it's a formula of two or more other keyfigures. You can select, right-mouse click, display properties and you see how it is composed.
    Under calculated keyfigures you see all available keyfigures of the infoprovider you build the query on. Just move them by drag-and drop from the left to the right into the columns.
    You can have also restricted keyfigures (RKF). This means a keyfigure is restricted by one or more characterstics and their values.
    Hope you got an idea.
    Regards,
    Juergen

Maybe you are looking for