MDX Code - Performance for Calculated Measure

Hi
I've very little experience of using MDX but i was provided with the below code to create a calculated OLAP measure that appears to be the reason the performance of my report is so poor. I'm hoping someone can help me write something alot more effiecently?
sum(tail(nonemptycrossjoin(Descendants([Reporting Date Hierarchies].[YWD].currentmember,[Reporting Date Hierarchies].[YWD].[rep_dt_1])),1),[Measures].[outstandSUM])
The code essentially looks at daily data and in my report i have time hierachy YWD in rows and Months (non time hierachy dimension) in coloumns, it is a business requirement to provide the report as such. When looking at this initial level the yearly figures
represent end of month position for given year, at week level end of week positions for monthly and yearly.
Could someome suggest alternative and more performance friendly code?
Thanks in advance

Hi Noobiemoobie,
In your query
sum(tail(nonemptycrossjoin(Descendants([Reporting Date Hierarchies].[YWD].currentmember,[Reporting Date Hierarchies].[YWD].[rep_dt_1])),1),[Measures].[outstandSUM])
There are only one set inside the nonemptycrossjoin function, why do your use nonemptycrossjoin function? If you crossjoin -sized sets (e.g., sets that contain more than 100 items each), you can end up with a result set that contains many thousands of items-enough
to seriously impair performance. For the detail information, please see:
http://sqlmag.com/data-access/cross-join-performance
Please try to remove the nonemptycrossjoin function and check if this issue is persists or not. Besides, here are some links about performance tuning for you reference.
http://www.bidn.com/blogs/DustinRyan/bidn-blog/2636/top-3-simplest-ways-to-improve-your-mdx-query
http://blogs.msdn.com/b/azazr/archive/2008/05/01/ways-of-improving-mdx-performance-and-improvements-with-mdx-in-katmai-sql-2008.aspx
Regards,
Charlie Liao
TechNet Community Support

Similar Messages

  • Constructing Calculated Measures in MDX for different measures using same columns in a fact table

    Hello,
    i have a fact table with 2 columns corresponding to dimensions Dim1, Dim2. In the same table i have 4 other columns Value_Type(int), INT_VALUE(int), FLOAT_VALUE(float), TEXT_VALUE(string). There are a number of measures which are identified by Value_Type and
    depending on their nature could be written in one of the 3 columns (INT_VALUE(int), FLOAT_VALUE(float), TEXT_VALUE(string)) Let's say Measure1 with Measure_Type=1 is age, 2 is account balance and 3 is Name for clarity. There could be other measure types that
    use these 3 same columns for data. So the sample fact table looks like this
    Dim1 Dim2 Measure_Type INT_VALUE FLOAT_VALUE TEXT_VALUE
    10 10 1 25
    10 10 2 2000,34
    10 10 3 John
    10 20 1 28
    10 20 2 3490,23
    10 20 3 Frank
    My task is to write an MDX query for each Dim1, Dim2 combination which returns all 3 measures in the same row. The idea is to construct a calculated member for each Measure that returns value from the right field. For example for Measure1 we take INT_VALUE
    with measure_type=1. The problem is i don't know how to construct MDX query for these calculated members. Can you please help me?
    So my final goal is to write an MDX query that returns all measures in one row for each set of Dim1, Dim2
    SELECT [Measure1], [Measure2], [Measure3] ON COLUMNS,
    NON EMPTY [Dim1].[Dim1].[Dim1].Members*[Dim2].[Dim2].[Dim2].Members ON ROWS
    FROM [Cube]
    Dim1 Dim2 Measure1 Measure2 Measure3
    10 10 25 2000,34 John
    10 20 28 3490,23 Frank

    Hi Kosmipt,
    I would combine the "INT_VALUE", "FLOAT_VALUE" and "TEXT_VALUE" columns into one with STRING data type for the fact table. And there should be one dimension to store the Measure_Type informations. Then in cube, you can write MDX scope for every one of
    "Measure_Type" dimension members. For example,
    scope(
                [Measures].[M]
        [DimMType].[MType].&[1]= CINT([Measures].[M]);                                                                                                                             
    end scope;
    Once the cube is built by the above way, you can write MDX like something like the following to achieve your purpose.
    WITH member [Measures].[Measure1] AS ([DimMType].[MType].&[1],[Measures].[M])
             member [Measures].[Measure2] AS ([DimMType].[MType].&[2],[Measures].[M])
             member [Measures].[Measure3] AS ([DimMType].[MType].&[3],[Measures].[M])
    thanks,
    Jerry

  • MDX calculated measure causing performance issue

    The calculated measure below against all product members is causing the excel pivot table to hang indefinitely. Any help on how to optimize the query for better performance?
    SCOPE ([MEASURES].[DIDaysInMonth]);
    THIS = CASE WHEN [Measures].[MonthDifference] < 0 THEN 0
    WHEN [MEASURES].[MonthDifference] >= 0 AND ProjectedEnd > 0 THEN [MEASURES].[DaysRemainingInMonth]
    WHEN [MEASURES].[MonthDifference] = 0 AND ProjectedEnd < 0 THEN
    [Measures].[Ordered Cases] / (([Measures].[Forecasted Sales]-[Measures].[Cases])/[measures].[DaysRemainingInMonth])
    WHEN [MEASURES].[MonthDifference] >= 0 AND ([Time Monthly].[Time Monthly].CurrentMember.PrevMember,[MEASURES].[ProjectedEnd]) <= 0 THEN 0
    WHEN [MEASURES].[MonthDifference] > 0 AND ([Time Monthly].[Time Monthly].CurrentMember.PrevMember,[MEASURES].[ProjectedEnd]) > 0 THEN
    ([Time Monthly].[Time Monthly].CurrentMember.PrevMember,[MEASURES].[ProjectedEnd]) /
    ([Forecasted Sales] / [daysInMonth]) END;
    END SCOPE;
    BI Developer

    Hi Abioye,
    According to your description, you create a calculated measure which against all products  in your AS cube, now the performance is poor when using this calculated measure in EXCEL Pivot table, right? In this case, here are some links which describe
    tips about performance tuning in SSAS, plesae see:
    http://technet.microsoft.com/en-us/library/cc966527.aspx
    http://sqlmag.com/t-sql/top-9-analysis-services-tips
    Hope this helps.
    Regards,
    Charlie Liao
    TechNet Community Support

  • SQL User Defined Functions for performing statistical calculations

    Hi!
       I hope you can help.  I just wasn’t sure where to go with this question, so I’m hoping you can at least point me in the right direction.
       I’m writing a SQL Server stored procedure that returns information for a facility-wide scorecard-type report.  The row and columns are going to be displayed in a SQL Server Reporting Services report. 
       Each row of information contains “Current Month” and “Previous Month” numbers and a variance column.  Some rows may compare percentages, others whole numbers, others ratios, depending on the metric they’re measuring.  For each row/metric the company has specified whether they want to see a t-test or a chi-squared statistical test to determine whether or not there was a statistically significant difference between the current month and the previous month. 
       My question is this:  Do you know where I can find a set of already-written user defined functions to perform statistical calculations beyond the basic ones provided in SQL Server 2005?  I’m not using Analysis Services, so what I’m looking for are real SQL User Defined Functions where I can just pass my data to the function and have it return the result within a stored procedure. 
       I’m aware that there may be some third-party statistical packages out there we could purchase, but that’s not what I’m looking for.   And I’m not able to do anything like call Excel’s analysis pack functions from within my stored procedure.   I’ve asked.   They won’t let me do that.   I just need to perform the calculation within the stored procedure and return the result.
       Any suggestions?  Is there a site where people are posting their SQL Server UDF’s to perform statistical functions?  Or are you perhaps aware of something like a free add-in for SQL that will add statistical functions to those available in SQL?   I just don’t want to have to write my own t-test function or my own chi-squared function if someone has already done it.
     Thanks for your help in advance!  Oh, and please let me know if this should have been posted in the TSQL forum instead.  I wasn't entirely sure.
    Karen Grube

    STATS_T_TEST_
    docs.oracle.com/cd/B19306_01/server.102/b14200/functions157.htm 
    STATS_T_TEST_ONE: A one-sample t-test
    STATS_T_TEST_PAIRED: A two-sample, paired t-test (also known as a crossed t-test)
    STATS_T_TEST_INDEP: A t-test of two independent groups with the same variance (pooled variances)
    STATS_T_TEST_INDEPU: A t-test of two independent groups with unequal variance (unpooled variances)

  • Code for calculating variance between plan and actual in Co module

    Hi Abbapers,
    I have to develop a  2 fresh report : One for calculating the variance for the Plan/Actual in controlling and second for sales variance for the month Jan to Dec .
    I need your help as I have to complete this reports in this week . Its Bit urgent.
    Can you please help me with the sample code or any of the similar report which you might have created as I am bit new in ABAP. Please reply asap.
    Thanks in advance!
    Best Regards.

    Hi Deepak,
    For Finished Goods materials  if you assigned Price control  V (Moving Average) it is irrelevant  to run the Standard Cost Estimation.
    My earlier client we don't have variance because of the Moving Average Price for Finished Goods.
    Please check, It is really required Variance??  If it is required check whether variance key is automatically assigned to Process order or not. (Control Tab)
    Ravi Polampalli

  • Unit of measure for calculation type C must be completed

    Hi All
    When releasing some contracts in SRM , we are getting the error message as above.
    The error message doesn't specify any particular part and the contract has more than 3000 parts.
    There are no conditions at header or item level.
    There is no Target quantity also. Just a Target value for the contract and prices for individual items are maintained.
    We are using extended classic scenario and the contract is manually created in SRM.
    I have checked the OSS notes for this error, but they are mostly for contract creation in Bidding and not for a manually created SRM contract.
    We checked the UOMs for all the contract parts with the UOMs ( Base and Purchasing) in ECC Material Master and they are OK.
    In our scenario , MM changes in ECC are replicated to SRM in real time
    Any pointers towards pin pointing the error will be rewarded!
    Regards
    Kedar

    Hi,
    We encountered this error message and resolved the problem by applying the following 3 SAP Notes:
    a)     Note #1022928 - Unit of Measure for Calculation Type C Must Be Completed
            (Previously Mentioned)
    b)     Note #1702484 - Unit of Measure for Calculation Type C Must Be Completed
    c)     Note #1517648 - Default Conditions Added for Product Category Item Type
    Although the contract required is similar to a "Limit" item, but had to be placed for bid, then the RFX PS Item Type was set to "Service" to receive the bid responses.  After accepting the bid response, then we created a contract from the bid response.  Since it was to be a "Limit" type contract, then we immediately removed the data from the following fields:
    Remove the Quantity
    Remove the Unit of Measure
    Remove the Price
    Remove the Price Condition
    After removing the values from these fields, then we switched the PS Item Type from "Service" to "Product Category".  We added the Guaranteed Minimum and account assignment data to encumber the funds.  We were able to "Release" the contract successfully.
    Bob Zinsmeister

  • Are Cube organized materialized view with Year to Date calculated measure eligible for Query Rewrite

    Hi,
    Will appreciate if someone can help me with a question regarding Cube organized MV (OLAP).
    Does cube organized materialized view with calculated measures based on time series  Year to date, inception to date  eg.
    SUM(FCT_POSITION.BASE_REALIZED_PNL) OVER (HIERARCHY DIM_CALENDAR.CALENDAR BETWEEN UNBOUNDED PRECEDING AND CURRENT MEMBER WITHIN ANCESTOR AT DIMENSION LEVEL DIM_CALENDAR."YEAR")
    are eligible for query rewrites or these are considered advanced for query rewrite purposes.
    I was hoping to find an example with YTD window function on physical fact dim tables  with optimizer rewriting it to Cube Org. MV but not much success.
    Thanks in advance

    I dont think this is possible.
    (My own reasoning)
    Part of the reason query rewrite works for base measures only (not calc measures in olap like ytd would be) is due to the fact that the data is staged in olap but its lineage is understandable via the olap cube mappings. That dependency/source identification is lost when we build calculated measures in olap and i think its almost impossible for optimizer to understand the finer points relating to an olap calculation defined via olap calculation (olap dml or olap expression) and also match it with the equivalent calculation using relational sql expression. The difficulty may be because both the olap ytd as well as relational ytd defined via sum() over (partition by ... order by ...) have many non-standard variations of the same calculation/definition. E.g: You can choose to use or choose not to use the option relating to IGNORE NULLs within the sql analytic function. OLAP defn may use NASKIP or NASKIP2.
    I tried to search for query rewrite solutions for Inventory stock based calculations (aggregation along time=last value along time) and see if olap cube with cube aggregation option set to "Last non-na hierarchical value" works as an alternative to relational calculation. My experience has been that its not possible. You can do it relationally or you can do it via olap but your application needs to be aware of each and make the appropriate backend sql/call. In such cases, you cannot make olap (aw/cubes/dimensions) appear magically behind the scenes to fulfill the query execution while appearing to work relationally.
    HTH
    Shankar

  • Performance for the below code

    Can any one help me in improving the performance for the below code.
    FORM RETRIEVE_DATA .
    CLEAR WA_TERRINFO.
    CLEAR WA_KNA1.
    CLEAR WA_ADRC.
    CLEAR SORT2.
    *To retrieve the territory information from ZPSDSALREP
    SELECT ZZTERRMG
           ZZSALESREP
           NAME1
           ZREP_PROFILE
           ZTEAM
         INTO TABLE GT_TERRINFO
         FROM ZPSDSALREP.
    *Preparing Corporate ID from KNA1 & ADRC and storing it in SORT2 field
    LOOP AT GT_TERRINFO INTO WA_TERRINFO.
      SELECT SINGLE * FROM KNA1 INTO WA_KNA1
                      WHERE KUNNR = WA_TERRINFO-SALESREP.
      SELECT SINGLE * FROM ADRC INTO WA_ADRC
                      WHERE ADDRNUMBER = WA_KNA1-ADRNR.
      IF NOT WA_ADRC-SORT2 IS INITIAL.
      CONCATENATE 'U' WA_ADRC-SORT2 INTO SORT2.
      MOVE SORT2 TO WA_TERRINFO-SORT2.
    MODIFY GT_TERRINFO1 FROM WA_TERRINFO.
      APPEND WA_TERRINFO TO GT_TERRINFO1.
      CLEAR WA_TERRINFO.
      ENDIF.
      CLEAR WA_KNA1.
      CLEAR WA_ADRC.
    ENDLOOP.
    ENDFORM.                    " RETRIEVE_DATA

    Hi
    The code is easy so I don't think you can do nothing, only u can try to limit the reading of KNA1:
    FORM RETRIEVE_DATA .
      CLEAR WA_TERRINFO.
      CLEAR WA_KNA1.
      CLEAR WA_ADRC.
      CLEAR SORT2.
    *To retrieve the territory information from ZPSDSALREP
      SELECT ZZTERRMG
      ZZSALESREP
      NAME1
      ZREP_PROFILE
      ZTEAM
      INTO TABLE GT_TERRINFO
      FROM ZPSDSALREP.
      SORT GT_TERRINFO BY SALESREP.
    *Preparing Corporate ID from KNA1 & ADRC and storing it in SORT2 field
      LOOP AT GT_TERRINFO INTO WA_TERRINFO.
        IF KNA1-KUNNR <> WA_KNA1-KUNNR.
          SELECT SINGLE * FROM KNA1 INTO WA_KNA1
               WHERE KUNNR = WA_TERRINFO-SALESREP.
          IF SY-SUBRC <> 0.
            CLEAR: WA_KNA1, WA_ADRC.
          ELSE.
            SELECT SINGLE * FROM ADRC INTO WA_ADRC
                                     WHERE ADDRNUMBER = WA_KNA1-ADRNR.
            IF SY-SUBRC <> 0. WA_ADRC. ENDIF.
          ENDIF.
        ENDIF.
        IF NOT WA_ADRC-SORT2 IS INITIAL.
          CONCATENATE 'U' WA_ADRC-SORT2 INTO SORT2.
          MOVE SORT2 TO WA_TERRINFO-SORT2.
    * MODIFY GT_TERRINFO1 FROM WA_TERRINFO.
          APPEND WA_TERRINFO TO GT_TERRINFO1.
          CLEAR WA_TERRINFO.
        ENDIF.
      ENDLOOP.
    ENDFORM. " RETRIEVE_DATA
    If program takes many times to upload the data from ZPSDSALREP, you can try to split in sevaral packages:
    SELECT ZZTERRMG ZZSALESREP NAME1 ZREP_PROFILE ZTEAM
      INTO TABLE GT_TERRINFO PACKAGE SIZE <...>
      FROM ZPSDSALREP.
      SORT GT_TERRINFO BY SALESREP.
    *Preparing Corporate ID from KNA1 & ADRC and storing it in SORT2 field
      LOOP AT GT_TERRINFO INTO WA_TERRINFO.
        IF KNA1-KUNNR <> WA_KNA1-KUNNR.
          SELECT SINGLE * FROM KNA1 INTO WA_KNA1
               WHERE KUNNR = WA_TERRINFO-SALESREP.
          IF SY-SUBRC <> 0.
            CLEAR: WA_KNA1, WA_ADRC.
          ELSE.
            SELECT SINGLE * FROM ADRC INTO WA_ADRC
                                     WHERE ADDRNUMBER = WA_KNA1-ADRNR.
            IF SY-SUBRC <> 0. WA_ADRC. ENDIF.
          ENDIF.
        ENDIF.
        IF NOT WA_ADRC-SORT2 IS INITIAL.
          CONCATENATE 'U' WA_ADRC-SORT2 INTO SORT2.
          MOVE SORT2 TO WA_TERRINFO-SORT2.
    * MODIFY GT_TERRINFO1 FROM WA_TERRINFO.
          APPEND WA_TERRINFO TO GT_TERRINFO1.
          CLEAR WA_TERRINFO.
        ENDIF.
      ENDLOOP.
    ENDSELECT.
    Max

  • How to Zero Debt After Account Closure Date in MDX Calculated Measure

    Thanks in advance to anyone who can help me with the below problem.
    I’m building a Finance cube, it holds all customers we have billed and any cash received against their debt. I have a few calculated measures that work out Total Debt, Total Cash. From these I have a calculated measure called Total Debt Outstanding which
    is (Total Debt - Total Cash)
    My problem, when an account is closed the debt is not always zero. We don’t always get all the cash to cover the debt so just write off the outstanding debt. I have a requirement to zero out the debt after closure date has passed. 
    If I look at the calculated measure [Total Debt Outstanding] today for all accounts it will include debt figures from closed accounts, I need the calculated measure
    [Total Debt Outstanding] to ignore this and not to inflate my current debt figure.
    Example
    Account 12345, Debt £3142 added on 15/03/2013, Cash £1,687
    received on 12/12/2013, Total Debt Outstanding £1107. The Account closed on 15/12/2013 with a balance of £1107, after the closure date I need the total Debtor outstanding figure to be £0 if you browse the cube with my date dimension. 
    My date dimension Hierarchies is Year-Quarter-Month-Day
    I have tried using the Scope function to Zero the Total Debt Outstanding figure after closure date but can’t get it to work. 
    My scope calculation is below:
    Scope([Measures].[Total Debt Outstanding]);
    This = IIF([Date].[Year-Quarter-Month-Day].Currentmember > [ClaimData].[Closed].[Closed].Currentmember,0,[Measures].[Total Debt Outstanding]);
    End Scope;
    My Account Opening and Closing dates are in my ClaimData fact table, with Opening Date linked to the date dimension. I can’t link Closing date as we don’t always have a closing date until the account closes, so SSAS errors.
    Many Thanks

    Hi,
    Try compare key value of members. For example,
    [Date].[Year-Quarter-Month-Day].Currentmember.Properties("key") > [ClaimData].[Closed].[Closed].Currentmember.Properties.("key")
    And think about UnknownMember in Date dimension. In this case you can link Closing date
    Best regards,
    Aleksandr Severin

  • Performance reports and measures for Online Trading Customer's databases

    We have a client from online trading commodity domain. there databases are running in healthy stat.
    We on daily basis share with them AWR report of time span 9:30AM to 10:30AM (peak trading hours).
    can you suggest what all recommendadtions we can include in this AWR report or how we can study on daily basis this AWR report extracted to mention recommendation to customer.
    apart from this what other performance reports we can share with customer which will make cilent view their database stat in terms of performace tunning and all.
    We can share daily database checks and all but I am trying to have some good performance end reports which can be extract to share with client??
    What performace reports you are sharing or can suggest in this regards??
    Thanks friends in advance.

    Hi
    Ankit Ashok Aggarwal wrote:
    We have a client from online trading commodity domain. there databases are running in healthy stat.
    We on daily basis share with them AWR report of time span 9:30AM to 10:30AM (peak trading hours). I have written a few blog posts about AWR analysis (http://savvinov.com/category/awr/) -- some of them may be useful to you. However, AWR reports are not so good for monitoring purposes. Monitoring is about trends -- AWR doesn't have that. Plus, application performance should be measured in its native metrics (KPIs or "key performance indicator"). For instance, for an online shop that would be the number of orders processed, average time it takes to process an order etc. Low level stats such as the number of table scans per unit time or IO stats doesn't really tell whether or not an application is performing satisfactorily.
    From the database point of view, a simple overview of the OEM Performance Page should be enough to get a basic idea whether the database is ok.
    can you suggest what all recommendadtions we can include in this AWR report or how we can study on daily basis this AWR report extracted to mention recommendation to customer.I've seen a lot of such recommendations -- 99% of them are garbage, and the customers treat it accordingly. Unless there are clear signs of a performance issue, recommendations are generally neither necessary, nor possible (without additional information about the application). Ain't broken -- don't fix it.
    apart from this what other performance reports we can share with customer which will make cilent view their database stat in terms of performace tunning and all.
    We can share daily database checks and all but I am trying to have some good performance end reports which can be extract to share with client??Ideally, you should have SLAs with your customers which should clearly define how much time a certain user action should take. Without an SLA and/or specific complaints from the user there is little you can do about database performance, except for when something obvious shows up on the report.
    Best regards,
    Nikolay

  • Leave unit of measure for calculation type A empty

    Hi Gurus,
    We are in SRM Extended classic scenario with 5.0 version.
    Here in SRM we have Procurement contact with process type Global Outline Agreement (GCTR).
    Currently we are facing problem while amending the condition type in the contact line item. When we change the condition type for particular line item then getting error message "Leave Unit of Measure for calculation type A empty. (item 10).
    On the line item the condition could look like this
    Condition                    Price                 Start Date       End Date      
    Price(Contract/Bid)    100 EUR            2010-02-22    2010-12-31
    gross surcharge %   5,5 %                2010-02-22    2010-03-31
    gross surcharge %   6,5%                 2010-04-01    2010-06-30
    gross surcharge %   8,5 %                2010-07-01    2010-12-31
    when we try to add additional conditon like "gross surcharge %" then we got this error message
    Leave unit of measure for calculation type A empty
    if I delete all condition "Gross Surcharge%" then the error disapear and then I can try again, sometime we could create those condition without problems but offen we have this error.
    Can any one help me to get rid of this error message.
    Thanks In Advance,
    Jakob Bang

    initially you could do multiple periods but when you change the condition it is not allowing you.
    can you make inactivate that andd new line item and add same item with new condition
    (SRM might have limitations)
    I tried this works for me
    MM 123 -
    FIRST :-
    price contract bid - 43.99 USD 2010.09.02  to 2010.09.02
    Discount (Absolute) - 3 %  2010.09.02  to 2010.09.02
    Discount (Absolute) - 5 % 2010.09.03  to 2010.09.03
    same GOA edited change version created
    price contract bid - 43.99 USD 2010.09.02  to 2010.09.02
    Discount (Absolute) - 4 %  2010.09.02  to 2010.09.02
    Discount (Absolute) - 5 % 2010.09.03  to 2010.09.03
    no erorr for me and GOA released too and reached backend too.
    Muthu

  • Different transaction codes useful for Performance Monitoring

    Hi Experts,
    Please can you guide me on this question, as to what are the different transaction codes useful for Performance Monitoring i.e. workload statistics and database statistics? What kind of statistics do each of these codes provide?
    Many thanks,
    Mithun

    Hi Mithun
    In performance issuses you need to look in terms of many ways that is..
    Workload analsys
    ST03N: Statistics Regards Locallly
    ST03G: Statistics Regards Golbally
    STAD: Individual Statistics Regards
    STATTRAACE: Individual Statistics Regards Trace
    ST07 : User Distribution
    Buffers and Memory
    ST02 : Buffers and Memory and swaps monitoring
    ST10: Table Acess
    OS Monitoring
    OS04: Locally monitoring
    OS07: Remotely monitoring
    OS01: LAN check
    DataBase Side
    ST04: Performance overview
    DB01: Exclusive locks
    DB02: Tables/Indexes
    BackgroundJobs monitor
    SM37
    other tcodes
    ST22: Abap Dumps
    SM12: Lock Entries
    SM56: NumberRange Buffers
    SU56: User Buffer
    all above transactions are need to monitor for Performance.
    Regards
    Bandla

  • How to get the measure total at row level for another measure calculation in DAX?

    Hi There,
    Using DAX, I am trying to get an expression for a calculated measure as follow:
    So basically, I would like to get the AMeasure total highlighted in yellow in the A2Measure calculation:
    Smeasure -(Ameasure total * WMeasure) where Ameasure total is the one highlighted. 
    I think my question would be how to get the total of a measure in order to used in the calculation of another measure.
    Thanks and best regards,
    Joss

    Hi Joss83,
    If you're trying to get the result for [AMeasure] at the total level (i.e. 5.09 in your example) you can do this by creating a version of this measures that is evaluated after ignoring the filters (indirectly or explicitly) placed on
    the 'Customer' and 'Player' columns. You may also need to ignore the filters place on some other tables or columns that must be ignored in order for the correct total to be returned.
    Without knowing much more about your data model, I can only take a vague guess as to what this calculation would look like in your scenario:
    AMeasureTotal:=
    CALCULATE(
    [AMeasure],
    ALL(Customer[Customer]),
    ALL(Player[Player])
    You can read more about the 'ALL' DAX function here:
    http://technet.microsoft.com/en-us/library/ee634802.aspx
    Hope this helps,
    Michael

  • How to modify formula for the calculated measure in a cube.

    Hi,
    I need to modify formula for some measures within a cube, What would be the best way do that? In the obj view if I update the formula and compile, it does seem to reflect in the model view.
    I cannot delete and recreate the measures deletion and recreation would change the order of the measures which i do not need.
    Appreciate your inputs. thanks

    Yes thats true it desnt change in the AWM front end if you change the formula in object view but it would show you results according to the new formula.
    If you want AWM to show the correct expression in the front end then the way out is to change in the XML and reimport the xml.
    Thanks,
    Brijesh
    Edited by: Brijesh Gaur on Nov 18, 2009 5:51 PM

  • Incorrect results for calculation based on diff dimensions - 11.1.1.5

    Hello All,
    OBIEE gives incorrect results when i try to perform a calculation (for eg: addition) based on 2 measures. For eg:
    (Note: "->" signifies 1:M)
    Rpd (Physical model & BMM): dim_fe -> dim_gl-> Fact_Legder <- Dim_param
    Fact_Ledger (agg measures) -> YTD_01, YTD_02...... YTD_12 ( here 01,02...12 represent month i.e. if "Feb" selected in prompt then we need to use YTD_02 and so on for other months)
    Answers: Created a report with following columns
    Column Name : Formula
    =================
    Line Item : 'Net Profit'
    Prev Yr Act: (filter("Fact Ledger"."YTD_12" using "Fact_Ledger"."YEAR"=@{pYear}{2013}-1 and "Dim_Param"."PL_Line" in ( 'Item 1','Item 2','Item 3') and "Fact_Ledger"."Code"=100)/1000) /
    (filter("Fact Ledger"."YTD_12" using "Fact_Ledger"."YEAR"=@{pYear}{2013}-1 and "Dim_FE"."Item" in ( 'L1','L2','L3') and "Fact_Ledger"."Code"=100)/1000)
    Curr Yr Act: (filter("Fact Ledger"."YTD_12" using "Fact_Ledger"."YEAR"=@{pYear}{2013} and "Dim_Param"."PL_Line" in ( 'Item 1','Item 2','Item 3') and "Fact_Ledger"."Code"=100)/1000) /
    (filter("Fact Ledger"."YTD_12" using "Fact_Ledger"."YEAR"=@{pYear}{2013} and "Dim_FE"."Item" in ( 'L1','L2','L3') and "Fact_Ledger"."Code"=100)/1000)
    Curr Yr Plan: case when '@{pmonth}{Jan}='Jan' then
    (filter("Fact Ledger"."YTD_01" using "Fact_Ledger"."YEAR"=@{pYear}{2013} and "Dim_Param"."PL_Line" in ( 'Item 1','Item 2','Item 3') and "Fact_Ledger"."Code"=200)/1000)/
    (filter("Fact Ledger"."YTD_01" using "Fact_Ledger"."YEAR"=@{pYear}{2013} and "Dim_FE"."Item" in ( 'L1','L2','L3') and "Fact_Ledger"."Code"=200)/1000)
    when '@{pmonth}{Jan}='Feb' then
    (filter("Fact Ledger"."YTD_02" using "Fact_Ledger"."YEAR"=@{pYear}{2013} and "Dim_Param"."PL_Line" in ( 'Item 1','Item 2','Item 3') and "Fact_Ledger"."Code"=200)/1000)/
    (filter("Fact Ledger"."YTD_02" using "Fact_Ledger"."YEAR"=@{pYear}{2013} and "Dim_FE"."Item" in ( 'L1','L2','L3') and "Fact_Ledger"."Code"=200)/1000)
    when '@{pmonth}{Jan}='Dec' then
    (filter("Fact Ledger"."YTD_12" using "Fact_Ledger"."YEAR"=@{pYear}{2013} and "Dim_Param"."PL_Line" in ( 'Item 1','Item 2','Item 3') and "Fact_Ledger"."Code"=200)/1000)/
    (filter("Fact Ledger"."YTD_12" using "Fact_Ledger"."YEAR"=@{pYear}{2013} and "Dim_FE"."Item" in ( 'L1','L2','L3') and "Fact_Ledger"."Code"=200)/1000)
    endthe results are incorrect. Any help appreciated.
    Qry generated is like
    (select...
    case when year=.. and pl_lin=... and code=100 then ytd_01,
    case when year=.. and pl_lin=... and code=100 then ytd_03,
    case when year=.. and pl_lin=... and code=100 then ytd_04,....,
    case when year=.. and pl_lin=... and code=200 then ytd_01,
    case when year=.. and pl_lin=... and code=200 then ytd_03,
    case when year=.. and pl_lin=... and code=200 then ytd_04,....,
    from...
    where ... year in (2013-1,2013) and pl_line('Item1,'Item2','Item3') or fe.item('l1','l2','l3') and code in (100,200)... ) D1
    (select
    case when 'Apr'='Jan' thne d1.c1 when 'Apr'='Feb' then d1.c2......
    from D1
    Regards..
    Shruti

    See if this explains it better for my crosstab with page items of Vendor Number 1234.
    Vendor 1234
    Dc Nbr 1 2 4 AAAA
    Sum Invoice Amt 1387.04 300.82 327.29 2015.15
    Sum Cost 44.86 57.43 25.54 127.83
    Sum Advanced Cost 102.44 0 0 102.44
    Sum Consolidation Cost 30.37 0 0 30.37
    Sum Allowance Amt 27.74 6.02 6.54 40.30
    Net Freight Cost 149.93 51.41 19 220.34
    Freight Percent 10.81 17.09 5.81 ****
    As stated before, Frieght Percent is a calculation I created in Discoverer that looks like this :
    ( NVL(Sum Cost,0)+NVL(Sum Advanced Cost,0)+NVL(Sum Consolidation Cost,0)-NVL(Sum Allowance Amt,0) )/NVL(Sum Invoice Amt,0)*100
    Column AAAA was created in Discoverer using Sum of field and show to the right.
    What I need is for the **** to be the correct calculation for the totals in column AAAA. If I use do a total for Freight Percent using the Cell Sum I get 33.70., what I want is it to be 10.93, which is (127.83 + 102.44 + 30.37 - 40.30)/2015.15*100.
    If I use an Average Total row for Freight Percent, I get 11.24 which is 33.70 / 3 (the 3 would be the # of dc nbr's)
    I did start with using the detail level data to create this crosstab. Then I made a new version and used the SUM data. I seem to get the same results but am still having issues with the one **** value.
    Hopefully this explains it better.
    Thanks for the ideas so far.

Maybe you are looking for