Regarding Percentile Calculation

hi Experts,
plz help me is there any function module for calculating percentile or tell me formula for percentile calculation

I would not rely on XI built in arithmetic functions as they are extremely inaccurate. Write your own UDF and use Java classes such as double or BigDecimal depending on use.
Yoni

Similar Messages

  • Is percentile calculation supported by QD or WAD?

    Hi!
    Customer does not have access to Business Objects, Crystal Reports or similar but requires percentiles to be calculated at runtime.
    Customer wants 10th, 25th, 50th, 75th and 90th percentile of the employees' individual wages to be presented for whatever combination or drill-down of characteristics are selected in report. Eg first display is organizational units as a hierarchy, each unit should have the percentiles displayed and calculated on the wages of the employees for that unit. Also, minimum wage for each unit and total sum for wages for each unit should be presented.
    If user adds Gender from free characteristics, the percentile values should be re-calculated so that they are based on the employees' wages for each gender under each unit.
    Is this at all possible?
    Query Designer does not provide any options unless we predefine which characteristics we want to use and sort and index the data accordingly. This is not an option since user will analyse many scenarios and does not know beforehand which scenarios are applicable.
    Web Application Designer can offer median value (50th percentile) via context menu but only based on the subtotals and totals presented in query (eg total wage for organizational unit, not employees' individual wages).
    Only option as we see it now is to try to create an enhancement of context menu in WAD and use a customer exit to do the calculation and then send result back to query. But is this a solution? Will it even work? Can this work dynamically or must user make selection on menu for each re-calculation of the values? Will it damage performance?
    There is very little information about percentile calculations on forums or SAP help, is this not supported at all if customer does not use BO or similar?
    Please advice on solution or information about whether or not this is supported by BI.
    (links to info regarding enhancement and customer exit
    http://help.sap.com/saphelp_nw04/helpdata/en/a4/7f853b175d214ee10000000a11402f/content.htm
    http://help.sap.com/saphelp_nw70/helpdata/en/47/a3d30269421599e10000000a42189c/content.htm)
    (System information: SAP Netweaver 7.0 EHP 1, suppport pack 8)

    Well, you're right. There is an amount of manual action required. And the condition does indeed look at the characteristics in the drill-down - so if employee no in not in the drill down, then the lowest granular data will not be taken into account.
    If you want to avoid that, you'll need a different approach. You could create a vanilla (without condition query) and write an APD to pull data from that query. The APD would then contain a routine to perform this calculation of the 10th percentile and that could be sent into a DSO and thereafter to a Cube. The same APD for 25th, 50th percentile etc. Eventually the target cube, for each Org Unit, would have one record for the 10th percentile, one for the 25th etc. The query could be built on this cube and could pick up the percentile required by the user.
    The trouble with this design is that you'll have to think of all possible combinations of characteristics (Org Unit, Gender etc) and use the APD to calculate them. But I can't think of any other way to handle it either.
    Regards,
    Suhas

  • Percentile Calculation

    Hi All,
    I want to calculate "Percentile"(per-50 and per-90)
    Referred : http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:16695563776514
    and tried with NTILE but its not giving correct results
    DB: Oracle 10g
    Any suggestions ? ?
    Thanks,
    Saichand.v

    Hi,
    thanks for the reply
    select percentile_cont(0.5) within group (order by measure_column),dim_col
    from table_name group by dim_col
    NTILE : In OBIEE we have NTILE function
    NTILE(measure_column,50)
    This is what i tried but i am not sure about the expected results
    When i use NTILE not getting exact results ......its is showing same value for all my reports(obiee side)
    Thats the reason trying to write an sql for the percentile calculation so that i can manage the things from my OBIEE side
    Can you please shed some light ?
    Sorry if i didn't mention my problem correctly(expected results) the basic problem is don't have idea how Percentile will handle
    Thanks,
    Saichand.v
    Edited by: Saichand Varanasi on Jan 6, 2011 2:22 PM

  • Regarding vat calculating wrongly in miro

    hi,
    Issue:Reg: Miro Tax Differences
    Tax Codes showing the small differences while doing the MIRO transactions.  In all
    the three tax codes  Vat and Cst shows correctly in the purchase order
    levels  whereas  while doing the Miro transactions the VAT and CST will not
    calculate correctly.
                   As per Bill          As per System Calculations          
    Total Bill Basic Amount             72016          72016          Diff
         ExciseR.off          7419          7419     7417.66     
              Vat @ 5%     3972          3982.49          
    Total Bill Value               83407          83417.49          10.49
    see the vat @ 5% in system it is showing 10.40 rs more in miro. Tax code used is VM(input 10.30% cenvat+5% vat),Condition type used is JVRD, tax procedure used is taxinn
    Po details
    material        Qty   Amount
    20                 301    44548
    30                  101   16968
    40                  100   10500
    If any details required please revert back
    Regards
    Badri

    Hi,
    check following path
    in SPRO-Gene rel-Logistic-ax on goods Movement-India-BASIC SETTING-DETERMINATION OF EXCISE DUTY-CONDITION BASE EXCISE DETERMINATION-CLASSIFY CONDITION TYPE
    here check your Vat and CSt condition type is there or not,it should be there .
    Regards
    Kailas ugale

  • Need help regarding complex calculation using Max value and limiting data after Max date in MDX

    I am working on a bit complex calculated measure in SSAS cube script mode.
    Scenario /Data Set
    Date
    A
    B
    C
    A+B
    5/29/2014
    Null
    34
    Null
    34
    6/30/2014
    Null
    23
    45
    68
    7/15/2014
    25
    -25
    Null
    0
    8/20/2014
    -34
    Null
    Null
    -34
    9/30/2014
    25
    Null
    60
    25
    10/15/2014
    45
    -45
    Null
    0
    11/20/2014
    7
    8
    Null
    15
    a) Need to capture latest non-null value of Column C based on date
    with above example it should be 60 as of 9/30/2014
    b) Need to capture column A+B for all dates.
    c) Add values from column (A+B) only after latest date which is after 9/30/2014. 
    with above example it's last 2 rows and sum is 15
    d) Finally add value from step a and step c. which means the calc measure value should be = 75
    I need to perform all this logic in MDX. I was able to successfully get step a and b in separate calc measure, however i am not sure how to limit the scope based on certain date criteria. In this case it's, date> Max date(9/30/2014) . Also how should
    i add calculated members and regular members?
    I was able to get max value of C based on date and max date to limit the scope.
    CREATE MEMBER CURRENTCUBE.[Measures].[LatestC] AS
    TAIL( 
      NONEMPTY(
        [Date].[Date].CHILDREN*[Measures].[C]),1).ITEM(0) ,visible=1;
    CREATE MEMBER CURRENTCUBE.[Measures].[MaxDateofC] AS
    TAIL( 
      NONEMPTY(
        [Date].[Date].CHILDREN,[Measures].[C]),1).ITEM(0).MemberValue ,visible=1;
    Please help with Scope statement to limit the aggregation of A+B for dates > MaxDateofC? Also further how to add this aggregation value to LatestC calc measure?
    Thank You

    Hi Peddi,
    I gave TRUNC to both of the dates. But still the same issue. I think the problem is in returning the BolbDomain.
    return blobDomain;
    } catch (XDOException xdoe) {
    System.out.println("Exception in XDO :");
    throw new OAException("Exception in XDO : "+xdoe.getMessage());
    catch (SQLException sqle) {
    System.out.println("Exception in SQL :");
    throw new OAException("SQL Exception : "+sqle.getMessage());
    catch (OAException e) {
    System.out.println("Exception in OA :");
    throw new OAException("Unexpected Error :: " +e.getMessage());
    Thanks and Regards,
    Myvizhi

  • Error when creating GL accounts regarding interest calculation

    Dear colleagues
    When I create a GL account the system comes up with the sequent error message Field "Date of last interest run" contains an entry, although it is hidden. It is strange cause we are not using this functionality and as the message says we have this field hidden int the account group. We have installed the baseline package and i was wondering if there is any customizing point where the interest calculation functionality is activated or decativated (like global company code parameters). Your help will be appreciated,
    Kind regards
    Marc

    Hi
    The same is maintained in GL master.Pls unhide the field and check the entries in the master
    Regards
    Sanil Bhandari

  • Regarding depreciation calculation

    Hi experts,
    1. Can i assign two depreciation methods for one asset, is it possible.
    For ex : Machinery for first 6 months can assign SLM 5%
                  for next 6 months can assign WLM 2%
    2. if i want to maintain Multiple percentages for one asset in year, then what are steps need to be follow.
    Thanks & regards
    vinod.

    Create/change the required depreciation key in AFAMA and assign it to your asset.
    Check below link on calculation methods for Dep Key.
    http://help.sap.com/saphelp_46c/helpdata/en/4f/71dd6b448011d189f00000e81ddfac/content.htm
    Regards,
    GK

  • Isseu regarding MRP calculation.

    Hi Gurus,
    We have MTS scenario with strategy 10 ,requirement type LSF , in MD61 PIR are present from 01 /01/2011 till today , but when i check in MD04 only requirments from 01/04/2011 are taken into consideration ( for LSF requirment after 01/04 system is showing -ve qty and plan order are generated for these independent requirement ) , i want to know why PIR from 01/04/2011 are taken into account and why older PIR are not considered.
    Current stock is 0 and planning time fence in 30 days and planning horizon is 100 days,, please help me understand why older demands are not taken for calculations.
    Thank you,

    Hi.
    I assume you have run the MRP with the processing key NETPL (Net Change in Planning Horizon). And as you have define planning horizon 100 Days that's why system is taking the requirement for only 100 days. If you want to have total requirement have to consider use MRP processing key NETCH (Net Change in Total Horizon).
    Regards
    Vivek

  • Count and Percentile calculations for various Lines in BEx Query.

    Dear Experts
    I working on a BEx Query at the moment and the output is required in the format below:
    This and another report (which will be displayed above this report) will go into a WAD eventually.
    I have Key Figure for 'Document Count' but not sure how to get the percentage displayed in the second line and also for the Top 3 Reason Codes (not having anything displayed across 'Top 3 Reason Codes' though). I have tried Selection and other ways but nothing worked so far.
    Do I have to use Cell Editing by any chance or is there a way to work-around this.
    Thanks in advance for your help.
    Kind regards,
    Chandu

    You can achieve your report format by creating a Structure in rows with Cell definitions. Define each cell in COunt Document KF according to the row logic.

  • Regarding fringe calculation

    Hi...
          What is fringe calculation...?? In my company we are using zt.code for that... "ZFGT0024"... When I execute this T.code... It is asking "company code group"... "What is the meaning of company code group....... and when I used already existed groups... it is saying no data is available... How can we post transaction by using this company code group....
    It's very urgent to me....
    plz... kindly... solve my problem...

    Hi,
    Fringe Benefit is one type of taxation.
    company code group is the Main Company Name....check with that group.
    normally Fringe Benefit will be applicable in the Company Level..
    try to do...i will check for more information
    Edited by: divya p on Jul 15, 2008 4:42 PM

  • Hai regarding excise calculation h cess

    hai this is swamy ragariding excise system not caluating the higher education cess 1 % not diplay in j1iin screen what is the in puts plz give the details . and we are mainatine condtion record  and all from sd side it is calcluating in pricing also but it is not diaplay in j1iin screen plz give the answer man thanks

    Hi,
    Did u activate the AT1 indicator for your excise registration.
    Regards,
    Murali

  • Regarding date calculations

    Hi Experts,
        My problem is i have a start date and either the no of weeks or no of months will be entered dynamically based on this value i.e take for eg. 10 months is entered then i have to calculate the stock for 10 months
    and display
    1st month               2nd Month         3rd month...
    stock   au .....        stock   quan...        stock............
    can you help me out .... please its very urgent kindly help
    if possible give me a sample program....
    Message was edited by:
            MADHANGI SAMRAJ

    Find the difference between two days in days, months and years
    * Type in two date and find the difference between the two dates in
    * days, months and years.
    REPORT ZDATEDIFF.
    DATA: EDAYS   LIKE VTBBEWE-ATAGE,
          EMONTHS LIKE VTBBEWE-ATAGE,
          EYEARS  LIKE VTBBEWE-ATAGE.
    PARAMETERS: FROMDATE LIKE VTBBEWE-DBERVON,
                TODATE   LIKE VTBBEWE-DBERBIS DEFAULT SY-DATUM.
    call function 'FIMA_DAYS_AND_MONTHS_AND_YEARS'
      exporting
        i_date_from          = FROMDATE
        i_date_to            = TODATE
    *   I_FLG_SEPARATE       = ' '
      IMPORTING
        E_DAYS               = EDAYS
        E_MONTHS             = EMONTHS
        E_YEARS              = EYEARS.
    WRITE:/ 'Difference in Days   ', EDAYS.
    WRITE:/ 'Difference in Months ', EMONTHS.
    WRITE:/ 'Difference in Years  ', EYEARS.
    INITIALIZATION.
    FROMDATE = SY-DATUM - 60.
    reward points if  it is usefull ...
    Girish

  • Possible new percentile function?

    Hi,
    I'm running "Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production".
    I need to calculate percentiles, so I believe I have two choices: PERCENTILE_CONT() or PERCENTILE_DISC().
    I also need to get the same results as the SPSS and SAS tools.
    Comparing the results, neither of the Oracle versions give the same result as SPSS/SAS. I think this is due to how Oracle determines what row numbers to use.
    Oracle takes row RN = (1+ (P*(N-1)) where P is percentile and N is number of ordered rows.
    For PERCENTILE_CONT(0.1) from a 16 row set that would give row 2.5 as the starting point.
    http://docs.oracle.com/cd/B28359_01/server.111/b28286/functions115.htm
    SPSS and SAS uses the formula RN = P*(N+1) which gives row 1.7 for a 16 row set.
    Now, would it be possible for me to write a PERCENTILE_SPSS() function?
    The samples of analytical/pipelined functions I have seen (and written) are happy to use the rows in the set one by one, to calculate some special min/max/avg/etc.
    For this function I would need either
    1 - to know the number of rows in the set, from the first call of my function, and calculate the percentile as my function "passes" the desired rows
    or
    2 - save all values in the set and calculate the percentile at the end
    The first sounds impossible, the second sounds expensive for a large table.
    Any suggestions?
    Kind regards
    Tomas

    You'll be more successful in the forums if you focus on the issues, questions and suggestions rather than make assumptions about why those questions or suggestions are being asked or made. Your unwarranted personal comments are unprofessional and indicate a lack of maturity.
    My questions were, and are, solely to elicit information about your actual issue so that suitable solutions or workarounds can be determined. There was no 'lecture' anywhere in any of the information that I provided; just statements and quotes from experts.
    The articles and quotes I provided were intended to show you that there isn't a universal agreement on a definition for those 'percentile' calculations. That would signal at least a 'caution' flag.
    >
    I'm asking if it's possible to write a custom-made analytical function when the formula needs to know the record count.
    >
    Yes - it is possible. In my opinion it isn't feasible. Any attempt to reproduce the exact proprietary algorithm that any vendor uses would require knowledge about the internals of that vendor's code that jsut aren't likely to be known or available: what precision is being used for intermediate results? What rounding methods are being used? What is the exact order of operations. Were any of those factors changed between versions. Did the vendor make any bug fixes?
    Any testing you do with any manageable set of test data won't ensure that there aren't data sets that will produce slightly different results.
    So writing such a function is one thing; trying to duplicate existing functionality is something else. The vendor's own implementation may have changed due to bug fixes so the results from that vendor may depend on the actual version being used.
    >
    Oracle does not have to match, I've never said that.
    My report, running in Oracle Reports, do need to match the results of an old report.
    >
    Well I read that as saying that Oracle does need to match. Your second sentence pretty much says exactly that.
    Again, my question was to try to understand why Oracle needed to produce a NEW report from OLD data that matches information in a PRINTED report from the past. That is a very unusual requirement. I've never run across such a requirement in more than 30 years and I have written thousands of reports using Crystal Reports (now BO), Oracle Reports, RPG and many other tools.
    It is often not possible to reproduce ANY report from the past due to the ever-changing nature of the data in the database. Data gets updated, deleted, inserted even when it shouldn't.
    >
    I have considered typing in the old values and selecting them.
    >
    That is the only method guaranteed to produce the same results. And that is about the only method that will pass the stringent auditing and Sarbanes-Oxley requirements that many organizations have to abide by. Those auditors won't allow you to just change the data to make things come out the way you want. You have to PROVE the provenance of the data: where did every piece of data come from, what changes were made to it (and when and by whome) and how were any accumulations performed.
    Using a custom function and merely showing that the results, on paper, match would not sway any auditor I've ever had to deal with. You may not have an issue of having to prove the data. That's why we are asking questions. To try to understand what options are viable for you.
    >
    That would mean I'd have to use SPSS to produce future values to store.
    >
    Yep! That's one of the main drivers for a lot of the ETL processes that exist. There are multiple systems of record. A PeopleSoft system might be the SOR for Inventory while an Oracle financials system might be the SOR for the financial and payroll data. ETL tries to merge and match the data. Trying to keep those ETL processes up to date with the changes going on in all of the source systems is a MAJOR headache.
    The standard solutiion for that use case is to select an SOR for reporting and then create ETL processes that bring the SOR data from each source system to the reporting SOR. For your use case that might mean using SPSS to produce data in report-ready tables and then use ETL to bring that data to the report server where it can be merged with data from other systems.
    >
    That's just too much manual labour for my taste.
    >
    Welcome to the real world!

  • Dynamic Calc processor cannot lock more than [100] ESM blocks during the calculation, please increase CalcLockBlock setting and then retry(a small data cache setting could also cause this problem, please check the data cache size setting).

    Hi,
    Our Environment is Essbase 11.1.2.2 and working on Essbase EAS and Shared Services components.One of our user tried to run the Cal Script of one Application and faced this error.
    Dynamic Calc processor cannot lock more than [100] ESM blocks during the calculation, please increase CalcLockBlock setting and then retry(a small data cache setting could also cause this problem, please check the data cache size setting).
    I have done some Google and found that we need to add something in Essbase.cfg file like below.
    1012704 Dynamic Calc processor cannot lock more than number ESM blocks during the calculation, please increase CalcLockBlock setting and then retry (a small data cache setting could also cause this problem, please check the data cache size setting).
    Possible Problems
    Analytic Services could not lock enough blocks to perform the calculation.
    Possible Solutions
    Increase the number of blocks that Analytic Services can allocate for a calculation:
    Set the maximum number of blocks that Analytic Services can allocate to at least 500. 
    If you do not have an $ARBORPATH/bin/essbase.cfg file on the server computer, create one using a text editor.
    In the essbase.cfg file on the server computer, set CALCLOCKBLOCKHIGH to 500.
    Stop and restart Analytic Server.
    Add the SET LOCKBLOCK HIGH command to the beginning of the calculation script.
    Set the data cache large enough to hold all the blocks specified in the CALCLOCKBLOCKHIGH setting. 
    Determine the block size.
    Set the data catche size.
    Actually in our Server Config file(essbase.cfg) we dont have below data  added.
    CalcLockBlockHigh 2000
    CalcLockBlockDefault 200
    CalcLockBlocklow 50
    So my doubt is if we edit the Essbase.cfg file and add the above settings and restart the services will it work?  and if so why should we change the Server config file if the problem is with one application Cal Script. Please guide me how to proceed.
    Regards,
    Naveen

    Your calculation needs to hold more blocks in memory than your current set up allows.
    From the docs (quoting so I don't have to write it, not to be a smarta***:
    CALCLOCKBLOCK specifies the number of blocks that can be fixed at each level of the SET LOCKBLOCK HIGH | DEFAULT | LOW calculation script command.
    When a block is calculated, Essbase fixes (gets addressability to) the block along with the blocks containing its children. Essbase calculates the block and then releases it along with the blocks containing its children. By default, Essbase allows up to 100 blocks to be fixed concurrently when calculating a block. This is sufficient for most database calculations. However, you may want to set a number higher than 100 if you are consolidating very large numbers of children in a formula calculation. This ensures that Essbase can fix all the required blocks when calculating a data block and that performance will not be impaired.
    Example
    If the essbase.cfg file contains the following settings:
    CALCLOCKBLOCKHIGH 500  CALCLOCKBLOCKDEFAULT 200  CALCLOCKBLOCKLOW 50 
    then you can use the following SET LOCKBLOCK setting commands in a calculation script:
    SET LOCKBLOCK HIGH; 
    means that Essbase can fix up to 500 data blocks when calculating one block.
    Support doc is saying to change your config file so those settings can be made available for any calc script to use.
    On a side note, if this was working previously and now isn't then it is worth investigating if this is simply due to standard growth or a recent change that has made an unexpected significant impact.

  • In MIRO excise not getting calculated

    Dear Users,
    I have an issue regarding tax calculation in MIRO,I have maintained tax code in PO and in analysis all the conditions are showing exact value as maintained. But in MIRO these excise duty's are not getting calculated, even i checked all the master data like J1ID and also the tax conditions, please provide your suggestions.
    Thanks

    Hi,
    Check if EXYEAR is same as what is updated in J_1IEXCDTL table.
    If not then that is the reason why excise values are not updated in J_1IGRXREF table.
    Now the table should be updated with excise values only then the same will flow in MIRO.
    Regards,
    Brinda

Maybe you are looking for

  • Can't open photos in separate windows in iPhoto 9.5

    How can I view multiple photos simultaneously in order to compare shots side by side? Previously (in iPhoto '08) I was able to open thumbnails in individual edit windows and drag them next to each other to view. Now the images open within the same wi

  • Alert Modeler in IC Webclient

    Hi, Could you help me out with the code in the alert modeler to achieve this functionality. After i confirm the BP i would like the sevice ticket transaction to appear by default rather than clicking on the link in the navigation bar. I know this fun

  • Safari: Pop-up windows for "Mac Keeper"; Random search results and ads everywhere!

    Please help! A very annoying pop up window called "Mac Keeper" keeps popping up. Also, when i search something in google; or any search engine; I will get random results.

  • H: not found when insert the card of flash drive for auto import In Lightroom CC

    when i try to auto import Inside Lightroom 6 or CC getting an error "H: not found" as soon as i plug the memory card or any flash drive to the computer. i'm using lightroom CC win 7 64 bit

  • Complex validation rules

    Experts, I want to do validation check for this screen http://www.2shared.com/photo/hRS0k6yI/validation_check.html. The use case is : Basically if one of the checkbox is selected, the user need to select the radio button under it or enter some value