Percentile Calculation

Hi All,
I want to calculate "Percentile"(per-50 and per-90)
Referred : http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:16695563776514
and tried with NTILE but its not giving correct results
DB: Oracle 10g
Any suggestions ? ?
Thanks,
Saichand.v

Hi,
thanks for the reply
select percentile_cont(0.5) within group (order by measure_column),dim_col
from table_name group by dim_col
NTILE : In OBIEE we have NTILE function
NTILE(measure_column,50)
This is what i tried but i am not sure about the expected results
When i use NTILE not getting exact results ......its is showing same value for all my reports(obiee side)
Thats the reason trying to write an sql for the percentile calculation so that i can manage the things from my OBIEE side
Can you please shed some light ?
Sorry if i didn't mention my problem correctly(expected results) the basic problem is don't have idea how Percentile will handle
Thanks,
Saichand.v
Edited by: Saichand Varanasi on Jan 6, 2011 2:22 PM

Similar Messages

  • Is percentile calculation supported by QD or WAD?

    Hi!
    Customer does not have access to Business Objects, Crystal Reports or similar but requires percentiles to be calculated at runtime.
    Customer wants 10th, 25th, 50th, 75th and 90th percentile of the employees' individual wages to be presented for whatever combination or drill-down of characteristics are selected in report. Eg first display is organizational units as a hierarchy, each unit should have the percentiles displayed and calculated on the wages of the employees for that unit. Also, minimum wage for each unit and total sum for wages for each unit should be presented.
    If user adds Gender from free characteristics, the percentile values should be re-calculated so that they are based on the employees' wages for each gender under each unit.
    Is this at all possible?
    Query Designer does not provide any options unless we predefine which characteristics we want to use and sort and index the data accordingly. This is not an option since user will analyse many scenarios and does not know beforehand which scenarios are applicable.
    Web Application Designer can offer median value (50th percentile) via context menu but only based on the subtotals and totals presented in query (eg total wage for organizational unit, not employees' individual wages).
    Only option as we see it now is to try to create an enhancement of context menu in WAD and use a customer exit to do the calculation and then send result back to query. But is this a solution? Will it even work? Can this work dynamically or must user make selection on menu for each re-calculation of the values? Will it damage performance?
    There is very little information about percentile calculations on forums or SAP help, is this not supported at all if customer does not use BO or similar?
    Please advice on solution or information about whether or not this is supported by BI.
    (links to info regarding enhancement and customer exit
    http://help.sap.com/saphelp_nw04/helpdata/en/a4/7f853b175d214ee10000000a11402f/content.htm
    http://help.sap.com/saphelp_nw70/helpdata/en/47/a3d30269421599e10000000a42189c/content.htm)
    (System information: SAP Netweaver 7.0 EHP 1, suppport pack 8)

    Well, you're right. There is an amount of manual action required. And the condition does indeed look at the characteristics in the drill-down - so if employee no in not in the drill down, then the lowest granular data will not be taken into account.
    If you want to avoid that, you'll need a different approach. You could create a vanilla (without condition query) and write an APD to pull data from that query. The APD would then contain a routine to perform this calculation of the 10th percentile and that could be sent into a DSO and thereafter to a Cube. The same APD for 25th, 50th percentile etc. Eventually the target cube, for each Org Unit, would have one record for the 10th percentile, one for the 25th etc. The query could be built on this cube and could pick up the percentile required by the user.
    The trouble with this design is that you'll have to think of all possible combinations of characteristics (Org Unit, Gender etc) and use the APD to calculate them. But I can't think of any other way to handle it either.
    Regards,
    Suhas

  • Regarding Percentile Calculation

    hi Experts,
    plz help me is there any function module for calculating percentile or tell me formula for percentile calculation

    I would not rely on XI built in arithmetic functions as they are extremely inaccurate. Write your own UDF and use Java classes such as double or BigDecimal depending on use.
    Yoni

  • Count and Percentile calculations for various Lines in BEx Query.

    Dear Experts
    I working on a BEx Query at the moment and the output is required in the format below:
    This and another report (which will be displayed above this report) will go into a WAD eventually.
    I have Key Figure for 'Document Count' but not sure how to get the percentage displayed in the second line and also for the Top 3 Reason Codes (not having anything displayed across 'Top 3 Reason Codes' though). I have tried Selection and other ways but nothing worked so far.
    Do I have to use Cell Editing by any chance or is there a way to work-around this.
    Thanks in advance for your help.
    Kind regards,
    Chandu

    You can achieve your report format by creating a Structure in rows with Cell definitions. Define each cell in COunt Document KF according to the row logic.

  • Possible new percentile function?

    Hi,
    I'm running "Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production".
    I need to calculate percentiles, so I believe I have two choices: PERCENTILE_CONT() or PERCENTILE_DISC().
    I also need to get the same results as the SPSS and SAS tools.
    Comparing the results, neither of the Oracle versions give the same result as SPSS/SAS. I think this is due to how Oracle determines what row numbers to use.
    Oracle takes row RN = (1+ (P*(N-1)) where P is percentile and N is number of ordered rows.
    For PERCENTILE_CONT(0.1) from a 16 row set that would give row 2.5 as the starting point.
    http://docs.oracle.com/cd/B28359_01/server.111/b28286/functions115.htm
    SPSS and SAS uses the formula RN = P*(N+1) which gives row 1.7 for a 16 row set.
    Now, would it be possible for me to write a PERCENTILE_SPSS() function?
    The samples of analytical/pipelined functions I have seen (and written) are happy to use the rows in the set one by one, to calculate some special min/max/avg/etc.
    For this function I would need either
    1 - to know the number of rows in the set, from the first call of my function, and calculate the percentile as my function "passes" the desired rows
    or
    2 - save all values in the set and calculate the percentile at the end
    The first sounds impossible, the second sounds expensive for a large table.
    Any suggestions?
    Kind regards
    Tomas

    You'll be more successful in the forums if you focus on the issues, questions and suggestions rather than make assumptions about why those questions or suggestions are being asked or made. Your unwarranted personal comments are unprofessional and indicate a lack of maturity.
    My questions were, and are, solely to elicit information about your actual issue so that suitable solutions or workarounds can be determined. There was no 'lecture' anywhere in any of the information that I provided; just statements and quotes from experts.
    The articles and quotes I provided were intended to show you that there isn't a universal agreement on a definition for those 'percentile' calculations. That would signal at least a 'caution' flag.
    >
    I'm asking if it's possible to write a custom-made analytical function when the formula needs to know the record count.
    >
    Yes - it is possible. In my opinion it isn't feasible. Any attempt to reproduce the exact proprietary algorithm that any vendor uses would require knowledge about the internals of that vendor's code that jsut aren't likely to be known or available: what precision is being used for intermediate results? What rounding methods are being used? What is the exact order of operations. Were any of those factors changed between versions. Did the vendor make any bug fixes?
    Any testing you do with any manageable set of test data won't ensure that there aren't data sets that will produce slightly different results.
    So writing such a function is one thing; trying to duplicate existing functionality is something else. The vendor's own implementation may have changed due to bug fixes so the results from that vendor may depend on the actual version being used.
    >
    Oracle does not have to match, I've never said that.
    My report, running in Oracle Reports, do need to match the results of an old report.
    >
    Well I read that as saying that Oracle does need to match. Your second sentence pretty much says exactly that.
    Again, my question was to try to understand why Oracle needed to produce a NEW report from OLD data that matches information in a PRINTED report from the past. That is a very unusual requirement. I've never run across such a requirement in more than 30 years and I have written thousands of reports using Crystal Reports (now BO), Oracle Reports, RPG and many other tools.
    It is often not possible to reproduce ANY report from the past due to the ever-changing nature of the data in the database. Data gets updated, deleted, inserted even when it shouldn't.
    >
    I have considered typing in the old values and selecting them.
    >
    That is the only method guaranteed to produce the same results. And that is about the only method that will pass the stringent auditing and Sarbanes-Oxley requirements that many organizations have to abide by. Those auditors won't allow you to just change the data to make things come out the way you want. You have to PROVE the provenance of the data: where did every piece of data come from, what changes were made to it (and when and by whome) and how were any accumulations performed.
    Using a custom function and merely showing that the results, on paper, match would not sway any auditor I've ever had to deal with. You may not have an issue of having to prove the data. That's why we are asking questions. To try to understand what options are viable for you.
    >
    That would mean I'd have to use SPSS to produce future values to store.
    >
    Yep! That's one of the main drivers for a lot of the ETL processes that exist. There are multiple systems of record. A PeopleSoft system might be the SOR for Inventory while an Oracle financials system might be the SOR for the financial and payroll data. ETL tries to merge and match the data. Trying to keep those ETL processes up to date with the changes going on in all of the source systems is a MAJOR headache.
    The standard solutiion for that use case is to select an SOR for reporting and then create ETL processes that bring the SOR data from each source system to the reporting SOR. For your use case that might mean using SPSS to produce data in report-ready tables and then use ETL to bring that data to the report server where it can be merged with data from other systems.
    >
    That's just too much manual labour for my taste.
    >
    Welcome to the real world!

  • How to calculate an interpolation...

    Hi Everybody,
    Iu2019m struggling with the percentile calculation using simple interpolation.
    I have following inputs:
    Table:
    Day_bucket             1|     2|     3|     4
    Responses             10|     20|     50|     20
    Cumul_responses     10|     30|     80|     100
    Percentile                     0.1|     0.3|     0.8|     1
    1. Day_bucket u2013 days from 1 to 100 (dimension)
    2. Percentile u2013 percentile distribution u2013 how many % of all responses have been received in a particular day from Day_bucket dimension (measure)
    3.Interpolation equation: d=d1 + ((g-g1)*(d2-d1)/(g2-g1))
    g - percentile you are looking for (in my case 0.9)
    g1 - MAX percentile < 0.90 (in my case 0.8 )
    g2 - MIN percentile > 0.90 (in my case 1)
    d1 u2013 Day_bucket where percentile is equal to g1 (in my case 3)
    d2 u2013 Day_bucket where percentile is equal to g1 (in my case 4)
    d u2013 value Iu2019m looking for (in my case 3.5)
    Now what I need to do is to get the result of above equation into one field. And my problem is that I not able to calculate g1, g2, d1, d2.
    Can anybody help me? Is it actually possible to make such calculation in Business Object?
    Edited by: gonosgon on Jul 20, 2010 2:46 PM

    Hi gonosgon.
    I do think this is possible in Web Intelligence. I think you would need to create some variables in order to do what you need. I am not 100% clear on what you need to do, but if you create some variables in your report, for instance, for g, g1, g2, d1, d2, etc. this may allow you to do what you need. These variables can then be built off other variables.
    If you have access to your universe, then you can simply create the objects you need with these pre-calculated.
    Hope that helps!

  • Percentage pie chart whose query return just one value

    Hi,
    I have a requirement which is to display a pie chart that shows the relative percentage of based on a particular row's values.
    Here the SQL query return just one value to generate the chart, however Apex is creating its own seemingly random value to draw the rest of the chart and dividing the region.
    Is there any way for Apex to take the single value that the query returns as a percentage and generate the chart by calculating the other value to be as (100 - queried value) and split the region in the chart accordingly.
    Of course one method is to store both the value that the query returns as well as 100 - that value in a table and then write the query in the chart that references both these values.. but I'd like to know if there is a way to go about it without storing this extra bit of information in a table
    Thanks!

    you create a hidden page item and populate with total value and percentile calculated value and use it in dial chart as querry.

  • PERCENTILE_CONT with non constants

    Hi All,
    I'm currently learning PL/SQL and I'm having an issue with PERCENTILE_CONT where I want to pass a variable value instead of a constant. Can anyone shed any light on how I could coerce the variables into a constant yet still keep it dynamic or alternatively suggest any alternative functions that may perform the percentile calculations?
    My SQL is (very ameturish, improvements appreciated) below:
    SELECT
         TB1.FRONTEND_COUNTRY,ROUND(AVG(TB1.ER_RT),2) ER_RATE,ROUND(AVG(SFR_RT),2) SFR_RATE,ROUND(PERCENTILE_CONT(TB2.PERC1/100) WITHIN GROUP (ORDER BY TB1.SSV_SIV ASC)*30.5,0) SSV_SIV1, ROUND(PERCENTILE_CONT(TB2.PERC2/100) WITHIN GROUP (ORDER BY TB1.SSV_SIV ASC)*30.5,0) SSV_SIV2
    FROM     
         (SELECT PD.FRONTEND_COUNTRY, PD.PROJECT_CODE, PD.PROTOCOL, AVG(PD.PROJECT_MEAN_ERMTHS) ER_RT, AVG(PD.PROJECT_MEAN_SFR_PERCENT) SFR_RT, AVG(PD.COUNTRY_MEAN_SSU2) SSV_SIV
         FROM MYDB.TCTM_TEMP_USERACC_PROJDATA1 PD
         GROUP BY PD.FRONTEND_COUNTRY, PD.PROJECT_CODE, PD.PROTOCOL
         ORDER BY 1) TB1
         LEFT JOIN MYDB.TCTM_STRAT_NEWVAROFMIKE TB2 ON TB1.FRONTEND_COUNTRY=TB2.COUNTRY_NAME
    GROUP BY
         TB1.FRONTEND_COUNTRY
    ORDER BY 1Edited by: 955750 on 29-Aug-2012 05:53
    Edited by: 955750 on 29-Aug-2012 07:12

    Hi,
    Welcome to the forum!
    Whenever you have a problem, please post a little sample data (CREATE TABLE and INSERT statements, relevant columns only) from all tables involved.
    Also post the results you want from that data, and an explanation of how you get those results from that data, with specific examples.
    It's good that you posted your code; always do that.
    Always say which version of Oracle you're using.
    See the forum FAQ {message:id=9360002}
    955750 wrote:
    Hi All,
    I'm currently learning PL/SQL Do you mean SQL?
    and I'm having an issue with PERCENTILE_CONT where I want to pass a variable value instead of a constant. Can anyone shed any light on how I could coerce the variables into a constant yet still keep it dynamic or alternatively suggest any alternative functions that may perform the percentile calculations?You can pass variables to PERCENTILE_CONT; the question is which values of the variables?
    >
    My SQL is (very ameturish, improvements appreciated) below:Your SQL isn't bad.
    It's hard to read if it's not formatted. When posting on this site, type these 6 characters
    \(lower-case only, inside curly brackets) before and after sections of formatted text to preserve spacing.
    SELECT
         TB1.FRONTEND_COUNTRY,ROUND(AVG(TB1.ER_RT),2) ER_RATE,ROUND(AVG(SFR_RT),2) SFR_RATE,ROUND(PERCENTILE_CONT(TB2.PERC1/100) WITHIN GROUP (ORDER BY TB1.SSV_SIV ASC)*30.5,0) SSV_SIV1, ROUND(PERCENTILE_CONT(TB2.PERC2/100) WITHIN GROUP (ORDER BY TB1.SSV_SIV ASC)*30.5,0) SSV_SIV2
    FROM     
         (SELECT PD.FRONTEND_COUNTRY, PD.PROJECT_CODE, PD.PROTOCOL, AVG(PD.PROJECT_MEAN_ERMTHS) ER_RT, AVG(PD.PROJECT_MEAN_SFR_PERCENT) SFR_RT, AVG(PD.COUNTRY_MEAN_SSU2) SSV_SIV
         FROM MYDB.TCTM_TEMP_USERACC_PROJDATA1 PD
         GROUP BY PD.FRONTEND_COUNTRY, PD.PROJECT_CODE, PD.PROTOCOL
         ORDER BY 1) TB1There's no reason to use an ORDER BY clause in this sub-query.
         LEFT JOIN MYDB.TCTM_STRAT_NEWVAROFMIKE TB2 ON TB1.FRONTEND_COUNTRY=TB2.COUNTRY_NAME
    GROUP BY
         TB1.FRONTEND_COUNTRY
    ORDER BY 1
    Edited by: 955750 on 29-Aug-2012 05:53This problem concerns GROUP BY more that PERCENTILE_CONT.
    In the main query, you're GROUPing BY frontend_country, so there will only be 1 row of ouptput for each distinct value of frontend_country.  Let's say, for a particular frontend_country, there are 3 rows, and thy have perc1 values of 50, 50 and 80.  When you computePERCENTILE_CONT(TB2.PERC1/100) WITHIN GROUP (ORDER BY TB1.SSV_SIV ASC)
    for that row, which value of perc1 should be used?  50? 80? Something in between?  And why should it use that value? For example, if you want it to use 50, is that because 50 is the lowest value? The most common value?  Some other reason?
    If you know that each distinct value of frontend_coutry will only have 1 distinct value of perc1 (and perc2) in the main query, then you can add perc1 (and perc2) to the main query's GROUP BY clause.
    Whatever you want to do, there's probably some way to do it.  However, it's unclear what you want to do.  That's why it's so important to post a little sample data and the results you want from that data.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • MDX Query using BottomCount to limit Median calculation returns null

    I'm building a new cube that includes some person age values that aren't useful when summed, but are
    useful when the median is determined. The measure group that contains the measure for the age has a 1-1 relationship with a dimension in the cube because both use the same table as the source. This is important because I use the key attribute of the dimension
    as the set expression in the Median function to prevent any summation before the median is found. Here is the code for the original median calculation:
    MEMBER Measures.[Median Age] AS
    MEDIAN(
    [Placement Dimension].[Id Removal Episode Fact].Members,
    [Measures].[Age At Removal Months]
    This median naturally represents the half-way point in the series of values. My analysts have also requested
    median-type values at the quarter and three-quarter points in the same series. I've been able to accomplish this for the three-quarter point by nesting the TopCount function in the set expression of the Median function to limit the set to the last half of
    the records and then find the median point like this:
    MEMBER Measures.[75th Percentile] AS
    MEDIAN(
    TOPCOUNT(
    [Placement Dimension].[Id Removal Episode Fact].MEMBERS
    ,Measures.[Episode Count] / 2
    ,Measures.[Age At Removal Months]
    ,Measures.[Age At Removal Months]
    However, my attempt to use the BottomCount function in the same way as TopCount to find the quarter point
    in the whole data set by limiting the calculation's set to the first half of the data always returns null. Here is how I've formed the code:
    MEMBER Measures.[25th Percentile] AS
    MEDIAN(
    BOTTOMCOUNT(
    [Placement Dimension].[Id Removal Episode Fact].MEMBERS
    ,Measures.[Episode Count] / 2
    ,Measures.[Age At Removal Months]
    ,Measures.[Age At Removal Months]
    And here is the query that returns the values:
    SELECT
    Measures.[Episode Count]
    ,Measures.[Median Age]
    ,Measures.[25th Percentile]
    ,Measures.[75th Percentile]
    } ON 0
    ,[Date Begin].[Calendar Hierarchy].Year.&[2011]:[Date Begin].[Calendar Hierarchy].Year.&[2014] ON 1
    FROM [POC Cube]
    WHERE
    [Age at Removal Mos].[Age in Years List].[Age Year].&[0]:[Age at Removal Mos].[Age in Years List].[Age Year].&[5]
    I don't know why the end result is always null. I don't have any null values in the data for this measure, and I know what values I should be seeing because I've found the median records manually in results from a SQL Server query. I've tried using TopCount
    and multiplying Measures.[Age At Removal Months] in the TopCount function by -1 to workaround the descending sort, but I still get nulls. I've also tried separating these queries out so the quarter point and three-quarter point calculations aren't run together,
    but I still get nulls for the quarter point calculation.
    I'm open to any help fixing this situation by modifying my current code or by using alternate methods, but the end result has to be dynamic enough to be used as a calculation in the cube. Thanks!

    The links might helps.
    http://technet.microsoft.com/en-us/library/ms144864.aspx
    http://www.mssqltips.com/sqlservertip/3034/sql-server-analysis-services-ssas-2012-top-and-bottom-functions/
    http://www.sqlservercentral.com/blogs/bradleyschacht/2012/03/12/mdx-functions-bottomcount/

  • Percentile & percentage function in SQL or PL/SQL ?

    Hi all,
    I was wondering if there is any
    Percentile or percentage functions in SQL or PL/SQL  to calculate ??
    I came across percentile_rank , but i didnt find really good example online .... so i wasnt sure if thats right function for calculating  percentile Any example will be great !!
    THank you so much !!!

    Thank you so much frank and tubby !! It just amazing to see so many powerful aggregate and anlytic functions are available in 11g.
    Here is the example data i was working on
    with x as (
    select '100' value_amt, '6' cnt  from dual union all
    select '200' value_amt, '5' cnt  from dual union all
    select '500' value_amt, '10' cnt  from dual union all
    select '700' value_amt, '12' cnt  from dual union all
    select '900' value_amt, '14' cnt  from dual )
    select * from x;
    I m trying to calucalte the column called "avg_value" by dividing " value_amt / cnt ". I was just wondering if i need to use any function to calculate
    Here is the column "avg_value" look like
    with y as (
    select '100' value_amt, '6' cnt, '16' avg_value  from dual union all
    select '200' value_amt, '5' cnt,  '40' avg_value from dual union all
    select '500' value_amt, '10' cnt, '50' avg_value from dual union all
    select '700' value_amt, '12' cnt,  '58.33'avg_value from dual union all
    select '900' value_amt, '14' cnt, '64.28' avg_value   from dual )
    select * from y;Thank you so much!! I really appriciate it !!

  • Finding the 65-th percentile

    Dear Gurus,
    I understand that there are functions in BEx for calculating percentages in BI Reports.
    There is a requirement from the client for calculating the 65 th <b>Percentile</b> (Not Percentage) of a keyfigure based on a specific characteristic for a particular year. The report consists of only a single row. The value of the keyfigure pertaining to the 65 th percentile is to be displayed under that Keyfigure column.
    Your help and suggestions will be much appreciated. Thanks in advance.
    Regards,
    Arun.

    Dear Roberto, Thanks for the reply.
    The keyfigures have to be ranked based on their individual values. Then say if there are 100 values for that specific key figure, the 65 th percentile refers to the value of the keyfigure corresponding to the Rank 65.
    The report consists of three columns and one row.
    The query churns out 1000 records based on the following filter conditions and finds the Keyfigure X and Y for each record.
    1. 0CalYear
    2. Characteristic A, B and C
    The report consists of 0Calyear, Characteristic B and Keyfigures X and Y. There will be only one row. Under the Keyfigure X, its corresponding 65th Percentile has to be displayed.
    Kindly advice me on proceeding further.
    Regards,
    Arun.

  • How I can calculate statistical percentiles using LabVIEW?

    Hello,
    I was wondered if somebody help me or send me a VI for calculation of statistical percentiles.
    Many Thanks,
    Bijan

    What is it exactly that you are trying to do? Knowing this will help us determine whether we have (or can make) what you need.
    Mike...
    Certified Professional Instructor
    Certified LabVIEW Architect
    LabVIEW Champion
    "... after all, He's not a tame lion..."
    Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

  • Tax Calculation of free goods

    Hi
    We have a requirement where the client wants only VAT to be posted in GL accounts and not the price.For this I tried free goods condition type R100 with AltCBV 28.But the VAT value is also coming zero. I want the VAT to be calculated on Pricing condition and then give 100% discount.
    Apart from this I also tried to make pricing condition statistical and calculate VAT on it but then also system gives zero value for VAT.
    Also wondering if the scenario can be addressed with Free of Change Dellivery.

    Hi
    Normally in scenario where you are dealing with samples , there are cases you have to charge only TAXES / EXCISE part.
    Here my SUGGESTION is you can handle this through a simple sales order process, where you have to configure a new pricing procedure.
    1. Create a new document type to handle samples.
    2. Create a New pricing procedure where you can keep the basic price condtion as Statistical and rest all same as wht is is there your main pricing procedure. Just juggle with the TO-FROM to capture the TAX part in the total.
    3. Doc pricing procedure will help you to pick the new Pricing procedure
    4. Now Process this as a normal sales order
    5. This new document type will help you to have a proper reporting.
    The only constraint here is USER has to punch a fresh order seperately for such scenario. He cannot give a free good in the same sales order which is used as standard.
    The other way is you can configure a FREE GOOD Determination procedure as suggested  where you need to then have your Single pricing procedure accomodating the same scenario.
    Thanks
    RB

  • IF statement in Calculated Field for Share point, doesnt calculate sum in my Excel Pivot table.

    Hi Everyone
    I used this in SP calculated column field.
    =IF([Shift Sched]="1pm to 10pm","0",IF([Shift Sched]="2pm to 11pm","1",IF([Shift Sched]="3pm to 12am","2",IF([Shift Sched]="4pm to 1am","3",IF([Shift Sched]="5pm to 2am","4",IF([Shift
    Sched]="6pm to 3am","5",IF([Shift Sched]="7pm to 4am","6",IF([Shift Sched]="8pm to 5am","7",IF([Shift Sched]="9pm to 6am","8",IF([Shift Sched]="10pm to 7am","8",IF([Shift
    Sched]="11pm to 8am","7",IF([Shift Sched]="12pm to 9am","6",IF([Shift Sched]="1am to 10am","5",IF([Shift Sched]="2am to 11am","4",IF([Shift Sched]="3am to 12pm","3",IF([Shift
    Sched]="4am to 1pm","2",IF([Shift Sched]="5am to 2pm","1",IF([Shift Sched]="6am to 3pm","0",IF([Shift Sched]="7am to 4pm","0",IF([Shift Sched]="8am to 5pm","0",IF([Shift
    Sched]="9am to 6pm","0",IF([Shift Sched]="10am to 7pm","0",IF([Shift Sched]="11am to 8pm","0",IF([Shift Sched]="12pm to 9pm","0"))))))))))))))))))))))))    
    it was able to work fine however my issue is when i extract the information to excel and use a pivot table the table is not able to calulate the sum of the value for this field. Can you please help me with this. this is for an Attendance traker for Night
    Differential pay for employees. they create a daily log of their shift schedule and if i summarize this in pivot the value in the calculated field for this is not getting the sum.
    Thanks,
    Norman

    Hi Everyone
    I used this in SP calculated column field.
    =IF([Shift Sched]="1pm to 10pm","0",IF([Shift Sched]="2pm to 11pm","1",IF([Shift Sched]="3pm to 12am","2",IF([Shift Sched]="4pm to 1am","3",IF([Shift Sched]="5pm to 2am","4",IF([Shift
    Sched]="6pm to 3am","5",IF([Shift Sched]="7pm to 4am","6",IF([Shift Sched]="8pm to 5am","7",IF([Shift Sched]="9pm to 6am","8",IF([Shift Sched]="10pm to 7am","8",IF([Shift
    Sched]="11pm to 8am","7",IF([Shift Sched]="12pm to 9am","6",IF([Shift Sched]="1am to 10am","5",IF([Shift Sched]="2am to 11am","4",IF([Shift Sched]="3am to 12pm","3",IF([Shift
    Sched]="4am to 1pm","2",IF([Shift Sched]="5am to 2pm","1",IF([Shift Sched]="6am to 3pm","0",IF([Shift Sched]="7am to 4pm","0",IF([Shift Sched]="8am to 5pm","0",IF([Shift
    Sched]="9am to 6pm","0",IF([Shift Sched]="10am to 7pm","0",IF([Shift Sched]="11am to 8pm","0",IF([Shift Sched]="12pm to 9pm","0"))))))))))))))))))))))))    
    it was able to work fine however my issue is when i extract the information to excel and use a pivot table the table is not able to calulate the sum of the value for this field. Can you please help me with this. this is for an Attendance traker for Night
    Differential pay for employees. they create a daily log of their shift schedule and if i summarize this in pivot the value in the calculated field for this is not getting the sum.
    Thanks,
    Norman

  • If statement in Custom Calculation Script

    I have 16 fields and if even one of them ="1" I have to list it in another field.  I do not want to count or sum.  If one of those fields has a 1 in it I just want the other field to display Y and if none have a 1 I want that field to display N.
    Please help.
    Thank you in advance~mjc

    You need to write a compound logical statement to evaluate all of the values and that statement needs to evaluate to true or false.
    Do you know how to write JavaScript?
    Do you know how to enter JavaScript calculations into a form field?
    For custom calculation of the text field I could write something like:
    // define an array of the field names to check
    var aNames = new Array("Text1", "Text2", "Text3", "Text4",
    "Text5", "Text6", "Text7", "Text8",
    "Text9", "Text10", "Text11", "Text12",
    "Text13", "Text14", "Text15", "Text16");
    // define a logical variable that is true if any field has a value of 1 - default is false or no field has a value of 1
    var bMatch = false;
    // value for text field
    var TextValue = 0;
    // logical value of field being equal to 1 test
    var FieldIs1 = false;
    // loop through all the fields and test the fields value
    for(i = 0; i < aNames.length; i++) {
    // logically OR the result of field i value equal to true with bMatch
    // get the value of field
    TextValue = this.getField( aNames[i] ).value;
    // test the value of the field
    FieldIs1 = Number(TextValue) == 1
    // logically OR the 2 values
    bMatch = FieldIs1 | bMatch;
    } // end field processing
    // set the field value
    if(bMatch == true) {
    event.value = "Y";
    } else {
    event.value = "N";
    You will need to change the field names to match your fields. You can add more field name or remove field names as needed and the script will adjust for the number of field names.

Maybe you are looking for