Barcode counting over groups

Hi all,
I am working on a barcode for my invoices print. This barcode counts over the invoices from 1 to 16 and then starts over again. For example if i have a set of 6 invoices each of 3 pages the barcode numbering will be as follows:
invoice 1 page 1: 1
invoice 1 page 2: 2
invoice 1 page 3: 3
invoice 2 page 1: 4
invoice 2 page 2: 5
invoice 5 page 3: 15
invoice 6 page 1: 16
invoice 6 page 2: 1
invoice 6 page 3: 2
I have tried to solve this with some xsl-fo programming, but i can't get the page-number or total pages per invoice into a variable. So I can't determine how many pages have been printed for previous invoices and which is the next number to print in the barcode.
Hope you can give me some advice.

Looks like this should return what you are expecting:
--drop table #temp
create table #temp
autoid int,
assembly_no nvarchar(10),
rank_no int
insert into #temp (autoid, assembly_no, rank_no) values (1, 'Assembly1', 2)
insert into #temp (autoid, assembly_no, rank_no) values (2, 'Assembly2', 1)
insert into #temp (autoid, assembly_no, rank_no) values (3, 'Assembly1', 2)
insert into #temp (autoid, assembly_no, rank_no) values (4, 'Assembly1', 1)
insert into #temp (autoid, assembly_no, rank_no) values (5, 'Assembly1', 0)
insert into #temp (autoid, assembly_no, rank_no) values (6, 'Assembly2', 2)
insert into #temp (autoid, assembly_no, rank_no) values (7, 'Assembly2', 1)
select t.assembly_no, count(nullif(t.rank_no, 0))
from #temp t
inner join 
select autoid, rank_no, row_number() over (partition by assembly_no order by autoid desc) as ct
from #temp
)x
on x.autoid = t.autoid
and x.ct <= 2
group by t.assembly_no

Similar Messages

  • Counting Over Last 3500 appearances with Where Clause

    I'm Using Sql Server Studio 2014
    I have a Table containing the following columns:
    AutoId  Assembly_No  [Rank]   
    1          Assembly1       2
    2          Assembly2       1
    3          Assembly1       2
    4          Assembly1       1
    5          Assembly1       0
    6          Assembly2       2
    7          Assembly2       1
    I'm Trying to Run a query that Will count over the last 3500 times that a specific Assembly_No has been that has a rank > 0. For simplicity's sake we can use we can look at the last 2 times that an assembly has been ran. So the results that I'm expecting
    should look like this
    Assembly_NO   Count
    Assembly2        2
    Assembly1        1
    This resulted in assembly2 being counted twice for ids 6 and 7 and only once for ids 4 and 5 because only one rank was > 0. AutoID is an identity column so the most recent values will be determined using this number.
    The query below can count over all of the Assemblies ran, however I'm having trouble only counting over the last 2 for each assembly
    Select Assembly_No, Count(*) as Count
    From TblBuild
    Where Rank > 0
    Group By Assembly_No
    Returns the following 
    Assembly_no  Count
    Assembly2      3
    Assembly1      3

    Looks like this should return what you are expecting:
    --drop table #temp
    create table #temp
    autoid int,
    assembly_no nvarchar(10),
    rank_no int
    insert into #temp (autoid, assembly_no, rank_no) values (1, 'Assembly1', 2)
    insert into #temp (autoid, assembly_no, rank_no) values (2, 'Assembly2', 1)
    insert into #temp (autoid, assembly_no, rank_no) values (3, 'Assembly1', 2)
    insert into #temp (autoid, assembly_no, rank_no) values (4, 'Assembly1', 1)
    insert into #temp (autoid, assembly_no, rank_no) values (5, 'Assembly1', 0)
    insert into #temp (autoid, assembly_no, rank_no) values (6, 'Assembly2', 2)
    insert into #temp (autoid, assembly_no, rank_no) values (7, 'Assembly2', 1)
    select t.assembly_no, count(nullif(t.rank_no, 0))
    from #temp t
    inner join 
    select autoid, rank_no, row_number() over (partition by assembly_no order by autoid desc) as ct
    from #temp
    )x
    on x.autoid = t.autoid
    and x.ct <= 2
    group by t.assembly_no

  • Count(field) group by having changed to an analytic function

    Goal: Be able to show columns with data not in the group by clause.
    I couldn't find this on the 'net either by looking here or by googling for it (unless I'm looking for the wrong thing). I'm doing a regular:
    select fielda, count(fielda)
    from some_table
    group by fielda1
    having count(field1) > 1;
    to find out where I have duplicates in a table (fielda). I get fielda and the count of those records back (ie X, 2). However, I want to add several more fields that may or may not be different for each of those records.
    For example:
    1 record has fielda = X, field2 = Test, field3 = Oracle, field4 = Unix
    1 record has fielda = X, field2 = Testa, field3 = Sybase, field4 = Windows.
    I want to be able to have the following output:
    X, 2 (count of records with fieldA = X), Test, Oracle, Unix
    X, 2 (count of records with fieldA = X), Testa, Sybase, Windows
    I'm not sure if this is one of those where I have to do a subquery on or not. Any ideas?
    Thanks as always!
    Vic

    Sorry, I misread your statement - it is poorly formatted. Besides incorrect use of aliases you are referencing columns that are not in inline view, therefore you must include it there:
    select  nsn,
            item_type_name,
            item_type_id,
            date_time_added,
            function_name
      from  (
             select  it.nsn,
                     it.item_type_name,
                     it.item_type_id,
                     it.date_time_added,
                     fn.function_name,
                     count(*) over (partition by it.nsn) cnt,
                     it.item_type_cat_id,
                     ic.item_category_id,
                     ic.function_id ic_function_id,
                     fn.function_id fn_function_id
               from  item_types it,
                     item_categories ic, functions fn
               where it.nsn is not null
                 and it.item_type_cat_id = ic.item_category_id
                 and ic.function_id = fn.function_id
      where cnt > 1
        and nsn is not null
        and item_type_cat_id = item_category_id
        and ic_function_id = fn_function_id
    order by nsn;SY.

  • Guidance on use of "COUNT(*) OVER () * 5" in a select query.

    Hello Friends,
    I was reading one article, in which one table was created for demo. Following was the statements for there.
    CREATE TABLE source_table
    NOLOGGING
    AS
    SELECT ROWNUM AS object_id
    , object_name
    , object_type
    FROM all_objects;
    INSERT /*+ APPEND */ INTO source_table
    SELECT ROWNUM (COUNT(*) OVER () * 5)+ AS object_id
    , LOWER(object_name) AS object_name
    , SUBSTR(object_type,1,1) AS object_type
    FROM all_objects;
    INSERT /*+ APPEND */ INTO source_table
    SELECT ROWNUM (COUNT(*) OVER() * 10)+ AS object_id
    , INITCAP(object_name) AS object_name
    , SUBSTR(object_type,-1) AS object_type
    FROM all_objects;
    Can anyone please tell me the purpose of *"ROWNUM + (COUNT(*) OVER () * 5)"* in above 2 insert statements, or suggest me some document on that.
    I don't know about its usage, and want to learn that..
    Regards,
    Dipali..

    The insert statements that you have listed are using Oracle Analytic Functions. Some examples of these functions can be found here: [Oracle Analytic Functions|http://www.psoug.org/reference/analytic_functions.html|Oracle Analytic Functions]
    Effectively what that says is the following:
    1. "COUNT(*) OVER ()" = return the number of rows in the entire result set
    2. Multiply that by 5 (or 10 depending on the insert)
    3. Add the current ROWNUM value to it.
    This can be shown with a simple example:
    SQL&gt; edit
    Wrote file sqlplus_buffer.sql
      1  SELECT *
      2  FROM
      3  (
      4     SELECT ROWNUM r,
      5             (COUNT(*) OVER ()) AS ANALYTIC_COUNT,
      6             5,
      7             ROWNUM + (COUNT(*) OVER () * 5) AS RESULT
      8     FROM all_objects
      9  )
    10* WHERE r &lt;= 10
    SQL&gt; /
             R ANALYTIC_COUNT          5     RESULT
             1          14795          5      73976
             2          14795          5      73977
             3          14795          5      73978
             4          14795          5      73979
             5          14795          5      73980
             6          14795          5      73981
             7          14795          5      73982
             8          14795          5      73983
             9          14795          5      73984
            10          14795          5      73985
    10 rows selected.
    SQL&gt; SELECT COUNT(*) from all_objects;
      COUNT(*)
         14795Hope this helps!
    Note the the statements you provided will not actually execute because of the extra "+" signs on either side. I have removed them.

  • Count(*) with group by max(date)

    SQL> select xdesc,xcust,xdate from coba1 order by xdesc,xcust,xdate;
    XDESC XCUST XDATE
    RUB-A 11026 01-JAN-06
    RUB-A 11026 05-JAN-06
    RUB-A 11026 08-JAN-06
    RUB-A 11027 10-JAN-06
    RUB-B 11026 02-JAN-06
    RUB-B 11026 08-JAN-06
    RUB-B 11026 09-JAN-06
    RUB-C 11027 08-JAN-06
    I want to make sql that result :
    XDESC     COUNT(*)
    RUB-A     2
    RUB-B 1
    RUB-C 1
    Criteria : GROUPING: XDESC XCUST AND MAX(DATE)
    bellow mark *** that was selected in count.
    XDESC XCUST XDATE
    RUB-A 11026 01-JAN-06
    RUB-A 11026 05-JAN-06
    RUB-A 11026 08-JAN-06 ***
    RUB-A 11027 10-JAN-06 ***
    ---------------------------------------------------------COUNT RUB-A = 2
    RUB-B 11026 02-JAN-06
    RUB-B 11026 08-JAN-06
    RUB-B 11026 09-JAN-06 ***
    ---------------------------------------------------------COUNT RUB-B = 1
    RUB-C 11027 08-JAN-06 ***
    --------------------------------------------------------COUNT RUB-C = 1
    Can Anybody help ?
    I tried :
    select xdesc,max(xdate),count(max(xdate)) from coba1 group by xdesc
    ERROR at line 1:
    ORA-00937: not a single-group group function
    Thank

    This one is duplicate. see the following link
    Count(*) with group by max(date)
    Thanks

  • IR Problem with COUNT (*) OVER () AS apxws_row_cnt

    Hi all,
    i have a query wich is ran under 2 seconds when ecexuted in SQL (TOAD),
    when i use this query in an interactive report then it takes minutes.
    When i reviewd the query which APEX is sending to the DB i noticed an additional select clause:
    SELECT columns,
    COUNT ( * ) OVER () AS apxws_row_cnt
    FROM (
    SELECT *
    FROM ( query
    ) r
    ) r
    WHERE ROWNUM <= TO_NUMBER (:APXWS_MAX_ROW_CNT)
    when i remove the COUNT ( * ) OVER () AS apxws_row_cnt then the query is fast again.
    How can i change the IR so that the COUNT ( * ) OVER () AS apxws_row_cnt doesn't appear annymore.
    I removed the pagination (- no pagination selected -)
    I put the query in a new IR report in a new page, the COUNT ( * ) OVER () AS apxws_row_cnt still appears.
    Any suggestions i can try ?
    Regards,
    Marco

    Marco1975 wrote:
    Hi all,
    i have a query wich is ran under 2 seconds when ecexuted in SQL (TOAD), I doubt that.
    I think your query returns many rows. Did you see all the rows in Toad in 2 secs?
    If the answer is NO then the query is not finished after 2 secs. It just showed you the first few rows.
    Almost every tool including apex does the same.
    However if you want to know how many rows are returned then there is no way around doing the whole select until the last row.
    Then the result can be shown.
    APEX or a developer might use something like the analytic function "count(*) over ()" to get this result already on the first row. The database still needs to fetch all the rows. Howeverthe result set is not transported over the network to the client, which might also save a lot of time compared to not doing it on the database level.

  • How to get count of group of current login user if AD Group is added in SharePoint Group?

    My Client has 2 SharePoint Application. For the AD Users they have created AD Group and added users in that AD Group as per requirement. Later AD Group is added in SharePoint Group. When I'm trying to fetch Current User Group count, I can able to get the
    count of Groups using below statement.
    int groupCount = SPContext.Current.Web.CurrentUser.Groups.Count;
    Above Statement, returns always 0 value if I tried with User who are added in AD Group and if I add AD User and then it will return the exact count.
    Please suggest solution to get Count of Group of Current User. My Application contains more than 60 SharePoint group.

    Hello,
    I believe your code doesn't count those AD group users until they login at least once. If this is the case then try to use "SPUtility.GetPrincipalsInGroup" as suggested in below post:
    http://stackoverflow.com/questions/4314767/getting-members-of-an-ad-domain-group-using-sharepoint-api
    Hemendra:Yesterday is just a memory,Tomorrow we may never see
    Please remember to mark the replies as answers if they help and unmark them if they provide no help

  • Reg: counter over reading .

    what is  "CntrOverReadg"  in the ik01  while creating measurement point counter.
    can any tell me the usage of this.  what data shd i"ve to enter in this cntrOverReadg

    Its counter overflow reading. It describes the maximum possible reading that physical measurement point can measure.
    For example: A car odometer can measure up to 99999 km of distance covered by it. Hence after the car odometer reaches the value of 99999 km it starts again from 0 km. However the total distance covered by the should be 99999 plus the reading after counter over flow.
    Lets say you have provided the counter overflow reading e.g. 99 for any measurement point for which the current counter reading is 10. If you will create the measurement document with reading less than 10 (let say 4) then the system will consider it as counter overflow occurred and will show the total reading as 103 (i.e 99 + 4) and counter reading as 4.
    Hope it helps!
    Regards,
    Saif Ali Momin

  • Hu is related to undershipment and count over ?

    Hi everyone ,
      Can anyone tell me how HU is related to undershipment and count over ?

    Did you try a reset? Hold down on the sleep and home buttons at the same time until the Apple logo appears on the screen, then let go of the buttons.

  • Counter over flow reading

    What is the utilization of counter over flow reading and annual estimate in counters? Is there any report or any application where it helps us in tracking?
    Regards,
    VM
    Edited by: V M on Jun 1, 2008 9:35 AM

    hi
    counter overflow reading is used wherein your counter reading overflows
    for example your milometer will only show 9999 miles ,once this counter has reached overflow  occurs ,i.e. the counter starts to count upwards from 0000 again
    Annual estimate is used for scheduling purpose for performance based scheduling and for multiple counter plan ,
    Multiple counter plan
    The system uses the current date as the start date and automatically calculates the planned dates based on the maintenance cycles, the scheduling parameters, the estimated annual performance and the last counter readings
    Performance based
    The system automatically calculates the planned date and call date based on the maintenance packages, the scheduling parameters, the estimated annual performance and the counter reading at the start of the cycle
    regards
    thyagarajan

  • Need help using count over function

    I have the following query
    Select student_id, OM, TM, TP, count(rownum) over (order by OM desc) PS from
    (select
    er.student_id, sum(er.obtained_marks) OM, sum(ds.max_marks) TM,
    to_char(sum(er.obtained_marks)/sum(ds.max_marks)*100,'990.00') TP
    from
    tbl_exam_results er, tbl_date_sheet ds
    where
    ds.date_sheet_id = er.date_sheet_id and ds.class_id = 77 and ds.exam_id = 3 and ds.session_id = 1 group by er.student_id
    results in
    <div style="width: 889px"><div class="fielddata"><div>
    <div>STUDENT_ID OM TM TP PS
    1825 291 300 97.00 1
    3717 290 300 96.67 2
    2122 289 300 96.33 3
    3396 287 300 95.67{color:#ff6600} *5 &lt;--*{color}
    4554 287 300 95.67{color:#ff6600} *5 &lt;--*{color}
    1847 281 300 93.67 6
    1789 279 300 93.00 7
    5254 277 300 92.33 8
    1836 258 300 86.00 9
    4867 250 260 96.15 10
    1786 249 300 83.00 11
    4659 245 300 81.67 12
    1835 241 300 80.33 *{color:#ff6600}15 &lt;--{color}*
    1172 241 270 89.26*{color:#ff6600} 15 &lt;--{color}*
    3696 241 300 80.33 *{color:#ff6600}15 &lt;--{color}*
    3865 234 300 78.00 16
    5912 215 300 71.67 17
    5913 204 300 68.00 *{color:#ff6600}19 &lt;--{color}*
    3591 204 300 68.00 *{color:#ff6600}19 &lt;--{color}*
    1830 184 250 73.60 20
    </div>
    </div>
    </div>
    </div>
    <div style="width: 889px"><div class="fielddata"><div>
    But I want as following
    <div>STUDENT_ID OM TM TP PS
    1825 291 300 97.00 1
    3717 290 300 96.67 2
    2122 289 300 96.33 3
    3396 287 300 95.67 *{color:#ff6600}4 &lt;={color}*
    4554 287 300 95.67 *{color:#ff6600}4 &lt;={color}*
    1847 281 300 93.67 {color:#ff6600}5 the following entry{color}
    1789 279 300 93.00 6
    5254 277 300 92.33 7
    1836 258 300 86.00 8
    4867 250 260 96.15 9
    1786 249 300 83.00 10
    4659 245 300 81.67 11
    1835 241 300 80.33 {color:#ff6600}*12 &lt;=*{color}
    1172 241 270 89.26{color:#ff6600} *12 &lt;=*{color}
    3696 241 300 80.33 {color:#ff6600}*12 &lt;=*{color}
    3865 234 300 78.00{color:#ff6600} 13 the following entry{color}
    5912 215 300 71.67 14
    5913 204 300 68.00 *{color:#ff6600}15&lt;={color}*
    3591 204 300 68.00 *{color:#ff6600}15 &lt;={color}*
    1830 184 250 73.60 {color:#ff6600}16{color} {color:#ff6600}the following entry{color}
    </div>
    Thanks in advance for any help
    </div>
    </div>
    </div>
    <div style="width: 889px"></div>
    Edited by: sabir786 on Jan 14, 2009 4:13 AM
    Edited by: sabir786 on Jan 14, 2009 4:17 AM

    Since I do not understand at all what you are trying to do, I cannot correct your query, but I can explain the results.
    The analytic function is doing a running count of the number of records that have been outout so far. With no duplicates, this is somewhat clearer.
    SQL> WITH t AS (SELECT 1 om FROM dual UNION ALL
      2             SELECT 2 FROM dual UNION ALL
      3             SELECT 3 FROM dual UNION ALL
      4             SELECT 4 FROM dual UNION ALL
      5             SELECT 5 FROM dual)
      6  SELECT om, COUNT(rownum) OVER (ORDER BY om) ps
      7  FROM t;
            OM         PS
             1          1
             2          2
             3          3
             4          4
             5          5However, when you have duplicates, both duplicate values get the running count from the last of the duplicates (i.e. the highest running count). Here, I have duplicated 4 and see what I get:
    SQL> WITH t AS (SELECT 1 om FROM dual UNION ALL
      2             SELECT 2 FROM dual UNION ALL
      3             SELECT 3 FROM dual UNION ALL
      4             SELECT 4 FROM dual UNION ALL
      5             SELECT 4 FROM dual UNION ALL
      6             SELECT 5 FROM dual)
      7  SELECT om, COUNT(rownum) OVER (ORDER BY om) ps
      8  FROM t;
            OM         PS
             1          1
             2          2
             3          3
             4          5
             4          5
             5          6The "second" 4 record had a running count of 5 (i.e. it was the fifth record output), so both 4's get the same count. Changing the order by to descending shows the same effect, it just changes the running count:
    SQL> WITH t AS (SELECT 1 om FROM dual UNION ALL
      2             SELECT 2 FROM dual UNION ALL
      3             SELECT 3 FROM dual UNION ALL
      4             SELECT 4 FROM dual UNION ALL
      5             SELECT 4 FROM dual UNION ALL
      6             SELECT 5 FROM dual)
      7  SELECT om, COUNT(rownum) OVER (ORDER BY om DESC) ps
      8  FROM t;
            OM         PS
             5          1
             4          3
             4          3
             3          4
             2          5
             1          6John

  • Use of SUM and COUNT with GROUP BY

    All,
    I have a set like below:
    State DOB StartDt EndDt Desc Category MemberID SubID Code
    NC 2003-01-30 2014-01-01 2014-01-31 Child B 123456 2 38
    NC 2003-01-30 2014-01-01 2014-01-31 Child B 123456 2 39
    NC 2003-01-30 2014-02-01 2014-05-31 Child B 123456 2 38
    NC 2003-01-30 2014-02-01 2014-05-31 Child B 123456 2 39
    NC 2003-01-30 2014-06-01 2014-07-31 Child B 123456 2 38
    NC 2003-01-30 2014-06-01 2014-07-31 Child B 123456 2 39
    NC 2014-01-17 2014-01-01 2014-07-31 Infant S 456789 1 49
    NC 2014-02-04 2014-02-01 2014-07-31 Infant S 246376 3 49
    -- MemberID and SubID identify 1 member
    -- Member 123456 has 2 distinct "Code" i.e. 38,39 but each code has different "StartDt" and "EndDt"
    -- Expected Result
    State Desc Category CountOfDistinctCode TotalMonthsEnrolled
    NC Child B 2 (38 and 39 for same member) 7 months (1 + 4 + 2) (Difference between StartDt and EndDt:1 (1/31/2014 - 1/1/2014) + 4 (5/31/2014 - 2/1/2014).. )
    NC Infant S 2 (Same code 49 but different member) 13 months (7+6) (7/31/2014 - 1/1/2014 = 7 months + 7/31/2014 - 2/1/2014 = 6 months)
    I tried doing a count of distinct Code and the summing up the member months, grouped by State, Desc, Category, but I am not able to get 2 and 7 for child, it somehow calculates Months as 14 (7+7). Please let me know what you guys suggest.

    OK, so we need a different approach. We need a table of numbers, a concept that I discuss here:
    http://www.sommarskog.se/arrays-in-sql-2005.html#numbersasconcept
    (Only read down to the next header.)
    We join the numbers to temp table with BETWEEN over the numbers. Then we count the distinct combination of member ID and number.
    We also need to make an adjustment how to build the string for the DISTINCT. I initially assumed that your columns were integer, why I used str() which produces a right-adjusted fixed-length string of 10 characters. (The default, you can specify a different
    width as the second parameter.) But str() expects a float value as input, and will fail if for instance member id would be non-numeric. We cannot just concatenate the strings, since in that case (MemberID, SubID) = ('12345', '61') would be
    the same as ('123456', '1'). So we must convert to char to get fixed length. All that said, and some more test data added, here is a solution:
    CREATE TABLE #Test
    [State] CHAR(2),
    DOB DATE,
    StartDt DATE,
    EndDt DATE,
    [Desc] VARCHAR(8),
    Category CHAR(1),
    MemberID VARCHAR(10),
    SubID VARCHAR(2),
    Code VARCHAR(5)
    INSERT INTO #Test
    VALUES
    ('NC', '20130130', '20140101', '20140120', 'Child', 'B', '123456', '2', '38'),
    ('NC', '20130130', '20140121', '20140131', 'Child', 'B', '123456', '2', '39'),
    ('NC', '20130130', '20140201', '20140531', 'Child', 'B', '123456', '2', '38'),
    ('NC', '20130130', '20140401', '20140613', 'Child', 'B', '123456', '2', '39'),
    ('NC', '20130130', '20140601', '20140731', 'Child', 'B', '123456', '2', '38'),
    ('NC', '20130130', '20140614', '20140731', 'Child', 'B', '123456', '2', '39'),
    ('NC', '20130129', '20140601', '20140731', 'Child', 'B', '9123456', '1', '38'),
    ('NC', '20130129', '20140614', '20140831', 'Child', 'B', '9123456', '1', '39'),
    ('NC', '20140117', '20140101', '20140731', 'Infant', 'S', '456789', '1', '49'),
    ('NC', '20140204', '20140201', '20140731', 'Infant', 'S', '246376', '3', '49')
    SELECT * FROM #Test ORDER BY MemberID, StartDt, EndDt
    SELECT
    [State]
    , [Desc]
    , Category
    , COUNT(DISTINCT convert(char(10), MemberID) +
    convert(char(2), SubID) +
    convert(char(5), Code)) AS TotalCnt
    , COUNT(DISTINCT convert(char(10), MemberID) +
    convert(char(2), SubID) +
    str(N.Number)) AS CalMonths
    FROM #Test t
    JOIN Numbers N ON N.Number BETWEEN
    datediff(MONTH, '19000101', t.StartDt) AND
    datediff(MONTH, '19000101', t.EndDt)
    GROUP BY [State], [Desc], Category
    go
    DROP TABLE #Test
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Count of group by resultset

    This may be very simple but I am unable to
    figure this out ...
    How do I get the count of
    a SQL query which has a GROUP BY clause
    and GROUP by functions.
    For example,
    How do I get the the numbers of rows fetched
    by the following query:
    select asse_no,
    SUM( round(gross_amt + srg + hbt )) as total_outs
    from pls_demand_1
    having SUM(round(gross_amt + srg + hbt )) >80000
    group by asse_no
    In SQL Plus, I see 38 rows fetched.
    BUT ...
    I will run this query in Oracle Reports.
    I need to get the count somehow ...
    Thanks in advance ...

    Try:
    select asse_no,
    SUM( round(gross_amt + srg + hbt )) as total_outs,
    count(asse_no)
    over (order by asse_no) the_record_number
    from pls_demand_1
    having SUM(round(gross_amt + srg + hbt )) >80000
    group by asse_no
    This does work ... but if you are running this in oracle reports why not just put a summary function against the query.

  • Count Over Count

    Hi, i was looking for some help in regards to understanding the general rule of using the count function. I am trying to build my knowledge of the different functions in order for me to have a better understanding so that I am writing my sql queries in the right way.
    What i was looking to find out was trying to understand the count function. As i understand I can apply the following in order to give me a count of a particular set of data.
    select number, count(*) AS no_count
    from test
    group by job_id;What I am trying to understand is if for some reason I wanted to use the results of the "count(*) AS no_count" to give me another set of results i.e. sum all values over 2 for example, would i write it like the following:
    select number, count(*) AS no_count,
    select count(no_count) having etc...
    from test
    group by job_id;or is the general rule that I would have to write a select within a select?
    If somebody could please help, would really appreciate it. Btw if there is any documentation on using count, as well as using this with other functions i.e. sum than that would be very useful as well.
    Thanks in advance.

    Solomon, thanks for your help. Apologies if i have not explained my question properly. The problem is that I haven't created the tables and have wrote my sample data and i am then trying to work out solutions before attempting to create the tables etc, which is probably where I am going wrong.
    The job_ids can be repeated, first job_count should give me total job counts belonging to each job_id.
    For example in your first dataset you have a job count for all jobs i.e. manager has 3 records with 1 job_count. So I would then like a column to give me a total count of job_count for each criteria i.e. manager had 3 total jobs. I have tried to breakdown the dataset you have shown alongwith the extras I am trying to add, to hopefully explain what I am looking for.
    JOB               JOB_COUNT       TOTAL_JOB_COUNT               OVER_1
    MANAGER                    1                    3                                   0
    PRESIDENT                  1                    1                                   1
    CLERK                         1                    4                                   0
    SALESMAN                  4                    4                                   0
    ANALYST                    2                    2                                   0
    MANAGER                   1                    3                                   0
    MANAGER                   1                    3                                   0
    CLERK                        1                    4                                   0
    CLERK                        2                    4                                   0
    [/CODE]
    So this tells from all jobs which job was dealt with first time so in this case it would be the president, the rest of the jobs were repeated.
    The total_job_count would be written like:select job, count(*) as TOTAL_JOB_COUNT
    but its the over_1 (or sum maybe, not sure) that is based on the results within the total_job_count that I need to look into to find values that equal to 1. Hence I thought I would have to write a count of a count, which is what I am not clear on.
    Sorry for the inconvenience, and really appreciate your help and time.
    Thanks
    Apologies, not sure how to write the resultset as added but appears all over the place.
    Edited by: 973436 on 17-Dec-2012 04:06                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Count Over(Patition by ) equivalent in MDX

    Good Afternoon ,
    I'm trying to replicate the COUNT in MDX. Im trying to get the average values. My query gives me the exact results in the sql server. I'm trying to get the average work done per each employee and the average revenue he/she generated. I have SUM(WorkDone)
    and SUM(Revenue). I want count of days in a month based of the calendar i have as look up. For the month of february 2014 am getting 23 days, but when i do the count in SSAS am getting 400.
    Please see the below sample data for secempid=1368, at the end of the sample data please see the code that i'm running to get the average values. The data is collected for 23 days in this case, for 16 hours each day.
    loc ampm workdone Secempid Date Rev
    1 10AM 0 1368 2/1/14 12:00 AM NULL
    1 11AM 0 1368 2/1/14 12:00 AM NULL
    1 12PM 0 1368 2/1/14 12:00 AM NULL
    1 1PM 0 1368 2/1/14 12:00 AM NULL
    1 2PM 0 1368 2/1/14 12:00 AM NULL
    1 3PM 0 1368 2/1/14 12:00 AM NULL
    1 4PM 0 1368 2/1/14 12:00 AM NULL
    1 5AM 0 1368 2/1/14 12:00 AM NULL
    1 5PM 0 1368 2/1/14 12:00 AM NULL
    1 6AM 0 1368 2/1/14 12:00 AM NULL
    1 6PM 0 1368 2/1/14 12:00 AM NULL
    1 7AM 0 1368 2/1/14 12:00 AM NULL
    1 7PM 0 1368 2/1/14 12:00 AM NULL
    1 8AM 0 1368 2/1/14 12:00 AM NULL
    1 8PM 0 1368 2/1/14 12:00 AM NULL
    1 9AM 0 1368 2/1/14 12:00 AM NULL
    1 10AM 0 1368 2/3/14 12:00 AM NULL
    1 11AM 0 1368 2/3/14 12:00 AM NULL
    1 12PM 0 1368 2/3/14 12:00 AM NULL
    1 1PM 0 1368 2/3/14 12:00 AM NULL
    1 2PM 0 1368 2/3/14 12:00 AM NULL
    1 3PM 0 1368 2/3/14 12:00 AM NULL
    1 4PM 0 1368 2/3/14 12:00 AM NULL
    1 5AM 0 1368 2/3/14 12:00 AM NULL
    1 5PM 0 1368 2/3/14 12:00 AM NULL
    1 6AM 0 1368 2/3/14 12:00 AM NULL
    1 6PM 0 1368 2/3/14 12:00 AM NULL
    1 7AM 0 1368 2/3/14 12:00 AM NULL
    1 7PM 0 1368 2/3/14 12:00 AM NULL
    1 8AM 0 1368 2/3/14 12:00 AM NULL
    1 8PM 0 1368 2/3/14 12:00 AM NULL
    1 9AM 0 1368 2/3/14 12:00 AM NULL
    1 10AM 0 1368 2/4/14 12:00 AM NULL
    1 11AM 0 1368 2/4/14 12:00 AM NULL
    1 12PM 0 1368 2/4/14 12:00 AM NULL
    1 1PM 0 1368 2/4/14 12:00 AM NULL
    1 2PM 0 1368 2/4/14 12:00 AM NULL
    1 3PM 0 1368 2/4/14 12:00 AM NULL
    1 4PM 0 1368 2/4/14 12:00 AM NULL
    1 5AM 0 1368 2/4/14 12:00 AM NULL
    1 5PM 0 1368 2/4/14 12:00 AM NULL
    1 6AM 0 1368 2/4/14 12:00 AM NULL
    1 6PM 0 1368 2/4/14 12:00 AM NULL
    1 7AM 0 1368 2/4/14 12:00 AM NULL
    1 7PM 0 1368 2/4/14 12:00 AM NULL
    1 8AM 0 1368 2/4/14 12:00 AM NULL
    1 8PM 0 1368 2/4/14 12:00 AM NULL
    1 9AM 0 1368 2/4/14 12:00 AM NULL
    1 10AM 0 1368 2/5/14 12:00 AM NULL
    1 11AM 0 1368 2/5/14 12:00 AM NULL
    1 12PM 0 1368 2/5/14 12:00 AM NULL
    1 1PM 0 1368 2/5/14 12:00 AM NULL
    1 2PM 0 1368 2/5/14 12:00 AM NULL
    1 3PM 0 1368 2/5/14 12:00 AM NULL
    1 4PM 0 1368 2/5/14 12:00 AM NULL
    1 5AM 0 1368 2/5/14 12:00 AM NULL
    1 5PM 0 1368 2/5/14 12:00 AM NULL
    1 6AM 0 1368 2/5/14 12:00 AM NULL
    1 6PM 0 1368 2/5/14 12:00 AM NULL
    1 7AM 0 1368 2/5/14 12:00 AM NULL
    1 7PM 0 1368 2/5/14 12:00 AM NULL
    1 8AM 0 1368 2/5/14 12:00 AM NULL
    1 8PM 0 1368 2/5/14 12:00 AM NULL
    1 9AM 0 1368 2/5/14 12:00 AM NULL
    1 10AM 0 1368 2/6/14 12:00 AM NULL
    1 11AM 0 1368 2/6/14 12:00 AM NULL
    1 12PM 0 1368 2/6/14 12:00 AM NULL
    1 1PM 0 1368 2/6/14 12:00 AM NULL
    1 2PM 0 1368 2/6/14 12:00 AM NULL
    1 3PM 0 1368 2/6/14 12:00 AM NULL
    1 4PM 0 1368 2/6/14 12:00 AM NULL
    1 5AM 0 1368 2/6/14 12:00 AM NULL
    1 5PM 0 1368 2/6/14 12:00 AM NULL
    1 6AM 0 1368 2/6/14 12:00 AM NULL
    1 6PM 0 1368 2/6/14 12:00 AM NULL
    1 7AM 0 1368 2/6/14 12:00 AM NULL
    1 7PM 0 1368 2/6/14 12:00 AM NULL
    1 8AM 0 1368 2/6/14 12:00 AM NULL
    1 8PM 0 1368 2/6/14 12:00 AM NULL
    1 9AM 0 1368 2/6/14 12:00 AM NULL
    1 10AM 0 1368 2/7/14 12:00 AM 4.55
    1 11AM 0 1368 2/7/14 12:00 AM 4.55
    1 12PM 0 1368 2/7/14 12:00 AM 4.55
    1 1PM 0 1368 2/7/14 12:00 AM 4.55
    1 2PM 41.66666667 1368 2/7/14 12:00 AM 4.55
    1 3PM 111.6666667 1368 2/7/14 12:00 AM 4.55
    1 4PM 100 1368 2/7/14 12:00 AM 4.55
    1 5AM 0 1368 2/7/14 12:00 AM 4.55
    1 5PM 50 1368 2/7/14 12:00 AM 4.55
    1 6AM 0 1368 2/7/14 12:00 AM 4.55
    1 6PM 0 1368 2/7/14 12:00 AM 4.55
    1 7AM 0 1368 2/7/14 12:00 AM 4.55
    1 7PM 0 1368 2/7/14 12:00 AM 4.55
    1 8AM 0 1368 2/7/14 12:00 AM 4.55
    1 8PM 0 1368 2/7/14 12:00 AM 4.55
    1 9AM 0 1368 2/7/14 12:00 AM 4.55
    1 10AM 0 1368 2/8/14 12:00 AM NULL
    1 11AM 0 1368 2/8/14 12:00 AM NULL
    1 12PM 0 1368 2/8/14 12:00 AM NULL
    1 1PM 0 1368 2/8/14 12:00 AM NULL
    1 2PM 0 1368 2/8/14 12:00 AM NULL
    1 3PM 0 1368 2/8/14 12:00 AM NULL
    1 4PM 0 1368 2/8/14 12:00 AM NULL
    1 5AM 0 1368 2/8/14 12:00 AM NULL
    1 5PM 0 1368 2/8/14 12:00 AM NULL
    1 6AM 0 1368 2/8/14 12:00 AM NULL
    1 6PM 0 1368 2/8/14 12:00 AM NULL
    1 7AM 0 1368 2/8/14 12:00 AM NULL
    1 7PM 0 1368 2/8/14 12:00 AM NULL
    1 8AM 0 1368 2/8/14 12:00 AM NULL
    1 8PM 0 1368 2/8/14 12:00 AM NULL
    1 9AM 0 1368 2/8/14 12:00 AM NULL
    1 10AM 0 1368 2/10/14 12:00 AM NULL
    1 11AM 0 1368 2/10/14 12:00 AM NULL
    1 12PM 0 1368 2/10/14 12:00 AM NULL
    1 1PM 0 1368 2/10/14 12:00 AM NULL
    1 2PM 0 1368 2/10/14 12:00 AM NULL
    1 3PM 0 1368 2/10/14 12:00 AM NULL
    1 4PM 0 1368 2/10/14 12:00 AM NULL
    1 5AM 0 1368 2/10/14 12:00 AM NULL
    1 5PM 0 1368 2/10/14 12:00 AM NULL
    1 6AM 0 1368 2/10/14 12:00 AM NULL
    1 6PM 0 1368 2/10/14 12:00 AM NULL
    1 7AM 0 1368 2/10/14 12:00 AM NULL
    1 7PM 0 1368 2/10/14 12:00 AM NULL
    1 8AM 0 1368 2/10/14 12:00 AM NULL
    1 8PM 0 1368 2/10/14 12:00 AM NULL
    1 9AM 0 1368 2/10/14 12:00 AM NULL
    1 10AM 0 1368 2/11/14 12:00 AM NULL
    1 11AM 0 1368 2/11/14 12:00 AM NULL
    1 12PM 0 1368 2/11/14 12:00 AM NULL
    1 1PM 0 1368 2/11/14 12:00 AM NULL
    1 2PM 0 1368 2/11/14 12:00 AM NULL
    1 3PM 0 1368 2/11/14 12:00 AM NULL
    1 4PM 0 1368 2/11/14 12:00 AM NULL
    1 5AM 0 1368 2/11/14 12:00 AM NULL
    1 5PM 0 1368 2/11/14 12:00 AM NULL
    1 6AM 0 1368 2/11/14 12:00 AM NULL
    1 6PM 0 1368 2/11/14 12:00 AM NULL
    1 7AM 0 1368 2/11/14 12:00 AM NULL
    1 7PM 0 1368 2/11/14 12:00 AM NULL
    1 8AM 0 1368 2/11/14 12:00 AM NULL
    1 8PM 0 1368 2/11/14 12:00 AM NULL
    1 9AM 0 1368 2/11/14 12:00 AM NULL
    1 10AM 0 1368 2/12/14 12:00 AM NULL
    1 11AM 0 1368 2/12/14 12:00 AM NULL
    1 12PM 0 1368 2/12/14 12:00 AM NULL
    1 1PM 0 1368 2/12/14 12:00 AM NULL
    1 2PM 0 1368 2/12/14 12:00 AM NULL
    1 3PM 0 1368 2/12/14 12:00 AM NULL
    1 4PM 0 1368 2/12/14 12:00 AM NULL
    1 5AM 0 1368 2/12/14 12:00 AM NULL
    1 5PM 0 1368 2/12/14 12:00 AM NULL
    1 6AM 0 1368 2/12/14 12:00 AM NULL
    1 6PM 0 1368 2/12/14 12:00 AM NULL
    1 7AM 0 1368 2/12/14 12:00 AM NULL
    1 7PM 0 1368 2/12/14 12:00 AM NULL
    1 8AM 0 1368 2/12/14 12:00 AM NULL
    1 8PM 0 1368 2/12/14 12:00 AM NULL
    1 9AM 0 1368 2/12/14 12:00 AM NULL
    1 10AM 0 1368 2/13/14 12:00 AM NULL
    1 11AM 0 1368 2/13/14 12:00 AM NULL
    1 12PM 0 1368 2/13/14 12:00 AM NULL
    1 1PM 0 1368 2/13/14 12:00 AM NULL
    1 2PM 0 1368 2/13/14 12:00 AM NULL
    1 3PM 0 1368 2/13/14 12:00 AM NULL
    1 4PM 0 1368 2/13/14 12:00 AM NULL
    1 5AM 0 1368 2/13/14 12:00 AM NULL
    1 5PM 0 1368 2/13/14 12:00 AM NULL
    1 6AM 0 1368 2/13/14 12:00 AM NULL
    1 6PM 0 1368 2/13/14 12:00 AM NULL
    1 7AM 0 1368 2/13/14 12:00 AM NULL
    1 7PM 0 1368 2/13/14 12:00 AM NULL
    1 8AM 0 1368 2/13/14 12:00 AM NULL
    1 8PM 0 1368 2/13/14 12:00 AM NULL
    1 9AM 0 1368 2/13/14 12:00 AM NULL
    1 10AM 0 1368 2/14/14 12:00 AM 1.45
    1 11AM 0 1368 2/14/14 12:00 AM 1.45
    1 12PM 0 1368 2/14/14 12:00 AM 1.45
    1 1PM 0 1368 2/14/14 12:00 AM 1.45
    1 2PM 0 1368 2/14/14 12:00 AM 1.45
    1 3PM 50 1368 2/14/14 12:00 AM 1.45
    1 4PM 0 1368 2/14/14 12:00 AM 1.45
    1 5AM 0 1368 2/14/14 12:00 AM 1.45
    1 5PM 0 1368 2/14/14 12:00 AM 1.45
    1 6AM 0 1368 2/14/14 12:00 AM 1.45
    1 6PM 0 1368 2/14/14 12:00 AM 1.45
    1 7AM 0 1368 2/14/14 12:00 AM 1.45
    1 7PM 0 1368 2/14/14 12:00 AM 1.45
    1 8AM 0 1368 2/14/14 12:00 AM 1.45
    1 8PM 46.66666667 1368 2/14/14 12:00 AM 1.45
    1 9AM 0 1368 2/14/14 12:00 AM 1.45
    1 10AM 0 1368 2/15/14 12:00 AM 4.35
    1 11AM 0 1368 2/15/14 12:00 AM 4.35
    1 12PM 0 1368 2/15/14 12:00 AM 4.35
    1 1PM 0 1368 2/15/14 12:00 AM 4.35
    1 2PM 0 1368 2/15/14 12:00 AM 4.35
    1 3PM 0 1368 2/15/14 12:00 AM 4.35
    1 4PM 0 1368 2/15/14 12:00 AM 4.35
    1 5AM 0 1368 2/15/14 12:00 AM 4.35
    1 5PM 0 1368 2/15/14 12:00 AM 4.35
    1 6AM 0 1368 2/15/14 12:00 AM 4.35
    1 6PM 88.33333333 1368 2/15/14 12:00 AM 4.35
    1 7AM 0 1368 2/15/14 12:00 AM 4.35
    1 7PM 100 1368 2/15/14 12:00 AM 4.35
    1 8AM 0 1368 2/15/14 12:00 AM 4.35
    1 8PM 100 1368 2/15/14 12:00 AM 4.35
    1 9AM 0 1368 2/15/14 12:00 AM 4.35
    1 10AM 0 1368 2/18/14 12:00 AM NULL
    1 11AM 0 1368 2/18/14 12:00 AM NULL
    1 12PM 0 1368 2/18/14 12:00 AM NULL
    1 1PM 0 1368 2/18/14 12:00 AM NULL
    1 2PM 0 1368 2/18/14 12:00 AM NULL
    1 3PM 0 1368 2/18/14 12:00 AM NULL
    1 4PM 0 1368 2/18/14 12:00 AM NULL
    1 5AM 0 1368 2/18/14 12:00 AM NULL
    1 5PM 0 1368 2/18/14 12:00 AM NULL
    1 6AM 0 1368 2/18/14 12:00 AM NULL
    1 6PM 0 1368 2/18/14 12:00 AM NULL
    1 7AM 0 1368 2/18/14 12:00 AM NULL
    1 7PM 0 1368 2/18/14 12:00 AM NULL
    1 8AM 0 1368 2/18/14 12:00 AM NULL
    1 8PM 0 1368 2/18/14 12:00 AM NULL
    1 9AM 0 1368 2/18/14 12:00 AM NULL
    1 10AM 0 1368 2/19/14 12:00 AM NULL
    1 11AM 0 1368 2/19/14 12:00 AM NULL
    1 12PM 0 1368 2/19/14 12:00 AM NULL
    1 1PM 0 1368 2/19/14 12:00 AM NULL
    1 2PM 0 1368 2/19/14 12:00 AM NULL
    1 3PM 0 1368 2/19/14 12:00 AM NULL
    1 4PM 0 1368 2/19/14 12:00 AM NULL
    1 5AM 0 1368 2/19/14 12:00 AM NULL
    1 5PM 0 1368 2/19/14 12:00 AM NULL
    1 6AM 0 1368 2/19/14 12:00 AM NULL
    1 6PM 0 1368 2/19/14 12:00 AM NULL
    1 7AM 0 1368 2/19/14 12:00 AM NULL
    1 7PM 0 1368 2/19/14 12:00 AM NULL
    1 8AM 0 1368 2/19/14 12:00 AM NULL
    1 8PM 0 1368 2/19/14 12:00 AM NULL
    1 9AM 0 1368 2/19/14 12:00 AM NULL
    1 10AM 0 1368 2/20/14 12:00 AM NULL
    1 11AM 0 1368 2/20/14 12:00 AM NULL
    1 12PM 0 1368 2/20/14 12:00 AM NULL
    1 1PM 0 1368 2/20/14 12:00 AM NULL
    1 2PM 0 1368 2/20/14 12:00 AM NULL
    1 3PM 0 1368 2/20/14 12:00 AM NULL
    1 4PM 0 1368 2/20/14 12:00 AM NULL
    1 5AM 0 1368 2/20/14 12:00 AM NULL
    1 5PM 0 1368 2/20/14 12:00 AM NULL
    1 6AM 0 1368 2/20/14 12:00 AM NULL
    1 6PM 0 1368 2/20/14 12:00 AM NULL
    1 7AM 0 1368 2/20/14 12:00 AM NULL
    1 7PM 0 1368 2/20/14 12:00 AM NULL
    1 8AM 0 1368 2/20/14 12:00 AM NULL
    1 8PM 0 1368 2/20/14 12:00 AM NULL
    1 9AM 0 1368 2/20/14 12:00 AM NULL
    1 10AM 0 1368 2/21/14 12:00 AM 2.95
    1 11AM 0 1368 2/21/14 12:00 AM 2.95
    1 12PM 0 1368 2/21/14 12:00 AM 2.95
    1 1PM 0 1368 2/21/14 12:00 AM 2.95
    1 2PM 0 1368 2/21/14 12:00 AM 2.95
    1 3PM 0 1368 2/21/14 12:00 AM 2.95
    1 4PM 0 1368 2/21/14 12:00 AM 2.95
    1 5AM 0 1368 2/21/14 12:00 AM 2.95
    1 5PM 0 1368 2/21/14 12:00 AM 2.95
    1 6AM 0 1368 2/21/14 12:00 AM 2.95
    1 6PM 0 1368 2/21/14 12:00 AM 2.95
    1 7AM 0 1368 2/21/14 12:00 AM 2.95
    1 7PM 95 1368 2/21/14 12:00 AM 2.95
    1 8AM 0 1368 2/21/14 12:00 AM 2.95
    1 8PM 100 1368 2/21/14 12:00 AM 2.95
    1 9AM 0 1368 2/21/14 12:00 AM 2.95
    1 10AM 0 1368 2/22/14 12:00 AM 3.5
    1 11AM 0 1368 2/22/14 12:00 AM 3.5
    1 12PM 0 1368 2/22/14 12:00 AM 3.5
    1 1PM 0 1368 2/22/14 12:00 AM 3.5
    1 2PM 0 1368 2/22/14 12:00 AM 3.5
    1 3PM 0 1368 2/22/14 12:00 AM 3.5
    1 4PM 0 1368 2/22/14 12:00 AM 3.5
    1 5AM 0 1368 2/22/14 12:00 AM 3.5
    1 5PM 16.66666667 1368 2/22/14 12:00 AM 3.5
    1 6AM 0 1368 2/22/14 12:00 AM 3.5
    1 6PM 21.66666667 1368 2/22/14 12:00 AM 3.5
    1 7AM 0 1368 2/22/14 12:00 AM 3.5
    1 7PM 100 1368 2/22/14 12:00 AM 3.5
    1 8AM 0 1368 2/22/14 12:00 AM 3.5
    1 8PM 95 1368 2/22/14 12:00 AM 3.5
    1 9AM 0 1368 2/22/14 12:00 AM 3.5
    1 10AM 0 1368 2/24/14 12:00 AM 3.95
    1 11AM 0 1368 2/24/14 12:00 AM 3.95
    1 12PM 0 1368 2/24/14 12:00 AM 3.95
    1 1PM 0 1368 2/24/14 12:00 AM 3.95
    1 2PM 0 1368 2/24/14 12:00 AM 3.95
    1 3PM 0 1368 2/24/14 12:00 AM 3.95
    1 4PM 0 1368 2/24/14 12:00 AM 3.95
    1 5AM 0 1368 2/24/14 12:00 AM 3.95
    1 5PM 0 1368 2/24/14 12:00 AM 3.95
    1 6AM 0 1368 2/24/14 12:00 AM 3.95
    1 6PM 63.33333333 1368 2/24/14 12:00 AM 3.95
    1 7AM 0 1368 2/24/14 12:00 AM 3.95
    1 7PM 100 1368 2/24/14 12:00 AM 3.95
    1 8AM 0 1368 2/24/14 12:00 AM 3.95
    1 8PM 100 1368 2/24/14 12:00 AM 3.95
    1 9AM 0 1368 2/24/14 12:00 AM 3.95
    1 10AM 0 1368 2/25/14 12:00 AM 3.25
    1 11AM 0 1368 2/25/14 12:00 AM 3.25
    1 12PM 0 1368 2/25/14 12:00 AM 3.25
    1 1PM 0 1368 2/25/14 12:00 AM 3.25
    1 2PM 0 1368 2/25/14 12:00 AM 3.25
    1 3PM 0 1368 2/25/14 12:00 AM 3.25
    1 4PM 0 1368 2/25/14 12:00 AM 3.25
    1 5AM 0 1368 2/25/14 12:00 AM 3.25
    1 5PM 0 1368 2/25/14 12:00 AM 3.25
    1 6AM 0 1368 2/25/14 12:00 AM 3.25
    1 6PM 76.66666667 1368 2/25/14 12:00 AM 3.25
    1 7AM 0 1368 2/25/14 12:00 AM 3.25
    1 7PM 100 1368 2/25/14 12:00 AM 3.25
    1 8AM 0 1368 2/25/14 12:00 AM 3.25
    1 8PM 40 1368 2/25/14 12:00 AM 3.25
    1 9AM 0 1368 2/25/14 12:00 AM 3.25
    1 10AM 0 1368 2/26/14 12:00 AM 0.6
    1 11AM 0 1368 2/26/14 12:00 AM 0.6
    1 12PM 0 1368 2/26/14 12:00 AM 0.6
    1 1PM 0 1368 2/26/14 12:00 AM 0.6
    1 2PM 0 1368 2/26/14 12:00 AM 0.6
    1 3PM 0 1368 2/26/14 12:00 AM 0.6
    1 4PM 0 1368 2/26/14 12:00 AM 0.6
    1 5AM 0 1368 2/26/14 12:00 AM 0.6
    1 5PM 0 1368 2/26/14 12:00 AM 0.6
    1 6AM 0 1368 2/26/14 12:00 AM 0.6
    1 6PM 0 1368 2/26/14 12:00 AM 0.6
    1 7AM 0 1368 2/26/14 12:00 AM 0.6
    1 7PM 0 1368 2/26/14 12:00 AM 0.6
    1 8AM 0 1368 2/26/14 12:00 AM 0.6
    1 8PM 38.33333333 1368 2/26/14 12:00 AM 0.6
    1 9AM 0 1368 2/26/14 12:00 AM 0.6
    1 10AM 0 1368 2/27/14 12:00 AM 4.9
    1 11AM 0 1368 2/27/14 12:00 AM 4.9
    1 12PM 0 1368 2/27/14 12:00 AM 4.9
    1 1PM 0 1368 2/27/14 12:00 AM 4.9
    1 2PM 0 1368 2/27/14 12:00 AM 4.9
    1 3PM 0 1368 2/27/14 12:00 AM 4.9
    1 4PM 0 1368 2/27/14 12:00 AM 4.9
    1 5AM 0 1368 2/27/14 12:00 AM 4.9
    1 5PM 25 1368 2/27/14 12:00 AM 4.9
    1 6AM 0 1368 2/27/14 12:00 AM 4.9
    1 6PM 100 1368 2/27/14 12:00 AM 4.9
    1 7AM 0 1368 2/27/14 12:00 AM 4.9
    1 7PM 100 1368 2/27/14 12:00 AM 4.9
    1 8AM 0 1368 2/27/14 12:00 AM 4.9
    1 8PM 100 1368 2/27/14 12:00 AM 4.9
    1 9AM 0 1368 2/27/14 12:00 AM 4.9
    1 10AM 0 1368 2/28/14 12:00 AM 7.1
    1 11AM 23.33333333 1368 2/28/14 12:00 AM 7.1
    1 12PM 91.66666667 1368 2/28/14 12:00 AM 7.1
    1 1PM 58.33333333 1368 2/28/14 12:00 AM 7.1
    1 2PM 0 1368 2/28/14 12:00 AM 7.1
    1 3PM 0 1368 2/28/14 12:00 AM 7.1
    1 4PM 76.66666667 1368 2/28/14 12:00 AM 7.1
    1 5AM 0 1368 2/28/14 12:00 AM 7.1
    1 5PM 100 1368 2/28/14 12:00 AM 7.1
    1 6AM 0 1368 2/28/14 12:00 AM 7.1
    1 6PM 100 1368 2/28/14 12:00 AM 7.1
    1 7AM 0 1368 2/28/14 12:00 AM 7.1
    1 7PM 23.33333333 1368 2/28/14 12:00 AM 7.1
    1 8AM 0 1368 2/28/14 12:00 AM 7.1
    1 8PM 0 1368 2/28/14 12:00 AM 7.1
    1 9AM 0 1368 2/28/14 12:00 AM 7.1
    select l.*,(l.totrev/l.cnt) as AvgRev,(l.totworkdone)/l.cnt as AVgworkdone
    from
    (select k.*,SUM(Case when k.rn=k.cnt1 then k.Rev end) over(partition by k.[ Secempid] ) as totrev
    from
    (SELECT [ampm]
    ,[workdone]
    ,[loc]
    ,[Secempid]
    ,[Date]
    ,[Rev]
    ,COUNT(date) over(partition by [Secempid],AMPM ) as cnt
    ,SUM(workdone) over(partition by [Secempid])as totworkdone
    ,row_number() over(partition by [Secempid],date,rev order by [Secempid],date,rev) as rn
    ,COUNT(rev)over(partition by [Secempid],date,rev ) as cnt1
    FROM MYTABLE
    where [Secempid]='1368'
    and DATE between '2014-02-01' and '2014-02-28'
    group by [ampm]
    ,[workdone]
    ,[loc]
    ,[ Secempid]
    ,[Date]
    ,[Rev]
    )k
    )l
    SV

    Hi S,
    There are many ways to solve your challenge. What I have found very efficient, and not to complicated, is to add a measure group, which simply has days as facts, with one attached dimension, dates. Your measure could be [Business Days], which has a '1' for
    every business day. Then, wherever you are in the cube, you can get Measures.[Business Days] for the number of business days. It can be the denominator in your average calc.
    Of course, you can also dynamically count a filtered set of days in your date dimension (filtered by business day). Which would also work, but its performance would not be as good as a physical measure.
    Hope that helps,
    Richard

Maybe you are looking for

  • How to connect my ipod touch with my laptop?

    How can I connect my laptop pc with my ipod touch to share my music in itunes?

  • How to write UDL file in Labview for Database connection

    Hi I am using SQL Server database. I am making connection with SQL by using UDL file. If I create  Normal UDL file in windows and make connection it happens.  I want to change or add some parameter Like database name. If I am changing this value by u

  • SQL - SCO Cluster Configuration Understanding

    Dear All,             I am building a two-node failover cluster on SQL Server 2012 SP1 (inside Hyper-V as a Guest Cluster) and want clarification on few things that I am facing. 1.  I am receiving MSDTC Warning.  I can go ahead and create the cluster

  • CACHE REFRESH ERROR (SXI_CACHE)

    Dear Gurus, I am trying to execute SXI_CACHE tcode on ABAP system i getting some error. I have maintained the INTEGRATION_DIRECTORY_HMI RFC (H) & User : PIDIRUSER and given the required authorizations: Kindly find the screen shots kindly help us.

  • IOS7 Orientation/AspectRatio Bug AIR 3.9

    I have discovered a new bug in AIR 3.9 on iOS7. I am testing on iPAD2 How to replicate: Create an app that auto-orient to any direction. Create 1 button at the very top left that when clicked runs the following code:       this.stage.setAspectRatio(S