Proper use of GROUP BY

I wrote the view below to query orders that have shipped quantity from selected warehouse subinventories. I need to compute the total of unique orders and sum for a given date along with extended price.
I have tried many SELECTs from SCO_REQSGROSS_V including
select TRXN_DATE
, count(*)
, SUM(EXT_PRC)
from SCO_REQSGROSS_V
where trxn_date = to_date('30-AUG-2010')
group by TRXN_DATE, order_num
I need a count of all the unique order numbers and there ext_prc for each date in a given date range. I keep getting back the line counts. There is something wrong with my approach using the GROUP BY clause. Do I need to use a partition clause and how would this written?
create or replace view SCO_REQSGROSS_V (
TRXN_DATE
, order_num
, EXT_PRC)
AS
select TRUNC(transaction_date)
, segment1
, ext_price
from
-- lines
SELECT DISTINCT prh.segment1
, prl.line_num
, ool.ordered_item item_number
, mmt.trx_source_line_id
, mmt.transaction_reference
, ooh.order_number
, prl.quantity
, ROUND(ool.shipped_quantity * prl.unit_price,2) ext_price
, mmt.transaction_date
, mmt.subinventory_code
FROM apps.po_requisition_headers_all prh
, apps.po_requisition_lines_all prl
, apps.oe_order_headers_all ooh
, apps.oe_order_lines_all ool
, apps.mtl_material_transactions mmt
WHERE prh.requisition_header_id = prl.requisition_header_id
AND ool.header_id = ooh.header_id
AND prh.requisition_header_id = prl.requisition_header_id
AND ooh.source_document_id = prh.requisition_header_id
AND ool.source_document_line_id = prl.requisition_line_id ----(+)
AND ool.line_id = mmt.trx_source_line_id
AND mmt.transaction_type_id = 53 -- Internal Order Pick
AND prl.source_type_code = 'INVENTORY'
AND prh.authorization_status = 'APPROVED'
AND prl.item_id = ool.inventory_item_id
AND ool.shipped_quantity IS NOT NULL
AND ool.subinventory = mmt.subinventory_code
AND mmt.subinventory_code IN ('APPLIANCE', 'ELECTRICAL', 'ELEVATOR', 'FURNITURE', 'HARDWARE', 'HEAT'
, 'JANITORIAL', 'JANIT 1', 'OFFICE', 'PAINT', 'PAPER RM', 'PLUMBING', 'SKIL', 'SURPLUS', 'TOOL', 'UNIFORM')
)

I ran this version of your code:
select distinct trxn_date
count(order_num) over (partition by trunc(trxn_date)) cnt_orders,
sum(ext_prc) over (partition by trunc(trxn_date)) total_per_txn_day -- for total_prc per day
from SCO_REQSGROSS_V
where trunc(trxn_date) = to_date('30-AUG-2010');
it returned one row and total_per_txn_day was correct at 51044.21. However cnt_orders was 303. This is nominally correct but not really what I am after because what I need is a count of unique order numbers.
The logic
SELECT * FROM
FROM SCO_REQSGROSS_V
where trunc(trxn_date) = to_date('30-AUG-2010')
order by order_num;
returns some duplicate order numbers. I want to count the unique order numbers but still include the ext_prc for all of the orders even the duplicates.
Here is the first several rows of output from my query above. Desired cnt_orders is 3 and the total_per_txn_day is 1528.98:
Date..........Order#.....Ext_Prc
30-AUG-10     399884     90
30-AUG-10     399893     108
30-AUG-10     403664     114.48
30-AUG-10     403664     68.7
30-AUG-10     403664     100.02
30-AUG-10     403664     114.48
30-AUG-10     403664     139.04
30-AUG-10     403664     43.04
30-AUG-10     403664     49.8
30-AUG-10     403664     120
30-AUG-10     403664     55
30-AUG-10     403664     198.41
30-AUG-10     403664     129.6
30-AUG-10     403664     198.41
Thank you.

Similar Messages

  • ACS 5.3 Authorization problem with using Identity Groups in Access Policy Rule

    Hello guys, I am found a problem which I can't solve regarding authorization with using Identity Groups in Access Policy rule.
    ACS version: 5.3.0.40.6 (internal build B.839)
    I have very simple RADIUS Authorization rule which authorize user on behalf of right Identity Group.
    Requested Identity Group exist
    Testing user is created in Internal Users and has assigned requested Identity Group
    Radius Access Policy: 
    Authentication against Identity Store Sequence, where authorization server is external RSA SecurID device and additional attributes retrieval is configured from Internal Users.
    Authorization is very simple – One Rule with only one Condition which is: Identity Group - in - Requested_Testing_Rule. Then Default rule is set to Deny.
    When I will try login with my testing user then authentication against RSA SecurID is OK, but authorization will be denied by Default rule – It looks like my Rule with Identity Group is totally omitted.
    I am managing several other ACS servers (version 5.3 but with older patches) where similar rules are working without problem.
    What I am tested:
    Remove testing user and create his account again.
    Rename Identity Group
    Use another Identity Group
    Remove Access Policy rule and create it again
    Use Compound Condition: System:Identity Group
    Use Compound Condition: System:UserID instead of Identity Group in Rule (it is working without problem)
    Do you have any idea where problem can be?

    OK guys, it started working yesterday without any configuration change. Maybe it was some database inconsistence wich was solved by ACS itself.

  • Single step workflow with multiple approvers (without using a group)

    Hi,
    is it possible to have a single step workflow with multiple approvers without using a group? This is for a contract document.
    i want to add a number of users based on particular logic. The approvers are random and do not belong to any particular group.
    DO let me know if it is possible or if any of you have done that.
    thanks in advance.
    regards,
    rubio

    Hi Rubio,
    I believe the behavior would be, if individual users are added as approvers then the system would require each approver to approve the document. However, if you use the user group, you could set the role so that it would be either ALL or ANY.
    Regards,
    Vikram

  • Overhead calculation not happening after using orgin group.

    overhead calculation not happening after using orgin group.
    There was a runtime error earlier related to u201Cdefine credit u201C IMG node under costing sheet component and we have applied SAP note 769946 and that error was gone out of the way
    We want to apply/add Overhead to SFG/FG materials.
    We are using PP order with PCC(product cost collector) as the cost object , i.e costing by period.(system ECC 6)
    But our problem is with material standard cost estimate process.
    We have assigned overhead keys to the percentage rates in costing sheet for material standard costing and assigned the origin groups to the credits of costing sheet. But after running the cost estimate overhead is not taking into account for standard cost calculation.
    In the define credit entry table key field is valid to date strangly and actually system should allow one than one entry with same valid to date and same sec.Cost element(type-41) for different cost centers.
    But if we without using overhead key and origin group, the entire cost in that supporting cost center will come to all materials (SFG/FG) and we can not distinguish between different product materials(SFG /FG).
    We have checked all things as mentined below.
    Firstly that the correct costing sheet is assigned to the valuation
    variant.
    That the costing sheet is entered for the appropriate material type:
    Finished and semi finished or material components.
    All of the above can be checked and verified via transaction OKKN.
    In addition make sure that the base value maintained is present in the
    costing, for example the base may include an Origin group, is that
    origin group part of the materials being costed?
    Similarly if the base is found and values exist how is the overhead
    rate of the costing sheet set up, is it valid etc.
    And finally do a similar check for the credit.
    we doubt this as a programm error...
    So, request all experts to have ur feedback..

    Dear,
    Check your origin group & material unit of mesaurement is same.
    some time in costing sheet origin group is maintain in different unit & for materail it's maitain in other unit of mesaurement.
    You can see unit of measure for material in Additional data - unit of measure.
    Check BOM component material unit also.
    Check same  unit of measure is maintain in KZS2
    I hope above will useful.
    GOPAN

  • What are the key fields used to group double orders in t-code SDD1?

    What are the key fields used to group double orders in t-code SDD1?
    Cheers,
    VT

    Hi,
    You can group the duplicate sales documents with the help of the following fields,
    Sold-To-Party,Document type,date and sales area details.
    Regards,
    Gopal.

  • How to create groups in AD using AD group process definition

    Hello,
    I want to create group in AD when i create group in OIM & i had achieved this using JNDI & was able to create groups in AD
    successfully,now i want to create groups in AD using AD Group process definition which contains create AD Group task.
    Can anyone suggest what flow should be followed to achieve this ?
    Thanks & Regards,
    Rahul Shah

    I think you can do R & D to acheive this via Event Handler.
    1. Apply Event Handler on Group Object Form.
    In Event handler class.
    2. get the parameter from ADITresource.
    3. Make JNDI call.
    4. Create the same group name in AD.
    There may be easy approach.
    Which version of OIM you are using?
    Thanks,
    Kuldeep

  • What is the advantages of using Flexconnect groups

    what is the advantages of using Flexconnect groups in WLC?
    Reg,
    Ezra.

    Pls refer this document for more detail about these features
    http://www.cisco.com/c/en/us/td/docs/solutions/Enterprise/Mobility/emob73dg/emob73/ch7_HREA.html#wp1091114
    FlexConnect is one mode an AP can operate, typically deployed in Branch setup where you do not have a controller at branch site. Those AP can register to a controller at your HQ or main site. So traffic will terminate at your branch switch instead of tunnel back to HQ-WLC.
    If you want roaming within your branch FlexConnect AP then you have to put those AP into a FlexConnect Group. Then only key information shared among those AP to facilitate fast roaming.
    Pls do not forget to rate our responses if you find them useful.
    HTH
    Rasika

  • Aggregate functions cannot be used in group expressions

    Hi have report showing sales by Vendor. I need to list all the vendors with Monthly Total>5000 and combine the rest as "OTHER VENDORS"
    Vendor is a Group in my report, so I tried to put an expression as a Group on:
    =IIF(Sum(Fields!Mth_1_Sales.Value)>5000,Fields!Vendor_No.Value,"OTHER VENDORS")
    I've got an error: "aggregate functions cannot be used in group expressions"
    How do I get Vendors with Sales < 5000 into  "OTHER VENDORS" ?

    Hi,
    You need to group by Month on group expression,
    And you can use the same expression in the report column as 
    =IIF(Sum(Fields!Mth_1_Sales.Value)>5000,Fields!Vendor_No.Value,"OTHER VENDORS")
    Many Thanks
    ..................................................................................................................................................................Please
    mark the post as Please mark the post as answered if this post helps to solve the post.

  • Use of GROUP BY in SELECT

    I am considering using GROUP BY in several selects where I will access very big bsik, bsid, bsad, bsak, ecc...
    My two concerns are performance and high data volume.
    GROUP BY should reduce data arriving from DB but will it hurt performance?
    What are your considerations?
    Any ideas are welcome. I cannot seem to find anything specific for or against GROUP BY.
    thanks,
    Phillip

    Hello Philip,
    I would advise that you do not use the Group By clause in the select statements for these tables. As you have rightly said, the data volume in these tables might be a major cause for concern.
    Using the Group By clause will heavily load the database server. while in the development environment, your query  might run okay, in the productive environment it might crash. The reason : the cursor to the database would have timed out.
    The Group By operation, as you know can be simulated after you get the data into the application server. The application server processing can take some time but you can usually optimize it.
    Regards,
    Anand Mandalika.

  • Photoshop CC 2014 hangs for nearly 15 seconds when using Layer Group Layers or Layer Palette Group from Layers commands.

    Photoshop hangs for nearly 15 seconds when using Layer > Group Layers or Layer Palette > Group from Layers commands. NOT on creating new group and creating layers and drag-n-dropping layers into group manually in Layer Palette. Got a UI design file with nearly 800 layers and lot of groups. This doesn't occur in smaller files. Using OS X Mavericks, Photoshop CC 2014.2.2, Mid 2014 15-inch Retina Macbook Pro.
    I understand that the solution would be just to cut the file into smaller files and design each UI view in a separate file but that would be painful. And it does not seem to be a very hard task to just group layers.

    Does the document also have a lot of layer comps?
    And can you post that document or send me a copy of it so we can see exactly why it is hanging?

  • Reg. can we display alv grid using field groups (extracts)

    Hi,
    can we display alv grid using field groups (extracts). is this possible. i have to develop a blocked alv.
    tnks
    Yerukala Setty

    No, you will need the data in an internal table to use ALV.
    Cheers
    Allan

  • Percentage based on Status using per group

    Hi Gurus,
    Can you help me with this? You have any approach of getting the percentage based on status using per group?
    Currently I have this code below but it doesn’t have result but no error. I am trying to get the % In Complete.
    <?xdofx:(sum(current-group()/NUMINCOMPLETE)/(sum(NUMSUBMITTED) + sum(NUMCOMPLETED) + sum(NUMINPROGRESS) + sum(NUMINCOMPLETE)))*100?>
    Thanks Much,
    JP
    Edited by: BIPnewbie on Feb 6, 2012 3:05 AM

    Use this:
    <?xdoxslt:div(sum(current-group()/NUMINCOMPLETE), ((sum(NUMSUBMITTED) + sum(NUMCOMPLETED) + sum(NUMINPROGRESS) + sum(NUMINCOMPLETE)))*100?>
    Thanks,
    Bipuser

  • How to handle "The specified resource does not exist" exception while using entity group transactions to purge WADLogs table

    Hi,
    We have a requirement to purge the Azure WADLogs table on a periodic basis. We are achieving this by using Entity group transactions to delete the
    records older than 15 days. The logic is like this.
    bool recordDoesNotExistExceptionOccured = false;
    CloudTable wadLogsTable = tableClient.GetTableReference(WADLogsTableName);
    partitionKey = "0" + DateTime.UtcNow.AddDays(noOfDays).Ticks;
    TableQuery<WadLogsEntity> buildQuery = new TableQuery<WadLogsEntity>().Where(
    TableQuery.GenerateFilterCondition("PartitionKey",
    QueryComparisons.LessThanOrEqual, partitionKey));
    while (!recordDoesNotExistExceptionOccured)
    IEnumerable<WadLogsEntity> result = wadLogsTable.ExecuteQuery(buildQuery).Take(1000);
    //// Batch entity delete.
    if (result != null && result.Count() > 0)
    Dictionary<string, TableBatchOperation> batches = new Dictionary<string, TableBatchOperation>();
    foreach (var entity in result)
    TableOperation tableOperation = TableOperation.Delete(entity);
    if (!batches.ContainsKey(entity.PartitionKey))
    batches.Add(entity.PartitionKey, new TableBatchOperation());
    // A Batch Operation allows a maximum 100 entities in the batch which must share the same PartitionKey.
    if (batches[entity.PartitionKey].Count < 100)
    batches[entity.PartitionKey].Add(tableOperation);
    // Execute batches.
    foreach (var batch in batches.Values)
    try
    await wadLogsTable.ExecuteBatchAsync(batch);
    catch (Exception exception)
    // Log exception here.
    // Set flag.
    if (exception.Message.Contains(ResourceDoesNotExist))
    recordDoesNotExistExceptionOccured = true;
    break;
    else
    break;
    My questions are:
    Is this an efficient way to purge the WADLogs table? If not, what can make this better?
    Is this the correct way to handle the "Specified resource does not exist exception"? If not, how can I make this better?
    Would this logic fail in any particular case?
    How would this approach change if this code is in a worker which has multiple instances deployed?
    I have come up with this code by referencing the solution given
    here by Keith Murray.

    Hi Nikhil,
    Thanks for your posting!
    I tested your and Keith's code on my side, every thing worked fine. And when result is null or "result.count()<0", the While() loop is break. I found you code had some logic to handle the error "ResourceDoesNotExist" .
    It seems that the code worked fine. If you always occurred this error, I suggest you could debug your code and find which line of code throw the exception.   
    >> Is this an efficient way to purge the WADLogs table? If not, what can make this better?
    Base on my experience, we could use code (like the above logic code) and using the third party tool to delete the entities manually. In my opinion, I think the code is every efficient, it could be auto-run and save our workload.
     >>Is this the correct way to handle the "Specified resource does not exist exception"? If not, how can I make this better?
    In you code, you used the "recordDoesNotExistExceptionOccured " as a flag to check whether the entity is null. It is a good choice. I had tried to deleted the log table entities, but I used the flag to check the result number.
    For example, I planed the query result count is 100, if the number is lower than 100, I will set the flag as false, and break the while loop. 
    >>Would this logic fail in any particular case?
    I think it shouldn't fail. But if the result is "0", your while loop will always run. It will never stop. I think you could add "recordDoesNotExistExceptionOccured
    = true;" into your "else" block.
    >>How would this approach change if this code is in a worker which has multiple instances deployed?
    You don't change anything expect the "else" block. It would work fine on the worker role.
    If any question about this issue, please let me know free.
    Regards,
    Will
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Proper Use Of Sessions

    Hi everyone,
    My fellow developers and I were having a discussion as to the proper use of sessions in a web application built with Servlets. The situation was we were building a method to get search results from a database and we wanted to have the ability to limit the number of results displayed per page.
    One method of doing this was to get the entire search result, load that into a hashtable or some other data structure, and put that into a session variable to preserve the hashtable.
    The other method was to re-query the database every time the person clicked the 'view next' button and get a new result set every time.
    I tend to favor the use of session variables to maintain the ResultSet...as I was taught that database connections are expensive and should be limited. However, according to one of the other developers, storing objects (especially when they are potentially large) in session variables is not recommended.
    So what do you guys think? Should be store large objects in session variables, or should we re-query the database every time the person loads the page?
    As a side note, the servers we use are quite robust, but we know that that is not an excuse to write in-efficient code.

    Its a trade off between memory and database access.
    Storing stuff in session is expensive in terms of memory.
    The more stuff you store in session, the more overhead there is with each user of the system.
    Querying the database each time saves you memory on the app server, but means more work for the database.
    Which is better? As always it depends.
    - How many users are projected for the system? More users means you want to keep the session as light as possible.
    - Is the query is hugely expensive to run? A long running query that only returns a few records - you would want to cache that in session.
    - Are you able to limit the result set data returned via the database? Some database support this, some don't. ie if you did a database query each time would you have to step through "5 pages" of data to get to display the "6th" page, or could you just get the data for page 6 using row numbers?
    Database connections themselves aren't that expensive to acquire if you are using connection pooling, but they are still a limited resource.
    Hope that helps some,
    evnafets

  • IMPOSSIBLE TO USE A GROUPED MAIL  IN A SCHEDULING ACTION

    Hello Everybody,
    I am currently facing a problem while trying to schedule a report.
    In fact, when I use a simple mail  like '[email protected]', the report is well sent.
    However, when I use a grouped email  like '[email protected]', the people doesn't receive the email. In BO, all seems ok, there are no erros in the schedule history of the report and every services are OK.
    When I use any other kind of software (outlook, talend, ect..)  to send data thought this mail list , it works everytime.
    There is no particular configuration in the SMTP server.
    Does anyone here know why it doen't work with BO ?
    I work with BO XI 3.1.
    Thanks for your help!

    Are the users in CMC? If they are, Follow below steps.
    Create users and while creating the users add the email address of respective user.
    Add the users in a group.
    Go to the report and right click>Schedule.
    Add the group in 'Schedule For'
    Under Schedule> Destinations select Email.
    Update the information and in 'TO' specify %SI_EMAIL_ADDRESS%
    Schedule the report.

Maybe you are looking for