Group by in query

Hi!
We have a flight route table that holds individual flight legs together that build a route:
CREATE TABLE "TEST"."ROUTE"
   (     "ROUTE_ID" NUMBER NOT NULL ENABLE,
     "ORIGIN" VARCHAR2(3 BYTE) NOT NULL ENABLE,
     "DESTINATION" VARCHAR2(3 BYTE) NOT NULL ENABLE,
     "FLIGHT_LEG_ID" NUMBER NOT NULL ENABLE,
     "SEGMENT_ID" NUMBER NOT NULL ENABLE,
      CONSTRAINT "ROUTE_PK" PRIMARY KEY ("ROUTE_ID", "FLIGHT_LEG_ID", "SEGMENT_ID")
  USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
  STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
  PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
  TABLESPACE "USERS"  ENABLE
   ) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
  STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
  PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
  TABLESPACE "USERS" ;
Insert into ROUTE (ROUTE_ID,ORIGIN,DESTINATION,FLIGHT_LEG_ID,SEGMENT_ID) values (1,'FCO','ZRH',9,1);
Insert into ROUTE (ROUTE_ID,ORIGIN,DESTINATION,FLIGHT_LEG_ID,SEGMENT_ID) values (1,'ZRH','JFK',10,2);
Insert into ROUTE (ROUTE_ID,ORIGIN,DESTINATION,FLIGHT_LEG_ID,SEGMENT_ID) values (2,'JFK','ZRH',11,1);
Insert into ROUTE (ROUTE_ID,ORIGIN,DESTINATION,FLIGHT_LEG_ID,SEGMENT_ID) values (2,'ZRH','FCO',12,2);The flight legs must be grouped by route_id. I.e. A user wants to go from Rome to New York, so the results must return both legs:
FCO  ZRH
ZRH  JFKThanks for your feedback!

This one, too. more test data:
Insert into ROUTE (ROUTE_ID,ORIGIN,DESTINATION,FLIGHT_LEG_ID,SEGMENT_ID) values (3,'STG','FRA',19,1);
Insert into ROUTE (ROUTE_ID,ORIGIN,DESTINATION,FLIGHT_LEG_ID,SEGMENT_ID) values (3,'FRA','HKG',20,2);
Insert into ROUTE (ROUTE_ID,ORIGIN,DESTINATION,FLIGHT_LEG_ID,SEGMENT_ID) values (3,'HKG','MAN',21,3);
Insert into ROUTE (ROUTE_ID,ORIGIN,DESTINATION,FLIGHT_LEG_ID,SEGMENT_ID) values (3,'MAN','PMO',22,4);And the query
SQL>SELECT route_id, origin, destination, route
  2    FROM (SELECT route_id, origin, destination, MAX(route) OVER(PARTITION BY route_id) AS route
  3            FROM (SELECT     route_id, origin, destination, flight_leg_id, segment_id,
  4                             LTRIM(SYS_CONNECT_BY_PATH(origin, '-'), '-') || '-' || destination AS route
  5                        FROM route
  6                  CONNECT BY route_id = PRIOR route_id AND origin = PRIOR destination
  7                  START WITH origin = :origin))
  8   WHERE route LIKE :origin || '%' AND route LIKE '%' || :destination;
  ROUTE_ID ORIGI DESTI ROUTE
         3 STG   FRA   STG-FRA-HKG-MAN-PMO
         3 FRA   HKG   STG-FRA-HKG-MAN-PMO
         3 HKG   MAN   STG-FRA-HKG-MAN-PMO
         3 MAN   PMO   STG-FRA-HKG-MAN-PMOUrs

Similar Messages

  • Creating Target group with Bi Query

    Hi,
    I am trying to create Target group with BI query, for that I created one query on real time info cube  with 0BPARTNER, 0BP_GUID  and selected all navigational attributes.  In the query info object properties u2013 Advance setting I selecting u201C Values from Master Data Tableu201D  & Access Type u201C Master Datau201D , In query Property u2013 selected u201C Allow External Access to Queryu201D
    In CRM while creating Data source I can see the In objects selected in   the query u2013 selected Business partner info object as 0BP_GUID ,In the attribute list I just selected one item Industry code against the above data source, while running the Segment Builder when I selected the industry code it gives me 0 business partner where as I am having 20 BP with that Industry code.
    I appreciate your help ASAP.
    Regards,
    Rajiv Jain

    Hi Rajiv
    I have the same issue with my BI Query. Hope you have resolved by now, if you have please do let me know.
    What we have done is, created a BI Query and used this query in in CRM datasource, then created some filters where i can see the objects in the query while creating filters in Attributes List. But when we use these filters to create Target group, its not resulting any results in the query, i mean not brining any values on segmentation area. This issue is only in Production as we have already tested in Test and passed without any problems. But in Production we have this issue, i m not sure why this is happening, i spoke to BW team regarding this and they are not sure eitherl. If you know could you please let me know
    Thank you in advance
    Regards
    shankar

  • How many ways we can create authorization for user groups in sap query reports

    Hi Gurus, I am getting a problem when I am assigning users to user group in sap query report .The users other than created in user groups are also able to add &change  the users .So please suggest me how to restrict users outside of the user group.
    Please send me if u have any suggestions and useful threads.
    Thank You,
    Suneel Kumar.

    I don't think it can be done. According to the link below 'Users who have authorization for the authorization object S_QUERY with both the values Change and Maintain, can access all queries of all user groups without being explicitly entered in each user group.'
    http://help.sap.com/saphelp_46c/helpdata/en/d2/cb3f89455611d189710000e8322d00/content.htm
    Although I think you can add code to your infoset and maybe restrict according to authority group, i.e.:
    Use AUTHORITY-CHECK to restrict access to the database based on user.
    Press F1 on AUTHORITY-CHECK to find out how to use it in the code

  • Order by ignored if report has 2 groups in 1 query

    My report has order by &p_q1_order that allows the user to select the order of the report at runtime either by Business Name or by Applicator Name. The order by was working fine with only one group in the query. I have since had to add new fields(applicator type) to the original query and have created a second group below the first in the data model. Now when the report runs it ignores the order by clause and is ordering the report by which ever field is first in the data model's first group. If I move the fields around in the data model it will sort by which ever one I have at the top.
    My query is below: All fields except (ps.status, ps.description, and ps.requalify) are in the 1st group those 3 fields are in the second group. If I move the t.BUSINESS_NAME to the top it orders by Business name. If I move the t.NAME to the top it order by applicator's name and ignores the user's choice in the parameter form.
    My query:
    SELECT distinct
    t.NAME,
    t.BUSINESS_NAME,
    t.ID,
    ps.status,
    ps.description,
    ps.requalify,
    t.license_cd,
    t.ISSUE_DATE,
    t.EXPIRES,
    t.county,
    t.enf_dist,
    t.addr_ln_1||' '||t.addr_ln_2 address,
    t.city||' '||t.state||' '||t.zip ctystzip,
    t.phone
    FROM lic_com_appl_dler_addr t, pesticide_com_license_sumlic ps
    where t.id = ps.id(+)
    and ps.lic_subtype_cd not in('11','12','13','14','15')
    and t.EXPIRES like :p_current
    &p_where
    order by &p_q1_order

    Hi Terri
    Please let us know the Reports version you are using? Also try with out the bind varibale in query. Does the order by works then? There were bugs like this in earlier realeses of Report but are all fixed now.
    Thanks
    The Oracle Reports Team

  • Error message:FRM-12001: Cannot Create the record group(check your query)

    Requirement: Need to get employee name and number in the LOV in search criteria.
    So I created LOV "full_name" and Record group Query under Employee Name property palette with
    select papf.title||' '||papf.last_name||', '||papf.first_name||' '||papf.middle_names emp_full_name
    ,papf.employee_number
    from apps.per_all_people_f papf, apps.per_person_types ppt
    where sysdate between papf.effective_start_date and papf.effective_end_date AND papf.person_type_id=ppt.person_type_id AND ppt.system_person_type IN ('EMP', 'OTHER', 'CWK','EMP_APL')
    AND PPT.default_flag='Y' and papf.BUSINESS_GROUP_ID=1
    order by papf.full_name
    I was unable to save and getting error message "FRM-12001: Cannot Create the record group(check your query)".
    I cant use PER_ALL_PEOPLE_F.FULL_NAME since full name here is last_name||title||middle_names||firstname.
    But my requiremnet is papf.title||' '||papf.last_name||', '||papf.first_name||' '||papf.middle_names emp_full_name .
    Can any one of you help me.

    First, Magoo wrote:
    <pre><font face = "Lucida Console, Courier New, Courier, Fixed" size = "1" color = "navy">create or replace function emp_full_name ( p_title in varchar2,
    p_last_name in varchar2,
    p_first_name in varchar2,
    p_mid_names in varchar2 ) return varchar2 is
    begin
    for l_rec in ( select decode ( p_title, null, null, p_title || ' ' ) ||
    p_last_name || ', ' || p_first_name ||
    decode ( p_mid_names, null, null, ' ' || p_mid_names ) full_name
    from dual ) loop
    return ( l_rec.full_name );
    end loop;
    end;</font></pre>
    Magoo, you don't ever need to use Select from Dual. And the loop is completely unnecessary, since Dual always returns only one record. This would be much simpler:
    <pre><font face = "Lucida Console, Courier New, Courier, Fixed" size = "1" color = "navy">create or replace function emp_full_name
    ( p_title in varchar2,
    p_last_name in varchar2,
    p_first_name in varchar2,
    p_mid_names in varchar2 ) return varchar2 is
    begin
    Return ( Ltrim( Rtrim ( p_title
    ||' ' ||p_last_name
    ||', '||p_first_name
    ||' ' ||p_middle_names )));
    end;</font></pre>
    And second:
    user606106, you did not mention how you got your record group working. However, you DO have an issue with spaces. If you change this:
    <pre><font face = "Lucida Console, Courier New, Courier, Fixed" size = "1" color = "navy">select papf.title||' '||papf.last_name||', '||papf.first_name||' '||papf.middle_names emp_full_name
    ,papf.employee_number </font></pre>
    to this:
    <pre><font face = "Lucida Console, Courier New, Courier, Fixed" size = "1" color = "navy">select Ltrim(Rtrim(papf.title||' '||papf.last_name||', '
    ||papf.first_name||' '||papf.middle_names)) AS emp_full_name,
    papf.employee_number</font></pre>
    it should work. The Ltrim(Rtrim()) removes leading and trailing spaces from the resulting full name.

  • FRM-12001:  Cannot create the record group (check your query).

    I WANT TO ADD A RECORD GROUP IN PRI BUILD FORM THE QUARY IS VERY SIMPLE LIKE
    SELECT item_code
    FROM items
    WHERE active = 'Y'
    AND item_code like 'FAJ%'
    BUT THE SYSTEM SHOW THE ERROR MESSAGE
    Cannot create the record group (check your query).

    Make sure the user connected to the database from the forms builder has the privilege of select from the table, or there's a synonym.
    Try your query from SQL*Plus connected with the same user.
    Tony

  • Displaying a radio group in SQL QUERY report region

    Good morning everyone,
    I have a report in which a column - ORDER STATUS, will come in with a value of 1, 2 or 3...being order unfilled, order partially filled, or order filled, respectively.
    I would like to display the order status as a radio group on the report so that it will be easy to run down the column of radio buttons to see what is filled, etc.
    I've gone to the manual and checked the doco on HTMLDB_ITEM.RADIOGROUP. But the example given there is actually for CHECKBOX (is this an error?!?).
    I went to the forums and found nothing suitable.
    My region is an SQL QUERY. Can I display the STATUS as a radio group in the SELECT ?
    This is the question.
    Thankyou in anticipation. TC. 23/11/2004

    Tony,
    There may be better solutions, but here's what I was thinking:    create table orders (id number, status number, customer varchar(30))
        insert into orders (id,status,customer) values(1,1,'ACME')
        insert into orders (id,status,customer) values(2,2,'BENSON')
        insert into orders (id,status,customer) values(3,3,'CLARKE')
        commit
        Query Source
        select
          id "ORDER NUMBER",
          decode(status,
            1,htmldb_item.RADIOGROUP(1,status,'1','unfilled')||htmldb_item.RADIOGROUP(2,status,'2','partial','"disabled=true"')||htmldb_item.RADIOGROUP(3,status,'3','filled','"disabled=true"'),
            2,htmldb_item.RADIOGROUP(1,status,'1','unfilled','"disabled=true"')||htmldb_item.RADIOGROUP(2,status,'2','partial')||htmldb_item.RADIOGROUP(3,status,'3','filled','"disabled=true"'),
            3,htmldb_item.RADIOGROUP(1,status,'1','unfilled','"disabled=true"')||htmldb_item.RADIOGROUP(2,status,'2','partial','"disabled=true"')||htmldb_item.RADIOGROUP(3,status,'3','filled'))
          "STATUS",
          customer "Customer Name"
        from orders;Scott

  • GROUP and MAX Query

    Hi
    I have two tables that store following information
    CREATE TABLE T_FEED (FEED_ID NUMBER, GRP_NUM NUMBER);
    CREATE TABLE T_FEED_RCV (FEED_ID NUMBER, RCV_DT DATE);
    INSERT INTO T_FEED VALUES (1, 1);
    INSERT INTO T_FEED VALUES (2, 1);
    INSERT INTO T_FEED VALUES (3, 2);
    INSERT INTO T_FEED VALUES (4, NULL);
    INSERT INTO T_FEED VALUES (5, NULL);
    INSERT INTO T_FEED_RCV VALUES (2, '1-MAY-2009');
    INSERT INTO T_FEED_RCV VALUES (3, '1-FEB-2009');
    INSERT INTO T_FEED_RCV VALUES (4, '12-MAY-2009');
    COMMIT;
    I join these tables using the following query to return all the feeds and check when each feed was received:
    SELECT
    F.FEED_ID,
    F.GRP_NUM,
    FR.RCV_DT
    FROM T_FEED F
    LEFT OUTER JOIN T_FEED_RCV FR
    ON F.FEED_ID = FR.FEED_ID
    ORDER BY GRP_NUM, RCV_DT DESC;
    Output
    Line: ----------
    FEED_ID     GRP_NUM     RCV_DT
    1     1     
    2     1     5/1/2009
    3     2     2/1/2009
    5          
    4          5/12/2009
    Actually I want the maximum date of when we received the feed. Grp_Num tells which feeds are grouped together. NULL grp_num means they are not grouped so treat them as individual group. In the example - Feeds 1 and 2 are in one group and any one of the feed is required. Feed 3, 4 and 5 are individual groups and all the three are required.
    I need a single query that should return the maximum date for the feeds. For the example the result should be NULL because.. out of feed 1 and 2 the max date is 5/1/2009. For feed 3 the max date is 2/1/2009, for feed 4 it is 5/12/2009 and for feed 4 it is NULL. Since one of the required feed is null so the results should be null.
    DELETE FROM T_FEED;
    DELETE FROM T_FEED_RCV;
    COMMIT;
    INSERT INTO T_FEED VALUES (1, 1);
    INSERT INTO T_FEED VALUES (2, 1);
    INSERT INTO T_FEED VALUES (3, NULL);
    INSERT INTO T_FEED VALUES (4, NULL);
    INSERT INTO T_FEED_RCV VALUES (2, '1-MAY-2009');
    INSERT INTO T_FEED_RCV VALUES (3, '1-FEB-2009');
    INSERT INTO T_FEED_RCV VALUES (4, '12-MAY-2009');
    COMMIT;
    For above inserts, the result should be for feed 1 and 2 - 5/1/2009, feed 3 - 2/1/2009 and feed 4 - 5/12/2009. So the max of these dates is 5/12/2009.
    I tried using MAX function grouped by GRP_NUM and also tried using DENSE_RANK but unable to resolve the issue. I am not sure how can I use the same query to return - not null value for same group and null (if any) for those that doesn't belong to any group. Appreciate if anyone can help me..

    Hi,
    Kuul13 wrote:
    Thanks Frank!
    Appreciate your time and solution. I tweaked your earlier solution which was more cleaner and simple and built the following query to resolve the prblem.
    SELECT * FROM (
    SELECT NVL (F.GRP_NUM, F.CARR_ID || F.FEED_ID || TO_CHAR(EFF_DT, 'MMDDYYYY')) AS GRP_ID
    ,MAX (FR.RCV_DT) AS MAX_DT
    FROM T_FEED F
    LEFT OUTER JOIN T_FEED_RCV FR ON F.FEED_ID = FR.FEED_ID
    GROUP BY NVL (F.GRP_NUM, F.CARR_ID || F.FEED_ID || TO_CHAR(EFF_DT, 'MMDDYYYY'))
    ORDER BY MAX_DT DESC NULLS FIRST)
    WHERE ROWNUM=1;
    I hope there are no hidden issues with this query than the later one provided by you.Actually, I can see 4 issues with this. I admit that some of them are unlikely, but why take any chances?
    (1) The first argument to NVL is a NUMBER, the second (being the result of ||) is a VARCHAR2. That means one of them will be implicitly converted to the type of the other. This is just the kind of thing that behaves differently in different versions or Oracle, so it may work fine for a year or two, and then, when you change to another version, mysteriously quit wiorking. When you have to convert from one type of data to another, always do an explicit conversion, using TO_CHAR (for example).
    (2)
    F.CARR_ID || F.FEED_ID || TO_CHAR(EFF_DT, 'MMDDYYYY)'will produce a key like '123405202009'. grp_num is a NUMBER with no restriction on the number of digits, so it could conceivably be 123405202009. The made-up grp_ids must never be the same any real grp_num.
    (3) The combination (carr_id, feed_id, eff_dt) is unique, but using TO_CHAR(EFF_DT, 'MMDDYYYY) assumes that the combination (carr_id, feed_id, TRUNC (eff_dt)) is unique. Even if eff_dt is always entered as (say) midnight (00:00:00) now, you may decide to start using the time of day sometime in the future. What are the chances that you'll remember to change this query when you do? Not very likely. If multiple rows from the same day are relatively rare, this is the kind of error that could go on for months before you even realize that there is an error.
    (4) Say you have this data in t_feed:
    carr_id      feed_id  eff_dt       grp_num
    1        234      20-May-2009  NULL
    12       34       20-May-2009  NULL
    123      4        20-May-2009  NULLAll of these rows will produce the same grp_id: 123405202009.
    Using NVL, as you are doing, allows you to get by with just one sub-query, which is nice.
    You can do that and still address all the problems above:
    SELECT  *
    FROM     (
         SELECT  NVL2 ( F.GRP_NUM
                   , 'A' || TO_CHAR (f.grp_num)
                   , 'B' || TO_CHAR (f.carr_id) || ':' ||
                                TO_CHAR (f.feed_id) || ':' ||
                                TO_CHAR ( f.eff_dt
                                            , 'MMDDYYYYHH24MISS'
                   ) AS grp_id
         ,     MAX (FR.RCV_DT) AS MAX_DT
         FROM               T_FEED      F
         LEFT OUTER JOIN  T_FEED_RCV  FR        ON  F.FEED_ID = FR.FEED_ID
         GROUP BY  NVL2 ( F.GRP_NUM
                     , 'A' || TO_CHAR (f.grp_num)
                     , 'B' || TO_CHAR (f.carr_id) || ':' ||
                                     TO_CHAR (f.feed_id) || ':' ||
                                     TO_CHAR ( f.eff_dt
                                                 , 'MMDDYYYYHH24MISS'
         ORDER BY  MAX_DT   DESC   NULLS FIRST
    WHERE      ROWNUM = 1;I would still use two sub-queries, adding one to compute grp_id, so we don't have to repeat the NVL2 expression. I would also use a WITH clause rather than in-line views.
    Do you find it easier to read the query above, or the simpler query you posted in your last message?
    Please make things easy on yourself and the people who want to help you. Always format your code so that the way the code looks on the screen makes it clear what the code is doing.
    In particular, the formatting should make clear
    (a) where each clause (SELECT, FROM, WHERE, ...) of each query begins
    (b) where sub-queries begin and end
    (c) what each argument to functions is
    (d) the scope of parentheses
    When you post formatted text on this site, type these 6 characters:
    before and after the formatted text, to preserve spacing.
    The way you post the DDL (CREATE TABLE ...)  and DML (INSERT ...) statements is great: I wish more people were as helpful as you.
    There's no need to format the DDL and DML.  (If you want to, then go ahead: it does help a little.)                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • How to change user group of a query (sq01)

    Hi,
    I'm working with SAP Queries (SQ01) and I don't know how can I change the user group assignation of a query. We did only one user group for all users and queries and know in order to arrange it a bit we've created three user groups.
    We know how to assign users to new user groups but for each query of the old group we neet to move it to one of the new user groups, how could we do this?
    thanks in advance

    Hi
    pls go through these links
    http://help.sap.com/saphelp_46c/helpdata/en/35/26b413afab52b9e10000009b38f974/content.htm
    Step-by-step guide for creating ABAP query
    http://www.sappoint.com/abap/ab4query.pdf
    ABAP query is mostly used by functional consultants.
    SAP Query
    Purpose
    The SAP Query application is used to create lists not already contained in the SAP standard system. It has been designed for users with little or no knowledge of the SAP programming language ABAP. SAP Query offers users a broad range of ways to define reporting programs and create different types of reports such as basic lists, statistics, and ranked lists.
    Features
    SAP Query's range of functions corresponds to the classical reporting functions available in the system. Requirements in this area such as list, statistic, or ranked list creation can be met using queries.
    All the data required by users for their lists can be selected from any SAP table created by the customer.
    for more information please go thru this url:
    http://www.thespot4sap.com/Articles/SAP_ABAP_Queries_Create_The_Query.asp
    http://goldenink.com/abap/sap_query.html
    Please check this PDF document (starting page 352) perhaps it will help u.
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/BCSRVQUE/BCSRVQUE.pdf
    check the below link will be helpful for u
    Tutorial on SQVI
    once you create query system generates a report starting with AQZZ/SAPQUERY/ABAGENCY2======= assing this report to tr code for the same

  • Grouping using report query / Case statement in toplink

    I have following tables
    1. Student with columns id, gender
    2. Subject_Score with columns id, student_id, subject_id, score
    To get scores grouped by subject, I am doing
    ExpressionBuilder subjScoreBuilder = new ExpressionBuilder();
    ReportQuery query = new ReportQuery(SubjectScore.class);
    query.addAverage("average-score",subjScoreBuilder.get("score"));
    query.addGrouping(subjScoreBuilder.get("subjectId"));
    Vector responses = (Vector) serverSession.executeQuery(query);
    Float score = (Float) queryResult.get("average-score");
    This works fine. It gives avg score per each subject
    Now i want both in one query
    A) avg score per subject
    B) avg score per subject per gender
    I want to achive this in one query
    I am doing like:
    ExpressionBuilder subjScoreBuilder =new ExpressionBuilder(SubjectScore.class);
    ExpressionBuilder studentExpBuilder = new ExpressionBuilder(Student.class);
    Expression expression = subjScoreBuilder.get("studentid").equal(studentExpBuilder.get("id")));
    ReportQuery query = new ReportQuery(SubjectScore.class, expression);
    query.addAverage("average-score", subjScoreBuilder.get("score"));
    query.addGrouping( subjScoreBuilder.get("subjectId"));
    query.addGrouping( studentExpBuilder.get("gender"));
    This gives me avg scor per each subject per gender. i.e.
    it applies grouping on both subjectId & gender.
    This is fine.
    But I also want avg score per each subject (group on subject only) in same query.
    1. How can we achive it?
    2. is there something like Case statement in toplink?      
    Thanks a lot for any help.

    I believe in SQL you would need two queries to do this directly, so you will need to issue two queries.
    You could select the Count and Avg, this would give you all the data need to compute the Avg yourself.
    i.e.
    (count(male) * avg(male) + count(female) * avg(female)) / (count(male) + count(female))

  • How to use group by in query

    In this query  i want group by item description and quatity sum
    SELECT T0.[Dscription], T0.[Quantity], T0.[OpenSum],  T1.[DocNum], T2.[State] FROM DLN1 T0  INNER JOIN ODLN T1 ON T0.DocEntry = T1.DocEntry, CRD1 T2 WHERE T1.[CardCode] =  T2.[CardCode] AND  T1.[DocNum] Like '1_%%'

    Hi YYREDDY,
    When you use the GROUP clause, all the fields in the SELECT clause must either figure in the GROUP clause or be part of an aggregate function (AVG, SUM, COUNT, ...).
    SELECT
        T0.Dscription, SUM(T0.Quantity), SUM(T0.OpenSum)
    FROM
        DLN1 T0 INNER JOIN
        ODLN T1 ON T0.DocEntry = T1.DocEntry, CRD1 T2
    WHERE
        T1.CardCode = T2.CardCode AND T1.DocNum Like '1_%%'
    GROUP BY
        T0.Dscription
    Regards,
    Vítor Vieira

  • Group By making query very slow

    Hi All,
    I have a query as follows,
    SELECT cc.segment1 company,
    recv.customer_number paying_account_number,
    irec.trx_number,
    irec.trx_date,
    irec.trx_type_name,
    recv.payment_method_dsp pmt_met,
    recv.customer_name paying_customer,
    irec.applied_payment_schedule_id,
    irec.customer_trx_id,
    irec.cash_receipt_id,
    irec.gl_date gl_date,
    irec.apply_date apply_date,
    irec.receipt_number,
    recv.receipt_date,
    'PMT' ps_class,
    SUM (NVL (irec.amount_applied, 0)) amount_applied
    FROM iex_receivable_applications_v irec,
    ra_cust_trx_types trxt,
    gl_code_combinations cc,
    ar_cash_receipts_v recv
    WHERE cc.segment2 != '410400'
    AND recv.receipt_status != 'REV'
    AND recv.cash_receipt_id = irec.cash_receipt_id
    AND cc.code_combination_id = trxt.gl_id_rev
    AND trxt.NAME = irec.trx_type_name
    AND trxt.org_id = irec.org_id
    AND trxt.cust_trx_type_id = irec.cust_trx_type_id
    AND recv.cash_receipt_id = irec.cash_receipt_id
    GROUP BY cc.segment1,
    recv.customer_number,
    recv.customer_name,
    irec.trx_number,
    irec.trx_date,
    irec.trx_type_name,
    recv.payment_method_dsp,
    irec.applied_payment_schedule_id,
    irec.customer_trx_id,
    irec.cash_receipt_id,
    irec.gl_date,
    irec.apply_date,
    irec.receipt_number,
    recv.receipt_date
    In this query when I put the group by clause it hangs, but without group by clause it works very fast.
    Can anyone help me with the same.
    Regards,
    Shruti

    When your query takes too long:
    HOW TO: Post a SQL statement tuning request - template posting
    When your query takes too long ...
    HTH
    Srini

  • Infoset data grouping for SAP Query

    Hi,
    I'm pretty new to SAP R/3 (4.7) and have a query that I think relates to Infoset rather that SAP Query.
    I have two tables to join but the right hand table (it's a left outer join), needs to be summarised to reduce to unique records on each side of the join. In SQL Plus I would use a 'Group by' to perform this function .. is there a simple way to do this when building the Infoset ?
    I am not an ABAP programmer, nor do I have access to ABAP so would prefer not to go down this route unless it's the only way.
    Any assistance would be appreciated.

    Hi Guys,
    Any ideas?

  • No user group created - Adhoc Query transaction error

    Hi there,
    Whenever I try to access the Adhoc Query transaction S_PH0_48000513 i get the error "No user group created". The required user groups are present in the system.
    Any idea why am I getting this error?
    Regards,
    Anjali.

    Hi Anjali,
    Pls check if the infoset of your query is assigned to the user group through SQ02-> Role/User gr. Assignment.
    Regards,
    Dilek

  • How to group by below query?

    Hi All,
    I have below query,
    SELECT username,firstname,lastname,country
    FROM emp e1
    WHERE EXISTS (
    SELECT 1
    FROM emp
    WHERE e1.lastname = firstname
    AND e1.firstname) = lastname )
    OR EXISTS (
    SELECT 1
    FROM emp
    WHERE lastname = e1.lastname
    AND firstname) = e1.firstname )
    and the result of the query is
    USERNAME,      FIRSTNAME,     LASTNAME, COUNTRY
    1,     b6,     a6, CH
    2,     a6,     b6,     CH
    3,     a1,     b1,     CH
    4,     b1,     a1,     CH
    5,     a2,     b2,     CH
    6,     a2,     b2,     CH
    7,     b1,      a1,     CH
    8,     b1,     a1,     CH
    9,     b2,     a2,     CH
    10,     b2, a2,     CH
    now i have to group all the firstname and lastname which are haviing values
    1) a1,b1 and b1,a1
    2) a2,b2 and b2,a2
    3) a6,b6 and b6,a6
    Can you please help me in this?
    Thanks in advance.
    Edited by: user9258447 on Jul 27, 2010 6:16 AM

    Welcome to the forum!
    select least (first_name, last_name)
         , greatest (first_name, last_name)
         , ctry     
      from test
    group by least (first_name, last_name)
            , greatest (first_name, last_name)
            , ctryas in
    SQL> with test as
      2  (
      3  select 1 username,    'b6' first_name,    'a6' last_name, 'CH' ctry from dual union all
      4  select 2 username,    'a6' first_name,    'b6' last_name,    'CH' ctry from dual union all
      5  select 3 username,    'a1' first_name,    'b1' last_name,    'CH' ctry from dual union all
      6  select 4 username,    'b1' first_name,    'a1' last_name,    'CH' ctry from dual union all
      7  select 5 username,    'a2' first_name,    'b2' last_name,    'CH' ctry from dual union all
      8  select 6 username,    'a2' first_name,    'b2' last_name,    'CH' ctry from dual union all
      9  select 7 username,    'b1' first_name, 'a1' last_name,    'CH' ctry from dual union all
    10  select 8 username,    'b1' first_name,    'a1' last_name,    'CH' ctry from dual union all
    11  select 9 username,    'b2' first_name,    'a2' last_name,    'CH' ctry from dual union all
    12  select 1 username0,  'b2' first_name, 'a2' last_name,  'CH' ctry from dual
    13  )
    14  select least (first_name, last_name)
    15       , greatest (first_name, last_name)
    16       , ctry     
    17    from test
    18   group by least (first_name, last_name)
    19          , greatest (first_name, last_name)
    20          , ctry
    21  /
    LE GR CT
    a1 b1 CH
    a2 b2 CH
    a6 b6 CH

  • Multiple passes (groups) on same query data

    I need to run a report which has two sets of results, one for the current month, one for year to date. This would be easy with subreports, but unfortunately it's a very complex report which has a number of subreports already. I'm using CR10 and SQL Server.
    In both cases the main report's query is the same (most of the work's done in the subreports).
    I've done this before using Paradox tables, by adding an extra table with two records (Seq = 1 and 2) which is unconnected to the other tables (ignoring the warning). This results in the query data being repeated for each value in the Seq table. I group on Seq and I have what I want - two groups with the same data.
    This doesn't appear to work with a SQL Server DB though. Any ideas?

    If your report is based on an SQL Command data source, you could always run the query (or stored procedure) into a temp table (say, #temp), then end the SQL Command with
    select 1 as seq, * from #temp
    Union All
    select 2 as seq, * from #temp
    I thought of this but unfortunately I have no control over the report launching, so a temp table's not an option. I tried this with one of the tables involved  but it then seems to run incredibly slowly, presumably because Crystal's obliged to do the joins rather than the server.
    It does work quite well when I put all the joins in the Command, which is a pain though, particularly for maintenance.
    So thanks for encouraging me to look further.

Maybe you are looking for

  • I have a hp 2009m with two users, one user , all is normal, 2cd user cannot get past sign on screen

    second user cannot get any further than partial log on screen . .system hangs up with HP advisor blinking box

  • About iPhone 5s update from 7.1.2 to 8.1.3

    Is it good to update my iPhone 5s from 7.1.2 to 8.1.3? It may sound a weird question but actually my phone is working gud vid 7.1.2 plus I hv heard very negative things abt ios 8 so I didn't update my phone. But for 8.1.3 I read some gud reviews so w

  • OC4J Instance crashes on Linux

    We have a standalone OC4J instance version 10.1.3.2 on Linux that we use to provide pdf print service for applications developed using APEX. Java version is 1.4.2. I have deployed the fop ear file to the instance, this is delivered with APEX. On seve

  • ALV :  Catch after editing ALV ?

    Hi all, I've done an ALV report ( using 'REUSE_ALV_GRID_DISPLAY' ). I've also add a GUI Status to add a button. I've some checkbox and some editable field. The display is ok, I can edit and click checkbox but when I press my own button, the table isn

  • Upload gallery to Behance

    How do I upload a photo gallery created in Bridge CC Output Module to Behance?