Select with aggregate functions? i.e. select max(code)...

The code field on my user table is too small for the key field I want to use but it is still necessary I assume I must generate my own unique value.
The DBDataSource.Query() method doesn't seem appropriate so should I create a SAPbobsCOM.Recordset and use the DoQuery method to select the maximum value currently in the user table?
I found some mentions of requiring such a select statement on this board but no examples of the actual code. I'll go ahead and try the DoQuery method unless I hear of something more appropriate.
Also, I saw a comment that a future release of SAP would have a larger code field for user tables but obviously that didn't happen in 6.7.
Thanks for any comments.
Bill Faulk

Here's what I came up with...
If you are happy with 99,999,999 records instead of 4,294,967,295 then I guess you can forego the hexadecimal bit. I'd welcome any alternatives if someone has one.
With the hex:
private string GetNextKey(string strTable)
    SAPbobsCOM.Recordset rs =
        (SAPbobsCOM.Recordset)oCompany.GetBusinessObject(BoObjectTypes.BoRecordset);
    rs.DoQuery("select isnull(max(code),'00000000') from [" + strTable + "]");
    string strCode = (string)rs.Fields.Item(0).Value;
    int intKey = Convert.ToInt32(strCode,16) + 1;
    return intKey.ToString("X8");
No Hex:
private string GetNextKey(string strTable)
    SAPbobsCOM.Recordset rs =
      (SAPbobsCOM.Recordset)oCompany.GetBusinessObject(BoObjectTypes.BoRecordset);
    rs.DoQuery("select isnull(max(code),'00000000') from [" + strTable + "]");
    string strCode = (string)rs.Fields.Item(0).Value;
    int intKey = Convert.ToInt32(strCode) + 1;
    return intKey.ToString("00000000");

Similar Messages

  • Problem with aggregate function

    Hello,
    the prerequisite is to have a table with the following columns (#id, a_date).
    Now the task is to select the maximum date from a range of rows of this table
    select max(t.a_date)
    from the_table t
    where <condition to get only some rows of the_table>
    and to get the id of the record which holds the maximum date. That's where the problem is located. It should look like the following code (yes, I know this cannot work because grouping by the id makes the aggregate senseless) just to make clear what I want to do.
    select max(t.a_date), t.id
    from the_table t
    where <condition to get only some rows of the_table>
    group by t.id
    Does anybody have any idea how to solve the problem?
    With regards,
    Cosima
    Message was edited by:
    Cosima Laube

    Hi all,
    as the condition in the where-clause is very complex, I decided to use the solution with the FIRST function. After googling and reading a bit, I have just two more questions for my deeper understanding:
    1) am I right that both snippets below do exactly the same?
    SELECT MAX (t.a_date)
         , MAX (t.ID)KEEP (DENSE_RANK FIRST ORDER BY t.a_date DESC)
      FROM the_table t
    WHERE <condition to get only some rows of the_table>
    SELECT MAX (t.a_date)
         , MAX (t.ID)KEEP (DENSE_RANK LAST ORDER BY t.a_date ASC)
      FROM the_table t
    WHERE <condition to get only some rows of the_table>2) can I use als the min function for t.ID as there the aggregate function does not have any effect (but is necessary in order to use KEEP and FIRST or KEEP and LAST)?
    With regards,
    Cosima
    Message was edited by:
    Cosima Laube

  • Join tables with aggregate function

    Hello
    I have 4 views that I need to perform aggregate function, count, on and then join them for query output.
    Basically each view has a column with a score and a subcontractor name. One subcontractor may have more than one score in each view.
    Here is the sql that I'm working with thus far:
    select e.sub_name, (count(e.score) - 1), (count(v.score) - 1), (count(s.score) - 1), (count(u.score) - 1)
    from speed.v_ratings_ex e, speed.v_ratings_vg v, speed.v_ratings_sat s, speed.v_ratings_uns u
    where v.sub_name = e.sub_name
    and s.sub_name = e.sub_name
    and u.sub_name = e.sub_name
    group by e.sub_name
    order by e.sub_name asc;
    This results in each column returning the same value b/c the join is performed before the aggregate function.
    Can anyone offer some help so that I may get the desired results?

    You need to use in-line views to perform the aggregates, then join the in-line views.
    Something like:
    SELECT e.sub_name, e.score - 1 escore, v.score - 1 vscore,
           s.score - 1 sscore, u.score - 1 uscore
    FROM (SELECT sub_name,count(*) score
          FROM v_ratings_ex
          GROUP BY sub_name) e,
         (SELECT sub_name,count(*) score
          FROM v_ratings_vg
          GROUP BY sub_name) v,
         (SELECT sub_name,count(*) score
          FROM v_ratings_sat
          GROUP BY sub_name) s,
         (SELECT sub_name,count(*) score
          FROM v_ratings_uns
          GROUP BY sub_name) u
    WHERE v.sub_name = e.sub_name and
          s.sub_name = e.sub_name and
          u.sub_name = e.sub_name
    ORDER BY e.sub_name asc;TTFn
    John

  • Select with aggregate function MAX

    Hi,
    Can I write a query like:
    select field1 field2 MAX( field3 )
    from z_table
    into table itab
    for all entries in itab2
    where field1 = itab2-field1
         and field2 = itab2-field2.
    In the above case all the 3 fields are key fields.
    Thanks in advance.

    Hi,
    I did try it.. but got an error like: "The field "z_table~field2" from the select list is missing in the GROUP BY clause" but my aggregate funtion is used only for the field3..... So, I wanted to know if anybody has come across such a query....
    Thanks,

  • Update of a table from a select query with aggregate functions.

    Hello All,
    I have problem here:
    I have 2 tables A(a1, a2, a3, a4, a4....... ) and B( a1, a2, b1, b2, b3). I need to calculate the avg(a4-a3), Max(a4-a3) and Min(a4-a3) and insert it into table B. If the foreign keys a1, a2 already exist in table B, I need to do an update of the computed values into column b1, b2 and b3 respectively, for a1, a2.
    Q1. Is it possible to do this with a single query ? I would prefer not to join A with B because the table A is very large. Also columns b1, b2 and b3 are non-nullable.
    Q2. Also if a4 and a3 are timestamps what is the best way to find the average? A difference of timestamps yields INTERVAL DAY TO SECOND over which the avg function doesn't seem to work. The averages, max and min in my case would be less than a day and hence all I need is to get the data in the hh:mm:ss format.
    As of now I'm using :
    TO_CHAR(TO_DATE(ABS(MOD(TRUNC(AVG(extract(hour FROM (last_modified_date - created_date))*3600 +
    extract( minute FROM (last_modified_date - created_date))*60 +
    extract( second FROM (last_modified_date - created_date)))
    ),86400)),'sssss'),'hh24":"mi":"ss') AS avg_time,
    But this is very long drawn. Something more compact and efficient would be nice.
    Thanks in advance for your inputs.
    Edited by: 847764 on Mar 27, 2011 5:35 PM

    847764 wrote:
    Hi,
    Thanks everyone for such fast replies. Malakshinov's example worked fine for me as far as updating the table goes. As for the timestamp computations, I'm posting additional info: Sorry, I don't understand.
    If Malakshinov's example worked for updating the table, but you still have problems, does that mean you have to do something else besides update the table? If so, what?
    Oracle version : Oracle Database 11g Enterprise Edition Release 11.2.0.1.0
    Here are the table details :
    DESC Table A
    Name Null Type
    ID               NOT NULL NUMBER
    A1               NOT NULL VARCHAR2(4)
    A2               NOT NULL VARCHAR2(40)
    A3               NOT NULL VARCHAR2(40)
    CREATED_DATE NOT NULL TIMESTAMP(6)
    LAST_MODIFIED_DATE TIMESTAMP(6) DESCribing the tables can help clarify some things, but it's no substitute for posting CREATE TABLE and INSERT statements. With only a description of the table, nobody can re-create the problem or test their ideas. Please post CREATE TABLE and INSERT statements for both tables as they exist before the MERGE. If table b doen't contain any rows before the MERGE, then just say so, but you still need to post a CREATE TABLE statement for both tables, and INSERT statements for table a.
    The objective is to compute the response times : avg (LAST_MODIFIED_DATE - CREATED_DATE), max (LAST_MODIFIED_DATE - CREATED_DATE) and min (LAST_MODIFIED_DATE - CREATED_DATE) grouped by A1 and A2 and store it in table B under AVG_T, MAX_T and MIN_T. Since AVG_T, MAX_T and MIN_T are only used for reporting purposes we have kept it as Varchar (though I think keeping it as timestamp would make more sense). I think a NUMBER would make more sense (the number of minutes, for example), or perhaps an INTERVAL DAY TO SECOND. If you stored a NUMBER, it would be easy to compute averages.
    In table B the times are stored in the format : hh:mm:ss. We don't need milliseconds precision. If you don;'t need milliseconds, then you should use DATE instead of TIMESTAMP. The functions for manipulating DATEs are much better.
    Hence I was calculating is as follows:
    -- Avg Time
    TO_CHAR(TO_DATE(ABS(MOD(TRUNC(AVG(extract(hour FROM (last_modified_date - created_date))*3600 +
    extract( minute FROM (last_modified_date - created_date))*60 +
    extract( second FROM (last_modified_date - created_date)))
    ),86400)),'sssss'),'hh24":"mi":"ss') AS avg_time,
    --Max Time
    extract (hour FROM MAX(last_modified_date - created_date))||':'||extract (minute FROM MAX(last_modified_date - created_date))||':'||TRUNC(extract (second FROM MAX(last_modified_date - created_date))) AS max_time,
    --Min Time
    extract (hour FROM MIN(last_modified_date - created_date))||':'||extract (minute FROM MIN(last_modified_date - created_date))||':'||TRUNC(extract (second FROM MIN(last_modified_date - created_date))) AS min_timeIs this something that has to be done before or after the MERGE?
    Post the complete statement.
    Is this part of a query? Where's the SELECT keyword?
    Is this part of a DML operation? Where's the INSERT, or UPDATE, or MERGE keyword?
    What are the exact results you want from this? Explain how you get those results.
    Is the code above getting the right results? Are you just asking if there's a better way to get the same results?
    You have to explain things very carefully. None of the people who want to help you are familiar with your application, or your needs.
    I just noticed that my reply is horribly formatted - apologies! I'm just getting the hang of it.Whenever you post formatted text (such as query results) on this site, type these 6 characters:
    \(small letters only, inside curly brackets) before and after each section of formatted text, to preserve spacing.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Help with aggregate function max query

    I have a large database which stores daily oil life, odo readings from thousands of vehicles being tested but only once a month.
    I am trying to grab data from one month where only the EOL_READ = 0 and put all of those values into the previous month's query where EOL_READ = anything.
    Here is the original query which grabs everything
    (select distinct vdh.STID, vdh.CASE_SAK,
    max(to_number(decode(COMMAND_ID,'EOL_READ',COMMAND_RESULT,-100000))) EOL_READ,
    max(to_number(decode(COMMAND_ID,'ODO_READ',COMMAND_RESULT,-100000))) ODO_READ,
    max(to_number(decode(COMMAND_ID,'OIL_LIFE_PREDICTION',COMMAND_RESULT,-100000))) OIL_LIFE_PREDICTION
    from veh_diag_history vdh, c2pt_data_history cdh
    where vdh.veh_diag_history_sak = cdh.veh_diag_history_sak
    and cdh.COMMAND_ID in ('EOL_READ','ODO_READ','OIL_LIFE_PREDICTION')
    and vdh.LOGICAL_TRIGGER_SAK = 3
    and cdh.CREATED_TIMESTAMP =vdh.CREATED_TIMESTAMP
    and cdh.CREATED_TIMESTAMP >= to_date('03/01/2007 12:00:00 AM','mm/dd/yyyy HH:MI:SS AM')
    and cdh.CREATED_TIMESTAMP < to_date('04/01/2007 12:00:00 AM','mm/dd/yyyy HH:MI:SS AM')
    group by vdh.STID, vdh.case_sak
    having count(distinct command_id) = 3
    order by OIL_LIFE_PREDICTION)
    which gives 5 columns:
    STID, CASE_SAK, EOL_READ, ODO_READ, and OIL_LIFE_PREDICTION
    and gives me about 80000 rows returned for the above query
    I only want one reading per month, but sometimes I get two.
    STID is the unique identifier for a vehicle.
    I tried this query:
    I tried this query which nests one request within the other and SQL times out every time:
    select distinct vdh.STID, vdh.CASE_SAK,
    max(to_number(decode(COMMAND_ID,'EOL_READ',COMMAND_RESULT,-100000))) EOL_READ,
    max(to_number(decode(COMMAND_ID,'ODO_READ',COMMAND_RESULT,-100000))) ODO_READ,
    max(to_number(decode(COMMAND_ID,'OIL_LIFE_PREDICTION',COMMAND_RESULT,-100000))) OIL_LIFE_PREDICTION
    from veh_diag_history vdh, c2pt_data_history cdh
    where vdh.veh_diag_history_sak = cdh.veh_diag_history_sak
    and cdh.COMMAND_ID in ('EOL_READ','ODO_READ','OIL_LIFE_PREDICTION')
    and vdh.LOGICAL_TRIGGER_SAK = 3
    and cdh.CREATED_TIMESTAMP =vdh.CREATED_TIMESTAMP
    and cdh.CREATED_TIMESTAMP >= to_date('02/01/2007 12:00:00 AM','mm/dd/yyyy HH:MI:SS AM')
    and cdh.CREATED_TIMESTAMP < to_date('03/01/2007 12:00:00 AM','mm/dd/yyyy HH:MI:SS AM')
    group by vdh.STID, vdh.case_sak
    having count(distinct command_id) = 3
    and vdh.stid in (select distinct vdh.STID, vdh.CASE_SAK,
    max(to_number(decode(COMMAND_ID,'EOL_READ',COMMAND_RESULT,-100000))) EOL_READ,
    max(to_number(decode(COMMAND_ID,'ODO_READ',COMMAND_RESULT,-100000))) ODO_READ,
    max(to_number(decode(COMMAND_ID,'OIL_LIFE_PREDICTION',COMMAND_RESULT,-100000))) OIL_LIFE_PREDICTION
    from veh_diag_history vdh, c2pt_data_history cdh
    where vdh.veh_diag_history_sak = cdh.veh_diag_history_sak
    and cdh.COMMAND_ID in ('EOL_READ','ODO_READ','OIL_LIFE_PREDICTION')
    and vdh.LOGICAL_TRIGGER_SAK = 3
    and cdh.CREATED_TIMESTAMP =vdh.CREATED_TIMESTAMP
    and cdh.CREATED_TIMESTAMP >= to_date('03/01/2007 12:00:00 AM','mm/dd/yyyy HH:MI:SS AM')
    and cdh.CREATED_TIMESTAMP < to_date('04/01/2007 12:00:00 AM','mm/dd/yyyy HH:MI:SS AM')
    group by vdh.STID, vdh.case_sak
    having count(distinct command_id) = 3
    and (max(to_number(decode(COMMAND_ID,'EOL_READ',COMMAND_RESULT,-100000)))) = 0)
    order by OIL_LIFE_PREDICTION
    so in summary I am trying to select values from the previous month only from those stids where this month's value for EOL_READ = 0
    Any ideas.....please help.

    You can use row_number() within each stid and each month to determine the first read of each month. Then you can use lag() to get the previous month's reading of the current month's reading is zero.
    with all_data as (
    select stid,
           case_sak,
           eol_read,
           timestamp,
           row_number() over (partition by stid, trunc(timestamp,'mm') order by timestamp) AS rn
      from veh_diag_history
    select stid,
           case_sak,
           case
             when eol_read = 0
               then lag(eol_read) over (partition by stid order by timestamp)
             else eol_read
           end as eol_read
      from all_data
    where rn = 1;

  • How to use analytic function with aggregate function

    hello
    can we use analytic function and aggrgate function in same qurey? i tried to find any example on Net but not get any example how both of these function works together. Any link or example plz share with me
    Edited by: Oracle Studnet on Nov 15, 2009 10:29 PM

    select
    t1.region_name,
    t2.division_name,
    t3.month,
    t3.amount mthly_sales,
    max(t3.amount) over (partition by t1.region_name, t2.division_name)
    max_mthly_sales
    from
    region t1,
    division t2,
    sales t3
    where
    t1.region_id=t3.region_id
    and
    t2.division_id=t3.division_id
    and
    t3.year=2004
    Source:http://www.orafusion.com/art_anlytc.htm
    Here max (aggregate) and over partition by (analytic) function is in same query. So it means we can use aggregate and analytic function in same query and more than one analytic function in same query also.
    Hth
    Girish Sharma

  • Update statement with Aggregate function

    Hi All,
    I would like to update the records with sum of SAL+COMM+DEPTNO into SALCOMDEPT column for each row. Can we use SUM function in Update statement. Please help me out this
    See below:
    Table
    CREATE TABLE EMP
    EMPNO NUMBER(4) NOT NULL,
    ENAME VARCHAR2(10 CHAR),
    JOB VARCHAR2(9 CHAR),
    MGR NUMBER(4),
    HIREDATE DATE,
    SAL NUMBER(7,2),
    COMM NUMBER(7,2),
    DEPTNO NUMBER(2),
    SALCOMDEPT NUMBER
    Used update statement :
    UPDATE emp e1
    SET e1.salcomdept= (select sum(sumsal+comm+deptno) from emp e2 where e2.empno=e1.empno)
    WHERE e1.deptno = 10
    commit
    Thanks,
    User

    Adding these columns makes no sense, so I'll assume this is just an exercise.
    However, storing calculated columns like this is generally not a good idea. If one of the other columns is updated, your calculated column will be out of sync.
    One way around this is to create a simple view
    SQL> CREATE view EMP_v as
      2  select EMPNO
      3        ,ENAME
      4        ,JOB
      5        ,MGR
      6        ,HIREDATE
      7        ,SAL
      8        ,COMM
      9        ,DEPTNO
    10        ,(nvl(sal,0) + nvl(comm,0) + nvl(deptno,0)) SALCOMDEPT
    11  from   emp;
    View created.
    SQL> select sal, comm, deptno, salcomdept from emp_v;
                     SAL                 COMM               DEPTNO           SALCOMDEPT
                     800                                        20                  820
                    1600                  300                   30                 1930
                    1250                  500                   30                 1780
                    2975                                        20                 2995
                    1250                 1400                   30                 2680
                    2850                                        30                 2880
                    2450                                        10                 2460
                    3000                                        20                 3020
                    5000                                        10                 5010
                    1500                    0                   30                 1530
                    1100                                        20                 1120
                     950                                        30                  980
                    3000                                        20                 3020
                    1300                                        10                 1310
    14 rows selected.

  • Answers Reports with Aggregate Function

    Hi,
    I have a Product Table with columns, CustomerName, ProductID,ProductName,Purchase Date. I am trying to create a report in the flg. format:
    CustomerName Customer-Prod Count ProductName Total Prod Count %of Products
    Mike 10 CD 2 20%
    DVD 1 10%
    Computer 6 60%
    Laptop 1 10%
    Dan 5 Computer 3 60%
    DVD 2 40%
    Any suggestions on how to accomplish this?
    Thanks,
    rkingmdu

    Thanks, the syntax worked. But I get incorrect results.
    Here's what I queried:
    "PRODUCTS".CUSTOMERNAME,
    COUNT("PRODUCTS"."PRODUCTID" By "PRODUCTS"."CUSTOMERNAME"),
    "PRODUCTS".PRODUCTNAME,
    COUNT("PRODUCTS"."PRODUCTID" By "PRODUCTS"."PRODUCTID", "PRODUCTS"."CUSTOMERNAME")
    The first count returns the correct results but the second count is wrong. The query looks to be ok, I don't know where I am getting it wrong.
    Any ideas?
    Thanks
    rkingmdu

  • Select into aggregate function

    Hi,
    I have the following PL/SQL block that I am testing. The block reads a list of values from a local file and inserts the values into a select statement that uses an aggregate function. Here is my code:
    DECLARE
    f utl_file.file_type;
    n COLa%TYPE; --holds values from the file
    v COLb%TYPE; --holds value returned by select statement
    BEGIN
    f := utl_file.fopen(dir,file.txt,'R');
    loop -- loop through file
         utl_file.get_line(f,n);
         if length(n) <> 0 then
              SELECT max(A0.COLa) into v
              FROM (SELECT AOB.COLa, A0.COLb
              FROM TAB1 A0,TAB2 A0B
              WHERE (A0.PK=A0B.PK) and A0B.COLa = n) A0;          
              IF SQL%rowcount = 0 THEN
                   dbms_output.put_line('no rows);
              else
                   dbms_output.put_line(v);
              end if;                    
         end if;     
    end loop;
         utl_file.fclose(f);     
    end;
    Here is the error I get:
    declare
    ERROR at line 1:
    ORA-01403: no data found
    ORA-06512: at "SYS.UTL_FILE", line 98
    ORA-06512: at "SYS.UTL_FILE", line 656
    ORA-06512: at line 12
    I checked the database for the first couple of values in the list and got rows. I also ran a simple PL/sql code to print values in the file successfully. So I dont know why it is returning no data found. From what I understand select into statement using aggregate functions will never raise this exception, so why I am getting this error?
    Thanks.

    Hi,
    Actually, the SELECT ... INTO isn't the problem here.
    Look at the error message:
    ERROR at line 1:
    ORA-01403: no data found
    ORA-06512: at "SYS.UTL_FILE", line 98
    ORA-06512: at "SYS.UTL_FILE", line 656
    ORA-06512: at line 12The error is ocurring inside utl_file, which is being called at line 12 of your code. (That must be
    utl_file.get_line(f,n);)
    Utl_file.get_line raises NO_DATA_FOUND when it reaches the end of the file. (If it didn't you'd have an infinite loop, since your code doesn't have any exit strategy.)
    You can put the call to get_line in its won BEGIN-EXCEPTION-END block, and trap the NO_DATA_FOUND error.
    For example:
    end_of_file := 0;
    WHILE  end_of_file = 0;
    loop -- loop through file
        BEGIN
            utl_file.get_line(f,n);
            if length(n) != 0 then          -- Use !=, because you can't post &lt;&gt; on this site
                SELECT  max(A0B.COLa)     -- No sub-query needed
                into    v
                FROM    TAB1 A0
                ,         TAB2 A0B
                WHERE   (A0.PK   = A0B.PK)
                and         A0B.COLa = n;
                IF SQL%rowcount = 0 THEN     -- Not needed: SELECT MAX ... without GROUP BY always returns 1 row
                    dbms_output.put_line('no rows);
                else
                    dbms_output.put_line(v);
                end if;
            end if;
        EXCEPTION
            WHEN  NO_DATA_FOUND
            THEN
                end_of_file = 1;
        END;
    end loop;Edited by: Frank Kulash on Jul 17, 2009 2:51 PM

  • Suggest me less expensive SELECT for aggregate function.

    Hi,
    I hv a SELECT statement with AGGREGATE function, but, as aggregates are expensive, so, want to replace it with less expensive SELECT statement, so, pls. let me know it, Wht can i use instaed, code is as below,
    CLEAR my_amount.
      SELECT SUM( wrbtr ) into my_amount from BSID
                              WHERE bukrs = p_bukrs
                              and   kunnr = p_konto
                              and   budat <= sy-datum.
    so, pls. let me know alternative(replace) SELECT which is less expensive performence wise
    thank you

    >
    M_S_Raju_0613 wrote:
    > Hi,
    >
    > I hv a SELECT statement with AGGREGATE function, but, as aggregates are expensive, so, want to replace it with less expensive SELECT statement, so, pls. let me know it, Wht can i use instaed, code is as below,
    >
    >
    CLEAR my_amount.
    >   SELECT SUM( wrbtr ) into my_amount from BSID
    >                           WHERE bukrs = p_bukrs
    >                           and   kunnr = p_konto
    >                           and   budat <= sy-datum.
    >
    > so, pls. let me know alternative(replace) SELECT which is less expensive performence wise
    >
    > thank you
    I don't think using SUM is a bad thing to do if you're using an index.  The only replacement would be to select all the data into an itab and add it up yourself and I don't think that doing this would be any quicker - it might even be slower because you would be bringing back more records rather than restricting the number of records brought back at database level which is what the SUM would do.  The thing I'd be concerned about here is that you're not taking acount of the SHKZG debit / credit flag.  But if all the lines you're dealing are credits or all are debits that won't matter to you.

  • Need complex query  with joins and AGGREGATE  functions.

    Hello Everyone ;
    Good Morning to all ;
    I have 3 tables with 2 lakhs record. I need to check query performance.. How CBO rewrites my query in materialized view ?
    I want to make complex join with AGGREGATE FUNCTION.
    my table details
    SQL> select from tab;*
    TNAME TABTYPE CLUSTERID
    DEPT TABLE
    PAYROLL TABLE
    EMP TABLE
    SQL> desc emp
    Name
    EID
    ENAME
    EDOB
    EGENDER
    EQUAL
    EGRADUATION
    EDESIGNATION
    ELEVEL
    EDOMAIN_ID
    EMOB_NO
    SQL> desc dept
    Name
    EID
    DNAME
    DMANAGER
    DCONTACT_NO
    DPROJ_NAME
    SQL> desc payroll
    Name
    EID
    PF_NO
    SAL_ACC_NO
    SALARY
    BONUS
    I want to make  complex query  with joins and AGGREGATE  functions.
    Dept names are : IT , ITES , Accounts , Mgmt , Hr
    GRADUATIONS are : Engineering , Arts , Accounts , business_applications
    I want to select records who are working in IT and ITES and graduation should be "Engineering"
    salary > 20000 and < = 22800 and bonus > 1000 and <= 1999 with count for males and females Separately ;
    Please help me to make a such complex query with joins ..
    Thanks in advance ..
    Edited by: 969352 on May 25, 2013 11:34 AM

    969352 wrote:
    why do you avoid providing requested & NEEDED details?I do NOT understand what do you expect ?
    My Goal is :
    1. When executing my own query i need to check expalin plan.please proceed to do so
    http://docs.oracle.com/cd/E11882_01/server.112/e26088/statements_9010.htm#SQLRF01601
    2. IF i enable query rewrite option .. i want to check explain plan ( how optimizer rewrites my query ) ? please proceed to do so
    http://docs.oracle.com/cd/E11882_01/server.112/e16638/ex_plan.htm#PFGRF009
    3. My only aim is QUERY PERFORMANCE with QUERY REWRITE clause in materialized view.It is an admirable goal.
    Best Wishes on your quest for performance improvements.

  • Using journalized data in an interface with aggragate function

    Hi
    I am trying to use the journalized data of a source table in one of my interfaces in ODI. The trouble is that one of the mappings on the target columns involves a aggregate function(sum). When I run the interface i get an error saying "not a group by expression". I checked the code and found that the jrn_subscriber, jrn_flag and jrn_date columns are included in the select statement but not in the group by statement(the group by statement only contains the remiaining two columns of the target table).
    Is there a way around this? Do I have to manually modify the km? If so how would I go about doing it?
    Also I am using Oracle GoldenGate JKM (oracle to oracle OGG).
    Thanks and really aprreciate the help
    Ajay

    'ORA-00979' When Using The ODI CDC (Journalization) Feature With Knowledge Modules Including SQL Aggregate Functions [ID 424344.1]
         Modified 11-MAR-2009 Type PROBLEM Status MODERATED      
    In this Document
    Symptoms
    Cause
    Solution
    Alternatives :
    This document is being delivered to you via Oracle Support's Rapid Visibility (RaV) process, and therefore has not been subject to an independent technical review.
    Applies to:
    Oracle Data Integrator - Version: 3.2.03.01
    This problem can occur on any platform.
    Symptoms
    After having successfully tested an ODI Integration Interface using an aggregate function such as MIN, MAX, SUM, it is necessary to set up Changed Data Capture operations by using Journalized tables.
    However, during execution of the Integration Interface to retrieve only the Journalized records, problems arise at the Load Data step of the Loading Knowledge Module and the following message is displayed in ODI Log:
    ORA-00979: not a GROUP BY expression
    Cause
    Using both CDC - Journalization and aggregate functions gives rise to complex issues.
    Solution
    Technically there is a work around for this problem (see below).
    WARNING : Oracle engineers issue a severe warning that such a type of set up may give results that are not what may be expected. This is related to the way in which ODI Journalization is implemented as specific Journalization tables. In this case, the aggregate function will only operate on the subset which is stored (referenced) in the Journalization table and NOT over the entire Source table.
    We recommend to avoid such types of Integration Interface set ups.
    Alternatives :
    1.The problem is due to the missing JRN_* columns in the generated SQL "Group By" clause.
    The work around is to duplicate the Loading Knowledge Module (LKM), and in the clone, alter the "Load Data" step by editing the "Command on Source" tab and by replacing the following instruction:
    <%=odiRef.getGrpBy()%>
    with
    <%=odiRef.getGrpBy()%>
    <%if ((odiRef.getGrpBy().length() > 0) && (odiRef.getPop("HAS_JRN").equals("1"))) {%>
    ,JRN_FLAG,JRN_SUBSCRIBER,JRN_DATE
    <%}%>
    2. It is possible to develop two alternative solutions:
    (a) Develop two separate and distinct Integration Interfaces:
    * The first Integration Interface loads data into a temporary Table and specify the aggregate functions to be used in this initial Integration Interface.
    * The second Integration Interfaces uses the temporary Table as a Source. Note that if you create the Table in the Interface, it is necessary to drag and drop the Integration Interface into the Source panel.
    (b) Define two connections to the Database so that the Integration Interface references two distinct and separate Data Server Sources (one for the Journal, one for the other Tables). In this case, the aggregate function will be executed on the Source Schema.
    Show Related Information Related
    Products
    * Middleware > Business Intelligence > Oracle Data Integrator (ODI) > Oracle Data Integrator
    Keywords
    ODI; AGGREGATE; ORACLE DATA INTEGRATOR; KNOWLEDGE MODULES; CDC; SUNOPSIS
    Errors
    ORA-979
    Please find above the content from OTN.
    It should show you this if you search this ID in the Search Knowledge Base
    Cheers
    Sachin

  • How to write SQL query and apply aggregate functions on it

    Hello experts,
    Iu2019ve a task to write SQL query on tree tables and do inner join on them. Iu2019ve accomplish this task by using T-CODE SQVI. However now I need to write a query and apply SQL functions on it i.e. Add, Count, Max and Min etc. Please can someone tell me how I can write SQL query with aggregate functions in SAP?
    Thanks a lot in advance

    HI Mr. Cool
    you can see the below code for using aggregate functions.
    where ARTIST and SEATSOCCU are the field names for which you want to perform these functions.
    DATA: TOTAL_ENTRIES TYPE I,
          TOTAL_ATT TYPE I,
          MAX_ATT TYPE I,
          AVG_ATT TYPE I.
    SELECT COUNT( DISTINCT ARTIST )
           SUM( SEATSOCCU )
           MAX( SEATSOCCU )
           AVG( SEATSOCCU ) FROM YCONCERT INTO (TOTAL_ENTRIES, TOTAL_ATT,
    MAX_ATT, AVG_ATT).
    Thanks
    Lalit Gupta

  • How to use Aggregate Functions during Top N analysis?

    Say i want to find top 5 highest salaries and their totals and average. In that case how to use aggregate functions. Please give me an example on this.
    Regards,
    Renu
    Message was edited by:
    user642387

    Hi,
    Yes, you can do that with aggregate functions.
    First, do a sub-query to retrieve all the salaries (in descending order), then say "WHERE ROWNUM <= 5" in the main query. Use the aggregate SUM and AVG functions in the main query.
    Analytic functions are easier to use for jobs like this, once you get familiar with them. If you're not leaving the field this month, then it's probably worthwhile for you to get familiar with analytic functions.

Maybe you are looking for

  • How to change FM8 default setting for Anchored Frame?

    Hello, I would like to change the default settings in FrameMaker 8 for Anchored Frame. Currently, when I import an .ai file into FrameMaker 8, I have to change the Anchored Frame default setting from "Below Current Line" to "At Insertion Point". Is t

  • MY PHONE IS POSSESSED!!!!!

    Something has taken over my phone.  For the past 4 days my Voice Dialing comes on and says, ``initializing`` then say a command, it then says no match found and goes back to the home screen.  THEN, this process carries on and does not stop all day! 

  • How to assign a pernr to a org unit

    Hi to all experts, How do i assign a pernr to a organisation unit.

  • What is the imp of req id in reporting?

    hai all, what is the importance of request id in reporting? can any one plz send tme the solution. thanks in advance shanti

  • Allowing more Quantity to Deliver than in Sales Order

    Hi, When I make an order for 100 Nos. the system allows me to create delivery for 150 nos. How to prevent that regards, Rajesh T