Select with aggregate function MAX

Hi,
Can I write a query like:
select field1 field2 MAX( field3 )
from z_table
into table itab
for all entries in itab2
where field1 = itab2-field1
     and field2 = itab2-field2.
In the above case all the 3 fields are key fields.
Thanks in advance.

Hi,
I did try it.. but got an error like: "The field "z_table~field2" from the select list is missing in the GROUP BY clause" but my aggregate funtion is used only for the field3..... So, I wanted to know if anybody has come across such a query....
Thanks,

Similar Messages

  • Select with aggregate functions? i.e. select max(code)...

    The code field on my user table is too small for the key field I want to use but it is still necessary I assume I must generate my own unique value.
    The DBDataSource.Query() method doesn't seem appropriate so should I create a SAPbobsCOM.Recordset and use the DoQuery method to select the maximum value currently in the user table?
    I found some mentions of requiring such a select statement on this board but no examples of the actual code. I'll go ahead and try the DoQuery method unless I hear of something more appropriate.
    Also, I saw a comment that a future release of SAP would have a larger code field for user tables but obviously that didn't happen in 6.7.
    Thanks for any comments.
    Bill Faulk

    Here's what I came up with...
    If you are happy with 99,999,999 records instead of 4,294,967,295 then I guess you can forego the hexadecimal bit. I'd welcome any alternatives if someone has one.
    With the hex:
    private string GetNextKey(string strTable)
        SAPbobsCOM.Recordset rs =
            (SAPbobsCOM.Recordset)oCompany.GetBusinessObject(BoObjectTypes.BoRecordset);
        rs.DoQuery("select isnull(max(code),'00000000') from [" + strTable + "]");
        string strCode = (string)rs.Fields.Item(0).Value;
        int intKey = Convert.ToInt32(strCode,16) + 1;
        return intKey.ToString("X8");
    No Hex:
    private string GetNextKey(string strTable)
        SAPbobsCOM.Recordset rs =
          (SAPbobsCOM.Recordset)oCompany.GetBusinessObject(BoObjectTypes.BoRecordset);
        rs.DoQuery("select isnull(max(code),'00000000') from [" + strTable + "]");
        string strCode = (string)rs.Fields.Item(0).Value;
        int intKey = Convert.ToInt32(strCode) + 1;
        return intKey.ToString("00000000");

  • Help with aggregate function max query

    I have a large database which stores daily oil life, odo readings from thousands of vehicles being tested but only once a month.
    I am trying to grab data from one month where only the EOL_READ = 0 and put all of those values into the previous month's query where EOL_READ = anything.
    Here is the original query which grabs everything
    (select distinct vdh.STID, vdh.CASE_SAK,
    max(to_number(decode(COMMAND_ID,'EOL_READ',COMMAND_RESULT,-100000))) EOL_READ,
    max(to_number(decode(COMMAND_ID,'ODO_READ',COMMAND_RESULT,-100000))) ODO_READ,
    max(to_number(decode(COMMAND_ID,'OIL_LIFE_PREDICTION',COMMAND_RESULT,-100000))) OIL_LIFE_PREDICTION
    from veh_diag_history vdh, c2pt_data_history cdh
    where vdh.veh_diag_history_sak = cdh.veh_diag_history_sak
    and cdh.COMMAND_ID in ('EOL_READ','ODO_READ','OIL_LIFE_PREDICTION')
    and vdh.LOGICAL_TRIGGER_SAK = 3
    and cdh.CREATED_TIMESTAMP =vdh.CREATED_TIMESTAMP
    and cdh.CREATED_TIMESTAMP >= to_date('03/01/2007 12:00:00 AM','mm/dd/yyyy HH:MI:SS AM')
    and cdh.CREATED_TIMESTAMP < to_date('04/01/2007 12:00:00 AM','mm/dd/yyyy HH:MI:SS AM')
    group by vdh.STID, vdh.case_sak
    having count(distinct command_id) = 3
    order by OIL_LIFE_PREDICTION)
    which gives 5 columns:
    STID, CASE_SAK, EOL_READ, ODO_READ, and OIL_LIFE_PREDICTION
    and gives me about 80000 rows returned for the above query
    I only want one reading per month, but sometimes I get two.
    STID is the unique identifier for a vehicle.
    I tried this query:
    I tried this query which nests one request within the other and SQL times out every time:
    select distinct vdh.STID, vdh.CASE_SAK,
    max(to_number(decode(COMMAND_ID,'EOL_READ',COMMAND_RESULT,-100000))) EOL_READ,
    max(to_number(decode(COMMAND_ID,'ODO_READ',COMMAND_RESULT,-100000))) ODO_READ,
    max(to_number(decode(COMMAND_ID,'OIL_LIFE_PREDICTION',COMMAND_RESULT,-100000))) OIL_LIFE_PREDICTION
    from veh_diag_history vdh, c2pt_data_history cdh
    where vdh.veh_diag_history_sak = cdh.veh_diag_history_sak
    and cdh.COMMAND_ID in ('EOL_READ','ODO_READ','OIL_LIFE_PREDICTION')
    and vdh.LOGICAL_TRIGGER_SAK = 3
    and cdh.CREATED_TIMESTAMP =vdh.CREATED_TIMESTAMP
    and cdh.CREATED_TIMESTAMP >= to_date('02/01/2007 12:00:00 AM','mm/dd/yyyy HH:MI:SS AM')
    and cdh.CREATED_TIMESTAMP < to_date('03/01/2007 12:00:00 AM','mm/dd/yyyy HH:MI:SS AM')
    group by vdh.STID, vdh.case_sak
    having count(distinct command_id) = 3
    and vdh.stid in (select distinct vdh.STID, vdh.CASE_SAK,
    max(to_number(decode(COMMAND_ID,'EOL_READ',COMMAND_RESULT,-100000))) EOL_READ,
    max(to_number(decode(COMMAND_ID,'ODO_READ',COMMAND_RESULT,-100000))) ODO_READ,
    max(to_number(decode(COMMAND_ID,'OIL_LIFE_PREDICTION',COMMAND_RESULT,-100000))) OIL_LIFE_PREDICTION
    from veh_diag_history vdh, c2pt_data_history cdh
    where vdh.veh_diag_history_sak = cdh.veh_diag_history_sak
    and cdh.COMMAND_ID in ('EOL_READ','ODO_READ','OIL_LIFE_PREDICTION')
    and vdh.LOGICAL_TRIGGER_SAK = 3
    and cdh.CREATED_TIMESTAMP =vdh.CREATED_TIMESTAMP
    and cdh.CREATED_TIMESTAMP >= to_date('03/01/2007 12:00:00 AM','mm/dd/yyyy HH:MI:SS AM')
    and cdh.CREATED_TIMESTAMP < to_date('04/01/2007 12:00:00 AM','mm/dd/yyyy HH:MI:SS AM')
    group by vdh.STID, vdh.case_sak
    having count(distinct command_id) = 3
    and (max(to_number(decode(COMMAND_ID,'EOL_READ',COMMAND_RESULT,-100000)))) = 0)
    order by OIL_LIFE_PREDICTION
    so in summary I am trying to select values from the previous month only from those stids where this month's value for EOL_READ = 0
    Any ideas.....please help.

    You can use row_number() within each stid and each month to determine the first read of each month. Then you can use lag() to get the previous month's reading of the current month's reading is zero.
    with all_data as (
    select stid,
           case_sak,
           eol_read,
           timestamp,
           row_number() over (partition by stid, trunc(timestamp,'mm') order by timestamp) AS rn
      from veh_diag_history
    select stid,
           case_sak,
           case
             when eol_read = 0
               then lag(eol_read) over (partition by stid order by timestamp)
             else eol_read
           end as eol_read
      from all_data
    where rn = 1;

  • Suggest me less expensive SELECT for aggregate function.

    Hi,
    I hv a SELECT statement with AGGREGATE function, but, as aggregates are expensive, so, want to replace it with less expensive SELECT statement, so, pls. let me know it, Wht can i use instaed, code is as below,
    CLEAR my_amount.
      SELECT SUM( wrbtr ) into my_amount from BSID
                              WHERE bukrs = p_bukrs
                              and   kunnr = p_konto
                              and   budat <= sy-datum.
    so, pls. let me know alternative(replace) SELECT which is less expensive performence wise
    thank you

    >
    M_S_Raju_0613 wrote:
    > Hi,
    >
    > I hv a SELECT statement with AGGREGATE function, but, as aggregates are expensive, so, want to replace it with less expensive SELECT statement, so, pls. let me know it, Wht can i use instaed, code is as below,
    >
    >
    CLEAR my_amount.
    >   SELECT SUM( wrbtr ) into my_amount from BSID
    >                           WHERE bukrs = p_bukrs
    >                           and   kunnr = p_konto
    >                           and   budat <= sy-datum.
    >
    > so, pls. let me know alternative(replace) SELECT which is less expensive performence wise
    >
    > thank you
    I don't think using SUM is a bad thing to do if you're using an index.  The only replacement would be to select all the data into an itab and add it up yourself and I don't think that doing this would be any quicker - it might even be slower because you would be bringing back more records rather than restricting the number of records brought back at database level which is what the SUM would do.  The thing I'd be concerned about here is that you're not taking acount of the SHKZG debit / credit flag.  But if all the lines you're dealing are credits or all are debits that won't matter to you.

  • Problem with aggregate function

    Hello,
    the prerequisite is to have a table with the following columns (#id, a_date).
    Now the task is to select the maximum date from a range of rows of this table
    select max(t.a_date)
    from the_table t
    where <condition to get only some rows of the_table>
    and to get the id of the record which holds the maximum date. That's where the problem is located. It should look like the following code (yes, I know this cannot work because grouping by the id makes the aggregate senseless) just to make clear what I want to do.
    select max(t.a_date), t.id
    from the_table t
    where <condition to get only some rows of the_table>
    group by t.id
    Does anybody have any idea how to solve the problem?
    With regards,
    Cosima
    Message was edited by:
    Cosima Laube

    Hi all,
    as the condition in the where-clause is very complex, I decided to use the solution with the FIRST function. After googling and reading a bit, I have just two more questions for my deeper understanding:
    1) am I right that both snippets below do exactly the same?
    SELECT MAX (t.a_date)
         , MAX (t.ID)KEEP (DENSE_RANK FIRST ORDER BY t.a_date DESC)
      FROM the_table t
    WHERE <condition to get only some rows of the_table>
    SELECT MAX (t.a_date)
         , MAX (t.ID)KEEP (DENSE_RANK LAST ORDER BY t.a_date ASC)
      FROM the_table t
    WHERE <condition to get only some rows of the_table>2) can I use als the min function for t.ID as there the aggregate function does not have any effect (but is necessary in order to use KEEP and FIRST or KEEP and LAST)?
    With regards,
    Cosima
    Message was edited by:
    Cosima Laube

  • Select into aggregate function

    Hi,
    I have the following PL/SQL block that I am testing. The block reads a list of values from a local file and inserts the values into a select statement that uses an aggregate function. Here is my code:
    DECLARE
    f utl_file.file_type;
    n COLa%TYPE; --holds values from the file
    v COLb%TYPE; --holds value returned by select statement
    BEGIN
    f := utl_file.fopen(dir,file.txt,'R');
    loop -- loop through file
         utl_file.get_line(f,n);
         if length(n) <> 0 then
              SELECT max(A0.COLa) into v
              FROM (SELECT AOB.COLa, A0.COLb
              FROM TAB1 A0,TAB2 A0B
              WHERE (A0.PK=A0B.PK) and A0B.COLa = n) A0;          
              IF SQL%rowcount = 0 THEN
                   dbms_output.put_line('no rows);
              else
                   dbms_output.put_line(v);
              end if;                    
         end if;     
    end loop;
         utl_file.fclose(f);     
    end;
    Here is the error I get:
    declare
    ERROR at line 1:
    ORA-01403: no data found
    ORA-06512: at "SYS.UTL_FILE", line 98
    ORA-06512: at "SYS.UTL_FILE", line 656
    ORA-06512: at line 12
    I checked the database for the first couple of values in the list and got rows. I also ran a simple PL/sql code to print values in the file successfully. So I dont know why it is returning no data found. From what I understand select into statement using aggregate functions will never raise this exception, so why I am getting this error?
    Thanks.

    Hi,
    Actually, the SELECT ... INTO isn't the problem here.
    Look at the error message:
    ERROR at line 1:
    ORA-01403: no data found
    ORA-06512: at "SYS.UTL_FILE", line 98
    ORA-06512: at "SYS.UTL_FILE", line 656
    ORA-06512: at line 12The error is ocurring inside utl_file, which is being called at line 12 of your code. (That must be
    utl_file.get_line(f,n);)
    Utl_file.get_line raises NO_DATA_FOUND when it reaches the end of the file. (If it didn't you'd have an infinite loop, since your code doesn't have any exit strategy.)
    You can put the call to get_line in its won BEGIN-EXCEPTION-END block, and trap the NO_DATA_FOUND error.
    For example:
    end_of_file := 0;
    WHILE  end_of_file = 0;
    loop -- loop through file
        BEGIN
            utl_file.get_line(f,n);
            if length(n) != 0 then          -- Use !=, because you can't post &lt;&gt; on this site
                SELECT  max(A0B.COLa)     -- No sub-query needed
                into    v
                FROM    TAB1 A0
                ,         TAB2 A0B
                WHERE   (A0.PK   = A0B.PK)
                and         A0B.COLa = n;
                IF SQL%rowcount = 0 THEN     -- Not needed: SELECT MAX ... without GROUP BY always returns 1 row
                    dbms_output.put_line('no rows);
                else
                    dbms_output.put_line(v);
                end if;
            end if;
        EXCEPTION
            WHEN  NO_DATA_FOUND
            THEN
                end_of_file = 1;
        END;
    end loop;Edited by: Frank Kulash on Jul 17, 2009 2:51 PM

  • Join tables with aggregate function

    Hello
    I have 4 views that I need to perform aggregate function, count, on and then join them for query output.
    Basically each view has a column with a score and a subcontractor name. One subcontractor may have more than one score in each view.
    Here is the sql that I'm working with thus far:
    select e.sub_name, (count(e.score) - 1), (count(v.score) - 1), (count(s.score) - 1), (count(u.score) - 1)
    from speed.v_ratings_ex e, speed.v_ratings_vg v, speed.v_ratings_sat s, speed.v_ratings_uns u
    where v.sub_name = e.sub_name
    and s.sub_name = e.sub_name
    and u.sub_name = e.sub_name
    group by e.sub_name
    order by e.sub_name asc;
    This results in each column returning the same value b/c the join is performed before the aggregate function.
    Can anyone offer some help so that I may get the desired results?

    You need to use in-line views to perform the aggregates, then join the in-line views.
    Something like:
    SELECT e.sub_name, e.score - 1 escore, v.score - 1 vscore,
           s.score - 1 sscore, u.score - 1 uscore
    FROM (SELECT sub_name,count(*) score
          FROM v_ratings_ex
          GROUP BY sub_name) e,
         (SELECT sub_name,count(*) score
          FROM v_ratings_vg
          GROUP BY sub_name) v,
         (SELECT sub_name,count(*) score
          FROM v_ratings_sat
          GROUP BY sub_name) s,
         (SELECT sub_name,count(*) score
          FROM v_ratings_uns
          GROUP BY sub_name) u
    WHERE v.sub_name = e.sub_name and
          s.sub_name = e.sub_name and
          u.sub_name = e.sub_name
    ORDER BY e.sub_name asc;TTFn
    John

  • Group By Select Statement aggregate function error.

    I am using Dreamweaver MX4 and have an ASP web page. I can easily do this in Access 2007 a simple "group by" sum totals statement.
    Why can I not get this to work on a web page? Are the "#" signs messing this up in the Code2 Select Statement?
    How can I get Code2 to work correctly? Code1 works fine. A user enters the customer and a date range on a form and submits request. I get error message in Dreamweaver. You tried to execute a query where the specified expression field3 is not part of an aggregate function
    Code1. This works fine SELECT Field3, Field10, SUM(Field16) as SumofField16 FROM CustomerHistory_CP WHERE Field3 LIKE '%Search_Criteria%' AND Field6 >= #1/2/09# AND Field6 <= #1/30/09#  GROUP BY Field3, Field10
    Code2. I get error message SELECT Field3, Field10, SUM(Field16) as SumofField16 FROM CustomerHistory_CP WHERE Field3 LIKE '%Search_Criteria%' AND Field6 >= #Date1# AND Field6 <= #Date2#  GROUP BY Field3, Field10

    I am using Dreamweaver MX4 and have an ASP web page. I can easily do this in Access 2007 a simple "group by" sum totals statement.
    Why can I not get this to work on a web page? Are the "#" signs messing this up in the Code2 Select Statement?
    How can I get Code2 to work correctly? Code1 works fine. A user enters the customer and a date range on a form and submits request. I get error message in Dreamweaver. You tried to execute a query where the specified expression field3 is not part of an aggregate function
    Code1. This works fine SELECT Field3, Field10, SUM(Field16) as SumofField16 FROM CustomerHistory_CP WHERE Field3 LIKE '%Search_Criteria%' AND Field6 >= #1/2/09# AND Field6 <= #1/30/09#  GROUP BY Field3, Field10
    Code2. I get error message SELECT Field3, Field10, SUM(Field16) as SumofField16 FROM CustomerHistory_CP WHERE Field3 LIKE '%Search_Criteria%' AND Field6 >= #Date1# AND Field6 <= #Date2#  GROUP BY Field3, Field10

  • Aggregate function max

    hi evrybdy
    can we use MAX aggregate  function to pick a field of type char from database table ...if yes thn how ..?
    plzzz ans soon .
    thank u .

    No. Max is only for numbers. ( even it will not work for NUMC type )
    OPENSQL statements will not work as ASCII format.
    Cheers
    Kothand

  • Update of a table from a select query with aggregate functions.

    Hello All,
    I have problem here:
    I have 2 tables A(a1, a2, a3, a4, a4....... ) and B( a1, a2, b1, b2, b3). I need to calculate the avg(a4-a3), Max(a4-a3) and Min(a4-a3) and insert it into table B. If the foreign keys a1, a2 already exist in table B, I need to do an update of the computed values into column b1, b2 and b3 respectively, for a1, a2.
    Q1. Is it possible to do this with a single query ? I would prefer not to join A with B because the table A is very large. Also columns b1, b2 and b3 are non-nullable.
    Q2. Also if a4 and a3 are timestamps what is the best way to find the average? A difference of timestamps yields INTERVAL DAY TO SECOND over which the avg function doesn't seem to work. The averages, max and min in my case would be less than a day and hence all I need is to get the data in the hh:mm:ss format.
    As of now I'm using :
    TO_CHAR(TO_DATE(ABS(MOD(TRUNC(AVG(extract(hour FROM (last_modified_date - created_date))*3600 +
    extract( minute FROM (last_modified_date - created_date))*60 +
    extract( second FROM (last_modified_date - created_date)))
    ),86400)),'sssss'),'hh24":"mi":"ss') AS avg_time,
    But this is very long drawn. Something more compact and efficient would be nice.
    Thanks in advance for your inputs.
    Edited by: 847764 on Mar 27, 2011 5:35 PM

    847764 wrote:
    Hi,
    Thanks everyone for such fast replies. Malakshinov's example worked fine for me as far as updating the table goes. As for the timestamp computations, I'm posting additional info: Sorry, I don't understand.
    If Malakshinov's example worked for updating the table, but you still have problems, does that mean you have to do something else besides update the table? If so, what?
    Oracle version : Oracle Database 11g Enterprise Edition Release 11.2.0.1.0
    Here are the table details :
    DESC Table A
    Name Null Type
    ID               NOT NULL NUMBER
    A1               NOT NULL VARCHAR2(4)
    A2               NOT NULL VARCHAR2(40)
    A3               NOT NULL VARCHAR2(40)
    CREATED_DATE NOT NULL TIMESTAMP(6)
    LAST_MODIFIED_DATE TIMESTAMP(6) DESCribing the tables can help clarify some things, but it's no substitute for posting CREATE TABLE and INSERT statements. With only a description of the table, nobody can re-create the problem or test their ideas. Please post CREATE TABLE and INSERT statements for both tables as they exist before the MERGE. If table b doen't contain any rows before the MERGE, then just say so, but you still need to post a CREATE TABLE statement for both tables, and INSERT statements for table a.
    The objective is to compute the response times : avg (LAST_MODIFIED_DATE - CREATED_DATE), max (LAST_MODIFIED_DATE - CREATED_DATE) and min (LAST_MODIFIED_DATE - CREATED_DATE) grouped by A1 and A2 and store it in table B under AVG_T, MAX_T and MIN_T. Since AVG_T, MAX_T and MIN_T are only used for reporting purposes we have kept it as Varchar (though I think keeping it as timestamp would make more sense). I think a NUMBER would make more sense (the number of minutes, for example), or perhaps an INTERVAL DAY TO SECOND. If you stored a NUMBER, it would be easy to compute averages.
    In table B the times are stored in the format : hh:mm:ss. We don't need milliseconds precision. If you don;'t need milliseconds, then you should use DATE instead of TIMESTAMP. The functions for manipulating DATEs are much better.
    Hence I was calculating is as follows:
    -- Avg Time
    TO_CHAR(TO_DATE(ABS(MOD(TRUNC(AVG(extract(hour FROM (last_modified_date - created_date))*3600 +
    extract( minute FROM (last_modified_date - created_date))*60 +
    extract( second FROM (last_modified_date - created_date)))
    ),86400)),'sssss'),'hh24":"mi":"ss') AS avg_time,
    --Max Time
    extract (hour FROM MAX(last_modified_date - created_date))||':'||extract (minute FROM MAX(last_modified_date - created_date))||':'||TRUNC(extract (second FROM MAX(last_modified_date - created_date))) AS max_time,
    --Min Time
    extract (hour FROM MIN(last_modified_date - created_date))||':'||extract (minute FROM MIN(last_modified_date - created_date))||':'||TRUNC(extract (second FROM MIN(last_modified_date - created_date))) AS min_timeIs this something that has to be done before or after the MERGE?
    Post the complete statement.
    Is this part of a query? Where's the SELECT keyword?
    Is this part of a DML operation? Where's the INSERT, or UPDATE, or MERGE keyword?
    What are the exact results you want from this? Explain how you get those results.
    Is the code above getting the right results? Are you just asking if there's a better way to get the same results?
    You have to explain things very carefully. None of the people who want to help you are familiar with your application, or your needs.
    I just noticed that my reply is horribly formatted - apologies! I'm just getting the hang of it.Whenever you post formatted text (such as query results) on this site, type these 6 characters:
    \(small letters only, inside curly brackets) before and after each section of formatted text, to preserve spacing.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • How to use analytic function with aggregate function

    hello
    can we use analytic function and aggrgate function in same qurey? i tried to find any example on Net but not get any example how both of these function works together. Any link or example plz share with me
    Edited by: Oracle Studnet on Nov 15, 2009 10:29 PM

    select
    t1.region_name,
    t2.division_name,
    t3.month,
    t3.amount mthly_sales,
    max(t3.amount) over (partition by t1.region_name, t2.division_name)
    max_mthly_sales
    from
    region t1,
    division t2,
    sales t3
    where
    t1.region_id=t3.region_id
    and
    t2.division_id=t3.division_id
    and
    t3.year=2004
    Source:http://www.orafusion.com/art_anlytc.htm
    Here max (aggregate) and over partition by (analytic) function is in same query. So it means we can use aggregate and analytic function in same query and more than one analytic function in same query also.
    Hth
    Girish Sharma

  • Update statement with Aggregate function

    Hi All,
    I would like to update the records with sum of SAL+COMM+DEPTNO into SALCOMDEPT column for each row. Can we use SUM function in Update statement. Please help me out this
    See below:
    Table
    CREATE TABLE EMP
    EMPNO NUMBER(4) NOT NULL,
    ENAME VARCHAR2(10 CHAR),
    JOB VARCHAR2(9 CHAR),
    MGR NUMBER(4),
    HIREDATE DATE,
    SAL NUMBER(7,2),
    COMM NUMBER(7,2),
    DEPTNO NUMBER(2),
    SALCOMDEPT NUMBER
    Used update statement :
    UPDATE emp e1
    SET e1.salcomdept= (select sum(sumsal+comm+deptno) from emp e2 where e2.empno=e1.empno)
    WHERE e1.deptno = 10
    commit
    Thanks,
    User

    Adding these columns makes no sense, so I'll assume this is just an exercise.
    However, storing calculated columns like this is generally not a good idea. If one of the other columns is updated, your calculated column will be out of sync.
    One way around this is to create a simple view
    SQL> CREATE view EMP_v as
      2  select EMPNO
      3        ,ENAME
      4        ,JOB
      5        ,MGR
      6        ,HIREDATE
      7        ,SAL
      8        ,COMM
      9        ,DEPTNO
    10        ,(nvl(sal,0) + nvl(comm,0) + nvl(deptno,0)) SALCOMDEPT
    11  from   emp;
    View created.
    SQL> select sal, comm, deptno, salcomdept from emp_v;
                     SAL                 COMM               DEPTNO           SALCOMDEPT
                     800                                        20                  820
                    1600                  300                   30                 1930
                    1250                  500                   30                 1780
                    2975                                        20                 2995
                    1250                 1400                   30                 2680
                    2850                                        30                 2880
                    2450                                        10                 2460
                    3000                                        20                 3020
                    5000                                        10                 5010
                    1500                    0                   30                 1530
                    1100                                        20                 1120
                     950                                        30                  980
                    3000                                        20                 3020
                    1300                                        10                 1310
    14 rows selected.

  • Strange behaviour of result area after select with XML-function

    Hello,
    i execute following statement
    SELECT XMLElement("arow",
    XMLElement("bcol", a.b) ,
    XMLElement("ccol", a.c) d
    ) as res FROM a;
    and get afterwards a completly grey result area.
    SQL Developer says that it used 0.006 seconds and 5 rows are retrieved.
    After clicking in the result i see the column header and the rowcount
    on the left side, but the rest is grey.
    I am using SQL Developer 1.5.4 on Windows XP Professional Service
    Pack 3 and Oracle 10.2.0.4.
    I also search in Metalink if this behaviour is a bug, but i didn't find anything.
    In this forum i found a solution (getclobval). But i can not use this solution
    for several reasons.
    How to reproduce:
    <--start-->
    create table a (b number not null primary key, c varchar2(10 char));
    insert into a (b, c) values (1, 'a');
    insert into a (b, c) values (2, 'b');
    insert into a (b, c) values (3, 'c');
    insert into a (b, c) values (4, 'd');
    insert into a (b, c) values (5, 'e');
    commit;
    Execute the statement above.
    <--stop-->
    If you need further information, let me know.
    Thanks for your help in advance.
    Yours,
    Erich

    Erich,
    It looks like you need to provide more details. I can't replicate this on 155 or the latest 2.1 early adopter, on either a remote 10g database or a local 11g database. It sounds like it may be related to your setup, perhaps a slow network connection? The JDK sounds fine, although I'm testing on 1.6_15 for the 1.5.5 installation. Is it only related to the XML queries - do you see this with other queries?
    Sue

  • Answers Reports with Aggregate Function

    Hi,
    I have a Product Table with columns, CustomerName, ProductID,ProductName,Purchase Date. I am trying to create a report in the flg. format:
    CustomerName Customer-Prod Count ProductName Total Prod Count %of Products
    Mike 10 CD 2 20%
    DVD 1 10%
    Computer 6 60%
    Laptop 1 10%
    Dan 5 Computer 3 60%
    DVD 2 40%
    Any suggestions on how to accomplish this?
    Thanks,
    rkingmdu

    Thanks, the syntax worked. But I get incorrect results.
    Here's what I queried:
    "PRODUCTS".CUSTOMERNAME,
    COUNT("PRODUCTS"."PRODUCTID" By "PRODUCTS"."CUSTOMERNAME"),
    "PRODUCTS".PRODUCTNAME,
    COUNT("PRODUCTS"."PRODUCTID" By "PRODUCTS"."PRODUCTID", "PRODUCTS"."CUSTOMERNAME")
    The first count returns the correct results but the second count is wrong. The query looks to be ok, I don't know where I am getting it wrong.
    Any ideas?
    Thanks
    rkingmdu

  • Need complex query  with joins and AGGREGATE  functions.

    Hello Everyone ;
    Good Morning to all ;
    I have 3 tables with 2 lakhs record. I need to check query performance.. How CBO rewrites my query in materialized view ?
    I want to make complex join with AGGREGATE FUNCTION.
    my table details
    SQL> select from tab;*
    TNAME TABTYPE CLUSTERID
    DEPT TABLE
    PAYROLL TABLE
    EMP TABLE
    SQL> desc emp
    Name
    EID
    ENAME
    EDOB
    EGENDER
    EQUAL
    EGRADUATION
    EDESIGNATION
    ELEVEL
    EDOMAIN_ID
    EMOB_NO
    SQL> desc dept
    Name
    EID
    DNAME
    DMANAGER
    DCONTACT_NO
    DPROJ_NAME
    SQL> desc payroll
    Name
    EID
    PF_NO
    SAL_ACC_NO
    SALARY
    BONUS
    I want to make  complex query  with joins and AGGREGATE  functions.
    Dept names are : IT , ITES , Accounts , Mgmt , Hr
    GRADUATIONS are : Engineering , Arts , Accounts , business_applications
    I want to select records who are working in IT and ITES and graduation should be "Engineering"
    salary > 20000 and < = 22800 and bonus > 1000 and <= 1999 with count for males and females Separately ;
    Please help me to make a such complex query with joins ..
    Thanks in advance ..
    Edited by: 969352 on May 25, 2013 11:34 AM

    969352 wrote:
    why do you avoid providing requested & NEEDED details?I do NOT understand what do you expect ?
    My Goal is :
    1. When executing my own query i need to check expalin plan.please proceed to do so
    http://docs.oracle.com/cd/E11882_01/server.112/e26088/statements_9010.htm#SQLRF01601
    2. IF i enable query rewrite option .. i want to check explain plan ( how optimizer rewrites my query ) ? please proceed to do so
    http://docs.oracle.com/cd/E11882_01/server.112/e16638/ex_plan.htm#PFGRF009
    3. My only aim is QUERY PERFORMANCE with QUERY REWRITE clause in materialized view.It is an admirable goal.
    Best Wishes on your quest for performance improvements.

Maybe you are looking for

  • The Art of Scrolling Challenge

    Good morning ppl, A while back I was working on creating a scrollbarless sliding thumbnail list. I scoured the net and couldn't find anything -- anything free that is. All I found were sites selling products (minus source files of course). Ridiculous

  • How to make a CSS transition?

    I have text beneath a picture, but I'd like to hide the text behind the picture and have it drop down when a mouse hovers over it. I can't get it to work.

  • How to know if there is Export

    Hi All, does anyonw has info to whether if one is abkle to know whether the export funvtion Exp or Exp80 exist in a db. In other words, how to query db whether we can do an export of objects/tables or data from a db? Tried to use Exp80 in Oracle rel

  • DNG converter converts RAW  Adobe RGB color profile to ProPhoto ??

    I have a Canon 5D2.  there is an internal switch to shoot RAW files using the Adobe RGB  profile.  when I open a RAW image in CS4, there is no problem.  yet when I open the DNG converted image in CS4, CS4 tells me the image has the ProPhoto color pro

  • IPod will not sync with Itunes since updated to 7.2034 error 1418

    The IPod still works when not connected to PC but when connected an error message comes up that it is corrupted and will need to be restored. When this attempted error 1418 eventually appears. I have tried by resetting the Drivers in windows - updati