GROUP and MAX Query

Hi
I have two tables that store following information
CREATE TABLE T_FEED (FEED_ID NUMBER, GRP_NUM NUMBER);
CREATE TABLE T_FEED_RCV (FEED_ID NUMBER, RCV_DT DATE);
INSERT INTO T_FEED VALUES (1, 1);
INSERT INTO T_FEED VALUES (2, 1);
INSERT INTO T_FEED VALUES (3, 2);
INSERT INTO T_FEED VALUES (4, NULL);
INSERT INTO T_FEED VALUES (5, NULL);
INSERT INTO T_FEED_RCV VALUES (2, '1-MAY-2009');
INSERT INTO T_FEED_RCV VALUES (3, '1-FEB-2009');
INSERT INTO T_FEED_RCV VALUES (4, '12-MAY-2009');
COMMIT;
I join these tables using the following query to return all the feeds and check when each feed was received:
SELECT
F.FEED_ID,
F.GRP_NUM,
FR.RCV_DT
FROM T_FEED F
LEFT OUTER JOIN T_FEED_RCV FR
ON F.FEED_ID = FR.FEED_ID
ORDER BY GRP_NUM, RCV_DT DESC;
Output
Line: ----------
FEED_ID     GRP_NUM     RCV_DT
1     1     
2     1     5/1/2009
3     2     2/1/2009
5          
4          5/12/2009
Actually I want the maximum date of when we received the feed. Grp_Num tells which feeds are grouped together. NULL grp_num means they are not grouped so treat them as individual group. In the example - Feeds 1 and 2 are in one group and any one of the feed is required. Feed 3, 4 and 5 are individual groups and all the three are required.
I need a single query that should return the maximum date for the feeds. For the example the result should be NULL because.. out of feed 1 and 2 the max date is 5/1/2009. For feed 3 the max date is 2/1/2009, for feed 4 it is 5/12/2009 and for feed 4 it is NULL. Since one of the required feed is null so the results should be null.
DELETE FROM T_FEED;
DELETE FROM T_FEED_RCV;
COMMIT;
INSERT INTO T_FEED VALUES (1, 1);
INSERT INTO T_FEED VALUES (2, 1);
INSERT INTO T_FEED VALUES (3, NULL);
INSERT INTO T_FEED VALUES (4, NULL);
INSERT INTO T_FEED_RCV VALUES (2, '1-MAY-2009');
INSERT INTO T_FEED_RCV VALUES (3, '1-FEB-2009');
INSERT INTO T_FEED_RCV VALUES (4, '12-MAY-2009');
COMMIT;
For above inserts, the result should be for feed 1 and 2 - 5/1/2009, feed 3 - 2/1/2009 and feed 4 - 5/12/2009. So the max of these dates is 5/12/2009.
I tried using MAX function grouped by GRP_NUM and also tried using DENSE_RANK but unable to resolve the issue. I am not sure how can I use the same query to return - not null value for same group and null (if any) for those that doesn't belong to any group. Appreciate if anyone can help me..

Hi,
Kuul13 wrote:
Thanks Frank!
Appreciate your time and solution. I tweaked your earlier solution which was more cleaner and simple and built the following query to resolve the prblem.
SELECT * FROM (
SELECT NVL (F.GRP_NUM, F.CARR_ID || F.FEED_ID || TO_CHAR(EFF_DT, 'MMDDYYYY')) AS GRP_ID
,MAX (FR.RCV_DT) AS MAX_DT
FROM T_FEED F
LEFT OUTER JOIN T_FEED_RCV FR ON F.FEED_ID = FR.FEED_ID
GROUP BY NVL (F.GRP_NUM, F.CARR_ID || F.FEED_ID || TO_CHAR(EFF_DT, 'MMDDYYYY'))
ORDER BY MAX_DT DESC NULLS FIRST)
WHERE ROWNUM=1;
I hope there are no hidden issues with this query than the later one provided by you.Actually, I can see 4 issues with this. I admit that some of them are unlikely, but why take any chances?
(1) The first argument to NVL is a NUMBER, the second (being the result of ||) is a VARCHAR2. That means one of them will be implicitly converted to the type of the other. This is just the kind of thing that behaves differently in different versions or Oracle, so it may work fine for a year or two, and then, when you change to another version, mysteriously quit wiorking. When you have to convert from one type of data to another, always do an explicit conversion, using TO_CHAR (for example).
(2)
F.CARR_ID || F.FEED_ID || TO_CHAR(EFF_DT, 'MMDDYYYY)'will produce a key like '123405202009'. grp_num is a NUMBER with no restriction on the number of digits, so it could conceivably be 123405202009. The made-up grp_ids must never be the same any real grp_num.
(3) The combination (carr_id, feed_id, eff_dt) is unique, but using TO_CHAR(EFF_DT, 'MMDDYYYY) assumes that the combination (carr_id, feed_id, TRUNC (eff_dt)) is unique. Even if eff_dt is always entered as (say) midnight (00:00:00) now, you may decide to start using the time of day sometime in the future. What are the chances that you'll remember to change this query when you do? Not very likely. If multiple rows from the same day are relatively rare, this is the kind of error that could go on for months before you even realize that there is an error.
(4) Say you have this data in t_feed:
carr_id      feed_id  eff_dt       grp_num
1        234      20-May-2009  NULL
12       34       20-May-2009  NULL
123      4        20-May-2009  NULLAll of these rows will produce the same grp_id: 123405202009.
Using NVL, as you are doing, allows you to get by with just one sub-query, which is nice.
You can do that and still address all the problems above:
SELECT  *
FROM     (
     SELECT  NVL2 ( F.GRP_NUM
               , 'A' || TO_CHAR (f.grp_num)
               , 'B' || TO_CHAR (f.carr_id) || ':' ||
                            TO_CHAR (f.feed_id) || ':' ||
                            TO_CHAR ( f.eff_dt
                                        , 'MMDDYYYYHH24MISS'
               ) AS grp_id
     ,     MAX (FR.RCV_DT) AS MAX_DT
     FROM               T_FEED      F
     LEFT OUTER JOIN  T_FEED_RCV  FR        ON  F.FEED_ID = FR.FEED_ID
     GROUP BY  NVL2 ( F.GRP_NUM
                 , 'A' || TO_CHAR (f.grp_num)
                 , 'B' || TO_CHAR (f.carr_id) || ':' ||
                                 TO_CHAR (f.feed_id) || ':' ||
                                 TO_CHAR ( f.eff_dt
                                             , 'MMDDYYYYHH24MISS'
     ORDER BY  MAX_DT   DESC   NULLS FIRST
WHERE      ROWNUM = 1;I would still use two sub-queries, adding one to compute grp_id, so we don't have to repeat the NVL2 expression. I would also use a WITH clause rather than in-line views.
Do you find it easier to read the query above, or the simpler query you posted in your last message?
Please make things easy on yourself and the people who want to help you. Always format your code so that the way the code looks on the screen makes it clear what the code is doing.
In particular, the formatting should make clear
(a) where each clause (SELECT, FROM, WHERE, ...) of each query begins
(b) where sub-queries begin and end
(c) what each argument to functions is
(d) the scope of parentheses
When you post formatted text on this site, type these 6 characters:
before and after the formatted text, to preserve spacing.
The way you post the DDL (CREATE TABLE ...)  and DML (INSERT ...) statements is great: I wish more people were as helpful as you.
There's no need to format the DDL and DML.  (If you want to, then go ahead: it does help a little.)                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

Similar Messages

  • A bit tricky Min and Max Query

    Dear Masters
    I have a table
    Student_ID
    Semester_Name
    Registration_Date
    Now i want to retrieve for each student_id semester_name with lowest registration_date (First Semester) and Last Semester.
    I do not want to use sub query as it has performance problem
    Any Help
    Thanks
    Shakeel

    Solution:
    SELECT student_id, MIN(semester_name) KEEP (DENSE_RANK FIRST ORDER BY registration_date) as First_Semester, MIN(semester_name) KEEP (DENSE_RANK LAST ORDER BY registration_date) as Last_Semester
    From yourtable
    GROUP BY student_id
    hth

  • Help on min and max query

    hi all,
    I have a query that returns the following.
    SELECT id, start_time, end_time
    FROM accounts
    WHERE
    id = 006267
    ORDER BY begin_time
    result is
         ID     BEGIN_TIME     END_TIME
         006267     11-12-2006 15:00:00     11-12-2006 17:30:00
         006267     15-12-2006 17:00:00     15-12-2006 19:30:00
         006267     18-12-2006 08:30:00     18-12-2006 12:50:00
         006267     18-12-2006 13:50:00     18-12-2006 18:20:00
         006267     20-12-2006 10:00:00     20-12-2006 12:30:00
         006267     21-12-2006 16:15:00     21-12-2006 18:45:00
    The output needed is
    006267 11-12-2006 15:00:00 15-12-2006 19:30:00
    006267 18-12-2006 08:30:00 18-12-2006 18:20:00
    006267 20-12-2006 10:00:00 21-12-2006 18:45:00
    Would really appreciate your help on this!
    Thanks in advance.

    To get the next value, you can use LEAD() over ()<br>
    So, in your case :<br>
    lead(end_time) over (partition by id order by start_time)<br>
    You need to define the removing row criteria. If that's only the odd, try :<br>
    select id, start_time, next_end_time
       from
       (select id,
               start_time,
               lead(end_time) over (partition by id order by start_time) next_end_time,
               row_number() over (partition by id order by start_time) rn
        from   accounts
        order by id, start_time)
       where mod(rn,2)=1;<br>
    <br>
    Nicolas.

  • How to retrieve Min(startDate) and Max(endDate) for different groups of data? (sql server 2000)

    My sample dataset (below) contains 3 groups -- 'a', 'b', 'c'.  I need to retrieve the Min(startDate) and Max(EndDate) for each group so that the output looks something like this (date format not an issue):
    fk   minStart       maxEnd
    a    1/13/1985    12/31/2003
    b    2/14/1986    12/31/2003
    c    4/26/1987    12/31/2002
    What is the Tsql to perform this type of operation?  Note:  the actual data resides in a sql server 2000 DB.  If the Tsql is different between version 2000 and the later versions -- I would be grateful for both versions of the Tsql
    --I noticed that multiple lines of Insert values doesn't work in Sql Server 2000 -- this sample is in Sql Server 2008
    create table #tmp2(rowID int Identity(1,1), fk varchar(1), startDate datetime, endDate datetime)
    insert into #tmp2
    values
    ('a', '1/13/1985', '12/31/1999'),
    ('a', '3/17/1992', '12/31/1997'),
    ('a', '4/21/1987', '12/31/2003'),
    ('b', '2/14/1986', '12/31/2003'),
    ('b', '5/30/1993', '12/31/2001'),
    ('b', '6/15/1994', '12/31/2003'),
    ('b', '7/7/2001', '12/31/2003'),
    ('c', '4/26/1987', '12/31/1991'),
    ('c', '8/14/1992', '12/31/1998'),
    ('c', '9/10/1995', '12/31/2002'),
    ('c', '10/9/1996', '12/31/2000')
    Thanks
    Rich P

    Rich
    It is unclear what you are trying to achieve, you said that it is SQL Server 2000 but provide a sample data with SQL Server 2008 syntax
    Is it possible to use UNION ALL for your queries to make its one 
    select * from
    select * from #tmp2 t1 where exists
    (select * from (select top 1 * from #tmp2 t2 where t2.fk = t1.fk order by t2.startdate) x where x.rowID = t1.rowID)
    UNION ALL
    select * from #tmp2 t1 where exists
    (select * from (select top 1 * from #tmp2 t2 where t2.fk = t1.fk order by t2.Enddate desc) x where x.rowID = t1.rowID)
     as  der order by fk
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • SQL Query to get All AD Groups and its users in Active Directory

    Hi,
       Is there any query to get all AD groups and its user in an instance of a SQL server?

    Check this blog.
    http://www.mikefal.net/2011/04/18/monday-scripts-%E2%80%93-xp_logininfo/
    It will give you more than what is required. If you dont want the extra information,then you can try this.. I took the query and removed the bits that you might not require.
    declare @winlogins table
    (acct_name sysname,
    acct_type varchar(10),
    act_priv varchar(10),
    login_name sysname,
    perm_path sysname)
    declare @group sysname
    declare recscan cursor for
    select name from sys.server_principals
    where type = 'G' and name not like 'NT%'
    open recscan
    fetch next from recscan into @group
    while @@FETCH_STATUS = 0
    begin
    insert into @winlogins
    exec xp_logininfo @group,'members'
    fetch next from recscan into @group
    end
    close recscan
    deallocate recscan
    select
    u.name,
    u.type_desc,
    wl.login_name,
    wl.acct_type
    from sys.server_principals u
    inner join @winlogins wl on u.name = wl.perm_path
    where u.type = 'G'
    order by u.name,wl.login_name
    Regards, Ashwin Menon My Blog - http:\\sqllearnings.com

  • Wsus query needed - get WSUS-Computers, belonging WSUS-Group and Not Installed Count

    Hi,
    i try to find a way by using basic WSUS powershell cmds in combination with piping in Server 2012 R2 to get all registered computers in WSUS plus belonging WSUS-Group and Update "Not Installed Count" as output.
    Is that possible?
    I tried multiple times and enden up in using posh - is there no way based on standard powershell commandlets.
    Thank you
    Peter

    Hi Michael,
    it seems that you are right :(. I tried out a few things with powershell (source
    http://blogs.technet.com/b/heyscriptingguy/archive/2012/01/19/use-powershell-to-find-missing-updates-on-wsus-client-computers.aspx) - big problem is that i actually cant get belonging WSUS Group to Server object. I only are able to get all WSUS Groups
    but cant find the right sytax to get only belonging ones.
    Any ideas?
    Thanks
    Peter
    #Load assemblies
    [void][system.reflection.assembly]::LoadWithPartialName('Microsoft.UpdateServices.Administration')
    #Create Scope objects
    $computerscope = New-Object Microsoft.UpdateServices.Administration.ComputerTargetScope
    $updatescope = New-Object Microsoft.UpdateServices.Administration.UpdateScope
    #Gather only servers
    $ServersId = @($wsus.GetComputerTargets($computerscope) | Where {
    $_.OSDescription -like "*Server*"
    } | Select -expand Id)
    #Get Update Summary
    $wsus.GetSummariesPerComputerTarget($updatescope,$computerscope) | Where {
    #Filter out non servers
    $ServersId -Contains $_.ComputerTargetID
    } | ForEach {
    New-Object PSObject -Property @{
    ComputerTarget = ($wsus.GetComputerTarget([guid]$_.ComputerTargetId)).FullDomainName
    ComputerTargetGroupIDs = ($wsus.GetComputerTarget([guid]$_.ComputerTargetId)).ComputerTargetGroupIds
    ComputerTargetGroupNames = ($wsus.GetComputerTargetGroups())
    NeededCount = ($_.DownloadedCount + $_.NotInstalledCount)
    #DownloadedCount = $_.DownloadedCount
    NotInstalledCount = $_.NotInstalledCount
    #InstalledCount = $_.InstalledCount

  • Query about min and max and middle row of a table

    suppose i have table emp and field
    v_date date;
    which has data in time stamp
    time
    10:20
    10:25
    10:30
    10:32
    10:33
    10:35
    10:36
    10:38
    I need only min time and max time and two record between min and max

    I need only min time and max time and two record between min and max Like this?
    SQL> create table t (id number);
    Table created.
    SQL>
    SQL> insert into t values (1020);
    1 row created.
    SQL> insert into t values (1025);
    1 row created.
    SQL> insert into t values (1030);
    1 row created.
    SQL> insert into t values (1032);
    1 row created.
    SQL> insert into t values (1033);
    1 row created.
    SQL> insert into t values (1035);
    1 row created.
    SQL> insert into t values (1036);
    1 row created.
    SQL> insert into t values (1038);
    1 row created.
    SQL>
    SQL> commit;
    Commit complete.
    SQL>
    SQL> select * from t;
         ID
       1020
       1025
       1030
       1032
       1033
       1035
       1036
       1038
    8 rows selected.
    SQL>
    SQL>
    SQL> select decode(rownum, 1, min_val, 4, max_val, next_val) your_data from (
      2  select first_value (id) over (partition by 'a' order by 'a') min_val,
      3         last_value (id) over (partition by 'a' order by 'a')  max_val,
      4         id,
      5         lead(id) over (partition by 'a' order by id) next_val
      6   from t
      7  order by id
      8   )
      9  where min_val <> next_val and max_val <> next_val
    10  and rownum <= 4;
    YOUR_DATA
         1020
         1030
         1032
         1038
    SQL>

  • Help with aggregate function max query

    I have a large database which stores daily oil life, odo readings from thousands of vehicles being tested but only once a month.
    I am trying to grab data from one month where only the EOL_READ = 0 and put all of those values into the previous month's query where EOL_READ = anything.
    Here is the original query which grabs everything
    (select distinct vdh.STID, vdh.CASE_SAK,
    max(to_number(decode(COMMAND_ID,'EOL_READ',COMMAND_RESULT,-100000))) EOL_READ,
    max(to_number(decode(COMMAND_ID,'ODO_READ',COMMAND_RESULT,-100000))) ODO_READ,
    max(to_number(decode(COMMAND_ID,'OIL_LIFE_PREDICTION',COMMAND_RESULT,-100000))) OIL_LIFE_PREDICTION
    from veh_diag_history vdh, c2pt_data_history cdh
    where vdh.veh_diag_history_sak = cdh.veh_diag_history_sak
    and cdh.COMMAND_ID in ('EOL_READ','ODO_READ','OIL_LIFE_PREDICTION')
    and vdh.LOGICAL_TRIGGER_SAK = 3
    and cdh.CREATED_TIMESTAMP =vdh.CREATED_TIMESTAMP
    and cdh.CREATED_TIMESTAMP >= to_date('03/01/2007 12:00:00 AM','mm/dd/yyyy HH:MI:SS AM')
    and cdh.CREATED_TIMESTAMP < to_date('04/01/2007 12:00:00 AM','mm/dd/yyyy HH:MI:SS AM')
    group by vdh.STID, vdh.case_sak
    having count(distinct command_id) = 3
    order by OIL_LIFE_PREDICTION)
    which gives 5 columns:
    STID, CASE_SAK, EOL_READ, ODO_READ, and OIL_LIFE_PREDICTION
    and gives me about 80000 rows returned for the above query
    I only want one reading per month, but sometimes I get two.
    STID is the unique identifier for a vehicle.
    I tried this query:
    I tried this query which nests one request within the other and SQL times out every time:
    select distinct vdh.STID, vdh.CASE_SAK,
    max(to_number(decode(COMMAND_ID,'EOL_READ',COMMAND_RESULT,-100000))) EOL_READ,
    max(to_number(decode(COMMAND_ID,'ODO_READ',COMMAND_RESULT,-100000))) ODO_READ,
    max(to_number(decode(COMMAND_ID,'OIL_LIFE_PREDICTION',COMMAND_RESULT,-100000))) OIL_LIFE_PREDICTION
    from veh_diag_history vdh, c2pt_data_history cdh
    where vdh.veh_diag_history_sak = cdh.veh_diag_history_sak
    and cdh.COMMAND_ID in ('EOL_READ','ODO_READ','OIL_LIFE_PREDICTION')
    and vdh.LOGICAL_TRIGGER_SAK = 3
    and cdh.CREATED_TIMESTAMP =vdh.CREATED_TIMESTAMP
    and cdh.CREATED_TIMESTAMP >= to_date('02/01/2007 12:00:00 AM','mm/dd/yyyy HH:MI:SS AM')
    and cdh.CREATED_TIMESTAMP < to_date('03/01/2007 12:00:00 AM','mm/dd/yyyy HH:MI:SS AM')
    group by vdh.STID, vdh.case_sak
    having count(distinct command_id) = 3
    and vdh.stid in (select distinct vdh.STID, vdh.CASE_SAK,
    max(to_number(decode(COMMAND_ID,'EOL_READ',COMMAND_RESULT,-100000))) EOL_READ,
    max(to_number(decode(COMMAND_ID,'ODO_READ',COMMAND_RESULT,-100000))) ODO_READ,
    max(to_number(decode(COMMAND_ID,'OIL_LIFE_PREDICTION',COMMAND_RESULT,-100000))) OIL_LIFE_PREDICTION
    from veh_diag_history vdh, c2pt_data_history cdh
    where vdh.veh_diag_history_sak = cdh.veh_diag_history_sak
    and cdh.COMMAND_ID in ('EOL_READ','ODO_READ','OIL_LIFE_PREDICTION')
    and vdh.LOGICAL_TRIGGER_SAK = 3
    and cdh.CREATED_TIMESTAMP =vdh.CREATED_TIMESTAMP
    and cdh.CREATED_TIMESTAMP >= to_date('03/01/2007 12:00:00 AM','mm/dd/yyyy HH:MI:SS AM')
    and cdh.CREATED_TIMESTAMP < to_date('04/01/2007 12:00:00 AM','mm/dd/yyyy HH:MI:SS AM')
    group by vdh.STID, vdh.case_sak
    having count(distinct command_id) = 3
    and (max(to_number(decode(COMMAND_ID,'EOL_READ',COMMAND_RESULT,-100000)))) = 0)
    order by OIL_LIFE_PREDICTION
    so in summary I am trying to select values from the previous month only from those stids where this month's value for EOL_READ = 0
    Any ideas.....please help.

    You can use row_number() within each stid and each month to determine the first read of each month. Then you can use lag() to get the previous month's reading of the current month's reading is zero.
    with all_data as (
    select stid,
           case_sak,
           eol_read,
           timestamp,
           row_number() over (partition by stid, trunc(timestamp,'mm') order by timestamp) AS rn
      from veh_diag_history
    select stid,
           case_sak,
           case
             when eol_read = 0
               then lag(eol_read) over (partition by stid order by timestamp)
             else eol_read
           end as eol_read
      from all_data
    where rn = 1;

  • CFOUTPUT GROUP ing max results ??

    Hello All,
    Just a question, is there a max number of rows etc that cold
    fusion can handle in outputing by group from a large query?
    I have a query that returns 45631 rows in about 5 seconds it
    has some joined tables and grouping in the query.
    I output the query using two nested groups
    CFOUTPUT query="get_data" group="suppname"
    Group title
    CFOUTPUT group="mpart"
    relevant part
    CFOUTPUT
    all lines from query for that part
    /CFOUTPUT
    /CFOUTPUT
    /CFOUTPUT
    My page only seems to output 9999 results when I use GROUP in
    the output, but if I just output the query with no GROUPING then I
    get all 45631 results.
    Does CF have a limit on the results when GROUPING in a
    CFOUTPUT?
    Kind Regards Guy

    How are you counting your outputted results?

  • Rewrite the max query...

    The inner select statement works fast and few columns generated based on A.QSN_DESC_X values, AND to get a single row I am doing a group by on cmpltn_i by taking the max row. When I do this group by the query takes approx 5 mnts. The inner query returns 227270 records. The final query give 37,000 records.
    Can someone suggest any better way to write this query?
    select
    CMPLTN_I,
    max(emp_i) as emp_i,
    max(first_name) as first_name,
    max(last_name) as last_name,
    max (vendor) as vendor,
    max (product_type) as product_type,
    max (event_type) as event_type,
    max (dollar_amt) as dollar_amt,
    max (date_received) as date_received,
    max (branch_code) as branch_code
    from (
    select   /*+DRIVING_SITE(A)*/   
    B.CMPLTN_I,
    B.emp_i, 
    E.EMP_1ST_NME_X as first_name,
    E.EMP_LAST_NME_X as last_name, 
    case when substr(A.QSN_DESC_X,1,6) = 'Vendor' then A.CHCE_DESC_X else null end as vendor,
    case when substr(A.QSN_DESC_X,1,12) = 'Product Type' then A.CHCE_DESC_X else null end  as product_type,
    case when substr(A.QSN_DESC_X,1,10) = 'Event Type' then A.CHCE_DESC_X else null end as  event_type,
    case when substr(A.QSN_DESC_X,1,13) = 'Dollar Amount' then A.CHCE_DESC_X else null end as  dollar_amt,
    case when substr(A.QSN_DESC_X,1,13) = 'Date Received' then A.CHCE_DESC_X else null end as date_received,
    case when substr(A.QSN_DESC_X,1,16) = 'Branch Wire Code' then A.CHCE_DESC_X else null end as branch_code        
    from  OAT.FORM_FACT@REMOTE_AB A, OAT.FORM_CMPLTN_FACT@REMOTE_AB B, empl_info_dimn E
    where  A.CMPLTN_I  =  B.CMPLTN_I
    and      B.CMPLTN_C  =  'C'
    and      B.app_i  = '20'
    and      E.emp_i = B.emp_i
    group by
    CMPLTN_I

    10g release 2.
    cost based, statistics are good
    without driving site hint, the response time is bad
    SQL> select * from table(dbms_xplan.display);
    PLAN_TABLE_OUTPUT
    Plan hash value: 2770348679
    | Id  | Operation              | Name                       | Rows  | Bytes |TempSpc| Cost (%CPU)| Time     | Inst   |IN-OUT|
    |   0 | SELECT STATEMENT REMOTE|                            |   922K|   139M|       |   104K  (1)| 00:20:49 |        |      |
    |   1 |  SORT GROUP BY         |                            |   922K|   139M|   300M|   104K  (1)| 00:20:49 |        |      |
    |*  2 |   HASH JOIN            |                            |   922K|   139M|  4976K| 71818   (2)| 00:14:22 |        |      |
    |   3 |    REMOTE              | EMPL_INFO_DIMN             | 86311 |  3961K|       | 10903   (1)| 00:02:11 |      ! | R->S |
    |*  4 |    HASH JOIN           |                            |   923K|    97M|       | 55274   (2)| 00:11:04 |        |      |
    |*  5 |     TABLE ACCESS FULL  | FORM_RECIPIENT_CMPLTN_FACT | 24223 |   331K|       |   698   (3)| 00:00:09 | OATP0~ |      |
    PLAN_TABLE_OUTPUT
    |   6 |     TABLE ACCESS FULL  | FORM_RESP_FACT             |    10M|  1013M|       | 54439   (2)| 00:10:54 | OATP0~ |      |
    Predicate Information (identified by operation id):
       2 - access("A1"."EMP_I"="A2"."EMP_I")
       4 - access("A3"."CMPLTN_I"="A2"."CMPLTN_I")
       5 - filter("A2"."APP_I"=20 AND "A2"."CMPLTN_I" IS NOT NULL AND "A2"."CMPLTN_C"='C')
    Remote SQL Information (identified by operation id):
    PLAN_TABLE_OUTPUT
       3 - SELECT /*+ USE_HASH ("A1") */ "EMP_I","EMP_LAST_NME_X","EMP_1ST_NME_X" FROM "DWCSADM"."EMPL_INFO_DIMN" "A1"
           (accessing '!' )
    Note
       - fully remote statement
    31 rows selected.

  • MIN and MAX datetimes ti find  range

    I am using Oracle 11g version
    create table re(Name char(20),Datetime char(45),val1 number);
    insert into re values('abc','10/29/2012 13:00','1.5')
    insert into re values('abc','10/29/2012 13:05','1.5')
    insert into re values('abc','10/29/2012 13:10','1.5')
    insert into re values('abc','10/29/2012 13:15','1.5')
    insert into re values('abc','10/29/2012 13:20','0.00')
    insert into re values('abc','10/29/2012 13:25','0.00')
    insert into re values('abc','10/29/2012 13:30','0.00')
    insert into re values('abc','10/29/2012 13:35','0.00')
    insert into re values('abc','10/29/2012 13:40','2.1')
    insert into re values('abc','10/29/2012 13:45','2.3')
    insert into re values('abc','10/29/2012 13:50','2.1')
    insert into re values('abc','10/29/2012 13:55','2.1')
    insert into re values('abc','10/29/2012 14:00','2.2')
    O/P:
    In this way data is stored in database.Needed output is, I want the datetime column data range with min and max values where val1>0 only.
    Expected result while we consider the above data is::
    Name mintime maxtime
    abc 10/19/2012 13:00 10/19/2012 13:15
    abc 10/29/2012 13:40 10/29/2012 14:00
    For this I tried something like this,
    select name, min(to_date(Datetime ,'mm/dd/yyyy hh24:mi')) start, max(to_date(Datetime ,'mm/dd/yyyy hh24:mi')) end from (
    select name, Datetime ,to_date(Datetime ,'mm/dd/yyyy hh24:mi') - rank() over (partition by loc_name order by t1 asc) Val_col from re where val1 > 0
    ) group by lname, Val_col
    but I am getting the output like this for above query.
    name start end
    abc 10/29/2012 13:00 10/29/2012 13:00
    abc 10/29/2012 13:05 10/29/2012 13:05
    so.on.
    Edited by: 913672 on Apr 3, 2013 3:07 AM

    913672 wrote:
    I am using Oracle 11g version
    create table re(Name char(20),Datetime char(45),val1 number);
    insert into re values('abc','10/29/2012 13:00','1.5')
    insert into re values('abc','10/29/2012 13:05','1.5')
    insert into re values('abc','10/29/2012 13:10','1.5')
    insert into re values('abc','10/29/2012 13:15','1.5')
    insert into re values('abc','10/29/2012 13:20','0.00')
    insert into re values('abc','10/29/2012 13:25','0.00')
    insert into re values('abc','10/29/2012 13:30','0.00')
    insert into re values('abc','10/29/2012 13:35','0.00')
    insert into re values('abc','10/29/2012 13:40','2.1')
    insert into re values('abc','10/29/2012 13:45','2.3')
    insert into re values('abc','10/29/2012 13:50','2.1')
    insert into re values('abc','10/29/2012 13:55','2.1')
    insert into re values('abc','10/29/2012 14:00','2.2')
    O/P:
    In this way data is stored in database.Needed output is, I want the datetime column data range with min and max values where val1>0 only.
    Expected result while we consider the above data is::
    Name mintime maxtime
    abc 10/19/2012 13:00 10/19/2012 13:15
    abc 10/29/2012 13:40 10/29/2012 14:00
    For this I tried something like this,
    select name, min(to_date(Datetime ,'mm/dd/yyyy hh24:mi')) start, max(to_date(Datetime ,'mm/dd/yyyy hh24:mi')) end from (
    select name, Datetime ,to_date(Datetime ,'mm/dd/yyyy hh24:mi') - rank() over (partition by loc_name order by t1 asc) Val_col from re where val1 > 0
    ) group by lname, Val_col
    but I am getting the output like this for above query.
    name start end
    abc 10/29/2012 13:00 10/29/2012 13:00
    abc 10/29/2012 13:05 10/29/2012 13:05
    so.on.Firstly and most importantly do NOT store dates as char columns: that's what the DATE type is for.
    Secondly, i'm not sure how you get your output from the input, particularly as your SQL contains 'partition by loc_name' which
    doesn't even exist in your example table.
    Why has the same name got two rows with those min/max times. Are you partitioning by val1?

  • Group by in query

    Hi!
    We have a flight route table that holds individual flight legs together that build a route:
    CREATE TABLE "TEST"."ROUTE"
       (     "ROUTE_ID" NUMBER NOT NULL ENABLE,
         "ORIGIN" VARCHAR2(3 BYTE) NOT NULL ENABLE,
         "DESTINATION" VARCHAR2(3 BYTE) NOT NULL ENABLE,
         "FLIGHT_LEG_ID" NUMBER NOT NULL ENABLE,
         "SEGMENT_ID" NUMBER NOT NULL ENABLE,
          CONSTRAINT "ROUTE_PK" PRIMARY KEY ("ROUTE_ID", "FLIGHT_LEG_ID", "SEGMENT_ID")
      USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS
      STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
      TABLESPACE "USERS"  ENABLE
       ) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
      STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
      PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
      TABLESPACE "USERS" ;
    Insert into ROUTE (ROUTE_ID,ORIGIN,DESTINATION,FLIGHT_LEG_ID,SEGMENT_ID) values (1,'FCO','ZRH',9,1);
    Insert into ROUTE (ROUTE_ID,ORIGIN,DESTINATION,FLIGHT_LEG_ID,SEGMENT_ID) values (1,'ZRH','JFK',10,2);
    Insert into ROUTE (ROUTE_ID,ORIGIN,DESTINATION,FLIGHT_LEG_ID,SEGMENT_ID) values (2,'JFK','ZRH',11,1);
    Insert into ROUTE (ROUTE_ID,ORIGIN,DESTINATION,FLIGHT_LEG_ID,SEGMENT_ID) values (2,'ZRH','FCO',12,2);The flight legs must be grouped by route_id. I.e. A user wants to go from Rome to New York, so the results must return both legs:
    FCO  ZRH
    ZRH  JFKThanks for your feedback!

    This one, too. more test data:
    Insert into ROUTE (ROUTE_ID,ORIGIN,DESTINATION,FLIGHT_LEG_ID,SEGMENT_ID) values (3,'STG','FRA',19,1);
    Insert into ROUTE (ROUTE_ID,ORIGIN,DESTINATION,FLIGHT_LEG_ID,SEGMENT_ID) values (3,'FRA','HKG',20,2);
    Insert into ROUTE (ROUTE_ID,ORIGIN,DESTINATION,FLIGHT_LEG_ID,SEGMENT_ID) values (3,'HKG','MAN',21,3);
    Insert into ROUTE (ROUTE_ID,ORIGIN,DESTINATION,FLIGHT_LEG_ID,SEGMENT_ID) values (3,'MAN','PMO',22,4);And the query
    SQL>SELECT route_id, origin, destination, route
      2    FROM (SELECT route_id, origin, destination, MAX(route) OVER(PARTITION BY route_id) AS route
      3            FROM (SELECT     route_id, origin, destination, flight_leg_id, segment_id,
      4                             LTRIM(SYS_CONNECT_BY_PATH(origin, '-'), '-') || '-' || destination AS route
      5                        FROM route
      6                  CONNECT BY route_id = PRIOR route_id AND origin = PRIOR destination
      7                  START WITH origin = :origin))
      8   WHERE route LIKE :origin || '%' AND route LIKE '%' || :destination;
      ROUTE_ID ORIGI DESTI ROUTE
             3 STG   FRA   STG-FRA-HKG-MAN-PMO
             3 FRA   HKG   STG-FRA-HKG-MAN-PMO
             3 HKG   MAN   STG-FRA-HKG-MAN-PMO
             3 MAN   PMO   STG-FRA-HKG-MAN-PMOUrs

  • Sql grouping and summing impossible?

    I want to create an sql query to sum up some data but i'm starting to think it is impossible with sql alone. The data i have are of the following form :
    TRAN_DT     TRAN_RS     DEBT     CRED
    10-Jan     701     100     0
    20-Jan     701     150     0
    21-Jan     701     250     0
    22-Jan     705     0     500
    23-Jan     571     100     0
    24-Jan     571     50     0
    25-Jan     701     50     0
    26-Jan     701     20     0
    27-Jan     705     0     300The data are ordered by TRAN_DT and then by TRAN_RS. Tha grouping and summing of data based on tran_rs but only when it changes. So in the table above i do not want to see all 3 first recods but only one with value DEBT the sum of those 3 i.e. 100+150+250=500. So the above table after grouping would be like the one below:
    TRAN_DT     TRAN_RS     DEBT     CRED
    21-Jan     701     500     0
    22-Jan     705     0     500
    24-Jan     571     150     0
    26-Jan     701     70     0
    27-Jan     705     0     300The TRAN_DT is the last value of the summed records. I undestand that the tran_dt may not be selectable. What i have tried so far is the following query:
    select tran_dt,
             tran_rs,
             sum(debt)over(partition by tran_rs order by tran_dt rows unbounded preceding),
             sum(cred)over(partition by tran_rs order by tran_dt rows unbounded preceding) from that_tableIs this even possible with sql alone, any thoughts?
    The report i am trying to create in BI Publisher.Maybe it is possible to group the data in the template and ask my question there?

    915218 wrote:
    Is this even possible with sql alone, any thoughts?It sure is...
    WITH that_table as (select to_date('10/01/2012', 'dd/mm/yyyy') tran_dt, 701 tran_rs, 100 debt, 0 cred from dual union all
                         select to_date('20/01/2012', 'dd/mm/yyyy') tran_dt, 701 tran_rs, 150 debt, 0 cred from dual union all
                         select to_date('21/01/2012', 'dd/mm/yyyy') tran_dt, 701 tran_rs, 250 debt, 0 cred from dual union all
                         select to_date('22/01/2012', 'dd/mm/yyyy') tran_dt, 705 tran_rs, 0 debt, 500 cred from dual union all
                         select to_date('23/01/2012', 'dd/mm/yyyy') tran_dt, 571 tran_rs, 100 debt, 0 cred from dual union all
                         select to_date('24/01/2012', 'dd/mm/yyyy') tran_dt, 571 tran_rs, 50 debt, 0 cred from dual union all
                         select to_date('25/01/2012', 'dd/mm/yyyy') tran_dt, 701 tran_rs, 50 debt, 0 cred from dual union all
                         select to_date('26/01/2012', 'dd/mm/yyyy') tran_dt, 701 tran_rs, 20 debt, 0 cred from dual union all
                         select to_date('27/01/2012', 'dd/mm/yyyy') tran_dt, 705 tran_rs, 0 debt, 300 cred from dual)
    , brk AS (
    SELECT     tran_dt,
            tran_rs,
         debt,
         cred,
            CASE WHEN Nvl (Lag (tran_rs) OVER (ORDER BY tran_dt, tran_rs), 0) != tran_rs THEN tran_rs || tran_dt END brk_tran_rs
      FROM that_table
    ), grp AS (
    SELECT     tran_dt,
            tran_rs,
             debt,
             cred,
            Last_Value (brk_tran_rs IGNORE NULLS) OVER  (ORDER BY tran_dt, tran_rs) grp_tran_rs
      FROM brk
    SELECT     Max (tran_dt),
            Max (tran_rs),
             Sum (debt),
             Sum (cred)
      FROM grp
    GROUP BY grp_tran_rs
    ORDER BY 1, 2
    Boneist
    MAX(TRAN_    TRAN_RS       DEBT       CRED
    21-JAN-12        701        500          0
    22-JAN-12        705          0        500
    24-JAN-12        571        150          0
    26-JAN-12        701         70          0
    27-JAN-12        705          0        300
    Me
    MAX(TRAN_ MAX(TRAN_RS)  SUM(DEBT)  SUM(CRED)
    21-JAN-12          701        500          0
    22-JAN-12          705          0        500
    24-JAN-12          571        150          0
    26-JAN-12          701         70          0
    27-JAN-12          705          0        300Edited by: BrendanP on 17-Feb-2012 04:05
    Test data courtesy of Boneist, and fixed bug.
    Edited by: BrendanP on 17-Feb-2012 04:29

  • Grouping and Decimal characters in rtf templates.

    Hi guys and girls,
    I’m really struggling with a problem here regarding the decimal characters for the prices in my output.
    I'm using XML Publisher 5.6.3.
    My goal is to control the grouping and decimal character from my template.
    The numbers in the XML data file can either be 10.000,00 or 10,000.00. The format is handled by the users nls_numeric_characters profile option.
    The output of the template shall be based on the locale and not the data generated by Oracle Reports. For example: Reports to US customers shall show the numbers in the following format 10,000.00. Reports to our European customers shall show the numbers in this format 10.000,00.
    How can I achieve this in my templates? Can it be achieved at all?
    Thank you in advance.
    Kenneth Kristoffersen
    Edited by: Kenneth_ on May 19, 2009 1:30 AM

    Hi,
    Thank you for your reply.
    The problem is that the report is generating the output based on the users profile option nls_numeric_characters.
    I have tried to override the users profile option in the before report trigger without any luck. I can alter selects so the query gets the numbers in the right format but then I would have to go through all queryes and reports which seem a bit wrong? Especially for the standard Oracle reports.
    BR Kenneth

  • Min and MAx Value in a SELECT Statement

    Hi,
    I have a scenario where I am Selecting the values BETWEEN MIN and MAX values:
    SELECT * FROM ABC WHERE CODE BETWEEN MIN(CODE) AND MAX(CODE)
    ITS GETTING Error as:ORA-00934: group function is not allowed here
    Any help will be needful for me.

    select substr(no,1,3)||to_char(substr(no,4,1)+1) "first missing number"
    from
    with t as
    (select 'ABC1' no from dual
    union select 'ABC2' from dual
    union select 'ABC3' from dual
    union select 'ABC5' from dual
    union select 'ABC6' from dual
    union select 'ABC8' from dual
    select no, lead(no,1,0) over (order by no) next_no from t
    where substr(next_no,4,1) - substr(no,4,1) > 1
    and rownum = 1;

Maybe you are looking for

  • WRVS4400N firmware upgrade to 1.01.03, VPN no longer works

    I recently upgraded to firmware version 1.01.03 for my WRVS4400N. I have been having several problems with IPSec tunnels on the previous firmware and was hoping this release would resolve those issues. To my surprise this firmware version seems to be

  • Create alv report

    hi friends, i have urgent reqirement pls help on this topic ·     Created an ALV report that displays Purchase document details in hierarchical manner using predefined function modules. thanks ravi

  • Copy PO Number from contract to sales order

    Hi, My clients requirement is that when i create sales order with reference to contract the PO Number should be copied from contract to sales order other wise we have to enter PO Number every time while creating the sales order. Please guide me.

  • 1.5TB Bus-Powered Drive

    Hi. This is not strictly a MacBook Pro question, but perhaps someone can help. I am looking for a 1.5TB bus-powered drive that I can connect via FireWire 800 to my MacBook Pro. Seagate appears to make one called the FreeAgent GoFlex Ultra–portable dr

  • Spry Tabs on right side?

    Hi, is it possible to have the Spry tabs verticle on the right side of the content? And what do i have to change in the CSS. Thanks alot in advance.