Need help with outputing query group names

I am trying to come up with a way to output group headers, then all the records under each group header etc. It would be easy, except there is a twist with what i want to do.
Normally if I have this set of data (which I ‘borrowed’ from a site that showed the closest to what I’m looking for):
Example Table Setup:
TABLE [Numbers]
(Name, NUMBER)
Dave Bosky              843-444-4444
Dave Bosky              843-555-5555
Matthew Small        843-111-1111
Matthew Small      _843-222-2222
Matthew Small        843-333-3333
I could use the following code:
<cfoutput query="somequery" group="name">
        #name#<br>
        <cfoutput>
                #phonenumber#<br>
        </cfoutput>
        <hr>
</cfoutput>
And get this:
Dave Bosky
843-444-4444
843-555-5555
Matthew Small
843-111-1111
843-222-2222
843-333-3333
BUT, my actual tables are not set up like this. Rather than recording each name with each record, I would have an ID that is the foreign key for another table.
Current table set up would look like this:
TABLE [People]
(ID, NAME)
1 Dave Bosky        
2 Matthew Small  
TABLE [Phones]
(PEOPLE_ID, NUMBER)
1              843-444-4444
1              843-555-5555
2              843-111-1111
2              843-222-2222
2              843-333-3333
So this output would actually give me this with my current setup and the query code above:
1
843-444-4444
843-555-5555
2
843-111-1111
843-222-2222
843-333-3333
How do I keep my current setup but create a query that achieves the same result from the top? (Output the names from the People table as the group headers, but the data from the Phones table underneath that)

Thanks! With a little tweaking I got it done.
In case this might be useful to someone else this is the code I ended up using as a test run for what Im trying to do:
<cfquery name="testthis" datasource="IRLE">
SELECT ACCOUNTTYPES.ACCOUNTTYPEID, ACCOUNTTYPES.TYPEDESCRIPTION, ACCOUNTS.ACCOUNTNUMBER, ACCOUNTS.ACCOUNTTYPE, ACCOUNTS.ACCOUNTDESCRIPTION
FROM ACCOUNTS INNER JOIN ACCOUNTTYPES ON ACCOUNTTYPES.ACCOUNTTYPEID = ACCOUNTS.ACCOUNTTYPE
WHERE ACCOUNTUSERID = '#Session.UserID#'
</cfquery>
<table>
<th colspan="3" align="center"><u>ACCOUNTS:</u></th>
<cfoutput query="testthis" group="TYPEDESCRIPTION">
<tr><td></td>
<td colspan="2">#TYPEDESCRIPTION# Accounts:<BR></td>
</tr>
<cfoutput>
<tr>
<td></td>
<td>#ACCOUNTNUMBER#</td>
<td>#ACCOUNTDESCRIPTION#</td>
</td>
</tr>
</cfoutput>
</cfoutput>
</table>
Im using MySQL, so all the table names etc have to be capitalized.
Table setup:
ACCOUNTTYPES
ACCOUNTTYPEID
TYPEDESCRIPTION
1
CASH
2
TRADE
ACCOUNTS
ACCOUNTNUMBER
ACCOUNTUSERID
ACCOUNTTYPE
ACCOUNTDESCRIPTION
1
1
1
CASH
2
1
2
BROKERAGE
Note I also threw in a WHERE clause to only show accounts for the logged in user. (Session.USERID is defined by my code at login)
All in all the output was:
ACCOUNTS:
                CASH Accounts:
                1              CASH
                TRADE Accounts:
                2              BROKERAGE
Given my data at the moment, that’s what I wanted! Sweet!

Similar Messages

  • Need help with SQL Query with Inline View + Group by

    Hello Gurus,
    I would really appreciate your time and effort regarding this query. I have the following data set.
    Reference_No---Check_Number---Check_Date--------Description-------------------------------Invoice_Number----------Invoice_Type---Paid_Amount-----Vendor_Number
    1234567----------11223-------------- 7/5/2008----------paid for cleaning----------------------44345563------------------I-----------------*20.00*-------------19
    1234567----------11223--------------7/5/2008-----------Adjustment for bad quality---------44345563------------------A-----------------10.00------------19
    7654321----------11223--------------7/5/2008-----------Adjustment from last billing cycle-----23543556-------------------A--------------------50.00--------------19
    4653456----------11223--------------7/5/2008-----------paid for cleaning------------------------35654765--------------------I---------------------30.00-------------19
    Please Ignore '----', added it for clarity
    I am trying to write a query to aggregate paid_amount based on Reference_No, Check_Number, Payment_Date, Invoice_Number, Invoice_Type, Vendor_Number and display description with Invoice_type 'I' when there are multiple records with the same Reference_No, Check_Number, Payment_Date, Invoice_Number, Invoice_Type, Vendor_Number. When there are no multiple records I want to display the respective Description.
    The query should return the following data set
    Reference_No---Check_Number---Check_Date--------Description-------------------------------Invoice_Number----------Invoice_Type---Paid_Amount-----Vendor_Number
    1234567----------11223-------------- 7/5/2008----------paid for cleaning----------------------44345563------------------I-----------------*10.00*------------19
    7654321----------11223--------------7/5/2008-----------Adjustment from last billing cycle-----23543556-------------------A--------------------50.00--------------19
    4653456----------11223--------------7/5/2008-----------paid for cleaning------------------------35654765-------------------I---------------------30.00--------------19
    The following is my query. I am kind of lost.
    select B.Description, A.sequence_id,A.check_date, A.check_number, A.invoice_number, A.amount, A.vendor_number
    from (
    select sequence_id,check_date, check_number, invoice_number, sum(paid_amount) amount, vendor_number
    from INVOICE
    group by sequence_id,check_date, check_number, invoice_number, vendor_number
    ) A, INVOICE B
    where A.sequence_id = B.sequence_id
    Thanks,
    Nick

    It looks like it is a duplicate thread - correct me if i'm wrong in this case ->
    Need help with SQL Query with Inline View + Group by
    Regards.
    Satyaki De.

  • Please, need help with a query

    Hi !
    Please need help with this query:
    Needs to show (in cases of more than 1 loan offer) the latest create_date one time.
    Meaning, In cases the USER_ID, LOAN_ID, CREATE_DATE are the same need to show only the latest, Thanks!!!
    select distinct a.id,
    create_date,
    a.loanid,
    a.rate,
    a.pays,
    a.gracetime,
    a.emailtosend,
    d.first_name,
    d.last_name,
    a.user_id
    from CLAL_LOANCALC_DET a,
    loan_Calculator b,
    bv_user_profile c,
    bv_mr_user_profile d
    where b.loanid = a.loanid
    and c.NET_USER_NO = a.resp_id
    and d.user_id = c.user_id
    and a.is_partner is null
    and a.create_date between
    TO_DATE('6/3/2008 01:00:00', 'DD/MM/YY HH24:MI:SS') and
    TO_DATE('27/3/2008 23:59:00', 'DD/MM/YY HH24:MI:SS')
    order by a.create_date

    Take a look on the syntax :
    max(...) keep (dense_rank last order by ...)
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14200/functions056.htm#i1000901
    Nicolas.

  • Please need help with this query

    Hi !
    Please need help with this query:
    Needs to show (in cases of more than 1 loan offer) the latest create_date one time.
    Meaning, In cases the USER_ID, LOAN_ID, CREATE_DATE are the same need to show only the latest, Thanks!!!
    select distinct a.id,
    create_date,
    a.loanid,
    a.rate,
    a.pays,
    a.gracetime,
    a.emailtosend,
    d.first_name,
    d.last_name,
    a.user_id
    from CLAL_LOANCALC_DET a,
    loan_Calculator b,
    bv_user_profile c,
    bv_mr_user_profile d
    where b.loanid = a.loanid
    and c.NET_USER_NO = a.resp_id
    and d.user_id = c.user_id
    and a.is_partner is null
    and a.create_date between
    TO_DATE('6/3/2008 01:00:00', 'DD/MM/YY HH24:MI:SS') and
    TO_DATE('27/3/2008 23:59:00', 'DD/MM/YY HH24:MI:SS')
    order by a.create_date

    Perhaps something like this...
    select id, create_date, loanid, rate, pays, gracetime, emailtosend, first_name, last_name, user_id
    from (
          select distinct a.id,
                          create_date,
                          a.loanid,
                          a.rate,
                          a.pays,
                          a.gracetime,
                          a.emailtosend,
                          d.first_name,
                          d.last_name,
                          a.user_id,
                          max(create_date) over (partition by a.user_id, a.loadid) as max_create_date
          from CLAL_LOANCALC_DET a,
               loan_Calculator b,
               bv_user_profile c,
               bv_mr_user_profile d
          where b.loanid = a.loanid
          and   c.NET_USER_NO = a.resp_id
          and   d.user_id = c.user_id
          and   a.is_partner is null
          and   a.create_date between
                TO_DATE('6/3/2008 01:00:00', 'DD/MM/YY HH24:MI:SS') and
                TO_DATE('27/3/2008 23:59:00', 'DD/MM/YY HH24:MI:SS')
    where create_date = max_create_date
    order by create_date

  • Hi.. need help with a query

    hello guys :)
    I need help with this exercise:
    Find the most common cooking method in recipes that contain tomatoes.
    the 4 tables:
    t_recipes
    --recipe_no
    --recipe_name
    t_products
    --product_no
    -product_name [tomatoes,cucumbers, onions]
    t_integration
    --product_no
    --recipe_no
    t_cooking_mode
    recipe_no
    mode [Frying,baking]
    I am realy lost :S
    thank you

    Hi,
    851072 wrote:
    Thank you for your comment :)
    But... I didn`t understand the first partSorry, what part is that? You probably understood "Welcome to the forum!" It looks like you understood most of what I said, perhaps before I said it. You seem to be on the right track.
    Unlike talking to a co-worker in the next cube, exchanging messages with someone on this forum takes a fair amount of time, so it's worth the time it takes to explain things very clearly, more than you would in conversation. Post all the relevant information, and say exactly what the problem is.
    amm i tried to do this:
    select t_recipes.recipe_name, count(t_cooking_mode.cooking_mode)
    from (t_cooking_mode INNER JOIN t_integration ON t_cooking_mode.recipe_no = t_integration.recipe_no )
    INNER JOIN t_products ON t_integration.product_no = t_products.product_no)
    WHERE t_products.product_name = 'tomatoes'
    GROUP BY t_recipes.recipe_name
    help?Please help by posting whatever you know about the problem. For example, if there's a error message, post the complete error message, including the line number.
    Help the people who want to help you by formatting your code and making it easy to read and understand. This will help you, too.
    For example, here's exactly what you posted, with only the white-space changed to make parallel items, such as parentheses, line up nicely:
    select        t_recipes.recipe_name
    ,       count(t_cooking_mode.cooking_mode)
    from             (          t_cooking_mode
                INNER JOIN      t_integration      ON t_cooking_mode.recipe_no = t_integration.recipe_no
    INNER JOIN      t_products      ON t_integration.product_no = t_products.product_no
    WHERE       t_products.product_name      = 'tomatoes'
    GROUP BY  t_recipes.recipe_nameWhen you format your code like this, it can be very easy to spot errors like unbalanced parentheses. In this case, the ')' right before the WHERE clause has no matching '('. You don't need to use any parentheses at all in the FROM clause. You can simply say:
    FROM     t_cooking_mode
    JOIN      t_integration      ON t_cooking_mode.recipe_no = t_integration.recipe_no
    JOIN      t_products      ON t_integration.product_no = t_products.product_no
    WHERE     ...INNER JOIN is the default kind of join, which makes sense, since the majority of all joins are inner joins. I usually say JOIN (instead of INNER JOIN) because it makes the code more compact and easier to read (at least for me), but I won't be maintaining your code, so do what's best for you.
    What are you trying to find in this problem? Is it a recipe name or a cooking mode? If it's a cooking mode, the why do you have t_recipes.reciple_name in the SELECT (and GROUP BY) clause? Shouldn't you be using some other column, from some other table?
    If I understand the problem correctly, the t_recipes table is not needed in this problem. You won't necessarily use every table in every query.
    When you post formatted text on this site, type these 6 characters:
    \(small letters only, inside curly brackets) before and after each section of formatted text, to preserve spacing.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • Need help with conditional query

    guys this is just an extension of this post that Frank was helping me with. im reposting because my requirements have changes slightly and im having a hell of a time trying to modify the query.
    here is the previous post.
    need help with query that can look data back please help.
    CREATE TABLE "FGL"
        "FGL_GRNT_CODE" VARCHAR2(60),
        "FGL_FUND_CODE" VARCHAR2(60),
        "FGL_ACCT_CODE" VARCHAR2(60),
        "FGL_ORGN_CODE" VARCHAR2(60),
        "FGL_PROG_CODE" VARCHAR2(60),
        "FGL_GRNT_YEAR" VARCHAR2(60),
        "FGL_PERIOD"    VARCHAR2(60),
        "FGL_BUDGET"    VARCHAR2(60)
      )data
    Insert into FGL (FGL_GRNT_CODE,FGL_FUND_CODE,FGL_ACCT_CODE,FGL_ORGN_CODE,FGL_PROG_CODE,FGL_GRNT_YEAR,FGL_PERIOD,FGL_BUDGET) values ('240055','240055','7600','4730','02','11','00','400');
    Insert into FGL (FGL_GRNT_CODE,FGL_FUND_CODE,FGL_ACCT_CODE,FGL_ORGN_CODE,FGL_PROG_CODE,FGL_GRNT_YEAR,FGL_PERIOD,FGL_BUDGET) values ('240055','240055','7240','4730','02','10','1','100');
    Insert into FGL (FGL_GRNT_CODE,FGL_FUND_CODE,FGL_ACCT_CODE,FGL_ORGN_CODE,FGL_PROG_CODE,FGL_GRNT_YEAR,FGL_PERIOD,FGL_BUDGET) values ('240055','240055','7240','4730','02','10','1','0');
    Insert into FGL (FGL_GRNT_CODE,FGL_FUND_CODE,FGL_ACCT_CODE,FGL_ORGN_CODE,FGL_PROG_CODE,FGL_GRNT_YEAR,FGL_PERIOD,FGL_BUDGET) values ('240055','240055','7600','4730','02','11','1','400');
    Insert into FGL (FGL_GRNT_CODE,FGL_FUND_CODE,FGL_ACCT_CODE,FGL_ORGN_CODE,FGL_PROG_CODE,FGL_GRNT_YEAR,FGL_PERIOD,FGL_BUDGET) values ('360055','360055','7200','4730','02','10','1','400');
    Insert into FGL (FGL_GRNT_CODE,FGL_FUND_CODE,FGL_ACCT_CODE,FGL_ORGN_CODE,FGL_PROG_CODE,FGL_GRNT_YEAR,FGL_PERIOD,FGL_BUDGET) values ('360055','360055','7600','4730','02','10','1','400');
    Insert into FGL (FGL_GRNT_CODE,FGL_FUND_CODE,FGL_ACCT_CODE,FGL_ORGN_CODE,FGL_PROG_CODE,FGL_GRNT_YEAR,FGL_PERIOD,FGL_BUDGET) values ('240055','240055','7240','4730','02','10','14','200');
    Insert into FGL (FGL_GRNT_CODE,FGL_FUND_CODE,FGL_ACCT_CODE,FGL_ORGN_CODE,FGL_PROG_CODE,FGL_GRNT_YEAR,FGL_PERIOD,FGL_BUDGET) values ('240055','240055','7600','4730','02','10','14','100');
    Insert into FGL (FGL_GRNT_CODE,FGL_FUND_CODE,FGL_ACCT_CODE,FGL_ORGN_CODE,FGL_PROG_CODE,FGL_GRNT_YEAR,FGL_PERIOD,FGL_BUDGET) values ('240055','240055','7240','4730','02','10','14','200');
    Insert into FGL (FGL_GRNT_CODE,FGL_FUND_CODE,FGL_ACCT_CODE,FGL_ORGN_CODE,FGL_PROG_CODE,FGL_GRNT_YEAR,FGL_PERIOD,FGL_BUDGET) values ('240055','240055','7240','4730','02','10','2','100');
    Insert into FGL (FGL_GRNT_CODE,FGL_FUND_CODE,FGL_ACCT_CODE,FGL_ORGN_CODE,FGL_PROG_CODE,FGL_GRNT_YEAR,FGL_PERIOD,FGL_BUDGET) values ('240055','240055','7240','4730','02','11','2','600');
    I need to find the greatest grant year for the grant by a period parameter.
    once i find the greatest year i need to check the value of period 14 for that grant for the previous year and add it to the budget amount for that grant. however if their is an entry in the greatest year for period 00 then i need to ignore the period 14 of previous year and do this calculation current period +(current period - greatest year 00)
    hope that makes sense so in other words with the new data above. if i was querying period two of grant year 11. i would end up with $800
    because the greatest year is 11 it contains a period 0 with amount of $400 so my total should be
    period 2 amount $ 600
    period 0 amount $ 400 - period 2 amount of $600 = 200
    600+200 = $800
    if i query period 1 of grant 360055 i would just end up with 800 of grnt year 10.
    i have tried to modify that query you supplied to me with no luck. I have tried for several day but im embarrased to say i just can get it to do what im trying to do .
    can you please help me out.
    here is the query supplied by frank kulash who gracefully put this together for me.
    WITH     got_greatest_year     AS
         SELECT     fgl.*     -- or whatever columns are needed
         ,     MAX ( CASE
                     WHEN  fgl_period = :given_period
                     THEN  fgl_grnt_year
                    END
                  ) OVER ()     AS greatest_year
         FROM     fgl
    SELECT     SUM (fgl_budget)     AS total_budget     -- or SELECT *
    FROM     got_greatest_year
    WHERE     (     fgl_grnt_year     = greatest_year
         AND     fgl_period     = :given_period
    OR     (     fgl_grnt_year     = greatest_year - 1
         AND     fgl_period     = 14
    ;Miguel

    Hi, Miguel,
    Are you waying that, when the greatest year that has :given_period also has period='00' (or '0', or whatever you want to use), then you want to double the budget from the given_period (as well as subtract the budget from the '00', and not count the pevious year's '14')? If so, add another condition to the CASE statement which decides what you're SUMming:
    WITH     got_greatest_year     AS
         SELECT       TO_NUMBER (fgl_grnt_year)     AS grnt_year
         ,       fgl_period
         ,       TO_NUMBER (fgl_budget)     AS budget
         ,       MAX ( CASE
                       WHEN  fgl_period = :given_period
                       THEN  TO_NUMBER (fgl_grnt_year)
                      END
                    ) OVER ()     AS greatest_year
         FROM       fgl
    ,     got_cnt_00     AS
         SELECT     grnt_year
         ,     fgl_period
         ,     budget
         ,     greatest_year
         ,     COUNT ( CASE
                       WHEN  grnt_year     = greatest_year
                       AND       fgl_period     = '00'
                       THEN  1
                         END
                    ) OVER ()          AS cnt_00
         FROM    got_greatest_year
    SELECT       SUM ( CASE
                        WHEN  grnt_year     = greatest_year                    -- New
                  AND       fgl_period     = :given_period                    -- New
                  AND       cnt_00     > 0            THEN  budget * 2     -- New
                        WHEN  grnt_year     = greatest_year
                  AND       fgl_period     = :given_period       THEN  budget
                        WHEN  grnt_year     = greatest_year
                  AND       fgl_period     = '00'            THEN -budget
                        WHEN  grnt_year     = greatest_year - 1
                  AND       fgl_period     = '14'     
                  AND       cnt_00     = 0            THEN  budget
                    END
               )          AS total_budget
    FROM       got_cnt_00
    ;You'll notice this is the same as the previous query I posted, except for 3 lines maked "New".

  • Need help with a query

    Hi,
    I need help with the following query. I want the balance (bal) with the latest exchange rate available.
    Sample table & data
    with
    FX_RATE as
    select 11 as id_date, 1 as id_curr, 47 as EXCH_rate from dual union
    select 12, 1, 48 from dual union
    select 13, 2, 54 from dual union
    select 14, 2, 55 from dual union
    select 15, 3, 56 from dual union
    select 15, 2, 49 from dual),
    TBL_NM as
    select 13 as p_date, 2 as p_curr, 200 as bal from dual union
    select 14, 2, 200 from dual union
    select 15, 2, 200 from dual union
    select 16, 2, 200 from dual union
    select 17, 2, 200 from dual union
    select 11, 5, 100 from dual
    select p_date, p_curr, bal * nvl(exch_rate,1) from TBL_NM T LEFT OUTER JOIN FX_RATE F1 on (id_curr = p_curr and F1.id_date = T.p_Date)In the above query for p_date 16 & 17 and p_curr 2 it returns just balance multiplied by exchange rate 1"default". But i want the balance to have data as per latest exchange rate which is of exchange rate 15.
    I tried this but returns error ORA-01799: a column may not be outer joined to a subquery ..
    with
    FX_RATE as
    select 11 as id_date, 1 as id_curr, 47 as EXCH_rate from dual union
    select 12, 1, 48 from dual union
    select 13, 2, 54 from dual union
    select 14, 2, 55 from dual union
    select 15, 3, 56 from dual union
    select 15, 2, 49 from dual),
    TBL_NM as
    select 13 as p_date, 2 as p_curr, 200 as bal from dual union
    select 14, 2, 200 from dual union
    select 15, 2, 200 from dual union
    select 16, 2, 200 from dual union
    select 17, 2, 200 from dual union
    select 11, 5, 100 from dual
    select p_date, p_curr, bal * nvl(exch_rate,1) from TBL_NM T LEFT OUTER JOIN FX_RATE F1
    on (id_curr = p_curr and F1.id_date = (select max(F2.id_date) from FX_RATE F2 where F2.id_curr = T.p_curr and F2.id_Date <=  T.p_date))Please advice on how i can achieve this ..

    The entire query wud be like this .. I've to incorporate in here
    CREATE MATERIALIZED VIEW MV_DUMMY
    BUILD IMMEDIATE
    REFRESH FORCE ON DEMAND
    AS
        SELECT T.ID_TSACTION_RELEASED                                                                                
        BAL.ID_CONTRACT_BALANCE                                                                                                                                                                                                                                                                           AS ID_CONTRACT_BALANCE,   
        T.N_REFERENCE_NUMBER                                                                                          
        T.INSTRUMENT_N_REFERENCE                                                                                      
        T.ITEM_NUMBER                                                                                                   
        T.EXTERNAL_SYSTEM_ID                                                                                            
        T.SEQUENCE_NUMBER                                                                                               
        T.ID_RELEASED_DATE                                                                                              
       ROUND(BAL.LC_AVAILABLE_BALANCE * NVL(FX1.EXCHANGE_RATE,1) / NVL(FX2.EXCHANGE_RATE,1) , 4)                           
        BAL.LIABILITY_BALANCE                                                                                              
        BAL.LIABILITY_BALANCE * NVL(FX3.EXCHANGE_RATE,1)                                                                   
        BAL.LIABILITY_CHANGE_USD                                                                                           
        BAL.MEMO_LIABILITY_BALANCE                                                                                         
        BAL.MEMO_LIABILITY_BALANCE * NVL(FX3.EXCHANGE_RATE,1)                                                              
        BAL.MEMO_LIABILITY_CHANGE_USD                                                                                      
        BAL.ORIGINAL_FACE_AMOUNT                                                                                           
        decode(T.TENOR_CODE,'Time','T','Sight','S','Split Sight Time','SST','Split Multiple Time','SMT',T.TENOR_CODE)
        CASE
          WHEN GTSPROD.PRODUCT_CATEGORY IN ('ILC','IB','AIR','STG','NLC','NP')
          THEN T.ID_LIABILITY_CIF
          WHEN GTSPROD.PRODUCT_CATEGORY IN ('ELC','EB','XLC','XP','EC','LN','SCF-AR')
          THEN T.Id_Beneficiary
          WHEN GTSPROD.PRODUCT_CATEGORY IN ('TLC','IC','OA','SCF-AP')
          THEN T.ID_Applicant
        END PRIMARY_CUSTOMER_ID,
        CASE
          WHEN GTSPROD.PRODUCT_CATEGORY IN ('ILC','IB','AIR','STG','NLC','NP')
          THEN plbcif.EXTERNAL_SYSTEM_ID
          WHEN GTSPROD.PRODUCT_CATEGORY IN ('ELC','EB','XLC','XP','EC','LN','SCF-AR')
          THEN PBCIF.EXTERNAL_SYSTEM_ID
          WHEN GTSPROD.PRODUCT_CATEGORY IN ('TLC','IC','OA','SCF-AP')
          THEN pappcif.EXTERNAL_SYSTEM_ID
        END PRIMARY_CUSTOMER_EXT_SYS_ID,
        CASE
          WHEN GTSPROD.PRODUCT_CATEGORY IN ('ILC','IB','AIR','STG','NLC','NP')
          THEN plbcif.CIF_NAME
          WHEN GTSPROD.PRODUCT_CATEGORY IN ('ELC','EB','XLC','XP','EC','LN','SCF-AR')
          THEN PBCIF.CIF_NAME
          WHEN GTSPROD.PRODUCT_CATEGORY IN ('TLC','IC','OA','SCF-AP')
          THEN pappcif.CIF_NAME
        END PRIMARY_CUSTOMER_NAME,
        CASE
          WHEN GTSPROD.PRODUCT_CATEGORY IN ('ILC','IB','AIR','STG','NLC','NP')
          THEN plbbac.BAC
          WHEN GTSPROD.PRODUCT_CATEGORY IN ('ELC','EB','XLC','XP','EC','LN','SCF-AR')
          THEN pbbac.BAC
          WHEN GTSPROD.PRODUCT_CATEGORY IN ('TLC','IC','OA','SCF-AP')
          THEN pappbac.BAC
        END PRIMARY_CUST_BAC_CODE,
        CASE
          WHEN GTSPROD.PRODUCT_CATEGORY IN ('ILC','IB','AIR','STG','NLC','NP')
          THEN nvl(plbmg.MARKET,'NOT APPLICABLE')
          WHEN GTSPROD.PRODUCT_CATEGORY IN ('ELC','EB','XLC','XP','EC','LN','SCF-AR')
          THEN nvl(pbmg.MARKET,'NOT APPLICABLE')
          WHEN GTSPROD.PRODUCT_CATEGORY IN ('TLC','IC','OA','SCF-AP')
          THEN nvl(pappmg.MARKET,'NOT APPLICABLE')
        END PRIMARY_CUST_MARKET,
        CASE
          WHEN GTSPROD.PRODUCT_CATEGORY IN ('ILC','IB','AIR','STG','NLC','NP')
          THEN nvl(plbmg.SUB_MARKET,'NOT APPLICABLE')
          WHEN GTSPROD.PRODUCT_CATEGORY IN ('ELC','EB','XLC','XP','EC','LN','SCF-AR')
          THEN nvl(pbmg.SUB_MARKET,'NOT APPLICABLE')
          WHEN GTSPROD.PRODUCT_CATEGORY IN ('TLC','IC','OA','SCF-AP')
          THEN nvl(pappmg.SUB_MARKET,'NOT APPLICABLE')
        END PRIMARY_CUST_SUB_MARKET
      FROM F_TSACTION_RELEASED T
      LEFT OUTER JOIN D_BAC_CODE BAC
      ON (T.BAC_CODE_LIABILITY = BAC.BAC_CODE)
      LEFT OUTER JOIN REF_BAC_SORT_CODE REF_BAC
      ON (T.BAC_CODE_LIABILITY = REF_BAC.BAC)
      LEFT OUTER JOIN F_CONTRACT_BALANCE BAL
      ON (T.ID_TSACTION_RELEASED = BAL.ID_TSACTION_RELEASED)
      LEFT OUTER JOIN D_MARKET_SEGMENT MG
      ON (T.ID_MARKET_SEGMENT = MG.ID_MARKET_SEGMENT)
      LEFT OUTER JOIN D_DATE DT
      ON (DT.ID_DATE = T.ID_RELEASED_DATE)
      LEFT OUTER JOIN D_DATE DB
      ON (DB.ID_DATE = BAL.ID_RELEASED_DATE)
      LEFT OUTER JOIN D_PROCESSING_UNIT PU
      ON (PU.ID_PROCESSING_UNIT = T.ID_PROCESSING_UNIT)
      LEFT OUTER JOIN D_BIR_PRODUCT BIRPROD
      ON (BIRPROD.ID_BIR_PRODUCT=T.ID_BIR_PRODUCT)
      LEFT OUTER JOIN D_GTS_PRODUCT_TYPE GTSPROD
      ON (GTSPROD.ID_GTS_PRODUCT_TYPE= T.ID_GTS_PRODUCT_TYPE)
      LEFT OUTER JOIN D_GTS_TSACTION_TYPE GTST
      ON (GTST.ID_GTS_TSACTION_TYPE = T.ID_GTS_TSACTION_TYPE)
      LEFT OUTER JOIN D_CURRENCY CCYT
      ON (CCYT.ID_CURRENCY = T.ID_TSACTION_CURRENCY)
      LEFT OUTER JOIN d_cif lcif
      ON (lcif.id_cif = T.id_liability_cif)
      LEFT OUTER JOIN d_cif lbcif
      ON (lbcif.id_cif = bal.id_liability_cif)
      LEFT OUTER JOIN d_cif bcif
      ON (bcif.id_cif = T.id_BENEFICIARY)
      LEFT OUTER JOIN d_cif icif
      ON (icif.id_cif = T.id_ISSUING_BANK)
      LEFT OUTER JOIN d_cif acif
      ON (acif.id_cif = T.id_ADVISING_BANK)
      LEFT OUTER JOIN d_cif appcif
      ON (appcif.id_cif = T.id_applicant)
      LEFT OUTER JOIN d_state astate
      ON (astate.id_state = acif.id_state)
      LEFT OUTER JOIN d_state bstate
      ON (bstate.id_state = bcif.id_state)
      LEFT OUTER JOIN d_state lstate
      ON (lstate.id_state = lcif.id_state)
      LEFT OUTER JOIN d_state lbstate
      ON (lbstate.id_state = lbcif.id_state)
      LEFT OUTER JOIN d_state istate
      ON (istate.id_state = icif.id_state)
      LEFT OUTER JOIN d_state appstate
      ON (appstate.id_state = appcif.id_state)
      LEFT OUTER JOIN D_TSACTION_SOURCE TSrc
      ON (T.ID_TSACTION_SOURCE = TSrc.ID_TSACTION_SOURCE)
      LEFT OUTER JOIN D_COUNTRY LCTRY
      ON (LCTRY.ID_COUNTRY = lcif.ID_COUNTRY)
      LEFT OUTER JOIN D_COUNTRY LBCTRY
      ON (LBCTRY.ID_COUNTRY = lbcif.ID_COUNTRY)
      LEFT OUTER JOIN D_COUNTRY BCTRY
      ON (BCTRY.ID_COUNTRY = bcif.ID_COUNTRY)
      LEFT OUTER JOIN D_COUNTRY ICTRY
      ON (ICTRY.ID_COUNTRY = icif.ID_COUNTRY)
      LEFT OUTER JOIN D_COUNTRY ACTRY
      ON (ACTRY.ID_COUNTRY = acif.ID_COUNTRY)
      LEFT OUTER JOIN D_COUNTRY APPCTRY
      ON (APPCTRY.ID_COUNTRY = appcif.ID_COUNTRY)
      LEFT OUTER JOIN D_COUNTRY PCTRY
      ON (PCTRY.ID_COUNTRY = T.ID_PRESENTER_COUNTRY)
      LEFT OUTER JOIN D_LOCATION LOC
      ON (LOC.ID_LOCATION = T.ID_PROCESSING_LOCATION)
      LEFT OUTER JOIN D_CURRENCY BCCYT
      ON (BCCYT.ID_CURRENCY = BAL.ID_LIABILITY_CURRENCY)
      LEFT OUTER JOIN D_CURRENCY BALCYT
      ON (BALCYT.ID_CURRENCY = BAL.ID_BALANCE_CURRENCY)
      LEFT OUTER JOIN d_liability_type li
      ON (li.id_liability_type = BAL.id_liability_type)
      LEFT OUTER JOIN d_cif plbcif
      ON (plbcif.id_cif = T.id_liability_cif)
      LEFT OUTER JOIN REF_BAC_SORT_CODE plbbac
      ON (plbcif.bac_code=plbbac.bac)
      LEFT OUTER JOIN D_MARKET_SEGMENT plbmg
      ON (plbbac.SORT_CODE=plbmg.MARKET_SEGMENT)
      LEFT OUTER JOIN d_cif pbcif
      ON (pbcif.id_cif = T.id_BENEFICIARY)
      LEFT OUTER JOIN REF_BAC_SORT_CODE pbbac
      ON (pbcif.bac_code=pbbac.bac)
      LEFT OUTER JOIN D_MARKET_SEGMENT pbmg
      ON (pbbac.SORT_CODE=pbmg.MARKET_SEGMENT)
      LEFT OUTER JOIN d_cif pappcif
      ON (pappcif.id_cif = T.id_applicant)
      LEFT OUTER JOIN REF_BAC_SORT_CODE pappbac
      ON (pappcif.bac_code=pappbac.bac)
      LEFT OUTER JOIN D_MARKET_SEGMENT pappmg
      ON (pappbac.SORT_CODE=pappmg.MARKET_SEGMENT)
      LEFT OUTER JOIN D_CURRENCY LOCALCYT
      ON (LOCALCYT.alpha_code = PU.local_ccy)
      LEFT OUTER JOIN D_BRANCH Branch              
      ON (T.ID_BRANCH  = Branch.ID_BRANCH )
      LEFT OUTER JOIN F_USD_FX_RATE_HISTORY FX1
      ON (BAL.ID_BALANCE_CURRENCY = FX1.ID_CURRENCY and FX1.ID_DATE = (select max(FX11.ID_DATE) from F_USD_FX_RATE_HISTORY FX11 where BAL.ID_BALANCE_CURRENCY = FX11.ID_CURRENCY and FX11.ID_DATE <= BAL.id_released_date))
      LEFT OUTER JOIN F_USD_FX_RATE_HISTORY FX2
      ON (LOCALCYT.ID_CURRENCY = FX2.ID_CURRENCY and FX2.ID_DATE = (select max(FX22.ID_DATE) from F_USD_FX_RATE_HISTORY FX22 where LOCALCYT.ID_CURRENCY = FX22.ID_CURRENCY and FX22.ID_DATE <= BAL.id_released_date))
      LEFT OUTER JOIN F_USD_FX_RATE_HISTORY FX3
      ON (BAL.ID_LIABILITY_CURRENCY = FX3.ID_CURRENCY and FX3.ID_DATE = (select max(FX33.ID_DATE) from F_USD_FX_RATE_HISTORY FX33 where BAL.ID_LIABILITY_CURRENCY = FX33.ID_CURRENCY and FX33.ID_DATE <= BAL.id_released_date))Note the lines
    ROUND(BAL.MN_AVAILABLE_BALANCE * NVL(FX1.EXCHANGE_RATE,1) / NVL(FX2.EXCHANGE_RATE,1) , 4)                           
    BAL.LIABILITY_BALANCE * NVL(FX3.EXCHANGE_RATE,1)                                                                   
    BAL.MEMO_LIABILITY_BALANCE * NVL(FX3.EXCHANGE_RATE,1)                 
    And
    LEFT OUTER JOIN F_USD_FX_RATE_HISTORY FX1
      ON (BAL.ID_BALANCE_CURRENCY = FX1.ID_CURRENCY and FX1.ID_DATE = (select max(FX11.ID_DATE) from F_USD_FX_RATE_HISTORY FX11 where BAL.ID_BALANCE_CURRENCY = FX11.ID_CURRENCY and FX11.ID_DATE <= BAL.id_released_date))
      LEFT OUTER JOIN F_USD_FX_RATE_HISTORY FX2
      ON (LOCAMNYT.ID_CURRENCY = FX2.ID_CURRENCY and FX2.ID_DATE = (select max(FX22.ID_DATE) from F_USD_FX_RATE_HISTORY FX22 where LOCAMNYT.ID_CURRENCY = FX22.ID_CURRENCY and FX22.ID_DATE <= BAL.id_released_date))
      LEFT OUTER JOIN F_USD_FX_RATE_HISTORY FX3
      ON (BAL.ID_LIABILITY_CURRENCY = FX3.ID_CURRENCY and FX3.ID_DATE = (select max(FX33.ID_DATE) from F_USD_FX_RATE_HISTORY FX33 where BAL.ID_LIABILITY_CURRENCY = FX33.ID_CURRENCY and FX33.ID_DATE <= BAL.id_released_date))Thsi is where I need to incorporate the change

  • Subquery results need help with output

    Requirements:
    There is request to dump every day’s data from one table into text file.
    Tables:INSPECTION_RESULTS
    NJAS
    Primary Key: RES_SYS_NO
    Parent table: INSPECTION_RESULTS
    Child table: NJAS
    Purpose:
    1. obtain output results for the MIN and MAX RES_SYS_NO from INSPECTION_RESULTS table for yesterday’s date.
    2. Create comma delimited file for following columns using above table data output.
    Script thus far:
    SELECT NJA_RES_SYS_NO||','||NJA_TEST_REC_NO||','||NJA_LIC_ST_ID||','||NJA_ETS_ID||','||NJA_SW_VER||','||NJA_TECH_ID||','||NJA_VIN||','||NJA_LIC_NO||','
    ||NJA_LIC_JUR||','||NJA_LIC_SRC||','||NJA_MODEL_YR||','||NJA_MAKE||','||NJA_MODEL||','||NJA_VHCL_TYPE||','||NJA_BODY_STYLE||','||NJA_CERT_CLASS||','||
    NJA_GVWR||','||NJA_ASM_ETW||','||NJA_ETW_SRC||','||NJA_NO_OF_CYL||','||NJA_ENG_SIZE||','||NJA_TRANS_TYPE||','||NJA_DUAL_EXH||','||NJA_FUEL_CD||','
    || NJA_VID_SYS_NO||','|| NJA_VRT_REC_NO||','|| NJA_ARMSTDS_REC_NO||','|| NJA_RSN_F_N_TESTABLE||','|| NJA_DYNO_TESTABLE||','||
    NJA_CURR_ODO_RDNG||','|| NJA_PREV_ODO_RDNG||','|| NJA_PREV_TEST_DT||','|| NJA_TEST_TYPE||','||
    NJA_TEST_START_DT||','|| NJA_TEST_END_TIME||','|| NJA_EMISS_TEST_TYPE||','|| NJA_TEST_SEQ_NO||','
    ||NJA_LOW_MILE_EXEMP||','||NJA_TIRE_DRY||','|| NJA_REL_HUMID||','|| NJA_AMB_TEMP||','||NJA_BAR_PRESS||','|| NJA_HCF||','||
    NJA_GAS_CAP_ADAP_AVL||','|| NJA_GAS_CAP_REPL||','|| NJA_OVERALL_GAS_CAP_RES
    FROM NJAS
    WHERE NJA_RES_SYS_NO IN (SELECT MIN(NJA_RES_SYS_NO) FROM INSPECTION_RESULTS WHERE NJA_RES_SYS_NO IN
    (SELECT MAX(NJA_RES_SYS_NO) FROM INSPECTION_RESULTS, (SELECT SYSDATE-1 FROM DUAL))
    It works but not sure if I am getting accurate results. Can anyone help me fine tune this subquery script? Thanks!
    Scott

    Duplicate post
    See my answer in your other posting here Re: need help with script

  • Need help with the query -

    We have these two tables and trying to see if the amount in one table is equal to the sum of the amount fields in another table.
    Can someone please help?
    SQL> desc test;
    Name Null? Type
    KEY NUMBER(9)
    AMOUNT NUMBER(11,2)
    SQL> desc test1;
    Name Null? Type
    KEY NUMBER(9)
    TOTAL_AMOUNT NUMBER(11,2)
    SQL> select * from test;
    KEY AMOUNT
    1 10
    1 20
    1 70
    3 rows selected.
    SQL> select * from test1;
    KEY TOTAL_AMOUNT
    1 100
    1 row selected.
    SQL> select count(*) from test1 a where exists(select 'x' from test b where
    2 a.key=b.key and a.total_amount!=sum(b.amount));
    a.key=b.key and a.total_amount!=sum(b.amount))
    ERROR at line 2:
    ORA-00934: group function is not allowed here

    Hi,
    This would be so easy if there was a table like test that only had one row per key!
    You can make such a table, or you can just make a sub-query that has all the same data, and use that sub-query as if it were a table.
    For example:
    WITH  one_row_per_key_test  AS
        SELECT    key
        ,         SUM (amount)  AS total_amount
        FROM      test
        GROUP BY  key
    SELECT  COUNT (*)
    FROM    test1   a
    WHERE   EXISTS
            SELECT  'x'
            FROM    one_row_per_key_test  b
            WHERE   a.key          = b.key
            AND     a.total_amount = b.total_amount
            );There are lots of other ways to solve this particular problem (including HAVING), but this is a very important technique to know.

  • Need help with XMLTABLE query

    I am new to XML parsing. I am trying to convert xml data into relational data and need some help with this.
    with table1 AS
    (select xmltype(
    '<BuildingData_AllResponse xmlns="http://secure.rutgers.edu/spacemanagement/">
    <BuildingData_AllResult>
    <xs:schema id="NewDataSet" xmlns="" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:msdata="urn:schemas-microsoft-com:xml-msdata">
    <xs:element name="NewDataSet" msdata:IsDataSet="true" msdata:UseCurrentLocale="true">
    <xs:complexType>
    <xs:choice minOccurs="0" maxOccurs="unbounded">
    <xs:element name="Table">
    <xs:complexType>
    <xs:sequence>
    <xs:element name="BldgNum" type="xs:int" minOccurs="0"/>
    <xs:element name="BldgName" type="xs:string" minOccurs="0"/>
    <xs:element name="Campus" type="xs:string" minOccurs="0"/>
    <xs:element name="StreetNo" type="xs:string" minOccurs="0"/>
    <xs:element name="Address" type="xs:string" minOccurs="0"/>
    <xs:element name="City" type="xs:string" minOccurs="0"/>
    <xs:element name="State" type="xs:string" minOccurs="0"/>
    <xs:element name="Zip" type="xs:string" minOccurs="0"/>
    <xs:element name="Lat" type="xs:double" minOccurs="0"/>
    <xs:element name="Long" type="xs:double" minOccurs="0"/>
    <xs:element name="BldgDesc" type="xs:string" minOccurs="0"/>
    <xs:element name="DeptName" type="xs:string" minOccurs="0"/>
    <xs:element name="OrgID" type="xs:int" minOccurs="0"/>
    </xs:sequence>
    </xs:complexType>
    </xs:element>
    </xs:choice>
    </xs:complexType>
    </xs:element>
    </xs:schema>
    <diffgr:diffgram xmlns:msdata="urn:schemas-microsoft-com:xml-msdata" xmlns:diffgr="urn:schemas-microsoft-com:xml-diffgram-v1">
    <NewDataSet xmlns="">
    <Table diffgr:id="Table1" msdata:rowOrder="0">
    <BldgNum>1113</BldgNum>
    <BldgName>Trinity House</BldgName>
    <Campus>College Avenue</Campus>
    <StreetNo>14</StreetNo>
    <Address>Stone Street</Address>
    <City>New Brunswick</City>
    <State>NJ</State>
    <Zip>08901</Zip>
    <Lat>0</Lat>
    <Long>0</Long>
    </Table>
    <Table diffgr:id="Table2" msdata:rowOrder="1">
    <BldgNum>1114</BldgNum>
    <BldgName>Second Reformed Church</BldgName>
    <Campus>College Avenue</Campus>
    <StreetNo>100</StreetNo>
    <Address>College Avenue</Address>
    <City>New Brunswick</City>
    <State>NJ</State>
    <Zip>08901</Zip>
    </Table>
    <Table diffgr:id="Table3" msdata:rowOrder="2">
    <BldgNum>3000</BldgNum>
    <BldgName>Old Queens </BldgName>
    <Campus>College Avenue</Campus>
    <StreetNo>83</StreetNo>
    <Address>SOMERSET STREET</Address>
    <City>New Brunswick</City>
    <State>NJ</State>
    <Zip>08901-1281</Zip>
    <Lat>40.4987594033</Lat>
    <Long>-74.4462681014</Long>
    <BldgDesc>Old Queens, originally called the Queens Building was erected in 1809. It was named after Charlotte Sophia the wife of George III, King of England. Charlotte Sophia was the youngest daughter of Charles Lewis, brother of Frederic, third Duke of Mecklenburg- Strelitz. She married George III on Sept.8, 1761 and was crowned on Sept. 22. Queen Charlotte was buried in St. George&apos;s Chapel, Winsdor. Source: Catalogue of Building and Place Names at Rutgers.</BldgDesc>
    <DeptName>Division of Continuing Studies, VP</DeptName>
    <OrgID>10003</OrgID>
    </Table>
    </NewDataSet>
    </diffgr:diffgram>
    </BuildingData_AllResult>
    </BuildingData_AllResponse>'
    ) ruloc from dual
    SELECT loc.*
    FROM table1 PO,
    XMLTable('/BuildingData_AllResponse/BuildingData_AllResult/diffgr:diffgram/NewDataSet/Table/BldgNum' PASSING PO.ruloc
    COLUMNS "BldgNum" CHAR(10) PATH 'BldgNum',
    "Name" CHAR(150) PATH 'BldgName',
    "Campus" CHAR(50) PATH 'Campus') AS loc
    In the above example, for simplicity I am pulling only 3 columns, but I am looking to pull all elements within <TABLE> as columns values.

    Hi,
    Please always give your database version (select * from v$version), and the error message (if any).
    The XML document you're querying defines multiple namespaces.
    You have to declare those useful to resolve the XPath expression with the XMLNamespaces clause :
    SELECT loc.*
    FROM table1 PO,
         XMLTable(
           XMLNamespaces(
             'urn:schemas-microsoft-com:xml-diffgram-v1' as "diffgr"
           , 'http://secure.rutgers.edu/spacemanagement/' as "ns0"
         , '/ns0:BuildingData_AllResponse/ns0:BuildingData_AllResult/diffgr:diffgram/NewDataSet/Table'
           PASSING PO.ruloc
           COLUMNS
             "BldgNum" CHAR(10) PATH 'BldgNum',
             "Name"    CHAR(150) PATH 'BldgName',
             "Campus"  CHAR(50) PATH 'Campus'
         ) AS loc
    ;

  • Need Help with Simple Query

    Select deptno,sum(sal) from emp group by deptno order by deptno;
    This gives the below output
    Deptno Sum(Sal)
    10 8750
    20 10875
    30 9400
    But i need the Below Output
    10 20 30
    8750 10875 9400
    Help needed

    i tried with a example go around a glance and try it with yours
    CREATE TABLE CHIPTRUCK_SALES
    (LOCATION VARCHAR(20),
    DATE DATE,
    HOTDOGS_SOLD INT);
    INSERT INTO CHIPTRUCK_SALES VALUES
    ('Claremont', '1/1/2001', 10),
    ('Brougham', '1/1/2001', 20),
    ('Greenwood', '1/1/2001', 8),
    ('Whitevale', '1/1/2001', 5),
    ('Claremont', '1/2/2001', 11),
    ('Brougham', '1/2/2001', 22),
    ('Greenwood', '1/2/2001', 15),
    ('Whitevale', '1/2/2001', 17),
    ('Claremont', '1/3/2001', 8),
    ('Brougham', '1/3/2001', 19),
    ('Greenwood', '1/3/2001', 12),
    ('Whitevale', '1/3/2001', 7);
    SELECT
    LOCATION,
    MAX(CASE WHEN date = '1/1/2001' THEN hotdogs_sold END) AS "1/1/2001",
    MAX(CASE WHEN date = '1/2/2001' THEN hotdogs_sold END) AS "1/2/2001",
    MAX(CASE WHEN date = '1/3/2001' THEN hotdogs_sold END) AS "1/3/2001"
    FROM CHIPTRUCK_SALES GROUP BY LOCATION ORDER BY LOCATION;
    =>
    LOCATION 1/1/2001 1/2/2001 1/3/2001
    Brougham 20 22 19
    Claremont 10 11 8
    Greenwood 8 15 12
    Whitevale 5 17 7
    4 record(s) selected.

  • Need help with comparing Query data

    See Attached query
    I need to only show Sales Employee Name that does NOT match Email
    Example: Sales Employee Name KSLM-130
                      E-Mail does NOT start with the same KSLM-130 @ XXX.com
    In other words need to report on unmatched Sales Employee Name to Email Name.
    Thank you!

    Hi!
    Try this. It might work for you.
    SELECT
    T0.[CardCode], T0.[CardName], T2.[SlpCode], T1.[Name], T1.[E_MailL], T2.[SlpName]
    FROM OCRD T0 
    INNER JOIN OCPR T1 ON T0.CardCode = T1.CardCode
    INNER JOIN OSLP T2 ON T0.SlpCode = T2.SlpCode
    WHERE LEFT(T1.[E_MailL],LEN(T2.[SlpName])) <> T2.[SlpName]
    Regards,

  • Need help with SQL Query

    I am trying to build a query that sums up 12 columns depending on some accounting variables.
    example:
    SELECT gbmcu, gbco, gbfy,'',SUM(gban01 gban02 gban03 gban04 gban05+gban06+gban07+gban08+gban09+gban10+gban11+gban12) AS Oil_Volumes
    FROM PRODDTA.F0902
    WHERE GBCO = '00099'
    AND GBFY = 5
    AND GBLT = 'QU' AND GBOBJ = 5015 AND GBSUB = '105'
    GROUP BY gbmcu, gbco, gbfy
    The condition that changes is :AND GBLT = 'QU' AND GBOBJ = 5015 AND GBSUB = '105'
    I need to do this for 17 different conditions - I need to make the query as optimized as possible.
    I tried using a case statement but that takes forever (the table is over 4 million records to scan through).
    Please let me know if anyone has any suggestions on how to create something to perform these calculations.
    thanks,
    Pam

    I think I would tend to write that query as:
    SELECT gbmcu, gbco, gbfy,
           SUM(CASE WHEN gblt = 'QA' AND gbobj = 5015 AND gbsub = '105' THEN
                    gban01+gban02+gban03+gban04+gban05+gban06+gban07+gban08+gban09+gban10+gban11+gban12 END) Oil_Sales
           SUM(CASE WHEN gblt = 'QU' AND gbobj = 5015 AND gbsub = '105' THEN
                    gban01+gban02+gban03+gban04+gban05+gban06+gban07+gban08+gban09+gban10+gban11+gban12 END) Oil_Volumes
    FROM proddta.f0902
    WHERE gbco = '00099' and
          gbfy = 5 and
          gblt IN ('QA', 'QU') and
          gbpbj = 5015 and
          gbsub = '105'
    GROUP BY gbmcu, gbco, gbfySUM the CASE rather than CASE SUM. Also, as written, your query will look at all of the records in f0902 whether or not they meet one of the Case criteria, so I would put the required values in the WHERE clause as well. Your samples only show gblt changing, so you may need to make the gbpbj and gbsub predicates IN lists as well.
    If the query runs in 1.5 hours calculating one value, I would expect it to be about the same calculating 17 values, since I would bet that most of the run time is accessing the table rather than doing the math. It would almost certainly be faster than running essentially the same query 17 times.
    Indexes on some or all of the columns in the WHERE clause may help, depending on how selective the columns are. At a guess, I would suggest that gbco and gbfy would be good candidates.
    Finally, are you sure that gbsub is a character field? The only example we have is a number.
    HTH
    John

  • SQL I need help with this query Please help

    List the names of all the products whose weight unit measure is “Gram”.  Order the list by product name.  Do not use JOINS, but use the IN clause with a sub-query.
    select Name
    from UnitMeasure
    where Name= 'Gram'
    order by Name
    I did this, but it seem that the requirement is different.

    As a guess:
    Select Name from Product
    where UnitMeasure in (Select Name from unitmeasure where name = 'Gram')
    Andy Tauber
    Data Architect
    The Vancouver Clinic
    Website | LinkedIn
    This posting is provided "AS IS" with no warranties, and confers no rights. Please remember to click
    "Mark as Answer" and "Vote as Helpful" on posts that help you. This can be beneficial to other community members reading the thread.

  • I need help with a query

    Hello everyone,
    First, some background information.  We have in place a table which records status changes on a work order.  The orders normally go through each status only once, however they do occasionally reuse status indicators.  For example, an order is placed on hold, released from hold, placed back on hold, released again and so on.  The sample data provided is an example of an order with data repeating itself.  I need some help with writing a query on this table.
    LOC_CODE
    WO_NO
    UPDATETIME
    WO_STATUS_OLD
    WO_STATUS_NEW
    xxx
    12345
    05-01-2013 10:24:00
    WR
    SP
    xxx
    12345
    05-01-2013 10:39:00
    SP
    PM
    xxx
    12345
    05-01-2013 11:52:00
    PM
    ES
    xxx
    12345
    05-01-2013 11:58:00
    ES
    MO
    xxx
    12345
    05-01-2013 12:03:00
    MO
    ES
    xxx
    12345
    05-01-2013 12:38:00
    ES
    AT
    xxx
    12345
    05-01-2013 12:48:00
    AT
    RS
    xxx
    12345
    05-01-2013 13:01:00
    RS
    RA
    xxx
    12345
    05-01-2013 13:26:00
    RA
    RS
    xxx
    12345
    05-01-2013 13:36:00
    RS
    RA
    xxx
    12345
    05-01-2013 15:35:00
    RA
    RS
    xxx
    12345
    05-01-2013 15:42:00
    RS
    RA
    xxx
    12345
    05-01-2013 16:04:00
    RA
    RS
    xxx
    12345
    05-01-2013 16:42:00
    RS
    RA
    xxx
    12345
    05-01-2013 19:28:00
    RA
    FD
    xxx
    12345
    05-01-2013 19:28:00
    FD
    SO
    The query (which will in turn be used for a view) will display the elapsed time between status updates (subtract updatetime from the record preceeding).  Only the first record for each order at a location would have no elapsed time.  The result should look like this:
    LOC_CODE
    WO_NO
    UPDATETIME
    WO_STATUS_OLD
    WO_STATUS_NEW
    MINUTES_ELAPSED
    xxx
    12345
    05-01-2013 10:24:00
    WR
    SP
    {null}
    xxx
    12345
    05-01-2013 10:39:00
    SP
    PM
    15
    xxx
    12345
    05-01-2013 11:52:00
    PM
    ES
    73
    xxx
    12345
    05-01-2013 11:58:00
    ES
    MO
    6
    xxx
    12345
    05-01-2013 12:03:00
    MO
    ES
    5
    xxx
    12345
    05-01-2013 12:38:00
    ES
    AT
    35
    xxx
    12345
    05-01-2013 12:48:00
    AT
    RS
    10
    xxx
    12345
    05-01-2013 13:01:00
    RS
    RA
    13
    xxx
    12345
    05-01-2013 13:26:00
    RA
    RS
    25
    xxx
    12345
    05-01-2013 13:36:00
    RS
    RA
    10
    xxx
    12345
    05-01-2013 15:35:00
    RA
    RS
    119
    xxx
    12345
    05-01-2013 15:42:00
    RS
    RA
    7
    xxx
    12345
    05-01-2013 16:04:00
    RA
    RS
    22
    xxx
    12345
    05-01-2013 16:42:00
    RS
    RA
    38
    xxx
    12345
    05-01-2013 19:28:00
    RA
    FD
    166
    xxx
    12345
    05-01-2013 19:28:00
    FD
    SO
    0
    I have been trying various queries, but no luck as of yet.  I would appreciate your input.
    Thank you,
    Patrick

    Sorry about the late reply.  I had an unexpected meeting to attend.  Here is the requested information
    We are running Oracle Database 11g Release 11.2.0.3.0 - 64bit Production.
    --  DDL for Table WO_STATUS
      CREATE TABLE "WO_STATUS"
       ( "LOC_CODE" VARCHAR2(3 BYTE),
      "WO_NO" NUMBER,
      "UPDATE_DATETIME" DATE,
      "WO_STATUS_OLD" VARCHAR2(2 BYTE),
      "WO_STATUS_NEW" VARCHAR2(2 BYTE)
    INSERT INTO WO_STATUS (LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 10:24:00'},'WR','SP');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 10:39:00'},'SP','PM');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 11:52:00'},'PM','ES');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 11:58:00'},'ES','MO');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 12:03:00'},'MO','ES');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 12:38:00'},'ES','AT');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 12:48:00'},'AT','RS');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 13:01:00'},'RS','RA');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 13:26:00'},'RA','RS');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 13:36:00'},'RS','RA');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 15:35:00'},'RA','RS');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 15:42:00'},'RS','RA');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 16:04:00'},'RA','RS');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 16:42:00'},'RS','RA');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 19:28:00'},'RA','FD');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12345,{ts '2013-05-01 19:28:00'},'FD','SO');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12346,{ts '2013-06-18 09:35:00'},'PM','ES');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12346,{ts '2013-06-18 09:37:00'},'ES','AT');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12346,{ts '2013-06-18 09:45:00'},'AT','RS');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12346,{ts '2013-06-18 09:51:00'},'RS','RA');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12346,{ts '2013-06-18 10:01:00'},'RA','FD');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12346,{ts '2013-06-18 10:02:00'},'FD','SO');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12346,{ts '2013-06-18 10:23:00'},'SO','MP');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12347,{ts '2013-06-18 08:29:00'},'WR','SP');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12347,{ts '2013-06-18 09:07:00'},'SP','PM');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12347,{ts '2013-06-18 09:48:00'},'PM','ES');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12347,{ts '2013-06-18 09:51:00'},'ES','AT');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12347,{ts '2013-06-18 10:19:00'},'AT','FD');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12347,{ts '2013-06-18 10:20:00'},'FD','SO');
    INSERT INTO WO_STATUS(LOC_CODE,WO_NO,UPDATE_DATETIME,WO_STATUS_OLD,WO_STATUS_NEW) VALUES ('xxx',12347,{ts '2013-06-18 10:24:00'},'SO','PY');

Maybe you are looking for