Group by year query

This is a sample data for a particular employee code 15400. Table name: tab_contr.
EMP_CODE CURR_MONTH DUE_MONTH CONTRIBUTION_AMT
15400 200804 200804 541
15400 200805 200805 541
15400 200806 200806 541
15400 200807 200807 541
15400 200808 200808 541
15400 200809 200809 541
15400 200810 200810 541
15400 200811 200811 541
15400 200812 200812 541
15400 200901 200901 541
15400 200902 200902 541
15400 200903 200903 541
15400 201001 201001 541
Here, using the group by query, I need the following output:
Years Total_Amt
Mar 2008- Feb 2009 sum(contribution_amount for that duration)
Mar 2009- FEb 2010 sum(contribution_amount for that duration)
please help.

with t
as
select 15400 emp_code, 200804 cur_month, 200804 due_month, 541 amt from dual union all
select 15400, 200805, 200805, 541 from dual union all
select 15400, 200806, 200806, 541 from dual union all
select 15400, 200807, 200807, 541 from dual union all
select 15400, 200808, 200808, 541 from dual union all
select 15400, 200809, 200809, 541 from dual union all
select 15400, 200810, 200810, 541 from dual union all
select 15400, 200811, 200811, 541 from dual union all
select 15400, 200812, 200812, 541 from dual union all
select 15400, 200901, 200901, 541 from dual union all
select 15400, 200902, 200902, 541 from dual union all
select 15400, 200903, 200903, 541 from dual union all
select 15400, 201001, 201001, 541 from dual
select emp_code,
       min(cur_year)||'-'||max(cur_year) years,
       sum(amt) amt
  from (
select emp_code,
       extract(year from to_date(cur_month, 'yyyymm')) cur_year,
       extract(month from to_date(cur_month, 'yyyymm')) cur_month,
       amt
  from t)
group by emp_code, cur_year-decode(cur_month, 1, 1, 2, 1, 0)
order by cur_year-decode(cur_month, 1, 1, 2, 1, 0)

Similar Messages

  • Report not grouping by year

    Hi
    I have a report that is grouping by month but not year and I can't see why.
    I run the below query:
    select count(case when report_base = 2 then 1 end) as 'Base 1',
    count(case when report_base = 4 then 1 end) as 'Base 2',
    count(case when report_base = 5 then 1 end) as 'Base 3',
    month(Report_Date) as Month,
    year(Report_Date) as Year
    from [dbo].[SRF_Data]
    where Report_Date >= dateadd(month, datediff(month, 0, getDate()) -12, 0)
    group by year(Report_Date),month(Report_Date)
    And get the results below:
    When I put this into a report and run it the report shows broken down by month (it shows a 13 as well?) but not year. It is grouping the results for 2015 in with 2014. The X axis is set to Scalar as I want to show empty months. My category groups are set
    to Year and then month.
    Can someone let me know what I'm doing wrong?
    Thanks.

    Hi rjp1977,
    I tested the issue in my local machine, when we set Axis type to category, since there is no data for October, it will not be displayed. If we set Axis type to Scalar, October will be displayed in the chart, but the data of 2014 and 2015 will be displayed
    together by design.
    If you care about the function, i recommend you that submit the requirement at
    https://connect.microsoft.com/SQLServer/ . If the requirement mentioned by customers for many times, the product team may consider to add this feature in the next SQL Server version. Your feedback is valuable
    for us to improve our products and increase the level of service provided.
    In addition, the screenshot you provided has value of February 2015.
    If you have any more questions, please feel free to ask.
    Thanks,
    Wendy Fu
    Wendy Fu
    TechNet Community Support

  • Creating Target group with Bi Query

    Hi,
    I am trying to create Target group with BI query, for that I created one query on real time info cube  with 0BPARTNER, 0BP_GUID  and selected all navigational attributes.  In the query info object properties u2013 Advance setting I selecting u201C Values from Master Data Tableu201D  & Access Type u201C Master Datau201D , In query Property u2013 selected u201C Allow External Access to Queryu201D
    In CRM while creating Data source I can see the In objects selected in   the query u2013 selected Business partner info object as 0BP_GUID ,In the attribute list I just selected one item Industry code against the above data source, while running the Segment Builder when I selected the industry code it gives me 0 business partner where as I am having 20 BP with that Industry code.
    I appreciate your help ASAP.
    Regards,
    Rajiv Jain

    Hi Rajiv
    I have the same issue with my BI Query. Hope you have resolved by now, if you have please do let me know.
    What we have done is, created a BI Query and used this query in in CRM datasource, then created some filters where i can see the objects in the query while creating filters in Attributes List. But when we use these filters to create Target group, its not resulting any results in the query, i mean not brining any values on segmentation area. This issue is only in Production as we have already tested in Test and passed without any problems. But in Production we have this issue, i m not sure why this is happening, i spoke to BW team regarding this and they are not sure eitherl. If you know could you please let me know
    Thank you in advance
    Regards
    shankar

  • How many ways we can create authorization for user groups in sap query reports

    Hi Gurus, I am getting a problem when I am assigning users to user group in sap query report .The users other than created in user groups are also able to add &change  the users .So please suggest me how to restrict users outside of the user group.
    Please send me if u have any suggestions and useful threads.
    Thank You,
    Suneel Kumar.

    I don't think it can be done. According to the link below 'Users who have authorization for the authorization object S_QUERY with both the values Change and Maintain, can access all queries of all user groups without being explicitly entered in each user group.'
    http://help.sap.com/saphelp_46c/helpdata/en/d2/cb3f89455611d189710000e8322d00/content.htm
    Although I think you can add code to your infoset and maybe restrict according to authority group, i.e.:
    Use AUTHORITY-CHECK to restrict access to the database based on user.
    Press F1 on AUTHORITY-CHECK to find out how to use it in the code

  • Order by ignored if report has 2 groups in 1 query

    My report has order by &p_q1_order that allows the user to select the order of the report at runtime either by Business Name or by Applicator Name. The order by was working fine with only one group in the query. I have since had to add new fields(applicator type) to the original query and have created a second group below the first in the data model. Now when the report runs it ignores the order by clause and is ordering the report by which ever field is first in the data model's first group. If I move the fields around in the data model it will sort by which ever one I have at the top.
    My query is below: All fields except (ps.status, ps.description, and ps.requalify) are in the 1st group those 3 fields are in the second group. If I move the t.BUSINESS_NAME to the top it orders by Business name. If I move the t.NAME to the top it order by applicator's name and ignores the user's choice in the parameter form.
    My query:
    SELECT distinct
    t.NAME,
    t.BUSINESS_NAME,
    t.ID,
    ps.status,
    ps.description,
    ps.requalify,
    t.license_cd,
    t.ISSUE_DATE,
    t.EXPIRES,
    t.county,
    t.enf_dist,
    t.addr_ln_1||' '||t.addr_ln_2 address,
    t.city||' '||t.state||' '||t.zip ctystzip,
    t.phone
    FROM lic_com_appl_dler_addr t, pesticide_com_license_sumlic ps
    where t.id = ps.id(+)
    and ps.lic_subtype_cd not in('11','12','13','14','15')
    and t.EXPIRES like :p_current
    &p_where
    order by &p_q1_order

    Hi Terri
    Please let us know the Reports version you are using? Also try with out the bind varibale in query. Does the order by works then? There were bugs like this in earlier realeses of Report but are all fixed now.
    Thanks
    The Oracle Reports Team

  • Error message:FRM-12001: Cannot Create the record group(check your query)

    Requirement: Need to get employee name and number in the LOV in search criteria.
    So I created LOV "full_name" and Record group Query under Employee Name property palette with
    select papf.title||' '||papf.last_name||', '||papf.first_name||' '||papf.middle_names emp_full_name
    ,papf.employee_number
    from apps.per_all_people_f papf, apps.per_person_types ppt
    where sysdate between papf.effective_start_date and papf.effective_end_date AND papf.person_type_id=ppt.person_type_id AND ppt.system_person_type IN ('EMP', 'OTHER', 'CWK','EMP_APL')
    AND PPT.default_flag='Y' and papf.BUSINESS_GROUP_ID=1
    order by papf.full_name
    I was unable to save and getting error message "FRM-12001: Cannot Create the record group(check your query)".
    I cant use PER_ALL_PEOPLE_F.FULL_NAME since full name here is last_name||title||middle_names||firstname.
    But my requiremnet is papf.title||' '||papf.last_name||', '||papf.first_name||' '||papf.middle_names emp_full_name .
    Can any one of you help me.

    First, Magoo wrote:
    <pre><font face = "Lucida Console, Courier New, Courier, Fixed" size = "1" color = "navy">create or replace function emp_full_name ( p_title in varchar2,
    p_last_name in varchar2,
    p_first_name in varchar2,
    p_mid_names in varchar2 ) return varchar2 is
    begin
    for l_rec in ( select decode ( p_title, null, null, p_title || ' ' ) ||
    p_last_name || ', ' || p_first_name ||
    decode ( p_mid_names, null, null, ' ' || p_mid_names ) full_name
    from dual ) loop
    return ( l_rec.full_name );
    end loop;
    end;</font></pre>
    Magoo, you don't ever need to use Select from Dual. And the loop is completely unnecessary, since Dual always returns only one record. This would be much simpler:
    <pre><font face = "Lucida Console, Courier New, Courier, Fixed" size = "1" color = "navy">create or replace function emp_full_name
    ( p_title in varchar2,
    p_last_name in varchar2,
    p_first_name in varchar2,
    p_mid_names in varchar2 ) return varchar2 is
    begin
    Return ( Ltrim( Rtrim ( p_title
    ||' ' ||p_last_name
    ||', '||p_first_name
    ||' ' ||p_middle_names )));
    end;</font></pre>
    And second:
    user606106, you did not mention how you got your record group working. However, you DO have an issue with spaces. If you change this:
    <pre><font face = "Lucida Console, Courier New, Courier, Fixed" size = "1" color = "navy">select papf.title||' '||papf.last_name||', '||papf.first_name||' '||papf.middle_names emp_full_name
    ,papf.employee_number </font></pre>
    to this:
    <pre><font face = "Lucida Console, Courier New, Courier, Fixed" size = "1" color = "navy">select Ltrim(Rtrim(papf.title||' '||papf.last_name||', '
    ||papf.first_name||' '||papf.middle_names)) AS emp_full_name,
    papf.employee_number</font></pre>
    it should work. The Ltrim(Rtrim()) removes leading and trailing spaces from the resulting full name.

  • Sort "Group by Year" question

    Hi,
    I use Aperture 3.1.1 and have added a project which contains pictures from 2010 and 2011. I have the option "Group by year" enabled and display "newest first". Usually Aperture sorts every project which contains two years above the oldest year. So it should be sorted like:
    2011
    2010-2011
    2010
    2009
    2008
    2007-2008
    2007
    However Aperture sorts as follows:
    2011
    2010
    2010-2011
    2009
    2008
    2007-2008
    2007
    Did I miss a setting here?
    Any ideas on how to fix it so I would get displayed as the first example?
    Thanks a lot,
    Joern

    Joern18 wrote:
    Any other ideas?
    An easy one to try:
    Create a new, empty Project. Give it a very simple name. Select all the images in the problem Project and move them to the new Project. Delete the old Project. Go to Project view and try the "group by year", "sort by date" again.
    If it fails, try it again with a subset of the images to see if it is an image that is causing the problem.
    You could also rebuild your Library. This is the last step in the progression "Repair permissions", "Repair Library", "Rebuild Library". For a large Library this will take hours.

  • FRM-12001:  Cannot create the record group (check your query).

    I WANT TO ADD A RECORD GROUP IN PRI BUILD FORM THE QUARY IS VERY SIMPLE LIKE
    SELECT item_code
    FROM items
    WHERE active = 'Y'
    AND item_code like 'FAJ%'
    BUT THE SYSTEM SHOW THE ERROR MESSAGE
    Cannot create the record group (check your query).

    Make sure the user connected to the database from the forms builder has the privilege of select from the table, or there's a synonym.
    Try your query from SQL*Plus connected with the same user.
    Tony

  • Group by Month, Group By Year

    I have a date column in my table and it looks like 2009/06/29. Is there a way I can GROUP BY MONTH and GROUP BY YEAR on this column?
    Or do I have to create a YEAR column to GROUP BY YEAR, and a month column to GROUP BY MONTH?
    Thanks.

    When you insert a group on date field you will be having an option in group option "The section will be printed" For each month or year. So you can select for each month or week or quarter or year.
    Regards,
    Raghavendra

  • GROUP and MAX Query

    Hi
    I have two tables that store following information
    CREATE TABLE T_FEED (FEED_ID NUMBER, GRP_NUM NUMBER);
    CREATE TABLE T_FEED_RCV (FEED_ID NUMBER, RCV_DT DATE);
    INSERT INTO T_FEED VALUES (1, 1);
    INSERT INTO T_FEED VALUES (2, 1);
    INSERT INTO T_FEED VALUES (3, 2);
    INSERT INTO T_FEED VALUES (4, NULL);
    INSERT INTO T_FEED VALUES (5, NULL);
    INSERT INTO T_FEED_RCV VALUES (2, '1-MAY-2009');
    INSERT INTO T_FEED_RCV VALUES (3, '1-FEB-2009');
    INSERT INTO T_FEED_RCV VALUES (4, '12-MAY-2009');
    COMMIT;
    I join these tables using the following query to return all the feeds and check when each feed was received:
    SELECT
    F.FEED_ID,
    F.GRP_NUM,
    FR.RCV_DT
    FROM T_FEED F
    LEFT OUTER JOIN T_FEED_RCV FR
    ON F.FEED_ID = FR.FEED_ID
    ORDER BY GRP_NUM, RCV_DT DESC;
    Output
    Line: ----------
    FEED_ID     GRP_NUM     RCV_DT
    1     1     
    2     1     5/1/2009
    3     2     2/1/2009
    5          
    4          5/12/2009
    Actually I want the maximum date of when we received the feed. Grp_Num tells which feeds are grouped together. NULL grp_num means they are not grouped so treat them as individual group. In the example - Feeds 1 and 2 are in one group and any one of the feed is required. Feed 3, 4 and 5 are individual groups and all the three are required.
    I need a single query that should return the maximum date for the feeds. For the example the result should be NULL because.. out of feed 1 and 2 the max date is 5/1/2009. For feed 3 the max date is 2/1/2009, for feed 4 it is 5/12/2009 and for feed 4 it is NULL. Since one of the required feed is null so the results should be null.
    DELETE FROM T_FEED;
    DELETE FROM T_FEED_RCV;
    COMMIT;
    INSERT INTO T_FEED VALUES (1, 1);
    INSERT INTO T_FEED VALUES (2, 1);
    INSERT INTO T_FEED VALUES (3, NULL);
    INSERT INTO T_FEED VALUES (4, NULL);
    INSERT INTO T_FEED_RCV VALUES (2, '1-MAY-2009');
    INSERT INTO T_FEED_RCV VALUES (3, '1-FEB-2009');
    INSERT INTO T_FEED_RCV VALUES (4, '12-MAY-2009');
    COMMIT;
    For above inserts, the result should be for feed 1 and 2 - 5/1/2009, feed 3 - 2/1/2009 and feed 4 - 5/12/2009. So the max of these dates is 5/12/2009.
    I tried using MAX function grouped by GRP_NUM and also tried using DENSE_RANK but unable to resolve the issue. I am not sure how can I use the same query to return - not null value for same group and null (if any) for those that doesn't belong to any group. Appreciate if anyone can help me..

    Hi,
    Kuul13 wrote:
    Thanks Frank!
    Appreciate your time and solution. I tweaked your earlier solution which was more cleaner and simple and built the following query to resolve the prblem.
    SELECT * FROM (
    SELECT NVL (F.GRP_NUM, F.CARR_ID || F.FEED_ID || TO_CHAR(EFF_DT, 'MMDDYYYY')) AS GRP_ID
    ,MAX (FR.RCV_DT) AS MAX_DT
    FROM T_FEED F
    LEFT OUTER JOIN T_FEED_RCV FR ON F.FEED_ID = FR.FEED_ID
    GROUP BY NVL (F.GRP_NUM, F.CARR_ID || F.FEED_ID || TO_CHAR(EFF_DT, 'MMDDYYYY'))
    ORDER BY MAX_DT DESC NULLS FIRST)
    WHERE ROWNUM=1;
    I hope there are no hidden issues with this query than the later one provided by you.Actually, I can see 4 issues with this. I admit that some of them are unlikely, but why take any chances?
    (1) The first argument to NVL is a NUMBER, the second (being the result of ||) is a VARCHAR2. That means one of them will be implicitly converted to the type of the other. This is just the kind of thing that behaves differently in different versions or Oracle, so it may work fine for a year or two, and then, when you change to another version, mysteriously quit wiorking. When you have to convert from one type of data to another, always do an explicit conversion, using TO_CHAR (for example).
    (2)
    F.CARR_ID || F.FEED_ID || TO_CHAR(EFF_DT, 'MMDDYYYY)'will produce a key like '123405202009'. grp_num is a NUMBER with no restriction on the number of digits, so it could conceivably be 123405202009. The made-up grp_ids must never be the same any real grp_num.
    (3) The combination (carr_id, feed_id, eff_dt) is unique, but using TO_CHAR(EFF_DT, 'MMDDYYYY) assumes that the combination (carr_id, feed_id, TRUNC (eff_dt)) is unique. Even if eff_dt is always entered as (say) midnight (00:00:00) now, you may decide to start using the time of day sometime in the future. What are the chances that you'll remember to change this query when you do? Not very likely. If multiple rows from the same day are relatively rare, this is the kind of error that could go on for months before you even realize that there is an error.
    (4) Say you have this data in t_feed:
    carr_id      feed_id  eff_dt       grp_num
    1        234      20-May-2009  NULL
    12       34       20-May-2009  NULL
    123      4        20-May-2009  NULLAll of these rows will produce the same grp_id: 123405202009.
    Using NVL, as you are doing, allows you to get by with just one sub-query, which is nice.
    You can do that and still address all the problems above:
    SELECT  *
    FROM     (
         SELECT  NVL2 ( F.GRP_NUM
                   , 'A' || TO_CHAR (f.grp_num)
                   , 'B' || TO_CHAR (f.carr_id) || ':' ||
                                TO_CHAR (f.feed_id) || ':' ||
                                TO_CHAR ( f.eff_dt
                                            , 'MMDDYYYYHH24MISS'
                   ) AS grp_id
         ,     MAX (FR.RCV_DT) AS MAX_DT
         FROM               T_FEED      F
         LEFT OUTER JOIN  T_FEED_RCV  FR        ON  F.FEED_ID = FR.FEED_ID
         GROUP BY  NVL2 ( F.GRP_NUM
                     , 'A' || TO_CHAR (f.grp_num)
                     , 'B' || TO_CHAR (f.carr_id) || ':' ||
                                     TO_CHAR (f.feed_id) || ':' ||
                                     TO_CHAR ( f.eff_dt
                                                 , 'MMDDYYYYHH24MISS'
         ORDER BY  MAX_DT   DESC   NULLS FIRST
    WHERE      ROWNUM = 1;I would still use two sub-queries, adding one to compute grp_id, so we don't have to repeat the NVL2 expression. I would also use a WITH clause rather than in-line views.
    Do you find it easier to read the query above, or the simpler query you posted in your last message?
    Please make things easy on yourself and the people who want to help you. Always format your code so that the way the code looks on the screen makes it clear what the code is doing.
    In particular, the formatting should make clear
    (a) where each clause (SELECT, FROM, WHERE, ...) of each query begins
    (b) where sub-queries begin and end
    (c) what each argument to functions is
    (d) the scope of parentheses
    When you post formatted text on this site, type these 6 characters:
    before and after the formatted text, to preserve spacing.
    The way you post the DDL (CREATE TABLE ...)  and DML (INSERT ...) statements is great: I wish more people were as helpful as you.
    There's no need to format the DDL and DML.  (If you want to, then go ahead: it does help a little.)                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Multiple passes (groups) on same query data

    I need to run a report which has two sets of results, one for the current month, one for year to date. This would be easy with subreports, but unfortunately it's a very complex report which has a number of subreports already. I'm using CR10 and SQL Server.
    In both cases the main report's query is the same (most of the work's done in the subreports).
    I've done this before using Paradox tables, by adding an extra table with two records (Seq = 1 and 2) which is unconnected to the other tables (ignoring the warning). This results in the query data being repeated for each value in the Seq table. I group on Seq and I have what I want - two groups with the same data.
    This doesn't appear to work with a SQL Server DB though. Any ideas?

    If your report is based on an SQL Command data source, you could always run the query (or stored procedure) into a temp table (say, #temp), then end the SQL Command with
    select 1 as seq, * from #temp
    Union All
    select 2 as seq, * from #temp
    I thought of this but unfortunately I have no control over the report launching, so a temp table's not an option. I tried this with one of the tables involved  but it then seems to run incredibly slowly, presumably because Crystal's obliged to do the joins rather than the server.
    It does work quite well when I put all the joins in the Command, which is a pain though, particularly for maintenance.
    So thanks for encouraging me to look further.

  • Acounting the total of tables group by year

    Hi there, I need help
    I am writing simple query that gives the result on total counts by group and sort by year.
    When I do single query it works
    select to_char(date, 'yyyy') as year, count(*) as total from table_name group by to_char(date, 'yyyy') order by to_char(date, 'yyyy');
    I need to use above query using through dba_tables. I have two thousand owners which has same tables.
    thanks,

    all,
    so far
    This is what I have
    set serveroutput on
    declare
    cursor c1 is select owner, from all_tables where table_name ='MESSAGE' order by owner;
    sql_stmt varchar2(4000);
    cnt_val number;
    cnt_year date;
    begin
    for c1_rec in c1 loop
    sql_stmt:= 'select count(*), to_char(date, "yyyy") from '||c1_rec.owner||' .MESSAGE';
    execute immediate sql_stmt into cnt_val, cnt_year;
    dbms_output.put_line (rpad(c1_rec.owner,20) ||' ' ||cnt_val|| ' '||cnt_year);
    end loop;
    end;
    when I run the above query, the errors at line 1
    ora-00904: "yyyy": invalid identifier
    ora-06512: at line 12
    any idea, please help me....

  • How to group by year

    Hi,
    I have the following simple query:
    SELECT pubdate, title
    FROM table
    ORDER BY date DESC
    Which returns hundreds of records. Instead of outputing every
    single date within a year for pubdate, I would like to group the
    output based on the year. So the display would like something like:
    2008
    2007
    2006
    1970 etc.
    How can I output this?

    jenn wrote:
    > But this displays all the dates within a year. I just
    want to display the #DatePart("yyyy", date# to get
    > 2008, 2007 etc. and not 5/21/08, 5/21/08....1/1/1979.
    Well then that is what would need to be in your query. Ok you
    could do
    it with code in the CFML, but that would be much more effort.
    SELECT pubdate, title, year(pubDate) AS year
    FROM table
    ORDER BY year(pubdate) DESC, pubdate DESC
    <!--- this is a pseudo code example. The exact SQL date
    function will
    very from one database management system to another. --->
    <cfoutput query="simpleQry" group="year">
    #year#
    <cfoutput>
    #pubDate#: #title#<br/>
    <cfoutput>
    <hr/>
    </cfoutput>

  • Displaying a radio group in SQL QUERY report region

    Good morning everyone,
    I have a report in which a column - ORDER STATUS, will come in with a value of 1, 2 or 3...being order unfilled, order partially filled, or order filled, respectively.
    I would like to display the order status as a radio group on the report so that it will be easy to run down the column of radio buttons to see what is filled, etc.
    I've gone to the manual and checked the doco on HTMLDB_ITEM.RADIOGROUP. But the example given there is actually for CHECKBOX (is this an error?!?).
    I went to the forums and found nothing suitable.
    My region is an SQL QUERY. Can I display the STATUS as a radio group in the SELECT ?
    This is the question.
    Thankyou in anticipation. TC. 23/11/2004

    Tony,
    There may be better solutions, but here's what I was thinking:    create table orders (id number, status number, customer varchar(30))
        insert into orders (id,status,customer) values(1,1,'ACME')
        insert into orders (id,status,customer) values(2,2,'BENSON')
        insert into orders (id,status,customer) values(3,3,'CLARKE')
        commit
        Query Source
        select
          id "ORDER NUMBER",
          decode(status,
            1,htmldb_item.RADIOGROUP(1,status,'1','unfilled')||htmldb_item.RADIOGROUP(2,status,'2','partial','"disabled=true"')||htmldb_item.RADIOGROUP(3,status,'3','filled','"disabled=true"'),
            2,htmldb_item.RADIOGROUP(1,status,'1','unfilled','"disabled=true"')||htmldb_item.RADIOGROUP(2,status,'2','partial')||htmldb_item.RADIOGROUP(3,status,'3','filled','"disabled=true"'),
            3,htmldb_item.RADIOGROUP(1,status,'1','unfilled','"disabled=true"')||htmldb_item.RADIOGROUP(2,status,'2','partial','"disabled=true"')||htmldb_item.RADIOGROUP(3,status,'3','filled'))
          "STATUS",
          customer "Customer Name"
        from orders;Scott

  • Problem in display of char. and key figure for yearly query

    Hello all,
    I got a query in which i have to do restriction on
    Calendar Day/Month in Rows under Material which is a Characteristics
    So it should be
                                                               PACKAGE QUANTITY
                                            JAN  FEB  MAR   APRIL.......DEC
    PACK1
    PACK2
    PACK3
    PACK4 for current year
    PACK4 for prior YEar
    Now the months display is not pre decided, it can be anything as per the input of user.
    So i have to restrict it on Calendar month for Material twice
    In current year , just introducing the Interval range variable
    In previous year, same restricition with offset -12
    Now the problem is while displaying it shows
                                           JAN'06 JAN'07  FEB'06 FEB'07  MAR'06 MAR'07........ 
    PACK1
    PACK2
    PACK3
    PACK4 for current year
    PACK4 for prior YEar
    Which is not desirable.
    I only want value for Jan in one column
    and for previous year and current year in different rows thats it.
    If i remove Calendar year/month from columns there is no way I CAN DISPLAY MONTHS .
    How to achieve this.
    Please suggest.

    Kartikey,
    Rusty's suggestion wont work - using two characteristics for year and month will have the same effect as the position you are in at the moment.
    ie..
    2007     Jan.07     Feb.07    Mar.07   Apr.07
    2006 -
    Jan 06  Feb 06  Mar 06
    If you have the posting period of 01 - 02 - 03  (just the period number only, no year) then your query will display
    2007-01-02-03-04
    2006-01-02-03-04
    Regards
    Gill

Maybe you are looking for

  • How to find the list of Keyfigures in a particular Extractor RSA3

    Hi, I have a list of extractors where I need to find out Keyfigures for all the extractors. do we have any table or program to list the keyfigures for a particular extractor. That would be really a great helpful. Thank you in advance. Regards, Prem

  • Urgent: RTF runs successfully in local, but no output on JDE

    Hello I'm having a problem with my RTF. The following codes below are in my RTF, my RTF outputs successfully when run locally. But if i upload the same RTF in JDE, it doesn't show any output. In the error log i get an error in the foreach_number loop

  • Credit limit is not applicable for particular customer

    Hi How  can we exclude the paricular cusmoter from credit limit presently credit limit configured at deliver level ... and it is worked on  particular Saleorder type pls ..any one guide me about on this issue thaks and regards sesi Edited by: sesidha

  • Getting Deployment error in Oracle 10g server

    Hii while deploying EAR in Oracle Enterprise Manager 10g. I am getting following error message and an resoluation message,which i could not able to proceed further. Deployment Failed : Another operation in progress which prevents your operation from

  • Upgrading the primary and physical standby database

    Solaris 9 oracle 9i (9.2.0.8) upgrade to 10g (10.2.0.3) Hi, I would like to know how to upgrade database in dataguard envronment.(primary and remote physical standby database) Is rolling upgrade possible in this case? Or any steps or Document is appr