Need a maximum date value using group by
Create table student (dept number(10), dep_name varchar2(10),join_date date,years_attended number(10),end_date date);
insert into student values (1,'I',to_date('3/7/1917','MM/DD/YYYY'),4,to_date('8/26/1987','MM/DD/YYYY'));
insert into student values (1,'I',to_date('1/1/1900','MM/DD/YYYY'),4,to_date('8/26/1932','MM/DD/YYYY'));
insert into student values (1,'D',to_date('1/1/1920','MM/DD/YYYY'),5,to_date('8/26/1994','MM/DD/YYYY'));
insert into student values (1,'C',to_date('1/1/1920','MM/DD/YYYY'),6,to_date('8/26/1945','MM/DD/YYYY'));
insert into student values (2,'I',to_date('7/1/1900','MM/DD/YYYY'),3,to_date('8/26/1932','MM/DD/YYYY'));
insert into student values (2,'I',to_date('8/16/1916','MM/DD/YYYY'),9,to_date('8/26/1923','MM/DD/YYYY'));
insert into student values (2,'D',to_date('8/16/1916','MM/DD/YYYY'),10,to_date('8/26/1987','MM/DD/YYYY'));
insert into student values (3,'I',to_date('3/7/1917','MM/DD/YYYY'),4,to_date('8/26/1987','MM/DD/YYYY'));
insert into student values (3,'D',to_date('7/28/1920','MM/DD/YYYY'),6,to_date('8/26/1945','MM/DD/YYYY'));
insert into student values (3,'I',to_date('7/28/1920','MM/DD/YYYY'),8,to_date('8/26/1965','MM/DD/YYYY'));
insert into student values (4,'I',to_date('12/31/1924','MM/DD/YYYY'),2,to_date('8/26/1998','MM/DD/YYYY'));
insert into student values (4,'I',to_date('6/10/1929','MM/DD/YYYY'),1,to_date('8/26/1943','MM/DD/YYYY'));
insert into student values (4,'C',to_date('1/17/1927','MM/DD/YYYY'),4,to_date('8/26/1955','MM/DD/YYYY'));
insert into student values (4,'C',to_date('6/10/1929','MM/DD/YYYY'),30,to_date('8/26/1967','MM/DD/YYYY'));
insert into student values (5,'D',to_date('2/10/1931','MM/DD/YYYY'),2,to_date('8/26/1943','MM/DD/YYYY'));
insert into student values (5,'I',to_date('2/10/1931','MM/DD/YYYY'),24,to_date('8/26/1962','MM/DD/YYYY'));
commit;I need a maximum date value join_date for each department. If max(join_date) has two records for each dept then max(end_date) should be considered. I have used a below select query
select * from student where join_date in (select
max(join_date) from student group by dept);which gives me the following result
1 D 1/1/1920 5 8/26/1994
1 C 1/1/1920 6 8/26/1945
2 I 8/16/1916 9 8/26/1923
2 D 8/16/1916 10 8/26/1987
3 D 7/28/1920 6 8/26/1945
3 I 7/28/1920 8 8/26/1965
4 I 6/10/1929 1 8/26/1943
4 C 6/10/1929 30 8/26/1967
5 D 2/10/1931 2 8/26/1943
5 I 2/10/1931 24 8/26/1962But I am looking for the result which gives me only one maximum value for each dept column. First it should look for maximum value of join_date, if two records has same join_date then max(end_date) should be considered. The result should be sumthing like this
1 D 1/1/1920 5 8/26/1994
2 D 8/16/1916 10 8/26/1987
3 I 7/28/1920 8 8/26/1965
4 C 6/10/1929 30 8/26/1967
5 I 2/10/1931 24 8/26/1962Can you please tell me how to rewrite the select query to get the above results.
Edited by: user11872870 on Aug 2, 2011 5:29 PM
Edited by: user11872870 on Aug 2, 2011 5:36 PM
Hi,
That's called a Top-N Query , and here's one way to do it:
WITH got_r_num AS
SELECT student.*
, ROW_NUMBER () OVER ( PARTITION BY dept
ORDER BY join_date DESC
, end_date DESC
) AS r_num
FROM student
SELECT dept, dep_name, join_date, years_attended, end_date
FROM got_r_num
WHERE r_num = 1
ORDER BY dept
;Another way is similar to what you originally posted:
SELECT *
FROM student
WHERE (dept, join_date, end_date)
IN (
SELECT dept
, MAX (join_date)
, MAX (end_date) KEEP (DENSE_RANK LAST ORDER BY join_date)
FROM student
GROUP BY dept
);I suspect the first way (using ROW_NUMBER) will be faster.
Also, the ROW_NUMBER approach is guaranteed to return only 1 row per dept. Using the GROUP BY approach,if there is a tie on join_date and end_date, then it will return all contenders in that dept. Using ROW_NUMBER, it's easy to add as many tie-breaking expressions as you want, and, if there is still a tie, it will arbirarily pick one of the rows involved in the tie as #1.
Thanks for posting the CREATE TABLE and INSERT statments! That's very helpful.
Edited by: Frank Kulash on Aug 2, 2011 9:00 PM
Added GROUP BY alternative
Similar Messages
-
How to find out Maximum date value in RPD.
Hi All,
I have a date column i want to get the maximum date value from that column. I am trying this expression MAX( "sales"."book"."date") in new logical column i am getting the RPD inconsistence error. Database is SQL server 2005. Is there any problem in my syntax?
Thanks in advance.Is that column date part of time dimension?
What does error say? syntax should be ok - maybe something to do with your DB settings in physical layer.
can you try to create the same column in Answers? (select fx button of the column and then add MAX ( ) around the column.
Edited by: wildmight on Jun 12, 2009 10:09 AM -
Need to display data from one group while in another repeating group
I have a repeating group in my .rft file which displays line level data that has a quote number in it.
At the end of this repeating group I need to display the total for the quote number but these values are in another group that's within a differnt group higher up in the XML tree. Both of the groups have quote number so there is a link between the two. How do I do this? When I add the "higher up" repeating group within the current group I can't get any data to show. Any help would be appreciated.
Kind RegardsHere's the data, I do know how to to move up the tree like a file system but I think my problem is I need to move up the tree and repeat based on the value (quote number) of the lower node.
For instance for each quote number in repeating group G_FORX_OPS_QTE_ORDER_LINES loop through the occurances of same quote number in G_OPPCAT_BREAKDOWN so I can get a total of the fields by quote number
This is overly simplified but it would be like this
<?for-each-group:G_FORX_OPS_QTE_ORDER_LINES;QUOTE_NUMBER?>
quote number 1428
various line level data....
various line level data....
etc....
-- Open loop to get data from other group (Only get data for quote 1428!!)
<?for-each-group:/FORX_AS_OPPOR/LIST_G_OPPORTUNITY_SHEET/G_OPPORTUNITY_ID/LIST_G_OPPCAT_BREAKDOWN/G_OPPCAT_BREAKDOWN;./QUOTE_NUMBER1?>
Total For Quote 1428
Billing Class
BILLABLE: Billable total in dollars
INTERNAL: Internal total in dollars
<?end for-each-group?>
<!-- Generated by Oracle Reports version 6.0.8.26.0
-->
- <FORX_AS_OPPOR>
- <LIST_G_OPPORTUNITY_SHEET>
- <G_OPPORTUNITY_ID>
<COMMENTS1>Opportunity Includes Quote Number(s): 1428-2 1443-1 1444-1 Order Number(s): 159038-10 159044-3</COMMENTS1>
<CUSTOMER_ID>1183</CUSTOMER_ID>
<UPDATED_BY>CTYNER</UPDATED_BY>
<DEPOSITS>5704.68</DEPOSITS>
<OPPORTUNITY_ID>1000216</OPPORTUNITY_ID>
<OPPORTUNITY_NAME>1000216:UAT DEMO</OPPORTUNITY_NAME>
<CUSTOMER_NUMBER>103736</CUSTOMER_NUMBER>
<ENTITY>FSG</ENTITY>
<SOLD_TO>ALLSTATE INSURANCE COMPANY</SOLD_TO>
<CONTRACT_ID>S620_N</CONTRACT_ID>
<SALESREP_ID>2239</SALESREP_ID>
<SALES_REP_NAME>Conley, Michael</SALES_REP_NAME>
<ORDER_ACCEPTANCE>Skokie</ORDER_ACCEPTANCE>
<BOOKED_COUNT>2</BOOKED_COUNT>
<TERRITORY_NAME>Enterprise - Enterprise</TERRITORY_NAME>
- <LIST_G_OPPCAT_BREAKDOWN>
- <G_OPPCAT_BREAKDOWN>
<QUOTE_NUMBER1>1428</QUOTE_NUMBER1> First Occurence of 1428
<INT_ITM_TYPES>HARDWARE</INT_ITM_TYPES>
<OPPORTUNITY_ID3>1000216</OPPORTUNITY_ID3>
<INT_BILLING_CLASS>BILLABLE</INT_BILLING_CLASS>
<SUM_ITM_COUNT>5</SUM_ITM_COUNT>
<SUM_ITM_PRICE>151500.4</SUM_ITM_PRICE>
<SUM_ITM_COST>1688</SUM_ITM_COST>
<SUM_ITM_GPM>149812.4</SUM_ITM_GPM>
<SUM_ITM_MARKUP>88.7514218009478672985781990521327014218</SUM_ITM_MARKUP>
</G_OPPCAT_BREAKDOWN>
- <G_OPPCAT_BREAKDOWN>
<QUOTE_NUMBER1>1428</QUOTE_NUMBER1> Second Occurence of 1428
<INT_ITM_TYPES>INVOICE_ADJUSTMENT</INT_ITM_TYPES>
<OPPORTUNITY_ID3>1000216</OPPORTUNITY_ID3>
<INT_BILLING_CLASS>BILLABLE</INT_BILLING_CLASS>
<SUM_ITM_COUNT>4</SUM_ITM_COUNT>
<SUM_ITM_PRICE>29.72</SUM_ITM_PRICE>
<SUM_ITM_COST>0</SUM_ITM_COST>
<SUM_ITM_GPM>29.72</SUM_ITM_GPM>
<SUM_ITM_MARKUP />
</G_OPPCAT_BREAKDOWN>
- <G_OPPCAT_BREAKDOWN>
<QUOTE_NUMBER1>1443</QUOTE_NUMBER1>
<INT_ITM_TYPES>HARDWARE</INT_ITM_TYPES>
<OPPORTUNITY_ID3>1000216</OPPORTUNITY_ID3>
<INT_BILLING_CLASS>BILLABLE</INT_BILLING_CLASS>
<SUM_ITM_COUNT>5</SUM_ITM_COUNT>
<SUM_ITM_PRICE>2084</SUM_ITM_PRICE>
<SUM_ITM_COST>1748</SUM_ITM_COST>
<SUM_ITM_GPM>336</SUM_ITM_GPM>
<SUM_ITM_MARKUP>.192219679633867276887871853546910755149</SUM_ITM_MARKUP>
</G_OPPCAT_BREAKDOWN>
- <G_OPPCAT_BREAKDOWN>
<QUOTE_NUMBER1>1444</QUOTE_NUMBER1>
<INT_ITM_TYPES>HARDWARE</INT_ITM_TYPES>
<OPPORTUNITY_ID3>1000216</OPPORTUNITY_ID3>
<INT_BILLING_CLASS>BILLABLE</INT_BILLING_CLASS>
<SUM_ITM_COUNT>4</SUM_ITM_COUNT>
<SUM_ITM_PRICE>1500.4</SUM_ITM_PRICE>
<SUM_ITM_COST>1364</SUM_ITM_COST>
<SUM_ITM_GPM>136.4</SUM_ITM_GPM>
<SUM_ITM_MARKUP>.1</SUM_ITM_MARKUP>
</G_OPPCAT_BREAKDOWN>
- <G_OPPCAT_BREAKDOWN>
<QUOTE_NUMBER1>1444</QUOTE_NUMBER1>
<INT_ITM_TYPES>INVOICE_ADJUSTMENT</INT_ITM_TYPES>
<OPPORTUNITY_ID3>1000216</OPPORTUNITY_ID3>
<INT_BILLING_CLASS>BILLABLE</INT_BILLING_CLASS>
<SUM_ITM_COUNT>4</SUM_ITM_COUNT>
<SUM_ITM_PRICE>66.28</SUM_ITM_PRICE>
<SUM_ITM_COST>0</SUM_ITM_COST>
<SUM_ITM_GPM>66.28</SUM_ITM_GPM>
<SUM_ITM_MARKUP />
</G_OPPCAT_BREAKDOWN>
- <G_OPPCAT_BREAKDOWN>
<QUOTE_NUMBER1>1428</QUOTE_NUMBER1>
<INT_ITM_TYPES>EXPENSE_ADJUSTMENT</INT_ITM_TYPES>
<OPPORTUNITY_ID3>1000216</OPPORTUNITY_ID3>
<INT_BILLING_CLASS>INTERNAL</INT_BILLING_CLASS>
<SUM_ITM_COUNT>1</SUM_ITM_COUNT>
<SUM_ITM_PRICE>0</SUM_ITM_PRICE>
<SUM_ITM_COST>22.98</SUM_ITM_COST>
<SUM_ITM_GPM />
<SUM_ITM_MARKUP />
</G_OPPCAT_BREAKDOWN>
- <G_OPPCAT_BREAKDOWN>
<QUOTE_NUMBER1>1444</QUOTE_NUMBER1>
<INT_ITM_TYPES>EXPENSE_ADJUSTMENT</INT_ITM_TYPES>
<OPPORTUNITY_ID3>1000216</OPPORTUNITY_ID3>
<INT_BILLING_CLASS>INTERNAL</INT_BILLING_CLASS>
<SUM_ITM_COUNT>1</SUM_ITM_COUNT>
<SUM_ITM_PRICE>0</SUM_ITM_PRICE>
<SUM_ITM_COST>22.98</SUM_ITM_COST>
<SUM_ITM_GPM />
<SUM_ITM_MARKUP />
</G_OPPCAT_BREAKDOWN>
- <G_OPPCAT_BREAKDOWN>
<QUOTE_NUMBER1>1428</QUOTE_NUMBER1>
<INT_ITM_TYPES>REVENUE_ADJUSTMENT</INT_ITM_TYPES>
<OPPORTUNITY_ID3>1000216</OPPORTUNITY_ID3>
<INT_BILLING_CLASS>INTERNAL</INT_BILLING_CLASS>
<SUM_ITM_COUNT>1</SUM_ITM_COUNT>
<SUM_ITM_PRICE>20</SUM_ITM_PRICE>
<SUM_ITM_COST>101</SUM_ITM_COST>
<SUM_ITM_GPM />
<SUM_ITM_MARKUP />
</G_OPPCAT_BREAKDOWN>
</LIST_G_OPPCAT_BREAKDOWN>
- <LIST_G_FORX_OPS_QTE_ORDER_LINES>
- <G_FORX_OPS_QTE_ORDER_LINES>
<ORDER_ACCEPTANCE1>SKOKIE</ORDER_ACCEPTANCE1>
<COMMITMENT />
<COMMITMENT_APPLIED_AMOUNT />
<PARTIAL_COMMENTS />
<COMMENT_FOR_INVOICE />
<QUOTE_DATE>07/13/2009 02:00:10PM</QUOTE_DATE>
<LINE_NUMBER>1</LINE_NUMBER>
<EXTD_SELLING_PRICE>880</EXTD_SELLING_PRICE>
<ORDER_PAY_TERM>Automated Term Due in 44 Days</ORDER_PAY_TERM>
<QUOTE_PAY_TERM />
<ITEM_CATEGORIES>HARDWARE.MACHINE.SERVER.</ITEM_CATEGORIES>
<QUOTE_STATUS>Order Submitted</QUOTE_STATUS>
<QUOTE_USER>Tyner, Chris</QUOTE_USER>
<OPPORTUNITY_ID1>1000216</OPPORTUNITY_ID1>
<DATASOURCE>1</DATASOURCE>
<HEADER_ID>398631</HEADER_ID>
<LINE_ID>900897</LINE_ID>
<LINE_TYPE_ID>1181</LINE_TYPE_ID>
<ORDER_NUMBER>159038</ORDER_NUMBER>
<ORDER_VERSION>10</ORDER_VERSION>
<TEAM>IBM</TEAM>
<OPPORTUNITY_NAME1 />
<QUOTE_NAME>1000216:IBM:UAT DEMO</QUOTE_NAME>
<QUOTE_NUMBER>1428</QUOTE_NUMBER>
<QUOTE_VERSION>2</QUOTE_VERSION>
<NEW_PARTIAL>0</NEW_PARTIAL>
<COMMIT_DEL_DATE>13-JUL-09</COMMIT_DEL_DATE>
<ORDERED_QUANTITY>1</ORDERED_QUANTITY>
<PO_NUMBER>098534089</PO_NUMBER>
<GROUP_NUM>1</GROUP_NUM>
<INVENTORY_ITEM_ID>845256</INVENTORY_ITEM_ID>
<ORDERED_ITEM>7978BDU</ORDERED_ITEM>
<PRODUCT>7978BDU</PRODUCT>
<ITEM_DESCRIPTION>X3550, XEON QUAD CORE E5430 80W 2.66GHZ/1333MHZ/12MB L2, 2X1GB CHK, O/BAY 2.5IN HS SAS, SR 8K-I, PCI-E RISER CARD, ULTRABAY ENHANCED DVD-ROM/CD-RW COMBO DRIVE, 670W P/S, RACK</ITEM_DESCRIPTION>
<MANUFACTURER_DESCRIPTION>IBM</MANUFACTURER_DESCRIPTION>
<TYPE />
<EQUIPMENT_CODE>B</EQUIPMENT_CODE>
<TAX_CLASS_CODE>HW</TAX_CLASS_CODE>
<CLS />
<SUBINVENTORY />
<UNIT_LIST_PRICE>0</UNIT_LIST_PRICE>
<ITEM_UNIT_COST>800</ITEM_UNIT_COST>
<VENDOR_INVENTORY_RETURN />
<SHIPPING_TYPE />
<SHIPPING_ORG>DSP</SHIPPING_ORG>
<SOLD_TO1>ALLSTATE INSURANCE COMPANY</SOLD_TO1>
<SOLD_TO_ACCOUNT_NUMBER>103736</SOLD_TO_ACCOUNT_NUMBER>
<INVOICE_TO_LOCATION>10110384</INVOICE_TO_LOCATION>
<INVOICE_TO>ALLSTATE INSURANCE COMPANY</INVOICE_TO>
<INVOICE_TO_ACCOUNT_NUMBER>103736</INVOICE_TO_ACCOUNT_NUMBER>
<INVOICE_TO_ADDRESS1>2775 SANDERS RD</INVOICE_TO_ADDRESS1>
<INVOICE_TO_ADDRESS2 />
<INVOICE_TO_ADDRESS3 />
<INVOICE_TO_ADDRESS4 />
<INVOICE_TO_EMAIL />
<INVOICE_TO_PHONE_NUMBER>847-402-0223</INVOICE_TO_PHONE_NUMBER>
<INVOICE_TO_ADDRESS5>NORTHBROOK, IL, 60062-6110, US</INVOICE_TO_ADDRESS5>
<INVOICE_TO_CONTACT>CRAIG SOCKEL</INVOICE_TO_CONTACT>
<L_INVOICE_TO_LOCATION>10110384</L_INVOICE_TO_LOCATION>
<L_INVOICE_TO_ACCOUNT_NUMBER>103736</L_INVOICE_TO_ACCOUNT_NUMBER>
<L_INVOICE_TO>ALLSTATE INSURANCE COMPANY</L_INVOICE_TO>
<L_INVOICE_TO_ADDRESS1>2775 SANDERS RD</L_INVOICE_TO_ADDRESS1>
<L_INVOICE_TO_ADDRESS2 />
<L_INVOICE_TO_ADDRESS3 />
<L_INVOICE_TO_ADDRESS4 />
<L_INVOICE_TO_EMAIL />
<L_INVOICE_TO_PHONE_NUMBER>847-402-0223</L_INVOICE_TO_PHONE_NUMBER>
<L_INVOICE_TO_ADDRESS5>NORTHBROOK, IL, 60062-6110, US</L_INVOICE_TO_ADDRESS5>
<L_INVOICE_TO_CONTACT>CRAIG SOCKEL</L_INVOICE_TO_CONTACT>
<SHIP_TO_ACCOUNT_NUMBER>103736</SHIP_TO_ACCOUNT_NUMBER>
<SHIP_TO>ALLSTATE INSURANCE COMPANY</SHIP_TO>
<SHIP_TO_LOCATION>10271260</SHIP_TO_LOCATION>
<SHIP_TO_ADDRESS1>3075 SANDERS RD STE 12C</SHIP_TO_ADDRESS1>
<SHIP_TO_ADDRESS2 />
<SHIP_TO_ADDRESS3 />
<SHIP_TO_ADDRESS4 />
<SHIP_TO_ADDRESS>NORTHBROOK, IL, 60062-7119, US</SHIP_TO_ADDRESS>
<SHIP_EMAIL>[email protected]</SHIP_EMAIL>
<SHIP_TO_PHONE_NUMBER>847-402-0223</SHIP_TO_PHONE_NUMBER>
<SHIP_TO_CONTACT>ORLANDO LOPEZ</SHIP_TO_CONTACT>
<L_SHIP_TO_ACCOUNT_NUMBER>103736</L_SHIP_TO_ACCOUNT_NUMBER>
<L_SHIP_TO>ALLSTATE INSURANCE COMPANY</L_SHIP_TO>
<L_SHIP_TO_LOCATION>10271260</L_SHIP_TO_LOCATION>
<L_SHIP_TO_ADDRESS1>3075 SANDERS RD STE 12C</L_SHIP_TO_ADDRESS1>
<L_SHIP_TO_ADDRESS2 />
<L_SHIP_TO_ADDRESS3 />
<L_SHIP_TO_ADDRESS4 />
<L_SHIP_TO_ADDRESS>NORTHBROOK, IL, 60062-7119, US</L_SHIP_TO_ADDRESS>
<L_SHIP_EMAIL>[email protected]</L_SHIP_EMAIL>
<L_SHIP_TO_PHONE_NUMBER>847-402-0223</L_SHIP_TO_PHONE_NUMBER>
<L_SHIP_TO_CONTACT>ORLANDO LOPEZ</L_SHIP_TO_CONTACT>
<DEL_TO_ACCOUNT_NUMBER>103736</DEL_TO_ACCOUNT_NUMBER>
<DEL_TO>ALLSTATE INSURANCE COMPANY</DEL_TO>
<DEL_TO_LOCATION>10110327</DEL_TO_LOCATION>
<DEL_TO_ADDRESS1>3075 SANDERS RD</DEL_TO_ADDRESS1>
<DEL_TO_ADDRESS2 />
<DEL_TO_ADDRESS3 />
<DEL_TO_ADDRESS4 />
<DEL_TO_ADDRESS>NORTHBROOK, IL, 60062-7119, US</DEL_TO_ADDRESS>
<DEL_TO_EMAIL />
<DEL_TO_PHONE_NUMBER>847-402-0223</DEL_TO_PHONE_NUMBER>
<DEL_TO_CONTACT />
<UNIT_SELLING_PRICE>880</UNIT_SELLING_PRICE>
<BILL_TO_LAST_UPDATE>21/04/2009 02:25:18PM</BILL_TO_LAST_UPDATE>
<SHIP_TO_LAST_UPDATE>13/07/2009 01:31:45PM</SHIP_TO_LAST_UPDATE>
<BILL_TO_UPDATED_BY>SLUMPP</BILL_TO_UPDATED_BY>
<SHIP_TO_UPDATED_BY>CTYNER</SHIP_TO_UPDATED_BY>
<ORDER_QUOTE_USER>CTYNER</ORDER_QUOTE_USER>
<PARTICIPANTS>Conley, Michael Order:159044 100% and Conley, Michael Order:159038 100% and Conley, Michael Quote:1443 50% and Jones, Maureen Quote:1443 50%</PARTICIPANTS>
<ORDERED_DATE>07/13/2009 02:23:23PM</ORDERED_DATE>
<BOOKED_DATE>07/13/2009 02:31:15PM</BOOKED_DATE>
<ORDER_QUOTE_STATUS>BOOKED</ORDER_QUOTE_STATUS>
<ORDER_USER_STATUS>RE-BOOK REQD</ORDER_USER_STATUS>
<BUYER>Tyner, Chris</BUYER>
<SERIAL_NUM>45784687</SERIAL_NUM>
<INVOICE_HANDLING_FORMAT_VALUE />
<INVOICE_HANDLING_FORMAT />
<PRINT_TRADE_NAME_VALUE>No</PRINT_TRADE_NAME_VALUE>
<ITEM_EXTD_COST>800</ITEM_EXTD_COST>
<QUOTE_APPROVERS>Tyner, Chris</QUOTE_APPROVERS>
</G_FORX_OPS_QTE_ORDER_LINES> -
Display returns latest data value rather than maximum data value.
I have requested the Maximum value using the Statistics VI. The display returns the latest value rather than the maximum. I think it may have to do with specifying the sample size or data packet size but I don't know how to do that.
You probably are only taking a single reading with the DAQ Assistant and the Statistics function expects an array to determine the max value in that array. There are a couple ways to do what you want. One way is to use the Array Min & Max Single Point function (Signal Processing>Point by Point>Other Functions). This has a sample length input and a reset input. Another way is to create an array with a shift register and use the Statistics function or just the Array Min & Max function. You could limit the size of the array with Array Subset for example.
To attach a file to your post, look below the message body. There is a Attachment field with a browse button.
Message Edited by Dennis Knutson on 06-29-2007 09:41 AM
Attachments:
Max value.PNG 7 KB -
Displaying Date value using ASP
Dear all,
I am currently trying to display the value in an Oracle date
field using ASP. However, I encountered the following error :
Microsoft OLE DB Provider for ODBC Drivers error '80020009'
Multiple-step OLE DB operation generated errors. Check each OLE
DB status value, if available. No work was done.
My code is as follow:
set abc= server.createobject("adodb.recordset")
abc.activeconnection = gDBConnectString
sqlstring = "select c_datetime from counter_tbl where c_variable
= 'counter1'"
abc.open sqlstring
response.write abc("c_datetime")
I have tried using to_Date to format the field but when I try to
display it, there will be the same errors again. The SQL
statement executes successfully in SQLPLUS.
Please help. Thanks!
Regards,
Dara.Hi Colin,
I may be misinterpreting your question -- so I don't know how much this will help you, but the Oracle-to-java mapping for database table columns with the DATE datatype changed in the "ojdbc14.jar" driver (from the "classes12.zip" driver), from "java.sql.Timestamp" to "java.sql.Date".
However, since I saw no details in your post regarding your environment/platform, like I said before, this may not help you much!
In any case, the mappings (Oracle-to-java) are detailed in the Oracle documentation -- which is available from the Tahiti Web site (in case you didn't know). Have you checked the documentation?
Good Luck,
Avi. -
Need help with data filtering on groups/application roles
Hello,
I have a situation where I have to apply security on objects (reports, prompts etc) and dimension members (Essbase cube). So the idea is like this:
Report 1: access to three users (U1, U2, U3), but for dimension Company they have separate rights:
U1: Company A, Companies A.1-A.7 (children of A) and Companies A.1.1-A.1.9 (children of A.1);
U2: Company A.1 and Companies A.1.1-A.1.9;
U3: Company A.1.1
same for Report 2, but users must have access to different companyes, like Company B, B1...
In WebLogic Console I created three groups (G1-G3) and placed each user to a group (U1-> G1, U2 ->G2, U3->G3). Then in WebLogic EM I created three application roles (R1-R3) and added for each, corresponding user (R1-> U1, R2->U2, R3-> U3).
My approach was to use application roles like this:
R1: include User1 and filter data on repository by application role to each generation of the cube ("Data_Source_Name"."Dimension_Name"."Generation2,Dimension"='Company A',"Data_Source_Name"."Dimension_Name"."Generation3,Dimension"='Company A.1', "Data_Source_Name"."Dimension_Name"."Generation4,Dimension"='Company A.1.1')
R2: include User2 and filter data on repository by application role to each generation of the cube ("Data_Source_Name"."Dimension_Name"."Generation3,Dimension"='Company A.1', "Data_Source_Name"."Dimension_Name"."Generation4,Dimension"='Company A.1.1')
R3: include User3 and filter data on repository by application role to each generation of the cube ("Data_Source_Name"."Dimension_Name"."Generation4,Dimension"='Company A.1.1').
I've noticed that, by default, each role inherites BIConsumer and "localmachineusers" application roles, so I set in repository these both roles to filter data as the role 3 (the lowest level of acces), in order for my roles (Roles 1 to 3) to have the highest privileges.
In repository I cannot see any of my users (U1-U3), but just the application roles they are in.
For Report 1 I set the access to Roles 1-3 and when I am logged on as U3 this report should display only the data for Company A.1.1, but it doesn't (displays data also for Company A, Companies A.1-A.7).
In fact it seems, that the data isn't filtered at all, which drives me to the conclusion that my data filter is override by another role, maybe ?
Could you please give me a clue about what I am missing here ?
Thank you.Amith,
Please bear this with me - see my comments below (search for petresion_Comments):
So, we have three users who have access to a report called Report1. But the data that they see in the report needs to be different. The report has a dimension company, and each user needs to see different companies data. So the filtering needs to be done on company dimension.
petresion_Comment: That's my case to solve.
Now the groups in weblogic has no purpose in OBIEE 11g unless you are using an LDAP authenticator who has groups defined in the active directory. By this I mean the network people are maintaining the users and group relation necessary for OBIEE. So keeping the weblogic groups apart for a minute, lets deal with users and roles only.
The three users are assigned to three different roles R1, R2 and R3. By default, all the roles inherit the BIconsumer role, and localmachineusers role you mentioned is not an OTB role. This is something that is probably causing the data filtering to fail. Do a test like create a user in weblogic, assign him only to the localmachineusers role, and go to analytics, and check your roles and groups by going under my account. Make sure this role is not inheriting any other roles like BIAdministrator, BIauthor etc. So in conclusion, when one of your users login, they should inherit only their custom Role (R1 for instance), BIConsumer, Authenticated User, and your custom role localmachineusers.
petresion_Comment: That is what I checked on the first time (few days ago) and is exactly as you say (BIConsumer, localmachinerole and Role1).
Do not apply any data filters on the BIConsumer role. This is not a good practice because the filters get applied to every single user that logs into the system.
petresion_Comment: I know that, but appliyng filters on BIConsumer role I tried to make sure that its privileges doesn't overrides any of my Roles (1,2 or 3). I will remove the filter on BIConsumer.
Now create the data filters on your custom roles (R1, R2, R3). Save the RPD. Deploy the Rpd through Enterprise Manager.
petresion_Comment: Only difference in my case is that I stopped BI services, applied changes to rpd in Offline mode and then restarted BI services.But also tried as you mentioned (by the book in fact) and same result. The problem is the same, my roles(1,2,3) don't filter the companies at all.
Once you are done with all the work above, you should login into analytics as user1. After logging in go to my account, roles and groups, and make sure you see the R1 in the list of groups. Now run the report, and your filters should get applied no matter what. If they are still not getting applied, grab the physical sql and see if the filters are existing in the where condition.
petresion_Comment: Where can I capture the physical SQL (probably an MDX sent to the Essbase cube ?) ?
One other reason could be, one of the roles that are assigned to the user1 by default, is overriding the filters. Like for example, if a user is assigned to BIAdmin role, and no matter if you assign him to a different role that has 100's of filters, he will still see all of the data.
petresion_Comment: As I said before, each of my users are members of their roles, BIComsumer and localmachinerole, so no other privileges (no BIAdmin role).
Thank you for the patience.
John -
Help needed with Export Data Pump using API
Hi All,
Am trying to do an export data pump feature using the API.
while the export as well as import works fine from the command line, its failing with the API.
This is the command line program:
expdp pxperf/dba@APPN QUERY=dev_pool_data:\"WHERE TIME_NUM > 1204884480100\" DUMPFILE=EXP_DEV.dmp tables=PXPERF.dev_pool_data
Could you help me how should i achieve the same as above in Oracle Data Pump API
DECLARE
h1 NUMBER;
h1 := dbms_datapump.open('EXPORT','TABLE',NULL,'DP_EXAMPLE10','LATEST');
dbms_datapump.add_file(h1,'example3.dmp','DATA_PUMP_TEST',NULL,1);
dbms_datapump.add_file(h1,'example3_dump.log','DATA_PUMP_TEST',NULL,3);
dbms_datapump.metadata_filter(h1,'NAME_LIST','(''DEV_POOL_DATA'')');
END;
Also in the API i want to know how to export and import multiple tables (selective tables only) using one single criteria like "WHERE TIME_NUM > 1204884480100\"Yes, I have read the Oracle doc.
I was able to proceed as below: but it gives error.
============================================================
SQL> SET SERVEROUTPUT ON SIZE 1000000
SQL> DECLARE
2 l_dp_handle NUMBER;
3 l_last_job_state VARCHAR2(30) := 'UNDEFINED';
4 l_job_state VARCHAR2(30) := 'UNDEFINED';
5 l_sts KU$_STATUS;
6 BEGIN
7 l_dp_handle := DBMS_DATAPUMP.open(
8 operation => 'EXPORT',
9 job_mode => 'TABLE',
10 remote_link => NULL,
11 job_name => '1835_XP_EXPORT',
12 version => 'LATEST');
13
14 DBMS_DATAPUMP.add_file(
15 handle => l_dp_handle,
16 filename => 'x1835_XP_EXPORT.dmp',
17 directory => 'DATA_PUMP_DIR');
18
19 DBMS_DATAPUMP.add_file(
20 handle => l_dp_handle,
21 filename => 'x1835_XP_EXPORT.log',
22 directory => 'DATA_PUMP_DIR',
23 filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
24
25 DBMS_DATAPUMP.data_filter(
26 handle => l_dp_handle,
27 name => 'SUBQUERY',
28 value => '(where "XP_TIME_NUM > 1204884480100")',
29 table_name => 'ldev_perf_data',
30 schema_name => 'XPSLPERF'
31 );
32
33 DBMS_DATAPUMP.start_job(l_dp_handle);
34
35 DBMS_DATAPUMP.detach(l_dp_handle);
36 END;
37 /
DECLARE
ERROR at line 1:
ORA-39001: invalid argument value
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 3043
ORA-06512: at "SYS.DBMS_DATAPUMP", line 3688
ORA-06512: at line 25
============================================================
i have a table called LDEV_PERF_DATA and its in schema XPSLPERF.
value => '(where "XP_TIME_NUM > 1204884480100")',above is the condition i want to filter the data.
However, the below snippet works fine.
============================================================
SET SERVEROUTPUT ON SIZE 1000000
DECLARE
l_dp_handle NUMBER;
l_last_job_state VARCHAR2(30) := 'UNDEFINED';
l_job_state VARCHAR2(30) := 'UNDEFINED';
l_sts KU$_STATUS;
BEGIN
l_dp_handle := DBMS_DATAPUMP.open(
operation => 'EXPORT',
job_mode => 'SCHEMA',
remote_link => NULL,
job_name => 'ldev_may20',
version => 'LATEST');
DBMS_DATAPUMP.add_file(
handle => l_dp_handle,
filename => 'ldev_may20.dmp',
directory => 'DATA_PUMP_DIR');
DBMS_DATAPUMP.add_file(
handle => l_dp_handle,
filename => 'ldev_may20.log',
directory => 'DATA_PUMP_DIR',
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
DBMS_DATAPUMP.start_job(l_dp_handle);
DBMS_DATAPUMP.detach(l_dp_handle);
END;
============================================================
I dont want to export all contents as the above, but want to export data based on some conditions and only on selective tables.
Any help is highly appreciated. -
Data Modelling - Complex issue need to get DATE values
Gurus and Experts
I have a situation where i have a infoobject-A which is char having value 10.
I always get the values to this infoobject as 0001032008 from the source system
means 0001 is constant
and 032008 is month and year
Now i want to use this as a dynamic query where user enters 0001032008 it should get values for T+23 months
where T = 0001032008 and T+23 would be 0001022010 (or february 2010) in short.
I get only make this work if i can use this infoobject to mapto 0calmonth and in query use variable for offsets with range as T to T+23.
Steps would be
1)Create infoobject -B ,reference it to 0CALMONTH
2)In Transfer structure ,take always last 6 values ,so we get date as 032008
Now the 0CALMONTH always work with value such as 200803 and not 032008 ,so how would this work ,so will this solution work ,how to achieve this complex logic.
Or is there any other alternative best simple solution
please help
thanksHi
Sometimes what seems complex can be solved by going back to the root.
Let me explain:
Your source system sends you an information with a format 0001MMYYYY. If I catch you, you created an IO to get the information as a master data. That's fine.
But your aim is to interpret your source data as 0CALMONTH. So I suggest that you add 0CALMONTH in you Transfer rule, feed 0CALMONTH by ABAP code in the transfer rule to extract the information from your source system.
If you want, you can keep the original IO, but I do not know what you can do with it....
If you took BW310 course, your problem looks like the first exercise where you play with the cost center comming from another source system.
May be I did not catch everything but it is getting late for me
Cheers
PYG -
Finding variable data value using fms
hi experts
i want find variable field value by using fms.pls help me
exact scenerio is
i want fetch the value tax amount of respective tax code from the Define tax Amount distribution form.
eg.if suppose i tax code BED+VAT then i want get what is the calculated value of excise ,cess and and vat .i get the row tax amount.whwn i click tax amount filed i get tax amount distribution form from this form i want that particular valu
what is exactsql for thisHi Sachin,
No need to go FMS. Open any transcation screen. For eg., Open Sales Order, goto Form Setting -> Table Format. Please check the field column like "Tax Only".
Then open any SO, check it out this Link, u will get the full information and tax break etc...
Thanks
SAGAR -
How to insert data values using Poplist to both block items....
Hi,
I have created a poplist which should return a sequence(which is stored in a db table) and a description .
The sequence(stored in table) is of number datatype and the description is of varchar2.....
I have created the required record group as:
rg_id := Create_Group_From_Query('TEXNIKOS_GROUP', 'select eponymo , to_char(seq_code_ergazomenoy)
from ref_ergazomenos,ref_eidikothta
where ref_ergazomenos.code_eidikothtas_type_id=ref_eidikothta.seq_code_eidikothtas
order by 1');
status := Populate_Group( rg_id );
if (status = 0)
then
POPULATE_LIST('MOD2_KLISI_VLAVIS.TEXNIKOS_FNAME','TEXNIKOS_GROUP');
end if;The field 'MOD2_KLISI_VLAVIS.TEXNIKOS_FNAME' is the description i described above ... and whereas this block item is filled with the selected poplist... the sequence - the code of the db table- is not.....
Is it possible to do so.... ????
NOTE: i use Dev10g.
Many thanks,
SimonI have two block items:
seq_code_ergazomenoy: number datatype , db item , invisible
eponymo:varchar2 datatype , non db item , visible
How to fill these both block items using the written record group...?????
Now , only the "eponymo" block item is filled but not the required "seq_code_ergazomenoy"....
In other words.... is there any manner to do the column mapping of the two selected columns (in the dynamically created record group) to the two block items....????
Thanks,
Simon
Message was edited by:
sgalaxy -
Minimum & Maximum dates in a group of records
I have a set of records in which contains following fields, i have to present a report showing
Product, Quality produced on which machine, Start Date & End Date, how this task may be performed ?
Sample report is in the end for ref.
pls help.
MACHINE PROD QUALITY TRAN_DATE
CODE CODE
1002 101 2 15/11/2011
1002 101 2 16/11/2011
1002 101 2 17/11/2011
1002 101 2 18/11/2011
1002 101 2 19/11/2011
1002 101 2 20/11/2011
1002 101 1 21/11/2011
1002 101 1 22/11/2011
1002 101 1 23/11/2011
1002 101 1 24/11/2011
1002 101 1 25/11/2011
1002 101 2 26/11/2011
1002 101 2 27/11/2011
1002 101 2 28/11/2011
1002 101 2 29/11/2011
1002 101 2 30/11/2011
Product Quality Machine Start Date End Date
101 2 1002 15/11/2011 20/11/2011
101 1 1002 21/11/2011 25/11/2011
101 2 1002 26/11/2011 25/12/2011Sir,
Thanks for the suggestion, but when I run this query for multiple machine codes, it is not generating desired output, what changes I have to make to get desired output, I have arranged the data in format desired by you..
with t_data as
select 1001 machine_code, 105 prod_code, 2 quality, to_date('01/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('02/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('03/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('04/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('05/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('06/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('07/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('08/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('09/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('10/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('11/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('12/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('13/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('14/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('15/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('16/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('17/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('18/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('19/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('20/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('21/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('22/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('23/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('24/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1001, 105, 2, to_date('25/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 101, 2, to_date('01/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 101, 2, to_date('02/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 101, 2, to_date('03/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 101, 2, to_date('04/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 101, 2, to_date('05/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 101, 2, to_date('06/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 101, 2, to_date('07/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 101, 2, to_date('08/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 101, 2, to_date('09/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 101, 2, to_date('10/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 101, 2, to_date('11/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 101, 2, to_date('12/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 101, 2, to_date('13/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 101, 2, to_date('14/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 109, 1, to_date('15/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 109, 1, to_date('16/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 109, 1, to_date('17/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 109, 1, to_date('18/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 109, 1, to_date('19/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 109, 1, to_date('20/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 109, 1, to_date('21/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 109, 1, to_date('22/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 109, 1, to_date('23/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 101, 2, to_date('24/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1002, 101, 2, to_date('25/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('01/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('02/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('03/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('04/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('05/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('06/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('07/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('08/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('09/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('10/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('11/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('12/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('13/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('14/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('15/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('16/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('17/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('18/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('19/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('20/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('21/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('22/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('23/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('24/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1003, 101, 1, to_date('25/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('01/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('02/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('03/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('04/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('05/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('06/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('07/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('08/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('09/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('10/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('11/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('12/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('13/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('14/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('15/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('16/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('17/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('18/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('19/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('20/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('21/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('22/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('23/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('24/12/2011','DD/MM/YYYY') tran_date from dual union all
select 1004, 101, 2, to_date('25/12/2011','DD/MM/YYYY') tran_date from dual
SELECT
PROD_CODE,
QUALITY,
MACHINE_CODE,
MIN(TRAN_DATE),
MAX(TRAN_DATE)
FROM
SELECT
t.*,
row_number() over(order by TRAN_DATE) -
row_number() over(partition BY MACHINE_CODE,PROD_CODE,QUALITY order by TRAN_DATE) AS dt
FROM
t_data t
GROUP BY
MACHINE_CODE,PROD_CODE,QUALITY,dt; -
Get maximum Date valued records
Hi,
I have a table like this
Table1 with columns SY,MID, EndDate, x,y,z
Here SY+MID+EndDate combination is Primary key.
I need to get all the lattest records (means max(enddate) for the given SY, MID.
Now I am writing like this:
Select t.SY,t.MID,t.Enddate,t.x,t.y,t.z
From Table t
Where Enddate=(select max(enddate) from table where sy=Paramsy and mid=paramMID)
Is there any way to get the desired result without writing the subquery.to get maximum enddate.
ThanksEXPLAIN PLAN FOR
SELECT empno, ename, deptno
FROM emp
WHERE hiredate = (SELECT MAX(hiredate) FROM emp);
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 1 | 21 | 6 (0)| 00:00:01 |
|* 1 | TABLE ACCESS FULL | EMP | 1 | 21 | 3 (0)| 00:00:01 |
| 2 | SORT AGGREGATE | | 1 | 8 | | |
| 3 | TABLE ACCESS FULL| EMP | 14 | 112 | 3 (0)| 00:00:01 |
EXPLAIN PLAN FOR
select empno,ename,deptno,hiredate from
(select empno,ename,deptno,hiredate,dense_rank() over(order by hiredate desc) rn from emp)
where rn = 1;
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 14 | 770 | 4 (25)| 00:00:01 |
|* 1 | VIEW | | 14 | 770 | 4 (25)| 00:00:01 |
|* 2 | WINDOW SORT PUSHED RANK| | 14 | 294 | 4 (25)| 00:00:01 |
| 3 | TABLE ACCESS FULL | EMP | 14 | 294 | 3 (0)| 00:00:01 |
CREATE INDEX ix_emp_hiredate
ON emp(hiredate)
TABLESPACE example;
exec dbms_stats.gather_schema_stats(USER, CASCADE=>TRUE);
EXPLAIN PLAN FOR
SELECT empno, ename, deptno
FROM emp
WHERE hiredate = (SELECT MAX(hiredate) FROM emp);
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
| 0 | SELECT STATEMENT | | 1 | 21 | 5 (0)| 00:00:01 |
| 1 | TABLE ACCESS BY INDEX ROWID | EMP | 1 | 21 | 2 (0)| 00:00:01 |
|* 2 | INDEX RANGE SCAN | IX_EMP_HIREDATE | 1 | | 1 (0)| 00:00:01 |
| 3 | SORT AGGREGATE | | 1 | 8 | | |
| 4 | INDEX FULL SCAN (MIN/MAX)| IX_EMP_HIREDATE | 14 | 112 | | |
------------------------------------------------------------------------------------------------Just for comparison. -
Help Needed in Checking data values.
There a table which has got the name, address, ph_number of the organisational division.These deailts are fetched from few other tables and stored in a table.
Could you suggest how to check the details(name,address,ph_number) stored in this table and to create/update a field in a table which has the following number ,along with other the details of the divisions,
‘2’ – Creation of the new New Divisions
‘3’ – Amendment – i.e. Changed Division (either in the Address, Name or Ph_number)
‘4’ – Not Changed
blank = undetermined
Could you pls help in this.
Thanks a lot.Hallo,
with trigger
Something like this
create or replace trigger scott.trg_org
before update or insert on scott.tbl_org
REFERENCING NEW AS NEW OLD AS OLD
FOR EACH ROW
begin
if inserting THEN
:new.flag := '2';
else
if :old.adress != :new.adress
or :old.name != :new.name
or :old.phoneNr != :new.phoneNr
then
:new.flag := '3';
else
:new.flag := '4';
end if;
end if;
end;Regards
Dmytro Dekhtyaryuk -
Reg: fetch the data by using item_id which is retuned by In line View Query
Hi all,
create table xxc_transactions(type_id number,trx_line_id number ,item_id number,org_id number);
insert into xxc_transactions values(null,null,null,null);
create table xxc_items1(item_id number,org_id number,item_no varchar2(10));
insert into xxc_items1 values(123,12,'book');
create table xxc_headers(header_id number,order_id number);
insert into xxc_headers values(null,null);
create table xxc_lines(header_id number,item_id number,line_id number);
insert into xxc_lines values(null,null,null);
create table xxc_types_tl(transaction_id number,NAME varchar2(10));
insert into xxc_types_tl values(106,'abc');
create table xxc_quantity(item_id number);
insert into xxc_quantity values (123);
create table xxc_quantity_1(item_id number);
insert into xxc_quantity_1 values (123);
SELECT union_id.item_id,
b.org_id,
e.name,
fun1(union_id.item_id) item_no
FROM xxc_transactions a,
xxc_items1 b,
xxc_headers c,
xxc_lines d,
xxc_types_tl e,
(SELECT item_id
FROM xxc_quantity
WHERE item_id = 123
UNION
SELECT item_id
FROM xxc_quantity_1
WHERE item_id = 123
UNION
SELECT item_id
FROM xxc_transactions
WHERE item_id = 123) union_id
WHERE a.type_id = 6
AND a.item_id = b.item_id
AND union_id.item_id = b.item_id
AND a.org_id = b.org_id
AND c.header_id = d.header_id
AND d.line_id = a.trx_line_id
AND d.item_id = b.item_id
AND c.order_id = e.transaction_id
AND b.org_id = 12
GROUP BY union_id.item_id,
b.org_id,
e.name
ORDER BY union_id.item_id;
create or replace function fun1(v_item in number)
return varchar2
is
v_item_no
Begin
select item_no from xxc_items1
where item_id=v_item;
return v_item_no ;
Exception
When Others Then
v_item_no := null;
return v_item_no;
END fun1;
I need fetch the data by using item_id which is retuned by In line View Query(UNION)
item_id org_id name item_no
123 12 abc book
Version: 11.1.0.7.0 and 11.2.0.1.0
Message was edited by: Rajesh123 Added test cases script
Message was edited by: Rajesh123 changed Question as fetch the data by using item_id which is retuned by In line View Query(UNION)Hi Master , sorry for the late reply and can you please help on this?
create table xxc_transactions(type_id number,trx_line_id number ,item_id number,org_id number);
insert into xxc_transactions values(null,null,null,null);
create table xxc_items(item_id number,org_id number,item_no varchar2(10));
insert into xxc_items values(123,12,'book');
create table xxc_headers(header_id number,order_id number);
insert into xxc_headers values(null,null);
create table xxc_lines(header_id number,item_id number,line_id number);
insert into xxc_lines values(null,null,null);
create table xxc_types_tl(transaction_id number,NAME varchar2(10));
insert into xxc_types_tl values(106,'abc');
create table xxc_uinon_table(item_id number);
insert into xxc_types_tl values(123);
SELECT union_id.item_id,
b.org_id ,
e.name ,
fun1(union_id.item_id) item_no --> to get item_no
FORM xxc_transactions a,
xxc_items b,
xxc_headers c,
xxc_lines d,
xxc_types_tl e,
( SELECT item_id
FROM xxc_uinon_table ) union_id
WHERE a.type_id= 6
AND a.item_id = b.item_id
AND union_id.item_id = b.item_id
AND a.org_id = b.org_id
AND c.header_id = d.header_id
AND d.line_id= a.trx_line_id
AND d.item_id= b.item_id
AND c.order_id= e.transaction_id ---106
AND b.org_id = 12
GROUP BY union_id.item_id,
b.org_id ,
e.name
ORDER BY union_id.item_id;
Note: xxc_uinon_table is a combination of UNION's
select 1 from dual
union
select 1 from dual
union
select no rows returned from dual;
I will get 1 from the above Query
Thank you in advanced -
Hi everyone,
I need help with inserting values using merge.
* I need to check all the units in a parent category. For example, NF_ARTICLECATEGORYID = 7462 is a parent category.
* Im going to compare all the units in the parent category(7642) to the units in a subcategory (8053).
* If the units in parent category(7642) is not present in the subcategory(8053) then the units will be inserted in the same table.
table structure:
Table name : ARTICLECATEGORYACCESS
Fields: IP_ARTICLECATEGORYACCESSID
NF_ARTICLECATEGORYID
NF_UNITID
NF_USERID
N_VIEW
N_EDIT
Sample data:
CREATE TABLE articlecategoryaccess (
IP_ARTICLECATEGORYACCESSID NUMBER(5),
NF_ARTICLECATEGORYID NUMBER (10),
NF_UNITID NUMBER (10),
NF_USERID NUMBER (10)
N_VIEW INT,
N_EDIT INT);
INSERT INTO articlecategoryaccess VALUES (255583, 7642, 29727, NULL, 1 ,1);
INSERT INTO articlecategoryaccess VALUES (243977,7642,29728, NULL, 1 ,1);
INSERT INTO articlecategoryaccess VALUES (240770,7642,29843, NULL, 1 ,1);
INSERT INTO articlecategoryaccess VALUES (243413,7642,29844, NULL, 1 ,1);
INSERT INTO articlecategoryaccess VALUES (274828,7642,44849, NULL, 1 ,1);
INSERT INTO articlecategoryaccess VALUES (274828,8053,44849, NULL, 1 ,1);
Units ID 29727, 29728, 29843, 29844, 44849 has access to parent category 7642.
The units id 29727, 29728, 29843, 29844 dont have access to subcategory 8053.
29727, 29728, 29843, 29844 should be inserted in the same table and will have an access to 8053.
After they are inserted, it should look like this
IP_ARTICLECATEGORYACCESSID NF_ARTICLECATEGORYID NF_UNITID NF_USERID N_VIEW N_EDIT
255583 7642 29727 null 1 1
243977 7642 29728 null 1 1
240770 7642 29843 null 1 1
243413 7642 29844 null 1 1
274828 7642 44849 null 1 1
new value 8053 44849 null 1 1
new value 8053 29843 null 1 1
new value 8053 29844 null 1 1
new value 8053 29728 null 1 1
new value 8053 29727 null 1 1
NOTE: IP_ARTICLECATEGORYACCESSID is a sequence and it should be unique
DECLARE
BEGIN
MERGE INTO articlecategoryaccess b
USING (SELECT *
FROM articlecategoryaccess c
WHERE nf_articlecategoryid = 7642
MINUS
SELECT *
FROM articlecategoryaccess c
WHERE nf_articlecategoryid = 8053) e
ON (1 = 2)
WHEN NOT MATCHED THEN
INSERT (b.ip_articlecategoryaccessid, b.nf_articlecategoryid, b.nf_unitid, b.NF_USERID, b.N_VIEW, b.N_EDIT)
VALUES (articlecategoryaccessid_seq.nextval, 8053, e.nf_unitid, null, 1, 1);
END;
i got an error after running the script:
*Cause: An UPDATE or INSERT statement attempted to insert a duplicate key.
For Trusted Oracle configured in DBMS MAC mode, you may see
this message if a duplicate entry exists at a different level.
*Action: Either remove the unique restriction or do not insert the key.
why would it be duplicated? its a sequence and its unique.. I dont know, maybe there is something wrong my script..
Any help is appreciated..
EdEd,
1. What is the current value of the Sequence? Does the current value of sequence exist in the table? If yes, then increment the sequence to a value that is not present in the Table.
2. Do you have any unique constraint on any of the columns that you are inserting?
I have to ask you again, Why are you insisting on Merge statement when a simple Insert can do the job for you? Don't you feel that the below specified Merge statement is making things look more Complicated than they actually are, do you?
Think on it and then proceed. I hope these pointers help you to resolve the issue.
Regards,
P.
Maybe you are looking for
-
Questions on Report Builder and Graphics Builder
Hi there: I'm currently using Report/Graphics builder V 6.0.8.11.3 NT version to create RDF and OGD files. I was wondering with the following confusions: 1) Is the RDF/OGD file that I create under NT plat form portable to Sun Unix?? (ie. would Report
-
EJB 3.0 Constraints Problem
Hi everybody ! Hope someone can help me !! Im really go crazy with this stuff :s... Im Using jdeveloper .... I have 2 Entity Bean called Jiverating and Jiveuser, I creadte them using "Create Entity Bean from Table".... also a Session Bean called Foru
-
Change sold-to party BAPI_SALESORDER_CHANGE
Hello, Does anyone know how to change partner function AG (sold-to-party) in a sales order header from within ABAP code ? I'm using BAPI_SALESORDER_CHANGE function module to change partner functions. It works well for all other partner functions, but
-
How do I put all my messages onto my Macbook to view them on there.
I have a lot of messages saved onto my iPhone 5 and I want to view a certain contact's messages (like scroll all the way to the top) but it takes a long time to do so since I have sent/recieved so many messages from them and I have them all saved, so
-
Oracle DB and File system backup configuration
Hi, As I understand from the help documents and guides, brbackup, brarchive and brrestore are the tools which are used for backing up and restoring the oracle database and the file system. We have TSM (Trivoli Storage manager) in our infrastructure f