Hide duplicate values
Hi all
In one of my report I have duplicate values for Measure 2 = Measure1(2008+2009+2010)
Year-------Meausre1------Measure2
2008------ 10--------------60
2009------ 20--------------60
2010------ 30--------------60
Is it possible to only display 60 for 2008??
Regards
Hi.
It's not clear enough to understand what's your criteria to suppress measure2.
So, I tried the similar, in Answers.
Group by:
TIMES.CALENDAR_MONTH_DESC
Measure1:
SALES.QUANTITY_SOLD
Measure2: Quantity sold calculated on year level
SUM(SALES.QUANTITY_SOLD by TIMES.CALENDAR_YEAR)
Measure2 suppressed: Quantity sold calculated on year level shown only in the first month of each year
CASE
WHEN RCOUNT(SALES.QUANTITY_SOLD BY TIMES.CALENDAR_YEAR)=1 THEN SUM(SALES.QUANTITY_SOLD BY TIMES.CALENDAR_YEAR)
ELSE NULL
END
So I would expect results:
Month Quantity sold Quantity sold year Quantity sold year (suppressed)
1999-01 10081747.00 10991779.00 *10991779.00*
1999-02 96488.00 10991779.00
1999-03 78508.00 10991779.00
1999-04 68016.00 10991779.00
1999-05 76852.00 10991779.00
2000-01 88540.00 930584.00 *930584.00*
2000-02 82436.00 930584.00
2000-03 77812.00 930584.00
2000-04 69924.00 930584.00
2000-05 80184.00 930584.00
Maybe this helps.
Regards,
Goran
http://108obiee.blogspot.com
Similar Messages
-
Query produces duplicate values in successive rows. Can I null them out?
I've had very good success (Thanks Andy) in getting detailed responses to my posted questions, and I'm wondering whether there is a way in a report region join query that produces successive rows with duplicate values that I can suppress(replace) the print of the duplicate values in those rows. In case I've managed to twist the question into an unintelligible mess (One of my specialties), let me provide an example:
We're trying to list the undergraduate institutions that an applicant has attended, and display information about dates of attendence, gpa, and major(s)/minor(s) taken. The rub is that there can be multiple major(s)/minor(s) for a given undergraduate institution, so the following is produced by the query:
University of Hard Knox 01/02 01/06 4.00 Knitting
University of Hard Knox 01/02 01/06 4.00 Cloth Repair
Advanced University 02/06 01/08 3.75 Clothing Design
Really Advanced U 02/08 01/09 4.00 Sports Clothing
Really Advanced U 02/08 01/09 4.00 Tennis Shoe Burlap
I want it to look like this:
University of Hard Knox 01/02 01/06 4.00 Knitting
Cloth Repair
Advanced University 02/06 01/08 3.75 Clothing Design
Really Advanced U 02/08 01/09 4.00 Sports Clothing
Tennis Shoe Burlap
* (edit) Please note that the cloth repair and tennis shoe repair rows would be correctly positioned in a table, but unfortunately got space suppresed here for some reason. *
Under Andy's tuteage, I'd say the answer is probably javascript looping through the DOM looking for the innerHTML of specific TDs in the table, but I'd like to confirm that, or, does Apex provide a checkbox that I can check that will produce the same results? Thanks in advance guys, and sorry for all the questions. We've been tasked to use Apex for our next project and to learn it by using it, since the training budget is non-existant this year. I love unfunded mandates ;)
Phil
Edited by: Phil McDermott on Aug 13, 2009 9:34 AMHi Phil,
Javascript is useful, as is the column break functionality within Report Attributes (which would be my first choice if poss).
If you need to go past 3 columns, I would suggest doing something in the SQL statement itself. This will mean that the sorting would probably having to be controlled, but it is doable.
Here's a fairly old thread on the subject: Re: Grouping on reports (not interactive) - with an example here: [http://htmldb.oracle.com/pls/otn/f?p=33642:112] (I've even used a custom template for the report, but you don't need to go that far!)
This uses the LAG functionality within SQL to compare the values on one row to the values on the preceeding row - being able to compare these values allows you to determine which ones to display or hide.
Andy -
SSRS hide Duplicate property Issue
In a report grouped by Property Name, floorcode , unitcode ,I have the same square feet value in multiple rows. I clicked "hide duplicates" so it only displays once on the report, but it is summing row the square feet value
that is sum from a every row that hidden. Is there a way in square feet to only sum the displayed row (or the top row in a group)?
As the sample, I had the data as below:
property leasename, floorcode unit squre feet chargecode amounyt
property1 xyz f1 500 4000 rent 5000
property1 xyz f1 500 4000 est 5000
property1 xyz f1 500 4000 eso 5000
property1 xyz f1 500 4000 cleaning 4000
Now I can show like this with Hide Duplication properties,
property leasename, floorcode unit squrefeet chargecode amounyt
proeprty 1 xyz f1 500 4000 rent 5000
est 5000
eso 5000
cleaning 4000
16000 19000
But, I'd like it show the report as below:
property leasename, floorcode unit squre feet chargecode amounyt
proeprty 1 xyz f1 500 4000 rent 5000
est 5000
eso 5000
cleaning 4000
4000 19000
Anyone, can help me to solve this?
Thanks!property leasename, floorcode unit squre feet chargecode amounyt
property1 xyz f1 500 4000 rent 5000
property1 xyz f1 500 4000 est 5000
property1 xyz f1 500 4000 eso 5000
property1 xyz f1 500 4000 cleaning 4000
400 5000 rent
1000
400 5000 est
2000
400 5000 eso
3000
Now I can show like this with Hide Duplication properties,
property leasename, floorcode unit squrefeet chargecode amounyt
proeprty 1 xyz f1 500 4000 rent 5000
est 5000
eso 5000
cleaning 4000
400 5000 rent
1000
est 2000
eso 3000
21000 19000
But, I'd like it show the report as below:
property leasename, floorcode unit squre feet chargecode amounyt
proeprty 1 xyz f1 500 4000 rent 5000
est 5000
eso 5000
cleaning 4000
400 5000 rent
1000
est 2000
eso 3000
9000 19000
-
Hello,
is it possible to supress duplicate values in Oracle Report? like in sqlplus
BREAK ON DEPARTMENT_ID; would hide all duplicate values like mentioned in this doc
http://download.oracle.com/docs/cd/B19306_01/server.102/b14357/ch6.htm#sthref1249
Any such capability in Oracle Reports?
ThanksCreate a group with the fields you want to break on. In the datamodel, drag that column from the rectangular group-box onto the line which connects the rounded query box and the rectangle group-box.
-
Avoiding null and duplicate values using model clause
Hi,
I am trying to use model clause to get comma seperated list of data : following is the scenario:
testuser>select * from test1;
ID VALUE
1 Value1
2 Value2
3 Value3
4 Value4
5 Value4
6
7 value5
8
8 rows selected.
the query I have is:
testuser>with src as (
2 select distinct id,value
3 from test1
4 ),
5 t as (
6 select distinct substr(value,2) value
7 from src
8 model
9 ignore nav
10 dimension by (id)
11 measures (cast(value as varchar2(100)) value)
12 rules
13 (
14 value[any] order by id =
15 value[cv()-1] || ',' || value[cv()]
16 )
17 )
18 select max(value) oneline
19 from t;
ONELINE
Value1,Value2,Value3,Value4,Value4,,value5,
what I find is that this query has duplicate value and null (',,') coming in as data has null and duplicate value. Is there a way i can avoid the null and the duplicate values in the query output?
thanks,
Edited by: orausern on Feb 19, 2010 5:05 AMHi,
Try this code.
with
t as ( select substr(value,2)value,ind
from test1
model
ignore nav
dimension by (id)
measures (cast(value as varchar2(100)) value, 0 ind)
rules
( ind[any]= instr(value[cv()-1],value[cv()]),
value[any] order by id = value[cv()-1] || CASE WHEN value[cv()] IS NOT NULL
and ind[cv()]=0 THEN ',' || value[cv()] END
select max(value) oneline
from t;
SQL> select * from test1;
ID VALUE
1 Value1
2 Value2
3 Value3
4 Value4
5 Value4
6
7 value5
8
8 ligne(s) sélectionnée(s).
SQL> with
2 t as ( select substr(value,2)value,ind
3 from test1
4 model
5 ignore nav
6 dimension by (id)
7 measures (cast(value as varchar2(100)) value, 0 ind)
8 rules
9 ( ind[any]= instr(value[cv()-1],value[cv()]),
10 value[any] order by id = value[cv()-1] || CASE WHEN value[cv()] IS NOT NULL
11 and ind[cv()]=0 THEN ',' || value[cv()] END
12 )
13 )
14 select max(value) oneline
15 from t;
ONELINE
Value1,Value2,Value3,Value4,value5
SQL> -
Unable to Enforce Unique Values, Duplicate Values Exist
I have list in SP 2010, it contains roughly 1000 items. I would like to enforce unique values on the title field. I started by cleaning up the list, ensuring that all items already had a unique value. To help with this, I used the export
to excel action, then highlight duplicates within Excel. So as far as I can tell, there are no duplicates within that list column.
However, when I try to enable the option to Enforce Unique Values, I receive the error that duplicate values exist within the field and must be removed.
Steps I've taken so far to identify / resolve duplicate values:
- Multiple exports to Excel from an unfiltered list view, then using highlight duplicates feature > no duplicates found
- deleted ALL versions of every item from the list (except current), ensured they were completely removed by deleting from both site and site collection recycle bins
- Using the SP Powershell console, grabbed all list items and exported all of the "Title" type fields (Item object Title, LinkTitle, LinkTitleNoMenu, etc) to a csv and ran that through excel duplicate checking as well.
Unless there's some rediculous hidden field value that MS expects anyone capable of attempting to enforce unique values on a list (which is simple enough for anyone to figure out - if it doesn't throw an error), then I've exhausted anything I can think
of that might cause the list to report duplicate values for that field.
While I wait to see if someone else has an idea, I'm also going to see what happens if I wipe the Crawl Index and start it from scratch.
- JonFirst, I create index for a column in list settings, it works fine no matter duplicate value exists or not;
then I set enforce unique values in the field, after click OK, I get duplicate values error message.
With SQL Server profiler, I find the call to proc_CheckIfExistingFieldHasDuplicateValues and the parameters. After reviewing this stored procedure in content database,
I create the following script in SQL Server management studio:
declare @siteid
uniqueidentifier
declare @webid
uniqueidentifier
declare @listid
uniqueidentifier
declare @fieldid
uniqueidentifier
set @siteid='F7C40DC9-E5D3-42D7-BE60-09B94FD67BEF'
set @webid='17F02240-CE04-4487-B961-0482B30DDA84'
set @listid='B349AF8D-7238-419D-B6C4-D88194A57EA7'
set @fieldid='195A78AC-FC52-4212-A72B-D03144DC1E24'
SELECT
* FROM TVF_UserData_List(@ListId)
AS U1 INNER
MERGE JOIN
NameValuePair_Latin1_General_CI_AS
AS NVP1 WITH (INDEX=NameValuePair_Latin1_General_CI_AS_MatchUserData)
ON NVP1.ListId
= @ListId AND NVP1.ItemId
= U1.tp_Id
AND ((NVP1.Level
= 1 AND U1.tp_DraftOwnerId
IS NULL)
OR NVP1.Level
= 2)
AND NOT((DATALENGTH(ISNULL(NVP1.Value,
= 0)) AND U1.tp_Level
= NVP1.Level
AND U1.tp_IsCurrentVersion
= CONVERT(bit, 1)
AND U1.tp_CalculatedVersion
= 0 AND U1.tp_RowOrdinal
= 0 INNER
MERGE JOIN
NameValuePair_Latin1_General_CI_AS
AS NVP2 WITH (INDEX=NameValuePair_Latin1_General_CI_AS_CI)
ON NVP2.SiteId
= @SiteId AND NVP2.ListId
= @ListId AND NVP2.FieldId
= @FieldId AND NVP2.Value
= NVP1.Value
AND NVP2.ItemId <> NVP1.ItemId
CROSS APPLY TVF_UserData_ListItemLevelRow(NVP2.ListId, NVP2.ItemId,
NVP2.Level, 0)
AS U2 WHERE ((NVP2.Level
= 1 AND U2.tp_DraftOwnerId
IS NULL)
OR NVP2.Level
= 2)
AND NOT((DATALENGTH(ISNULL(NVP2.Value,
= 0))
I can find the duplicate list items based on the result returned by the query above.
Note that you need to change the parameter values accordingly, and change the name of NameValuePair_Latin1_General1_CI_AS table based on the last parameter of the
proc_CheckIfExistingFieldHasDuplicateValues stored procedure. You can review the code of this stored procedure by yourself.
Note that direct operation on the content database in production environment is not supported, please do all these in test environment. -
Need help-SQL with result format suppressing duplicate values
I am trying to write a SQL which would select data from the below tables(in reality i have many other tables also) but I wanted to know how do i get the data in the format given below.
The scenario is :-A training_plan can N no. of OBJECTIVES and EACH objective has N no.of activities
Insert into TEST_TRAINING_PLAN
(TPLAN_ID, TPLAN_NAME, TPLAN_DESC, T_PERSON_ID
Values
('111', 'test_name', 'test_name_desc', '****')
Objectives table:-
Insert into TEST_TRAINING_OBJECTIVE
(T_OBJECTIVE_ID, T_OBJECTIVE_NAME,T_owner)
Values
('10', 'objective1', '1862188559')
Objective and Training Plan relationship table where TPLAN_ID is the foreign key.
Insert into TEST_TP_OBJECTIVE
(TPLAN_TOBJ_ID, TPLAN_ID, T_OBJECTIVE_ID,
REQUIRED_CREDITS)
Values
('1', '111', '10',5)
Objective and Activity relationship table where T_OBJECTIVE_ID is the foreign key from the TEST_TRAINING_OBJECTIVE table.
Insert into TEST_TRAIN_OBJ_ACTIVITY
(TOBJ_TRAIN_ACTIVITY, T_OBJECTIVE_ID, ACTIVITY_ID, IS_REQUIRED, STATUS,
ACTIVITY_TYPE, ITEM_ORDER, IS_PREFERRED)
Values
('1000', '10', 'selfstudy event', SS1, NULL,
'Event', 0, 0);
Insert into TEST_TRAIN_OBJ_ACTIVITY
(TOBJ_TRAIN_ACTIVITY, T_OBJECTIVE_ID, ACTIVITY_ID, IS_REQUIRED, STATUS,
ACTIVITY_TYPE, ITEM_ORDER, IS_PREFERRED)
Values
('1001', '10', 'SQLcourse', 1, NULL,
'Course', 1, 0);
Insert into TEST_TRAIN_OBJ_ACTIVITY
(TOBJ_TRAIN_ACTIVITY, T_OBJECTIVE_ID, ACTIVITY_ID, IS_REQUIRED, STATUS,
ACTIVITY_TYPE, ITEM_ORDER, IS_PREFERRED)
Values
('1002', '10', 'testSQL', 1, NULL,
'test', 2, 0);
COMMIT;
firstname emplid Tplan name Number of activities/credits completed(for TP) Objective Name Number of required activities/Credits (for objective) Number of activities/credits completed(for objective) activity name activity completion status
U1 U1 TP1 5
OBJ1 4 3 C1 PASSED
C2 PASSED
C3 WAIVED
T1 ENROLLED
T2 ENROLLED
OBJ2 3 2
S1 ENROLLED
S2 PASSED
T3 WAIVED
U1 U1 TP2 C4 INPROGRESS
50 OBJ11 50 30 C11 PASSED
**The second row where we have another training_plan record and accordingly all objectivesand their objective.**similarly ,i need to display many Training_plan records in such tabular format.Please help with the SQL query to select and display data in the above format
If you want to suppress duplicate values in some of your results columns
I am using toad 9.1 using Oracle 10g version 2Hi,
You can use the BREAK command to suppress duplicate values.
http://download.oracle.com/docs/cd/B19306_01/server.102/b14357/ch12009.htm#SQPUG030
(scroll down for an example)
It's a 'SQL*Plus-ism', not sure if TOAD's capable to handle it.
Simple example:
HR%xe> break on department_name
HR%xe> select l.department_name
2 , e.last_name
3 , e.first_name
4 from departments l
5 , employees e
6 where e.department_id = l.department_id;
DEPARTMENT_NAME LAST_NAME FIRST_NAME
Executive King Steven
Kochhar Neena
De Haan Lex
IT Hunold Alexander
Ernst Bruce
Austin David
Pataballa Valli
Lorentz Diana
Finance Greenberg Nancy
Faviet Daniel
Chen John
Sciarra Ismael
Urman Jose Manuel
Popp Luis
Purchasing Raphaely Den
Khoo Alexander
Baida Shelli
Tobias Sigal
Himuro Guy
Colmenares Karen
Shipping Weiss Matthew
Fripp Adam
Kaufling Payam
Vollman Shanta
Mourgos Kevin
Nayer Julia
Mikkilineni Irene
Landry James
Public Relations Baer Hermann
Accounting Higgins Shelley
Gietz William
106 rijen zijn geselecteerd. -
i have one database table called "sms1" that table is updated every day or on daily basis it has the following fields in it:
SQL> desc sms1;
Name Null? Type
MOBILE NUMBER
RCSTCNATCNATCNATCNAWTHER VARCHAR2(39 CHAR)
SNO NUMBER
INDATE DATE
From this table the is one column "RCSTCNATCNATCNATCNAWTHER VARCHAR2(39 CHAR)" . I am splitting it into different columns like :
SQL> desc smssplit;
Name Null? Type
R VARCHAR2(2 CHAR)
C VARCHAR2(2 CHAR)
S VARCHAR2(1 CHAR)
TC VARCHAR2(3 CHAR)
NA VARCHAR2(3 CHAR)
TC2 VARCHAR2(3 CHAR)
NA2 VARCHAR2(3 CHAR)
TC3 VARCHAR2(3 CHAR)
NA3 VARCHAR2(3 CHAR)
TC4 VARCHAR2(3 CHAR)
NA4 VARCHAR2(3 CHAR)
WTHER VARCHAR2(10 CHAR)
SNO NUMBER
INSERTDATA VARCHAR2(25 CHAR)
Now I am written a procedure to insert the data from "Sms1" table to smssplit table...
CREATE OR REPLACE PROCEDURE SPLITSMS
AS
BEGIN
INSERT INTO scott.SMSSPLIT ( R,C,S,TC,NA,TC2,NA2,TC3,NA3,TC4,NA4,WTHER,SNO)
SELECT SUBSTR(RCSTCNATCNATCNATCNAWTHER,1,2) R,
SUBSTR(RCSTCNATCNATCNATCNAWTHER,3,2) C,
SUBSTR(RCSTCNATCNATCNATCNAWTHER,5,1) S,
SUBSTR(RCSTCNATCNATCNATCNAWTHER,6,3) TC,
SUBSTR(RCSTCNATCNATCNATCNAWTHER,9,3) NA,
SUBSTR(RCSTCNATCNATCNATCNAWTHER,12,3) TC2,
SUBSTR(RCSTCNATCNATCNATCNAWTHER,15,3) NA2,
SUBSTR(RCSTCNATCNATCNATCNAWTHER,18,3) TC3,
SUBSTR(RCSTCNATCNATCNATCNAWTHER,21,3) NA3,
SUBSTR(RCSTCNATCNATCNATCNAWTHER,24,3) TC4,
SUBSTR(RCSTCNATCNATCNATCNAWTHER,27,3) NA4,
SUBSTR(RCSTCNATCNATCNATCNAWTHER,30,10) WTHER, SNO
FROM scott.SMS1 where SNO=(select MAX (sno) from SMS1);
END;
Now in order to update the second table with data from first table on regular basis I have written a job scheduler and I am using oracle 9.0. version...
DECLARE
X NUMBER;
JobNumber NUMBER;
BEGIN
SYS.DBMS_JOB.SUBMIT
job => X
,what => 'scott.SPLITSMS;'
,next_date => SYSDATE+1/1440
,interval => 'SYSDATE+1/1440 '
,no_parse => FALSE
:JobNumber := to_char(X);
END;
Now this job scheduler is working properly and updating the data for every one minute but it is taking or updating the duplicate values also ..like example:
R C S TC NA TC2 NA2 TC3 NA3 TC4 NA4 WTHER SNO
INSERTDATA
33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
06-SEP-2012 03:49:16
33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
06-SEP-2012 03:49:16
33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
06-SEP-2012 03:50:17
R C S TC NA TC2 NA2 TC3 NA3 TC4 NA4 WTHER SNO
INSERTDATA
33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
06-SEP-2012 03:50:17
33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
06-SEP-2012 03:51:19
33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
06-SEP-2012 03:51:19
R C S TC NA TC2 NA2 TC3 NA3 TC4 NA4 WTHER SNO
INSERTDATA
33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
06-SEP-2012 03:52:20
33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
06-SEP-2012 03:52:20
33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
06-SEP-2012 03:53:22
R C S TC NA TC2 NA2 TC3 NA3 TC4 NA4 WTHER SNO
INSERTDATA
33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
06-SEP-2012 03:53:22
33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
06-SEP-2012 03:54:45
33 35 2 123 456 789 543 241 643 243 135 RRRRRR 55
06-SEP-2012 03:54:45
Now I do not want the duplicate values to be updated ...and want them to ignore them.....
please I need a help on this query........How to avoid the duplicate values............Look at the posts closely:might not be needed if formatted ;)
create or replace procedure splitsms as
begin
insert into scott.smssplit (r,c,s,tc,na,tc2,na2,tc3,na3,tc4,na4,wther,sno)
select substr(rcstcnatcnatcnatcnawther,1,2) r,
substr(rcstcnatcnatcnatcnawther,3,2) c,
substr(rcstcnatcnatcnatcnawther,5,1) s,
substr(rcstcnatcnatcnatcnawther,6,3) tc,
substr(rcstcnatcnatcnatcnawther,9,3) na,
substr(rcstcnatcnatcnatcnawther,12,3) tc2,
substr(rcstcnatcnatcnatcnawther,15,3) na2,
substr(rcstcnatcnatcnatcnawther,18,3) tc3,
substr(rcstcnatcnatcnatcnawther,21,3) na3,
substr(rcstcnatcnatcnatcnawther,24,3) tc4,
substr(rcstcnatcnatcnatcnawther,27,3) na4,
substr(rcstcnatcnatcnatcnawther,30,10) wther,
sno
from scott.sms1 a
where sno = (select max(sno)
from sms1
where sno != a.sno
); ---------------> added where clause with table alias.
end;Regards
Etbin -
Removing duplicate values from selectOneChoice bound to List Iterator
I'm trying to remove duplicate values from a selectOneChoice that i have. The component binds back to a List Iterator on the pageDefinition.
I have a table on a JSF page with 5 columns; the table is bound to a method iterator on the pageDef. Then above the table, there are 5 separate selectOneChoice components each one of which is bound to the result set of the table's iterator. So this means that each selectOneChoice only contains vales corresponding to the columns in the table which it represents.
The selectOneChoice components are part of a search facility and allow the user to select values from them and restrict the results that are returned. The concept is fine and i works. However if i have repeating values in the selectOneChoice (which is inevitable given its bound to the table column result set), then i need to remove them. I can remove null values or empty strings using expression language in the rendered attribute as shown:
<af:forEach var="item"
items="#{bindings.XXXX.items}">
<af:selectItem label="#{item.label}" value="#{item.label}"
rendered="#{item.label != ''}"/>
</af:forEach>
But i dont know how i can remove duplicate values easily. I know i can programatically do it in a backing bean etc.... but i want to know if there is perhaps some EL that might do it or another setting that ADF gives which can overcome this.
Any help would be appreciated.
Kind RegardsHi,
It'll be little difficult removing duplicates and keeping the context as it is with exixting standard functions. Removing duplicates irrespective of context changes, we can do with available functions. Please try with this UDF code which may help you...
source>sort>UDF-->Target
execution type of UDF is Allvalues of a context.
public void UDF(String[] var1, ResultList result, Container container) throws StreamTransformationException{
ArrayList aList = new ArrayList();
aList.add(var1(0));
result.addValue(var1(0));
for(int i=1; i<var1.length; i++){
if(aList.contains(var1(i)))
continue;
else{
aList.add(var1(i));
result.addValue(var1(i));
Regards,
Priyanka -
Exclude duplicate values on SQL where clause statement
Hi!
Are some posibilities to exclude duplicate values do not using sql aggregate functions in main select statement?
Priview SQL statement
SELECT * FROM
select id,hin_id,name,code,valid_date_from,valid_date_to
from diaries
QRSLT
WHERE (hin_id = (SELECT NVL(historic_id,id)FROM tutions where id=/*???*/ 59615))
AND NVL(valid_date_to,to_date('22.12.2999','dd.mm.yyyy')) <= (SELECT NVL(valid_date_to,to_date('22.12.2999','dd.mm.yyyy'))FROM tutions where id= /*???*/ 59615)
AND trunc(valid_date_from) >=(SELECT trunc(valid_date_from)FROM tutions where id= /*???*/ 59615)
The result
ID HIN_ID NAME CODE VALID_DATE FROM VALID_DATE_TO
50512
59564
RE TU
01
07.06.2013 16:32:15
07.06.2013 16:33:28
50513
59564
TT2
02
07.06.2013 16:33:23
07.06.2013 16:33:28
50515
59564
TT2
02
07.06.2013 16:33:28
07.06.2013 16:34:42
50516
59564
ROD
03
07.06.2013 16:34:37
07.06.2013 16:34:42
VALID_DATE_TO & AND VALID_DATE_FROM tutions
07.06.2013 16:34:42
15.07.2013 10:33:23
In this case i got duplicate of entry TT2 id 50513 In main select statement cant use agregate functions are even posible to exclude this value from result modifying only the QLRST WHERE clause (TRUNC need to be here)
THANKS FOR ANY TIP !
ID.Hi, Ok this is working in this case
SELECT * FROM
select id,hin_id,name,code,valid_date_from,valid_date_to
from diaries ahs
QRSLT
WHERE (hin_id = (SELECT NVL(historic_id,id)FROM aip_healthcare_tutions where id=/*???*/ 59615))
AND NVL(valid_date_to,to_date('22.12.2999','dd.mm.yyyy')) <= (SELECT NVL(valid_date_to,to_date('22.12.2999','dd.mm.yyyy'))FROM tutions where id= /*???*/ 59615)
AND trunc(valid_date_from) >=(SELECT trunc(valid_date_from)FROM tutions where id= /*???*/ 59615)
AND NOT EXISTS
(SELECT null FROM diaries ahs WHERE ahs.valid_date_from < QRSLT.valid_date_from
AND QRSLT.hin_id=ahs.hin_id
AND QRSLT.code=ahs.code);
Result
50512
59564
RE TU
01
07.06.2013 16:32:15
07.06.2013 16:33:28
50513
59564
TT2
02
07.06.2013 16:33:23
07.06.2013 16:33:28
50516
59564
ROD
03
07.06.2013 16:34:37
07.06.2013 16:34:42
But if the Data in tutions row are theese(valid_date_to-null) then NO ROWS are returning and its logical because in full result list Valid_date_from column are logical incorect
valid_date_from valid_date_to
15.07.2013 10:33:23
NULL
ID HIN_ID NAME CODE VALID_DATE FROM VALID_DATE_TO
50510
59564
RE TU
01
07.06.2013 16:33:28
50511
59564
TT2
02
07.06.2013 16:34:41
50514
59564
ROD
03
07.06.2013 16:34:41
50520
59564
Params
04
03.07.2013 21:01:30
50512
59564
RE TU
01
07.06.2013 16:32:15
07.06.2013 16:33:28
50513
59564
TT2
02
07.06.2013 16:33:23
07.06.2013 16:33:28
50515
59564
TT2
02
07.06.2013 16:33:28
07.06.2013 16:34:42
50516
59564
ROD
03
07.06.2013 16:34:37
07.06.2013 16:34:42
Are that posible modifying where statement if the valid_date_to in tutions are null then theese records where in diary valid_date_to is null is correct to, but need to stay previos logic
D HIN_ID NAME CODE VALID_DATE FROM VALID_DATE_TO
50510
59564
RE TU
01
07.06.2013 16:33:28
null
50511
59564
TT2
02
07.06.2013 16:34:41
null
50514
59564
ROD
03
07.06.2013 16:34:41
null
50520
59564
Params
04
03.07.2013 21:01:30
null
Thanks !
ID. -
hi,
Please see the tabl;e structure below
-- Create table
create table RN_RPT_FIG
DT DATE not null,
RPT_ID NUMBER(8) not null,
RPT_ROW_ID NUMBER(4) not null,
RPT_COL_ID NUMBER(4) not null,
FIG NUMBER(20,2)
tablespace TS_IRS
pctfree 10
initrans 1
maxtrans 255
storage
initial 64K
minextents 1
maxextents unlimited
-- Create/Recreate primary, unique and foreign key constraints
alter table RN_RPT_FIG
add constraint PK_RN_RPT_FIG primary key (DT, RPT_ID, RPT_ROW_ID, RPT_COL_ID)
using index
tablespace TS_IRS
pctfree 10
initrans 2
maxtrans 255
storage
initial 128K
minextents 1
maxextents unlimited
I would like to find any duplicate values that have been entered in this table. how can i write this query Keeping in mind the constraint PK_RN_RPT_FIG ?I am actually tryin to find out which are those columns
below is the data in that table
DT RPT_ID RPT_ROW_ID RPT_COL_ID FIG
1 31-Mar-11 2000101 1 2 8157500.00
2 31-Mar-11 2000101 1 1 17.00
3 31-Mar-11 2000101 1 4 530000.00
4 31-Mar-11 2000101 1 3 5.00
5 31-Mar-11 2000101 2 2 96500.00
6 31-Mar-11 2000101 2 1 6.00
7 31-Mar-11 2000101 3 2 8301000.00
8 31-Mar-11 2000101 3 1 7.00
9 31-Mar-11 2000101 3 4 669000.00
10 31-Mar-11 2000101 3 3 6.00
11 31-Mar-11 2000101 5 2 25184500.00
12 31-Mar-11 2000101 5 1 61.00
13 31-Mar-11 2000101 5 4 22609000.00
14 31-Mar-11 2000101 5 3 19.00
15 31-Mar-11 2000101 6 2 14103500.00
16 31-Mar-11 2000101 6 1 24.00
17 31-Mar-11 2000101 7 5 9153500.00
18 31-Mar-11 2000102 1 1 6.00
19 31-Mar-11 2000102 1 2 664000.00
20 31-Mar-11 2000102 1 3 1.00
21 31-Mar-11 2000102 1 4 18500.00
22 31-Mar-11 2000102 2 1 4.00
23 31-Mar-11 2000102 2 2 4800000.00
24 31-Mar-11 2000102 2 3 1.00
25 31-Mar-11 2000102 2 4 98000.00
26 31-Mar-11 2000102 3 3 1.00
27 31-Mar-11 2000102 3 4 98000.00
28 31-Mar-11 2000102 4 1 18.00
29 31-Mar-11 2000102 4 2 16476000.00
30 31-Mar-11 2000102 4 3 10.00
31 31-Mar-11 2000102 4 4 21257000.00
32 31-Mar-11 2000102 5 1 9.00
33 31-Mar-11 2000102 5 2 6069000.00
34 31-Mar-11 2000102 5 3 6.00
35 31-Mar-11 2000102 5 4 79000.00
36 31-Mar-11 2000102 6 1 26.00
37 31-Mar-11 2000102 6 2 2938000.00
38 31-Mar-11 2000102 6 3 2.00
39 31-Mar-11 2000102 6 4 1167500.00
40 31-Mar-11 2000102 7 1 2.00
41 31-Mar-11 2000102 7 2 257000.00
42 31-Mar-11 2000103 1 1 3.00
43 31-Mar-11 2000103 1 2 169500.00
44 31-Mar-11 2000103 2 3 3.00
45 31-Mar-11 2000103 2 4 2266500.00
46 31-Mar-11 2000103 3 1 4.00
47 31-Mar-11 2000103 3 2 6400000.00
48 31-Mar-11 2000103 4 1 17.00
49 31-Mar-11 2000103 4 2 9369500.00
50 31-Mar-11 2000103 4 3 8.00
51 31-Mar-11 2000103 4 4 20149000.00
52 31-Mar-11 2000103 5 1 25.00
53 31-Mar-11 2000103 5 2 7307000.00
54 31-Mar-11 2000103 5 3 3.00
55 31-Mar-11 2000103 5 4 125500.00
56 31-Mar-11 2000103 6 1 7.00
57 31-Mar-11 2000103 6 2 194500.00
58 31-Mar-11 2000103 6 3 5.00
59 31-Mar-11 2000103 6 4 68000.00
60 31-Mar-11 2000103 7 1 2.00
61 31-Mar-11 2000103 7 2 247000.00
62 31-Mar-11 2000103 8 1 2.00
63 31-Mar-11 2000103 8 2 1375000.00
64 31-Mar-11 2000103 11 1 1.00
65 31-Mar-11 2000103 11 2 122000.00
66 31-Mar-11 2000104 3 4 432500.00
67 31-Mar-11 2000104 3 3 27.00
68 31-Mar-11 2000104 3 6 115000.00
69 31-Mar-11 2000104 3 5 8.00
70 31-Mar-11 2000104 4 4 172000.00
71 31-Mar-11 2000104 4 3 4.00
72 31-Mar-11 2000104 4 6 294000.00
73 31-Mar-11 2000104 4 5 3.00
74 31-Mar-11 2000104 5 4 7030000.00
75 31-Mar-11 2000104 5 3 23.00
76 31-Mar-11 2000104 5 6 1150000.00
77 31-Mar-11 2000104 5 5 1.00
78 31-Mar-11 2000104 6 4 17550000.00
79 31-Mar-11 2000104 6 3 7.00
80 31-Mar-11 2000104 6 6 21050000.00
81 31-Mar-11 2000104 6 5 7.00
82 31-Mar-11 2000105 5 4 664000.00
83 31-Mar-11 2000105 5 3 6.00
84 31-Mar-11 2000105 5 6 18500.00
85 31-Mar-11 2000105 5 5 1.00
86 31-Mar-11 2000105 6 4 4800000.00
87 31-Mar-11 2000105 6 3 4.00
88 31-Mar-11 2000105 7 6 98000.00
89 31-Mar-11 2000105 7 5 1.00
90 31-Mar-11 2000105 8 4 16525500.00
91 31-Mar-11 2000105 8 3 23.00
92 31-Mar-11 2000105 8 6 21325000.00
93 31-Mar-11 2000105 8 5 15.00
94 31-Mar-11 2000105 9 4 2938000.00
95 31-Mar-11 2000105 9 3 26.00
96 31-Mar-11 2000105 9 6 1167500.00
97 31-Mar-11 2000105 9 5 2.00
98 31-Mar-11 2000105 10 4 257000.00
99 31-Mar-11 2000105 10 3 2.00
100 31-Mar-11 4000100 3 3 0.00
101 31-Mar-11 4000100 3 4 18.00
102 31-Mar-11 4000100 3 5 18.00
103 31-Mar-11 4000100 4 3 0.00
104 31-Mar-11 4000100 4 4 7.00
105 31-Mar-11 4000100 4 5 7.00
106 31-Mar-11 4000100 5 3 0.00
107 31-Mar-11 4000100 5 4 10.00
108 31-Mar-11 4000100 5 5 10.00
109 31-Mar-11 4000100 6 4 8.00
110 31-Mar-11 4000100 6 5 8.00
111 31-Mar-11 4000100 7 4 27.00
112 31-Mar-11 4000100 7 5 27.00
113 31-Mar-11 4000100 8 4 41.00
114 31-Mar-11 4000100 8 5 41.00
115 31-Mar-11 4000100 9 3 0.00
116 31-Mar-11 4000100 9 4 4.00
117 31-Mar-11 4000100 9 5 4.00
118 31-Mar-11 4000100 10 3 0.00
119 31-Mar-11 4000100 10 4 2.00
120 31-Mar-11 4000100 10 5 2.00
121 31-Mar-11 4000100 11 3 0.00
122 31-Mar-11 4000100 11 4 0.00
123 31-Mar-11 4000100 11 5 0.00
124 31-Mar-11 4000100 12 3 0.00
125 31-Mar-11 4000100 12 4 4.00
126 31-Mar-11 4000100 12 5 4.00
127 31-Mar-11 4000100 13 3 0.00
128 31-Mar-11 4000100 13 4 2.00
129 31-Mar-11 4000100 13 5 2.00
130 31-Mar-11 4000100 14 3 0.00
131 31-Mar-11 4000100 14 4 0.00
132 31-Mar-11 4000100 14 5 0.00
133 31-Mar-11 4000100 15 3 0.00
134 31-Mar-11 4000100 15 4 6.00
135 31-Mar-11 4000100 15 5 6.00
136 31-Mar-11 4000100 16 5 0.00
137 31-Mar-11 4000100 17 5 0.00
138 31-Mar-11 4000100 18 5 0.00
139 31-Mar-11 4000100 19 5 212.81
140 31-Mar-11 4000100 20 5 0.00
141 31-Mar-11 4000100 21 5 39.20
142 31-Mar-11 4000100 22 5 0.00
143 31-Mar-11 4000100 23 5 0.00
144 31-Mar-11 4000100 26 5 0.00
145 31-Mar-11 4000100 27 5 92.96
146 31-Mar-11 4000100 28 5 21.07
147 31-Mar-11 4000100 29 5 0.00
148 31-Mar-11 4000100 30 5 44.24
149 31-Mar-11 4000100 31 5 10.09
150 31-Mar-11 5000100 3 3 57700.00
151 31-Mar-11 5000100 4 3 137900.00
152 31-Mar-11 5000100 5 3 41700.00
153 31-Mar-11 5000100 6 3 20900.00
154 31-Mar-11 5000100 7 3 32800.00
155 31-Mar-11 5000100 8 3 188100.00
156 31-Mar-11 5000100 9 3 372730.00
157 31-Mar-11 5000100 10 3 63100.00
158 31-Mar-11 5000100 11 3 126200.00
159 31-Mar-11 5000100 12 3 65100.00
160 31-Mar-11 5000100 13 3 157200.00
161 31-Mar-11 5000100 15 3 61100.00
162 31-Mar-11 5000100 16 3 38900.00
163 31-Mar-11 5000100 17 3 208500.00
164 31-Mar-11 5000100 18 3 1167700.00
165 31-Mar-11 5000100 19 3 67100.00
166 31-Mar-11 5000100 20 3 68100.00
167 31-Mar-11 5000100 21 3 81100.00
168 31-Mar-11 5000100 22 3 82100.00
169 31-Mar-11 5000100 24 3 90500.00
170 31-Mar-11 5000100 25 3 20800.00
171 31-Aug-11 3000107 3 4 1000.00
172 31-Aug-11 3000107 3 3 1.00
173 31-Aug-11 3000107 3 6 1000.00
174 31-Aug-11 3000107 3 5 1.00
175 31-Aug-11 3000108 8 3 4.00
176 31-Aug-11 3000108 8 4 45100.00
177 31-Aug-11 3000108 15 3 4.00
178 31-Aug-11 3000108 15 4 60000.00
179 31-Aug-11 3000108 16 3 3.00
180 31-Aug-11 3000108 16 4 71020.56
181 31-Aug-11 3000108 17 3 4.00
182 31-Aug-11 3000108 17 4 97038.72
183 31-Aug-11 3000108 18 3 4.00
184 31-Aug-11 3000108 18 4 17096.91
185 31-Aug-11 3000108 19 3 8.00
186 31-Aug-11 3000108 19 4 106569.85
187 31-Aug-11 3000108 20 3 3.00
188 31-Aug-11 3000108 20 4 30456.12
189 31-Aug-11 3000108 21 3 4.00
190 31-Aug-11 3000108 21 4 63805.83
191 31-Aug-11 3000108 22 3 4.00
192 31-Aug-11 3000108 22 4 60000.00
193 31-Aug-11 3000108 23 3 4.00
194 31-Aug-11 3000108 23 4 16208.52
195 31-Aug-11 3000108 24 3 4.00
196 31-Aug-11 3000108 24 4 57784.26
197 31-Aug-11 3000108 25 3 3.00
198 31-Aug-11 3000108 25 4 45000.00
199 31-Aug-11 3000108 26 3 4.00
200 31-Aug-11 3000108 26 4 17212.24
201 31-Aug-11 3000108 28 3 4.00
202 31-Aug-11 3000108 28 4 30567.03
203 31-Aug-11 3000108 33 3 3.00
204 31-Aug-11 3000108 33 4 30130.34
205 31-Aug-11 4000100 3 3 0.00
206 31-Aug-11 4000100 3 4 18.00
207 31-Aug-11 4000100 3 5 18.00
208 31-Aug-11 4000100 4 3 0.00
209 31-Aug-11 4000100 4 4 7.00
210 31-Aug-11 4000100 4 5 7.00
211 31-Aug-11 4000100 5 3 0.00
212 31-Aug-11 4000100 5 4 10.00
213 31-Aug-11 4000100 5 5 10.00
214 31-Aug-11 4000100 6 5 8.00
215 31-Aug-11 4000100 7 5 27.00
216 31-Aug-11 4000100 8 5 41.00
217 31-Aug-11 4000100 9 3 0.00
218 31-Aug-11 4000100 9 4 4.00
219 31-Aug-11 4000100 9 5 4.00
220 31-Aug-11 4000100 10 3 0.00
221 31-Aug-11 4000100 10 4 2.00
222 31-Aug-11 4000100 10 5 2.00
223 31-Aug-11 4000100 11 3 0.00
224 31-Aug-11 4000100 11 4 0.00
225 31-Aug-11 4000100 11 5 0.00
226 31-Aug-11 4000100 12 3 0.00
227 31-Aug-11 4000100 12 4 4.00
228 31-Aug-11 4000100 12 5 4.00
229 31-Aug-11 4000100 13 3 0.00
230 31-Aug-11 4000100 13 4 2.00
231 31-Aug-11 4000100 13 5 2.00
232 31-Aug-11 4000100 14 3 0.00
233 31-Aug-11 4000100 14 4 0.00
234 31-Aug-11 4000100 14 5 0.00
235 31-Aug-11 4000100 15 3 0.00
236 31-Aug-11 4000100 15 4 6.00
237 31-Aug-11 4000100 15 5 6.00
238 31-Aug-11 4000100 16 5 0.00
239 31-Aug-11 4000100 17 5 0.00
240 31-Aug-11 4000100 18 5 0.00
241 31-Aug-11 4000100 19 5 212.81
242 31-Aug-11 4000100 20 5 0.00
243 31-Aug-11 4000100 21 5 39.20
244 31-Aug-11 4000100 22 5 0.00
245 31-Aug-11 4000100 23 5 0.00
246 31-Aug-11 4000100 26 5 0.00
247 31-Aug-11 4000100 27 5 92.96
248 31-Aug-11 4000100 28 5 21.07
249 31-Aug-11 4000100 29 5 0.00
250 31-Aug-11 4000100 30 5 44.24
251 31-Aug-11 4000100 31 5 10.09
252 31-Aug-11 5000100 3 3 97700.00
253 31-Aug-11 5000100 4 3 100600.00
254 31-Aug-11 5000100 5 3 82700.00
255 31-Aug-11 5000100 6 3 33900.00
256 31-Aug-11 5000100 7 3 59800.00
257 31-Aug-11 5000100 8 3 196400.00
258 31-Aug-11 5000100 9 3 287600.00
259 31-Aug-11 5000100 10 3 77100.00
260 31-Aug-11 5000100 11 3 154200.00
261 31-Aug-11 5000100 12 3 79100.00
262 30-Nov-07 4000100 6 4 8.00
263 30-Nov-07 4000100 7 4 27.00
264 30-Nov-07 4000100 8 4 41.00
265 30-Nov-07 4000100 16 4 0.00
266 30-Nov-07 4000100 18 4 0.00
267 30-Nov-07 4000100 19 4 0.03
268 30-Nov-07 4000100 20 4 0.00
269 30-Nov-07 4000100 21 4 0.01
270 30-Nov-07 4000100 26 4 0.00
271 30-Nov-07 4000100 27 4 0.00
272 30-Nov-07 4000100 28 4 0.00
273 30-Nov-07 4000100 29 4 0.00
274 30-Nov-07 4000100 30 4 0.00
275 30-Nov-07 4000100 31 4 0.00
276 31-Aug-11 1000101 6 2 1000.00
277 31-Aug-11 1000101 6 1 5.00
278 31-Aug-11 1000101 6 3 50800.00
279 31-Aug-11 1000101 7 2 4000.00
280 31-Aug-11 1000101 7 1 11.00
281 31-Aug-11 1000101 7 3 129828.26
282 31-Aug-11 1000101 12 2 14000.00
283 31-Aug-11 1000101 12 1 27.00
284 31-Aug-11 1000101 12 3 244433.51
285 31-Aug-11 1000101 30 2 0.00
286 31-Aug-11 1000101 30 1 6.00
287 31-Aug-11 1000101 30 3 1415254.00
288 31-Aug-11 1000101 39 2 0.00
289 31-Aug-11 1000101 39 1 8.00
290 31-Aug-11 1000101 39 3 5300.00
291 31-Aug-11 3000103 6 6 46.00
292 31-Aug-11 3000103 6 7 585509.54
293 31-Aug-11 3000104 3 13 7.00
294 31-Aug-11 3000104 3 14 47721.81
295 31-Aug-11 3000104 4 13 6.00
296 31-Aug-11 3000104 4 14 80805.83
297 31-Aug-11 3000104 5 13 6.00
298 31-Aug-11 3000104 5 14 46569.72
299 31-Aug-11 3000104 6 13 7.00
300 31-Aug-11 3000104 6 14 112907.50
301 31-Aug-11 3000104 7 13 6.00
302 31-Aug-11 3000104 7 14 75500.00
303 31-Aug-11 3000104 8 13 13.00
304 31-Aug-11 3000104 8 14 123471.04
305 31-Aug-11 3000104 9 13 6.00
306 31-Aug-11 3000104 9 14 101401.56
307 31-Aug-11 3000104 20 13 6.00
308 31-Aug-11 3000104 20 14 60489.10
309 31-Aug-11 3000104 22 13 6.00
310 31-Aug-11 3000104 22 14 75567.03
311 31-Aug-11 3000104 23 13 6.00
312 31-Aug-11 3000104 23 14 61339.10
313 31-Aug-11 3000104 24 13 20.00
314 31-Aug-11 3000104 24 14 94688.52
315 31-Aug-11 3000104 25 13 45.00
316 31-Aug-11 3000104 25 14 5772805.47
317 31-Aug-11 3000104 26 13 6.00
318 31-Aug-11 3000104 26 14 76462.24
319 31-Aug-11 3000104 27 13 16.00
320 31-Aug-11 3000104 27 14 63742.32
321 31-Aug-11 3000104 31 13 6.00
322 31-Aug-11 3000104 31 14 60141.20
323 31-Aug-11 3000105 7 3 14.00
324 31-Aug-11 3000105 7 4 162632.63
325 31-Aug-11 3000105 7 5 11.00
326 31-Aug-11 3000105 7 6 92950.93
327 31-Aug-11 3000105 8 3 15.00
328 31-Aug-11 3000105 8 4 128938.59
329 31-Aug-11 3000105 8 5 14.00
330 31-Aug-11 3000105 8 6 1491571.98
331 31-Aug-11 3000105 9 3 14.00
332 31-Aug-11 3000105 9 4 1666338.36
333 31-Aug-11 3000105 9 5 13.00
334 31-Aug-11 3000105 9 6 1377521.70
335 31-Aug-11 5000100 13 3 91100.00
336 31-Aug-11 5000100 15 3 75100.00
337 31-Aug-11 5000100 16 3 60100.00
338 31-Aug-11 5000100 17 3 448700.00
339 31-Aug-11 5000100 18 3 1117400.00
340 31-Aug-11 5000100 19 3 81100.00
341 31-Aug-11 5000100 20 3 82100.00
342 31-Aug-11 5000100 24 3 157500.00
343 31-Aug-11 5000100 25 3 48800.00
344 31-Aug-11 7000001 19 1
345 31-Aug-11 7000001 32 1
346 31-Aug-11 7000001 40 1
347 31-Aug-11 7000001 42 1
348 31-Aug-11 7000001 43 1
349 31-Aug-11 7000001 45 1
350 31-Aug-11 7000001 48 1 0.15
351 31-Aug-11 7000001 50 1 0.15
352 31-Aug-11 7000001 51 1
353 31-Aug-11 7000001 54 1
354 31-Aug-11 7000001 55 1
355 31-Aug-11 7000001 66 1 4417.45
356 31-Aug-11 7000001 66 2
357 31-Aug-11 7000002 12 2
358 31-Aug-11 7000002 12 3
359 31-Aug-11 7000002 13 2
360 31-Aug-11 7000002 13 3
361 31-Aug-11 7000002 16 2
362 31-Aug-11 7000002 16 3
363 31-Aug-11 7000002 17 2
364 31-Aug-11 7000002 17 3
365 31-Aug-11 7000002 18 2
366 31-Aug-11 7000002 18 3
367 31-Aug-11 7000002 19 2
368 31-Aug-11 7000002 19 3
369 31-Aug-11 7000002 20 2
370 31-Aug-11 7000002 20 3
371 31-Aug-11 7000002 22 2
372 31-Aug-11 7000002 23 2
373 31-Aug-11 7000002 23 3 14900.00
374 31-Aug-11 7000002 27 2
375 31-Aug-11 7000002 27 3
376 31-Aug-11 7000002 28 2
377 31-Aug-11 7000002 29 2
378 31-Aug-11 7000002 30 2
379 31-Aug-11 7000002 30 3 59100.00
380 31-Aug-11 7000002 32 2
381 31-Aug-11 7000002 32 3 56100.00
382 31-Aug-11 7000002 34 2
383 31-Aug-11 7000002 34 3 373880.00
384 31-Aug-11 7000002 35 2
385 31-Aug-11 7000002 35 3 119363875.00
386 31-Aug-11 7000002 36 2
387 31-Aug-11 7000002 36 3 9629230.00
388 31-Aug-11 7000002 38 2
389 31-Aug-11 7000002 38 3 2287214.00
390 31-Aug-11 7000002 40 2
391 31-Aug-11 7000002 40 3 1935671.00
392 31-Aug-11 7000002 41 2
393 31-Aug-11 7000002 41 3 2515010.00
394 31-Aug-11 7000002 42 2
395 31-Aug-11 7000002 42 3 36348309.00
396 31-Aug-11 7000002 43 2
397 31-Aug-11 7000002 43 3 3212922.00 -
Dear All
I have received common error when creating relation:
The relationship cannot be created because each column contains duplicate values. Select at least one column that contains only unique values.
The issue is strange because I am sure that there are no duplicate values in my table! I checked it.
What I've done was to create a single column "Append column" by append few columns with articles from different databases using Power Query. After that I removed duplicates.
The final column with unique values was added to powerpivot model as a new table. There are no duplicates and spaces.
Then I tried to create the relation between the column with append values and other databases which I had used to create this column.
So it is look like this:
Details of error:
Error Message:
============================
The relationship cannot be created because each column contains duplicate values. Select at least one column that contains only unique values.
============================
Call Stack:
============================
at Microsoft.AnalysisServices.Common.RelationshipController.CreateRelationship(DataModelingColumn sourceColumn, DataModelingColumn relatedColumn)
at Microsoft.AnalysisServices.Common.RelationshipController.CreateRelationship(String fkTableName, String fkColumnName, String pkTableName, String pkColumnName)
at Microsoft.AnalysisServices.Common.SandboxEditor.erDiagram_CreateRelationshipCallBack(Object sender, ERDiagramCreateRelationshipEventArgs e)
at Microsoft.AnalysisServices.Common.ERDiagram.ERDiagramActionCreateRelationship.Do(IDiagramActionInstance actionInstance)
============================
What could be a reason of this issue?Thanks recio but your solutions not works
I have found the issue but not a solution.
The problem is that there is a space at the end of some articles names.
In my Database 1 I have an "a" article but
in my Database 2 I have an "a_"
where "_" is invisible space character
So PowerQuery and excel "Remove duplicate" function did not find duplicates but Powerpivot see ones
My data model looks like as on example below
So now my question is how to automaticly get rid of space character to be able to create relation?
As I see power Query does not have such an option. Maybe a look up formula in PowerPivot? But how it should look like
This is how my "Append column" Should look like: -
Retrieving duplicate values from a table
I have a table that has 80,000 rows. One column contains text identifiers as varchar2. There are almost all unique. A select count(distinct id) statement returns 79,980 rows. Is there a way to return only the remaining 20?
To know the unique values you can do this. but it would not return the rows, only the 20 or less duplicate values.
SELECT textColumn, count(*) as cnt_occurences
FROM yourTable
GROUP BY txtColumn
HAVING count(*) > 1; -
Hi all,
I have created a form with one data block MATURED_FD_DTL which looks like below:
ACCT_FD_NO
CUST_CODE
FD_AMT
FD_INT_BAL
TDS
CHQ_NO
ADD_FD_AMT
P_SAP_CODE
P_TYPE
CREATE TABLE MATURED_FD_DTL
ACCT_FD_NO VARCHAR2(17 BYTE) NOT NULL,
CUST_CODE NUMBER(9),
FD_AMT NUMBER(15),
FD_INT_BAL NUMBER(15),
TDS NUMBER(15),
CHQ_NO NUMBER(10),
ADD_FD_AMT NUMBER(15),
P_SAP_CODE NUMBER(10),
P_TYPE VARCHAR2(1 BYTE)
For MATURED_FD_DT.ACCT_FD_NO, Trigger: KEY_NEXT_ITEM, I have written the following code:
DECLARE
V1 NUMBER;
V2 NUMBER;
BEGIN
V1:=:MATURED_FD_DTL.ACCT_FD_NO;
MESSAGE('V1:'||V1);
MESSAGE(' ');
END;
GO_ITEM('MATURED_FD_DTL.CUST_CODE');
This is just a dummy code.
When ever i enter the value in the field of ACCT_FD_NO, it pops up a message saying "V1:Value of ACCT_FD_NO",
So i want to store that value .
Compare them & if it is egual, it should pop up a message saying duplicate value is entered and must not allow to enter, at form_level.
So how can I compare the ACCT_FD_NO value between FIRST_RECORD and NEXT_RECORD ??
Help Me .
Thank You.
Oracle Forms 6i.
Oracle 9i.Thank You HamidHelal.
Actually before posting the code, I went through the first link you have mentioned. I tried also, it is quite tedious procedure, however i dint get the proper result, may be i have done some mistake. Anyways , can you tell me how do i check in database table, if a value exists OR IF VALUE FOUND & then message and raise form trigger failure; something like below.
IF :MATURED_FD_DTL.ACCT_FD_NO EXISTS
THEN MESSAGE('YOU HAVE ENTERED AN EXISTING OR DUPLICATE VALUE');
MESSAGE(' ');
RAISE FORM_TRIGGER_FAILURE;
END;
OR
IF :MATURED_FD_DTL.ACCT_FD_NO FOUND
THEN MESSAGE('YOU HAVE ENTERED AN EXISTING OR DUPLICATE VALUE');
MESSAGE(' ');
RAISE FORM_TRIGGER_FAILURE;
END;
Is there any equivalent in the form in oracle forms 6i?? Please let me know how do i do this?? -
Colour duplicate values in html output
Hello,
I am looking to colour duplicate values in an html output file.
I have written a folder comparison ps script which shows output as below:
Name Length Version
Name length and version are the column names
Now there are many duplicate entries under name column which I need to highlight using any one colour. I have sorted the output under name using alphabetical order.
I just need to highlight all duplicate values using a particular colour.
Thanks in advance.Posting my script here:
# Get Start Time
$startDTM = (Get-Date)
$a = "<style>"
$a = $a + "BODY{background-color:peachpuff;}"
$a = $a + "TABLE{border-width: 1px;border-style: solid;border-color: black;border-collapse: collapse;}"
$a = $a + "TH{border-width: 1px;padding: 0px;border-style: solid;border-color: black;background-color:thistle}"
$a = $a + "TD{border-width: 1px;padding: 0px;border-style: solid;border-color: black;background-color:PaleGoldenrod}"
$a = $a + "</style>"
$folderReference = Read-Host "Enter Reference Folder Name"
$folderDifference = Read-Host "Enter Difference Folder Name"
$FolderReferenceContents = Get-ChildItem $folderReference -Recurse |
where-object {-not $_.PSIsContainer}
$FolderDifferenceContents = Get-ChildItem $folderDifference -Recurse |
where-object {-not $_.PSIsContainer}
#get only files that are on reference folder not base folder
$one = Compare-Object -ReferenceObject $FolderReferenceContents `
-DifferenceObject $FolderDifferenceContents -Property ('Name', 'Length','Get-FileVersionInfo') -PassThru |
where-object { $_.SideIndicator -eq '<='} | select Name, Length, LastWriteTime, VersionInfo
#get only files that are on base folder not on reference
$two = Compare-Object -ReferenceObject $FolderReferenceContents `
-DifferenceObject $FolderDifferenceContents -Property ('Name', 'Length','Get-FileVersionInfo') -PassThru |
where-object { $_.SideIndicator -eq '=>'} | select Name, Length, LastWriteTime, VersionInfo
$three = $one + $two | sort-object -property Name | ConvertTo-HTML -Fragment
# Get End Time
$endDTM = (Get-Date)
# Echo Time elapsed
$four = "Elapsed Time: $(($endDTM-$startDTM).totalseconds) seconds"
ConvertTo-HTML -Head $a -Body "$one $two $three $four" -Title "Output" | Out-File C:\output.html
Maybe you are looking for
-
XML Gallery won't work remotely - Action Script issues
I created an XML scrolling thumbnail gallery using a tutorial found here I then had to alter the action script to get my thumbnails to display properly -- they were spaced out or overlapping each other and not in the sequence specified in the XML fil
-
Slideshow on IOS 7.1.1
According to Apple, I can make a slidehsow by Create a slideshow from selected photos Tap two or more photos or video clips, an album, or an event, and tap . Tap Slideshow. Under 7.1.1, Slideshow does not show up as an option. The only options are sh
-
Hi, I am fairly new to OS X, recently making the jump from Windows. And I haven't looked back and don't think I will! I have come across quite a few problems during the transition when it comes to hardware, but usually it came down to my error with c
-
Move from NON-SSL to SSL (OAS 9.0.4.1)
We installed OAS 9.0.4.1 (two Midtier and 1 Infst). We have Application based on forms. We installed and configure OAS default like non-ssl and forms using port 7778. Now we need to use SSL. If somebody give me detail what should be done? Actually, w
-
Get number of cells in F that = "Charge" for which H = "win"
Hello, (having a hard time working with formulas) I have a table with games (matches), I want to know how many games of a type were won. I tried to use (in French) NB.SI and CONCATENER but I always get syntax error. P.S. how do I know a function in a