Remove duplicate record with a concurrent program
Hi
I have two nearly simliar records. I will remove one of them. It should be a program to do this.
Any suggestion.
Thanks
Hi,
It shoul be a conc. prog. for deleting (removing) duülicate records. Currently i am searching to find this prog.What is the point of removing this duplicate record using a concurrent program (not manually or using an API)? I believe this is no standard concurrent program to achieve this.
Thanks,
Hussein
Similar Messages
-
Loacate and remove duplicate records in infocube.
Hi!!
we have found the infocube 0PUR_C01 contians duplicate records for the month april 2008, approx 1.5lac records are extracted to this infocube, similar situations may be occuring in the subsequent months.
How do I locate these records and remove them for the infocube?
How do I ensure that duplicate records are not extracted in the infocube?
All answers/ links are welcome!!
Yours Truly
K SenguptoFirst :
1. How do I locate duplicate records in an Infocube? other than down load all the records in an excel file and use excel funtionality to locate duplicate records.
This is not possible since a duplicate record would not exist - the records are sent to a cube with a + and - sign to accordingly summarize data.
You search for duplicate data would become that much troublesome.
If you have a DSO to load it from - delete data for that month and reload if possible this would be quicker and cleaner as opposed to removing duplicate records.
If you had
ABC|100 in your DSO and it got doubled
it would be
ABC|+100
ABC|+100
against different requests in the cube - and added to this ill be your correct deltas also. -
How to remove duplicates records from output ?
how to remove duplicates records from output ? i used delete adjacent but duplicates records are coming again ..suggest me
hi shruthi,
thanks for ur answer ..but duplicates records coming again
here is my code >> plz check it out
*& Report ZCRM_TROUBLE_TICKET
REPORT zcrm_trouble_ticket.
TYPES : BEGIN OF ty_qmih,
qmnum TYPE qmnum,
equnr TYPE equnr,
iloan TYPE iloan,
ausvn TYPE ausvn,
ausbs TYPE ausbs,
auztv TYPE auztv,
auztb TYPE auztb,
iwerk TYPE iwerk,
END OF ty_qmih,
BEGIN OF ty_qmel,
qmnum TYPE qmnum,
qmtxt TYPE qmtxt,
indtx TYPE indltx,
priok TYPE priok,
strmn TYPE strmn,
strur TYPE strur,
ltrmn TYPE ltrmn,
ltrur TYPE ltrur,
objnr TYPE qmobjnr,
arbpl TYPE lgwid,
vkorg TYPE vkorg,
vtweg TYPE vtweg,
spart TYPE spart,
END OF ty_qmel,
BEGIN OF ty_ihpa,
parnr TYPE i_parnr,
parvw TYPE parvw,
objnr TYPE qmobjnr,
END OF ty_ihpa,
BEGIN OF ty_crhd,
arbpl TYPE arbpl,
objid TYPE cr_objid,
END OF ty_crhd,
BEGIN OF ty_crtx,
ktext TYPE cr_ktext,
objid TYPE cr_objid,
END OF ty_crtx,
BEGIN OF ty_qmfe,
fecod TYPE fecod,
fegrp TYPE fegrp,
qmnum TYPE qmnum,
END OF ty_qmfe,
BEGIN OF ty_qmur,
urcod TYPE urcod,
urgrp TYPE urgrp,
urtxt TYPE urstx,
qmnum TYPE qmnum,
END OF ty_qmur,
BEGIN OF ty_iloa,
tplnr TYPE tplnr,
iloan TYPE iloan,
END OF ty_iloa,
BEGIN OF ty_output,
qmnum TYPE qmnum,
equnr TYPE equnr,
iloan TYPE iloan,
ausvn TYPE ausvn,
ausbs TYPE ausbs,
auztv TYPE auztv,
auztb TYPE auztb,
iwerk TYPE iwerk,
qmtxt TYPE qmtxt,
indtx TYPE indltx,
priok TYPE priok,
strmn TYPE strmn,
strur TYPE strur,
ltrmn TYPE ltrmn,
ltrur TYPE ltrur,
objnr TYPE qmobjnr,
arbpl TYPE lgwid,
vkorg TYPE vkorg,
vtweg TYPE vtweg,
spart TYPE spart,
parnr TYPE i_parnr,
parvw TYPE parvw,
arbpl TYPE arbpl,
objid TYPE cr_objid,
arbpl1 TYPE arbpl,
ktext TYPE cr_ktext,
fecod TYPE fecod,
fegrp TYPE fegrp,
urcod TYPE urcod,
urgrp TYPE urgrp,
urtxt TYPE urstx,
tplnr TYPE tplnr,
END OF ty_output.
DATA : it_qmih TYPE STANDARD TABLE OF ty_qmih,
it_qmel TYPE STANDARD TABLE OF ty_qmel,
it_ihpa TYPE STANDARD TABLE OF ty_ihpa,
it_crhd TYPE STANDARD TABLE OF ty_crhd,
it_crtx TYPE STANDARD TABLE OF ty_crtx,
it_qmfe TYPE STANDARD TABLE OF ty_qmfe,
it_qmur TYPE STANDARD TABLE OF ty_qmur,
it_iloa TYPE STANDARD TABLE OF ty_iloa,
it_output TYPE STANDARD TABLE OF ty_output,
wa_qmih TYPE ty_qmih,
wa_qmel TYPE ty_qmel,
wa_ihpa TYPE ty_ihpa,
wa_crhd TYPE ty_crhd,
wa_crtx TYPE ty_crtx,
wa_qmfe TYPE ty_qmfe,
wa_qmur TYPE ty_qmur,
wa_iloa TYPE ty_iloa,
wa_output TYPE ty_output.
INITIALIZATION.
REFRESH : it_qmih,
it_qmel,
it_ihpa,
it_crhd,
it_crtx,
it_qmfe,
it_qmur,
it_iloa,
it_output.
CLEAR: wa_qmih,
wa_qmel,
wa_ihpa,
wa_crhd,
wa_crtx,
wa_qmfe,
wa_qmur,
wa_iloa,
wa_output.
start-of-selection.
SELECT qmnum
equnr
iloan
ausvn
ausbs
auztv
auztb
iwerk
FROM qmih
INTO TABLE it_qmih.
SORT it_qmih BY qmnum .
DELETE ADJACENT DUPLICATES FROM it_qmih COMPARING qmnum equnr iloan ausvn ausbs auztv auztb iwerk.
SELECT qmnum
qmtxt
indtx
priok
strmn
strur
ltrmn
ltrur
objnr
arbpl
vkorg
vtweg
spart
FROM qmel
INTO TABLE it_qmel
FOR ALL ENTRIES IN it_qmih
WHERE qmnum = it_qmih-qmnum.
SORT it_qmel BY qmnum.
DELETE ADJACENT DUPLICATES FROM it_qmel COMPARING qmnum
qmtxt
indtx
strmn
strur
ltrmn
ltrur
objnr
arbpl
vkorg
vtweg
spart.
IF it_qmel IS NOT INITIAL.
SELECT parnr
parvw
objnr
FROM ihpa
INTO TABLE it_ihpa
FOR ALL ENTRIES IN it_qmel
WHERE objnr = it_qmel-objnr.
ENDIF.
DELETE ADJACENT DUPLICATES FROM it_ihpa COMPARING parnr
parvw
objnr.
IF it_qmel IS NOT INITIAL.
SELECT arbpl
objid
FROM crhd
INTO TABLE it_crhd
FOR ALL ENTRIES IN it_qmel
WHERE objid = it_qmel-arbpl.
ENDIF.
DELETE ADJACENT DUPLICATES FROM it_crhd COMPARING arbpl
objid.
IF it_qmel IS NOT INITIAL.
SELECT ktext
objid
FROM crtx
INTO TABLE it_crtx
FOR ALL ENTRIES IN it_crhd
WHERE objid = it_crhd-objid.
ENDIF.
DELETE ADJACENT DUPLICATES FROM it_crtx COMPARING ktext
objid.
IF it_qmih IS NOT INITIAL.
SELECT fecod
fegrp
qmnum
FROM qmfe
INTO TABLE it_qmfe
FOR ALL ENTRIES IN it_qmih
WHERE qmnum = it_qmih-qmnum.
ENDIF.
SORT it_qmfe BY qmnum.
DELETE ADJACENT DUPLICATES FROM it_qmfe COMPARING fecod
fegrp.
IF it_qmih IS NOT INITIAL.
SELECT urcod
urgrp
urtxt
qmnum
FROM qmur
INTO TABLE it_qmur
FOR ALL ENTRIES IN it_qmih
WHERE qmnum = it_qmih-qmnum.
ENDIF.
SORT it_qmur BY qmnum.
DELETE ADJACENT DUPLICATES FROM it_qmur COMPARING urcod
urgrp
urtxt.
IF it_qmih IS NOT INITIAL.
SELECT tplnr
iloan
FROM iloa
INTO TABLE it_iloa
FOR ALL ENTRIES IN it_qmih
WHERE iloan = it_qmih-iloan.
ENDIF.
DELETE ADJACENT DUPLICATES FROM it_iloa COMPARING tplnr
iloan.
LOOP AT it_qmih INTO wa_qmih.
wa_output-qmnum = wa_qmih-qmnum.
wa_output-equnr = wa_qmih-equnr.
wa_output-iloan = wa_qmih-iloan.
wa_output-ausvn = wa_qmih-ausvn.
wa_output-ausbs = wa_qmih-ausbs.
wa_output-auztv = wa_qmih-auztv.
wa_output-auztb = wa_qmih-auztb.
wa_output-iwerk = wa_qmih-iwerk.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_qmel INTO wa_qmel WITH KEY qmnum = wa_qmih-qmnum.
wa_output-qmtxt = wa_qmel-qmtxt.
wa_output-indtx = wa_qmel-indtx.
wa_output-priok = wa_qmel-priok.
wa_output-strmn = wa_qmel-strmn.
wa_output-strur = wa_qmel-strur.
wa_output-ltrmn = wa_qmel-ltrmn.
wa_output-ltrur = wa_qmel-ltrur.
wa_output-objnr = wa_qmel-objnr.
wa_output-arbpl = wa_qmel-arbpl.
wa_output-vkorg = wa_qmel-vkorg.
wa_output-vtweg = wa_qmel-vtweg.
wa_output-spart = wa_qmel-spart.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_ihpa INTO wa_ihpa WITH KEY objnr = wa_qmel-objnr.
wa_output-parnr = wa_ihpa-parnr.
wa_output-parvw = wa_ihpa-parvw.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_crhd INTO wa_crhd WITH KEY objid = wa_qmel-arbpl.
wa_output-arbpl = wa_crhd-arbpl.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_crtx INTO wa_crtx WITH KEY objid = wa_crhd-objid.
wa_output-ktext = wa_crtx-ktext.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_qmfe INTO wa_qmfe WITH KEY qmnum = wa_qmih-qmnum.
wa_output-fecod = wa_qmfe-fecod.
wa_output-fegrp = wa_qmfe-fegrp.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_qmur INTO wa_qmur WITH KEY qmnum = wa_qmih-qmnum.
wa_output-urcod = wa_qmur-urcod.
wa_output-urgrp = wa_qmur-urgrp.
wa_output-urtxt = wa_qmur-urtxt.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_iloa INTO wa_iloa WITH KEY iloan = wa_qmih-iloan.
wa_output-tplnr = wa_iloa-tplnr.
APPEND wa_output TO it_output.
CLEAR wa_output.
ENDLOOP.
DELETE ADJACENT DUPLICATES FROM it_output COMPARING qmnum
equnr
ausvn
ausbs
auztv
auztb
iwerk
qmtxt
indtx
priok
strmn
strur
ltrmn
ltrur
vkorg
vtweg
spart
parnr
parvw
arbpl
ktext
fecod
fegrp
urcod
urgrp
urtxt
tplnr.
*CALL FUNCTION 'STATUS_TEXT_EDIT'
EXPORTING
CLIENT = SY-MANDT
FLG_USER_STAT = ' '
objnr =
ONLY_ACTIVE = 'X'
spras = en
BYPASS_BUFFER = ' '
IMPORTING
ANW_STAT_EXISTING =
E_STSMA =
LINE =
USER_LINE =
STONR =
EXCEPTIONS
OBJECT_NOT_FOUND = 1
OTHERS = 2
*IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
*ENDIF.
*CALL FUNCTION 'READ_TEXT'
EXPORTING
CLIENT = SY-MANDT
id =
language =
name =
object =
ARCHIVE_HANDLE = 0
LOCAL_CAT = ' '
IMPORTING
HEADER =
tables
lines =
EXCEPTIONS
ID = 1
LANGUAGE = 2
NAME = 3
NOT_FOUND = 4
OBJECT = 5
REFERENCE_CHECK = 6
WRONG_ACCESS_TO_ARCHIVE = 7
OTHERS = 8
*IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
*ENDIF.
*LOOP AT IT_OUTPUT INTO WA_OUTPUT.
*WRITE : / WA_OUTPUT-qmnum,
WA_OUTPUT-equnr,
WA_OUTPUT-iloan,
WA_OUTPUT-ausvn,
WA_OUTPUT-ausbs,
WA_OUTPUT-auztv,
WA_OUTPUT-auztb,
WA_OUTPUT-qmtxt,
WA_OUTPUT-indtx,
WA_OUTPUT-strmn,
WA_OUTPUT-strur,
WA_OUTPUT-ltrmn,
WA_OUTPUT-ltrur,
WA_OUTPUT-objnr,
WA_OUTPUT-arbpl,
WA_OUTPUT-parnr,
WA_OUTPUT-parvw,
WA_OUTPUT-objid,
WA_OUTPUT-ktext,
WA_OUTPUT-fecod,
WA_OUTPUT-fegrp,
WA_OUTPUT-urcod,
WA_OUTPUT-urgrp,
WA_OUTPUT-urtxt,
WA_OUTPUT-tplnr.
*ENDLOOP.
CALL FUNCTION 'GUI_DOWNLOAD'
EXPORTING
BIN_FILESIZE =
filename = 'E:\CRM1.TXT'
FILETYPE = 'ASC'
APPEND = ' '
write_field_separator = '|'
HEADER = '00'
TRUNC_TRAILING_BLANKS = ' '
WRITE_LF = 'X'
COL_SELECT = ' '
COL_SELECT_MASK = ' '
DAT_MODE = ' '
CONFIRM_OVERWRITE = ' '
NO_AUTH_CHECK = ' '
CODEPAGE = ' '
IGNORE_CERR = ABAP_TRUE
REPLACEMENT = '#'
WRITE_BOM = ' '
TRUNC_TRAILING_BLANKS_EOL = 'X'
WK1_N_FORMAT = ' '
WK1_N_SIZE = ' '
WK1_T_FORMAT = ' '
WK1_T_SIZE = ' '
WRITE_LF_AFTER_LAST_LINE = ABAP_TRUE
IMPORTING
FILELENGTH =
TABLES
data_tab = it_output
FIELDNAMES =
EXCEPTIONS
FILE_WRITE_ERROR = 1
NO_BATCH = 2
GUI_REFUSE_FILETRANSFER = 3
INVALID_TYPE = 4
NO_AUTHORITY = 5
UNKNOWN_ERROR = 6
HEADER_NOT_ALLOWED = 7
SEPARATOR_NOT_ALLOWED = 8
FILESIZE_NOT_ALLOWED = 9
HEADER_TOO_LONG = 10
DP_ERROR_CREATE = 11
DP_ERROR_SEND = 12
DP_ERROR_WRITE = 13
UNKNOWN_DP_ERROR = 14
ACCESS_DENIED = 15
DP_OUT_OF_MEMORY = 16
DISK_FULL = 17
DP_TIMEOUT = 18
FILE_NOT_FOUND = 19
DATAPROVIDER_EXCEPTION = 20
CONTROL_FLUSH_ERROR = 21
OTHERS = 22
IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF. -
Duplicate record with same primary key in Fact table
Hi all,
Can the fact table have duplicate record with same primary key . When i checked a cube i could see records with same primary key combination but the key figure values are different. My cube has 6 dimentions (Including Time,Unit and DP) and 2 key figures. So 6 fields combined to form the composite primary key of the fact table. When i checked the records in se16 i could see duplicate records with same primary key. Ther are no parallel loading happening for the cube.
BW system version is 3.1
Data base is : Oracle 10.2
I am not sure how is this possible.
Regards,
PMHi Krish,
I checked the datapacket dimention also. Both the record have same dimention id (141). Except the Keyfigure value there is no other change in the Fact table record. I know this is against the basic DBMS primary key rule. But i have records like this in the cube.
Can this situation arise when same records is there in different data packet of same request.
Thx,
PM
null -
Removing Duplicate Records and Merging with the previous record
Hi
Can any one help me in doing this..
I want to delete a duplicate record from the current table.If there is one more record of the same kind. Compare the two records and fill the columns which are empty. Then Merge it. and Remove the duplicate record...Before deleting the record move it into the backup table..
with regards
srikanthI understand conceptually what you're after. At a detailed level, though, you're going to have to enunciate some specific rules to enforce.
1) What makes row A a duplicate of row B? Is it just the Name column? Some combination of columns? Something else?
2) What is the exact rule for merging rows? Among other questions
- If you have multiple non-NULL values for a particular column, which column wins? If you have one row that is Manchester, NH and one that is Nashua, NH, which would you keep?
- Does the merge logic depend on the particular column (i.e. perhaps you want the city & state to come from one row whereas first and last name can come from different rows)
Justin -
Remove duplicate records in Live Office, caused by CR Groups
hello all
i have a CR with groups. all works well, until i use the report in live office, were it is duplicating the group data for each of the detail records
i have removed the details from the CR report, leaving only the group data, but it still happens
anyone have a work around ?
thanks
gHi,
First you select the report name from the left panel and check whether option is coming on not.
or you try with right click on any report cell then go to live office and object properties.
Second , you are getting duplicate record in the particular this report or all reports.And how many time highlight expert you are using in this report.
Thanks,
Amit -
Removing duplicates record form a column
I ran this query to populate a field with random numbers but it keeps populating with some duplicate records. Any idea how I can remove the duplicates?
UPDATE APRFIL
SET ALTATH = CONVERT(int, RAND(CHECKSUM(NEWID())) * 10000);Prashanth,
You are correct the update does create the non-dupes records, it just doesn't insert them into the ALTATH field. I verify by running the select altath from aprfil table and the results are not the same records display after the updates. I hope I am clear
enough and thanks for your efforts.
Can you give example of a case where it doesnt work? It may be that values in actual were not duplicates due to presence of some unprintable characters like space characters.
Please Mark This As Answer if it solved your issue
Please Vote This As Helpful if it helps to solve your issue
Visakh
My Wiki User Page
My MSDN Page
My Personal Blog
My Facebook Page -
Can we use same data definition with multiple concurrent programs?
Hi,
My requirement is as below:
I have two concurrent programs (say CP1 and CP2), both concurrent programs need to use same data definition (the same data template xml file) and same RTF layout template. However when registering data definition in the E-Biz, the data definition code must match concurrent program, hence I have to create two definitions (for the same data template xml file) and because layout template is attached with data definition I have to duplicate the layout definition also.
In summary, i have to create two (duplicate) Data Definition and Layout Template for the same dataTemplate.xml and Layout.rtf files. Is there any way to avoid this duplication?
The only issue with duplication is any change in xml or rtf needs to be updated in all the four definitions (2 data definitions and 2 layout definitions).
Thanks
BhavikI found the resolution and thought of sharing
you can see this post
*[How to use same data definition/template between multiple concurrent programs? |http://techatwork.wordpress.com/2009/08/06/how-to-use-same-data-definitiontemplate-between-multiple-concurrent-programs/]*
Thanks
Bhavik -
Issue with iStore Concurrent Program
Hi All,
In local instance(11.5.10), I am not able to find 'iStore Concurrent Program' responsibility on it.
Please let me know how can I find it?
ThanksHi,
From the same window, search for IBECONC under "Responsibility Key", does this return any records?
What is your iStore patchset level?
Note: 139684.1 - Oracle Applications Current Patchset Comparison Utility - patchsets.sh
https://metalink.oracle.com/metalink/plsql/ml2_documents.showDocument?p_database_id=NOT&p_id=139684.1
Regards,
Hussein -
Dear All,
I have oracle 10g R2 On windows.
I have table structure like below...
ASSIGNED_TO
USER_ZONE
CREATED
MASTER_FOLIO_NUMBER
NAME
A_B_BROKER_CODE
INTERACTION_ID
INTERACTION_CREATED
INTERACTION_STATE
USER_TEAM_BRANCH
A4_IN_CALL_TYPE
A5_IN_CALL_SUBTYPE
DNT_AGING_IN_DAYS
DNT_PENDING_WITH
DNT_ESCALATION_STAGE_2
DT_UPDATEI use sql loader to load the data from .csv file to oracle table and have assign the value to dt_update sysdate. Everytime i execute the sql loader control file dt_update set as sysdate.
Sometimes problem occures while inserting data through sql loader and half row get insert. after solving the problem again i execute sql loader and hence these duplicate records get inserted.
Now I want to remove all the duplicate records for those dt_update is same.
Please help me to solve the problem
Regards,
Chanchal Wankhade.Galbarad wrote:
Hi
I think you have two ways
first - if it is first import in your table - you can delete all record from table and run import yet one time
second - you can delete all duplicate records and not running import
try this script
<pre>
delete from YOUR_TABLE
where rowid in (select min(rowid)
from YOUR_TABLE
group by ASSIGNED_TO,
USER_ZONE,
CREATED,
MASTER_FOLIO_NUMBER,
NAME,
A_B_BROKER_CODE,
INTERACTION_ID,
INTERACTION_CREATED,
INTERACTION_STATE,
USER_TEAM_BRANCH,
A4_IN_CALL_TYPE,
A5_IN_CALL_SUBTYPE,
DNT_AGING_IN_DAYS,
DNT_PENDING_WITH,
DNT_ESCALATION_STAGE_2,
DT_UPDATE)
</pre>Have you ever tried that script for deleting duplicates? I think not. If you did you'd find it deleted non-duplicates too. You'd also find that it only deletes the first duplicate where there are duplicates.
XXXX> CREATE TABLE dt_test_dup
2 AS
3 SELECT
4 mod(rownum,3) id
5 FROM
6 dual
7 CONNECT BY
8 level <= 9
9 UNION ALL
10 SELECT
11 rownum + 3 id
12 FROM
13 dual
14 CONNECT BY
15 level <= 3
16 /
Table created.
Elapsed: 00:00:00.10
XXXX> select * from dt_test_dup;
ID
1
2
0
1
2
0
1
2
0
4
5
6
12 rows selected.
Elapsed: 00:00:00.18
XXXX> delete
2 from
3 dt_test_dup
4 where
5 rowid IN ( SELECT
6 MIN(rowid)
7 FROM
8 dt_test_dup
9 GROUP BY
10 id
11 )
12 /
6 rows deleted.
Elapsed: 00:00:00.51
XXXX> select * from dt_test_dup;
ID
1
2
0
1
2
0
6 rows selected.
Elapsed: 00:00:00.00 -
How do i remove duplicate records?
I have a table in which almost all the records have been duplicated. I want to remove the duplicates now. How do i remove them?
Hi,
Here is the select stmt which will remove duplicate rows from table.
delete table_name a
where rowid not in (select max(rowid) from tablename b where a.rep_col = b.rep_col);
Hope this Helps.
Regards,
Ganesh R -
Query - Remove duplicate records based on value of one field
Hi,
Pleae see the data below,
how to remove records when its count 0
AND those records (name ) repeat with count > 0
existing data
name loc count
aaa a1 10
aaa a1 0
bbb b1 0
ccc c1 0
dcc d1 11
dcc d1 0
required output
name loc count
aaa a1 10
bbb b1 0
ccc c1 0
dcc d1 11
remove these records -
aaa a1 0
dcc d1 0Thanks.i assume that loc always corresponds to name. So to find the rows to remain is just a simple group by
with data as(
select 'aaa' name,'a1' loc,10 count from dual union all
select 'aaa','a1',0 from dual union all
select 'bbb','b1',0 from dual union all
select 'ccc','c1',0 from dual union all
select 'dcc','d1',11 from dual union all
select 'dcc','d1',0 from dual )
select
name
, loc
, max(count) cnt
from data
group by
name
, loc
order by
name
, loc
NAME LOC CNT
aaa a1 10
bbb b1 0
ccc c1 0
dcc d1 11to find the other is just a minus
with data as(
select 'aaa' name,'a1' loc,10 count from dual union all
select 'aaa','a1',0 from dual union all
select 'bbb','b1',0 from dual union all
select 'ccc','c1',0 from dual union all
select 'dcc','d1',11 from dual union all
select 'dcc','d1',0 from dual )
select name,loc,count from data
minus
select
name
, loc
, max(count) cnt
from data
group by
name
, loc
order by
name
, loc
NAME LOC COUNT
aaa a1 0
dcc d1 0so a delete would be
delete from data
where
(name,loc,count)
in
(select name,loc,count from data
minus ..regards -
Prevent Duplicate Record with Validation
Hello,
Can someone suggest to me how I can check a form such that it does not have a duplicate record, when a user either saves or creates the record?
In particular, I am looking at 2 fields in a record as the criterai for determine duplicity:
select PEOPLE_ID, DEPT_ID
from AN_PEOPLEDEPT
where PEOPLE_ID = :P12_PEOPLE_ID and
DEPT_ID = :P12_DEPT_ID
It would be nice to know how to make this work with only one validation in a page definition.
Thank you for your comments.
Regards,
Colin.1. Create a validation.
2. Choose "Page Level"
3. Choose "SQL"
4. Choose "Exists"
5. Give it a name and choose "Inline in notification"
6. Enter your query, simaler to the one I posted before but using your table name and your items
That's it. Essentially, you can create validations that display the error next to the item (won't work for you as it's just one item), display the error on an error page forcing the user to go back a page to edit the form (not ideal), or in the Notification area at the top of the page (as determined by your page template). The latter is the one your want. The validation will query your table based on the values of your two items. If there is a row that matches the values of the 2 items, then it will show the form again with an error message at the top. Just try following the steps I detailed above. After you create a validation or two, this should make sense.
Please listen to my advice about adding a unique index on the table as well. An ORA-XXXX error is ugly which is why I'm glad you're doing it with an HTML DB validation too, but the unique index will insure that you NEVER have dupes in your table. I can't stress this enough. Think about the check engine light on a modern car. It gives you a "pretty" reminder that something is wrong (low oil, too hot), but will typically shut the engine down before any real damage is done. We want the friendly reminder so we can get it to a mechanic, but the real safe-guard is under the covers and prevents any real damage from occuring, even if people ignore the warning light (no "wife ignoring the low oil light" jokes here please).
Tyler -
TS3276 How to remove duplicate emails in Apple Mail program
How do I remove duplicate emails from Apple Mail. I have runt he "remove duplicates" scripte several times and it finds nothing. I see clearly that I am receiving many duplictes.
Check out the tip by Bradley Taylor1 here > How to delete duplicate mail messages using Mail Script
-
Remove duplicate records in the Footer
Hello all:
I only want to print the final result(distinct records) in the footer section, therefore, the header and details sections are suppressed. However, I'm showing duplicate records in the footer section. How do I suppress the duplicate records?
Please help.....Thank you so much in advance
Here are my formulas :
Header
WhilePrintingRecords;
StringVar ConCat:=""
Details
WhilePrintingRecords;
StringVar Concat;
ConCat := Concat + Trim(ToText({MEDICATE.STARTDATE},"MM/dd/yyyy") + chr(7) + {MEDICATE.DESCRIPTION}) + chr(13)
Footer
WhilePrintingRecords;
StringVar ConCat;
Here's my desired output:
HUMALOG PEN 100 UNIT/ML SOLN
LANTUS 100 U/ML SOLN
METFORMIN HCL 1000 MG TABS
Below is an example my current output:
11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
11/24/2009-LANTUS 100 U/ML SOLN
11/24/2009-LANTUS 100 U/ML SOLN
11/24/2009-LANTUS 100 U/ML SOLN
01/24/2007-METFORMIN HCL 1000 MG TABS
01/24/2007-METFORMIN HCL 1000 MG TABS
01/24/2007-METFORMIN HCL 1000 MG TABS
11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
11/24/2009-LANTUS 100 U/ML SOLN
11/24/2009-LANTUS 100 U/ML SOLN
11/24/2009-LANTUS 100 U/ML SOLN
01/24/2007-METFORMIN HCL 1000 MG TABS
01/24/2007-METFORMIN HCL 1000 MG TABS
01/24/2007-METFORMIN HCL 1000 MG TABS
11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
11/24/2009-LANTUS 100 U/ML SOLN
11/24/2009-LANTUS 100 U/ML SOLN
11/24/2009-LANTUS 100 U/ML SOLN
01/24/2007-METFORMIN HCL 1000 MG TABS
01/24/2007-METFORMIN HCL 1000 MG TABS
01/24/2007-METFORMIN HCL 1000 MG TABSSorry, I forgot to mention I'm already grouping by Patient Name, and that's where I have created a formula called:
MEDSHEAD - which I've modified and added the concat line.
WhilePrintingRecords;
//StringVar ConCat:="";
StringVar ConCat;
ConCat := Concat + Trim(ToText({MEDICATE.STARTDATE},"MM/dd/yyyy") + chr(7) + {MEDICATE.DESCRIPTION}) + chr(13);
In the detail section : I'm displaying the the patient's name, DOB and Gender and the hemoglabin lab result and date of the lab result.
In the patient's footer section is where I have the formula called: medsfooter (WhilePrintingRecords; StringVar Concat; Concat;)
The changes that I made, eliminate some of the duplicate records, however, it's no longer initializing the values - what I meant is I am getting the previous patient's medication information:
See output below:
Patient One 4/25/1958 F
6/26/09 12.2
9/2/2008-Glipizide XL 5 MG TB24
4/2/2009-Novolog 100 unit/ml soln
Patient Two 12/11/45 F
9/2/2008-Glipizide XL 5 MG TB24
4/2/2009-Novolog 100 unit/ml soln
11/24/2009-Humalog Pen 100 Unit/ML soln
Maybe you are looking for
-
PLEEZ HELP ME
-
My itunes wont update to the 10.7, it keeps failing, can anybody help?
My itunes starts updating but fails after a few minutes and suggests downloading manually but the option under tools is not highlighted for me to pick. Can anybody help?
-
Acrobat X Std does not print from IE8 menu, rt click works...
Weird. I can't print an embedded PDF in IE8 going through File, Print. Nothing happens. There is no error, the thing just disappears. File, Print Preview shows the frame images, but not the embedded PDF. I can, however, right click on the embedded PD
-
i have to tried to make programs using java and oracle. if i give the values in the 'insert' statement it is getting updated in the original table in oracle but how to take the values from the text fields of java and insert into the tables in oracle.
-
User Profile Disks with Windows Search
We have 2 RDS Servers (2012R2) and work With User Profile Disks We use outlook cached mode (Outlook 2013 SP1) because the link to our head office is to slow The problem is that the outlook search isn't working properly. Every time the users logs on,