Deleting the duplicate records
Hi all,
I have some duplicate records in the PSA . and i want to delete the duplicate records before the data is entered into the data target. Can anybody tell me how to achieve this.
Actually i need to achieve this by using the ABAP code.
Thanx & Regards,
RaviChandra
Edited by: Ravichandra.bi on Mar 5, 2012 3:37 PM
Hi Ravi,
If its a full load to the PSA and because of it you are getting duplicate records and you want to eliminate them,
you can do what Anshul and Durgesh suggested.
Write the code in the start routine of the transformation to your target
SORT SOURCE_PACKAGE BY KEY1 KEY2 KEY3. "(The key will be the key of your target)
DELETE ADJACENT DUPLICATES FROM SOURCE_PACKAGE
COMPARING KEY1 KEY2 KEY3.
Hope this helps
Regards,
Joe
Similar Messages
-
How to delete the duplicate records in a table without promary key
I have a table that contains around 1 million records and there is no promary key or auto number coulums. I need to delete the duplicate records from this table. what is the simple effective way to do this.
Please see this link:
Remove duplicate records ...
sqldevelop.wordpress.com -
How to delete the duplicate data from PSA Table
Dear All,
How to delete the duplicate data from PSA Table, I have the purchase cube and I am getting the data from Item data source.
In PSA table, I found the some cancellation records for that particular records quantity would be negative for the same record value would be positive.
Due to this reason the quantity is updated to target but the values would summarized and got the summarized value of all normal and cancellation .
Please let me know the solution how to delete the data while updating to the target.
Thanks
Regards,
SaiHi,
in deleting the records in PSA table difficult and how many you will the delete.
you can achieve the different ways.
1. creating the DSO maintain the some key fields it will overwrite the based on key fields.
2. you can write the ABAP logic deleting the duplicate records at info package level check with the your ABAPer.
3.you can restrict the cancellation records at query level.
Thanks,
Phani. -
Delete overlapping/duplicate records from cube
Hi All,
Kindly let me know how to delete overlapping requests from a cube. Actually the cube is getting loaded from varuous infosources, but there are records which get duplicated and the are not wanted , then hiow to delete the duplicate records from the cube.
Regards,
dolaI think what arun is perfectly right....
use DSO for consolidation of various requests..from diferenet infosources...
Now load from DSO to cube...and it is very much possible...though will require little work.
Delete duplicate records option is usually used for master data.With transacdtion data i don't think its advisable.
Regards,
RK -
Delete the duplicate and keep the max records.....
I would like to remove the duplicate records based on columns ID and VAL but keep the max SAL records. ID + VAL is the key in the table and just delete same records by keeping the max sal.
Note: Eventhough there are two records for the same max SAL, just keep one
eg
SQL> select * from temp_fa;
ID VAL SAL
1 100 10
1 100 20
1 100 20
2 200 10
3 300 10
3 300 30
4 400 10
4 400 10
5 500 10
5 500 20
5 500 20
After deleting the table should looks like
SQL> select * from temp_fa;
ID VAL SAL
1 100 20
2 200 10
3 300 30
4 400 10
5 500 20user520824 wrote:
I would like to remove the duplicate records based on columns ID and VAL but keep the max SAL records. ID + VAL is the key in the table and just delete same records by keeping the max sal.
Note: Eventhough there are two records for the same max SAL, just keep one
eg
SQL> select * from temp_fa;
ID VAL SAL
1 100 10
1 100 20
1 100 20
2 200 10
3 300 10
3 300 30
4 400 10
4 400 10
5 500 10
5 500 20
5 500 20
After deleting the table should looks like
SQL> select * from temp_fa;
ID VAL SAL
1 100 20
2 200 10
3 300 30
4 400 10
5 500 20
Hi,
In this script I included sal in the key because it's more safe for you.
ID VAL SAL
1 100 10
1 100 20
1 100 20
2 200 10
3 300 10
3 300 30
4 400 10
4 400 10
5 500 10
5 500 20
5 500 20
--1.Preserve first row from all duplicates
create table first_duplicate as
(select distinct val, id, sal
from temp_fa
where to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal) in
(select to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal) p_key
from temp_fa
group by to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal)
having count(to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal))>1)
--2.Delete all duplicates
DELETE FROM temp_fa
where to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal) in
(select to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal) p_key
from temp_fa
where to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal) in
(select to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal) p_key
from temp_fa
group by to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal)
having count(to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal))>1)
--3.Add first row from all duplicates
insert into temp_fa (val, id, sal)
select * from first_duplicate;
--4.Delete rows that don't have the max salary
DELETE FROM temp_fa
where to_char(temp_fa.id)||to_char(temp_fa.val)||to_char(temp_fa.sal) in
(select to_char(id)||to_char(val)||to_char(sal) p_key
from temp_fa
MINUS
(select to_char(x.id)||to_char(x.val)||to_char(x.sal) p_key
from temp_fa x,
(select val, id, max(sal) max_val from temp_fa
group by val, id ) y
WHERE x.val = y.val and
x.id = y.id and
x.sal =max_val
HR: XE > select * from temp_fa order by id;
ID VAL SAL
1 100 20
2 200 10
3 300 30
4 400 10
5 500 20
HR: XE > Regards,
Ion
Edited by: user111444777 on Sep 25, 2009 10:42 PM -
How to delete the double records connected to one or more than one tables in SQL 2008?
Hi
Can anyone please help me with the SQL query. I Im having a table called People with columns names: personno., lastname, firstname and so on. The personno. is having duplicate records,so all the duplicate records i have written with "double" in
the beginning of the numbers. I tried deleting these double records but they are linked to one or more than one tables. I have to find out, all the tables blocking the deleting of double person. And then create select statements which creates update statements
in order to replace the current id of double person with substitute id. (The personno. is in the form of id's in the database)
ThanksYou should not append "double" in the personno. When we append it will not be able to join or relate to other table. Keep the id as it is and use another field(STATUS) to mark as duplicate. Also we will require another field(PRIMARYID) against
those duplicate rows i.e the main or the primary personno.
SELECT * FROM OtherTable a INNER JOIN
(SELECT personno, status, primaryid FROM PEOPLE WHERE status = 'Duplicate') b
ON a.personno = b.personno
UPDATE OtherTable SET personno = b.primaryid
FROM OtherTable a INNER JOIN
(SELECT personno, status, primaryid FROM PEOPLE WHERE status = 'Duplicate') b
ON a.personno = b.personno
NOTE: Please take backup before applying the query. This is not tested.
Regards, RSingh -
How to delete the duplicate requests in a cube after compression.
Hi experts,
1. How to delete the duplicate requests in a cube after compression.?
2. How to show a charaterstics and a keyfigure side by side in a bex query output?
Regards,
Nishuv.Hi,
You cannot delete the request as its compressed as all the data would have been moved to E table ..
If you have the double records you may use the selective deletion .
Check this thread ..
How to delete duplicate data from compressed requests?
Regards,
shikha -
To find the duplicate record in internal table
Hi,
i have a requirement to fine the duplicate record with 3 fields.
i am getting a flat file with 15 fields .
i need to check the duplaicate records of 3 fields . if i get any 2nd same record of 3 fields , the records will go to other internal table.
for ex :
1. aaa bbb ccc ddd eee fff ggg hhh
2. aaa bbb ccf dde edd ffg ggh hhj
3. aaa bbb cce ddd ees ffh ggu hhk
in that 1st record and 3rd record are same (aaa bbb ddd)
i need to find 3rd record
please help me
regrards
srinivasuhi,
itab2[] = itab1[].
sort itab1 by f1 f2 f3.
sort itab2 by f1 f2 f3.
delete itab2 index 1. "to delete the first record in itab2.
loop at itab1 into ws_itab1.
loop at itab2 into ws_itab2.
if ws_itab1-f1 = ws_itab2-f1 and
ws_itab1-f2 = ws_itab2-f2 and
ws_itab1-f3 = ws_itab2-f3.
ws_itab3 = ws_itab2.
append ws_itab3 into itab3. "Third internal table.
endif.
endloop.
delete itab2 index 1.
endloop.
ITAB3 will have all the duplicate records.
Regards,
Subramanian -
DELETE THE MATCHED RECORDS IN DB2 TABLE
DELETE THE MATCHED RECORDS IN DB2 TABLE
sql server table sqlserver_emp(c1,c2,c3,4)
records:1 2 3 4
DB2 table db2_emp(c1 key,c2,c3,c4)
records:1 2 5 6 7 8
Both tables having same structure
Matched records : 1 2
1.Delete the matched records in db2 table : 1 2 (without using truncate option for DB2 in Anywhere)
2.Finally Load all records in sql server(Because duplicate records in db2 already removed so no duplicates occured)
3.NEED final OUTPUT AS:db2 table: 5 6 7 8 1 2 3 4
Note:
1.DB2 truncate doesn't used
2.STARING AREA: ORACLE
3.SQLSERVER AND DB2 CLOSED ENVIRONMENT DOESN'T USED AS STAGING AREA.
HOW MANY INTERFACES AND PROCEDURES SHOULD BE CREATE? WHAT ARE THEY?
HOW TO REACH MY REQUIREMNT?
Answer provider is more appricatable.
Thanks in advance.
Edited by: krishna on Nov 9, 2011 8:40 PM1st option
See in this scenario you can use two interface and one procedure step.
1st interface-->poricedure-->2nd interface
1st interface will bring data to oracle staging-->then procedure will delete matched record from target using keys--->3rd interface will simple insert the the data in oracle staging.
2nd option
Insted of deleting the target matched records, you just update target matched record with latest records in your staging oracle.You can use IKM Merge
Thanks -
Start routine to filter the duplicate records
Dear Experts
I have two questions regarding the start routine.
1) I have a characteristic InfoObject with transactional InfoSource. Often the 'duplicate records' error happens during the data loading. I'm trying to put a start routine in the update rule to filter out the duplicate records.
After searching the SDN forum and SAPHelp, I use the code as:
DELETE ADJACENT DUPLICATES FROM DATA_PACKAGE COMPARING KEY1 KEY2 KEY3.
In my case, the InfoObject has 3 keys: SOURSYSTEM, /BIC/InfoObjectname, OBJVERS. My code is:
DELETE ADJACENT DUPLICATES FROM DATA_PACKAGE COMPARING SOURSYSTEM /BIC/InfoObjectname OBJVERS.
When checking the code I got message: 'E:No component exists with the name "OBJVERS".' So I only included the first 2 keys. But the routine does not work. The duplicate error is still happening. What is missing in this start routine?
2) Generally, for a start routine, do I really need to include the data declaration, ITAB or WA, SELECT statement etc.?
Do I have to use the statement below or just simply one line?
LOOP AT DATA_PACKAGE.
IF DATA_PACKAGE.....
ENDIF.
ENDLOOP.
Thanks for your help in advance, JessicaHello Jessica,
if it won't be possible for you to get unique data from the very beginning, there is still another way to manage this problem in a start routine.
Sort ... and delete adjacent ... must remain. Further on build up an internal table of type data_package, but defined with STATICS instead of DATA. This i-tab stays alive for all data-packages of one load. Fill it with the data of the transferred data-packages, and delete from every new data-package all records which already are in the statics i-tab. Alternatively you could do the same with a Z-(or Y-)database-table instead of the statics i-tab.
It will probably cost some performance, but better slow than wrong data.
Regards,
Ernst -
How to Delete the condition record in CRM
HI,
Can you please help me how to delete the condition record from condition table in CRM.
Please explain the usage of FM CRMXIF_CONDITION_SEL_DELETE with examples.
I have also read the documention of the function module. How to use this FM for custom defined condition table.
(this is the code given in Documentation)
DATA-OBJECT_REPRESENTATION = 'E'
DATA-SEL_OPT-CT_APPLICATION = 'CRM'
DATA-SEL_OPT-OBJECT_TASK = 'D'
DATA-SEL_OPT-RANGE-FIELDNAME = 'PRODUCT_ID'
DATA-SEL_OPT-RANGE-R_SIGN = 'I' (Including)
DATA-SEL_OPT-RANGE-R_OPTION = 'EQ'
DATA-SEL_OPT-RANGE-R_VALUE_LOW = 'PROD_1'
Thanks
ShankarHi Shankar,
I am using the same CRMXIF_CONDITION_SEL_DELETE function module to delete condition record present in CRM.
But it is giving me below error in the return table of the FM after i run the program. Can you please correct me if I am doing any thing wrong?
Error in lt_return: SMW3 CND_MAST_SEL_DEL_EXT_VALIDATE CND_M_SD
code:
ls_range-fieldname = 'PRODUCT_ID''.
ls_range-R_SIGN = 'I'.
ls_range-R_OPTION = 'EQ'.
ls_range-R_VALUE_LOW = '123456'.
APPEND ls_range TO lt_range.
MOVE lt_range TO ls_entry-SEL_OPT-range.
ls_data-SEL_OPT-object_task = 'D'.
ls_data-SEL_OPT-ct_application = 'CRM'.
ls_data-object_representation = 'E'.
CALL FUNCTION 'CRMXIF_CONDITION_SEL_DELETE'
EXPORTING
DATA = ls_date
IMPORTING
RETURN = lt_return
CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
IMPORTING
return = lt_ret.
Edited by: Saravanaprasad Nadar on Jul 7, 2010 1:27 AM -
When I did my itunes match, there were some duplicated albums and songs in my library. I have gone back into my library from my computer, and deleted the duplicates. When I did so, itunes did not ask me if I wanted to delete them from the cloud. They are still showing up on my iphone as duplicates in the cloud. What can I do to get them from showing up in my iphone?
They are not on the iphone in the first place. It is just a setting that allows you to see the songs that are available in icloud, but not on your iphone. If you turn the setting to off, then you no longer see the songs that are in icloud, but NOT on your iphone.
-
How to delete the duplicate email address in BP master data
Hi,
When you get an email ids from the third party vendor and you are loading into CRM BP master data. how to delete the duplicate email address already exits in the system. In CRM you can create the same BP with different id. I would like to know how to delete the email address during importing email addresses from the third party tool.
During the campaign you are sending email to all your customers, when the customer want to unsubscibe the email address from your list, how to unsubcribe the email address and how to updat the BP master data.
If you are sending the email to customer, you are using html or simple text, if the customer wants only html or simple text, how you to specify in the system?
thanks,
arulHello Arul,
welcome to the SDN CRM Development forum.
1. I think you should clear the data with duplicate E-Mail adresses in the external tool.
2. Unsubscription could be done by a Marketing Attribute which could be set by using a Target Group which is created by Campaign Automation. Have a look at this Toppic. There is also a Best Practice avaliable at http://help.sap.com/bp_crmv340/CRM_DE/index.htm.
3. Also HTML or Simple text can be mained in a Marketing Attribute. You have to use different Mail Forms to which are sent to different Target groups.
Regards
Gregor -
Library Duplicated
I just updated to the latest version of iTunes and it duplicated virtually every track in my library. I need a quick way to delete the duplicates. Sorting by "Date Added" will not work, because every track is listed as added on 12/12/2011 even though this happened today 12/19/2011.I've written a script called DeDuper which can help remove unwanted duplicates. See this thread for background.
tt2 -
How to delete the material records related to storage type
Dear SAP guru's,
I got a situation that if the material "XXXXXX" exists in two different storage type (let say 001 & 002 ) and the material needs to be removed out of one (Let say 002) to avoid the picking and putaway in future.
Inorder to accomplish an above requirement we tried the below cases
Case 1: By flagged the material "XXXXXX" under storage type 002 in MM06 transaction.But it didn't restrict the system on allocating and removing the stock from /to to the storage type 002.
Case 2:After we flagged the material for deletion under storage type 002 ,we used to MM71 transaction to archive & delete the material record that assocaited with the storage type 002 completely.
I am kind of queiries to know if there is any other method or procedure available to delete the storage type associated material records in SAP.
Please guide me with the other options.Thanks.
Regards,
John.Hi John,
Its very simple, remove the Storage type indicators from the material master - Warehouse View 1 at Stock placement & Stock removal section, with this it won't pickup or place the materials in the concerned storage type.
If you have extended this materials to more than 1 storage type, you need to do the same for all the storage types.
Reward if it is helpful.
Maybe you are looking for
-
Ipod is FROZEN and WONT SHOW UP ON ITUNES/COMPUTER!
My second generation Ipod is a little more than a year old and has worked splendidly until this last monday. Five days ago my ipod would not play a song, and when i tried to skip it, it wouldnt play that song either. The CD cover would also not show
-
I upgraded my iPhone to iOS 8.1.3 and can no longer pair with either the phone in my 2015 Toyota Highlander or my Jawbone earpiece. Upgraded to iOS 8.2 and still no resolution for the car, but now my Jawbone will finally pair with the iPhone. Is t
-
Dear All, i have maintained 100 rs as PO.my condition for freight is value.when i am doing 122 (Vendor Return).then freight is also reversing. can anybody tell me any solution for this. Thanks & regards. Pavan
-
Hey ~graffitti, can you help with this?? Two of us are "lost"!!
sasi55 1 posts since Jan 6, 2011 Reply Currently Being Moderated Jan 6, 2011 9:46 AM Problem saving pdf in adobe reader 8 I have a problem saving pdf file in adobe reader version 8. In Adobe acrobat version 8 when i clicked on save button nothing hap
-
I recently deleted my iPhone 4S from my list of icould backup devices.
does this mean that that phone's photos and videos were deleted from icould? if so, can the data be restored? thanks in advance everyone.