Duplicate record in Masterdata: Failure in Chain, Successful when manually
Hi all,
A daily load updates 0BPARTNER_ATTR from CRM.
Recently this load started to fail, with the following error message:
xx duplicate records found. xxxx recordings used in table /BI0/XBPARTNER
The data load through PSA, but there is no red record in the PSA.
After copying the PSA data to Excel for further analysis I concluded there are actually no duplicate in the PSA.
Also, the option 'Ignore double data records' in the infopackage is checked.
When I manually update from PSA the load is successful.
This started to happen about two weeks ago, and I didn't find an event which could have caused it.
Now it happens every two or three days, and each time the manual update is successful.
Any suggestions, anyone?
Thanks, Jan.
Hi Jan,
Possibly you would have two requests in PSA and you would try to update the Data Target.
Delete all the requests in PSA, schedule once again to bring the data to PSA and update then into Data Target.
Thank you,
Arvind
Similar Messages
-
Data Load Fails due to duplicate records from the PSA
Hi,
I have loaded the Master Data Twice in to the PSA. Then, I created the DTP to load the data from the PSA to the InfoProvider. The data load is failing with an error "duplicate key/records found".
Is there any setting that I can configure by which even though I have duplicate records in PSA, I can be successfully able to load only one set of data (without duplicates) in to the InfoProvider?
How can I set up the process chains to do so?
Your answer to the above two questions is appreciated.
Thanks,Hi Sesh,
There are 2 places where the DTP checks for duplicates.
In the first, it checks previous error stacks. If the records you are loading are still contained in the error stack of a previous DTP run, it will throw the error at this stage. In this case you will first have to clean up the previous error stack.
The second stage will clean up duplicates across datapackeages, provided the option is set in your datasource. But you should note that this will not solve the problem if you have duplicates in the same datapackage. In that case you can do the filtering yourself in the start routine of your transformation.
Hope this helps,
Pieter -
Error due to duplicate records
Hello friends,
I have done a full upload to the particular charactersitics info object using direct update.(PSA and directly to data target). I used PSA and Subsequently into data target option.
When i load the data into the object through process chain i get an error that duplicate records exist and the request become red in PSA.
But no duplicate records exist in the data package and when we try to manually load the record from PSA to data Target it works fine.
Can any one try to throw some lights on this error?
Regards
Sre....Hello Roberto and Paolo
There was an OSS note that we should not use that option only PSA with delete duplicate records and update into data target .
I dont know the reason Exactly.
Can you throw some lights on this, Why its like that?
Thanks for the reply paolo and roberto
Regards
Sri -
Hi Friends,
We are getting error "Duplicate Record Found" while loading master data.
When we checked in PSA there are no duplicated in PSA.
When we deleted the request and reloaded from PSA it completed successfully.
Daily we are facing this error and after deletion and reload from PSA it finishes successfully.
What could be the resaone? Any solution for this?
Regards
SSSHi,
In InfoPackage maintenance screen, under update option, select the check box, Ignore Double Data Records. This may solve u r probelm.
Hope this helps u a lot.........
Assigning points is the way of saying Thanks in SDN
Regards
Ramakrishna Kamurthy -
Master Data Load Failure- duplicate records
Hi Gurus,
I am a new member in SDN.
Now, work in BW 3.5 . I got a data load failure today. The error message saying that there is 5 duplicate records . The processing is in to PSA and then to infoobject. I checked the PSA and data available in PSA. How can I avoid these duplicate records.
Please help me, I want to fix this issue immediately.
regards
MiluHi Milu,
If it is a direct update, you willn't have any request for that.
The Data directly goes to Masterdata tables so don't have Manage Tab for that infoobject to see the request.
Where as in the case of flexible update you will have update rules from your infosource to the infoobject so you can delete the request in this case.
Check this link for flexible update of master data
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/37dda990-0201-0010-198f-9fdfefc02412 -
Duplicate records found while loading master data(very urgent)
Hi all,
One infopackage in the process chain failed while laoding the master data(full update).Its showing the following error-->duplicate record found ..1 record used in /BI0/PTCTQUERY and the same record occured in /BI0/PTCTQUERY tables.
can anyone give me the solution...its very urgent...
Thanks & Regards,
ManjulaHi
You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
Help says that:
To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
hope it clears ur doubt, otherwise let me know.
Regards
Kiran -
Delta in Duplicate records.
Hi Gurus,
Daily we are uploading CRM data through process chain.
In a week 3 to 4 times chain fails due to duplicate records 24 or 34...like that.
And deleting red request in target and loading again.
Why we are getting duplicate records in delta loading?
What is the reason?
Your help is appricate
Thanks
Ramu
Message was edited by:
Ramu THi Ramu,
Once try this way.Check the Keys in the table from which the datasource is made and add the corresponding infoobjects for that key fields in the compounding tab of the masterdata charesterstic.
Then it checks the uniqueness.
I have done this and it worked for me.
Hope this helps
Regards
karthik -
USE of PREVIOUS command to eliminate duplicate records in counter formula
i'm trying to create a counter formula to count the number of documents paid over 30 days. to do this i have to subtract the InvDate from the PayDate. and then create a counter based on this value. if {days to pay} is greater than 30 then 1 else 0.
then sum the {days to pay} field to each group. groups are company, month, and supplier.
becuase invoices can have multiple payments and payments can have multiple invoices. there is no way around having duplicate records for the field.
so my counter is distorted by by the duplicate records and my percentage of payments over 30 days formula will not be accurate do to these duplicates.
I've tried Distinct Count based on this formula if {days to pay} is greater than 30 then . and it works except that is counts 0.00 has a distinct records so my total is off 1 for summaries with a record that (days to pay} is less than or equal to 30.
if i subract 1 from the formula then it will be inaccurate for summaries with no records over 30 days.
so i'm come to this.
if Previous() do not equal
then
if {day to days} greater than 30
then 1
else 0.00
else 0.00
but it doesn't work. i've sorted the detail section by
does anyone have any knowledge or success using the PREVIOUS command in a report?
Edited by: Fred Ebbett on Feb 11, 2010 5:41 PMSo, you have to include all data and not just use the selection criteria 'PayDate-InvDate>30'?
You will need to create a running total on the RPDOC ID, one for each section you need to show a count for, evaluating for your >30 day formula.
I don't understand why you're telling the formula to return 0.00 in your if statement.
In order to get percentages you'll need to use the distinct count (possibly running totals again but this time no formula). Then in each section you'd need a formula that divides the two running totals.
I may not have my head around the concept since you stated "invoices can have multiple payments and payments can have multiple invoices". So, invoice A can have payments 1, 2 and 3. And Payment 4 can be associated with invoice B and C? Ugh. Still though, you're evaluating every row of data. If you're focus is the invoices that took longer than 30 days to be paid...I'd group on the invoice number, put the "if 'PayDate-InvDate>30' then 1 else 0" formula in the detail, do a sum on it in the group footer and base my running total on the sum being >0 to do a distinct count of invoices.
Hope this points you in the right direction.
Eric -
Hi All,
I have a situation. I am using BPC NW 7.0 and I have updated my dimension files. When I try to validate my transaction file every single record is validated successfully. But when I try to import the flat file into my application, I am getting a lot of dupplication records error and these are my questions.
1. Will we get duplicate records in transaction files?
2. Even if there are duplication, since it is a cube it should summarize not display that as a error and reject records?
3. Is there something I can do to accept duplicates (I have checked the Replace option in the data package, to overwrite the simillar records, but it is only for account, category and entity only.
5. In mycase I see identical values in all my dimension and the $value is the only difference. Why is it not summing up.
Your quickest reply is much appreciated.
Thanks,
Alex.Hi,
I have the same problem.
In my case the file that I want to upload has different row that differ for the nature column. In the conversion file I map different nature to one internal nature.
ES: cost1 --> cost
cost2 --> cost
cost3 --> cost
In my desire was that in BPC the nature cost assume the result cost = cost1 + cost2 + cost3.
The result is that only the first record is uploaded and all other recorda are rejected as duplicate.
Any suggestion? -
Hi gurus
We created a text datasource in R/3 and replicated it into BW 7.0
An infopackage (loading to PSA) and DataTransferProcess was created and included in a process chain.
The job failed because of duplicate records.
We now discovered that the setting of the u201CDelivery of Duplicate recordsu201D for this Datasource in BW is set to u201CUndefinedu201D
When creating the datasource in R/3, there were no settings for the u201CDelivery of duplicate recordsu201D.
In BW, Iu2019ve tried to change the settings of u201CDelivery of Duplicate data recordsu201D to NONE, but when I go into change-mode, the u201CDelivery of duplicateu201D is not changeable.
Does anyone have any suggestion on how to solve this problem?
Thanks,
@nne ThereseHi Muraly,
I do have the same issue. I am loading texts from R/3 to PSA using Infopackage with Full update. From PSA I am using DTP with Delta with the option" vaild records update, No reporting(Request Record).
It was running fine for last few weeks like transfer records and added records are same as in the PSA request every day.
suddenly the load has filed to infoobect . I deleted the request from Infoobject and reloaded using the DTP then again failed. I tried loading full update as it texts then again failed. Now I analised the error it says Duplicate records. So I changed the DTP by checking the option Handling Duplicate records and loaded with full update . It worked fine the transferred records more than 50000 and added records are exact no of the PSA request.
I reset the DTP again back to Delta and loaded today but the transferred records are 14000 and added records(3000) same as PSA request. I am fine if you see the history of loads the no of records in Transfer and Added in infoobjects and the no of records in PSA request are same every day..
Why this difference is now? But in Production I have no issues. Since I changed the DTP if I transport to Production does it make any difference. I am first time doing BI 7.0.
Please suggest me and explain me if I am wrong.
Thanks,
Sudha.. -
Import Transaction Data - Duplicate records
Hi,
I need to upload a file of transaction data into BPC using data manager package. I've done the transformation and conversion files which validate successfully on a small data set. When I try to upload the data file using the real life file, it fails due to duplicate records. This happens because multiple external ID's map to one internal ID. Therefore, whilst there are no duplicates in the actual file produced by the client, the resulting data produced after conversion does contain duplicates and will therefore not upload.
Apart from asking the client to perform the aggregation before sending me the file, is there any way to get BPC to allow the duplicates and simply sum up?
Regards
SueHi,
Try adding the delivered package /CPMP/APPEND and run it. This should solve your problem.
Thanks,
Sreeni -
Duplicate records in TABLE CONTROL
Hi folks,
i am doing a module pool where my internal table (itab) data is comming to table ontrol(ctrl).then i need to select one record in table control & then i press REFRESH push button.
after putting the refresh button, some new records are comming to that same internal table.then i need to display the modified internal table (some new records are added) data in the table control.
The modified internal table data is comming to the table control but to the last of table control, some records are repeating.
before comming to table control, i checked the modified itab. it contains correct data.i.e it contains 15 records.(previously i have 5 records.after REFRESH button 10 more records are added.). but when this table is comming to table control, it contains some 100 record.i should get only 15 record.
why these records r repeting. how to delete the duplicate records from table control?
plz suggest me where i am doing mistake.
correct answer will be rewarded
Thanks & RegardsHi ,
Thanks for ur help. but i should not refresh the internal table as some records r already present.after putting the REFRESH button, some new records r appending to this existing table.then i am going to display the previous records & the new records as well.
i checked the internal table after modification.it contains actual number of records. but after comming to table control , more records r comming.
is this the problem with scrolling or waht?
plz suggest where i am doing mistake.i am giving my coding below.
PROCESS BEFORE OUTPUT.
MODULE STATUS_0200.
module tc_shelf_change_tc_attr.
loop at object_tab1
with control tablctrl
cursor tablctrl-current_line.
module tc_shelf_get_lines.
endloop.
PROCESS AFTER INPUT.
module set_exit AT EXIT-COMMAND.
loop at object_tab1.
chain.
field: object_tab1-prueflos,
object_tab1-matnr.
module shelf_modify on chain-request.
endchain.
field object_tab1-idx
module shelf_mark on request.
endloop.
module shelf_user_command.
module user_command_0200.
***INCLUDE Y_RQEEAL10_STATUS_0200O01 .
*& Module STATUS_0200 OUTPUT
text
MODULE STATUS_0200 OUTPUT.
SET PF-STATUS 'MAIN'.
SET TITLEBAR 'xxx'.
ENDMODULE. " STATUS_0200 OUTPUT
*& Module tc_shelf_change_tc_attr OUTPUT
text
MODULE tc_shelf_change_tc_attr OUTPUT.
delete adjacent duplicates from object_tab1 comparing prueflos matnr.
describe table object_tab1 lines tablctrl-lines.
ENDMODULE. " tc_shelf_change_tc_attr OUTPUT
*& Module tc_shelf_get_lines OUTPUT
text
MODULE tc_shelf_get_lines OUTPUT.
data: g_tc_shelf_lines like sy-loopc.
if tablctrl-current_line > tablctrl-lines.
stop.
endif.
g_tc_tablctrl_lines = sy-loopc.
*refresh control tablctrl from screen 0200.
ENDMODULE. " tc_shelf_get_lines OUTPUT
***INCLUDE Y_RQEEAL10_SHELF_MODIFYI01 .
*& Module shelf_modify INPUT
text
MODULE shelf_modify INPUT.
modify object_tab1
index tablctrl-current_line.
ENDMODULE. " shelf_modify INPUT
*& Module set_exit INPUT
text
module set_exit INPUT.
leave program.
endmodule. " set_exit INPUT
*& Module shelf_mark INPUT
text
MODULE shelf_mark INPUT.
data: g_shelf_wa2 like line of object_tab1.
if tablctrl-line_sel_mode = 1
and object_tab1-idx = 'X'.
loop at object_tab1 into g_shelf_wa2
where idx = 'X'.
g_shelf_wa2-idx = ''.
modify object_tab1
from g_shelf_wa2
transporting idx.
endloop.
endif.
modify object_tab1
index tablctrl-current_line
transporting idx plnty plnnr plnal.
ENDMODULE. " shelf_mark INPUT
*& Module shelf_user_command INPUT
text
MODULE shelf_user_command INPUT.
ok_code = sy-ucomm.
perform user_ok_tc using 'TABLCTRL'
'OBJECT_TAB1'
changing ok_code.
sy-ucomm = ok_code.
ENDMODULE. " shelf_user_command INPUT
*& Module user_command_0100 INPUT
text
MODULE user_command_0200 INPUT.
data:v_line(3).
case OK_CODE.
when 'LAST'.
read table object_tab1 with key idx = 'X'.
if sy-subrc = 0.
select * from qals
where enstehdat <= object_tab1-enstehdat
and plnty ne space
and plnnr ne space
and plnal ne space.
if sy-dbcnt > 0.
if qals-enstehdat = object_tab1-enstehdat.
check qals-entstezeit < object_tab1-entstezeit.
move-corresponding qals to object_tab2.
append object_tab2.
else.
move-corresponding qals to object_tab2.
append object_tab2.
endif.
endif.
endselect.
sort object_tab2 by enstehdat entstezeit descending.
loop at object_tab2 to 25.
if not object_tab2-prueflos is initial.
append object_tab2 to object_tab1.
endif.
clear object_tab2.
endloop.
endif.
when 'SAVE'.
loop at object_tab1 where idx = 'X'.
if ( not object_tab1-plnty is initial and
not object_tab1-plnnr is initial and
not object_tab1-plnal is initial ).
select single * from qals into corresponding fields of wa_qals
where prueflos = object_tab1-prueflos.
if sy-subrc = 0.
wa_qals-plnty = object_tab1-plnty.
wa_qals-plnnr = object_tab1-plnnr.
wa_qals-plnal = object_tab1-plnal.
update qals from wa_qals.
if sy-subrc <> 0.
Message E001 with 'plan is not assigned to lot in sap(updation)'.
else.
v_line = tablctrl-current_line - ( tablctrl-current_line - 1 ).
delete object_tab1.
endif.
endif.
endif.
endloop.
when 'BACK'.
leave program.
when 'NEXT'.
call screen 300.
ENDCASE.
***INCLUDE Y_RQEEAL10_USER_OK_TCF01 .
*& Form user_ok_tc
text
-->P_0078 text
-->P_0079 text
<--P_OK_CODE text
form user_ok_tc using p_tc_name type dynfnam
p_table_name
changing p_ok_code like sy-ucomm.
data: l_ok type sy-ucomm,
l_offset type i.
search p_ok_code for p_tc_name.
if sy-subrc <> 0.
exit.
endif.
l_offset = strlen( p_tc_name ) + 1.
l_ok = p_ok_code+l_offset.
case l_ok.
when 'P--' or "top of list
'P-' or "previous page
'P+' or "next page
'P++'. "bottom of list
perform compute_scrolling_in_tc using p_tc_name
l_ok.
clear p_ok_code.
endcase.
endform. " user_ok_tc
*& Form compute_scrolling_in_tc
text
-->P_P_TC_NAME text
-->P_L_OK text
form compute_scrolling_in_tc using p_tc_name
p_ok_code.
data l_tc_new_top_line type i.
data l_tc_name like feld-name.
data l_tc_lines_name like feld-name.
data l_tc_field_name like feld-name.
field-symbols <tc> type cxtab_control.
field-symbols <lines> type i.
assign (p_tc_name) to <tc>.
concatenate 'G_' p_tc_name '_LINES' into l_tc_lines_name.
assign (l_tc_lines_name) to <lines>.
if <tc>-lines = 0.
l_tc_new_top_line = 1.
else.
call function 'SCROLLING_IN_TABLE'
exporting
entry_act = <tc>-top_line
entry_from = 1
entry_to = <tc>-lines
last_page_full = 'X'
loops = <lines>
ok_code = p_ok_code
overlapping = 'X'
importing
entry_new = l_tc_new_top_line
exceptions
others = 0.
endif.
get cursor field l_tc_field_name
area l_tc_name.
if syst-subrc = 0.
if l_tc_name = p_tc_name.
set cursor field l_tc_field_name line 1.
endif.
endif.
<tc>-top_line = l_tc_new_top_line.
endform. " COMPUTE_SCROLLING_IN_TC
Thanks -
Duplicate records in Infoobject
Hi gurus,
While loading in to a infoobject itz throwing an error as 150 Duplicate records are found in the details tab. My doubt is about temporary correction i.e in the infopackage either we can increase the error handling number and load or we can five the third option(valid records update) and update to infoobject.
My question is when error occurs i can find that that some records are there in added. After the error occured i forced the request to red(Delta) and deleted it and gave 3rd option(valid records update) and updated to infoobject. The load was successful all the records are transferred but in added the number of records is 0.And also i read in one article that the error records will be stored as seperate request in PSA even i cannot find that. I want to know when or how i will get the news records added.Hi Jitu,
If its only Infoobject,then under processing tab select only PSA and check the first option.After this relrun the infopackge.Go to monitor and see whether u got the same no of records including duplicates.After this follow the below steps,
1.Go to RSRV- tcode, expand Master data and double click on second option.
2.Click on the option that is present on right side of the panel. A pop-up window will appear and give the name of the infobject that is failed.
3.select that particular infoobject and click on Delete button,
if necessary then save it and click on execute----do this action only if necessary. Usually by deleting , the problem will be resolved. So, after clicking on Delete button, go back to the monitor screen and process the data packet manually by clicking on the wheel button.
4.Select the request and click on the read manually button (wheel button).
Dont forget to save the infopackge back by selecting only infoobject.
hope this works.Let me know if u have doubts.
Assign points if u r able to find the solution
Regards,
Manjula -
Guys,
My DTP failed 4 days back, but the full load from Source system to PSA is executing successfully.
To correct the DTP load, I have deleted the last DTP request and also all the PSA requests except the last one and when I re-executed the DTP I am getting Duplicate records. Also, in the failed DTP header, I am seeing all the deleted PSA requests?
How do I resolve this issue? I cannot check Handle duplicate records since it is a Time dependent data.
Thanks,
KumarHi Kumar,
1.deleting from PSA and updating is not a permanent solution..
chk wht are the records creating dupicates in PSA first...
the issue with the duplicates are not from PSA previously there are some records in the object itself...
so deleting from PSA will not solve the issue...
chk the data from source tables itself why the wrong data is cmg into PSA like tht...
then u can edit with the correct records in PSA and update the data into target using DTP..
u can create an error DTP for tht so tht it'll be easy to trace the duplicates easily,...
2. You have the option "handle duplicate reords" in the DTP.
check the box and try to load the data again.
If this is time dependent master data then check also "valid to" as key along with other objects in sematic group option in the DTP.
check this and they try to load the data.
http://help.sap.com/saphelp_nw70/helpdata/EN/42/fbd598481e1a61e10000000a422035/content.htm
Regards
Sudheer -
Hi,
I am providing support to one of our clients, where we have jobs scheduled to load the data from the tables in the source database to the destination database via SSIS packages. The first time load is a full load where we truncate all the tables in the destination
and load them from the source tables. But from the next day, we perform the incremental load from source to destination, i.e., only modified records fetched using changed tracking concept will be loaded to the destination. After full load, if we run the incremental
load, the job is failing with the error on one of the packages "Violation of PRIMARY KEY constraint. Cannot insert duplicate key in object '<tablename>'. The duplicate key value is <1234>, even though there are no duplicate records. When we
try debugging and running the failing package, it runs successfully. We are not able to figure out why the package fails and when we run the next day it runs successfully. Request you to help me in this regard.
Thank you,
Bala Murali Krishna Medipally.Hi,
I am providing support to one of our clients, where we have jobs scheduled to load the data from the tables in the source database to the destination database via SSIS packages. The first time load is a full load where we truncate all the tables in the destination
and load them from the source tables. But from the next day, we perform the incremental load from source to destination, i.e., only modified records fetched using changed tracking concept will be loaded to the destination. After full load, if we run the incremental
load, the job is failing with the error on one of the packages "Violation of PRIMARY KEY constraint. Cannot insert duplicate key in object '<tablename>'. The duplicate key value is <1234>, even though there are no duplicate records. When we
try debugging and running the failing package, it runs successfully. We are not able to figure out why the package fails and when we run the next day it runs successfully. Request you to help me in this regard.
Thank you,
Bala Murali Krishna Medipally.
I suspect you are trying to insert modified records instead of updating.
Maybe you are looking for
-
Problems with using Flat Brushes in Photoshop Elements 10
Hi, Today when I use the flat brushes to paint the colours are not flat but differant tones of the colour in the shape of the brush. Any suggestions please? Thanks, Jeff
-
Why am I being charged an extra $20 on my current bill?
I just upgraded to the iPhone 5S in the middle of December and in doing so had to give up my unlimited data plan. My new plan is the 4GB with mobile hot spot plan, and my regular monthly bill is now supposed to be $94.99 after changing phones and pla
-
How to unfeezing a Macbook Pro?
my MacBook Pro keeps freezing every time I go on to a website. I have tried turning it on and off, logging out of my user and have forced quit it many times and it still hasn't worked. The colour wheel also appears when I go on to my desktop a won't
-
How can I copy text spanning multiple pages in iBook?
I like to drop sections of text from i books into letters, journal entries etc and for the life of me I cannot figure out the functionality of grabbing text that spans multiple pages.
-
Problem when changing SAPJSF's role
Hi experts!! I am working on EP 2004s ABAP & JAVA stack SP14. I tried to change SAPJSF' s role in R/3 from SAP_BC_JSF_COMMUNICATION_RO to SAP_BC_JSF_COMMUNICATION so that i have write rights to the ABAP UME datasource. When i did that all the portal'