Duplicate records in Infoobject
Hi gurus,
While loading in to a infoobject itz throwing an error as 150 Duplicate records are found in the details tab. My doubt is about temporary correction i.e in the infopackage either we can increase the error handling number and load or we can five the third option(valid records update) and update to infoobject.
My question is when error occurs i can find that that some records are there in added. After the error occured i forced the request to red(Delta) and deleted it and gave 3rd option(valid records update) and updated to infoobject. The load was successful all the records are transferred but in added the number of records is 0.And also i read in one article that the error records will be stored as seperate request in PSA even i cannot find that. I want to know when or how i will get the news records added.
Hi Jitu,
If its only Infoobject,then under processing tab select only PSA and check the first option.After this relrun the infopackge.Go to monitor and see whether u got the same no of records including duplicates.After this follow the below steps,
1.Go to RSRV- tcode, expand Master data and double click on second option.
2.Click on the option that is present on right side of the panel. A pop-up window will appear and give the name of the infobject that is failed.
3.select that particular infoobject and click on Delete button,
if necessary then save it and click on execute----do this action only if necessary. Usually by deleting , the problem will be resolved. So, after clicking on Delete button, go back to the monitor screen and process the data packet manually by clicking on the wheel button.
4.Select the request and click on the read manually button (wheel button).
Dont forget to save the infopackge back by selecting only infoobject.
hope this works.Let me know if u have doubts.
Assign points if u r able to find the solution
Regards,
Manjula
Similar Messages
-
Allow duplicate key in InfoObject/ compounding length problem
Hi all.
I try to create InfoObject with compounding key, consisting of IObj itself plus 4 compound attribs and total length of 63 symbols (max allowed is 60). Obviously, the system doesn't allow me to activate the InfoObject.
I consider to move compounds into attributes area, but don't know is it possible to allow duplicate records in InfoObject?
or
Maybe you can help me with some other advise how to manage this compound key limitation?
additional info:
The InfoObject is the "copy" of source system master data tableHi,
ok, now I see. Well there is a restriction that the keyfields as in your example shouldn't be longer than 60 characters (summed up). So you need to think about using another object as compound. Isn't that possible? Another option might be to post the data to a ods/dso first which has your object as well as the compounding attributes as keyfields and which was already recommended, and after that, post the value to a new infoobject which gets it's unique value from a number range object that you need to define specially for this purpose. Add the keyfields of that ods/dso as navigational characteristics to that new object.
regards
Siggi -
hi all,
how to delete duplicate records in master data infoobject which has no requests because it is a direct update?Hi,
Right click on the info object and
select Maintain
in that you will get the Master data table
from that select the Record and delete it.
hope this solves your query.
reward points if useful
regards,
ANJI -
While loading master data to infoobject Load failed due to Duplicate record
Hi Experts,
While loading master data to the infoobject load failed .
The error it is showing is 24 Duplicate record found. 23 recordings used in table.
Pls help me to solve this issue
Thanks in Advance.
Regards,
Gopal.In infopackage settings u will find a checkbox for 'delete duplicate records'.
I think it appears beside the radio button for 'To PSA',and also tick checkbox for 'subsequent update to data targets'.
This will remove the duplicate records(if any) from the PSA before they are processed further by transfer and update rules.
Use this and reload master data.
cheers,
Vishvesh -
Master data infoobject can't handle duplicate records after SP10
Hi
I am trying to load master data which happened to contain duplicate records from the source system. In the DTP of the master data infoobject, I have ticked the 'Handle Duplicate Record Keys' checkbox. After executing this DTP, the duplicate master data records were trapped in the Error Stack. I am expecting overwriting of the duplicate master data to take place instead. I understand that this error was fixed in Note 954661 - Updating master data texts when error in data package which is from SP9. After applying Support Pack 10, the master data infoobject just can't handle records with duplicate keys.
Please let me know if you manage to fix this problem.
Many thanks,
AnthonyFound a fix for this problem. Just applied this OSS note Note 986196 - Error during duplicate record handling of master data texts.
-
How to find out duplicate record contained in a flat file
Hi Experts,
For my project I have written a program for flat file upload.
Requirement 1
In the flat file there may be some duplicate record like:
Field1 Field2
11 test1
11 test2
12 test3
13 test4
Field1 is primary key.
Can you please let me know how I can find out the duplicate record.
Requirement 2
The flat file contains the header row as shown above
Field1 Field2
How our program can skip this record and start reading / inserting records from row no 2 ie
11 test1
onwards.
Thanks
S
FORM upload1.
DATA : wf_title TYPE string,
lt_filetab TYPE filetable,
l_separator TYPE char01,
l_action TYPE i,
l_count TYPE i,
ls_filetab TYPE file_table,
wf_delemt TYPE rollname,
wa_fieldcat TYPE lvc_s_fcat,
tb_fieldcat TYPE lvc_t_fcat,
rows_read TYPE i,
p_error TYPE char01,
l_file TYPE string.
DATA: wf_object(30) TYPE c,
wf_tablnm TYPE rsdchkview.
wf_object = 'myprogram'.
DATA i TYPE i.
DATA:
lr_mdmt TYPE REF TO cl_rsdmd_mdmt,
lr_mdmtr TYPE REF TO cl_rsdmd_mdmtr,
lt_idocstate TYPE rsarr_t_idocstate,
lv_subrc TYPE sysubrc.
TYPES : BEGIN OF test_struc,
/bic/myprogram TYPE /bic/oimyprogram,
txtmd TYPE rstxtmd,
END OF test_struc.
DATA : tb_assum TYPE TABLE OF /bic/pmyprogram.
DATA: wa_ztext TYPE /bic/tmyprogram,
myprogram_temp TYPE ziott_assum,
wa_myprogram TYPE /bic/pmyprogram.
DATA : test_upload TYPE STANDARD TABLE OF test_struc,
wa2 TYPE test_struc.
DATA : wa_test_upload TYPE test_struc,
ztable_data TYPE TABLE OF /bic/pmyprogram,
ztable_text TYPE TABLE OF /bic/tmyprogram,
wa_upld_text TYPE /bic/tmyprogram,
wa_upld_data TYPE /bic/pmyprogram,
t_assum TYPE ziott_assum.
DATA : wa1 LIKE test_upload.
wf_title = text-026.
CALL METHOD cl_gui_frontend_services=>file_open_dialog
EXPORTING
window_title = wf_title
default_extension = 'txt'
file_filter = 'Tab delimited Text Files (*.txt)'
CHANGING
file_table = lt_filetab
rc = l_count
user_action = l_action
EXCEPTIONS
file_open_dialog_failed = 1
cntl_error = 2
OTHERS = 3. "#EC NOTEXT
IF sy-subrc 0.
EXIT.
ENDIF.
LOOP AT lt_filetab INTO ls_filetab.
l_file = ls_filetab.
ENDLOOP.
CHECK l_action = 0.
IF l_file IS INITIAL.
EXIT.
ENDIF.
l_separator = 'X'.
wa_fieldcat-fieldname = 'test'.
wa_fieldcat-dd_roll = wf_delemt.
APPEND wa_fieldcat TO tb_fieldcat.
CALL FUNCTION 'MESSAGES_INITIALIZE'.
CLEAR wa_test_upload.
Upload file from front-end (PC)
File format is tab-delimited ASCII
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
filename = l_file
has_field_separator = l_separator
TABLES
data_tab = i_mara
data_tab = test_upload
EXCEPTIONS
file_open_error = 1
file_read_error = 2
no_batch = 3
gui_refuse_filetransfer = 4
invalid_type = 5
no_authority = 6
unknown_error = 7
bad_data_format = 8
header_not_allowed = 9
separator_not_allowed = 10
header_too_long = 11
unknown_dp_error = 12
access_denied = 13
dp_out_of_memory = 14
disk_full = 15
dp_timeout = 16
OTHERS = 17.
IF sy-subrc 0.
EXIT.
ELSE.
CALL FUNCTION 'MESSAGES_INITIALIZE'.
IF test_upload IS NOT INITIAL.
DESCRIBE TABLE test_upload LINES rows_read.
CLEAR : wa_test_upload,wa_upld_data.
LOOP AT test_upload INTO wa_test_upload.
CLEAR : p_error.
rows_read = sy-tabix.
IF wa_test_upload-/bic/myprogram IS INITIAL.
p_error = 'X'.
MESSAGE s153 WITH wa_test_upload-/bic/myprogram sy-tabix.
CONTINUE.
ELSE.
TRANSLATE wa_test_upload-/bic/myprogram TO UPPER CASE.
wa_upld_text-txtmd = wa_test_upload-txtmd.
wa_upld_text-txtsh = wa_test_upload-txtmd.
wa_upld_text-langu = sy-langu.
wa_upld_data-chrt_accts = 'xyz1'.
wa_upld_data-co_area = '12'.
wa_upld_data-/bic/zxyzbcsg = 'Iy'.
wa_upld_data-objvers = 'A'.
wa_upld_data-changed = 'I'.
wa_upld_data-/bic/zass_mdl = 'rrr'.
wa_upld_data-/bic/zass_typ = 'I'.
wa_upld_data-/bic/zdriver = 'yyy'.
wa_upld_text-langu = sy-langu.
MOVE-CORRESPONDING wa_test_upload TO wa_upld_data.
MOVE-CORRESPONDING wa_test_upload TO wa_upld_text.
APPEND wa_upld_data TO ztable_data.
APPEND wa_upld_text TO ztable_text.
ENDIF.
ENDLOOP.
DELETE ADJACENT DUPLICATES FROM ztable_data.
DELETE ADJACENT DUPLICATES FROM ztable_text.
IF ztable_data IS NOT INITIAL.
CALL METHOD cl_rsdmd_mdmt=>factory
EXPORTING
i_chabasnm = 'myprogram'
IMPORTING
e_r_mdmt = lr_mdmt
EXCEPTIONS
invalid_iobjnm = 1
OTHERS = 2.
CALL FUNCTION 'MESSAGES_INITIALIZE'.
**Lock the Infoobject to update
CALL FUNCTION 'RSDG_IOBJ_ENQUEUE'
EXPORTING
i_objnm = wf_object
i_scope = '1'
i_msgty = rs_c_error
EXCEPTIONS
foreign_lock = 1
sys_failure = 2.
IF sy-subrc = 1.
MESSAGE i107(zddd_rr) WITH wf_object sy-msgv2.
EXIT.
ELSEIF sy-subrc = 2.
MESSAGE i108(zddd_rr) WITH wf_object.
EXIT.
ENDIF.
*****Update Master Table
IF ztable_data IS NOT INITIAL.
CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
EXPORTING
i_iobjnm = 'myprogram'
i_tabclass = 'M'
I_T_ATTR = lt_attr
TABLES
i_t_table = ztable_data
EXCEPTIONS
attribute_name_error = 1
iobj_not_found = 2
generate_program_error = 3
OTHERS = 4.
IF sy-subrc 0.
CALL FUNCTION 'MESSAGE_STORE'
EXPORTING
arbgb = 'zddd_rr'
msgty = 'E'
txtnr = '054'
msgv1 = text-033
EXCEPTIONS
OTHERS = 3.
MESSAGE e054(zddd_rr) WITH 'myprogram'.
ELSE.
CALL FUNCTION 'MESSAGE_STORE'
EXPORTING
arbgb = 'zddd_rr'
msgty = 'S'
txtnr = '053'
msgv1 = text-033
EXCEPTIONS
OTHERS = 3.
ENDIF.
*endif.
*****update Text Table
IF ztable_text IS NOT INITIAL.
CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
EXPORTING
i_iobjnm = 'myprogram'
i_tabclass = 'T'
TABLES
i_t_table = ztable_text
EXCEPTIONS
attribute_name_error = 1
iobj_not_found = 2
generate_program_error = 3
OTHERS = 4.
IF sy-subrc 0.
CALL FUNCTION 'MESSAGE_STORE'
EXPORTING
arbgb = 'zddd_rr'
msgty = 'E'
txtnr = '055'
msgv1 = text-033
EXCEPTIONS
OTHERS = 3.
ENDIF.
ENDIF.
ELSE.
MESSAGE s178(zddd_rr).
ENDIF.
ENDIF.
COMMIT WORK.
CALL FUNCTION 'RSD_CHKTAB_GET_FOR_CHA_BAS'
EXPORTING
i_chabasnm = 'myprogram'
IMPORTING
e_chktab = wf_tablnm
EXCEPTIONS
name_error = 1.
IF sy-subrc 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
****Release locks on Infoobject
CALL FUNCTION 'RSDG_IOBJ_DEQUEUE'
EXPORTING
i_objnm = 'myprogram'
i_scope = '1'.
ENDIF.
ENDIF.
PERFORM data_selection .
PERFORM update_alv_grid_display.
CALL FUNCTION 'MESSAGES_SHOW'.
ENDFORM.Can you please let me know how I can find out the duplicate record.
you need to split the records from flat file structure into your internal table ans use a delete ADJACENT duplicates comparing fields
split flat_str into wa_f1 wa_f2 wa_f2 at tab_space. -
36 duplicate record found. -- error while loading master data
Hello BW Experts,
Error while loading master data
( green light )Update PSA ( 50000 Records posted ) No errors
( green light )Transfer rules ( 50000 ³ 50000 Records ): No errors
( green light )Update rules ( 50000 ³ 50000 Records ): No errors
( green light ) Update ( 0 new / 50000 changed ): No errors
( red light )Processing end: errors occurred
Processing 2 finished
Þ 36 duplicate record found. 15718 recordings used in table /BIC/PZMATNUM
Þ 36 duplicate record found. 15718 recordings used in table /BIC/XZMATNUM
This error repeats with all the data-packets.
what could be the reason of the error. how to correct the error.
Any suggestions appreciated.
Thanks,
BWerBWer,
We have exactly the same issue when loading the infoobject 0PM_ORDER, the datasource is 0PM_ORDER_ATTR.
The workaround we have been using is 'Manual push' from the details tab of the monitor. Weirdly we don't have this issue in our test and dev systems, even in Production, this doesn't occur somedays. We have all the necessary settings enabled in the infopackage.
Did you figure out a solution for your issue, if so please let me know and likewise I will.
-Venkat -
Duplicate records found while loading master data(very urgent)
Hi all,
One infopackage in the process chain failed while laoding the master data(full update).Its showing the following error-->duplicate record found ..1 record used in /BI0/PTCTQUERY and the same record occured in /BI0/PTCTQUERY tables.
can anyone give me the solution...its very urgent...
Thanks & Regards,
ManjulaHi
You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
Help says that:
To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
hope it clears ur doubt, otherwise let me know.
Regards
Kiran -
Ignore duplicate records for master data attributes
dear experts ,
how & where can i enable "ignore duplicate records" when i am running my DTP to load data
to master data attributes.Hi Raj
Suppose you are loading master data to InfoObject and in PSA you have more than one records for Key.
Let's assume you are loading some attributes of Document NUmber ( 0DOC_NUMBER) and in PSA you have multiple records for same document number. In PSA this is not a problem, as Key for PSA table is technical key and which is different for every record in PSA. But this is a problem for InfoObject attribute table as more than one records for primary key ( here 0DOC_NUMBER) will create Primary Key consraints in database.
This issue can be easily avoided by selecting "Handle Duplicate Records" in DTP . You will find this option under *update" tab of DTP.
Regards
Anindya -
Hi gurus
We created a text datasource in R/3 and replicated it into BW 7.0
An infopackage (loading to PSA) and DataTransferProcess was created and included in a process chain.
The job failed because of duplicate records.
We now discovered that the setting of the u201CDelivery of Duplicate recordsu201D for this Datasource in BW is set to u201CUndefinedu201D
When creating the datasource in R/3, there were no settings for the u201CDelivery of duplicate recordsu201D.
In BW, Iu2019ve tried to change the settings of u201CDelivery of Duplicate data recordsu201D to NONE, but when I go into change-mode, the u201CDelivery of duplicateu201D is not changeable.
Does anyone have any suggestion on how to solve this problem?
Thanks,
@nne ThereseHi Muraly,
I do have the same issue. I am loading texts from R/3 to PSA using Infopackage with Full update. From PSA I am using DTP with Delta with the option" vaild records update, No reporting(Request Record).
It was running fine for last few weeks like transfer records and added records are same as in the PSA request every day.
suddenly the load has filed to infoobect . I deleted the request from Infoobject and reloaded using the DTP then again failed. I tried loading full update as it texts then again failed. Now I analised the error it says Duplicate records. So I changed the DTP by checking the option Handling Duplicate records and loaded with full update . It worked fine the transferred records more than 50000 and added records are exact no of the PSA request.
I reset the DTP again back to Delta and loaded today but the transferred records are 14000 and added records(3000) same as PSA request. I am fine if you see the history of loads the no of records in Transfer and Added in infoobjects and the no of records in PSA request are same every day..
Why this difference is now? But in Production I have no issues. Since I changed the DTP if I transport to Production does it make any difference. I am first time doing BI 7.0.
Please suggest me and explain me if I am wrong.
Thanks,
Sudha.. -
Duplicate records In Master Data
Hi,
I don't understant why we get Duplicate records in Master Data though it has got the overwritten functionality..
Any idea will be appreciated..Hi,
<u>Solution:</u> if the load to master data fails due to duplicate records,
Goto Monitor screen --> in the details tab --> under processeing find the duplicate record --> on the context menu of the error record select 'Manual update'.
After the above step is done....trigger the attribute change run for that infoobject.
This should solve your problem.
if there is any problem in the reporting, select the data using filter option on the master data.
Regards,
Vijay. -
Delta in Duplicate records.
Hi Gurus,
Daily we are uploading CRM data through process chain.
In a week 3 to 4 times chain fails due to duplicate records 24 or 34...like that.
And deleting red request in target and loading again.
Why we are getting duplicate records in delta loading?
What is the reason?
Your help is appricate
Thanks
Ramu
Message was edited by:
Ramu THi Ramu,
Once try this way.Check the Keys in the table from which the datasource is made and add the corresponding infoobjects for that key fields in the compounding tab of the masterdata charesterstic.
Then it checks the uniqueness.
I have done this and it worked for me.
Hope this helps
Regards
karthik -
Start routine to filter the duplicate records
Dear Experts
I have two questions regarding the start routine.
1) I have a characteristic InfoObject with transactional InfoSource. Often the 'duplicate records' error happens during the data loading. I'm trying to put a start routine in the update rule to filter out the duplicate records.
After searching the SDN forum and SAPHelp, I use the code as:
DELETE ADJACENT DUPLICATES FROM DATA_PACKAGE COMPARING KEY1 KEY2 KEY3.
In my case, the InfoObject has 3 keys: SOURSYSTEM, /BIC/InfoObjectname, OBJVERS. My code is:
DELETE ADJACENT DUPLICATES FROM DATA_PACKAGE COMPARING SOURSYSTEM /BIC/InfoObjectname OBJVERS.
When checking the code I got message: 'E:No component exists with the name "OBJVERS".' So I only included the first 2 keys. But the routine does not work. The duplicate error is still happening. What is missing in this start routine?
2) Generally, for a start routine, do I really need to include the data declaration, ITAB or WA, SELECT statement etc.?
Do I have to use the statement below or just simply one line?
LOOP AT DATA_PACKAGE.
IF DATA_PACKAGE.....
ENDIF.
ENDLOOP.
Thanks for your help in advance, JessicaHello Jessica,
if it won't be possible for you to get unique data from the very beginning, there is still another way to manage this problem in a start routine.
Sort ... and delete adjacent ... must remain. Further on build up an internal table of type data_package, but defined with STATICS instead of DATA. This i-tab stays alive for all data-packages of one load. Fill it with the data of the transferred data-packages, and delete from every new data-package all records which already are in the statics i-tab. Alternatively you could do the same with a Z-(or Y-)database-table instead of the statics i-tab.
It will probably cost some performance, but better slow than wrong data.
Regards,
Ernst -
Dear All
Job load from bw ODS to bw Master data InfoObject and ODS. This job always fail with same message: <b>9 duplicate record found. 73 recordings used in table /BIC/XZ_DEAL</b> (Z_DEAL is Infoobject).
When I rerun, this job has job no error.
Please help me solved this problem.
thanks
PhetHi,
What is the info object name.
Regards,
Goutam -
Hi,
I am using a ODS as source to update the master data infoobject with flexible update. The issue is in spit of using option ONLY PSA ( Update subsequently in data target ) in the infopackage with error handling enabled, I am getting Duplicate record error. This happens only when I am updating through process chain. If I run it manually, the error doesn't comes. Plz let me know the reason.
ThanksHi Maneesh,
As we r loading from ODS to info object we dont get the option "don't update duplicate records if exists"
first did u checked any duplicate records found in PSA?? if so delete them from PSA.
or one option is enable error handling in infopackage.
or
check any incosistencies for infoobject in RSRV and if found repair it and load again. check the inconsistencies for P,X,Y tables and for the complete object also.
*assign points if helpfull*
KS
Maybe you are looking for
-
WSUS Repair: Can I not use a drive other than C:?
I used the wizard to move the WSUS folders from the C: drive to the SBS2008's D: drive in order to recover C: drive disk space as so many others have had to do. Now ofcourse, the system says it no longer is installed as designed and updates to domain
-
i had a virus warning for my macbook pro. ran anti virus software but nothing was detected. however i keep getting pop ups.
-
Moved half of right screen to left
hi i have macbook pro 17' made 2011 with vga 1 amd radeon core i 7 8gig ram hdd 750 last night when i worked my screen got half and the part of half right went to left .. my quarranty support is finished . and i need to repair this .. as i ask th
-
Subtotals problem in hierarchy selection
Dear all, we have a technical problem in query 'Deviation Analysis'. When data is displayed as a hierarchy ('Brand local hierarchy', or 'Universal hierarchy'), calculations ("Calculate Result as...") in Subtotals of 'Formulas' are ignored. Calculatio
-
Would you recommend FX5700 TD256 or FX5500 128mb?
I bought a Video Capture card. Pinnacle AV DV Deluxe 9 which im plalnning to use with a video card. Which of the two will be the best to buy?