Duplicate records update
Hi
We do have primary key in source table and key columns without constraint in target table. We do transformation of converting schema name in source while capturing.
We do allow_duplicate_rows=Y in apply side. Whenever, duplicate rows insert happening, it works fine. When it comes to update it not working it throws an error.
ORA-01422: exact fetch returns more than requested number of rows
ORA-01403: no data found
Could you please shed some light to fix this issue.?
Alternate solution, remove the duplicate and execute the streams error transaction, works fine. But allow_duplicate_rows functionality is not fulfilled.
Thanks
Bala
Source 10.2.0.4.3
Target 10.2.0.3
We are creating the reporting instance, duplicate is allowed. Insert on duplicate records are successfully inserted, when it comes to update, it is failed with the following error.
----Error in Message: 1
----Error Number: 1422
----Message Text: ORA-01422: exact fetch returns more than requested number of rows
ORA-01403: no data found
--message: 1
type name: SYS.LCR$_ROW_RECORD
source database: PROD2
owner: REPORT_TWO
object: DEVICE
is tag null: Y
command_type: UPDATE
Supplemental logging is enabled for both source and target database.
Thanks
Bala
Similar Messages
-
Importing and Updating Non-Duplicate Records from 2 Tables
I need some help with the code to import data from one table
into another if it is not a duplicate or if a record has changed.
I have 2 tables, Members and NetNews. I want to check NetNews
and import non-duplicate records from Members into NetNews and
update an email address in NetNews if it has changed in Members. I
figured it could be as simple as checking Members.MembersNumber and
Members.Email against the existance of NetNews.Email and
Members.MemberNumber and if a record in NetNews does not exist,
create it and if the email address in Members.email has changed,
update it in NetNews.Email.
Here is what I have from all of the suggestions received from
another category last year. It is not complete, but I am stuck on
the solution. Can someone please help me get this code working?
Thanks!
<cfquery datasource="#application.dsrepl#"
name="qryMember">
SELECT distinct Email,FirstName,LastName,MemberNumber
FROM members
WHERE memberstanding <= 2 AND email IS NOT NULL AND email
<> ' '
</cfquery>
<cfquery datasource="#application.ds#"
name="newsMember">
SELECT distinct MemberNumber
FROM NetNews
</cfquery>
<cfif
not(listfindnocase(valuelist(newsMember.MemberNumber),qryMember.MemberNumber)
AND isnumeric(qryMember.MemberNumber))>
insert into NetNews (Email_address, First_Name, Last_Name,
MemberNumber)
values ('#trim(qryMember.Email)#',
'#trim(qryMember.FirstName)#', '#trim(qryMember.LastName)#', '#
trim(qryMember.MemberNumber)#')-
</cfif>
</cfloop>
</cfquery>
------------------Dan,
My DBA doesn't have the experience to help with a VIEW. Did I
mention that these are 2 separate databases on different servers?
This project is over a year old now and it really needs to get
finished so I thought the import would be the easiest way to go.
Thanks to your help, it is almost working.
I added some additional code to check for a changed email
address and update the NetNews database. It runs without error, but
I don't have a way to test it right now. Can you please look at the
code and see if it looks OK?
I am also still getting an error on line 10 after the routine
runs. The line that has this code: "and membernumber not in
(<cfqueryparam list="yes"
value="#valuelist(newsmember.membernumber)#
cfsqltype="cf_sql_integer">)" even with the cfif that Phil
suggested.
<cfquery datasource="#application.ds#"
name="newsMember">
SELECT distinct MemberNumber, Email_Address
FROM NetNewsTest
</cfquery>
<cfquery datasource="#application.dsrepl#"
name="qryMember">
SELECT distinct Email,FirstName,LastName,MemberNumber
FROM members
WHERE memberstanding <= 2 AND email IS NOT NULL AND email
<> ' '
and membernumber not in (<cfqueryparam list="yes"
value="#valuelist(newsmember.membernumber)#"
cfsqltype="cf_sql_integer">)
</cfquery>
<CFIF qryMember.recordcount NEQ 0>
<cfloop query ="qryMember">
<cfquery datasource="#application.ds#"
name="newsMember">
insert into NetNewsTest (Email_address, First_Name,
Last_Name, MemberNumber)
values ('#trim(qryMember.Email)#',
'#trim(qryMember.FirstName)#', '#trim(qryMember.LastName)#', '#
trim(qryMember.MemberNumber)#')
</cfquery>
</cfloop>
</cfif>
<cfquery datasource="#application.dsrepl#"
name="qryEmail">
SELECT distinct Email
FROM members
WHERE memberstanding <= 2 AND email IS NOT NULL AND email
<> ' '
and qryMember.email NEQ newsMember.email
</cfquery>
<CFIF qryEmail.recordcount NEQ 0>
<cfloop query ="qryEmail">
<cfquery datasource="#application.ds#"
name="newsMember">
update NetNewsTest (Email_address)
values ('#trim(qryMember.Email)#')
where email_address = #qryEmail.email#
</cfquery>
</cfloop>
</cfif>
Thank you again for the help. -
Duplicate record identifier and update
My records look like
Name City Duplicateindicator
SAM NYC 0
SAM NYC1 0
SAM ORD 0
TAM NYC 0
TAM NYC1 0
DAM NYC 0
for some reason numeric character are inserted into city which duplicated my records ,
I need to
Check for the duplicate records by name ( If name is repeating ) check for city if they having same city (NYC and NYC1) are consider same city here. I am ok to do this for one city at a time.
SAM has a duplicate record as NYC and NYC01 , the record which is having SAM NYC1 0 must be updated to SAM NYC1 1Good day tatva
Since the Cities names is not exactly the same, you will need to parse the text somehow in order to clean the numbers from the name, this is best to do with SQLCLR using regular expression (If this fit your need, then I can post the CLR code for you).
In this case you use simple regular expression replace function.
On the result of the function you use simple query with the function ROW_NUMBER over (partition by RegularExpressionReplace(ColumnName, '[0-9]') order by
ColumnName)
on the result of the ROW_NUMBER every row with ROW_NUMBER more then 1 is duplicate
I hope this useful :-)
Ronen Ariely
[Personal Site] [Blog] [Facebook] -
No 'Handle Duplicate records' in update tab
Hi,
I've a DTP from ODS/DSO to ODS/DSO and I got a Duplicate record error, which I find rather strange for standard ODS/DSO. I read in the help and within these forums that it can be fixed with the 'Handle Duplicate records' checkbox in the update tab. Trouble is that there isn't such a checkbox in our BI7 SP15 installation.
Any suggestion on the reason for that and how to fix this and more important on how to get rid of the duplicate record error (and the reason why it's occurring)?
Many thanks in advance
EddyHi Eddy,
I am confused -:)
Have u tried by checking or by unchecking..
My suggestion is to try by selecting the setting unique data records...
Cheers
Siva -
How to create duplicate records in end routines
Hi
Key fields in DSO are:
Plant
Storage Location
MRP Area
Material
Changed Date
Data Fields:
Safety Stocky
Service Level
MRP Type
Counter_1 (In flow Key figure)
Counter_2 (Out flow Key Figure)
n_ctr (Non Cumulative Key Figure)
For every record that comes in, we need to create a dupicate record. For the original record, we need to make the Counter_1 as 1 and Counter_2 as 0. For the duplicate record, we need to update Changed_Date to today's date and rest of the values will remain as is and update the counter_1 as 0 and counter_2 as -1. Where is the best place to write this code in DSO. IS it End
routine?
please let me know some bais cidea of code.Hi Uday,
I have same situation like Suneel and have written your logic in End routine DSO as follows:
DATA: l_t_duplicate_records TYPE TABLE OF TYS_TG_1,
l_w_duplicate_record TYPE TYS_TG_1.
LOOP AT RESULT_PACKAGE ASSIGNING <result_fields>.
MOVE-CORRESPONDING <result_fields> TO l_w_duplicate_record.
<result_fields>-/BIC/ZPP_ICNT = 1.
<result_fields>-/BIC/ZPP_OCNT = 0.
l_w_duplicate_record-CH_ON = sy-datum.
l_w_duplicate_record-/BIC/ZPP_ICNT = 0.
l_w_duplicate_record-/BIC/ZPP_OCNT = -1.
APPEND l_w_duplicate_record TO l_t_duplicate_records.
ENDLOOP.
APPEND LINES OF l_t_duplicate_records TO RESULT_PACKAGE.
I am getting below error:
Duplicate data record detected (DS ZPP_O01 , data package: 000001 , data record: 4 ) RSODSO_UPDATE 19
i have different requirement for date. Actually my requirement is to populate the CH_ON date as mentioned below:
sort the records based on the key and get the latest CH_ON value with unique Plant,sloc, material combination and populate
that CH_ON value for duplicate record.
Please help me to resolve this issue.
Thanks,
Ganga -
How to find out duplicate record contained in a flat file
Hi Experts,
For my project I have written a program for flat file upload.
Requirement 1
In the flat file there may be some duplicate record like:
Field1 Field2
11 test1
11 test2
12 test3
13 test4
Field1 is primary key.
Can you please let me know how I can find out the duplicate record.
Requirement 2
The flat file contains the header row as shown above
Field1 Field2
How our program can skip this record and start reading / inserting records from row no 2 ie
11 test1
onwards.
Thanks
S
FORM upload1.
DATA : wf_title TYPE string,
lt_filetab TYPE filetable,
l_separator TYPE char01,
l_action TYPE i,
l_count TYPE i,
ls_filetab TYPE file_table,
wf_delemt TYPE rollname,
wa_fieldcat TYPE lvc_s_fcat,
tb_fieldcat TYPE lvc_t_fcat,
rows_read TYPE i,
p_error TYPE char01,
l_file TYPE string.
DATA: wf_object(30) TYPE c,
wf_tablnm TYPE rsdchkview.
wf_object = 'myprogram'.
DATA i TYPE i.
DATA:
lr_mdmt TYPE REF TO cl_rsdmd_mdmt,
lr_mdmtr TYPE REF TO cl_rsdmd_mdmtr,
lt_idocstate TYPE rsarr_t_idocstate,
lv_subrc TYPE sysubrc.
TYPES : BEGIN OF test_struc,
/bic/myprogram TYPE /bic/oimyprogram,
txtmd TYPE rstxtmd,
END OF test_struc.
DATA : tb_assum TYPE TABLE OF /bic/pmyprogram.
DATA: wa_ztext TYPE /bic/tmyprogram,
myprogram_temp TYPE ziott_assum,
wa_myprogram TYPE /bic/pmyprogram.
DATA : test_upload TYPE STANDARD TABLE OF test_struc,
wa2 TYPE test_struc.
DATA : wa_test_upload TYPE test_struc,
ztable_data TYPE TABLE OF /bic/pmyprogram,
ztable_text TYPE TABLE OF /bic/tmyprogram,
wa_upld_text TYPE /bic/tmyprogram,
wa_upld_data TYPE /bic/pmyprogram,
t_assum TYPE ziott_assum.
DATA : wa1 LIKE test_upload.
wf_title = text-026.
CALL METHOD cl_gui_frontend_services=>file_open_dialog
EXPORTING
window_title = wf_title
default_extension = 'txt'
file_filter = 'Tab delimited Text Files (*.txt)'
CHANGING
file_table = lt_filetab
rc = l_count
user_action = l_action
EXCEPTIONS
file_open_dialog_failed = 1
cntl_error = 2
OTHERS = 3. "#EC NOTEXT
IF sy-subrc 0.
EXIT.
ENDIF.
LOOP AT lt_filetab INTO ls_filetab.
l_file = ls_filetab.
ENDLOOP.
CHECK l_action = 0.
IF l_file IS INITIAL.
EXIT.
ENDIF.
l_separator = 'X'.
wa_fieldcat-fieldname = 'test'.
wa_fieldcat-dd_roll = wf_delemt.
APPEND wa_fieldcat TO tb_fieldcat.
CALL FUNCTION 'MESSAGES_INITIALIZE'.
CLEAR wa_test_upload.
Upload file from front-end (PC)
File format is tab-delimited ASCII
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
filename = l_file
has_field_separator = l_separator
TABLES
data_tab = i_mara
data_tab = test_upload
EXCEPTIONS
file_open_error = 1
file_read_error = 2
no_batch = 3
gui_refuse_filetransfer = 4
invalid_type = 5
no_authority = 6
unknown_error = 7
bad_data_format = 8
header_not_allowed = 9
separator_not_allowed = 10
header_too_long = 11
unknown_dp_error = 12
access_denied = 13
dp_out_of_memory = 14
disk_full = 15
dp_timeout = 16
OTHERS = 17.
IF sy-subrc 0.
EXIT.
ELSE.
CALL FUNCTION 'MESSAGES_INITIALIZE'.
IF test_upload IS NOT INITIAL.
DESCRIBE TABLE test_upload LINES rows_read.
CLEAR : wa_test_upload,wa_upld_data.
LOOP AT test_upload INTO wa_test_upload.
CLEAR : p_error.
rows_read = sy-tabix.
IF wa_test_upload-/bic/myprogram IS INITIAL.
p_error = 'X'.
MESSAGE s153 WITH wa_test_upload-/bic/myprogram sy-tabix.
CONTINUE.
ELSE.
TRANSLATE wa_test_upload-/bic/myprogram TO UPPER CASE.
wa_upld_text-txtmd = wa_test_upload-txtmd.
wa_upld_text-txtsh = wa_test_upload-txtmd.
wa_upld_text-langu = sy-langu.
wa_upld_data-chrt_accts = 'xyz1'.
wa_upld_data-co_area = '12'.
wa_upld_data-/bic/zxyzbcsg = 'Iy'.
wa_upld_data-objvers = 'A'.
wa_upld_data-changed = 'I'.
wa_upld_data-/bic/zass_mdl = 'rrr'.
wa_upld_data-/bic/zass_typ = 'I'.
wa_upld_data-/bic/zdriver = 'yyy'.
wa_upld_text-langu = sy-langu.
MOVE-CORRESPONDING wa_test_upload TO wa_upld_data.
MOVE-CORRESPONDING wa_test_upload TO wa_upld_text.
APPEND wa_upld_data TO ztable_data.
APPEND wa_upld_text TO ztable_text.
ENDIF.
ENDLOOP.
DELETE ADJACENT DUPLICATES FROM ztable_data.
DELETE ADJACENT DUPLICATES FROM ztable_text.
IF ztable_data IS NOT INITIAL.
CALL METHOD cl_rsdmd_mdmt=>factory
EXPORTING
i_chabasnm = 'myprogram'
IMPORTING
e_r_mdmt = lr_mdmt
EXCEPTIONS
invalid_iobjnm = 1
OTHERS = 2.
CALL FUNCTION 'MESSAGES_INITIALIZE'.
**Lock the Infoobject to update
CALL FUNCTION 'RSDG_IOBJ_ENQUEUE'
EXPORTING
i_objnm = wf_object
i_scope = '1'
i_msgty = rs_c_error
EXCEPTIONS
foreign_lock = 1
sys_failure = 2.
IF sy-subrc = 1.
MESSAGE i107(zddd_rr) WITH wf_object sy-msgv2.
EXIT.
ELSEIF sy-subrc = 2.
MESSAGE i108(zddd_rr) WITH wf_object.
EXIT.
ENDIF.
*****Update Master Table
IF ztable_data IS NOT INITIAL.
CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
EXPORTING
i_iobjnm = 'myprogram'
i_tabclass = 'M'
I_T_ATTR = lt_attr
TABLES
i_t_table = ztable_data
EXCEPTIONS
attribute_name_error = 1
iobj_not_found = 2
generate_program_error = 3
OTHERS = 4.
IF sy-subrc 0.
CALL FUNCTION 'MESSAGE_STORE'
EXPORTING
arbgb = 'zddd_rr'
msgty = 'E'
txtnr = '054'
msgv1 = text-033
EXCEPTIONS
OTHERS = 3.
MESSAGE e054(zddd_rr) WITH 'myprogram'.
ELSE.
CALL FUNCTION 'MESSAGE_STORE'
EXPORTING
arbgb = 'zddd_rr'
msgty = 'S'
txtnr = '053'
msgv1 = text-033
EXCEPTIONS
OTHERS = 3.
ENDIF.
*endif.
*****update Text Table
IF ztable_text IS NOT INITIAL.
CALL FUNCTION 'RSDMD_WRITE_ATTRIBUTES_TEXTS'
EXPORTING
i_iobjnm = 'myprogram'
i_tabclass = 'T'
TABLES
i_t_table = ztable_text
EXCEPTIONS
attribute_name_error = 1
iobj_not_found = 2
generate_program_error = 3
OTHERS = 4.
IF sy-subrc 0.
CALL FUNCTION 'MESSAGE_STORE'
EXPORTING
arbgb = 'zddd_rr'
msgty = 'E'
txtnr = '055'
msgv1 = text-033
EXCEPTIONS
OTHERS = 3.
ENDIF.
ENDIF.
ELSE.
MESSAGE s178(zddd_rr).
ENDIF.
ENDIF.
COMMIT WORK.
CALL FUNCTION 'RSD_CHKTAB_GET_FOR_CHA_BAS'
EXPORTING
i_chabasnm = 'myprogram'
IMPORTING
e_chktab = wf_tablnm
EXCEPTIONS
name_error = 1.
IF sy-subrc 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
****Release locks on Infoobject
CALL FUNCTION 'RSDG_IOBJ_DEQUEUE'
EXPORTING
i_objnm = 'myprogram'
i_scope = '1'.
ENDIF.
ENDIF.
PERFORM data_selection .
PERFORM update_alv_grid_display.
CALL FUNCTION 'MESSAGES_SHOW'.
ENDFORM.Can you please let me know how I can find out the duplicate record.
you need to split the records from flat file structure into your internal table ans use a delete ADJACENT duplicates comparing fields
split flat_str into wa_f1 wa_f2 wa_f2 at tab_space. -
36 duplicate record found. -- error while loading master data
Hello BW Experts,
Error while loading master data
( green light )Update PSA ( 50000 Records posted ) No errors
( green light )Transfer rules ( 50000 ³ 50000 Records ): No errors
( green light )Update rules ( 50000 ³ 50000 Records ): No errors
( green light ) Update ( 0 new / 50000 changed ): No errors
( red light )Processing end: errors occurred
Processing 2 finished
Þ 36 duplicate record found. 15718 recordings used in table /BIC/PZMATNUM
Þ 36 duplicate record found. 15718 recordings used in table /BIC/XZMATNUM
This error repeats with all the data-packets.
what could be the reason of the error. how to correct the error.
Any suggestions appreciated.
Thanks,
BWerBWer,
We have exactly the same issue when loading the infoobject 0PM_ORDER, the datasource is 0PM_ORDER_ATTR.
The workaround we have been using is 'Manual push' from the details tab of the monitor. Weirdly we don't have this issue in our test and dev systems, even in Production, this doesn't occur somedays. We have all the necessary settings enabled in the infopackage.
Did you figure out a solution for your issue, if so please let me know and likewise I will.
-Venkat -
Duplicate records found while loading master data(very urgent)
Hi all,
One infopackage in the process chain failed while laoding the master data(full update).Its showing the following error-->duplicate record found ..1 record used in /BI0/PTCTQUERY and the same record occured in /BI0/PTCTQUERY tables.
can anyone give me the solution...its very urgent...
Thanks & Regards,
ManjulaHi
You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
Help says that:
To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
hope it clears ur doubt, otherwise let me know.
Regards
Kiran -
Data loader : Import -- creating duplicate records ?
Hi all,
does anyone have also encountered the behaviour with Oracle Data Loader that duplicate records are created (also if i set the option: duplicatecheckoption=externalid) When i am checking the "import request queue - view" the request parameters of the job looks fine! ->
Duplicate Checking Method == External Unique ID
Action Taken if Duplicate Found == Overwrite Existing Records
but data loader have created new records where the "External Unique ID" is already existent..
Very strange is that when i create the import manually (by using Import Wizard) exactly the same import does work correct! Here the duplicate checking method works correct and the record is updated....
I know the data loader has 2 methods, one for update and the other for import, however i do not expect that the import creates duplicates if the record is already existing, rather doing nothing!
Anyone else experiencing the same ?? I hope that this is not expected behaviour!! - by the way method - "Update" works fine.
thanks in advance, Juergen
Edited by: 791265 on 27.08.2010 07:25
Edited by: 791265 on 27.08.2010 07:26Sorry to hear about your duplicate records, Juergen. Hopefully you performed a small test load first, before a full load, which is a best practice for data import that we recommend in our documentation and courses.
Sorry also to inform you that this is expected behavior --- Data Loader does not check for duplicates when inserting (aka importing). It only checks for duplicates when updating (aka overwriting). This is extensively documented in the Data Loader User Guide, the Data Loader FAQ, and in the Data Import Options Overview document.
You should review all documentation on Oracle Data Loader On Demand before using it.
These resources (and a recommended learning path for Data Loader) can all be found on the Data Import Resources page of the Training and Support Center. At the top right of the CRM On Demand application, click Training and Support, and search for "*data import resources*". This should bring you to the page.
Pete -
Duplicate records in Fact Tables
Hi,
We are using BPC 7.0 MS SP7. BPC created duplicate records in WB and Fac2 tables. We faced similar issue before and the solution was to reboot the server and cleanup the additional data created. I think it should not be an issue with the script logic files we have. We had the issue across all applications. Data is fine now after the server reboot and running the same logic files. I want to know if any one faced this issue and if there is any solution other than reboot. I appreciate your help.
Thanks
RajHi Sorin,
I know this thread is rather old, but i have a problem which is pretty much related to this thread and appreciate if you could assist me. I have client running on 7.0 MS who has been using it for the past 3 years.
It is a heavily customized system with many batch files running daily to update dimensions, copy data and sort. And Yes we do use custom packages that incorporates stored procedures.
Recently, with no change in environment, we encountered our factwb ballooning up out of no where. fact table only contains less then 1 gb data but, factwb has 200 gb data and practically paralayzed the system. There is also equilavent 300 gb increase in the log files.
We are not able to find out what caused this? Or if even the 200gb records in wb are even valid records that are duplicated. Is there a way to troubleshoot this? -
Duplicate records in Cube Level ( BI 7.0 )
Dear All
I am working on BI 7.0 , I have an issue , i am loading the data from Flat File to ODS and from ODS to Cube . In ODS we are selected Overwrite option, in cube level we have an Summation option. the problem is while loading the data from Flatfile to ODS records are fine . from ODS to cube data loading also fine but here i am getting the Duplicate records .
what are the best options to go ahead in such a situations??
Regards
KKI am sharing a case occured for me. Please see if its applicable for you.
Sometimes, in the step loading to Cube, when any type of problem occurs, we restart the load. If the cube load prompts saying 'the lost load was unsuccessful.... reload?', this problem may occur. It will load the records in the previous load also.
Verify what has been duplicated from the ODS changelog table and the cube load record count. If you see the number of records updated being the total of the records in the different ODSR request (in the change log table). Delete the previous load in the cube. (provided no other side effect is produced e.g from start routine etc)
Cheers. -
Data Loader inserting duplicate records
Hi,
There is an import that we need to run everyday in order to load data from another system into CRM On Demand . I have set up a data loader script which is scheduled to run every morning. The script should perform insert operation.
Every morning a file with new insert data is available in the same location(generated by someone else) & same name. The data loader script must insert all records in it.
One morning , there was a problem in the other job and a new file was not produced. When the data loader script ran , it found the old file and re-inserted the records (there were 3 in file). I had specified the -duplicatecheckoption parameter as the external id, since the records come from another system, but I came to know that the option works in the case of update operations only.
How can a situation like this handled in future? The external id should be checked for duplicates before the insert operation is performed. If we cant check on the data loader side, is it possible to somehow specify the field as 'unique' in the UI so that there is an error if a duplicate record is inserted? Please suggest.
Regards,Hi
You can use something like this:
cursor crs is select distinct deptno,dname,loc from dept.
Now you can insert all the records present in this cursor.
Assumption: You do not have duplicate entry in the dept table initially.
Cheers
Sudhir -
Ignore duplicate records for master data attributes
dear experts ,
how & where can i enable "ignore duplicate records" when i am running my DTP to load data
to master data attributes.Hi Raj
Suppose you are loading master data to InfoObject and in PSA you have more than one records for Key.
Let's assume you are loading some attributes of Document NUmber ( 0DOC_NUMBER) and in PSA you have multiple records for same document number. In PSA this is not a problem, as Key for PSA table is technical key and which is different for every record in PSA. But this is a problem for InfoObject attribute table as more than one records for primary key ( here 0DOC_NUMBER) will create Primary Key consraints in database.
This issue can be easily avoided by selecting "Handle Duplicate Records" in DTP . You will find this option under *update" tab of DTP.
Regards
Anindya -
Master data failed with error `too many duplicate records'
Dear all
below is error message
Data records for package 1 selected in PSA -
error 4 in the update
LOng text is giving as below
Error 4 in the update
Message no. RSAR119
Diagnosis
The update delivered the error code 4.
Procedure
You can find further information on this error in the error message of the update.
WOrking on BI - 7.0
any solutions
Thanks
satish .aHi,
Go through these threads, they have same issue:
Master data load: Duplicate Records
Re: Master data info object - duplicate records
Re: duplicate records in master data info object
Regards
Raj Rai -
Hi All,
I have a situation. I am using BPC NW 7.0 and I have updated my dimension files. When I try to validate my transaction file every single record is validated successfully. But when I try to import the flat file into my application, I am getting a lot of dupplication records error and these are my questions.
1. Will we get duplicate records in transaction files?
2. Even if there are duplication, since it is a cube it should summarize not display that as a error and reject records?
3. Is there something I can do to accept duplicates (I have checked the Replace option in the data package, to overwrite the simillar records, but it is only for account, category and entity only.
5. In mycase I see identical values in all my dimension and the $value is the only difference. Why is it not summing up.
Your quickest reply is much appreciated.
Thanks,
Alex.Hi,
I have the same problem.
In my case the file that I want to upload has different row that differ for the nature column. In the conversion file I map different nature to one internal nature.
ES: cost1 --> cost
cost2 --> cost
cost3 --> cost
In my desire was that in BPC the nature cost assume the result cost = cost1 + cost2 + cost3.
The result is that only the first record is uploaded and all other recorda are rejected as duplicate.
Any suggestion?
Maybe you are looking for
-
How do I save then play a slideshow to windows media player?
I have a slideshow created in iphoto that I would like to play on a PC in Windows Media Player. I tried to save it in Quicktime, but it wouldn't play on my PC at work. Any suggestions?
-
When i download jar or jad file error appears when jad-invalid app jar-operation failed pls give me solution i think it is prblem with certificates
-
Using imessage between devices
My son has an ipod touch, me an iphone4. I want to be able to use imessage to 'text' between us, but since he's only 8 I'm not happy for him to have his own apple account so we share itunes etc. But I can't seem to work out how to message between the
-
Sync- Can we sync bookmarks between different pc or different os except for between pc and android
I use win7 on my personal pc and ubuntu on my working pc. I hope that sync function help us to sync bookmarks between the two pc conveniently by clounding service. Of course , saving bookmark as a file is OK.,but It is not good and convenient.Thanks!
-
Error when uploading file in TA: SFP (Form Builder)
Hi Everybody, i am having a problem in Transaction SFP (Form Builder) I download a Form Objekt and it was written correct in my Local Drive with the ending xxxx.XML. Then i wanted to Updold the file, then it says: "Error occurred when uploading file