Duplicate data record
hi can anbody help me out,can u tell me,in what senario we will get the duplicate data record issue and what will be the solution?
recently i joined the company and iam new to support work.
please provide me step by step guidance.any help can be appriciateable.
Hi sk ss
It depends try to search in our forum u will find n no of postings on this particular
issue.
Any ways
In general it comes in the Master data loads .To avoid this we flag an option which
is avilable in our info pak level.like Ignore double data records.
Some times it comes under data level part as well at that time we will just rerun
the Attribute change run and then restart the infopackage.
<u><b>My suggestion Please do a search in our portal to get very clear about this issue .
there huge no of postings on this..</b></u>
Hope itz clear a little atleast...!
Thanks & Regards
R M K
***Assigning pointz is the only way of saying thanx in SDN ***
**Winners dont do different things,they do things differently**
> hi can anbody help me out,can u tell me,in what
> senario we will get the duplicate data record issue
> and what will be the solution?
> recently i joined the company and iam new to support
> work.
> please provide me step by step guidance.any help can
> be appriciateable.
Similar Messages
-
DTP Error: Duplicate data record detected
Hi experts,
I have a problem with loading data from DataSource to standart DSO.
In DS there are master data attr. which have a key containing id_field.
In End routine I make some operations which multiple lines in result package and fill new date field - defined in DSO ( and also in result_package definition )
I.E.
Result_package before End routine:
__ Id_field ____ attra1 ____ attr_b ...___ attr_x ____ date_field
____1________ a1______ b1_________ x1
____2________ a2______ b2_________ x2
Result_package after End routine:
__ Id_field ____ attra1 ____ attr_b ..___ attr_x ____ date_field
____1________ a1______ b1_________ x1______d1
____2________ a1______ b1_________ x1______d2
____3________ a2______ b2_________ x2______d1
____4________ a2______ b2_________ x2______d2
The date_field (date type) is in a key fields in DSO
When I execute DTP I have an error in section Update to DataStore Object: "Duplicate data record detected "
"During loading, there was a key violation. You tried to save more than one data record with the same semantic key."
As I know the result_package key contains all fields except fields type i, p, f.
In simulate mode (debuging) everything is correct and the status is green.
In DSO I have uncheched checkbox "Unique Data Records"
Any ideas?
Thanks in advance.
MGHi,
In the end routine, try giving
DELETE ADJACENT DUPLICATES FROM RESULT_PACKAGE COMPARING XXX YYY.
Here XXX and YYY are keys so that you can eliminate the extra duplicate record.
Or you can even try giving
SORT itab_XXX BY field1 field2 field3 ASCENDING.
DELETE ADJACENT DUPLICATES FROM itab_XXX COMPARING field1 field2 field3.
this can be given before you loop your internal table (in case you are using internal table and loops) itab_xxx is the internal table.
field1, field2 and field 3 may vary depending on your requirement.
By using the above lines, you can get rid of duplicates coming through the end routine.
Regards
Sunil
Edited by: Sunny84 on Aug 7, 2009 1:13 PM -
Getting Duplicate data Records error while loading the Master data.
Hi All,
We are getting Duplicate data Records error while loading the Profit centre Master data. Master data contains time dependent attributes.
the load is direct update. So i made it red and tried to reloaded from PSA even though it is throwing same error.
I checked in PSA. Showing red which records have same Profit centre.
Could any one give us any suggestions to resolve the issues please.
Thanks & Regards,
RajuHi Raju,
I assume there are no routines written in the update rules and also you ae directly loading the data from R/3 ( not from any ODS). If that is the case then it could be that, the data maintained in R/3 has overlapping time intervals (since time dependency of attributes is involved). Check your PSA if the same profit center has time intervals which over lap. In that case, you need to get this fixed in R/3. If there are no overlapping time intervals, you can simply increase the error tolerance limit in your info-package and repeat the load.
Hope this helps you.
Thanks & Regards,
Nithin Reddy. -
How to avoid 'duplicate data record' error message when loading master data
Dear Experts
We have a custom extractor on table CSKS called ZCOSTCENTER_ATTR. The settings of this datasource are the same as the settings of 0COSTCENTER_ATTR. The problem is that when loading to BW it seems that validity (DATEFROM and DATETO) is not taken into account. If there is a cost center with several entries having different validity, I get this duplicate data record error. There is no error when loading 0COSTCENTER_ATTR.
Enhancing 0COSTCENTER_ATTR to have one datasource instead of two is not an option.
I know that you can set ignore duplicates in the infopackage, but that is not a nice solution. 0COSTCENTER_ATTR can run without this!
Is there a trick you know to tell the system that the date fields are also part of the key??
Thank you for your help
PeterAlessandro - ZCOSTCENTER_ATTR is loading 0COSTCENTER, just like 0COSTCENTER_ATTR.
Siggi - I don't have the error message described in the note.
"There are duplicates of the data record 2 & with the key 'NO010000122077 &' for characteristic 0COSTCENTER &."
In PSA the records are marked red with the same message (MSG no 191).
As you see the key does not contain the date when the record is valid. How do I add it? How is it working for 0COSTCENTER_ATTR with the same records? Is it done on the R/3 or on the BW side?
Thanks
Peter -
How to rectify the error message " duplicate data records found"
Hi,
How to rectify the error "duplicate data records found" , PSA is not there.
and give me brief description about RSRV
Thanks in advance,
Ravi AlakunltaHi Ravi,
In the Info Package screen...Processing tab...check the option Do not allow Duplicate records.
RSRV is used for Repair and Analysis purpose.
If you found Duplicate records in the F fact table...Compress it then Duplicate records will be summarized in the Cube.
Hope this helps. -
Duplicate Data records occured while loading
Hai Experts
While loading Duplicate Data records occured earlier there was less no of records i used to delete the duplicate records in background in PSA. Now i am having more records which i am confusing which to delete and which not to delete. Since it is a flatfile loading with delta update mode. and when i went to infopackage it is showing update subsequently in data target in hide position and i went through process chainipdisplay variant---- and made status red--- selected updated subsequently in data target
ignore duplicate data records. and i want to trigger the subsequent process. if we go through process monitor i can rspc_process_finish. I i go display variant then what is the process.......................... is there any function module.Select the check box " Handle duplicate records key" in DTP on update tab.
Thanks....
Shambhu -
Hello
IO is updated from ODS. System generates an error : duplicate data records.
This is because keys of ods dont match keys of infoobject.
I can build an other ods to aggregate data before loading to infobjects.
Are there any ways to do it in start/end routine?
Thanksthere is Handle duplicate recors option in DTP
Indicator: Handling Duplicate Data Records
Use
If this indicator is set, duplicate data records (that is, records with the same key) are handled during an update in the order in which they occur in a data package.
For time-independent attributes of a characteristic, the last data record with the corresponding key within a data package defines the valid attribute value for the update for a given data record key.
Issue solved
Thanks a lot ! -
Duplicate Data Records indicator / the handle duplicate records
Hi All,
I am getting double data in two request. How can I delete extra data using "Duplicate Data Records indicator ".
I am not able to see this option in PSA as well as DTP for "the handle duplicate records".
can u help me to find the option in PSA/DTP.
Regards
Amit SrivastavaWhat Arvind said is correct.
But if you can try this out in an End Routine, this may work, Not sure though.
Because then you will be dealing with the entire result_package.
Also, say if the target that you are talking about is a DSO, then you can Delete Adjacant Duplicates in Start Routine while updating it into your next target. That can be a cube, for ex. -
Duplicate data records through DTP
Hi Guys,
I am loading duplicate data records to customer master data.
data upto PSA level is correct,
now when i am it from psa to customer master through dtp ,
when in DTP in update tab i select the check box for duplicate data records then at bottom it shows the
message that *ENTER VALID VALUE*
After this message i am unable to click any function and repeat the same message again & again.
So please give me solution that the above mentioned message shouldnt appear and then
i will be able to Execute the data ?
Thanks .
Saurabh jain.Hi,
if you get duplicate data for your customer there might be something wrong with your data source or the data in psa. But anyway, leave the dtp by restarting rsa1. Edit or create the dtp again and press save immediately after entering edit mode. Leave the dtp again and start editing it. That should do the trick.
regards
Siggi -
Duplicate data records through DTP for attribute
Hi Guys,
I am loading data to customer master data, But it contains duplicate data in large volume.
I have to load both attribute and text data .
Data upto PSA level is correct, and text data also is loaded successfully.
When i am loading attribute data to customer master ,it fails due to duplicate data records.
Then in dtp with update tab, I select the check box for duplicate data records .
As i select this check box ,at bottom it shows the
message that *ENTER VALID VALUE* .
After this message i am unable to click any function and repeat the same message again & again.
So i am unable to execute the DTP.
helpful answer will get full points.
So please give me solution that the above mentioned message should not appear and then
i will be able to Execute the data ?
Thanks .
Saurabh jain.Hi,
if you get duplicate data for your customer there might be something wrong with your data source or the data in psa. But anyway, leave the dtp by restarting rsa1. Edit or create the dtp again and press save immediately after entering edit mode. Leave the dtp again and start editing it. That should do the trick.
regards
Siggi -
Getting duplicate data records for master data
Hi All,
When the process chain for the master data, i am getting duplicate data records and , for that selected the options in Info package level under processing 1)a update PSA and subsequentky data targets and alternateely select the option Ignore double data records. But still the load was failing and error message "Duplicate Data Records" after that rhe sehuduled the Info package then i am not getting the error message next time,
Can any one help on this to resolve the issue.
Regrasd
KKYes, for the first option u can write a routine ,what is ur data target--> if it is a cube, there may be a chances of duplicate records because of the additive nature.if its a ODS then u can avoid this, bec only delta is going to be updated.
Regarding the time dependant attributes, its based on the date field.we have 4 types of slowly changing dimensions.
check the following link
http://help.sap.com/bp_biv135/documentation/Multi-dimensional_modeling_EN.doc
http://www.intelligententerprise.com/info_centers/data_warehousing/showArticle.jhtml?articleID=59301280&pgno=1
http://help.sap.com/saphelp_nw04/helpdata/en/dd/f470375fbf307ee10000009b38f8cf/frameset.htm -
Loading ODS - Data record exists in duplicate within loaded data
BI Experts,
I am attemping to load an ODS with the Unique Data Records flag turned ON. The flat file I am loading is a crosswalk with four fields, the first 3 fields are being used as Key Fields in order to make the records unique. I have had this issue before, but gave up in frustration and added an ascending number count field so simply create a unique key. This time I would like to solve the problem if possible.
The errors come back referring to two data rows that are duplicate:
Data record 1 - Request / Data package / Data record: REQU_4CNUD93Q3RCC80XFBG2CZXJ0T/000003/ 339
Data record 2 - Request / data package / data record: REQU_4CNUD93Q3RCC80XFBG2CZXJ0T/000003/ 338
And below here are the two records that the error message refers to:
3 338 3902301480 19C* * J1JD
3 339 3902301510 19C* * J1Q5
As you can see, the combination of my three Key Fields should not be creating a duplicate. (3902301480, 19C(asterisk) , (asterisk)) and (3902301510, 19C(asterisk) , (asterisk)) I replaced the *'s because they turn bold!
Is there something off with the numbering of the data records? Am I looking in the wrong place? I have examined my flat file and can not find duplicates and the records that BW say are duplicates are not, I am really having a hard time with this - any and all help greatly appreciated!!!Thank you for the response Sabuj....
I was about to answer your questions but I wanted to try one more thing, and it actually worked. I simply moved the MOST unique Key Field to the TOP of my Key Field list. It was at the bottom before.
FYI for other people with this issue -
Apparantly the ORDER of your Key Fields is important when trying to avoid creating duplicate records.
I am using four data fields, and was using three data fields as the Key Fields. Any combination of all three would NOT have a duplicate, however when BW finds that the first two key fields match, sometimes it apparantly doesn't care about the third one which would make the row unique. By simply changing the order of my Key Fields I was able to stop getting the duplicate row errors...
Lesson - If you KNOW that your records are unique, and you are STILL getting errors for duplicates, try changing the ORDER of your key fields. -
Identifing duplicate master data records using the MDM Import Manager
hi all
I read the Topis "How to identify duplicate master data records using the MDM Import Manager</b>"
i tried to create import maps and to set rules. but when i import them it creates the new vendor records for each rule with rest of the fields blank.
when i import vendor data all the three fields i.e Match rate , match type and match group are blank.
My Question is :
I am getting vendor data from SAP R/3.
In which source (in lookup xml file or data xml file) do i have to include these above three fields and how all the rules will be reflected in repository?Hi Sheetal
Here we go when do you Import any data (vendor master) please follow the following steps;
1. First of all apply the map to the source data
2. In the Match Record tab there are 3 possiblities
a.[Remote Key] : Checks the current source rec with
repository along with all the fields - This is
default
b.Remove [Remote key] - by double click the same; and
choose any single fields like Vendor Number or
name - Then the current record will be matched
with the repository based on the field.
c.Instead of single field you can choose combination
also.
3. Based on the Match results, match class will be set
automatically;
a. None
b. Single
c. Multiple
4. Then Match Type
a.Exact-All the individual value matches are Equal.
b.Partial-At least one value match is Equal and at least one Undefined; no value matches are Not Equal.
c.Conflict-At least one value match is Equal and at least one value match is Not Equal.
5. then chek the Import status and Execute the import.
Hope this helps you.
cheers
Alexander
Note: Pls dont forget reward points. -
Check duplicate data entry in multi record block,which is a mandatory field
Dear all,
I have a situation where i have to check duplicate data entry(on a particular field,which is a mandatory field,i.e. it cannot be skipped by user without entering value) while data key-in in a Multi Record block.
As for reference I have used a logic,such as
1>In a When-Validate-Record trigger of that block I am assigning the value of that current item in Table type variable(collection type)
as this trigger fire every time as soon as i leave that record,so its assigning the value of that current time.And this process continues
then
2>In a When-Validate-Item trigger of that corresponding item(i.e. the trigger is at item level) has been written,where it compares the value of that current item and the value stored in Table type variable(collection type) of When-Validate-Record trigger.If the current item value is matched with any value stored in Table type variable I am showing a message of ('Duplicate Record') following by raise_form_trigger failure
This code is working fine for checking duplicate value of that multi record field
The problem is that,if user enter the value in that field,and then goes to next field,enter value to that field and then press 'Enter Query 'icon,the bolth Validate trigger fires.As result first when-validate record fires,which stores that value,and then when-validate-item fires,as a result it shows duplicate record message
Please give me a meaningful logic or code for solving this problem
Any other logic to solve this problem is also welcome@Ammad Ahmed
first of all thanks .your logic worked,but still i have some little bit of problem,
now the requirement is a master detail form where both master and detail is multirecord ,where detail cannot have duplicate record,
such as..........
MASTER:--
A code
A1
A2
DETAIL:--
D code
d1
d2 <-valid as for master A1 , detail d1 ,d2 are not duplicate
d2 <--invalid as as for master A1 , detail d2 ,d2 are duplicate
validation rule: A Code –D Code combination is unique. The system will stop users from entering duplicate D Code for a A Code. Appropriate error message will be displayed.*
actually i am facing a typical problem,the same logic i have been applied in detail section ,its working fine when i am inserting new records.problem starts when i query,after query in ' a ' field say 2 records (i.e. which has been earlier saved) has been pasted,now if i insert a new record with the value exactly same with the already present value in the screen(i.e. value populated after query) its not showing duplicate.................could u tell me the reason?and help me out...............its urgent plzzzzzzzzz
Edited by: sushovan on Nov 22, 2010 4:34 AM
Edited by: sushovan on Nov 22, 2010 4:36 AM
Edited by: sushovan on Nov 22, 2010 8:58 AM -
Check Duplicate data during data key-in Multi Record Block
Dear all,
I have a situation where i have to check duplicate data entry(on a particular field,which is a mandatory field,i.e. it cannot be skipped by user without entering value) while data key-in in a Multi Record block.
As for reference I have used a logic,such as
1>In a When-Validate-Record trigger of that block I am assigning the value of that current item in Table type variable(collection type)
as this trigger fire every time as soon as i leave that record,so its assigning the value of that current time.And this process continues
then
2>In a When-Validate-Item trigger of that corresponding item(i.e. the trigger is at item level) has been written,where it compares the value of that current item and the value stored in Table type variable(collection type) of When-Validate-Record trigger.If the current item value is matched with any value stored in Table type variable I am showing a message of ('Duplicate Record') following by raise_form_trigger failure
This code is working fine for checking duplicate value of that multi record field
The problem here is that suppose if usee gets a message of ('Duplicate Record') and after that without saving the values if user try to query of that block then also when validate item fired where as I am expecting ORACLE default alert message('Do You want to save?'),I want to restrict this When-Validate Item fire during query time..........................while user try to query.
Please give me a meaningful logic or code for solving this problem
Any other logic to solve this problem is also welcomeWhen-Validate-Record trigger
When-Validate-Item triggerThat smells like Oracle Forms...
And the Oracle Forms forum is over here: Forms
Maybe you are looking for
-
Error compiling movie: out of memory !!! Premiere CS3
Hey every1 --- desperate need of some guidance here. I've been all over the internet and these boards --- have seen this come up with a lot of people but no real solid answers that I'd seen or that have helped. First off, my comp is an imac 3.06 Ghz
-
Query for Linking Two Tables without Reference
Hi Folks, Good Day! I have this dilemma on how to link two tables without a reference. In the Business Partner Master Data, there is a field for the Territory which is from the table OTER. This OTER focuses on its description. Once you have added the
-
How to maintain GR/IR clearing account
Dear all, Can anyone tell me how the GR/IR clearing account maintained manually - i.e., prevent or stop system from auto posting. Is there any possibility - if yes can you tell me the steps and t.code where this has to be maintained. Regards, M.M
-
A GNU Unzip procedure makes me troubles for large size files. It takes too much time.GZIPInputStream zipin = new GZIPInputStream( new FileInputStream( zipFile ) ); byte buffer[] = new byte[8192]; FileOutputStream out = new FileOutputStream( sou
-
Action Transport into sandbox in urgen correction SDHF???
Hi Expert, We face a problem in the action transport into sandbox during an urgen correction, the solution manager not recognize the sandbox system. Any help please! The role of the sandbox is " evaluation system" The error is the following: An actio