Duplicate records to DSO
Hello Friends,
we have an issue with the Duplicate records in to DSO let me enplain the senarion
The Heder and Details data is loaded to saperate DSO's
and these 2 DSO's data shuld get merged in the third one,
the Key fields in
DSO 1 : DWRECID, 0AC_DOC_NO
DSO 2 : DWRECID , DWPOSNR
DSO 3 will fetch data from these the above 2
Key Fields are : ]
DWTSLO,
DWRECID,
DWEDAT ,
AC_DOC_NO
DWPOSNR,
0CUSTOMER
Now the data shuld be merge in to a single record in the 3 rd dso
DSO 1 do not have the DWPOSNR object in its data fields also.
we even have start routine data from DSO 1 to populate some values in the result fields from dso2 ,
Please provide if you have any inputs to merge the data record wise.
and also give me all the posibilites or options we have to over write " apart from mappings " the data ,
Hi,
You should go for creating an Infoset instead of creating third DSO.
In that DSO provide the Keys of DSOs and the Common records with those keys will be merged in that Infoset.
Hope It Helps.
Regards
Praeon
Similar Messages
-
Write Optimized DSO Duplicate records
Hi,
We are facing problem while doing delta to an write optimized data store object.
It gives error "Duplicate data record detected (DS <ODS name> , data package: 000001 , data record: 294 )
But it can not have an duplicate record since data is from DSO and
we have also checked in PSA the particular record and there we couldn't find the duplicate record.....
There is no much complex routine also.....
Have any one ever faced this issue and got the solution? Please let me know if yes.
Thanks
VJRavi,
We have checked that there is no duplicate records in PSA.
Also the source ODS has two keys and target ODS is having three Keys.
Also the records that it has mentioned are having Record mode "N" New.
Seems to be issue with write-ptimized DSO.
Regards
VJ -
DB Connect DataSource PSA records and DSO records are not matching...
Dear All,
I'm working with SAP NetWeaver BW 7.3 and for the first time, I have loaded from Source System DB Connect. I have created a DataSource and pulled all records and found 8,136,559 records in PSA. When I designed and created DSO with Key Fields 0CALDAY, Item No and Company Code, it has transferred records about 8,136,559 and added records about 12,534 only. Similarly following InfoCube has about 12,534 records into its Fact table. When I tried to reconcile the data/records with source DBMS for a month, the records/data could not matched?
1. What could be the reason behind the issue? why I am unable to load the records/data correctly?
2. Have I not mentioned the Key Fields of DSO in a correct manner?
3. Is it possible to load the records/data into DSO without giving any field as Key Fields?
4. How should I resolve this issue?
5. Is it could be of DSO Overwrite and summation function utilization if yes, then how to utilize it?
Many thanks,
Tariq AshrafDear Tariq,
1. What could be the reason behind the issue? why I am unable to load the records/data correctly?
Ans: Check transformation once. Is there any start routine you have used or direct assignments. What kind of DTP settings you have done.
Check the messages at the DTP monitor. You will surely find some clue. Any duplicate records are being detected or not check once if you are using semantic keys in your DTP.
2. Have I not mentioned the Key Fields of DSO in a correct manner?
Ans: The transformation key and the DSo key are they same in your case?
What kind of DSO is it? Like for sales order DSO you take Order number as a key field., So you have to define the key fields according to business semantics I suppose. Do you agree?
3. Is it possible to load the records/data into DSO without giving any field as Key Fields?
Ans: I dont think so as the keys you defined will help in having unique data records isnot it?
4. How should I resolve this issue?
Ans: Please check the above as in Ans:1 please share your observation.
5. Is it could be of DSO Overwrite and summation function utilization if yes, then how to utilize it?
Ans: DSO overwriting of key figures is useful when you have full loads in picture. Are you always going to perform full loads ?
For reference would you like to check this thread: Data fileds and key fields in DSO
Lets see what experts give their inputs.
Thank You... -
How to create duplicate records in end routines
Hi
Key fields in DSO are:
Plant
Storage Location
MRP Area
Material
Changed Date
Data Fields:
Safety Stocky
Service Level
MRP Type
Counter_1 (In flow Key figure)
Counter_2 (Out flow Key Figure)
n_ctr (Non Cumulative Key Figure)
For every record that comes in, we need to create a dupicate record. For the original record, we need to make the Counter_1 as 1 and Counter_2 as 0. For the duplicate record, we need to update Changed_Date to today's date and rest of the values will remain as is and update the counter_1 as 0 and counter_2 as -1. Where is the best place to write this code in DSO. IS it End
routine?
please let me know some bais cidea of code.Hi Uday,
I have same situation like Suneel and have written your logic in End routine DSO as follows:
DATA: l_t_duplicate_records TYPE TABLE OF TYS_TG_1,
l_w_duplicate_record TYPE TYS_TG_1.
LOOP AT RESULT_PACKAGE ASSIGNING <result_fields>.
MOVE-CORRESPONDING <result_fields> TO l_w_duplicate_record.
<result_fields>-/BIC/ZPP_ICNT = 1.
<result_fields>-/BIC/ZPP_OCNT = 0.
l_w_duplicate_record-CH_ON = sy-datum.
l_w_duplicate_record-/BIC/ZPP_ICNT = 0.
l_w_duplicate_record-/BIC/ZPP_OCNT = -1.
APPEND l_w_duplicate_record TO l_t_duplicate_records.
ENDLOOP.
APPEND LINES OF l_t_duplicate_records TO RESULT_PACKAGE.
I am getting below error:
Duplicate data record detected (DS ZPP_O01 , data package: 000001 , data record: 4 ) RSODSO_UPDATE 19
i have different requirement for date. Actually my requirement is to populate the CH_ON date as mentioned below:
sort the records based on the key and get the latest CH_ON value with unique Plant,sloc, material combination and populate
that CH_ON value for duplicate record.
Please help me to resolve this issue.
Thanks,
Ganga -
Delete overlapping/duplicate records from cube
Hi All,
Kindly let me know how to delete overlapping requests from a cube. Actually the cube is getting loaded from varuous infosources, but there are records which get duplicated and the are not wanted , then hiow to delete the duplicate records from the cube.
Regards,
dolaI think what arun is perfectly right....
use DSO for consolidation of various requests..from diferenet infosources...
Now load from DSO to cube...and it is very much possible...though will require little work.
Delete duplicate records option is usually used for master data.With transacdtion data i don't think its advisable.
Regards,
RK -
Loacate and remove duplicate records in infocube.
Hi!!
we have found the infocube 0PUR_C01 contians duplicate records for the month april 2008, approx 1.5lac records are extracted to this infocube, similar situations may be occuring in the subsequent months.
How do I locate these records and remove them for the infocube?
How do I ensure that duplicate records are not extracted in the infocube?
All answers/ links are welcome!!
Yours Truly
K SenguptoFirst :
1. How do I locate duplicate records in an Infocube? other than down load all the records in an excel file and use excel funtionality to locate duplicate records.
This is not possible since a duplicate record would not exist - the records are sent to a cube with a + and - sign to accordingly summarize data.
You search for duplicate data would become that much troublesome.
If you have a DSO to load it from - delete data for that month and reload if possible this would be quicker and cleaner as opposed to removing duplicate records.
If you had
ABC|100 in your DSO and it got doubled
it would be
ABC|+100
ABC|+100
against different requests in the cube - and added to this ill be your correct deltas also. -
Duplicate records in BW Data Loads
In my Project I am facing duplicate records in Data Loads, when I compare with PSA and DSO. How to check those are duplicate and is there any mechanism through Excel Sheet or any? Please help me out. Advance thanks for your quick response.
Edited by: svadupu on Jul 6, 2011 3:09 AMHi ,
Getting duplicate records in PSA is fine because there are no keys set in PSA and all the records come directly from the source .
In case of a standard DSO, records are always overwritten you would not get any duplicates .
In case you are getting duplicate records in PSA and need to find them,
Go to PSA -> manage -> PSA maintainance->change the no of records from 1000 to the actual no of records that have come ->IIn the menu tab, go to list ->Save-> file -> change the path from SAP directory to some other path and save the file .
Open the file ,take the columns forming the DSO keys together and sort ascending .you will find the duplicate records in PSA . -
Duplicate Records in Transactional Load
Dear All,
I have an issue where data is getting loaded from a Write Optimized DSO to another Write Optimized DSO and the DTP is getting failed because of Duplicate Records. It is for a Transactional Load.
I would be grateful if you could please help me to know how i can handle this situation and reload the DTP again.
I have tried searching the Forum,but i can see many for Master data loading where i can select Handling Duplicate Records.
Thanks in Advance...
Regards,
SyedHi Ravi,
Thanks for your reply.
If we uncheck the option, it would take the duplicate records right.
In my scenario, data is coming from a Write Optimized DSO to another Write Optimized DSO. Where as in the first DSO the Data Uniqueness is not checked and in the second DSO uniqueness is checked, so it is giving me the duplicate error message.
I saw around 28 records in the Error Stack. So please let me know, how i can process this error records(Duplicate) as well.
Many Thanks...
Regards,
Syed -
hello guys,
he was asking,i have duplicate ecords in the report how do we rectify them?
why and how the duplicate records come in reporting?how is it possible??
pls explain me how can this is possible?
thanks & regardsHi,
It may be possible that your data target may be reading data from DSO (for eg).
If this DSO have a keyfield as account but not center then in this case , the accounts with different centers but with same amount can acculmualte to duplicate data.
This case may occur with a flat file load and the records need to be corrected in that case. Also the flat file can work directly in the case when we have both account & center as Keyfield for that particular DSO.
This is scenario which can happen other than the above.
Best Regards,
Arpit -
Getting duplicate records in cube from each data packet.
Hi Guys,
I am using 3.x BI version. I am getting duplicate records in cube. for deleting these duplicate records i have written code. still it is giving same result. Actually i have written a start routine for deleting duplicate records.
These duplication is occurring depending on the number of packets.
Eg: If the number of packets are 2 then it is giving me 2 duplicate records.
If the number of packets are 7 then it is giving me 7 duplicate records.
How can i modify my code so that it can fetch only one record by eliminating duplicate records? Or any other solution is welcomed.
Thanks in advance.Hi Andreas, Mayank.
Thanks for your reply.
I created my own DSO, but its giving error. And I tried with the stanadard DSO too. Still its giving the same error as could not activate.
In error its giving a name of function module RSB1_OLTPSOURCE_GENERATE.
I searched in R3 but could not get that one.
Even I tried creating DSO for trial basis, they are also giving the same problem.
I think its the problem from BASIS side.
Please help if you have any idea.
Thanks. -
Duplicate records in PSA and Error in Delta DTP
Hi Experts,
I have enabled delta on CALDAY Feild for a Generic Data Source in Source system.
1) Initially I have created one record in source system and I have run the Delta Infopackage.So PSA Contains one record.
2) I have run the Delta DTP.Now That one record has moved to DSO
3) Again I have created one more record in source system and I have run the Delta Infopackage.Here I have contains 2 records in PSA.One record is created in step 2 and other record created from step 1.
4) If I try to run the Delta DTP, I am getting Error saying that Duplicate records are there.
So would you please tell me how to rectify this Issue.
My requirement is I need to run the load multiple times a day . (Note: Data Source is delta enabled)
Thanks and Regards,
K.Krishna Chaitanya.you can use the this feature only when loading master data, as it will ignore lines. if you would have for example a changed record with a before and after image, one of the two lines would be ignored.
you can keep searching and asking, but if you want to load a delta more then once a day, you can't use a calendar day as a generic delta field. this will not work.
the best thing is to have a timestamp.
if in the datasource you have a date and time, you can create a timestamp yourself.
you need in this case to use a function module.
search the forum with the terms 'generic delta function module' and you'll for sure find code that you can reuse
M. -
Hide Duplicate records of Key Fields (SAP BEX 7)
Need Advice.
On BEX 7, I want to hide repeated/duplicate records in KF object(formula) for the output when being converted into spreadsheet.What i did was i just modify the query properties and uncheck the 'Hide Repeated Key Values' on Display Option. But the fields that were affected by this changes are those only on the characteristic field. Duplicate records on Key Fields are still visible. I can't find any other way on hiding those duplicate records. Below will show sample output:
Defect:
Plan Owner | Plan ID | Status | Plan Comment | Total Plan Count
A | ID001 | Active | Test A | 15
A | ID001 | Active | Test B | 15
A | ID001 | Active | Test C | 15
Modified:
Plan Owner | Plan ID | Status | Plan Comment | Total Plan Count
A | ID001 | Active | Test A | 15
_ | _____| _____ | Test B | 15
_ | _____| _____ | Test C | 15
Only records on the characteristic field were hidden.
the value 15 still exist for all 'ROWS' after exporting into spreadsheet.
Question: Is it "POSSIBLE" to hide as well those duplicate records on the KF?
Thanks in Advance!Hi,
I also faced similar issue where i was getting duplicate keyfigures values usually it happens when you make a query on infoset or you do a lookup on another dso to get the value.
You may try with a code to delete adjacent duplicate records keeping a check on your characterstic.
Hope it helps.
Regards,
AL
Edited by: AL1112 on Sep 8, 2011 12:36 PM -
Duplicate Data Records indicator / the handle duplicate records
Hi All,
I am getting double data in two request. How can I delete extra data using "Duplicate Data Records indicator ".
I am not able to see this option in PSA as well as DTP for "the handle duplicate records".
can u help me to find the option in PSA/DTP.
Regards
Amit SrivastavaWhat Arvind said is correct.
But if you can try this out in an End Routine, this may work, Not sure though.
Because then you will be dealing with the entire result_package.
Also, say if the target that you are talking about is a DSO, then you can Delete Adjacant Duplicates in Start Routine while updating it into your next target. That can be a cube, for ex. -
Unable to pull duplicate record in BW
I am working on E-Recritment module.
I have to get candidate education details in BW but those are not available in standard extrator (0CANDAIDATE_ATTR & 0CAND_TD_ATTR). For that I have enhanced DS by fields INSTITUTE, START_DATE & END_DATE. My extractor is working fine & I am able to see all the records in RSA3.
I am getting these records upto PSA but when I want to update further to Info Object Candidate is giving error as Duplicate Records.
Problem is occuring in extracting the details such as Institute (from table HRP 5104) as each candidate is having more then one institute & BW is rejecting duplicate records.
Eg. 0OBJID INSTITUTE START_DATE END_DATE
50038860 ABC 10.06.1965 20.05.1973
50038860 XYZ 20.05.1976 15.05.1978
50038860 PQR 30.05.1978 12.05.1980
Alternatively I have done compounding of InfoObject but stll is not giving correct result.
Can anybody five idea to solve this.
Thanks in advance.Try creating Time Dependant Hierarchies, i don't think Compounding or only time dependant will help you.
Load it DSO is not a bad option.
Nagesh Ganisetti. -
No 'Handle Duplicate records' in update tab
Hi,
I've a DTP from ODS/DSO to ODS/DSO and I got a Duplicate record error, which I find rather strange for standard ODS/DSO. I read in the help and within these forums that it can be fixed with the 'Handle Duplicate records' checkbox in the update tab. Trouble is that there isn't such a checkbox in our BI7 SP15 installation.
Any suggestion on the reason for that and how to fix this and more important on how to get rid of the duplicate record error (and the reason why it's occurring)?
Many thanks in advance
EddyHi Eddy,
I am confused -:)
Have u tried by checking or by unchecking..
My suggestion is to try by selecting the setting unique data records...
Cheers
Siva
Maybe you are looking for
-
Multiple sound source question
Can I use a Usb mic and sound source input ie synth output running thru my rme pcie card into logic simultaneously?
-
My PC wasn't connecting to the Internet at my home so I did a restore. Once complete, I tried to open iTunes but it said there were components missing and to re-install my Full Library was in the iTunes Music folder but after the re-installation it w
-
I would like to have the front row dvd player use the settings from my Apple DVD Player but it seems the current version of front row does not use those settings. On my TV, a 1080p sony display, the image quality is highly depended on these settings.
-
How to see all sides of an array
I have this code: ColoringAttributes coloringattributes=new ColoringAttributes(color,1); appearance.setColoringAttributes(coloringattributes); points[0] = new Point3f(-0.5f, -0.9f, -0.5f); points[1] = new Point3
-
FaceTime window does not appear after startup
After I startup FaceTime, the window for it does not appear, but the toolbar is available. The options under 'Edit' and 'Video' are all greyed out, but the options under the 'FaceTime' tab are all available. However, when I try to access 'Preferences