Duplicate records in BW Data Loads
In my Project I am facing duplicate records in Data Loads, when I compare with PSA and DSO. How to check those are duplicate and is there any mechanism through Excel Sheet or any? Please help me out. Advance thanks for your quick response.
Edited by: svadupu on Jul 6, 2011 3:09 AM
Hi ,
Getting duplicate records in PSA is fine because there are no keys set in PSA and all the records come directly from the source .
In case of a standard DSO, records are always overwritten you would not get any duplicates .
In case you are getting duplicate records in PSA and need to find them,
Go to PSA -> manage -> PSA maintainance->change the no of records from 1000 to the actual no of records that have come ->IIn the menu tab, go to list ->Save-> file -> change the path from SAP directory to some other path and save the file .
Open the file ,take the columns forming the DSO keys together and sort ascending .you will find the duplicate records in PSA .
Similar Messages
-
Duplicate records during master data loading.
hello guys,
I am reading one blog where the blogger wrote about 'Various issues in a BW Production project'.....I came across one issue which I couldnot understand...
Data loading failed due to Duplicate records during master data loading.......
Why do this error occur?How can we rectify this in a production environment?
Thanks and Regards,
SHi SChandx200 ,
May I ask where you get "Various issues in a BW production project"?
Many Thanks, -
Duplicate records In Master Data
Hi,
I don't understant why we get Duplicate records in Master Data though it has got the overwritten functionality..
Any idea will be appreciated..Hi,
<u>Solution:</u> if the load to master data fails due to duplicate records,
Goto Monitor screen --> in the details tab --> under processeing find the duplicate record --> on the context menu of the error record select 'Manual update'.
After the above step is done....trigger the attribute change run for that infoobject.
This should solve your problem.
if there is any problem in the reporting, select the data using filter option on the master data.
Regards,
Vijay. -
hi all,
how to delete duplicate records in master data infoobject which has no requests because it is a direct update?Hi,
Right click on the info object and
select Maintain
in that you will get the Master data table
from that select the Record and delete it.
hope this solves your query.
reward points if useful
regards,
ANJI -
Deleting duplicate records from different data packets in BI data source.
Hi,
I am getting same (duplicate) records from different data packets in BI data source, after completion of extraction.
I tried to store key fields of the first data packet in an internal table. But this internal table is not carrying the previous data at the time of extraction of second data packet.
Is there any other way to remove duplicate records after completion of extraction.
Thanks in advance.I did not extensively worked in BI routenes. But I recon there will be routene which will het executed before data mapping part there will be a start routene in which you can validate the existense of data before beeing passed from data source to cube.
Hope this helps,
Regards,
Murthy. -
Ignore duplicate records for master data attributes
dear experts ,
how & where can i enable "ignore duplicate records" when i am running my DTP to load data
to master data attributes.Hi Raj
Suppose you are loading master data to InfoObject and in PSA you have more than one records for Key.
Let's assume you are loading some attributes of Document NUmber ( 0DOC_NUMBER) and in PSA you have multiple records for same document number. In PSA this is not a problem, as Key for PSA table is technical key and which is different for every record in PSA. But this is a problem for InfoObject attribute table as more than one records for primary key ( here 0DOC_NUMBER) will create Primary Key consraints in database.
This issue can be easily avoided by selecting "Handle Duplicate Records" in DTP . You will find this option under *update" tab of DTP.
Regards
Anindya -
Duplicate Records, Import transaction Data
Hi Everybody
I'm using BPC 7.5 NW and I gett\ a warning that says that there are duplicate records when I run the package "Load Transaction Data". The txt file that I'm using does not have duplicate records. I have the following data in my flat file:
ACCOUNT INTCO AMOUNT
61012 I_65 10
61012 I_66 12
61012 I_67 13
I'm using a conversion file for INTCO as:
EXTERNAL INTERNAL
I_65 I_99
I_66 I_99
I_67 I_99
When I ran the package, it says that there are duplicate records, the records are:
ACCOUNT INTCO AMOUNT
61012 I_99 10
61012 I_99 12
My cuestion is, It is not posible to use this package when I use conversion files? If I use the APPEND package, it works fine, but why it dosnt work whit the Import Transaction Data?
since I remember in MS version is posible to do that.
Thanks in advenced.
RegardsHi,
Originally, you had the following records:
ACCOUNT INTCO AMOUNT
61012 I_65 10
61012 I_66 12
61012 I_67 13
However, after the conversion file, the records are become:
ACCOUNT INTCO AMOUNT
61012 I_99 10
61012 I_99 12
61012 I_99 13
So, there are 3 records which are duplicate.
The import package will not accept the 2nd and the 3rd record. Because these are duplicate records fro the 1st record. However, the append package will append the 2nd and the 3rd records to the 1st one.
Hope you got the idea. -
Duplicate records in master data info object
dear friends,
i got a standard infoobject called 'A' and it has got some attributes B,C,D,E for that one standard datasource with the following fields A,B,C,D,E exists and i loaded data. abapers had created a z table with the following P,Q,R,S,X,Y,Z fields. 'P' holds the same records what the 'A' infoobject holds. my requirement is to create a report on the following fields:
P,Q,R,S,B,C,D,E
WAT I DONE IS I CREATED A GENERIC DATASOURCE FOR THE FOLLOWING FIELDS P,Q,R,S AND I ADDED THE STANDARD INFOOBJECT CALLED 'A' WITH THE FOLLOWING FIELDS: Q,R,S.
AND I CRTEATED AND SHEDULED THE INFOPACKAGE FOR THE STANDARD DATASOURCE(A,B,C,D,E,) TO STANDARD INFOOBJECT(A,B,C,D,E,Q,R,S). NEXT I CREATED THE ANOTHER INFOPACKAGE AND SHEDULED THE INFOPACKAGE AND SHEDULED FROM GENERIC DATASOURCE (P,Q,R,S) TO STANDARD INFOOBJECT(A,B,C,D,E,P,Q,R,S) WITH TRANSFER RULES P->A,Q->Q,R->R,S->S. AFTER LOADING THE DATA I AM GETTING DUPLICATE RECORDS.I WILL GIVE THE TABLE HOW MY MASTER DATA LOOKS LIKE
A B C D E P Q R S
1 2 3 4 5
2 3 4 5 6
3 4 5 6 7
1 6 7 8 9
2 7 8 9 3
3 4 6 2 1
THIS IS HOW MY MASTERDATA LOOKS LIKE BUT I NEED IN THE FOLLLOWING FORMAT:
A B C D E P Q R S
1 2 3 4 5 6 7 8 9
2 3 4 5 6 7 8 9 3
3 4 5 6 7 4 6 2 1
PLEASE LET ME KNOW
THANKS & REGARDS,
HARIHari,
why don't you enhance the Masterdata info object?. You are suppose to see overwritten records. infoobject A is primary key in the table.
try to enhance the masterdata Datasource. you will get required output or create masterdat generic Data source.
All the best.
any questions let us know.
Nagesh. -
Duplicate records in exported data
I'm trying to export the inventory data with a wildcard (%) filter on the
Workstation Name.
If I run the same filter in a query from ConsoleOne, I don't see any
duplicate records.
If I run the data export, the exported data for some workstations will have
a duplicate DN. Just the DN is duplicated, all the other fields are either
empty or have some default value.
I have also ran the manual duplicate removal process and have tried
deleting the records all together using the InventoryRemoval service.
Any other ideas?Dlee,
It appears that in the past few days you have not received a response to your
posting. That concerns us, and has triggered this automated reply.
Has your problem been resolved? If not, you might try one of the following options:
- Do a search of our knowledgebase at http://support.novell.com/search/kb_index.jsp
- Check all of the other support tools and options available at
http://support.novell.com.
- You could also try posting your message again. Make sure it is posted in the
correct newsgroup. (http://support.novell.com/forums)
Be sure to read the forum FAQ about what to expect in the way of responses:
http://support.novell.com/forums/faq_general.html
If this is a reply to a duplicate posting, please ignore and accept our apologies
and rest assured we will issue a stern reprimand to our posting bot.
Good luck!
Your Novell Product Support Forums Team
http://support.novell.com/forums/ -
Duplicate records in generic data source
Hello,
We have created a generic data source using a database view joing two tables MARA and MBEW.
When we run the view in our DEV server, we get perfectly fine data. Now when run the same view in QA, we get duplicate records.
Is it any thing to do with the CLIENT as in QA, we have 2 clients with same data.
MARA MANDT = MBEW MANDT
MARA MATNR = MBEW MATNR
This is what I mention in JOIN Conditions.
Hope I could explain my issue properly. Please HELP !
AbhishekPlease check the possibility of Multiple records for a given material in MBEW,as same material can be in multiple valuation areas
More over you will be executing extraction in one client so it is very unlikely that you see data of the other client
In dev normally we do not have good data to test so it seems like design is correct in dev -
Duplicated records on infoobect data load
Hi,
I have a problem when loading dato to 0UCINSTALLA infoobject.
It goes to Red flga and It reports duplicated records in /BI0/QUCINSTALLA and /BI0/YUCINSTALLA tables.
I checked the infopackage and the "PSA only" checkbox is selected, and "Continuing..." and "Ingnore dup records" checkboxes are selected, too.
If the "Ignore duplicated records" is selected, why is reporting the error?
I don't know what to do with this problem.
any ideas?
thanks for the help.
Mauricio.In transfer structure write a start routine that delete duplicate record like that:
sort DATAPAK by /BIC/filed1 descending /BIC/filed2
/BIC/filed3.
delete adjacent duplicates from DATAPAK comparing /BIC/filed1 /BIC/filed2.
Hope it helps.
Regards -
Records missing in data load.
Hi All,
we have one load(DSO-->CUBE).
EX: In DSO for customer number 1 there are 20 entries.
after uploading the data into Cube for this customer only 2 or 3 records loading into cube.
after applying the delete source package conditions at transformation of DSO-->CUBE also it should load 10 entries.
but only 2 or 3 records updating.
But when i debug the DTP Load, after END Routine it is showing the currect number of records., means it is showing 10 records for that customer..
but it cube only 3 records appearing..
What might be the reason..?
after end routine is there any chance to drop the records?
please share your ideas..
Thanks
Krishna.Hi Asish,
The key fig values summation also not matching.
i tried for that one customer also.
but it is behaving same(means in debuginng it is showing currect no.of records but not in CUBE)
and there are no filter conditions used in DTP.
Thanks
Krishna
Edited by: krishnamurthy g on Oct 17, 2008 6:52 PM
Edited by: krishnamurthy g on Oct 20, 2008 8:30 PM -
ODI CDC - Getting Duplicate Records in Journal Data
Dear Gurus
I am getting the following issues in CDC using Oracle Logminer
01) All operations on Source Data (Insert, Update) are shown as only Insert in Model -> CDC -> Journal Data
02) The records in Model -> CDC -> Journal Data are double
03) These are not travelling to Desitination table, I want to load the destiation table
04) Is it possible to have the last value and the new value both available on same screen as output in ODI, I want to see what data changed before actually populating the tablesHi Andreas, Mayank.
Thanks for your reply.
I created my own DSO, but its giving error. And I tried with the stanadard DSO too. Still its giving the same error as could not activate.
In error its giving a name of function module RSB1_OLTPSOURCE_GENERATE.
I searched in R3 but could not get that one.
Even I tried creating DSO for trial basis, they are also giving the same problem.
I think its the problem from BASIS side.
Please help if you have any idea.
Thanks. -
BI 7.0 - Duplicate Record Error while loading master data
I am working on BI 7.0 and I am trying to load master data to an info object.
I created an Infopackage and loaded into PSA.
I created transformation and DTP and I get an error after I execute the DTP about duplicate records.
I have read all previous threads about duplicate record error while loading master data and most of the them suggested to check 'Ignore duplicate records' option in the infopackage. But in 7.0, I can only load to PSA with infopackage and it doesn't have any option for me to ignore duplicate records.
My data is getting loaded to PSA fine and I get this error while loading to info object using DTP.
I would appreciate your help to resolve this issue.
Regards,
Ram.Hi,
Refer:
http://help.sap.com/saphelp_nw2004s/helpdata/en/45/2a8430131e03c3e10000000a1553f6/frameset.htm
http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
With rgds,
Anil Kumar Sharma .P -
Master data failed with error `too many duplicate records'
Dear all
below is error message
Data records for package 1 selected in PSA -
error 4 in the update
LOng text is giving as below
Error 4 in the update
Message no. RSAR119
Diagnosis
The update delivered the error code 4.
Procedure
You can find further information on this error in the error message of the update.
WOrking on BI - 7.0
any solutions
Thanks
satish .aHi,
Go through these threads, they have same issue:
Master data load: Duplicate Records
Re: Master data info object - duplicate records
Re: duplicate records in master data info object
Regards
Raj Rai
Maybe you are looking for
-
Computer won't upgrade to 10.9.5 after Mavericks update.
-
Hi all, I am migrating biztalk server 2010 to 2013. Using visual studio conversion wizard i have converted all applications into 2012. Can any one please help me out how to deploy the application in biztalk 2013 as i am new to it. and what would be t
-
trying to buy, get "We're sorry, we have encountered an error processing this information. Please try submitting again later, or try using a different web browser. If the problem persists, please contact customer support."
-
Key combination for condition record
Hi, I am trying to create condition record for tax code using tcode FV11. I have enter condition type as JMOP. but in key combination it is not showing Tax classification how can I add this key combination. Please help. Regards, vijay
-
2 Videos displays beside User ID
I have a box with 2 Videos displaying beside my user ID in iTunes when I am in the iTunes store. I have checked repeatedly for not downloaded videos, none show up. What could it be and how do I fix it?