Duplicate records in exported data
I'm trying to export the inventory data with a wildcard (%) filter on the
Workstation Name.
If I run the same filter in a query from ConsoleOne, I don't see any
duplicate records.
If I run the data export, the exported data for some workstations will have
a duplicate DN. Just the DN is duplicated, all the other fields are either
empty or have some default value.
I have also ran the manual duplicate removal process and have tried
deleting the records all together using the InventoryRemoval service.
Any other ideas?
Dlee,
It appears that in the past few days you have not received a response to your
posting. That concerns us, and has triggered this automated reply.
Has your problem been resolved? If not, you might try one of the following options:
- Do a search of our knowledgebase at http://support.novell.com/search/kb_index.jsp
- Check all of the other support tools and options available at
http://support.novell.com.
- You could also try posting your message again. Make sure it is posted in the
correct newsgroup. (http://support.novell.com/forums)
Be sure to read the forum FAQ about what to expect in the way of responses:
http://support.novell.com/forums/faq_general.html
If this is a reply to a duplicate posting, please ignore and accept our apologies
and rest assured we will issue a stern reprimand to our posting bot.
Good luck!
Your Novell Product Support Forums Team
http://support.novell.com/forums/
Similar Messages
-
Duplicate records In Master Data
Hi,
I don't understant why we get Duplicate records in Master Data though it has got the overwritten functionality..
Any idea will be appreciated..Hi,
<u>Solution:</u> if the load to master data fails due to duplicate records,
Goto Monitor screen --> in the details tab --> under processeing find the duplicate record --> on the context menu of the error record select 'Manual update'.
After the above step is done....trigger the attribute change run for that infoobject.
This should solve your problem.
if there is any problem in the reporting, select the data using filter option on the master data.
Regards,
Vijay. -
hi all,
how to delete duplicate records in master data infoobject which has no requests because it is a direct update?Hi,
Right click on the info object and
select Maintain
in that you will get the Master data table
from that select the Record and delete it.
hope this solves your query.
reward points if useful
regards,
ANJI -
Deleting duplicate records from different data packets in BI data source.
Hi,
I am getting same (duplicate) records from different data packets in BI data source, after completion of extraction.
I tried to store key fields of the first data packet in an internal table. But this internal table is not carrying the previous data at the time of extraction of second data packet.
Is there any other way to remove duplicate records after completion of extraction.
Thanks in advance.I did not extensively worked in BI routenes. But I recon there will be routene which will het executed before data mapping part there will be a start routene in which you can validate the existense of data before beeing passed from data source to cube.
Hope this helps,
Regards,
Murthy. -
Duplicate records during master data loading.
hello guys,
I am reading one blog where the blogger wrote about 'Various issues in a BW Production project'.....I came across one issue which I couldnot understand...
Data loading failed due to Duplicate records during master data loading.......
Why do this error occur?How can we rectify this in a production environment?
Thanks and Regards,
SHi SChandx200 ,
May I ask where you get "Various issues in a BW production project"?
Many Thanks, -
Ignore duplicate records for master data attributes
dear experts ,
how & where can i enable "ignore duplicate records" when i am running my DTP to load data
to master data attributes.Hi Raj
Suppose you are loading master data to InfoObject and in PSA you have more than one records for Key.
Let's assume you are loading some attributes of Document NUmber ( 0DOC_NUMBER) and in PSA you have multiple records for same document number. In PSA this is not a problem, as Key for PSA table is technical key and which is different for every record in PSA. But this is a problem for InfoObject attribute table as more than one records for primary key ( here 0DOC_NUMBER) will create Primary Key consraints in database.
This issue can be easily avoided by selecting "Handle Duplicate Records" in DTP . You will find this option under *update" tab of DTP.
Regards
Anindya -
Duplicate Records, Import transaction Data
Hi Everybody
I'm using BPC 7.5 NW and I gett\ a warning that says that there are duplicate records when I run the package "Load Transaction Data". The txt file that I'm using does not have duplicate records. I have the following data in my flat file:
ACCOUNT INTCO AMOUNT
61012 I_65 10
61012 I_66 12
61012 I_67 13
I'm using a conversion file for INTCO as:
EXTERNAL INTERNAL
I_65 I_99
I_66 I_99
I_67 I_99
When I ran the package, it says that there are duplicate records, the records are:
ACCOUNT INTCO AMOUNT
61012 I_99 10
61012 I_99 12
My cuestion is, It is not posible to use this package when I use conversion files? If I use the APPEND package, it works fine, but why it dosnt work whit the Import Transaction Data?
since I remember in MS version is posible to do that.
Thanks in advenced.
RegardsHi,
Originally, you had the following records:
ACCOUNT INTCO AMOUNT
61012 I_65 10
61012 I_66 12
61012 I_67 13
However, after the conversion file, the records are become:
ACCOUNT INTCO AMOUNT
61012 I_99 10
61012 I_99 12
61012 I_99 13
So, there are 3 records which are duplicate.
The import package will not accept the 2nd and the 3rd record. Because these are duplicate records fro the 1st record. However, the append package will append the 2nd and the 3rd records to the 1st one.
Hope you got the idea. -
Duplicate records in generic data source
Hello,
We have created a generic data source using a database view joing two tables MARA and MBEW.
When we run the view in our DEV server, we get perfectly fine data. Now when run the same view in QA, we get duplicate records.
Is it any thing to do with the CLIENT as in QA, we have 2 clients with same data.
MARA MANDT = MBEW MANDT
MARA MATNR = MBEW MATNR
This is what I mention in JOIN Conditions.
Hope I could explain my issue properly. Please HELP !
AbhishekPlease check the possibility of Multiple records for a given material in MBEW,as same material can be in multiple valuation areas
More over you will be executing extraction in one client so it is very unlikely that you see data of the other client
In dev normally we do not have good data to test so it seems like design is correct in dev -
Duplicate records in BW Data Loads
In my Project I am facing duplicate records in Data Loads, when I compare with PSA and DSO. How to check those are duplicate and is there any mechanism through Excel Sheet or any? Please help me out. Advance thanks for your quick response.
Edited by: svadupu on Jul 6, 2011 3:09 AMHi ,
Getting duplicate records in PSA is fine because there are no keys set in PSA and all the records come directly from the source .
In case of a standard DSO, records are always overwritten you would not get any duplicates .
In case you are getting duplicate records in PSA and need to find them,
Go to PSA -> manage -> PSA maintainance->change the no of records from 1000 to the actual no of records that have come ->IIn the menu tab, go to list ->Save-> file -> change the path from SAP directory to some other path and save the file .
Open the file ,take the columns forming the DSO keys together and sort ascending .you will find the duplicate records in PSA . -
Duplicate records in master data info object
dear friends,
i got a standard infoobject called 'A' and it has got some attributes B,C,D,E for that one standard datasource with the following fields A,B,C,D,E exists and i loaded data. abapers had created a z table with the following P,Q,R,S,X,Y,Z fields. 'P' holds the same records what the 'A' infoobject holds. my requirement is to create a report on the following fields:
P,Q,R,S,B,C,D,E
WAT I DONE IS I CREATED A GENERIC DATASOURCE FOR THE FOLLOWING FIELDS P,Q,R,S AND I ADDED THE STANDARD INFOOBJECT CALLED 'A' WITH THE FOLLOWING FIELDS: Q,R,S.
AND I CRTEATED AND SHEDULED THE INFOPACKAGE FOR THE STANDARD DATASOURCE(A,B,C,D,E,) TO STANDARD INFOOBJECT(A,B,C,D,E,Q,R,S). NEXT I CREATED THE ANOTHER INFOPACKAGE AND SHEDULED THE INFOPACKAGE AND SHEDULED FROM GENERIC DATASOURCE (P,Q,R,S) TO STANDARD INFOOBJECT(A,B,C,D,E,P,Q,R,S) WITH TRANSFER RULES P->A,Q->Q,R->R,S->S. AFTER LOADING THE DATA I AM GETTING DUPLICATE RECORDS.I WILL GIVE THE TABLE HOW MY MASTER DATA LOOKS LIKE
A B C D E P Q R S
1 2 3 4 5
2 3 4 5 6
3 4 5 6 7
1 6 7 8 9
2 7 8 9 3
3 4 6 2 1
THIS IS HOW MY MASTERDATA LOOKS LIKE BUT I NEED IN THE FOLLLOWING FORMAT:
A B C D E P Q R S
1 2 3 4 5 6 7 8 9
2 3 4 5 6 7 8 9 3
3 4 5 6 7 4 6 2 1
PLEASE LET ME KNOW
THANKS & REGARDS,
HARIHari,
why don't you enhance the Masterdata info object?. You are suppose to see overwritten records. infoobject A is primary key in the table.
try to enhance the masterdata Datasource. you will get required output or create masterdat generic Data source.
All the best.
any questions let us know.
Nagesh. -
ODI CDC - Getting Duplicate Records in Journal Data
Dear Gurus
I am getting the following issues in CDC using Oracle Logminer
01) All operations on Source Data (Insert, Update) are shown as only Insert in Model -> CDC -> Journal Data
02) The records in Model -> CDC -> Journal Data are double
03) These are not travelling to Desitination table, I want to load the destiation table
04) Is it possible to have the last value and the new value both available on same screen as output in ODI, I want to see what data changed before actually populating the tablesHi Andreas, Mayank.
Thanks for your reply.
I created my own DSO, but its giving error. And I tried with the stanadard DSO too. Still its giving the same error as could not activate.
In error its giving a name of function module RSB1_OLTPSOURCE_GENERATE.
I searched in R3 but could not get that one.
Even I tried creating DSO for trial basis, they are also giving the same problem.
I think its the problem from BASIS side.
Please help if you have any idea.
Thanks. -
BI 7.0 - Duplicate Record Error while loading master data
I am working on BI 7.0 and I am trying to load master data to an info object.
I created an Infopackage and loaded into PSA.
I created transformation and DTP and I get an error after I execute the DTP about duplicate records.
I have read all previous threads about duplicate record error while loading master data and most of the them suggested to check 'Ignore duplicate records' option in the infopackage. But in 7.0, I can only load to PSA with infopackage and it doesn't have any option for me to ignore duplicate records.
My data is getting loaded to PSA fine and I get this error while loading to info object using DTP.
I would appreciate your help to resolve this issue.
Regards,
Ram.Hi,
Refer:
http://help.sap.com/saphelp_nw2004s/helpdata/en/45/2a8430131e03c3e10000000a1553f6/frameset.htm
http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
With rgds,
Anil Kumar Sharma .P -
Master data failed with error `too many duplicate records'
Dear all
below is error message
Data records for package 1 selected in PSA -
error 4 in the update
LOng text is giving as below
Error 4 in the update
Message no. RSAR119
Diagnosis
The update delivered the error code 4.
Procedure
You can find further information on this error in the error message of the update.
WOrking on BI - 7.0
any solutions
Thanks
satish .aHi,
Go through these threads, they have same issue:
Master data load: Duplicate Records
Re: Master data info object - duplicate records
Re: duplicate records in master data info object
Regards
Raj Rai -
Hi everyone,
I am writing a statement to delete duplicate records in my data.
For example:
select identnr,datum,klokin,klokuit,count(datum)
from fus_timesheets
group by identnr,datum,klokin,klokuit
having count(datum) > 1; Above code gives following result:
IDENTNR DATUM KLOKIN KLOKUIT COUNT(DATUM)
10376 14/09/2009 2
10376 16/09/2009 3
10376 15/09/2009 16 What i want to do is delete duplicate records, meaning that count(datum) will have a value of 1.
I have written following code to delete duplicate records :
declare
type tst_type is record
(identnr number
,datum varchar2(15)
,klokin varchar2(10)
,klokuit varchar2(10)
,aantal number(3));
type type_coll_tst
is table of tst_type;
t_tst type_coll_tst;
begin
select identnr,datum,klokin,klokuit,count(datum)
bulk collect into t_tst
from fus_timesheets
group by identnr,datum,klokin,klokuit
having count(datum) > 1;
for i in 1..t_tst.count loop
dbms_output.put_line(t_tst(i).identnr ||' '|| t_tst(i).datum||' '||t_tst(i).aantal);
end loop;
for i in 1..t_tst.count loop
delete from fus_timesheets
where identnr = t_tst(i).identnr
and klokin = t_tst(i).klokin
and klokuit = t_tst(i).klokuit
and datum = t_tst(i).datum
and rownum < t_tst(i).aantal;
end loop;
end;My delete statement is not working good.
Can someone please help me with the delete part?
Thanks,
DianaManage correctly null values:
for i in 1..t_tst.count loop
delete from fus_timesheets
where identnr = t_tst(i).identnr
and (klokin = t_tst(i).klokin or (t_tst(i).klokin is null and klokin is null))
and (klokuit = t_tst(i).klokuit or (t_tst(i).klokuit is null and klokuit is null))
and datum = t_tst(i).datum
and rownum < t_tst(i).aantal;
end loop;Max -
Duplicate Records generating for infocube.
hi all,
when I load the data from datasource to infocube I am getting the duplicate records.Actually the data is loaded into the datasource from a flat file when i executed first time it worked but when i changed the flat file structure i am not getting the modified content in the infocube instead it is showing the duplicates.In the data source 'preview data' option it is showing the required data i.e modified flat file) .But where as in the infocube i made all the necessary changes in the datasource,infocube,infopackage,dtp but still I am getting the duplicates. I even deleted the data in the infocube.Still i am getting the duplicates. What is the ideal solution for this problem ? One way is to create a new data source with the modified flat file but I think it is not ideal .Then what is the possible solution with out creating the data source again.
Edited by: dharmatejandt on Oct 14, 2010 1:46 PM
Edited by: dharmatejandt on Oct 14, 2010 1:52 PM
Edited by: dharmatejandt on Oct 14, 2010 1:59 PMFinally i got it .I deleted the requestids in the infopackage ( right click infopackage go to manage) then i executed the transformation ,dtp finally i got the required output with out duplicate.
Edited by: dharmatejandt on Oct 14, 2010 4:05 PM
Maybe you are looking for
-
Can I get a refund on an HD movie I couldn't view on my computer?
Can I get a refund on an HD movie I couldn't view on my computer?
-
Is it possiable to play HD contant on a 2007 mac mini?
-
ITunes not opening because of "wrong" version of quicktime
It says that I do not have the version 7.1.3 of Quicktime (when I do), and will not open. I've tried reinstalling everything, but it won't work.
-
Problem Description: When attempting to use a Flash Builder File Dialogue Box under OSX 10.10 Yosemite will immediately crash the application. This bug essentially now renders Flash Builder 100% useless! Steps to Reproduce: 1. Open Flash Builder 4.7
-
Trying to sync ipod using windows media
I have recently bought a new ipod nano and when i first plugged it in, it automatically added my songs from my windows media. After creating my itunes account, i cannot seem to retrieve any songs from my windows media. It worked the first time, but a