Duplicate records generated on same Key in Layout !!
Dear Experts,
We are facing a serious problem in our Forecasting Application.
Its generating duplicate records on same key which results in wrong results and negative numbers.
Also since this is not happening constantly but at certain time so difficult to regenerate error and find solution.
If anyone have came across similar problem and found the solution please reply.
Appreciate any help and suggestion on this....
Thanks..
Dear All,
No we haven't compressed the cube also we are using same h/w and system as earlier. Actually lot of forecasting data was entered in UAT and in production but never we came across this issue. Now since this is problem in production, its very serious.
As suggeted by SAP we impletemented note 1179076 - Incorrect data in planning but it is too general and we are not sure if our problem has been resolved by this note as its not consistent problem, it occurs randomly.
we are on ABAP stack 18 and Java stack 16.
Thanks a lot Krishna for note but since we are not using BIA it will not help.
Please suggest if any more ways to debug / solve this strange problem.
Regards,
Jasmi
Similar Messages
-
Duplicate records in the same table
Hello.
I want to duplicate records in the same table, changing just two date fields and id. Id is changed with sequence.
Example:
id name start_date end_date
1 record_1 01.01.04 31.12.04
2 record_2 01.01.03 31.12.03
I want to duplicate record with the start_date year 04 and the duplicated record would change that start_date into year 05.
After duplicate it should look like:
1 record_1 01.01.04 31.12.04
2 record_2 01.01.03 31.12.03
3 record_1 01.01.05 31.12.05
How should my insert look like?
Thankscreate sequence A_SEQ
start with 3
nocache
insert into tableA
(ID, name, start_date end_date)
select
A_SEQ.nextval
,NAME
,start_date + add_months (12)
,end_date + add_months (12)
from
tableA
where
start_date >= to_date('01.01.2004','dd.mm.yyyy')
and start_date <= to_date('01.01.2004','dd.mm.yyyy') -
Reducing time while Filtering Out New Records with the Same Key
Hi Experts,
I have an issue .. I am trying to load data into 0MATERIAL . While loading via DTP the step u201CFilter Out New Records with the Same Keyu201D takes much time for processing . For each datapackage it is taking around 30 minutes to filter out new records with the same key . While for other master data it hardly takes 5 minutes.
Any pointers how to reduce the timeu2026
Thanks in advance
SamHello,
No there is no need to do that change then.
Can you tell me are you doing a full load daily to 0MATERIAL object.
why dont you use delta.
Also if the DTP is of type FULL, then are you deleting the previous PSA requests.
Maybe you can ask the basis guys to check with a trace what is really happening on the DB side during this 30 mins and then maybe we can find the needed fix.
Regards,
Shashank -
Record filtered in advance as error records with the same key exist
Dear Masters
I am getting below error for master data (0MATERIAL_ATTR), while loading data from PSA to Info object through DTP (I have set the Handle duliocate record also). When I checked that error data in Error Stack, I am getting below error for all records(100)
"Record filtered in advance as error records with the same key exist"
Could u please help me to resolve this issue
Thanks in Advance
RajaHi
Thanks for reply
I have loaded the Delta data in PSA. In PSA data was loaded successfully. I got the failure only when I loaded the from PSA to Maste data
Thanks in advance
Raja -
Can we generate the same key with the same password???
Hi everyone,
Is it at all possible to generate the same key? I have a program that takes a user password, creates a key. And this is done in a loop for 5 times. I just wanted to see if the keys that are generated will be the same for the same password. But the keys that are generated are different.
Does anyone have any idea why this would happen?
My Environment
jdk1.2.2
jce1.2.1
My class is very simple as is as follows
import java.security.*;
public class SecureStuff {
static
Security.addProvider(new com.sun.crypto.provider.SunJCE());
public SecureStuff()
try {
Key secretKey = null;
for (int i = 0; i < 6; i++){
javax.crypto.spec.DESedeKeySpec spec = new javax.crypto.spec.DESedeKeySpec("PASSWORDPASSWORDPASSWORD".getBytes());
javax.crypto.SecretKeyFactory desFactory = javax.crypto.SecretKeyFactory.getInstance("DESede");
secretKey = desFactory.generateSecret(spec);
System.out.println("key generated:"+secretKey.getEncoded());
} catch (Exception e){
e.printStackTrace();
public static void main(String[] args) {
SecureStuff securestuff = new SecureStuff();
The Output
C:\> java SecureStuff
key generated:[B@eb7989c5
key generated:[B@e96189c5
key generated:[B@e99589c5
key generated:[B@e8d989c5
key generated:[B@ef4189c5
key generated:[B@eff589c5
Please can anyone help me..
Thank you very very much.
...geethaHi,
I was wondering if you have found a solution to your question.. I justed started working with JCE and have to implement the same functionality. basically I am trying to Encrypt passwords stored in the DB and then when a user logs in, I encrypt his password and compare with the stored password in the DB. For me to do this I have to be sure I can get the same key generated for each encryption.
Also, do you have any examples I can look at or any recommendation on books that explain this subject (JCE) well.
Thanks
Chi -
How to get rid of duplicate records generated frm hierarchical cube in sql?
Hi All,
database version 10gR2.
I am trying to aggregated data for two hierarchical dimensions, specifically organization and products.
I am using one ROLLUP for each dimension, which would be two ROLLUP in GROUP BY clause to do the aggregation for every level of organization and product that are in included in the hierarchy.
the troubling part is that that products that have data in corresponding fact table are not always located at the lowest level (which is 6) of the product hierarchy.
e.g.
product_id level
0/01/0101/010102/01010201 5 -->01010201, at level 5 , has data in fact table
0/01/0101/010103 4 -->010103, at level 4, has data in fact table as well
0/02/0201/020102/02010203/0201020304/020102030405 6 --> at level 6,(lowest level) and has data in fact table we have a flat product hierarchy stored in table as below:
prod_id up_code_1 up_code_2 up_code_3 up_code_4 up_code_5 up_code_6
01010201 0 01 0101 010102 01010201 NULL
010103 0 01 0101 010103 null nulldue to the NULL in product in level 6 for 01010201, when i run the query below, one duplicate record will be generated.
for 010103, there will be 2 duplicate records, and for 020102030405 will be none.
Encounter the same issue with the organizational dimension.
currently, I am using DISTINCT to get rid of the duplicate records, but I don`t feel right to do it this way.
So, I wonder if there is a more formal and standard way to do this?
select distinct ORG_ID, DAY_ID, TRADE_TYPE_ID, cust_id, PRODUCT_ID, QUANTITY_UNIT, COST_UNIT, SOURCE_ID,
CONTRACT_AMOUNT, CONTRACT_COST, SALE_AMOUNT,SALE_COST, ACTUAL_AMOUNT, ACTUAL_COST, TRADE_COUNT
from (
select coalesce(UP_ORG_ID_6, UP_ORG_ID_5, UP_ORG_ID_4, UP_ORG_ID_3, UP_ORG_ID_2, UP_ORG_ID_1) as ORG_ID,
a.day_id as day_id,
a.TRADE_TYPE_ID as TRADE_TYPE_ID,
a.CUST_ID,
coalesce(UP_CODE_6, UP_CODE_5, UP_CODE_4, UP_CODE_3, UP_CODE_2, UP_CODE_1) as product_id,
QUANTITY_UNIT,
COST_UNIT,
A.SOURCE_ID as SOURCE_ID,
SUM(CONTRACT_AMOUNT) as CONTRACT_AMOUNT,
SUM(CONTRACT_COST) as CONTRACT_COST,
SUM(SALE_AMOUNT) as SALE_AMOUNT,
SUM(SALE_COST) as SALE_COST,
SUM(ACTUAL_AMOUNT) as ACTUAL_AMOUNT,
SUM(ACTUAL_COST) as ACTUAL_COST,
SUM(TRADE_COUNT) as TRADE_COUNT
from DM_F_LO_SALE_DAY a, DM_D_ALL_ORG_FLAT B, DM_D_ALL_PROD_FLAT D --, DM_D_LO_CUST E
where a.ORG_ID=B.ORG_ID
and a.PRODUCT_ID=D.CODE
group by rollup(UP_ORG_ID_1, UP_ORG_ID_2, UP_ORG_ID_3, UP_ORG_ID_4, UP_ORG_ID_5, UP_ORG_ID_6),
a.TRADE_TYPE_ID,
a.day_id,
A.CUST_ID,
rollup(UP_CODE_1, UP_CODE_2, UP_CODE_3, UP_CODE_4, UP_CODE_5, UP_CODE_6),
a.QUANTITY_UNIT,
a.COST_UNIT,
a.SOURCE_ID );Note, GROUPING_ID seems not help, at least i didn`t find it useful in this scenario.
any recommendation, links or ideas would be highly appreciated as always.
Thanksanyone ever encounter this kind of problems?
any thought would be appreciated.
thanks -
DTP error: No Message: Filter Out New Records with the Same Key
Hello SDN,
I have looked up solutions for this problem, but didnt find anything suitable. I am getting this error in multiple places: when i load pure master data into IOs like 0VENDOR, 0PLANT etc. or when I load DSOs like 0FIAP_O03.
When I checked the load in sdn, one of the solutions was to apply note: 1489178 , but in this case, the user had nothing selected in his 'semantic key' list. Whereas, in my case.. 0PLANT is checked ...eventhough the other fields are disabled. So, I think my DTP is accurate.. as 'plant' is the only key field here. But it still fails .. and so does 0VENDOR load. Moreover... the DSO load is crucial for our reports in Prod and its failure is a bigger issue for us.
Has anyone seen this problem? Is there another solution other than the above note?
Regards.Hello Arvind,
If I select that option, will it delete one record if it finds 2 records with the same semantic key in 1 data packet. Also, will it delete a record if it finds that there is already a record existing in the data target with the same semantic key.
I think, both of the above scenarios will result in incorrect data. From my understanding, 'Handle Duplicate Keys' is mostly used to filter out master data.. as in some instances.. you might get duplicate entries.
But, in my case, the same error is happening in the case of data load from 0fi_ap_4 --> 0fiap_o03. What happens is... the overall status of the DTP becomes green.. but when you look at details section, the process stops at this message which is in yellow. Because of this, the DTP status in the PC remains yellow.. and finally turns red.. and the PC turns red.
From my knowledge, it appears more like a bug. I was hoping to find a SAP NOTE which has some manual instructions that can resolve the issue, since the above NOTE asks for a support pack upgrade.. which would need a whole lotta testing.
Regards. -
Why two records for the same key???
Hi,
I'm finding that our database contains "duplicate" records even though it is set to disallow duplicated. In particular, from DbDump, here are two keys from our database:
737201037870770c0001000000a6005a001471c678
737201037870770c0001000000a6005a0014711678
You can probably really think of them as
737201037870770c 0001000000a6005a00147 1c678
737201037870770c 0001000000a6005a00147 11678
where only the middle part is "our key data". Use use Externalizable and have methods for the key:
public void readExternal(ObjectInput in) throws IOException, ClassNotFoundException {
short oldVersion = in.readShort();
this.primaryKey = in.readInt();
this.orderPurpose = in.readChar();
this.foreignKey = in.readInt();
public void writeExternal(ObjectOutput out) throws IOException {
out.writeShort(externalizationVersion);
out.writeInt(this.primaryKey);
out.writeChar(this.orderPurpose);
out.writeInt(this.foreignKey);
public int compare(Object o1, Object o2) {
byte[] b1 = (byte[]) o1;
byte[] b2 = (byte[]) o2;
final int compareLen = b1.length;
for (int i=0; i<compareLen; i++) {
if (b1[i] != b2) {
return (b1[i] < b2[i] ? -1 : 1);
return 0;
My question is basically this: what are the trailing bytes for in the key and why do they differ in my case? BTW, we're using an old version, 2.0.90, and we recently switched to Java 6 from Java 5... (just in the interest of full disclosure!)
Thanks very much in advance,
DanYes, you're right. That hint really helped me find
my problem !!Glad to hear that.
I still don't know what the leading and trailing
bytes are used for but at this point, it doesn't
really matter anymore.They're output by Java serialization when JE calls ObjectOutputStream.writeObject, so they should be defined here:
http://java.sun.com/j2se/1.5.0/docs/guide/serialization/spec/protocol.html
Thanks for your help,You're welcome,
Mark -
Duplicate Records generating for infocube.
hi all,
when I load the data from datasource to infocube I am getting the duplicate records.Actually the data is loaded into the datasource from a flat file when i executed first time it worked but when i changed the flat file structure i am not getting the modified content in the infocube instead it is showing the duplicates.In the data source 'preview data' option it is showing the required data i.e modified flat file) .But where as in the infocube i made all the necessary changes in the datasource,infocube,infopackage,dtp but still I am getting the duplicates. I even deleted the data in the infocube.Still i am getting the duplicates. What is the ideal solution for this problem ? One way is to create a new data source with the modified flat file but I think it is not ideal .Then what is the possible solution with out creating the data source again.
Edited by: dharmatejandt on Oct 14, 2010 1:46 PM
Edited by: dharmatejandt on Oct 14, 2010 1:52 PM
Edited by: dharmatejandt on Oct 14, 2010 1:59 PMFinally i got it .I deleted the requestids in the infopackage ( right click infopackage go to manage) then i executed the transformation ,dtp finally i got the required output with out duplicate.
Edited by: dharmatejandt on Oct 14, 2010 4:05 PM -
Duplicate records for the same computer
I have recently deployed SCCM 2007 SP1 with R2 for a customer. We are currently in the process of deploying imaging and are having an issue with duplicate computers showing up. The duplicates appear to be for the same computer. I have read everything I can on duplicate computers showing up, but none of it has helped. Basically, we have our base image, which is just a basic install of XP SP2 with all updates. We lay down the image on a clean computer using an OSD task sequence, and when the machine shows up in the all systems collection, there are two records for it. The current one is called SCCMTEST01.
The first record shows the following:
Name: SCCMTEST01
Resource Type: System
Domain: OURDOMAIN
Site: 001
Client: Yes
Approved: Approved
Assigned: Yes
Blocked: No
Client Type: Advanced
Obsolete: No
Active: Yes
Properties:
Agent Name[0]: MP_ClientRegistration
Agent Name[1]: Heartbeat Discovery
The second record shows the following:
Name: SCCMTEST01
Resource Type: System
Domain: OURDOMAIN
Site: 001
Client: No
Approved: N/A
Assigned: Yes
Blocked: No
Client Type:
Obsolete:
Active:
Properties:
Agent Name[0]: SMS_AD_SYSTEM_DISCOVERY_AGENT
Agent Name[1]: SMS_AD_SYSTEM_GROUP_DISCOVERY_AGENT
The first one appears to be the active one, since it includes more information and the client is listed as installed. Does anyone have any suggestions on what might be misconfigured?
thanks,
JustinI'm experiencing the same behaviour as described above. Not using any other OS distribution methods than scripted installation from RIS, will obviously make explanations based on OS Deployment behaviour not applicable in my case.
The thing is that an object being discovered by Heartbeat, MP_Registration, SMS_AD_SYSTEM and SMS_AD_SYSTEN_GROUP... until suddenly, SMS_AD_SYSTEM_GROUP discovery agent starts generating a new object and updates that object from the moment on. Heartbeat from the client still updates the original object. This is done for about five object a day (among 2600 alltogether), and it's not the same computers that do this (apart from some, troublesome clients...).
When finding this duplicate object, I have kept the one generated by SMS_AD_SYSTEM... and then reinitiated a Heartbeat Discovery on that object. Doing this will make Heartbeat to update the new object (now the original is gone), and everything seem to work for a while, although I have to manually approve that object again.
I cannot work out what makes SMS_AD_SYSTEM... generate a new object.
Does anyone have an idea?
/Jakob -
Duplicate records exists in same request of a PSA
Dear SAP Professionals,
Please look into below issue I am facing.
1) I have only 1 request (green) in PSA.
2) When I am looking into that request, I found there are 4 Packages.
3) In Package 3 & 4, I found the same records has been extracted twice.
Here I am not able to understand, why this is happening?
I tried to run the extractor in R/3 side, and found only 7 records exist for that particular condition.
But when I look into PSA Maintenance screen, there are 14 records exist for the same condition (7 records of Pkg 3 + 7 records of Pkg 4).
Request you to provide the necessary guidance.
Many thanks in advance.
Best Regards,
Ankur GoyalHello Ankur,
You didn't mention whether you are loading master data or transaction data.
If you are loading Master data so it will not create any problem, because it will be overwrite. It will load Package 3 firstly then data of Package 4 will be overwrite with Package 3. So there will be only 7 records in data target .
But if you are loading Transaction data so pick the data once again from the R/3 and check and for this you have to delete previous request.
Thanks
Abha
Edited by: Abha Sahu on Jan 29, 2010 3:50 PM -
Can anyone help me out with a master data load failure.
My request goes off red and the message looks somehow like "Record filtered in advance as error records with the same key exist", "Filter out new records with same key". I have already checked the duplicate handle record setting in DTP. Still why am i getting this error.
Would appreciate if some one could suggest me in this.Hello Shridevi M ,
When I execute my DTP, I get a master data load failure : "Record filtered in advance as error records with the same key exist" . It's the same as you get. Please tell me how did you solved this probelm?
Thanks in advance,
Abdellatif -
How to Prevent the Duplicate record insertion by JDBC Receiver?
HI Experts,
I have File Sender to JDBC Receiver scenario.I am using SAP PI 7.1 system.
Purpose of this Scenario is to read the data from specified file and INSERT or UPDATE the same records in Oracle Database.
Before inserting the data I want to check in the same table for prventing runtime error(Key fields Duplicatin).
So if records with the same key fields are available in table then I want to UPDATE the same record instead of INSERT.
Please Suggest me the poosbile way.
Thanks & Regards
JageshHi Jagdesh,
The JDBC receiver format has the following structure
<root>
<StatementName1>
<dbTableName action=u201DUPDATEu201D | u201CUPDATE_INSERTu201D>
<table>Table1</table>
<access>
<col1>val1</col1>
<col2>val2new</col2>
</access>
<key>
<col2>val2old</col2>
<col4>val4</col4>
</key>
This structure creates a query in this way :
UPDATE( OR UPDATE_INSERT) into Table1 set col1=val1 and col2 = val2new WHERE col2=val2old and col4=val4.
The key field works like a Where clause. So for the receiver format when you specify the key, it checks the DB with that field as the where criteria. Hence for a UPDATE_INSERT query, when you specify the key field, it checks whether that value specified within the key field exists in the DB or not. Then it proceeds to execute the statement for only those values.
Hence it makes sense to provide a primary key attribute in the key field of the JDBC receiver format.
Regards,
Kshitij -
How to avoid retrieve duplicate records from SalesLogix
I wanted to know if you could assist me. I am now responsible for reporting our inside sales activities which includes (each month), outbound calls made, opportunities created, opportunities won $, etc. We use SalesLogix as our tool. I have been working with Business Objects exporting this information from SalesLogix and have pretty much created the report I need. The only problem I have is it will pull in duplicate records with the same opportunity ID number because my query is based on u201Ccampaign codesu201D attached to SLX opportunities. When an opportunity is created in SLX, it automatically assigns an opportunity ID (ex: OQF8AA008YQB) which is distinctive. However, when we attach more than one u201Ccampaign codeu201D to this opportunity it pulls in opportunity ID that many more times.
Is there a way to filter or only retrieve one ID record number regardless of how many campaign codes are attached? All the information attached to the opportunity are the same with the exception that the "campaign code" is different which makes it two records since I pull by "campaign code"
My greatest appreciation!Hi,
If you are having CAMPAIGN CODE in your query and if you are displaying it in your report, then it would definitely display multiple rows for OPPORTUNITY ID for each CAMPAIGN CODE it has.
If you would like to have just one row for OPPORTUNITY ID, then you will need to remove CAMPAIGN CODE from your report. -
Duplicate record with same primary key in Fact table
Hi all,
Can the fact table have duplicate record with same primary key . When i checked a cube i could see records with same primary key combination but the key figure values are different. My cube has 6 dimentions (Including Time,Unit and DP) and 2 key figures. So 6 fields combined to form the composite primary key of the fact table. When i checked the records in se16 i could see duplicate records with same primary key. Ther are no parallel loading happening for the cube.
BW system version is 3.1
Data base is : Oracle 10.2
I am not sure how is this possible.
Regards,
PMHi Krish,
I checked the datapacket dimention also. Both the record have same dimention id (141). Except the Keyfigure value there is no other change in the Fact table record. I know this is against the basic DBMS primary key rule. But i have records like this in the cube.
Can this situation arise when same records is there in different data packet of same request.
Thx,
PM
null
Maybe you are looking for
-
Adding facebook comments to large product page
hi guys I just wanted to go over the facebook intergration process *** ive read this http://kb.worldsecuresystems.com/kb/add-social-media.html but couldnt find the container.html page in admin, only dreamweaver. plus when ive checked out the forums o
-
Why this weird opinion on firefox 12 etc. ?
ddepietro asked this:Please let me know if Mozilla will work on Windows XP after Microsoft stops their support of XP in April 2014? The answer was: Helpful Reply Diego Victor said this below. Firefox will work in Windows XP, but doesn't work in windo
-
Hi Guys, Can any one help me loading a simple XML file into a Select list on JSP page. I found many examples but unable to load it the way I wanted it. I have a XML file with contents as; <examples> <example name='abc'>value here can be complete new
-
I am trying to import an image I created in Adobe Illustrator into my iweb page (company logo). Whether I paste it as a pdf, export the image from illustrator as a jpeg and insert into iweb or save as a screenshot and import to iweb via iphoto, the r
-
Premiere Pro CC 2114 8.2 Crashes Doing Just Basic Things
I get freeze-ups or crashes constantly! This seemed to start over the last few months or the last few Premiere Pro updates. The usual error is: "Adobe Premiere Pro has stopped working- Windows is checking for a solution to the proble