Duplicate records exists in same request of a PSA

Dear SAP Professionals,
Please look into below issue I am facing.
1) I have only 1 request (green) in PSA.
2) When I am looking into that request, I found there are 4 Packages.
3) In Package 3 & 4, I found the same records has been extracted twice.
Here I am not able to understand, why this is happening?
I tried to run the extractor in R/3 side, and found only 7 records exist for that particular condition.
But when I look into PSA Maintenance screen, there are 14 records exist for the same condition (7 records of Pkg 3 + 7 records of Pkg 4).
Request you to provide the necessary guidance.
Many thanks in advance.
Best Regards,
Ankur Goyal

Hello Ankur,
You didn't mention whether you are loading master data or transaction data.
  If you are loading Master data so it will not create any problem, because it will be overwrite. It will load  Package  3 firstly then data of Package  4 will be overwrite with Package 3. So there will be only 7 records in data target .
    But if you are loading  Transaction data  so  pick the data once again from the R/3 and check  and for  this  you have to delete previous request.
Thanks
Abha
Edited by: Abha Sahu on Jan 29, 2010 3:50 PM

Similar Messages

  • Duplicate records in the same table

    Hello.
    I want to duplicate records in the same table, changing just two date fields and id. Id is changed with sequence.
    Example:
    id name start_date end_date
    1 record_1 01.01.04 31.12.04
    2 record_2 01.01.03 31.12.03
    I want to duplicate record with the start_date year 04 and the duplicated record would change that start_date into year 05.
    After duplicate it should look like:
    1 record_1 01.01.04 31.12.04
    2 record_2 01.01.03 31.12.03
    3 record_1 01.01.05 31.12.05
    How should my insert look like?
    Thanks

    create sequence A_SEQ
       start with 3
       nocache
    insert into tableA
            (ID, name, start_date end_date)
       select
               A_SEQ.nextval   
              ,NAME
              ,start_date + add_months (12)
              ,end_date   + add_months (12)
         from
               tableA
        where
               start_date >= to_date('01.01.2004','dd.mm.yyyy')
          and  start_date <= to_date('01.01.2004','dd.mm.yyyy')

  • Duplicate records generated on same Key in Layout !!

    Dear Experts,
    We are facing a serious problem in our Forecasting Application.
    Its generating duplicate records on same key which results in wrong results and negative numbers.
    Also since this is not happening constantly but at certain time so difficult to regenerate error and find solution.
    If anyone have came across similar problem and found the solution please reply.
    Appreciate any help and suggestion on this....
    Thanks..

    Dear All,
    No we haven't compressed the cube also we are using same h/w and system as earlier. Actually lot of forecasting data was entered in UAT and in production but never we came across this issue. Now since this is problem in production, its very serious.
    As suggeted by SAP we impletemented note 1179076 - Incorrect data in planning but it is too general and we are not sure if our problem has been resolved by this note as its not consistent problem, it occurs randomly.
    we are on ABAP stack 18 and Java stack 16.
    Thanks a lot Krishna for note but since we are not using BIA it will not help.
    Please suggest if any more ways to debug / solve this strange problem.
    Regards,
    Jasmi

  • Duplicate e-mails for same request

    Hi all,
    The user  is getting almost 30 repeated e-mails for the same access approval in the GRC landscape?
    I am not sure about how the e-mail thing works in the GRC landscape? could someonw throw more light here and let me know hw i could stop this repeated e-mails getting generated for the same request?
    Regards
    Bharathwaj V

    Hello Bharathwaj,
    This issue was existing in some of the earlier SPs of AE 5.2 but was resolved in SP 08. Also, the original issue was happening for customers using  clustered environment. Now, I doubt that this can be an issue on AE side here. So, try the following recommendations and if the issue does not get resolved then raise an OSS message for this:
    1) Check the VIRS_AE_EMLLOG table and see if you have multiple entries for the notifications and reminders for requests. If feasible delete the OPEN entries. Do take assistance from your DBA on this.
    2) In the stage level settings, uncheck the Other Approvers option in the Notification configuration.
    3) Restart the system and see if you still face this issue.
    4) Run the email dispatcher jobs in immediate mode and see if you still receive multiple emails.
    Harleen
    SAP GRC RIG

  • Duplicate record found short dump, if routed through PSA

    Hi Experts,
    I am getting these errors extracting the master data for 0ACTIVITY
    1 duplicate record found. 1991 recordings used in table /BI0/XACTIVITY
    1 duplicate record found. 1991 recordings used in table /BI0/PACTIVITY
    2 duplicate record found. 2100 recordings used in table /BI0/XACTIVITY
    2 duplicate record found. 2100 recordings used in table /BI0/PACTIVITY.
    If I extract via PSA with the option to ingnore duplicate records, I am getting a short dump
    ShrtText                                            
        An SQL error occurred when accessing a table.   
    How to correct the error                                                             
         Database error text........: "ORA-14400: inserted partition key does not map to  
          any partition"                                                                  
    What is  causing errors in my extraction ?
    thanks
    D Bret

    Go to RSRV .Go to All Elementary Tests -->PSA Tables -->Consistency Between PSA Partitions and SAP Administration Information## ..and correct the error...

  • Duplicate records: Process : Delete Overlapping Requests from InfoCube

    Hi Experts,
    We are loading data in standard costing cube with standard available option Full upload. In our process chain we have included process type "Delete Overlapping Requests from InfoCube". In our scenario we always load yesterday and today's data. In this case after loading yesterday's data, we need to check and delete the overlapping requests and then upload todays data.
    Many a times this deletion process is failing due to message "Couldn't lock cube" because it is already locked by user "ALEREMOTE". This cause system to duplicate the records in cube.
    How we can avoid this?
    Alok

    I tried running again and it again failed. Checked in SM12 and found this entry
    800     ALEREMOTE     08/14/2007     E     RSENQ_PROT_ENQ     CREA_INDX      ZCCA_C11                      DATATARGET     CREA_INDX                     ######################################     0     1
    This locked is not released since 14th. Is there way to remove the lock using some process.

  • Duplicate records for the same computer

    I have recently deployed SCCM 2007 SP1 with R2 for a customer.  We are currently in the process of deploying imaging and are having an issue with duplicate computers showing up.  The duplicates appear to be for the same computer.  I have read everything I can on duplicate computers showing up, but none of it has helped.  Basically, we have our base image, which is just a basic install of XP SP2 with all updates.  We lay down the image on a clean computer using an OSD task sequence, and when the machine shows up in the all systems collection, there are two records for it.  The current one is called SCCMTEST01.
    The first record shows the following:
    Name: SCCMTEST01
    Resource Type: System
    Domain: OURDOMAIN
    Site: 001
    Client: Yes
    Approved: Approved
    Assigned: Yes
    Blocked: No
    Client Type: Advanced
    Obsolete: No
    Active: Yes
    Properties:
    Agent Name[0]: MP_ClientRegistration
    Agent Name[1]: Heartbeat Discovery
    The second record shows the following:
    Name: SCCMTEST01
    Resource Type: System
    Domain: OURDOMAIN
    Site: 001
    Client: No
    Approved: N/A
    Assigned: Yes
    Blocked: No
    Client Type:
    Obsolete:
    Active:
    Properties:
    Agent Name[0]: SMS_AD_SYSTEM_DISCOVERY_AGENT
    Agent Name[1]: SMS_AD_SYSTEM_GROUP_DISCOVERY_AGENT
    The first one appears to be the active one, since it includes more information and the client is listed as installed.  Does anyone have any suggestions on what might be misconfigured?
    thanks,
    Justin

    I'm experiencing the same behaviour as described above. Not using any other OS distribution methods than scripted installation from RIS, will obviously make explanations based on OS Deployment behaviour not applicable in my case.
    The thing is that an object being discovered by Heartbeat, MP_Registration, SMS_AD_SYSTEM and SMS_AD_SYSTEN_GROUP... until suddenly, SMS_AD_SYSTEM_GROUP discovery agent starts generating a new object and updates that object from the moment on. Heartbeat from the client still updates the original object. This is done for about five object a day (among 2600 alltogether), and it's not the same computers that do this (apart from some, troublesome clients...).
    When finding this duplicate object, I have kept the one generated by SMS_AD_SYSTEM... and then reinitiated a Heartbeat Discovery on that object. Doing this will make Heartbeat to update the new object (now the original is gone), and everything seem to work for a while, although I have to manually approve that object again.
    I cannot work out what makes SMS_AD_SYSTEM... generate a new object.
    Does anyone have an idea?
    /Jakob

  • Join creation between 2 tables while duplicate records exist

    Hi all ,
    I have following 2 tables. I need to join both table on SAM_CUST_ID col in ABC (detail table) and DE_CUST_DI (UK) in XYZ (master table). But both columns contain duplicate values of customer because we load these table from anther database.Is there any solution that how can i join these tables by using DE_CUST_ID and SAM_CUST_ID.If I create Enable Novalidate constraint on these columns thery are working ??? Any suggestion would be highly appriciated.
    Regards
    REM TEST XYZ
      CREATE TABLE "TEST"."XYZ"
       (     "DE_BANK_ID" VARCHAR2(255 BYTE),
         "DE_CUST_ID" NUMBER(*,0),
         "XYZ_NO" NUMBER(*,0),
         "DE_HOLD_OUTPUT" VARCHAR2(255 BYTE),
         "DE_SHORT_NAME" VARCHAR2(255 BYTE),
         "DE_NAME1" VARCHAR2(255 BYTE),
         "DE_NAME2" VARCHAR2(255 BYTE),
         "DE_STREET_ADDR" VARCHAR2(255 BYTE),
         "DE_TOWN_COUNTRY" VARCHAR2(255 BYTE),
         "DE_POST_CODE" NUMBER(*,0),
         "DE_COUNTRY" VARCHAR2(255 BYTE),
         "DE_COPIES" NUMBER(*,0),
         "DE_EMAIL" VARCHAR2(255 BYTE),
         "DE_LOCAL_A1" VARCHAR2(255 BYTE),
         "DE_LOCAL_A2" VARCHAR2(255 BYTE),
         "DE_LOCAL_A3" VARCHAR2(255 BYTE),
         "DE_LOCAL_N1" FLOAT(126),
         "DE_LOCAL_N2" FLOAT(126),
         "DE_LOCAL_N3" FLOAT(126),
          CONSTRAINT "XYZ_UNIQUE" UNIQUE ("DE_CUST_ID", "XYZ_NO") ENABLE
    REM TEST XYZ_IDX1
      CREATE UNIQUE INDEX "WH1"."XYZ_IDX1" ON "WH1"."XYZ" ("DE_CUST_ID", "XYZ_NO")
    REM TEST ABC
      CREATE TABLE "TEST"."ABC"
       (     "ABC_BANK_ID" VARCHAR2(255 BYTE),
         "ABC_CUST_ID" NUMBER(22,0),
         "ABC_ABC_ID" VARCHAR2(255 BYTE),
         "ABC_PORT_NAME" VARCHAR2(255 BYTE),
         "ABC_CCY" VARCHAR2(255 BYTE),
         "ABC_INDICATOR" NUMBER(*,0),
         "ABC_MANAGED_ACCOUNT" VARCHAR2(255 BYTE),
         "ABC_CLOSED_DATE" DATE,
         "ABC_INVESTMENT_PGM" VARCHAR2(255 BYTE),
         "ABC_ACC_OFF" NUMBER(*,0),
         "ABC_FREQUENCY" VARCHAR2(255 BYTE),
          CONSTRAINT "ABC_UNIQUE" UNIQUE ("ABC_ABC_ID") ENABLE,
          CONSTRAINT "ABC_DE_ADD_CUSID_FK" FOREIGN KEY ("ABC_CUST_ID")
           REFERENCES "WH1"."DE_ADDR" ("DE_CUST_ID") ENABLE NOVALIDATE
    REM WH1 ABC_UNIQUE
      CREATE UNIQUE INDEX "WH1"."ABC_UNIQUE" ON "WH1"."ABC" ("ABC_ABC_ID")
     

    would u like to explain by an example?First of all why not removing duplicates
    If you need to keep duplicates, you entire design looks wrong.
    You have to follow normalization rules and always have LPK (logical primary key) and PK for transactional tables
    An example is simple
    A1 and B2 duplicates
    After data is loaded it should look like this
    Table 1
    id val
    1 A1
    2 A1
    3 B2
    4 B2
    Table 2
    id val
    1 A1
    2 A1
    3 B2
    4 B2--
    (id - unique sequence)
    Of cause there is more possibility but
    nobody can tell you how to do this (or what to do) based on what you showed.
    Good luck

  • Error due to duplicate records

    Hello friends,
    I have done a full upload to the particular charactersitics info object using direct update.(PSA and directly to data target). I used PSA and Subsequently into data target option.
    When i load the data into the object through process chain i get an error that duplicate records exist and the request become red in PSA.
    But no duplicate records exist in the data package and when we try to manually load the record from PSA to data Target it works fine.
    Can any one try to throw some lights on this error?
    Regards
    Sre....

    Hello Roberto and Paolo
    There was an OSS note that we should not use that option only PSA with delete duplicate records and update into data target .
    I dont know the reason Exactly.
    Can you throw some lights on this, Why its like that?
    Thanks for the reply paolo and roberto
    Regards
    Sri

  • How to avoid retrieve duplicate records from SalesLogix

    I wanted to know if you could assist me.  I am now responsible for reporting our inside sales activities which includes (each month), outbound calls made, opportunities created, opportunities won $, etc.  We use SalesLogix as our tool.  I have been working with Business Objects exporting this information from SalesLogix and have pretty much created the report I need.  The only problem I have is it will pull in duplicate records with the same opportunity ID number because my query is based on u201Ccampaign codesu201D attached to SLX opportunities.  When an opportunity is created in SLX, it automatically assigns an opportunity ID (ex: OQF8AA008YQB) which is distinctive.  However, when we attach more than one u201Ccampaign codeu201D to this opportunity it pulls in opportunity ID that many more times.
    Is there a way to filter or only retrieve one ID record number regardless of how many campaign codes are attached?  All the information attached to the opportunity are the same with the exception that the "campaign code" is different which makes it two records since I pull by "campaign code"
    My greatest appreciation!

    Hi,
    If you are having CAMPAIGN CODE in your query and if you are displaying it in your report, then it would definitely display multiple rows for OPPORTUNITY ID for each CAMPAIGN CODE it has. 
    If you would like to have just one row for OPPORTUNITY ID, then you will need to remove CAMPAIGN CODE from your report.

  • Master Data Load Failure- duplicate records

    Hi Gurus,
    I am a new member in SDN.
    Now,  work in BW 3.5 . I got a data load failure today. The error message saying that there is 5 duplicate records . The processing is in to PSA and then to infoobject. I checked the PSA and data available in PSA. How can I avoid these duplicate records.
    Please help me, I want to fix this issue immediately.
    regards
    Milu

    Hi Milu,
    If it is a direct update, you willn't have any request for that.
    The Data directly goes to Masterdata tables so don't have Manage Tab for that infoobject to see the request.
    Where as in the case of flexible update you will have update rules from your infosource to the infoobject so you can delete the request in this case.
    Check this link for flexible update of master data
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/37dda990-0201-0010-198f-9fdfefc02412

  • Duplicate record with same primary key in Fact table

    Hi all,
       Can the fact table have duplicate record with same primary key . When i checked a cube i could see records with same primary key combination but the key figure values are different. My cube has 6 dimentions (Including Time,Unit and DP) and 2 key figures. So 6 fields combined to form the composite primary key of the fact table. When i checked the records in se16 i could see duplicate records with same primary key. Ther are no parallel loading happening for the cube.
    BW system version is 3.1
    Data base is : Oracle 10.2
    I am not sure how is this possible.
    Regards,
    PM

    Hi Krish,
       I checked the datapacket dimention also. Both the record have same dimention id (141). Except the Keyfigure value there is no other change in the Fact table record.  I know this is against the basic DBMS primary key rule. But i have records like this in the cube.
    Can this situation arise when same records is there in different data packet of same request.
    Thx,
    PM
    null

  • Loading ODS - Data record exists in duplicate within loaded data

    BI Experts,
    I am attemping to load an ODS with the Unique Data Records flag turned ON.  The flat file I am loading is a crosswalk with four fields, the first 3 fields are being used as Key Fields in order to make the records unique.  I have had this issue before, but gave up in frustration and added an ascending number count field so simply create a unique key.  This time I would like to solve the problem if possible.
    The errors come back referring to two data rows that are duplicate:
    Data record 1 - Request / Data package / Data record: REQU_4CNUD93Q3RCC80XFBG2CZXJ0T/000003/ 339
    Data record 2 - Request / data package / data record: REQU_4CNUD93Q3RCC80XFBG2CZXJ0T/000003/ 338
    And below here are the two records that the error message refers to:
    3     338     3902301480     19C*     *     J1JD     
    3     339     3902301510     19C*     *     J1Q5     
    As you can see, the combination of my three Key Fields should not be creating a duplicate. (3902301480, 19C(asterisk) , (asterisk))   and (3902301510, 19C(asterisk) , (asterisk))  I replaced the *'s because they turn bold!
    Is there something off with the numbering of the data records?  Am I looking in the wrong place?  I have examined my flat file and can not find duplicates and the records that BW say are duplicates are not, I am really having a hard time with this - any and all help greatly appreciated!!!

    Thank you for the response Sabuj....
    I was about to answer your questions but I wanted to try one more thing, and it actually worked.  I simply moved the MOST unique Key Field to the TOP of my Key Field list. It was at the bottom before.
    FYI for other people with this issue -
    Apparantly the ORDER of your Key Fields is important when trying to avoid creating duplicate records.
    I am using four data fields, and was using three data fields as the Key Fields.  Any combination of all three would NOT have a duplicate, however when BW finds that the first two key fields match, sometimes it apparantly doesn't care about the third one which would make the row unique.  By simply changing the order of my Key Fields I was able to stop getting the duplicate row errors...
    Lesson - If you KNOW that your records are unique, and you are STILL getting errors for duplicates, try changing the ORDER of your key fields.

  • Record filtered in advance as error records with the same key exist

    Dear Masters
       I am getting below error for master data (0MATERIAL_ATTR), while loading data from PSA to Info object through DTP (I have set the Handle duliocate record also). When I checked that error data in Error Stack, I am getting below error for all records(100)
    "Record filtered in advance as error records with the same key exist"
    Could u please help me to resolve this issue
    Thanks in Advance
    Raja

    Hi
    Thanks for reply
    I have loaded the Delta data in PSA. In PSA data was loaded successfully. I got the failure only when I loaded the from PSA to Maste data
    Thanks in advance
    Raja

  • Data loader : Import -- creating duplicate records ?

    Hi all,
    does anyone have also encountered the behaviour with Oracle Data Loader that duplicate records are created (also if i set the option: duplicatecheckoption=externalid) When i am checking the "import request queue - view" the request parameters of the job looks fine! ->
    Duplicate Checking Method == External Unique ID
    Action Taken if Duplicate Found == Overwrite Existing Records
    but data loader have created new records where the "External Unique ID" is already existent..
    Very strange is that when i create the import manually (by using Import Wizard) exactly the same import does work correct! Here the duplicate checking method works correct and the record is updated....
    I know the data loader has 2 methods, one for update and the other for import, however i do not expect that the import creates duplicates if the record is already existing, rather doing nothing!
    Anyone else experiencing the same ?? I hope that this is not expected behaviour!! - by the way method - "Update" works fine.
    thanks in advance, Juergen
    Edited by: 791265 on 27.08.2010 07:25
    Edited by: 791265 on 27.08.2010 07:26

    Sorry to hear about your duplicate records, Juergen. Hopefully you performed a small test load first, before a full load, which is a best practice for data import that we recommend in our documentation and courses.
    Sorry also to inform you that this is expected behavior --- Data Loader does not check for duplicates when inserting (aka importing). It only checks for duplicates when updating (aka overwriting). This is extensively documented in the Data Loader User Guide, the Data Loader FAQ, and in the Data Import Options Overview document.
    You should review all documentation on Oracle Data Loader On Demand before using it.
    These resources (and a recommended learning path for Data Loader) can all be found on the Data Import Resources page of the Training and Support Center. At the top right of the CRM On Demand application, click Training and Support, and search for "*data import resources*". This should bring you to the page.
    Pete

Maybe you are looking for

  • Our webpage is getting version as 0 since updating to 31.0

    I access one of my company's webpage regularly, which is developed using html and have some kind of dynamic editing tools(cms-edit-form) embedded so that any one can edit a section of page by using all basic features available typically in MS Office.

  • Sending HR-File as email by the ABAP program as password protected ZIP file

    Hi All, My requiremet is to directly email the SAP-HR files to the users as the password protected ZIP file on UNIX. Can anyone help me out how to implement this in my ABAP program. Regards, Saumik

  • Varience Report

    Dear Friends My client needs to run a report with following information. Production Order, Material, Standard Price, Actual Cost, Total Varience. Please provide me answer for this. For right answer full points will be given. This will be one way of t

  • PC to Mac song transfer problem

    I've looked through tutorials on this, but haven't found a good answer, one that is specific to my situation. I have an iPod Classic which is currently synced with a Windows computer; itt has 60GB worth of songs on it. I want to transfer all of those

  • Problems setting up the wre54G

    When running my wre54G software I get the "Site Survey" failed error.