Error RSMPTEXTS~:Duplicate record dur durin EHP5 in phase SHADOW_IMPORT_INC

Hi expert,
i find this error during an EHP5 upgrade in phase shadow_import_inc:
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
SHADOW IMPORT ERRORS and RETURN CODE in SAPK-701DOINSAPBASIS.ERD
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
2EETW000 Table RSMPTEXTS~: Duplicate record during array insert occured.
2EETW000 Table RSMPTEXTS~: Duplicate record during array insert occured.
1 ETP111 exit code           : "8"
here also the last part of log SAPK-701DOINSAPBASIS.ERD
4 ETW000 Totally 4 tabentries imported.
4 ETW000 953984 bytes modified in database.
4 ETW000  [     dev trc,00000]  Thu Aug 11 16:58:45 2011                                             7954092  8.712985
4 ETW000  [     dev trc,00000]  Disconnecting from ALL connections:                                       28  8.713013
4 ETW000  [     dev trc,00000]  Disconnecting from connection 0 ...                                       38  8.713051
4 ETW000  [     dev trc,00000]  Closing user session (con=0, svc=0000000005C317C8, usr=0000000005C409A0)
4 ETW000                                                                                8382  8.721433
4 ETW000  [     dev trc,00000]  Detaching from DB Server (con=0,svchp=0000000005C317C8,srvhp=0000000005C32048)
4 ETW000                                                                                7275  8.728708
4 ETW000  [     dev trc,00000]  Now I'm disconnected from ORACLE                                        8648  8.737356
4 ETW000  [     dev trc,00000]  Disconnected from connection 0                                            84  8.737440
4 ETW000  [     dev trc,00000]  statistics db_con_commit (com_total=13, com_tx=13)                        18  8.737458
4 ETW000  [     dev trc,00000]  statistics db_con_rollback (roll_total=0, roll_tx=0)                      14  8.737472
4 ETW000 Disconnected from database.
4 ETW000 End of Transport (0008).
4 ETW000 date&time: 11.08.2011 - 16:58:45
4 ETW000 1 warning occured.
4 ETW000 1 error occured.
1 ETP187 R3TRANS SHADOW IMPORT
1 ETP110 end date and time   : "20110811165845"
1 ETP111 exit code           : "8"
1 ETP199 ######################################
4 EPU202XEND OF SECTION BEING ANALYZED IN PHASE SHADOW_IMPORT_INC
and i've already try to use the last version of R3trans.
can you help me????
thanks a lot
Franci

Hello Fransesca,
I am also facing same error while upgrading ehp5 upgradation please if you know tell me steps to solve it.
Thanks,
Venkat

Similar Messages

  • Master Delta Load - Error "38 Duplicate records found"

    Hi Guys,
    In one of our delta uploads for a Master data we are getting error "38 Duplicate records found". I checked the PSA data but there were no duplicate records in that.
    Once the data upload fails I manually update the failed packet and it goes fine.
    Please any one have solution for this.

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    u can see the same in my previous thread
    Re: duplicate records found while loading master data(very urgent)
    Regards
    Kiran

  • Error:1 duplicate record found. 0 recordings used in table /BI0/XBPARTNER

    Hei everyone
    I get an error of duplicate data records when trying to load masterdata to InfoObject.
    I've tried to load to PSA  and then to InfoObject. In PSA it doesn't show any errors but when loading to infoobject it again gets an error.
    I've also tried to loading to PSA with options 'Ignore double data records' and 'Update subsequently in data targets'.  I still get an error.
    Any suggestions.
    Thanks in advance,
    Maarja

    Take a look at these links below....
    http://help.sap.com/saphelp_nw70/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    http://help.sap.com/saphelp_nw70/helpdata/en/05/bb223905b61b0ae10000000a11402f/frameset.htm
    (Use option -  1 C under activtites)

  • Error in Duplicate record validation in EntityObject using unique key contraint

    I have implemented Unique key validation in Entity Object by creating Alternate key in Entity object.
    So the problem is that whenever the duplicate record is found,the duplicate record error is shown, but the page becomes blank, and no error is shown in the log.
    I wanted to know what may be the possible cause of it.
    I am using Jdev 11.1.2.4.

    After duplication, clear the PK item, then populate from the sequence in a PRE-INSERT block-level trigger.
    Francois

  • Error due to duplicate records

    Hello friends,
    I have done a full upload to the particular charactersitics info object using direct update.(PSA and directly to data target). I used PSA and Subsequently into data target option.
    When i load the data into the object through process chain i get an error that duplicate records exist and the request become red in PSA.
    But no duplicate records exist in the data package and when we try to manually load the record from PSA to data Target it works fine.
    Can any one try to throw some lights on this error?
    Regards
    Sre....

    Hello Roberto and Paolo
    There was an OSS note that we should not use that option only PSA with delete duplicate records and update into data target .
    I dont know the reason Exactly.
    Can you throw some lights on this, Why its like that?
    Thanks for the reply paolo and roberto
    Regards
    Sri

  • Setting change to correct duplicate records ..

    Hi Experts,
    > I tried loading a flat file to an ODS which was in the quality system and i got some duplicate rows which i had to correct manually in the PSA.
    > When i tried loading the same flat file to the same ODS in Production , i did not get the error message 'duplicate records' .
    Can you please tell me what is the setting that has to be changed to allow duplicate records, or correct it ...that i have to do in the quality.
    Help done would be defenately rewarded with points ....
    Thanks ,
    Santosh ....

    Hi Santhosh,
    Remove check for unique data records.
    <i><b>Unique Data Records</b></i>
    With the Unique Data Records indicator, you determine whether only unique data records are to be updated to the ODS object. This means that you cannot load a data record into the ODS object the key combination for which already exists in the system – otherwise a termination occurs. Only use this setting when you are sure that only unique data records are to be loaded into the ODS object (for example, single documents). A typical application of this is in the loading of mass data. It improves the load performance.
    You can also deselect this indicator again (even if data has already been loaded into the ODS object). This can be necessary if you want to re-post deleted data records using a repair request (see: Tab Page: Updating). In this case, you need to deselect the Unique Data Records indicator before posting the repair request, following which you can then reset the Unique Data Records indicator once more. The regeneration of metadata of the Export DataSource, which takes place when the ODS object is reactivated, has no effect on the existing data mart delta method.
    More info @ <a href="http://help.sap.com/saphelp_nw04/helpdata/en/a6/1205406640c442e10000000a1550b0/content.htm">ODS Object Settings</a>
    Hope it Helps
    Srini

  • Master data infoobject can't handle duplicate records after SP10

    Hi
    I am trying to load master data which happened to contain duplicate records from the source system.  In the DTP of the master data infoobject, I have ticked the 'Handle Duplicate Record Keys' checkbox.  After executing this DTP, the duplicate master data records were trapped in the Error Stack.  I am expecting overwriting of the duplicate master data to take place instead.  I understand that this error was fixed in Note 954661 - Updating master data texts when error in data package which is from SP9.  After applying Support Pack 10, the master data infoobject just can't handle records with duplicate keys.
    Please let me know if you manage to fix this problem.
    Many thanks,
    Anthony

    Found a fix for this problem.  Just applied this OSS note  Note 986196 - Error during duplicate record handling of master data texts.

  • Master data failing with Duplicate records

    Dear All,
    Daily the ODS XD0SDO08 is getting loaded with four requests.
    While loading Master data Delta from ODS XD0SDO08 to 0Doc_Number and Cube XD_SDC08, regularly it's getting failed with the error "54 duplicate record found. 1130 recordings used in table /BI0/XDOC_NUMBER" But after deleting the Request from the 0Doc_number and the cube XD_SDC08, and reconstructing from PSA, it is succcessfull.
    If I check the PSA,there are few records in PSA, in which each Sales document number has two after image records and two before image records.If I count the number of these records they are almost equal to the number of Duplicate records we are getting in the error message.
    As we are loading to cube XD_SDC08 and 0Doc_Number, we don't have the option of Ignore Duplicate records in the Infopackge.
    Please suggest me a solution as I have to delete the request manually and reconstruct it daily.
    Regards,
    Ugendhar

    Hi ughi,
    As ur telling that data is sucessesful in cube but not in infoobject check the records in  infoobject wheather their is M version and A version if both are there ..some times load fail with duplicate data record due to this problem ...Run Attribute change run for that infoobject and  check ..
    if this will also be not sucessesful ,and as per ur update after reconstruction it is sucessful means their will be problem in update rule heck once
    another thing u will not get ignore duplicate data records because here infoobject as marked as data target ..
    if u think my suggestion is helpful assigin point to this thread
    regards,
    Gurudatt Bellary

  • Unable to pull duplicate record in BW

    I am working on E-Recritment module.
    I have to get candidate education details in BW but those are not available in standard extrator (0CANDAIDATE_ATTR & 0CAND_TD_ATTR). For that I have enhanced DS by fields INSTITUTE, START_DATE & END_DATE. My extractor is working fine & I am able to see all the records in RSA3.
    I am getting these records upto PSA but when I want to update further to Info Object Candidate is giving error as Duplicate Records.
    Problem is occuring in extracting the details such as Institute (from table HRP 5104) as each candidate is having more then one institute & BW is rejecting duplicate records.
    Eg.    0OBJID           INSTITUTE     START_DATE              END_DATE
             50038860         ABC              10.06.1965                    20.05.1973
             50038860         XYZ                20.05.1976                  15.05.1978
             50038860        PQR                30.05.1978                   12.05.1980
    Alternatively I have done compounding of InfoObject but stll is not giving correct result.
    Can anybody five idea to solve this.
    Thanks in advance.

    Try creating Time Dependant Hierarchies, i don't think Compounding or only time dependant will help you.
    Load it DSO is not a bad option.
    Nagesh Ganisetti.

  • BI 7.0 - Duplicate Record Error while loading master data

    I am working on BI 7.0 and I am trying to load master data to an info object.
    I created an Infopackage and loaded into PSA.
    I created transformation and DTP and I get an error after I execute the DTP about duplicate records.
    I have read all previous threads about duplicate record error while loading master data and most of the them suggested to check 'Ignore duplicate records' option in the infopackage. But in 7.0, I can only load to PSA with infopackage and it doesn't have any option for me to ignore duplicate records.
    My data is getting loaded to PSA fine and I get this error while loading to info object using DTP.
    I would appreciate your help to resolve this issue.
    Regards,
    Ram.

    Hi,
    Refer:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/45/2a8430131e03c3e10000000a1553f6/frameset.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    With rgds,
    Anil Kumar Sharma .P

  • 36 duplicate record  found. -- error while loading master data

    Hello BW Experts,
    Error while loading master data
    ( green light )Update PSA ( 50000 Records posted ) No errors
    ( green light )Transfer rules ( 50000  „³ 50000 Records ): No errors
    ( green light )Update rules ( 50000 „³ 50000 Records ): No errors
    ( green light ) Update ( 0 new / 50000 changed ): No errors
    ( red light )Processing end: errors occurred
         Processing 2 finished
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/PZMATNUM
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/XZMATNUM
    This error repeats with all the data-packets.
    what could be the reason of the error. how to correct the error.
    Any suggestions appreciated.
    Thanks,
    BWer

    BWer,
    We have exactly the same issue when loading the infoobject 0PM_ORDER, the datasource is 0PM_ORDER_ATTR.
    The workaround we have been using is 'Manual push' from the details tab of the monitor. Weirdly we don't have this issue in our test and dev systems, even in Production, this doesn't occur somedays. We have all the necessary settings enabled in the infopackage.
    Did you figure out a solution for your issue, if so please let me know and likewise I will.
    -Venkat

  • How to rectify the error message " duplicate data records found"

    Hi,
    How to  rectify the error "duplicate data records found" , PSA is not there.
    and give me brief description about RSRV
    Thanks in advance,
    Ravi Alakunlta

    Hi Ravi,
    In the Info Package screen...Processing tab...check the option Do not allow Duplicate records.
    RSRV is used for Repair and Analysis purpose.
    If you found Duplicate records in the F fact table...Compress it then Duplicate records will be summarized in the Cube.
    Hope this helps.

  • Master data failed with  error `too many duplicate records'

    Dear all
    below is error message
                Data records for package 1 selected in PSA -                                             
                error                                                 4 in the update
    LOng text is giving as below
    Error              4 in the update
    Message no. RSAR119
    Diagnosis
    The update delivered the error code                                                   4.
    Procedure
    You can find further information on this error in the error message of the update.
    WOrking on BI - 7.0
    any solutions
    Thanks
    satish .a

    Hi,
    Go through these threads, they have same issue:
    Master data load: Duplicate Records
    Re: Master data info object - duplicate records
    Re: duplicate records in master data info object
    Regards
    Raj Rai

  • Duplicate Records error when processing transaction file....BPC 7.0

    Hi All,
    I have a situation. I am using BPC NW 7.0 and I have updated my dimension files. When I try to validate my transaction file every single record is validated successfully. But when I try to import the flat file into my application, I am getting a lot of dupplication records error and these are my questions.
    1. Will we get duplicate records in transaction files?
    2. Even if there are duplication, since it is a cube it should summarize not display that as a error and reject records?
    3. Is there something I can do to accept duplicates (I have checked the Replace option in the data package, to overwrite the simillar records, but it is only for account, category and entity only.
    5. In mycase I see identical values in all my dimension and the $value is the only difference. Why is it not summing up.
    Your quickest reply is much appreciated.
    Thanks,
    Alex.

    Hi,
    I have the same problem.
    In my case the file that I want to upload has different row that differ for the nature column. In the conversion file I map different nature to one internal nature.
    ES: cost1 --> cost
          cost2 --> cost
          cost3 --> cost
    In my desire was that in BPC the nature cost assume the result  cost = cost1 + cost2 + cost3.
    The result is that only the first record is uploaded and all other recorda are rejected as duplicate.
    Any suggestion?

  • Calendar and Adressbook error: Duplicate records found for GUID

    Hi all,
    i have a Mountaion Lion Server running on a mac mini and everything was working well.
    This morning one user of mine is unable to connect to his calendar and adressbook.... i found this error in the log files:
    2013-06-30 15:19:50+0200 [-] [caldav-1]  [-] [twistedcaldav.directory.appleopendirectory.OpenDirectoryService#error] Duplicate records found for GUID ****USER_GUID****:
    2013-06-30 15:19:50+0200 [-] [caldav-1]  [-] [twistedcaldav.directory.appleopendirectory.OpenDirectoryService#error] Duplicate: ***USER_Shortname***
    Apperetnly there is a duplicate match in the database. how can i fix this issue?
    In Server App this user is only listed once.
    Mail and other services for this user are working correctly.
    Thanks for any advice!

    Hi Samuel,
    You may try:
    select code,count(code)
    from [dbo\].[@XTSD_XA_CMD\]
    group by code having count(code) > 1
    What is the result?
    Thanks,
    Gordon

Maybe you are looking for