Duplicate record found short dump, if routed through PSA

Hi Experts,
I am getting these errors extracting the master data for 0ACTIVITY
1 duplicate record found. 1991 recordings used in table /BI0/XACTIVITY
1 duplicate record found. 1991 recordings used in table /BI0/PACTIVITY
2 duplicate record found. 2100 recordings used in table /BI0/XACTIVITY
2 duplicate record found. 2100 recordings used in table /BI0/PACTIVITY.
If I extract via PSA with the option to ingnore duplicate records, I am getting a short dump
ShrtText                                            
    An SQL error occurred when accessing a table.   
How to correct the error                                                             
     Database error text........: "ORA-14400: inserted partition key does not map to  
      any partition"                                                                  
What is  causing errors in my extraction ?
thanks
D Bret

Go to RSRV .Go to All Elementary Tests -->PSA Tables -->Consistency Between PSA Partitions and SAP Administration Information## ..and correct the error...

Similar Messages

  • 36 duplicate record  found. -- error while loading master data

    Hello BW Experts,
    Error while loading master data
    ( green light )Update PSA ( 50000 Records posted ) No errors
    ( green light )Transfer rules ( 50000  „³ 50000 Records ): No errors
    ( green light )Update rules ( 50000 „³ 50000 Records ): No errors
    ( green light ) Update ( 0 new / 50000 changed ): No errors
    ( red light )Processing end: errors occurred
         Processing 2 finished
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/PZMATNUM
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/XZMATNUM
    This error repeats with all the data-packets.
    what could be the reason of the error. how to correct the error.
    Any suggestions appreciated.
    Thanks,
    BWer

    BWer,
    We have exactly the same issue when loading the infoobject 0PM_ORDER, the datasource is 0PM_ORDER_ATTR.
    The workaround we have been using is 'Manual push' from the details tab of the monitor. Weirdly we don't have this issue in our test and dev systems, even in Production, this doesn't occur somedays. We have all the necessary settings enabled in the infopackage.
    Did you figure out a solution for your issue, if so please let me know and likewise I will.
    -Venkat

  • Duplicate records found while loading master data(very urgent)

    Hi all,
    One infopackage in the process chain failed while laoding the master data(full update).Its showing the following error-->duplicate record found ..1 record used in /BI0/PTCTQUERY and the same record occured in /BI0/PTCTQUERY tables.
    can anyone give me the solution...its very urgent...
    Thanks & Regards,
    Manjula

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    Regards
    Kiran

  • Calendar and Adressbook error: Duplicate records found for GUID

    Hi all,
    i have a Mountaion Lion Server running on a mac mini and everything was working well.
    This morning one user of mine is unable to connect to his calendar and adressbook.... i found this error in the log files:
    2013-06-30 15:19:50+0200 [-] [caldav-1]  [-] [twistedcaldav.directory.appleopendirectory.OpenDirectoryService#error] Duplicate records found for GUID ****USER_GUID****:
    2013-06-30 15:19:50+0200 [-] [caldav-1]  [-] [twistedcaldav.directory.appleopendirectory.OpenDirectoryService#error] Duplicate: ***USER_Shortname***
    Apperetnly there is a duplicate match in the database. how can i fix this issue?
    In Server App this user is only listed once.
    Mail and other services for this user are working correctly.
    Thanks for any advice!

    Hi Samuel,
    You may try:
    select code,count(code)
    from [dbo\].[@XTSD_XA_CMD\]
    group by code having count(code) > 1
    What is the result?
    Thanks,
    Gordon

  • Help....Duplicate records found in loading customer attributes!!!!

    Hi ALL,
    we have a full update master data with attributes job that  is failed with this error message: (note the processing is PSA and then Data targets)
    1 duplicate record found. 66 recordings used in table /BIC/PZMKE_CUST RSDMD 199     
    1 duplicate record found. 0 recordings used in table /BIC/PZTSA_CUST RSDMD 199     
    1 duplicate record found. 0 recordings used in table /BIC/PZTSA_CUST RSDMD 199     
    1 duplicate record found. 0 recordings used in table /BIC/XZTSA_CUST RSDMD 199     
    1 duplicate record found. 66 recordings used in table /BIC/XZMKE_CUST RSDMD 199     
    our datasource is 0CUSTOMER_ATTR i tried to use the transaction rso2 to get more information about this datasource to know where i can find the original data in R/3 but when i execute i got this message:
    DataSource 0CUSTOMER_ATTR  is extracted using functional module MDEX_CUSTOMER_MD
    Can you help me please what should i do to correct and reload or to find the duplicate data to tell the person on R/3 system to delete the duplicate.
    Thanks
    Bilal

    Hi Bilal,
    Could you try the following pls.
    Only PSA and Update Subsequently into Datatargets in the Processing tab.
    In Update tab use Error Handling , mark "Valid Records Update, Reporting Possible(request green).
    If my assumptions are right, this should collect the duplicate records in another request, from where you could get the data that needs to be corrected.
    To resolve this issue, just load using ONLY PSA and Update Subsequently into Datatargets with Ignore duplicate Records checked.
    Cheers,
    Praveen.

  • Duplicate Records Found?

    Hi Experts,
       Is there any Permanent solution for Duplicate Records Found?
        Info Package> only PSA subsequent Data Targets- Ignore Duplicate Records.
    Can u explain clearly for this Issue to close Permanently.
    Note : Points will be assinged.
    With Regards,
    Kiran

    k

  • Job Fail:9 duplicate record found. 73 recordings used in table /BIC/XZ_DEAL

    Dear All
      Job load from bw ODS to bw Master data InfoObject and ODS. This job always fail with same message: <b>9 duplicate record found. 73 recordings used in table /BIC/XZ_DEAL</b> (Z_DEAL is Infoobject).
    When I rerun, this job has job no error.
    Please help me solved this problem.
    thanks
    Phet

    Hi,
    What is the info object name.
    Regards,
    Goutam

  • Intellisync error "Duplicate records found"

    I am trying to sync my BB 8300 with Outlook 2007.  I am using Blackberry Desk Manager 4.2.2.14.
    I was able to sync before but now it get this error "Duplicate records found in the device" and it won't sync.
    Any ideas to fix this problem?
    I tried to un-install and re-install DM.
    I also tried to delete Intellisync folder
    Thanks! 

    JJKD wrote:
    I am trying to sync my BB 8300 with Outlook 2007.  I am using Blackberry Desk Manager 4.2.2.14.
    I was able to sync before but now it get this error "Duplicate records found in the device" and it won't sync.
    Any ideas to fix this problem?
    I tried to un-install and re-install DM.
    I also tried to delete Intellisync folder
    Thanks! 
    Hi JJKD;
    What is your email application?
    Backup your device.
    Review the BB KB Here...
    There are a couple of approaches to this. If your not needing to complete a two way sync, you can clear the calendar on the BB Handheld by going to BACKUP / ADVANCED and clearing the calendar on right side of the Advanced GUI. Then just try a regular sync.
    This can also work by clearing
    If you do need to complete a two way sync you can BACKUP the device and clear the Calendar (edit), then RESTORE just the the calendar in BACKUP / ADVANCED by menu'ing the backup you created and complete a restore of the calendar. By backing up and then clearing and then restoring, this will clear any issues with the calendar DB and reindex the data. Then try a regular sync and see if the symptom is taken care of.
    Good luck and let us know how your doing...
    Message Edited by hmeister on 10-10-2008 09:48 PM
    +++++++++++++++++++++++++++++++++++++++++++++++++
    If successful in helping you, please give me kudos in my post.
    Please mark the post that solved it for you!
    Thank you!
    Best Regards
    hmeister

  • Master Delta Load - Error "38 Duplicate records found"

    Hi Guys,
    In one of our delta uploads for a Master data we are getting error "38 Duplicate records found". I checked the PSA data but there were no duplicate records in that.
    Once the data upload fails I manually update the failed packet and it goes fine.
    Please any one have solution for this.

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    u can see the same in my previous thread
    Re: duplicate records found while loading master data(very urgent)
    Regards
    Kiran

  • Duplicate Record Found

    Hi Friends,
    We are getting error "Duplicate Record Found" while loading master data.
    When we checked in PSA there are no duplicated in PSA.
    When we deleted the request and reloaded from PSA it completed successfully.
    Daily we are facing this error and after deletion and reload from PSA it finishes successfully.
    What could be the resaone? Any solution for this?
    Regards
    SSS

    Hi,
    In InfoPackage maintenance screen, under update option, select the check box, Ignore Double Data Records. This may solve u r probelm.
    Hope this helps u a lot.........
    Assigning points is the way of saying Thanks in SDN
    Regards
    Ramakrishna Kamurthy

  • JCO RFC provider: Server function not found, short dump

    Hi all,
    I'm trying to use the JCO RFC provider service of NW04s (SP15) together with an ABAP 4.6C system. I've followed all the documentation that I could find, but couldn't get it to work yet. This is what I've done so far:
    Using SM59 I've had a destination APP_JK1 created for me on the R/3 system. AFAIK it's set up correctly and marked as a "registered server", connection tests were successful.
    On the portal, I've created a RFC portal destination using my R/3 credentials and successfully tested it. Then I created an entry in the RFC provider service, using the correct values for system, id & gateway, and let it point to my RFC destination for the repository connection. The program ID is also APP_JK1.
    Next, I created a stateless session bean which is part of an EAR, gave it a JNDI name of "RFCTEST" and added a method like this:
        * @ejb.interface-method view-type="both"
        * @param function called function
       public void processFunction(com.sap.mw.jco.JCO.Function function) {
          JCO.ParameterList input  = function.getImportParameterList();
          String query = input.getString("I_STRING_SEARCH");
          JCO.ParameterList output = function.getExportParameterList();
          output.setValue(query, "ECHOTEXT");
    (The project uses xdoclet for the creation of J2EE stuff)
    In application-j2ee-engine.xml located in the META-INF directory of my EAR, I've added:
    <reference reference-type="hard">
         <reference-target provider-name="sap.com" target-type="library">com.sap.mw.jco</reference-target>
    </reference>
    The application deploys without errors, and besides from my RFC connection works as expected. In the JNDI registry view of VisualAdmin I see the corresponding entry: "rfcaccessejb/RFCTEST" is a local reference to my stateless session bean.
    On ABAP, I have a tiny little program that calls my function, mainly looking like this:
    CALL FUNCTION 'RFCTEST' DESTINATION 'APP_JK1'
      EXPORTING
        I_STRING_SEARCH = query
      IMPORTING
        ECHOTEXT = t1.
    Unfortunately, it short dumps immediately. The error message is:
    JCO.Server could not find server function 'RFCTEST'
    I'm lost. What could be wrong? Any help is greatly appreciated. Kind regards,
    Christian Aust

    Hello Perry song,
    You got the short dump bcoz, there is no perform by the name you provided in the program,
    for example. consider there are 2 programs.
    Program 1 :
    REPORT  ZPGM1.
    *Calling a perform SNAME1 , the code of perform is written in ZPGM2.
    perform sname1 IN PROGRAM ZPGM2.
    Program 2 :
    REPORT  ZPGM2.
    perform sname1 .
    Form SNAME1 .
    write : 'Text in form SNAME1' .
    endform.                    " SNAME1
    (here SNAME1 is found and it works perfectly )
    Change in Program 1 :
    REPORT  ZPGM1.
    *Calling perform DELTA ,
    perform DELTA IN PROGRAM ZPGM2 IF FOUND.
    Now the perform statement will search for perform " DELTA " in ZPGM2 , but there is no perform by name DELTA , so here we need to mention the condition " IF FOUND "  , so now by mentioning the condition " IF FOUND " in perform statement ,  if the perform DELTA is not found then it wont go to DUMP.
    If condition " IF FOUND "  is not mentioned in perform statement like
    perform DELTA IN PROGRAM ZPGM2 . ( This gives DUMP )
    just type " IF FOUND " when u r calling a perform from other program ( i guess this will solve the problem ).
    Hope it might be helpfull,
    Regards ,
    Aby
    Edited by: abhi on Nov 6, 2008 10:14 AM

  • Error:1 duplicate record found. 0 recordings used in table /BI0/XBPARTNER

    Hei everyone
    I get an error of duplicate data records when trying to load masterdata to InfoObject.
    I've tried to load to PSA  and then to InfoObject. In PSA it doesn't show any errors but when loading to infoobject it again gets an error.
    I've also tried to loading to PSA with options 'Ignore double data records' and 'Update subsequently in data targets'.  I still get an error.
    Any suggestions.
    Thanks in advance,
    Maarja

    Take a look at these links below....
    http://help.sap.com/saphelp_nw70/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    http://help.sap.com/saphelp_nw70/helpdata/en/05/bb223905b61b0ae10000000a11402f/frameset.htm
    (Use option -  1 C under activtites)

  • Duplicate records exists in same request of a PSA

    Dear SAP Professionals,
    Please look into below issue I am facing.
    1) I have only 1 request (green) in PSA.
    2) When I am looking into that request, I found there are 4 Packages.
    3) In Package 3 & 4, I found the same records has been extracted twice.
    Here I am not able to understand, why this is happening?
    I tried to run the extractor in R/3 side, and found only 7 records exist for that particular condition.
    But when I look into PSA Maintenance screen, there are 14 records exist for the same condition (7 records of Pkg 3 + 7 records of Pkg 4).
    Request you to provide the necessary guidance.
    Many thanks in advance.
    Best Regards,
    Ankur Goyal

    Hello Ankur,
    You didn't mention whether you are loading master data or transaction data.
      If you are loading Master data so it will not create any problem, because it will be overwrite. It will load  Package  3 firstly then data of Package  4 will be overwrite with Package 3. So there will be only 7 records in data target .
        But if you are loading  Transaction data  so  pick the data once again from the R/3 and check  and for  this  you have to delete previous request.
    Thanks
    Abha
    Edited by: Abha Sahu on Jan 29, 2010 3:50 PM

  • Duplicate record error

    Hi,
    I am using a ODS as source to update the master data infoobject with flexible update. The issue is in spit of using option ONLY PSA ( Update subsequently in data target ) in the infopackage with error handling enabled, I am getting Duplicate record error. This happens only when I am updating through process chain. If I run it manually, the error doesn't comes. Plz let me know the reason.
    Thanks

    Hi Maneesh,
    As we r loading from ODS to info object we dont get the option "don't update duplicate records if exists"
    first did u checked any duplicate records found in PSA?? if so delete them from PSA.
    or one option is enable error handling in infopackage.
    or
    check any incosistencies for infoobject in RSRV and if found repair it and load again. check the inconsistencies for P,X,Y tables and for the complete object also.
    *assign points if helpfull*
    KS

  • Duplicate record in Masterdata: Failure in Chain, Successful when manually

    Hi all,
    A daily load updates 0BPARTNER_ATTR from CRM.
    Recently this load started to fail, with the following error message:
    xx duplicate records found. xxxx recordings used in table /BI0/XBPARTNER
    The data load through PSA, but there is no red record in the PSA.
    After copying the PSA data to Excel for further analysis I concluded there are actually no duplicate in the PSA.
    Also, the option 'Ignore double data records'  in the infopackage is checked.
    When I manually update from PSA the load is successful.
    This started to happen about two weeks ago, and I didn't find an event which could have caused it.
    Now it happens every two or three days, and each time the manual update is successful.
    Any suggestions, anyone?
    Thanks, Jan.

    Hi Jan,
    Possibly you would have two requests in PSA and you would try to update the Data Target.
    Delete all the requests in PSA, schedule once again to bring the data to PSA and update then into Data Target.
    Thank you,
    Arvind

Maybe you are looking for