Error:1 duplicate record found. 0 recordings used in table /BI0/XBPARTNER

Hei everyone
I get an error of duplicate data records when trying to load masterdata to InfoObject.
I've tried to load to PSA  and then to InfoObject. In PSA it doesn't show any errors but when loading to infoobject it again gets an error.
I've also tried to loading to PSA with options 'Ignore double data records' and 'Update subsequently in data targets'.  I still get an error.
Any suggestions.
Thanks in advance,
Maarja

Take a look at these links below....
http://help.sap.com/saphelp_nw70/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
http://help.sap.com/saphelp_nw70/helpdata/en/05/bb223905b61b0ae10000000a11402f/frameset.htm
(Use option -  1 C under activtites)

Similar Messages

  • Intellisync error "Duplicate records found"

    I am trying to sync my BB 8300 with Outlook 2007.  I am using Blackberry Desk Manager 4.2.2.14.
    I was able to sync before but now it get this error "Duplicate records found in the device" and it won't sync.
    Any ideas to fix this problem?
    I tried to un-install and re-install DM.
    I also tried to delete Intellisync folder
    Thanks! 

    JJKD wrote:
    I am trying to sync my BB 8300 with Outlook 2007.  I am using Blackberry Desk Manager 4.2.2.14.
    I was able to sync before but now it get this error "Duplicate records found in the device" and it won't sync.
    Any ideas to fix this problem?
    I tried to un-install and re-install DM.
    I also tried to delete Intellisync folder
    Thanks! 
    Hi JJKD;
    What is your email application?
    Backup your device.
    Review the BB KB Here...
    There are a couple of approaches to this. If your not needing to complete a two way sync, you can clear the calendar on the BB Handheld by going to BACKUP / ADVANCED and clearing the calendar on right side of the Advanced GUI. Then just try a regular sync.
    This can also work by clearing
    If you do need to complete a two way sync you can BACKUP the device and clear the Calendar (edit), then RESTORE just the the calendar in BACKUP / ADVANCED by menu'ing the backup you created and complete a restore of the calendar. By backing up and then clearing and then restoring, this will clear any issues with the calendar DB and reindex the data. Then try a regular sync and see if the symptom is taken care of.
    Good luck and let us know how your doing...
    Message Edited by hmeister on 10-10-2008 09:48 PM
    +++++++++++++++++++++++++++++++++++++++++++++++++
    If successful in helping you, please give me kudos in my post.
    Please mark the post that solved it for you!
    Thank you!
    Best Regards
    hmeister

  • Calendar and Adressbook error: Duplicate records found for GUID

    Hi all,
    i have a Mountaion Lion Server running on a mac mini and everything was working well.
    This morning one user of mine is unable to connect to his calendar and adressbook.... i found this error in the log files:
    2013-06-30 15:19:50+0200 [-] [caldav-1]  [-] [twistedcaldav.directory.appleopendirectory.OpenDirectoryService#error] Duplicate records found for GUID ****USER_GUID****:
    2013-06-30 15:19:50+0200 [-] [caldav-1]  [-] [twistedcaldav.directory.appleopendirectory.OpenDirectoryService#error] Duplicate: ***USER_Shortname***
    Apperetnly there is a duplicate match in the database. how can i fix this issue?
    In Server App this user is only listed once.
    Mail and other services for this user are working correctly.
    Thanks for any advice!

    Hi Samuel,
    You may try:
    select code,count(code)
    from [dbo\].[@XTSD_XA_CMD\]
    group by code having count(code) > 1
    What is the result?
    Thanks,
    Gordon

  • Duplicate record found short dump, if routed through PSA

    Hi Experts,
    I am getting these errors extracting the master data for 0ACTIVITY
    1 duplicate record found. 1991 recordings used in table /BI0/XACTIVITY
    1 duplicate record found. 1991 recordings used in table /BI0/PACTIVITY
    2 duplicate record found. 2100 recordings used in table /BI0/XACTIVITY
    2 duplicate record found. 2100 recordings used in table /BI0/PACTIVITY.
    If I extract via PSA with the option to ingnore duplicate records, I am getting a short dump
    ShrtText                                            
        An SQL error occurred when accessing a table.   
    How to correct the error                                                             
         Database error text........: "ORA-14400: inserted partition key does not map to  
          any partition"                                                                  
    What is  causing errors in my extraction ?
    thanks
    D Bret

    Go to RSRV .Go to All Elementary Tests -->PSA Tables -->Consistency Between PSA Partitions and SAP Administration Information## ..and correct the error...

  • Duplicate records found while loading master data(very urgent)

    Hi all,
    One infopackage in the process chain failed while laoding the master data(full update).Its showing the following error-->duplicate record found ..1 record used in /BI0/PTCTQUERY and the same record occured in /BI0/PTCTQUERY tables.
    can anyone give me the solution...its very urgent...
    Thanks & Regards,
    Manjula

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    Regards
    Kiran

  • Duplicate Record Found

    Hi Friends,
    We are getting error "Duplicate Record Found" while loading master data.
    When we checked in PSA there are no duplicated in PSA.
    When we deleted the request and reloaded from PSA it completed successfully.
    Daily we are facing this error and after deletion and reload from PSA it finishes successfully.
    What could be the resaone? Any solution for this?
    Regards
    SSS

    Hi,
    In InfoPackage maintenance screen, under update option, select the check box, Ignore Double Data Records. This may solve u r probelm.
    Hope this helps u a lot.........
    Assigning points is the way of saying Thanks in SDN
    Regards
    Ramakrishna Kamurthy

  • Job Fail:9 duplicate record found. 73 recordings used in table /BIC/XZ_DEAL

    Dear All
      Job load from bw ODS to bw Master data InfoObject and ODS. This job always fail with same message: <b>9 duplicate record found. 73 recordings used in table /BIC/XZ_DEAL</b> (Z_DEAL is Infoobject).
    When I rerun, this job has job no error.
    Please help me solved this problem.
    thanks
    Phet

    Hi,
    What is the info object name.
    Regards,
    Goutam

  • 36 duplicate record  found. -- error while loading master data

    Hello BW Experts,
    Error while loading master data
    ( green light )Update PSA ( 50000 Records posted ) No errors
    ( green light )Transfer rules ( 50000  „³ 50000 Records ): No errors
    ( green light )Update rules ( 50000 „³ 50000 Records ): No errors
    ( green light ) Update ( 0 new / 50000 changed ): No errors
    ( red light )Processing end: errors occurred
         Processing 2 finished
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/PZMATNUM
    ƒÞ     36 duplicate record found. 15718 recordings used in table /BIC/XZMATNUM
    This error repeats with all the data-packets.
    what could be the reason of the error. how to correct the error.
    Any suggestions appreciated.
    Thanks,
    BWer

    BWer,
    We have exactly the same issue when loading the infoobject 0PM_ORDER, the datasource is 0PM_ORDER_ATTR.
    The workaround we have been using is 'Manual push' from the details tab of the monitor. Weirdly we don't have this issue in our test and dev systems, even in Production, this doesn't occur somedays. We have all the necessary settings enabled in the infopackage.
    Did you figure out a solution for your issue, if so please let me know and likewise I will.
    -Venkat

  • Master Delta Load - Error "38 Duplicate records found"

    Hi Guys,
    In one of our delta uploads for a Master data we are getting error "38 Duplicate records found". I checked the PSA data but there were no duplicate records in that.
    Once the data upload fails I manually update the failed packet and it goes fine.
    Please any one have solution for this.

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    u can see the same in my previous thread
    Re: duplicate records found while loading master data(very urgent)
    Regards
    Kiran

  • Help....Duplicate records found in loading customer attributes!!!!

    Hi ALL,
    we have a full update master data with attributes job that  is failed with this error message: (note the processing is PSA and then Data targets)
    1 duplicate record found. 66 recordings used in table /BIC/PZMKE_CUST RSDMD 199     
    1 duplicate record found. 0 recordings used in table /BIC/PZTSA_CUST RSDMD 199     
    1 duplicate record found. 0 recordings used in table /BIC/PZTSA_CUST RSDMD 199     
    1 duplicate record found. 0 recordings used in table /BIC/XZTSA_CUST RSDMD 199     
    1 duplicate record found. 66 recordings used in table /BIC/XZMKE_CUST RSDMD 199     
    our datasource is 0CUSTOMER_ATTR i tried to use the transaction rso2 to get more information about this datasource to know where i can find the original data in R/3 but when i execute i got this message:
    DataSource 0CUSTOMER_ATTR  is extracted using functional module MDEX_CUSTOMER_MD
    Can you help me please what should i do to correct and reload or to find the duplicate data to tell the person on R/3 system to delete the duplicate.
    Thanks
    Bilal

    Hi Bilal,
    Could you try the following pls.
    Only PSA and Update Subsequently into Datatargets in the Processing tab.
    In Update tab use Error Handling , mark "Valid Records Update, Reporting Possible(request green).
    If my assumptions are right, this should collect the duplicate records in another request, from where you could get the data that needs to be corrected.
    To resolve this issue, just load using ONLY PSA and Update Subsequently into Datatargets with Ignore duplicate Records checked.
    Cheers,
    Praveen.

  • APO Bapi - ERROR - No entry found in transportation indicator table for fol

    Hi
    Experts .. need help on a APO bapi
    Iam being doing BAPI for transportation lane to add new material through the bapi BAPI_TRLSRVAPS_SAVEMULTI
    iam passing  the following to BAPI
        Logical system
        model
    and in tables .
        TRANSPORT_LANE
        TRANSPORT_LANEX
        PROD_PROCUREMENT
        PROD_PROCUREMENTX
        LOCATION_FROM    = '1010'.
        LOCTYPE_LOC_FROM = '1001'.   
        LOCATION_TO =   '1101'.
        LOCTYPE_LOC_TO =  '1001'.
    the above data is common for passing all the four bapi tables and in
    addition iam passing
    prod-valfr =  Converted value by passing to FM - IB_CONVERT_INTO_TIMESTAMP.
    prod-valto =  Converted value by passing to FM - IB_CONVERT_INTO_TIMESTAMP.
    prod-product = material number.
    AFter excuteing bapi iam getting an error
    No entry found in transportation indicator table for following objects
    the above error occured when passed the given data to bapi table -  PROD_PROCUREMENT  
    if we pass the data to   PROD_PROCUREMENT and   PROD_PROCUREMENTX
    then there is no error in  table return of bapi but the data is not uploaded in transportation lane.
    I will really appreciate if some guide me where iam wrong or some other solution for this .
    Regards .
    Thanks .

    Hi,
    I am writing bdc code for uploading changing inspection plan data using qp02 . I saw your post in sdn .I think you have solution .can you tell the solution .
    Regards
    Nandan.

  • Duplicate Records Found?

    Hi Experts,
       Is there any Permanent solution for Duplicate Records Found?
        Info Package> only PSA subsequent Data Targets- Ignore Duplicate Records.
    Can u explain clearly for this Issue to close Permanently.
    Note : Points will be assinged.
    With Regards,
    Kiran

    k

  • Error: Variable not found in variable substitution table

    Hi,
    I am getting following error in file adapter receiver communication channel.
    error:Could not process due to error: com.sap.aii.adapter.file.configuration.DynamicConfigurationException: Error during variable substitution: java.text.ParseException: Variable 'file' not found in variable substitution table
    <b>variable substitution values:</b>
    Variable name: File
    reference       :payload:MT_DC_Recr,1,Target,1,Filename,1
    file name scheme : %File%.txt
    <b>inpu file:</b>
    <?xml version="1.0" encoding="UTF-8"?>
    <ns0:MT_DC_Send xmlns:ns0="http://urn:psr/ff/DC">
       <Source>
          <Filename>dat1</Filename>
          <Record>
             <Name1>123</Name1>
             <Name2>abc</Name2>
             <Name3>XYZ</Name3>
          </Record>
       </Source>
    </ns0:MT_DC_Send>
    Regards,
    Srini

    Hi varun,
    This is the error i am getting now..
    Could not process due to error: com.sap.aii.adapter.file.configuration.DynamicConfigurationException: Error during variable substitution: com.sap.aii.adapter.file.varsubst.VariableDataSourceException: The following variable was not found in the message payload: File
    what i am trying to get is reading filename from the message payload.
    Reg,
    Srini

  • Duplicate Records in Out-Of-Line Tables

    Hi,
    I am using the following XML Schema:
    <?xml version="1.0"?>
    <xs:schema
    xmlns:xs="http://www.w3.org/2001/XMLSchema"
    elementFormDefault="qualified"
    xmlns:t="http://www.informatik.hu-berlin.de/~vitt/stud/studienarbeit/test-duplicates"
    xmlns:xdb="http://xmlns.oracle.com/xdb"
    targetNamespace="http://www.informatik.hu-berlin.de/~vitt/stud/studienarbeit/test-duplicates"
    >
    <xs:complexType name="eltype" xdb:SQLType="ELTYPE">
    <xs:simpleContent>
    <xs:extension base="xs:string">
    <xs:attribute name="nid" xdb:SQLName="NID" type="xs:ID"/>
    </xs:extension>
    </xs:simpleContent>
    </xs:complexType>
    <xs:element name="outerElement" xdb:defaultTable="OUTER_ELEMENT">
    <xs:complexType>
    <xs:sequence>
    <xs:element name="innerElement" type="t:eltype" maxOccurs="unbounded"
    xdb:SQLInline="false" xdb:defaultTable="INNER_ELEMENT"/>
    </xs:sequence>
    </xs:complexType>
    </xs:element>
    </xs:schema>
    When inserting the XML document:
    <?xml version="1.0"?>
    <t:outerElement
    xmlns:t="http://www.informatik.hu-berlin.de/~vitt/stud/studienarbeit/test-duplicates"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://www.informatik.hu-berlin.de/~vitt/stud/studienarbeit/test-duplicates
    http://www.informatik.hu-berlin.de/~vitt/stud/studienarbeit/schemas/test-duplicates.xsd">
    <t:innerElement nid="one">Eins</t:innerElement>
    <t:innerElement nid="two">Zwei</t:innerElement>
    </t:outerElement>
    once into the previously empty OUTER_TABLE table, the innerElements appear twice:
    SQL> select * from inner_element;
    SYS_NC_ROWINFO$
    <t:innerElement nid="one">Eins</t:innerElement>
    <t:innerElement nid="two">Zwei</t:innerElement>
    <t:innerElement nid="one">Eins</t:innerElement>
    <t:innerElement nid="two">Zwei</t:innerElement>
    SQL>
    Is there a way to avoid this? Or is this an issue which has been fixed in an Oracle newer than my 9.2.0.1?
    Kind Regards,
    Thorsten

    Mark,
    thanks for your reply.
    You must upgradeThis is in progress, I hope ...
    BTW Did you export and import the content of the
    table I cannot get importing to work. Appearently, the INNER_ELEMENT's root element name,
    innerElement, does not make it to the import utility:
    IMP-00017: following statement failed with ORACLE error 1741:
    "CREATE TABLE "VITT"."INNER_ELEMENT" OF "SYS"."XMLTYPE" XMLSCHEMA "http://w"
    "ww.informatik.hu-berlin.de/~vitt/stud/studienarbeit/schemas/test-duplicates"
    ".xsd" ELEMENT "" PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS L"
    "OGGING STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 214748364"
    "5 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPA"
    "CE "VITT" "
    IMP-00003: ORACLE error 1741 encountered
    ORA-01741: illegal zero-length identifier
    grep finds it in the dump file, though.
    and does the table use nested table storage.No. But the problem persists if I do so.
    Kind regards,
    Thorsten

  • Remove all duplicate records and load into temp table

    Hi
    I have a table contains data like this.
    Emp No Designation location
    1111 SE CA
    1111 DE CT
    3456 WE NJ
    4523 TY GH
    We found that there are two duplicate records for emp no: 1111. I want to delete all duplicate records (in this case two records for emp no:1111) and load into the temp table.
    Please advice me how to do it.

    Oh look, you can search the forums...
    http://forums.oracle.com/forums/search.jspa?threadID=&q=delete+duplicates&objID=f75&dateRange=all&userID=&numResults=30

Maybe you are looking for

  • How to display customize values in report?

    I need to be able to show certain customize values in a report portlet. Preferably just above the table of results. How do I do that? Grateful for any help... Bryan

  • Playhead In Log & Transfer Viewer Doesn't Stay Where It's Put?

    I've noticed this minor quirk that happens with AVCHD footage in the Log & Transfer window. If I drag the playhead (in the L&T Viewer) and let go to set an In or Out point, the playhead jumps back slightly on most occasions making it impossible to ch

  • How hide legends and lines of graph in OBIEE 11 dashboard report

    I have an OBIEE 11.1.1.5 dashboard report that contains Time Series Line graph on the basis of some measures. Some measures are contains 0 values for these measures graph lines are drawn at origin 0 and legends of these measures are also shown. I wan

  • Create sequence on report

    Help please. I am trying to create sequence on the report. I wrote following PLSQL on the field emp_no in order to show sequence number (1,2,3...)on the employee_no column. It did not work. CREATE SEQUENCE Emp_sequence INCREMENT BY 1 START WITH 1 NOM

  • Elements editor won't open

    I really need help before I throw my computer out the window. I pu rchased PE8 last week and have nothing but problems.  The main problem is I can't open editor at all. Organiser will open no problem but no matter ho w I try to open editor it crashes