Master data failing with Duplicate records

Dear All,
Daily the ODS XD0SDO08 is getting loaded with four requests.
While loading Master data Delta from ODS XD0SDO08 to 0Doc_Number and Cube XD_SDC08, regularly it's getting failed with the error "54 duplicate record found. 1130 recordings used in table /BI0/XDOC_NUMBER" But after deleting the Request from the 0Doc_number and the cube XD_SDC08, and reconstructing from PSA, it is succcessfull.
If I check the PSA,there are few records in PSA, in which each Sales document number has two after image records and two before image records.If I count the number of these records they are almost equal to the number of Duplicate records we are getting in the error message.
As we are loading to cube XD_SDC08 and 0Doc_Number, we don't have the option of Ignore Duplicate records in the Infopackge.
Please suggest me a solution as I have to delete the request manually and reconstruct it daily.
Regards,
Ugendhar

Hi ughi,
As ur telling that data is sucessesful in cube but not in infoobject check the records in  infoobject wheather their is M version and A version if both are there ..some times load fail with duplicate data record due to this problem ...Run Attribute change run for that infoobject and  check ..
if this will also be not sucessesful ,and as per ur update after reconstruction it is sucessful means their will be problem in update rule heck once
another thing u will not get ignore duplicate data records because here infoobject as marked as data target ..
if u think my suggestion is helpful assigin point to this thread
regards,
Gurudatt Bellary

Similar Messages

  • Master data failed with  error `too many duplicate records'

    Dear all
    below is error message
                Data records for package 1 selected in PSA -                                             
                error                                                 4 in the update
    LOng text is giving as below
    Error              4 in the update
    Message no. RSAR119
    Diagnosis
    The update delivered the error code                                                   4.
    Procedure
    You can find further information on this error in the error message of the update.
    WOrking on BI - 7.0
    any solutions
    Thanks
    satish .a

    Hi,
    Go through these threads, they have same issue:
    Master data load: Duplicate Records
    Re: Master data info object - duplicate records
    Re: duplicate records in master data info object
    Regards
    Raj Rai

  • Master data info object - duplicate records

    Hi All,
    I have a flat file which has department data that looks like this
    0     OTHER
    0      00 Other
    1      01 Language Arts
    2      02 Mathematics
    3      03 Reading
    4      04 Social Studies
    5      05 Science
    6      06 Student Success
    7     07 Write Traits
    1     READING
    2     LANGUAGE ARTS
    3     MATHEMATICS
    4     SCIENCE
    5     SOCIAL STUDIES
    50     50 Profesional Development
    6     BUSINESS EDUCATION
    65     65 FRABOOMERANG
    7     FOREIGN LANGUAGE
    75     75 KA Software Product
    8     Legacy = 8: No Description
    80     80 CCI
    89     89 Money Only EDC Wrkshps
    9     GENERIC
    99     99 Money Only Miscellaneou
    I was asked to create master data info object for this. I created it but when I am loading it , I am getting an error that there are duplicate records.
    What should be done? Will such data be still created as master data or an ODS ? Because there are duplicate records.
    Thanks

    Hi Eugene and Bhanu
    I have the same issue. I have this master data flat file, with 2 fields . I am trying to load it . And the data looks like this
    0----
    1----
    001 Work'g Words Spelling     
    2----
    002 Vocab for Achievmt           
    3----
    003 ETS95 Packets Program     
    4----
    004 Every Day Counts            
    5----
    005 Phonics We Use            
    6----
    006 US94 History of US         
    7----
    007 Masterng SAT 1/PSAT        
    I tried Bhanu's suggestion of ignoring duplicate records . I did not get any error but that I lost some data. So I tried Eugene's suggestion of shifting the fields .. I moved the second field to the frist place and saved and loaded it. I got this error.
    "There are duplicates of the data record 2. with the key '3.' for characteristic 1.."
    Any suggestions.
    Thanks

  • Master data tables with unwanted records from transaction data upload

    Hi Friends,
      I have a master data table for infoobject 'C' with compounding characteristics 'A' & 'B'.  I upload this master data with values given below:
        <i><u> A,              B,              C,           Short text,                        Long text</u></i>
           <b>  <b>P,          10,           BBB,         Apple,                              Big Apples
             Q,             20 ,           XYZ  ,       Tomatoes    ,                    Red Tomatoes</b></b>
      When I load data into ODS from a source system, I may not necessarily have data for all these 3 fields in these transaction record.  Example:
      <i><u>     A,                B,             C,             D,            E</u></i>    
         <b> P                -1            FFF</b>          20           30            
         <b> Q                10           GGG        </b> 10           40
       The problem is when I upload the above transaction data, it populates the <b>master data table</b> too with these two new records <b>1 -1 FFF</b> and  <b>2 10 GGG</b>, which I would like to avoid.
       Is there any way?
       Will assign full points to anyone who helps me here.
       Thanks,
       JB

    Hi JB,
    If you want to load transactional data and still want to prevent the population of the master data table, I don't think it is possible, as it is goes aginst the data consistency in the warehouse.
    However, if you can afford not to load transactional data for such cases, you can activate referential integrity check for the infoobject C. Then neither transactional data nor masterdata enter the datawarehouse until you maintain masterdata yourself for the infoobject C.
    hope this helps.

  • Master Data Load Failure- duplicate records

    Hi Gurus,
    I am a new member in SDN.
    Now,  work in BW 3.5 . I got a data load failure today. The error message saying that there is 5 duplicate records . The processing is in to PSA and then to infoobject. I checked the PSA and data available in PSA. How can I avoid these duplicate records.
    Please help me, I want to fix this issue immediately.
    regards
    Milu

    Hi Milu,
    If it is a direct update, you willn't have any request for that.
    The Data directly goes to Masterdata tables so don't have Manage Tab for that infoobject to see the request.
    Where as in the case of flexible update you will have update rules from your infosource to the infoobject so you can delete the request in this case.
    Check this link for flexible update of master data
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/37dda990-0201-0010-198f-9fdfefc02412

  • Payment terms in Vendor Master data and purchasing info record

    Hi, all, I met one trouble. While I create Purchase Order. In the header item, payment terms is automatically pop up from the data maintaining the vendor master data or purchasing info record if there is difference payment terms between in vendor master data and in purchasing info record. What happend?
    Sometimes, When you create purchase order, payment terms is not automatically pop up even you maintained in the vendor master data or purchasing info record. Why? Is it meaning that I can manually to input payment terms while create each purchase order which can be difference data with vendor master data or purchase info record?
    Thank you very much in advance,

    Hi,
    The payment terms is usually defaulted from either vendor master or purchasing master data record such as contract/outline agreement.  Unless it is set through configuration that the payment terms be set as mandatory, any failure to maintain payment terms won't stop you from posting your PO as this information will only be useful during invoice receipt maintenance.  Having said that if you are setting up your system with discount condition type as part of your PO pricing schema then it is necessary that the payment terms shall be maintained during PO creation.
    Cheers,
    HT

  • Migration of Equipment Master Data together with PRT data

    Hello Experts,
    we need to migrate Equipment Master Data which have activated the view PRT data. Most of the fields in this view are stored in table CRFH. Is there a way to migrate Equipment Master Data together with the PRT data of the Equipment? All the BAPI's I found do not have an import structure for those fields. Is a batch-input the only way?
    Does anybody know a mass change transaction for Equipments?
    Thank you and Regards
    Christian

    Hi Krishna,
    You can use the standard batch input program RFBIDE00 in LSMW to update the customer master data. However, there might be scenarios where you might be required to update the fields which do not get updated by the standard bacth input program. If any of such fields are defined as mandatory, then you wont be able to use this method directly.
    In such cases you will have to either create a recording to poplate the fields which do not get updated by the standard program and use both the recording and the standard batch input program to update the customer master data OR create a recording for customer master create and use the same in LSMW.
    Best regards,
    Harsh

  • SAP BPC 7.5 SP 7 - Master Data Load Detected duplicate member ID

    Hi Gurus I have  a requirement.
    I am loading the Master data for Cost Center - initially while loading master data i didnot load hierarchy. Now I started to load the Master data with Hierarchy but when ever I tried vailidate the transformation file then it is throwing the "Detected Duplicate member "
    Let me show you the transformation - what I wrote there
    *OPTIONS
    FORMAT = DELIMITED
    HEADER = YES
    DELIMITER = TAB
    *MAPPING
    ID=ID
    *CONVERSION
    ID=Master_Data_Conversion.xls
    Conversion Master_Data_Conversion.xls has the following code this is to remove the spaces EXTERNALINTERNAL
    *js:%external%.toString().replace(/\s+/g,"")
    Let me show you how I have selected the Data type - Master Data /text from NW BW Infoobject
    Selection of infoobject - 0costcenter
    Format - External Format
    Set Selection - first Tab - Attribute I only wante the Control Area which is 1000 so - Control Area = 1000
    2nd Tab - Hierarchy - Import Text node - yes Hierarchy node - xxxxxxx  Version - Empty Member ID - first member    Level - Blank
    3rd Tab Language - English
    4th Tab Attribute list - Controling Area only is selected
    Note: Let me tell you that when I am loading the Master data with out Hierarchy inthe Set Selection the loading is succesful but when I am doing it with the Hierarchy as mentioned in the 2nd Tab I am getting the error as following.
    Master data (dealt by table level) has errors
    Detected duplicate member ID '201100'
    Also, the Master data for Cost Center in the BW is a time dependent so it is having Valid to and Valid from fields which is not there to be delt in the BPC.
    Please Help

    @Vinay let me tell you when the BW is having the master data which is time dependent then it will have this Duplicate member as we will have duplicate cost center names which are compounded with the Time, so BW will not show an error as there is compounding but BPC do not have that feature.
    The was raised with SAP and they resolved this issue.
    SAP Note 1641529 - Loading Master Data gets duplicate error in special case
    When running Data Manager Package 'Loading Master Data from BW
    InfoObjects', 'Duplicate Members are found' error may be reported in below
    case:
    o In the source BW InfoObjects, there are master data that have
    different lengths, and their IDs are all numeric characters. And
    o If the members are sorted by their lengths first and then IDs gets
    a different order compared with the one if they are sorted directly
    by their IDs. Take member '122' and '1101' for instance. In BW,
    they are sorted as [122, 1102]; if they are sorted directly by the
    IDs, the order is [1102, 122]. And
    o when running the package, the members are in both 'Attribute' and
    'Hierarchy' selection, and the option Filter members by Attributes
    or Hierarchies is used. And
    o Select 'External Format' when running the package.
    Other terms
    DM, Loading Master Data from BW, Duplicate Members
    Reason and Prerequisites
    It's a program error.
    Solution
    Please apply this note or upgrade to SP11.
    I hope this will help you else let me know the entire requirements so that I can provide some kind of assistance.
    Do check how the master data in ur BW is then how the hierarchy nodes and costcenter nodes are there?
    Good Luck
    Vijay Sumith

  • How to load unicode data files with fixed records lengths?

    Hi!
    To load unicode data files with fixed records lengths (in terms of charachters and not of bytes!) using SQL*Loader manually, I found two ways:
    Alternative 1: one record per row
    SQL*Loader control file example (without POSITION, since POSITION always refers to bytes!)<br>
    LOAD DATA
    CHARACTERSET UTF8
    LENGTH SEMANTICS CHAR
    INFILE unicode.dat
    INTO TABLE STG_UNICODE
    TRUNCATE
    A CHAR(2) ,
    B CHAR(6) ,
    C CHAR(2) ,
    D CHAR(1) ,
    E CHAR(4)
    ) Datafile:
    001111112234444
    01NormalDExZWEI
    02ÄÜÖßêÊûÛxöööö
    03ÄÜÖßêÊûÛxöööö
    04üüüüüüÖÄxµôÔµ Alternative2: variable length records
    LOAD DATA
    CHARACTERSET UTF8
    LENGTH SEMANTICS CHAR
    INFILE unicode_var.dat "VAR 4"
    INTO TABLE STG_UNICODE
    TRUNCATE
    A CHAR(2) ,
    B CHAR(6) ,
    C CHAR(2) ,
    D CHAR(1) ,
    E CHAR(4)
    ) Datafile:
    001501NormalDExZWEI002702ÄÜÖßêÊûÛxöööö002604üuüüüüÖÄxµôÔµ Problems
    Implementing these two alternatives in OWB, I encounter the following problems:
    * How to specify LENGTH SEMANTICS CHAR?
    * How to suppress the POSITION definition?
    * How to define a flat file with variable length and how to specify the number of bytes containing the length definition?
    Or is there another way that can be implemented using OWB?
    Any help is appreciated!
    Thanks,
    Carsten.

    Hi Carsten
    If you need to support the LENGTH SEMANTICS CHAR clause in an external table then one option is to use the unbound external table and capture the access parameters manually. To create an unbound external table you can skip the selection of a base file in the external table wizard. Then when the external table is edited you will get an Access Parameters tab where you can define the parameters. In 11gR2 the File to Oracle external table can also add this clause via an option.
    Cheers
    David

  • Add new bank account in master data client with idoc DEBMAS.

    Hi friends,
    I have an issue.
    I am trying to add new bank account in master data client with idoc DEBMAS.
    But when I submit, then just overwrite the bank data but not add.
    I try playing with MSGFN code with value '009' or '004' but nothing done.
    Someone has met this issue ?
    Thanks for your answers.

    Thanks,
    But what do you mean ? where can I find this path , in img ?
    Fields->Set Qualified Update ->Append Option
    I think we have to use another idoc :
    BUPA_C_BANKDETAIL_ADD01 SAP BP,  BAPI: Add Bank Details
    I'll try...

  • Master data attributes with direct update...its very urgent

    Hi all,
    Could anyone tell me how to laod the master data attributes with direct update in the infopackge..
    provide steps to create master data attributes and how to load..
    Thanks,
    Manjula

    Hi Manjula,
    Flexible Uploading
    Transaction code RSA1—LEAD YOU TO MODELLING
    1. Creation of Info Objects
    • In left panel select info object
    • Create info area
    • Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
    • Create new characteristics and key figures under respective catalogs according to the project requirement
    • Create required info objects and Activate.
    2. Creation of Data Source
    • In the left panel select data sources
    • Create application component(AC)
    • Right click AC and create datasource
    • Specify data source name, source system, and data type ( Transaction data )
    • In general tab give short, medium, and long description.
    • In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
    • In proposal tab load example data and verify it.
    • In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
    • Activate data source and read preview data under preview tab.
    • Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
    3. Creation of data targets
    • In left panel select info provider
    • Select created info area and right click to create ODS( Data store object ) or Cube.
    • Specify name fro the ODS or cube and click create
    • From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
    • Click Activate.
    • Right click on ODS or Cube and select create transformation.
    • In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    • Activate created transformation
    • Create Data transfer process (DTP) by right clicking the master data attributes
    • In extraction tab specify extraction mode ( full)
    • In update tab specify error handling ( request green)
    • Activate DTP and in execute tab click execute button to load data in data targets.
    4. Monitor
    Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used
    honor with points if this helps,
    Sudhakar

  • Job Fail:9 duplicate record found. 73 recordings used in table /BIC/XZ_DEAL

    Dear All
      Job load from bw ODS to bw Master data InfoObject and ODS. This job always fail with same message: <b>9 duplicate record found. 73 recordings used in table /BIC/XZ_DEAL</b> (Z_DEAL is Infoobject).
    When I rerun, this job has job no error.
    Please help me solved this problem.
    thanks
    Phet

    Hi,
    What is the info object name.
    Regards,
    Goutam

  • Load for Profit center Master data Fails?

    When i am trying to load profit center master data the load fails with this error message
    ZCONTENT : Data record 1 ('00100000000666 '): Version 'Arcade ' is not valid     
    ZCONTENT : Data record 59 ('00100000001055 '): Version 'Simulation ' is not valid
    YYBRAND : Data record 87 ('00100000001084 '): Version 'Alter Echo ' is not valid
    YYDEVMENT : Data record 103 ('00100000001100 '): Version 'Externally developed ' is not valid     
    Like these errors i have so many data records failed. Please suggest me how to resolve this issue.

    Hi Naga,
    Looks like the input data needs to be corrected. See if the messages are coming from:
       a) Update/Transfer rules?
    if you are loading from a flat file...preview and check all the field values in the relevant records (1, 59, 87, etc...)
    Good luck, BB

  • Data Loader inserting duplicate records

    Hi,
    There is an import that we need to run everyday in order to load data from another system into CRM On Demand . I have set up a data loader script which is scheduled to run every morning. The script should perform insert operation.
    Every morning a file with new insert data is available in the same location(generated by someone else) & same name. The data loader script must insert all records in it.
    One morning , there was a problem in the other job and a new file was not produced. When the data loader script ran , it found the old file and re-inserted the records (there were 3 in file). I had specified the -duplicatecheckoption parameter as the external id, since the records come from another system, but I came to know that the option works in the case of update operations only.
    How can a situation like this handled in future? The external id should be checked for duplicates before the insert operation is performed. If we cant check on the data loader side, is it possible to somehow specify the field as 'unique' in the UI so that there is an error if a duplicate record is inserted? Please suggest.
    Regards,

    Hi
    You can use something like this:
    cursor crs is select distinct deptno,dname,loc from dept.
    Now you can insert all the records present in this cursor.
    Assumption: You do not have duplicate entry in the dept table initially.
    Cheers
    Sudhir

  • Customer Master data upload with Ref Customer Master

    Hi all,
    I want to extend all the Customer master records of a particular Sales area to another new Sales area (copy from one sales area to other). What is the easiest way to do this?
    Am trying to copy by giving referrence number, but still i had to give all the relavant data which i suppose shold not be. So plz guide me the best way to copy the master data of all the customers.
    Thanks in advance.
    Cheers,
    Anil.

    We have had the similar issue. I simply downloaded the customer master sales area data and replaced the distribution channel with new one in excel then division and recorded the LSMW for this only thing u have to be careful is while extending u have to use the current customer number as refernce which has to be recorded. this entire process till execution took about 30 mins + upload time.
    It works perfect.

Maybe you are looking for

  • Update customer adding partner functions (KNVP)

    Hi, I need to insert new partner functions in a customer. I have been looking for a BAPI but it does not exist. Here in the forum I have seen some threads saying you can use SD_CUSTOMER_MAINTAIN_ALL but there are other threads saying you have to do a

  • "Billing Information Change" issue

    Lately, I am unable to use my ATV for any purchases because I am continually confronted with an error message advising me that my Billing Information Has Changed, and suggests (politely), that I logon to iTunes and update my account. Trouble is, My a

  • Itunes Upgrade Failure

    I attempted to download upgrade to itunes (11.1.4) to my PC and it failed to load and left me with the message that MSVCR80.dll is missing?  I haven't a clue on how to repair this???

  • Is it neccesary certificate&ACL data to use functions in R3 via the RFC?

    Hi. I create sap r3 471 ides version. And try to connect via the iway(third party product to call function via the RFC). I referenced some block, and they says if I want to call functions in sap r3 471 via the RFC, I should check create remote functi

  • Round trip between Flash Builder and Catalyst?

    Is it, or will it, be possible to go back to Flash Catalyst once a project as been brought into Flash Builder? I have imported the .fxp project into Flash Builder, made some code changes, then exported a .fxp project out again.  I want to be able to