Data Loader inserting duplicate records

Hi,
There is an import that we need to run everyday in order to load data from another system into CRM On Demand . I have set up a data loader script which is scheduled to run every morning. The script should perform insert operation.
Every morning a file with new insert data is available in the same location(generated by someone else) & same name. The data loader script must insert all records in it.
One morning , there was a problem in the other job and a new file was not produced. When the data loader script ran , it found the old file and re-inserted the records (there were 3 in file). I had specified the -duplicatecheckoption parameter as the external id, since the records come from another system, but I came to know that the option works in the case of update operations only.
How can a situation like this handled in future? The external id should be checked for duplicates before the insert operation is performed. If we cant check on the data loader side, is it possible to somehow specify the field as 'unique' in the UI so that there is an error if a duplicate record is inserted? Please suggest.
Regards,

Hi
You can use something like this:
cursor crs is select distinct deptno,dname,loc from dept.
Now you can insert all the records present in this cursor.
Assumption: You do not have duplicate entry in the dept table initially.
Cheers
Sudhir

Similar Messages

  • Master Data Load Failure- duplicate records

    Hi Gurus,
    I am a new member in SDN.
    Now,  work in BW 3.5 . I got a data load failure today. The error message saying that there is 5 duplicate records . The processing is in to PSA and then to infoobject. I checked the PSA and data available in PSA. How can I avoid these duplicate records.
    Please help me, I want to fix this issue immediately.
    regards
    Milu

    Hi Milu,
    If it is a direct update, you willn't have any request for that.
    The Data directly goes to Masterdata tables so don't have Manage Tab for that infoobject to see the request.
    Where as in the case of flexible update you will have update rules from your infosource to the infoobject so you can delete the request in this case.
    Check this link for flexible update of master data
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/37dda990-0201-0010-198f-9fdfefc02412

  • Master data failing with Duplicate records

    Dear All,
    Daily the ODS XD0SDO08 is getting loaded with four requests.
    While loading Master data Delta from ODS XD0SDO08 to 0Doc_Number and Cube XD_SDC08, regularly it's getting failed with the error "54 duplicate record found. 1130 recordings used in table /BI0/XDOC_NUMBER" But after deleting the Request from the 0Doc_number and the cube XD_SDC08, and reconstructing from PSA, it is succcessfull.
    If I check the PSA,there are few records in PSA, in which each Sales document number has two after image records and two before image records.If I count the number of these records they are almost equal to the number of Duplicate records we are getting in the error message.
    As we are loading to cube XD_SDC08 and 0Doc_Number, we don't have the option of Ignore Duplicate records in the Infopackge.
    Please suggest me a solution as I have to delete the request manually and reconstruct it daily.
    Regards,
    Ugendhar

    Hi ughi,
    As ur telling that data is sucessesful in cube but not in infoobject check the records in  infoobject wheather their is M version and A version if both are there ..some times load fail with duplicate data record due to this problem ...Run Attribute change run for that infoobject and  check ..
    if this will also be not sucessesful ,and as per ur update after reconstruction it is sucessful means their will be problem in update rule heck once
    another thing u will not get ignore duplicate data records because here infoobject as marked as data target ..
    if u think my suggestion is helpful assigin point to this thread
    regards,
    Gurudatt Bellary

  • Master data info object - duplicate records

    Hi All,
    I have a flat file which has department data that looks like this
    0     OTHER
    0      00 Other
    1      01 Language Arts
    2      02 Mathematics
    3      03 Reading
    4      04 Social Studies
    5      05 Science
    6      06 Student Success
    7     07 Write Traits
    1     READING
    2     LANGUAGE ARTS
    3     MATHEMATICS
    4     SCIENCE
    5     SOCIAL STUDIES
    50     50 Profesional Development
    6     BUSINESS EDUCATION
    65     65 FRABOOMERANG
    7     FOREIGN LANGUAGE
    75     75 KA Software Product
    8     Legacy = 8: No Description
    80     80 CCI
    89     89 Money Only EDC Wrkshps
    9     GENERIC
    99     99 Money Only Miscellaneou
    I was asked to create master data info object for this. I created it but when I am loading it , I am getting an error that there are duplicate records.
    What should be done? Will such data be still created as master data or an ODS ? Because there are duplicate records.
    Thanks

    Hi Eugene and Bhanu
    I have the same issue. I have this master data flat file, with 2 fields . I am trying to load it . And the data looks like this
    0----
    1----
    001 Work'g Words Spelling     
    2----
    002 Vocab for Achievmt           
    3----
    003 ETS95 Packets Program     
    4----
    004 Every Day Counts            
    5----
    005 Phonics We Use            
    6----
    006 US94 History of US         
    7----
    007 Masterng SAT 1/PSAT        
    I tried Bhanu's suggestion of ignoring duplicate records . I did not get any error but that I lost some data. So I tried Eugene's suggestion of shifting the fields .. I moved the second field to the frist place and saved and loaded it. I got this error.
    "There are duplicates of the data record 2. with the key '3.' for characteristic 1.."
    Any suggestions.
    Thanks

  • Data Load : Number of records count

    Hi Experts,
              I want to document number of records transferred to BW during an infopackage execution.
              I want to automate the process by running a report in background which will fetch a data from SAP tables about number of records been transfered by all my InfoPackage .
    I would like to know how should I proceed with.
             I want to know some System tables which contains same data as that of RSMO transaction displays to us.
    Kindly help with valuable replies.

    HI,
    inorder to get the record counts report you need to create a report based on below tables
    rsseldone, rsreqdone, rsldpiot, rsmonfact
    Check the below link which explain in detail with the report code as well.
    [Data load Quick Stats|http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/90215bba-9a46-2a10-07a7-c14e97bdb764]
    This doc also explains how to trigger a mail with the details to all.
    Regards
    KP

  • Purchase Order Import inserts duplicate records in po_line_locations

    Hi,
    I'm running standard Purchase Order Import program to import few PO's. We have only one shipment for each item, so its only one record for each line in po_line_locations. But after running the import, it inserts a duplicate record with same qty into po_line_locations. Basically it is inserting same item twice in po_line_locations_all table and quantity is getting doubles at the line level. Seached Metalink but no hits for this, till now.
    This is in R12 (12.0.6).
    Did anyone encounter this problem earlier? Any hints or comments would help.
    Thanks in advance.
    Edited by: user2343071 on Sep 2, 2009 3:54 PM

    Hi,
    Once you please debug the particular program with the help of ABAPer. That may resolve your issue. Thanking you

  • SAP BPC 7.5 SP 7 - Master Data Load Detected duplicate member ID

    Hi Gurus I have  a requirement.
    I am loading the Master data for Cost Center - initially while loading master data i didnot load hierarchy. Now I started to load the Master data with Hierarchy but when ever I tried vailidate the transformation file then it is throwing the "Detected Duplicate member "
    Let me show you the transformation - what I wrote there
    *OPTIONS
    FORMAT = DELIMITED
    HEADER = YES
    DELIMITER = TAB
    *MAPPING
    ID=ID
    *CONVERSION
    ID=Master_Data_Conversion.xls
    Conversion Master_Data_Conversion.xls has the following code this is to remove the spaces EXTERNALINTERNAL
    *js:%external%.toString().replace(/\s+/g,"")
    Let me show you how I have selected the Data type - Master Data /text from NW BW Infoobject
    Selection of infoobject - 0costcenter
    Format - External Format
    Set Selection - first Tab - Attribute I only wante the Control Area which is 1000 so - Control Area = 1000
    2nd Tab - Hierarchy - Import Text node - yes Hierarchy node - xxxxxxx  Version - Empty Member ID - first member    Level - Blank
    3rd Tab Language - English
    4th Tab Attribute list - Controling Area only is selected
    Note: Let me tell you that when I am loading the Master data with out Hierarchy inthe Set Selection the loading is succesful but when I am doing it with the Hierarchy as mentioned in the 2nd Tab I am getting the error as following.
    Master data (dealt by table level) has errors
    Detected duplicate member ID '201100'
    Also, the Master data for Cost Center in the BW is a time dependent so it is having Valid to and Valid from fields which is not there to be delt in the BPC.
    Please Help

    @Vinay let me tell you when the BW is having the master data which is time dependent then it will have this Duplicate member as we will have duplicate cost center names which are compounded with the Time, so BW will not show an error as there is compounding but BPC do not have that feature.
    The was raised with SAP and they resolved this issue.
    SAP Note 1641529 - Loading Master Data gets duplicate error in special case
    When running Data Manager Package 'Loading Master Data from BW
    InfoObjects', 'Duplicate Members are found' error may be reported in below
    case:
    o In the source BW InfoObjects, there are master data that have
    different lengths, and their IDs are all numeric characters. And
    o If the members are sorted by their lengths first and then IDs gets
    a different order compared with the one if they are sorted directly
    by their IDs. Take member '122' and '1101' for instance. In BW,
    they are sorted as [122, 1102]; if they are sorted directly by the
    IDs, the order is [1102, 122]. And
    o when running the package, the members are in both 'Attribute' and
    'Hierarchy' selection, and the option Filter members by Attributes
    or Hierarchies is used. And
    o Select 'External Format' when running the package.
    Other terms
    DM, Loading Master Data from BW, Duplicate Members
    Reason and Prerequisites
    It's a program error.
    Solution
    Please apply this note or upgrade to SP11.
    I hope this will help you else let me know the entire requirements so that I can provide some kind of assistance.
    Do check how the master data in ur BW is then how the hierarchy nodes and costcenter nodes are there?
    Good Luck
    Vijay Sumith

  • Data Loading Error : ToomanyError records.

    Hi All,
    I got Data Loading error when I am loading from Flat File.
    Following is the error message :
    Too many error records - update terminated
    Error                             18 in the update
    No SID Found for value '00111805' of Characterstic ZUNIQUEID (Message No 70)
    can anybody help in resolving the issue.
    Regards,
    Chakravarthy

    hi
    Check the format of your charecteristics and key figures .
    Check you put data separators in you flat files approriately
    In the particular charecteristics ZUNIQUEID  ,ensure data is consistent...check the related tables
    Assign points if useful
    Regards
    N Ganesh

  • Data Load for 20M records from PSA

    Hi Team,
                   We need to reload a huge volume of data (around 20 million records) of Billing data (2LIS_13_VDITM) PSA to the first level DSO and then to the higher level targets.
    If we are going to run the entire load with one full request from PSA to DSO for 20M records will it have any performance issue?
    Will it be a good approach to split the load based on ‘Billing Document Number’?
    In Case, If we the load by 'Billing Document Number'; will it create any performance issue from the reporting perspective (if we receive the data from multiple requests?) Since most of the report would be ran based on Date and not by 'Billing Document Number'.
    Thanks
    San

    Hi,
    Better solution put the filter based on the year and fiscal year.
    check the how many years of data based on the you can put filter.
    Thanks,
    Phani.

  • To insert duplicate records in VO

    Hi,
    I have a button Duplicate Record on page.
    All the existing details on the page are read only except a checkbox for all rows.
    When i select the checkbox and click on 'duplicate Record' button, an editable row with the same data as the selected checkbox row should be created on page. This row is getting created but the issue is that the row from which it has been dupliacted also becomes editable. And the changes made to new row gets reflected in Old Row as well.
    Any solutions to have the old row read only and only new row as editable?

    ok..what i understand is as follows..
    In the VO, i create a transient variable 'RowRef' and select it in my VO query.
    In my results table, i create a form value "evtSrcRowRef" with View attribute as 'RowRef'.
    In CO, i write,
    String rowReference = pageContext.getParameter("evtSrcRowRef");
    Please correct me if i am wrong or missing somethimg..also please detail how do i use this row refernce to make my original row read only..

  • Data Load with 0 records

    Hi,
    How a system should react under following cases:
    1. Full load bringing in 0 records
    2. Init load bringing in 0 records
    3. Delta load bringing in 0 records
    Note: here by 0 records I mean actually the load has no records.
    For each of the above case will the load turn green or remain yellow and then times out.
    I always have different reactions from the system for these cases.  Appreciate view from experts…
    Thank you,
    sam

    Jr roberto setting which you said exists true,
    but i have that setting marked as green.
    i did an init load which pulled in 0 records. this is correct. now eventhough the green is checked for 0 records in rsmo settings the load errored out after the time out setting in infopack
    and the main traffic light is still running...with  No errors could be found. The current process has probably not finished yet.
    any tips..

  • Tabular Form - submit custom data and insert/update records

    I have a tabular form with 2 columns representing table data and 5 more custom columns.
    Task ID
    Task Name
    10/7/2013 to 10/13/2013
    10/14/2013 to 10/20/2013
    10/21/2013 to 10/27/2013
    10/28/2013 to 11/3/2013
    11/4/2013 to 11/10/2013
    1
    TASK1
    2
    TASK2
    3
    TASK3
    I use an sql which returns null values for columns 3 to 7. Then I use the html formatting option of apex and jquery to add change the headers to weeks and add checkboxes.
    My sql: select task_id, task_name, null week1, null week2, null week3, null week4, null week5 from <table name>
    My table has the columns task id, task name , start_date and end_date.
    When the user clicks the submit button, i need to send the checkbox data to my stored procedure and insert records into the corresponding table. I am unable to find out how to send a mapping of the task_id, date headers in each column and the checkbox data to apex back end and process them. For example, if the first check box in first row is checked, i should insert a row with values "1, TASK1, 10/7/2013, 10/13/2013". I also have to read the data from the table and display it in the same format. Please let me know how to accomplish this in apex.

    instead of using null, you can use apex_item api to create check boxes
    read http://docs.oracle.com/cd/E37097_01/doc/doc.42/e35127/apex_item.htm#CHDDCHAF
    you can set the value of check boxes using apex_item api and then the value can be captured in your process
    check this: Martin Giffy D'Souza on Oracle APEX: APEX Report with checkboxes (advanced).
    let me know if this answers your query in the current thread
    Regards,
    Vishal
    Oracle APEX 4.2 Reporting | Packt Publishing
    Vishal's blog

  • Data loading..few records could not be loaded

    Hi Sapiens
    i am loading data from one source to bw. all records have been loaded except for 10 reocrds which were missing. How can be we load only the 10 missing records.
    i dont want to load all records again.
    Plz guide
    Regs
    Sanju

    Hi Sanju,
    You can load those records by doing a repair full update or full update and by giving the unique selections for those 10 records in your Infopackage in BW>
    In details go to your infopackage, put Full update (Tick the Repair full indicator) and go to selection tab and give Doc Number or some selection criteria by which you can only extract those 10 records.
    Let us know if you facew any problems.
    Thanks
    CK

  • Data Load - Added & Transferred Record Miss Match

    Hi to all,
    I am loading data R/3 -> infoCube in 3.x, I can see that too much difference between Transferred and Added Record , Ex Transferred record -> 5032704 & Added record -> 3505696.
    I have checked there is no routine has been written .
    In Selection I have given a range for Object No.
    For 10000 to 99999 Object No Transferred and Added record is same.
    But KSM100 to KSM999 I am getting difference between added and transferred record.
    Please any one help me .
    Thanks in advance
    shalini

    Not all the times would the records would get added to your fact table. You may write an update rule for "not to update certain records". In such cases you may find the transfered and added records different. Another instance could be when you split the incoming record into 2 records in the update rules. In such an instance the no. of added records would be greater than transfered records.
    Re: difference i transfered records and added records
    Re: manage infocube

  • Data loader : Import -- creating duplicate records ?

    Hi all,
    does anyone have also encountered the behaviour with Oracle Data Loader that duplicate records are created (also if i set the option: duplicatecheckoption=externalid) When i am checking the "import request queue - view" the request parameters of the job looks fine! ->
    Duplicate Checking Method == External Unique ID
    Action Taken if Duplicate Found == Overwrite Existing Records
    but data loader have created new records where the "External Unique ID" is already existent..
    Very strange is that when i create the import manually (by using Import Wizard) exactly the same import does work correct! Here the duplicate checking method works correct and the record is updated....
    I know the data loader has 2 methods, one for update and the other for import, however i do not expect that the import creates duplicates if the record is already existing, rather doing nothing!
    Anyone else experiencing the same ?? I hope that this is not expected behaviour!! - by the way method - "Update" works fine.
    thanks in advance, Juergen
    Edited by: 791265 on 27.08.2010 07:25
    Edited by: 791265 on 27.08.2010 07:26

    Sorry to hear about your duplicate records, Juergen. Hopefully you performed a small test load first, before a full load, which is a best practice for data import that we recommend in our documentation and courses.
    Sorry also to inform you that this is expected behavior --- Data Loader does not check for duplicates when inserting (aka importing). It only checks for duplicates when updating (aka overwriting). This is extensively documented in the Data Loader User Guide, the Data Loader FAQ, and in the Data Import Options Overview document.
    You should review all documentation on Oracle Data Loader On Demand before using it.
    These resources (and a recommended learning path for Data Loader) can all be found on the Data Import Resources page of the Training and Support Center. At the top right of the CRM On Demand application, click Training and Support, and search for "*data import resources*". This should bring you to the page.
    Pete

Maybe you are looking for