Duplicated records on infoobect data load

Hi,
I have a problem when loading dato to 0UCINSTALLA infoobject.
It goes to Red flga and It reports duplicated records in /BI0/QUCINSTALLA and /BI0/YUCINSTALLA tables.
I checked the infopackage and the "PSA only" checkbox is selected, and "Continuing..." and "Ingnore dup records" checkboxes are selected, too.
If the "Ignore duplicated records" is selected, why is reporting the error?
I don't know what to do with this problem.
any ideas?
thanks for the help.
Mauricio.

In transfer structure write a start routine that delete duplicate record like that:
sort DATAPAK by /BIC/filed1 descending /BIC/filed2
/BIC/filed3.
delete adjacent duplicates from DATAPAK comparing /BIC/filed1 /BIC/filed2.
Hope it helps.
Regards

Similar Messages

  • Duplicate records during master data loading.

    hello guys,
    I am reading one blog where the blogger wrote about 'Various issues in a BW Production project'.....I came across one issue which I couldnot understand...
    Data loading failed due to Duplicate records during master data loading.......
    Why do this error occur?How can we rectify this in a production environment?
    Thanks and Regards,
    S

    Hi SChandx200 ,
          May I ask where you get "Various issues in a BW production project"?
    Many Thanks,

  • Duplicate records in BW Data Loads

    In my Project I am facing duplicate records in  Data Loads,  when I compare with PSA and DSO.  How to check those are duplicate and is there any mechanism through Excel Sheet or any?  Please help me out.  Advance thanks for your quick response.
    Edited by: svadupu on Jul 6, 2011 3:09 AM

    Hi ,
    Getting duplicate records in PSA is fine because there are no keys set in PSA and all the records come directly from the source .
    In case of a standard DSO, records are always overwritten you would not get any duplicates .
    In case you are getting duplicate records in PSA and need to find them,
    Go to PSA -> manage -> PSA maintainance->change the no of records from 1000 to the actual no of records that have come ->IIn the menu tab, go to list ->Save-> file -> change the path from SAP directory to some other path and save the file .
    Open the file ,take the columns forming the DSO keys together and sort ascending .you will find the duplicate records in PSA .

  • Duplicated records in Master Data Attribute Load

    Hi All,
    I'm getting the following errors during a masterdata attribute load:
       *1 duplicate record found. 8 recordings used in table /BIC/QEMP_IPF
    From my experience, when this error happened to other loads, I can make it go away by changing the infopackage setting to update to PSA and then to InfoObject (package by package). However for this load, I have already selected this infopackage setting and the error is still happening.
    Can anyone advise me on how to troubleshoot this further? Thanks.
    Regards,
    KM

    You selected option 1 from IP, select option 3 under processing tab.
    IP->processing->only PSA+ select Update Subsequently to Data Target + select ignore double data records
    Hope it will work fine.

  • Records missing in data load.

    Hi All,
    we have one load(DSO-->CUBE).
    EX: In DSO for customer number 1 there are 20 entries.
          after uploading the data into Cube for this customer only 2 or 3 records loading into cube.
    after applying the delete source package conditions at transformation of DSO-->CUBE also it should load 10 entries.
    but only 2 or 3 records updating.
    But when i debug the DTP Load, after END Routine it is showing the currect number of records., means it is showing 10 records for that customer..
    but it cube only 3 records appearing..
    What might be the reason..?
    after end routine is there any chance to drop the records?
    please share your ideas..
    Thanks
    Krishna.

    Hi Asish,
    The key fig values summation also not matching.
    i tried for that one customer also.
    but it is behaving same(means in debuginng it is showing currect no.of records but not in CUBE)
    and there are no filter conditions used in DTP.
    Thanks
    Krishna
    Edited by: krishnamurthy g on Oct 17, 2008 6:52 PM
    Edited by: krishnamurthy g on Oct 20, 2008 8:30 PM

  • Rate data loader

    Hey, Gurus!
    Does anyone know how many records does Oracle Data Loader On Demand send by package?
    I didn't find anything on the documentation (Data Loader FAQ, Data Loader Overview for R17, Data Loader User Guide).
    thanks in advance
    Rafael Feldberg

    Rafael, there is no upper limit for the number of records that the Data Loader can import. However, after doing a test import using the Import Wizard I would recommend keeping the number of records at a reasonable level.

  • The load from ODS - 0FIAP_O03 to cube 0FIAP_C03  is duplicating records

    Hello Everyone
    I need some help/advice for the following issue
    SAP BW version 3.5
    The Delta load for ODS - 0FIAP_O03 works correctly
    The load from ODS - 0FIAP_O03 to cube 0FIAP_C03  is duplicating records
    NB Noticed one other forum user who has raised the same issue but question is not answered
    My questions are 
    1. Is this a known problem?
    2. Is there a fix available from SAP?
    3. If anyone has had this issue and fixed it, could you please share how you achieved this
    i have possible solutions but need to know if there is a standard solution
    Thankyou
    Pushpa

    Hello Pushpa,
    I assume that you are using the Delta load to the initial ODS and then to the CUBE as well.
    If the Delta is placed in both the places then there should not be any issue while sending the data to CUBE.
    If you are using the FULL Load then normally the data will gets aggregated in the cube as objects are with the Addition mode in the Cube.
    Can you post the exact error that you are facing here as this also can be of the design issue.
    Murali

  • Data Load : Number of records count

    Hi Experts,
              I want to document number of records transferred to BW during an infopackage execution.
              I want to automate the process by running a report in background which will fetch a data from SAP tables about number of records been transfered by all my InfoPackage .
    I would like to know how should I proceed with.
             I want to know some System tables which contains same data as that of RSMO transaction displays to us.
    Kindly help with valuable replies.

    HI,
    inorder to get the record counts report you need to create a report based on below tables
    rsseldone, rsreqdone, rsldpiot, rsmonfact
    Check the below link which explain in detail with the report code as well.
    [Data load Quick Stats|http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/90215bba-9a46-2a10-07a7-c14e97bdb764]
    This doc also explains how to trigger a mail with the details to all.
    Regards
    KP

  • Data loader : Import -- creating duplicate records ?

    Hi all,
    does anyone have also encountered the behaviour with Oracle Data Loader that duplicate records are created (also if i set the option: duplicatecheckoption=externalid) When i am checking the "import request queue - view" the request parameters of the job looks fine! ->
    Duplicate Checking Method == External Unique ID
    Action Taken if Duplicate Found == Overwrite Existing Records
    but data loader have created new records where the "External Unique ID" is already existent..
    Very strange is that when i create the import manually (by using Import Wizard) exactly the same import does work correct! Here the duplicate checking method works correct and the record is updated....
    I know the data loader has 2 methods, one for update and the other for import, however i do not expect that the import creates duplicates if the record is already existing, rather doing nothing!
    Anyone else experiencing the same ?? I hope that this is not expected behaviour!! - by the way method - "Update" works fine.
    thanks in advance, Juergen
    Edited by: 791265 on 27.08.2010 07:25
    Edited by: 791265 on 27.08.2010 07:26

    Sorry to hear about your duplicate records, Juergen. Hopefully you performed a small test load first, before a full load, which is a best practice for data import that we recommend in our documentation and courses.
    Sorry also to inform you that this is expected behavior --- Data Loader does not check for duplicates when inserting (aka importing). It only checks for duplicates when updating (aka overwriting). This is extensively documented in the Data Loader User Guide, the Data Loader FAQ, and in the Data Import Options Overview document.
    You should review all documentation on Oracle Data Loader On Demand before using it.
    These resources (and a recommended learning path for Data Loader) can all be found on the Data Import Resources page of the Training and Support Center. At the top right of the CRM On Demand application, click Training and Support, and search for "*data import resources*". This should bring you to the page.
    Pete

  • Data Loader inserting duplicate records

    Hi,
    There is an import that we need to run everyday in order to load data from another system into CRM On Demand . I have set up a data loader script which is scheduled to run every morning. The script should perform insert operation.
    Every morning a file with new insert data is available in the same location(generated by someone else) & same name. The data loader script must insert all records in it.
    One morning , there was a problem in the other job and a new file was not produced. When the data loader script ran , it found the old file and re-inserted the records (there were 3 in file). I had specified the -duplicatecheckoption parameter as the external id, since the records come from another system, but I came to know that the option works in the case of update operations only.
    How can a situation like this handled in future? The external id should be checked for duplicates before the insert operation is performed. If we cant check on the data loader side, is it possible to somehow specify the field as 'unique' in the UI so that there is an error if a duplicate record is inserted? Please suggest.
    Regards,

    Hi
    You can use something like this:
    cursor crs is select distinct deptno,dname,loc from dept.
    Now you can insert all the records present in this cursor.
    Assumption: You do not have duplicate entry in the dept table initially.
    Cheers
    Sudhir

  • SPM Data Loads : Less number of records getting loaded in the Invoice Inbound DSO

    Dear Experts,
    We are working on a project, where data of different NON SAP Source Systems is being loaded into SPM, via Flat File Loads. We came across a very weird situation.
    For other Master and Transaction Data objects, it worked fine, but when we loaded Invoice File, less number of records are getting loaded in the Inbound DSO. The Invoice File contained 80000 records, but the inbound DSO has 78500 records only. We are losing out on 1500 Records.
    We are unable to figure out, as to which 1500 records are we missing out on. We couldn't find any logs, in the Inbound Invoice DSO. We are unable to find out if the records are erroneous, or there is any issue with something else. Is there a way to analyze the situation / Inbound invoice DSO.
    If there is any issue with the Outbound DSO or Cube, We know that it is possible to check the issue with the Data Load request, but for the Inbound DSO, we are not aware, as to which the way to analyze the issue, and why Inbound DSO is taking less records.
    Regards
    Pankaj

    Hi,
    Yes, It might be happen in DSO, because the data records have the simantic keys, so in Keyfileds selection you might have less no of records.
    If you have any rountines check the code(If any condetion for filtering the records).
    Regards.

  • Data Loader - Only imports first record; remaining records fail

    I'm trying to use Data Loader to import a group of opportunities. Everytime I run the Data Loader it only imports the first record. All the other records fail with the message "An unexpected error occurred during the import of the following row: 'External Unique Id: xxxxxxx'". After running the Data Loader, I can modify the Data file and remove the first record that was imported. By running the Data Loader again, the first row (previously the second row) will import successfully.
    Any idea what could be causing this behavior?

    W need a LOT more information, starting with the OS, and the version of ID, including any applied patches.
    Next we need to know if you are doing a single record per page or multiple records, whether the placeholders are on the master page, how many pages are in the document and if they all have fields on them (some screen captures might be useful -- embed them using the camera icon on the editing toolbar on the webpage rather than attaching, if it works [there seem to be some issues at the moment, though only for some people]).
    What else is on the page? Are you really telling it to merge all the records, or just one?
    You get the idea... Full description of what youhave, what you are doing, and what you get instead of what you expect.

  • # of records in a full load dont square with d # in an Init with data trans

    Heallo to all BI/FI SDNer's,
    FI AP n AR extractors like 0FI_AR_3 & 0FI_AP_3 pull 1785 records to PSA when loaded full but an Init with datatransfer for the same pull only 1740 records to PSA.
    I am very skeptical dat even a delta aft a repair full will not bring all records.
    What update methodologies are qualified for AP & AR ?
    OSS notes I found really dont answer my concern here......please comment SDNer's !!!
    Message was edited by:
            Jr Roberto

    Somehow it worked after redoing !!

  • Deleting duplicated records by date

    Hi,
    what could I do to delete duplicated records from test1 table
    the test1 table have the next columns that I describe below:
    code
    operator
    phone
    init_date
    end_date The records are duplicated by code, operator, phone, init_date
    and I need to delete the records with min(end_date)
    thanks in advanced...

    /* Formatted on 1/12/2012 7:28:44 AM (QP5 v5.149.1003.31008) */
    CREATE TABLE data
    AS
       (SELECT 'A' code,
               'Bob' operator,
               '111-2222' phone,
               ADD_MONTHS (SYSDATE, -1) init_date,
               SYSDATE end_date
          FROM DUAL
        UNION ALL
        SELECT 'A',
               'Bob',
               '111-2222',
               ADD_MONTHS (SYSDATE, -1) init_date,
               SYSDATE + 1 end_date
          FROM DUAL);
    DELETE FROM data
          WHERE (code, operator, phone, init_date, end_date) IN
                   (SELECT code,
                           operator,
                           phone,
                           init_date,
                           end_date
                      FROM (SELECT data.*,
                                   COUNT (
                                   OVER (
                                      PARTITION BY code,
                                                   operator,
                                                   phone,
                                                   init_date)
                                      cnt,
                                   ROW_NUMBER ()
                                   OVER (
                                      PARTITION BY code,
                                                   operator,
                                                   phone,
                                                   init_date
                                      ORDER BY end_date)
                                      rn
                              FROM data)
                     WHERE cnt > 1 AND rn = 1);

  • Zero Record Data Load Problem

    Hi,
    Please give your suggestion for following problem.
    we are loading data from ETL (Flat File - Data Stage) into SAP BW 3.1.
    data may contain Zero records. When we try to push the data into BW. At ETL side, it is showing successful data transfer. At, BW side it is showing "Processing state" (Yellow light). and all BW resources are hang-up.
    When we try to send another data load from ETL side, We could not push the data as BW resources are hang up by the previous process.
    Whenever we are getting this kind of problem, we are killing the process and continuing with another data Re-load. But this is not a permanent solution. This is happening more often.
    What is the solution for this problem?
    One of my colleague suggested following suggestion. Shall I consider this one?
    Summary:  when loading with empty files, data may be in the processing state in BW 
    Details:  When user load with empty file(must be empty, can not have any line returns, user can check the data file in binary mode), data is loaded into BW with 0 records. BW will show be in yellow state(processing state) with 0 record showing, and in the PSA inside BW, 1 datapacket will show there with nothing inside. Depends on how user configured their system, BW server can either accept the 0 record packet or deny it. When BW server is configured to accept it, this load request will change to green state(finished state). When the BW server is configured to deny it, this load request will be in the yellow state.
    Please give me ur suggestions.
    Thanks in advance.
    Regards,
    VPR

    hi VPR,
    have you tried to set the light 'judge'ment
    go to monitor of one request and menu settings->evaluation of requests(traffic light), in next screen 'evaluation of requests', 'if no data is avaible in the system, the request' -> choose option 'is judged to be successful' (green).
    Set delta load to complete when no delta data
    hope this helps.

Maybe you are looking for

  • USB ports are not working and I can't figure out what to do!

    My laptop is a HP Pavilion dv4-1225dx Entertainment Notebook. I've had it for almost three years now. It is running Windows Vista, the same it came with. The problem has only just recently started or was just discovered. My problem is that the two US

  • Why can't I write ID3 tags?

    Hi gang. Odd problem. Just very recently noticed it. With some files, but not all, when I go to bounce them, it won't let me write ID3 tags. I click the Bounce icon. The bounce box comes up. I overtype the file name at the top. I select destination M

  • How to Call pl/sql procedure from shell script

    Hi, I've one procedure which takes the xml file name as input from one of the remote location. Every day users keep one file in that location. For eg: ACS_RPO00020110316_20110316220956.xml ACS_RPO00020110319_20110319220956.xml I need to read the late

  • How to get Brushes to go lower then 3 Pixels

    In the retouch for brushes. The radius bar only goes to 3. Is that the lowest limit ? In working in PSE it can go to one (1) Thanks, Greg

  • My iMovie 7.1.4 will not open!!!

    So I am in a huge bind! I currently have iMovie 7.1.4 and it has been working just perfect the last 4 years, I am working on a project due Thursday and all of a sudden it crashed last night and now it won't open! Can someone please help me?! I am goi