Oracle 10 -   Avoiding Duplicate Records During Import Process

I have two databases on different servers(DB1&DB2) and a dblink connecting the two. In DB2, I have 100 million records in table 'DB2Target'.
I tried to load 100 million more records from tables DB1Source to DB2Target on top of existing 400 million records but after an hour I got a network error from DB2.
The load failed after inserting 70% records.Now I have three tasks. First I have to find the duplicate record between DB1 and DB2. Second I have to find out the remaining 30% records missing from DB2Target.
Third I have to re-load the remaining 30% records. What is the best solution?
SELECT COUNT(*), A, B FROM DB2TARGET
GROUP BY A, B
HAVING COUNT(*) > 2
re-loading
MERGE INTO DB2TARGET tgt
USING DB1SOURCE src
ON ( tgt .A=  tgt .A)
WHEN NOT MATCHED THEN
INSERT ( tgt.A,  tgt .B)
VALUES ( src .A,  src .B)Thanks for any guidance.

when I execute this I get the folllowing error message:
SQL Error: ORA-02064: distributed operation not supported
02064. 00000 - "distributed operation not supported"
*Cause:    One of the following unsupported operations was attempted
1. array execute of a remote update with a subquery that references
a dblink, or
2. an update of a long column with bind variable and an update of
a second column with a subquery that both references a dblink
and a bind variable, or
3. a commit is issued in a coordinated session from an RPC procedure
call with OUT parameters or function call.
*Action:   simplify remote update statement                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

Similar Messages

  • Avoid duplicate records

    hi guys
    could u pls let me know where is the option for avoiding duplicate records?
    1. in case of Info package
    2.In case of DTP?

    Hi,
    Incase of infopackage in 3.5 - > Processing tab -> select only PSA ,update subsequent data tagets,ignore double data records
    in 7.0 processing tab by default the selection is only PSA
    Incase of DTP - >update tab -> select handle duplicate data records.

  • How to avoid Duplicate Records  while joining two tables

    Hi,
    I am trying to join three tables, basically two tables are same one is like history table, so I wrote a query like
    select
    e.id,
    e.seqNo,
    e.name,
    d.resDate,
    d.details
    from employees e,
    ((select * from dept)union(select * from dept_hist)) d
    join on d.id=e.id and e.seqno=d.seqno
    but this returing duplicate records.
    Could anyone please tell me how to avoid duplicate records of this query.

    Actually it is like if the record is processed it will be moved to hist table, so both table will not have same records and I need the record from both the tables so i have done the union of both the tables, so d will have the union of both records.
    But I am getting duplicate records if even I am distinct.

  • Duplicate records during master data loading.

    hello guys,
    I am reading one blog where the blogger wrote about 'Various issues in a BW Production project'.....I came across one issue which I couldnot understand...
    Data loading failed due to Duplicate records during master data loading.......
    Why do this error occur?How can we rectify this in a production environment?
    Thanks and Regards,
    S

    Hi SChandx200 ,
          May I ask where you get "Various issues in a BW production project"?
    Many Thanks,

  • Change/new/delete highlighting during import process

    Hi everyone,
    I intend to use SAP NW MDM to consolidate master data from external sources.
    During the import process, I would like to see a highlighting of all new, of all deleted and of all changed records. The reason herefore is the following: before I actually load the imported data into my MDM repository, I want to check the data. Just in case, some corupt data was loaded.
    Can I do this in the MDM Import Manager?
    Best regards, Daniel

    hi,
    Whenever an imported record is new or changed, a timestamp could be set (which is most likely set already anyway). After importing the data I could run a routine which shows me all records that have not been updated. I could then delete these records or set them as "inactive".
    What do you think about this workaround? Is it feasible in MDM?
    Yes this is feasible in MDM. Whenever one record is changed / updated by user, the record will be updated along with the timestamp.
    you can view this in data manager. 
    if you want to select some records and mark them as inactive, you have to create a field.
    hope this may help you,
    Regards,
    Srinivas

  • Avoiding duplicate records while inserting into the table

    Hi
    I tried the following insert statement , where i want to avoid the duplicate records while inserting itself
    but giving me the errror like invalid identifier, though the column exists in the table
    Please let me know Where i'm doing the mistake.
    INSERT INTO t_map tm(sn_id,o_id,txt,typ,sn_time)
       SELECT 100,
              sk.obj_id,
              sk.key_txt,
              sk.obj_typ,
              sysdate,
              FROM S_KEY sk
        WHERE     sk.obj_typ = 'AY'
              AND SYSDATE BETWEEN sk.start_date AND sk.end_date
              AND sk.obj_id IN (100170,1001054)
               and   not exists  (select 1
                                                                   FROM t_map tm1 where tm1.O_ID=tm.o_id
                                                                        and tm1.sn_id=tm.sn_id
                                                                        and tm1.txt=tm.txt
                                                                        and tm1.typ=tm.typ
                                                                        and tm1.sn_time=tm.sn_time )

    Then
    you have to join the table with alias tml where is that ?do you want like this?
    INSERT INTO t_map tm(sn_id,o_id,txt,typ,sn_time)
       SELECT 100,
              sk.obj_id,
              sk.key_txt,
              sk.obj_typ,
              sysdate,
              FROM S_KEY sk
        WHERE     sk.obj_typ = 'AY'
              AND SYSDATE BETWEEN sk.start_date AND sk.end_date
              AND sk.obj_id IN (100170,1001054)
               and   not exists  (select 1
                                                                   FROM t_map tm where sk.obj_ID=tm.o_id
                                                                        and 100=tm.sn_id
                                                                        and sk.key_txt=tm.txt
                                                                        and sk.obj_typ=tm.typ
                                                                        and sysdate=tm.sn_time )

  • Sqlloader controlfileparam's to avoid duplicate records loading

    Hi All,
    I am trying for a option in control file which should restrict sql loader to load duplicate records.I know if we apply a constraint on table itself, it wont allow you to load duplicate data and reject into reject file but in controlfile i have to do it to make sql loader to avoid loading duplicate records.Can you please suggest me which option in controlfile will enable this.
    Can you please also inform me the exact difference between badfile and reject file.In what scenarios it will write to reject or to badfile
    Regards

    Hey
    i dont think there is any option to avoid loading duplicate records by using a parameter in the control file
    On the difference between the bad and reject files try this link
    http://www.exforsys.com/content/view/1587/240/
    Regards,
    Sushant

  • Duplicate Records error when processing transaction file....BPC 7.0

    Hi All,
    I have a situation. I am using BPC NW 7.0 and I have updated my dimension files. When I try to validate my transaction file every single record is validated successfully. But when I try to import the flat file into my application, I am getting a lot of dupplication records error and these are my questions.
    1. Will we get duplicate records in transaction files?
    2. Even if there are duplication, since it is a cube it should summarize not display that as a error and reject records?
    3. Is there something I can do to accept duplicates (I have checked the Replace option in the data package, to overwrite the simillar records, but it is only for account, category and entity only.
    5. In mycase I see identical values in all my dimension and the $value is the only difference. Why is it not summing up.
    Your quickest reply is much appreciated.
    Thanks,
    Alex.

    Hi,
    I have the same problem.
    In my case the file that I want to upload has different row that differ for the nature column. In the conversion file I map different nature to one internal nature.
    ES: cost1 --> cost
          cost2 --> cost
          cost3 --> cost
    In my desire was that in BPC the nature cost assume the result  cost = cost1 + cost2 + cost3.
    The result is that only the first record is uploaded and all other recorda are rejected as duplicate.
    Any suggestion?

  • How to avoid duplicate record in a file to file

    Hi Guys,
              Could you please provide a soultion
              in order  to avoid duplicate entries in a flat file based on key field.
              i request in terms of standard functions
             either at message mappingf level or by configuring the file adapter.
    warm regards
    mahesh.

    hi mahesh,
    write module processor for checking the duplicate record in file adapter
    or
    With a JAVA/ABAP mapping u can eliminate the duplicate records
    and check this links
    Re: How to Handle this "Duplicate Records"
    Duplicate records
    Ignoring Duplicate Records--urgent
    Re: Duplicate records frequently occurred
    Re: Reg ODS JUNK DATA
    http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    regards
    srinivas

  • Deleting Duplicate Records -EIM  Import Account & Contact

    Hi,
    Can anyone give me the query to delete duplicate records both from legacy as well as from the tables imported.

    Try this..
    DELETE
    FROM table
    WHERE ROWID NOT IN(SELECT MAX(ROWID)
    FROM table
    GROUP BY column)
    For which column u have duplicate rows that column u have to give in group by clause so except max of rowid it will delete all records.

  • Scenario - Webservice - XI - BW. How to Avoid duplicate records?

    Hi all,
    Webservice --> XI -->BW .
    BPM has been used to send to send the response back.
    BPM :
    start ->Receive(Request)> Transformation(Responsemap)>Send(SendtoBW)->Send(Send Response) ---> stop.
    We are making use of MSGID to maintain the uniqueness of each message which is coming from Webservice. Uniqueness is maintained for the combination of sale-num:trans-num:sap-saletype:sale-type like below. One msgID will be registered in XI for each Unique message.
    ex:   sale-num:trans-num:sap-saletype:sale-type
           1983:5837:E:NEW
    If they receive any duplicate message again it will send the response back to webservice as "DUPLICATE_GUID:c9237200-0c69-2a80-dd11-79d5b47b213a".*
    It is working correctly. But only problem is when XI is down or if any communication failure happens in the middle of the processing like below example.
    Sample example which has failed recently. A webservice call has been failed three times and the reasons are..
    First time :
    It got the error as ""FAILED TO INVOKE WEB SERVICE OPERATION OS_CWUSales
    Error receiving Web Service Response: Fatal Error: csnet read operation failed (No such file or directory) (11773)" .
    Second time:
    MessageExpiredException: Message c9237200-0c69-2a80-dd11-79d5b47b213a(OUTBOUND) expired.
    Third Time :
    "DUPLICATE_GUID:c9237200-0c69-2a80-dd11-79d5b47b213a" ""
    If you observe when the call made 2nd time, the MsgID has been registered but due to server down or some other reason it could not able to process further.So MSGID got registered here but processing wasn't happened for the message. When they retried thrid time again to send the same call again they are getting error as "DUPLICATE GUID".
    DUPLICATE GUID means it has meaning that the message has been processed and the records has been updated in the backend system which is not happened now.
    Final Result is :
    Status in Webservice showing as "it has been updated in receicing system" as it is indicating as duplicate guid.
    - But it has not updated in backend system which is a problem.
    Firstly is there any suggestions on how to solve this problem?
    Is there any better way to handle this duplicate thing instead of Msgid?
    Please help me in solving this.
    Thanks & Regards
    Deepthi.
    Edited by: deepthi reddy on Jan 7, 2009 2:07 AM
    Edited by: deepthi reddy on Jan 7, 2009 2:07 AM

    >> My suggestion: You can have a Webservice - BW synch-synch synario without BPM. Sender SOAP adapter sending a synch req msg, getting it mapped to BW req and getting its response back from BW, then map it to webservice response and send to webservice without any receiver SOAP adapter.
    Thanks for the suggestion . Looks like a good Idea.
    >> Regarding the problem of duplicate check: see when your BW system gets req msg, it processes it and then sends a response msg.........in this response message have a STATUS field with values S for success and E for error - send this response to webservice and store it in database for that req MSGID ........now in web service application, check the response code for a MSGID and if it is E for error, then only resend the msg to BW.
    Initially they have planned the same way. But sometimes the response is getting very late back from BW. So now once the request reached to XI then immediately they are sending back the response from XI itself withhardcoded "OK" status. They are not waiting for BAPI response from BW.
    So If message is succesfull the Status will go as "OK"
         If message is not succesfull the status will go with blank. and this will indicate that there is some problem in the other side. Then they will check the SOAP Fault message which will have the errors like 
    "DUPLICATE_GUID:c9237200-0c69-2a80-dd11-79d5b47b213a".
    "FAILED TO INVOKE WEB SERVICE OPERATION "
    Right now they are having the issue only for duplicate data and response time to go back. So by making use of MsgID they solved the Issue.But due to this we are seeing daily so many error messages failing with this DUPLICATE error in production which is hampering the performance.
    So we are thinking of to get rid of this MsgId method and BPM. So from your first answer I think we can able to acheive it w/o BPM and after that I need to check the response time how fast it will go back to webservice.
    -Deepthi.

  • Delete records during import new records. Filtering / matching?

    During SRM MDM Catalogue import we are making use of the Import Server component.
    We defined one Port to import all supplier catalogues. In the port, we enter the Import map which is to be used for all suppliers. All suppliers deliver their catalogue in a predefined and fixed format.
    The functionality we want to achieve is more or less described at help.sap.com:
    http://help.sap.com/saphelp_srmmdm20/helpdata/en/30/96cbb9f3cd40b5bfc4cef178880e66/content.htm
    At this moment we do not succeed in getting this done. The following example illustrates the issue:
    Current catalogue
    Supplier        Supplier part.no.
    1000            8000
    1000            8001
    1000            8002
    1001            8001
    1001            8001
    1001            8002
    Supplier 1000 comes with a new file containing the current assortment.
    1000     8000 has not changes
    1000     8001 is changed (update)
    1000     8002 no longer exist and is not part of the new file.
    During record matching we have the combined matching fields Supplier and Supplier part.no.
    A filter on all suppliers has the side effect that the other 4 products are deleted from the catalogue (so also 3 products from supplier 1001).
    A filter on only supplier 1000 is not wanted, because we would then need an individual import map for all suppliers. The result of this would also be: as many Ports, Import Server Ready folders and Import maps as we have suppliers in the catalogue.
    Wanted situation: How can we determine that entry 1000 u2013 8002 needs to be deleted? (Without help of multiple Ports, Ready folders and Import maps).
    We are looking forward to be receiving your reply.

    The answer is given in Patch 4.
    Note 1292931. https://service.sap.com/sap/support/notes/1292931
    After this patch dynamic filtering can be done. See copy text from note below.
    For more info look at article on SDN Using Dynamic Record Filtering in MDM Import Manager.  https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/d0245870-0adf-2b10-46a8-ca9dcf8e1a4d
    We have not yet implemented it but is seems to solve the problem.
    Kind Regards,
    Job Jansen
    Extract Note 1292931:
    MDM Import Manager:
    Enhanced: Support for dynamic source/destination filter. Added a virtual value "[Mapped Values]" into the matching fields filter. When this value is selected, ImportManager and MDIS will substitute it with all the map values of the lookup field during record matching. This creates a generic map that supports source/destination filters that are based on map values instead of fixed values.
    Edited by: Job Jansen on Mar 19, 2009 3:36 PM

  • ITunes locks up frequently during importing process

    Problem started after successfully ripping about five CDs. Purged the start up sequence of programs that may have been intruding on the iTunes signal and that worked for a good spell. After ripping about 1300 songs, the problem started again and I'm getting tired of rebooting. I've tried the uninstall-reinstall option with no luck.

    I just downloaded Itunes 7.4.1.2 and have been debugging freeze-ups (hard ones that require a full system reboot)as well.
    Old or new CD's doesn't seem to make a difference.
    Error correction or or off: no diff
    EZDC creator 7.5 removed: no diff
    All extraneous programs stopped (including AV): no diff.
    New memory in my PC: no diff.
    All other ripping software works OK (EAC, WMP, creative labs)
    Then I changed from the AAC 256 Mbps encoding method to MP3 (320 Mbps) and so far, it hasn't frozen.
    BTW, here's my CD diagnostic output.
    +Microsoft Windows XP Home Edition Service Pack 2 (Build 2600)+
    +NVIDIA AWRDACPI+
    +iTunes 7.4.1.2+
    +CD Driver 2.0.6.1+
    +CD Driver DLL 2.0.6.2+
    +LowerFilters: PxHelp20 (2.0.0.0), drvmcdb (1.0.0.1), Cdr4_xp (7.5.0.47),+
    +UpperFilters: Cdralw2k (7.5.0.47), pwd_2k (7.5.0.47), GEARAspiWDM (2.0.6.1),+
    +Found aspi32 running.+
    +Current user is an administrator.+
    +Video Display Information:+
    +RADEON 9600 SERIES+
    +RADEON 9600 SERIES - Secondary+
    +Connected Device Information:+
    +Disk drive, MAXTOR ATLAS10K5_147WLS SCSI Disk Device, Bus Type SCSI, Bus Address [0,0]+
    +Disk drive, QUANTUM DS10W SCSI Disk Device, Bus Type SCSI, Bus Address [1,0]+
    +Disk drive, Apple iPod USB Device, Bus Type USB+
    +Disk drive, Generic USB CF Reader USB Device, Bus Type USB+
    +Disk drive, Generic USB MS Reader USB Device, Bus Type USB+
    +Disk drive, Generic USB SD Reader USB Device, Bus Type USB+
    +Disk drive, Generic USB SM Reader USB Device, Bus Type USB+
    +Disk drive, Maxtor 3200 USB Device, Bus Type USB+
    +CD-ROM Drive, PLEXTOR DVDR PX-708A, Bus Type ATA, Bus Address [0,0]+
    +Some computers need an update to the ATA or IDE bus driver, or Intel chipset. If iTunes has problems recognizing CDs or hanging or crashing while importing or burning CDs, check the support site for the manufacturer of your computer or motherboard.+
    +D: PLEXTOR DVDR PX-708A, Rev 1.12+
    +Audio CD in drive.+
    +Found 12 songs on CD, playing time 51:16 on CDROM media.+
    +Track 1, start time 00:02:00+
    +Track 2, start time 03:09:29+
    +Track 3, start time 07:48:07+
    +Track 4, start time 13:46:36+
    +Track 5, start time 17:10:22+
    +Track 6, start time 21:26:26+
    +Track 7, start time 27:08:69+
    +Track 8, start time 30:28:54+
    +Track 9, start time 35:02:43+
    +Track 10, start time 39:13:30+
    +Track 11, start time 43:41:16+
    +Track 12, start time 47:17:14+
    +Audio CD reading succeeded.+
    +Get drive speed succeeded.+
    +The drive CDR speeds are: 4 8 16 32 40.+
    +The drive CDRW speeds are: 4 8.+
    +The drive DVDR speeds are: 4 8 16.+
    +The drive DVDRW speeds are: 4 8 16.+
    +Writing CD text is turned on in the preferences. If you're having problems burning CDs, try turning this preference off.+
    --Dale

  • Avoiding duplicate records in report

    Hi All,
                 I have a scenario where
    Delivery document gets created in R/3 say on 7/1 with Act GI date "#" and KFs are all "0". This gets loaded into BI.
    On "7/5" this is PGId and the status in R/3 changesto ACT GI date "7/5" and Qty of "100" . This when loaded into BI is getting published as dupicate records i.e.
    Del doc     Created date     Act GI     Del. Ind     Qty
    12345     1-Jul                         #        #     0
    12345     1-Jul                         5-Jul        #     100
    Please note that the data is getting loaded from DSO into Infocube and DSO is in overwrite mode.
    Any suggestions to overcome this problem.

    Is ACT GI date  a keyfield in the DSO ?
    If yes, data will not be overwritten and two records will be loaded into the Cube.
    Make ACT GI date  a datafield which will result in only one record 12345 1-Jul 5-Jul # 100 as the keyfield values are same.
    Firstly make sure if this is right for all business scenarios.

  • Restrict duplicate records during data entry in multi record detail block

    I have three fields (empno, edate, deptno) in a block overtime_d. I want to restrict it from duplicate entry. can u plz guide me

    hi,
    It must helpful to you.
    http://sheikyerbouti.developpez.com/duplicates/duplicates.htm
    Mark helpful/correct.
    kanish

Maybe you are looking for