Fail a Interface If Duplicate records present

HI All,
I am using ODI 11g.
I have Flat file to Table interface where i am using IKM as SQL control append ,Check the distinct Option ,Insert ,Commit and truncate option as TRUE ,Flow control as FALSE.
In my case suppose i have 10 records out of these 2 record are duplicate .So my interface is processing 8 records to the next level.
But i want to fail the interface if Duplicate records presents.
How to do this?
Thanks,
Lony

Set the Flow_Control to False and there should be a constraint based on that column in the target. This interface should fail.
Keeping the Flow_Control to True will exclude these rows from the data flow and hence the interface will not fail, instead these records will go into the error table.
Regards,
Rickson Lewis
Edited by: Rickson Lewis on Apr 25, 2013 3:43 PM

Similar Messages

  • Job Fail:9 duplicate record found. 73 recordings used in table /BIC/XZ_DEAL

    Dear All
      Job load from bw ODS to bw Master data InfoObject and ODS. This job always fail with same message: <b>9 duplicate record found. 73 recordings used in table /BIC/XZ_DEAL</b> (Z_DEAL is Infoobject).
    When I rerun, this job has job no error.
    Please help me solved this problem.
    thanks
    Phet

    Hi,
    What is the info object name.
    Regards,
    Goutam

  • While loading master data to infoobject Load failed due to Duplicate record

    Hi Experts,
    While loading master data to the infoobject load failed .
    The error it is showing is 24 Duplicate record found. 23 recordings used in table.
    Pls help me to solve this issue
    Thanks in Advance.
    Regards,
    Gopal.

    In infopackage settings u will find a checkbox for 'delete duplicate records'.
    I think it appears beside the radio button for 'To PSA',and also tick checkbox for 'subsequent update to data targets'.
    This will remove the duplicate records(if any) from the PSA before they are processed further by transfer and update rules.
    Use this and reload master data.
    cheers,
    Vishvesh

  • Data Load Fails due to duplicate records from the PSA

    Hi,
    I have loaded the Master Data Twice in to the PSA.  Then, I created the DTP to load the data from the PSA to the InfoProvider.  The data load is failing with an error "duplicate key/records found".
    Is there any setting that I can configure by which even though I have duplicate records in PSA, I can be successfully able to load only one set of data (without duplicates) in to the InfoProvider?
    How can I set up the process chains to do so?
    Your answer to the above two questions is appreciated.
    Thanks,

    Hi Sesh,
    There are 2 places where the DTP checks for duplicates.
    In the first, it checks previous error stacks. If the records you are loading are still contained in the error stack of a previous DTP run, it will throw the error at this stage. In this case you will first have to clean up the previous error stack.
    The second stage will clean up duplicates  across datapackeages, provided the option is set in your datasource. But you should note that this will not solve the problem if you have duplicates in the same datapackage. In that case you can do the filtering yourself in the start routine of your transformation.
    Hope this helps,
    Pieter

  • Master data failing with Duplicate records

    Dear All,
    Daily the ODS XD0SDO08 is getting loaded with four requests.
    While loading Master data Delta from ODS XD0SDO08 to 0Doc_Number and Cube XD_SDC08, regularly it's getting failed with the error "54 duplicate record found. 1130 recordings used in table /BI0/XDOC_NUMBER" But after deleting the Request from the 0Doc_number and the cube XD_SDC08, and reconstructing from PSA, it is succcessfull.
    If I check the PSA,there are few records in PSA, in which each Sales document number has two after image records and two before image records.If I count the number of these records they are almost equal to the number of Duplicate records we are getting in the error message.
    As we are loading to cube XD_SDC08 and 0Doc_Number, we don't have the option of Ignore Duplicate records in the Infopackge.
    Please suggest me a solution as I have to delete the request manually and reconstruct it daily.
    Regards,
    Ugendhar

    Hi ughi,
    As ur telling that data is sucessesful in cube but not in infoobject check the records in  infoobject wheather their is M version and A version if both are there ..some times load fail with duplicate data record due to this problem ...Run Attribute change run for that infoobject and  check ..
    if this will also be not sucessesful ,and as per ur update after reconstruction it is sucessful means their will be problem in update rule heck once
    another thing u will not get ignore duplicate data records because here infoobject as marked as data target ..
    if u think my suggestion is helpful assigin point to this thread
    regards,
    Gurudatt Bellary

  • Master data failed with  error `too many duplicate records'

    Dear all
    below is error message
                Data records for package 1 selected in PSA -                                             
                error                                                 4 in the update
    LOng text is giving as below
    Error              4 in the update
    Message no. RSAR119
    Diagnosis
    The update delivered the error code                                                   4.
    Procedure
    You can find further information on this error in the error message of the update.
    WOrking on BI - 7.0
    any solutions
    Thanks
    satish .a

    Hi,
    Go through these threads, they have same issue:
    Master data load: Duplicate Records
    Re: Master data info object - duplicate records
    Re: duplicate records in master data info object
    Regards
    Raj Rai

  • Master data load failed due to duplicate records .

    hello friends ,
    need some help .
    I am loading the master data from soruce sys , and it is throwing error of duplicate 56 records.
    I repeated the step , but found the same error once again .
    i could not find out the duplicate record , as thr are more than 24000 records , and in this 56 are duplicate . and this duplicate also looks like same.
    when i click on error records , it is showing me the below procedure .
    maintain the attribute in Psa SCREEN .
    I could not find the duplicate records , could you please let me know how can i maintain this .
    Regards

    Hi ,
    Reload the masterdata by cheking ignoreduplicate records check box.since the master data has overwriting capability the duplicate records will be overwritten
    cheers,
    Swapna.G

  • Master Data load fails because of Duplicate Records

    Hi BW Experts,
    I am loading historical data for a info-object using flexible update, first i tried to delete the data but it was not possible as it is being used in infocubes and ODS, As i am doing the rework of that cubes, ODS so i have to reload the whole data again. Anyway without deleting i tried loading the data in the info-object but it has thrown error that dulicate records found, I tried again then it has thrown an error ALEREMOTE has locked the object or lock not set for the object.
    Please suggest me what to do in these scenario.
    Please consider it as urgent.
    Thanks in advance.
    Sunil Morwal

    Sunil,
    First unlock the objects.go to SM12 give ALEREMOTE user name then list...select the request and delete.
    Load the data from PSA....
    OR reload... rememeber you will have one option at infopackage level in processing tab "<b>Ignore duplicate records"</b>.
    Let me know the status.
    Thanks
    Ram
    "BW is Everywhere"
    Message was edited by: Ram

  • Duplicate records in Infoobject

    Hi gurus,
      While loading in to a infoobject itz throwing an error as 150 Duplicate records are found in the details tab. My doubt is about temporary correction i.e in the infopackage either we can increase the error handling number and load or we can five the third option(valid records update) and update to infoobject.
    My question is when error occurs i can find that that some records are there in added. After the error occured i forced the request to red(Delta) and deleted it and gave 3rd option(valid records update) and updated to infoobject. The load was successful all the records are transferred but in added the number of records is 0.And also i read in one article that the error records will be stored as seperate request in PSA even i cannot find that. I want to know when or how i will get the news records added.

    Hi Jitu,
    If its only Infoobject,then under processing tab select only PSA and check the first option.After this relrun the infopackge.Go to monitor and see whether u got the same no of records including duplicates.After this follow the below steps,
    1.Go to RSRV- tcode, expand Master data and double click on second option.
    2.Click on the option that is present on right side of the panel. A pop-up window will appear and give the name of the infobject that is failed.
    3.select that particular infoobject and click on Delete button,
    if necessary then save it and click on execute----do this action only if necessary. Usually by deleting , the problem will be resolved. So, after clicking on Delete button, go back to the monitor screen and process the data packet manually by clicking on the wheel button.
    4.Select the request and click on the read manually button (wheel button).
    Dont forget to save the infopackge back by selecting only infoobject.
    hope this works.Let me know if u have doubts.
    Assign points if u r able to find the solution
    Regards,
    Manjula

  • Query to find duplicate records (Urgent!!!!)

    Hi,
    I have a to load data from a staging table to base table but I dont want to load data already present in the base table. Criteria to identify the duplicate data is thorugh a field say v_id and status_flag. If these two are the same in both staging and base table then that record must be rejected as a duplicate record.
    Kindly help me with the SQL which i need to use in a Procedure.
    Thanks

    Hello
    Another alternative would be to use MINUS if the table structures match:
    --Source rows the first 5 are in the destination table
    SQL> select * from dt_test_src;
    OBJECT_ID OBJECT_NAME
    101081 /1005bd30_LnkdConstant
    90723 /10076b23_OraCustomDatumClosur
    97393 /103a2e73_DefaultEditorKitEndP
    106075 /1048734f_DefaultFolder
    93337 /10501902_BasicFileChooserUINe
         93013 /106faabc_BasicTreeUIKeyHandle
         94929 /10744837_ObjectStreamClass2
        100681 /1079c94d_NumberConstantData
         90909 /10804ae7_Constants
        102543 /108343f6_MultiColorChooserUI
         92413 /10845320_TypeMapImpl
         89593 /10948dc3_PermissionImpl
        102545 /1095ce9b_MultiComboBoxUI
         98065 /109cbb8e_SpanShapeRendererSim
        103855 /10a45bfe_ProfilePrinterErrors
        102145 /10a793fd_LocaleElements_iw
         98955 /10b74838_SecurityManagerImpl
        103841 /10c906a0_ProfilePrinterErrors
         90259 /10dcd7b1_ProducerConsumerProd
        100671 /10e48aa3_StringExpressionCons
    20 rows selected.
    Elapsed: 00:00:00.00
    --Destination table contents
    SQL> select * from dt_test_dest
      2  /
    OBJECT_ID OBJECT_NAME
        101081 /1005bd30_LnkdConstant
         90723 /10076b23_OraCustomDatumClosur
         97393 /103a2e73_DefaultEditorKitEndP
        106075 /1048734f_DefaultFolder
         93337 /10501902_BasicFileChooserUINe
    Elapsed: 00:00:00.00
    --try inserting everything which will fail because of the duplicates
    SQL> insert into dt_test_dest select * from dt_test_src;
    insert into dt_test_dest select * from dt_test_src
    ERROR at line 1:
    ORA-00001: unique constraint (CHIPSDEVDL1.DT_TEST_PK) violated
    Elapsed: 00:00:00.00
    --now use the minus operator to "subtract" rows from the source set that are already in the destination set
    SQL> insert into dt_test_dest select * from dt_test_src MINUS select * from dt_test_dest;
    15 rows created.
    Elapsed: 00:00:00.00
    SQL> select * from dt_test_dest;
    OBJECT_ID OBJECT_NAME
        101081 /1005bd30_LnkdConstant
         90723 /10076b23_OraCustomDatumClosur
         97393 /103a2e73_DefaultEditorKitEndP
        106075 /1048734f_DefaultFolder
         93337 /10501902_BasicFileChooserUINe
         89593 /10948dc3_PermissionImpl
         90259 /10dcd7b1_ProducerConsumerProd
         90909 /10804ae7_Constants
         92413 /10845320_TypeMapImpl
         93013 /106faabc_BasicTreeUIKeyHandle
         94929 /10744837_ObjectStreamClass2
         98065 /109cbb8e_SpanShapeRendererSim
         98955 /10b74838_SecurityManagerImpl
        100671 /10e48aa3_StringExpressionCons
        100681 /1079c94d_NumberConstantData
        102145 /10a793fd_LocaleElements_iw
        102543 /108343f6_MultiColorChooserUI
        102545 /1095ce9b_MultiComboBoxUI
        103841 /10c906a0_ProfilePrinterErrors
        103855 /10a45bfe_ProfilePrinterErrors
    20 rows selected.You could use that in conjunction with the merge statement to exclude all trully duplicated rows and then update any rows that match on the id but have different statuses:
    MERGE INTO dest_table dst
    USING(     SELECT
              v_id,
              col1,
              col2 etc
         FROM
              staging_table
         MINUS
         SELECT
              v_id,
              col1,
              col2 etc
         FROM
              destination_table
         ) stg
    ON
         (dts.v_id = stg.v_id)
    WHEN MATCHED THEN....HTH

  • Duplicate records found while loading master data(very urgent)

    Hi all,
    One infopackage in the process chain failed while laoding the master data(full update).Its showing the following error-->duplicate record found ..1 record used in /BI0/PTCTQUERY and the same record occured in /BI0/PTCTQUERY tables.
    can anyone give me the solution...its very urgent...
    Thanks & Regards,
    Manjula

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    Regards
    Kiran

  • Data Loader inserting duplicate records

    Hi,
    There is an import that we need to run everyday in order to load data from another system into CRM On Demand . I have set up a data loader script which is scheduled to run every morning. The script should perform insert operation.
    Every morning a file with new insert data is available in the same location(generated by someone else) & same name. The data loader script must insert all records in it.
    One morning , there was a problem in the other job and a new file was not produced. When the data loader script ran , it found the old file and re-inserted the records (there were 3 in file). I had specified the -duplicatecheckoption parameter as the external id, since the records come from another system, but I came to know that the option works in the case of update operations only.
    How can a situation like this handled in future? The external id should be checked for duplicates before the insert operation is performed. If we cant check on the data loader side, is it possible to somehow specify the field as 'unique' in the UI so that there is an error if a duplicate record is inserted? Please suggest.
    Regards,

    Hi
    You can use something like this:
    cursor crs is select distinct deptno,dname,loc from dept.
    Now you can insert all the records present in this cursor.
    Assumption: You do not have duplicate entry in the dept table initially.
    Cheers
    Sudhir

  • SQL Loader loads duplicate records even when there is PK defined

    Hi,
    I have created table with pk on one of the column.Loaded the table using sql loader.the flat file has duplicate record.It loaded all the records without any error but now the index became unusable.
    The requirement is to fail the process if there are any duplicate.Please help me in understaing why this is happening.
    Below is the ctl file
    OPTIONS(DIRECT=TRUE, ERRORS=0)
    UNRECOVERABLE
    load data
    infile 'test.txt'
    into table abcdedfg
    replace
    fields terminated by ',' optionally enclosed by '"'
    col1 ,
    col2
    i defined pk on col1

    Check out..
    http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_modes.htm#sthref1457
    It states...
    During a direct path load, some integrity constraints are automatically disabled. Others are not. For a description of the constraints, see the information about maintaining data integrity in the Oracle Database Application Developer's Guide - Fundamentals.
    Enabled Constraints
    The constraints that remain in force are:
    NOT NULL
    UNIQUE
    PRIMARY KEY (unique-constraints on not-null columns)Since OP has the primary key in place before starting the DIRECT path load, this is contradictory to what the documentation says or probably a bug?

  • Duplicate records-problems

    Hi gurus
    We created a text datasource in R/3 and replicated it into BW 7.0
    An infopackage (loading to PSA) and DataTransferProcess was created and included in a process chain.
    The job failed because of duplicate records.
    We now discovered that the setting of the u201CDelivery of Duplicate recordsu201D for this Datasource in BW is set to u201CUndefinedu201D
    When creating the datasource in R/3, there were no settings for the u201CDelivery of duplicate recordsu201D.
    In BW, Iu2019ve tried to change the settings of u201CDelivery of Duplicate data recordsu201D to NONE, but when I go into change-mode, the u201CDelivery of duplicateu201D is not changeable.
    Does anyone have any suggestion on how to solve this problem?
    Thanks,
    @nne Therese

    Hi Muraly,
    I do have the same issue. I am loading texts from R/3 to PSA using Infopackage with Full update. From PSA I am using DTP with Delta with the option" vaild records update, No reporting(Request Record).
    It was running fine for last few weeks like transfer records and added records are same as in the PSA request every day.
    suddenly the load has filed to infoobect . I deleted the request from Infoobject and reloaded using the DTP then again failed. I tried loading full update as it texts then again failed. Now I analised the error it says Duplicate records. So I changed the DTP by checking the option Handling Duplicate records and loaded with full update . It worked fine the transferred records more than 50000 and added records are exact no of the PSA request.
    I reset the DTP again back to Delta and loaded today but the transferred records are 14000 and added records(3000) same as PSA request. I am fine if you see the history of loads the no of records in Transfer and Added in infoobjects and the no of records in PSA request are same every day..
    Why this difference is now? But in Production I have no issues. Since I changed the DTP if I transport to Production does it make any difference. I am first time doing BI 7.0.
    Please suggest me and explain me if I am wrong.
    Thanks,
    Sudha..

  • Import Transaction Data - Duplicate records

    Hi,
    I need to upload a file of transaction data into BPC using data manager package. I've done the transformation and conversion files which validate successfully on a small data set. When I try to upload the data file using the real life file, it fails due to duplicate records. This happens because multiple external ID's map to one internal ID. Therefore, whilst there are no duplicates in the actual file produced by the client, the resulting data produced after conversion does contain duplicates and will therefore not upload.
    Apart from asking the client to perform the aggregation before sending me the file, is there any way to get BPC to allow the duplicates and simply sum up?
    Regards
    Sue

    Hi,
    Try adding the delivered package /CPMP/APPEND and run it. This should solve your problem.
    Thanks,
    Sreeni

Maybe you are looking for

  • Can iTunes Match find different music on iPod and MacBook without overwriting one?

    I have an iPod nano that has playlists not on any other device.  It was synched with a laptop that died and iTune playlists were not transferred.  I now have a MacBook Air with iTunes with a new and different playlist.  I want to get my iPod playlist

  • When I FaceTime the other person can't see me. Please help

    When I FaceTime the other person can't see me. Please help

  • HT2963 Hi

    hi i am using OS X 10.8.5 i have a problem moving files to trash when i delete any file it deletes from a specific folder but it dose not appear in trash and my hard disk space is not getting free i have tried many things in terminal if any body can

  • ReInstall of older Adobe products?

    I am contacting you on a blind lark, hoping for direction.  I own Acrobat 9.0 Standard and had my computer, discs, and documentation stolen.  I have been able to retrieve my Serial # through my Adobe account, but Adobe no longer has a site to downloa

  • Help me to make final decidion

    Finaly i chocie the favorite PSU. Battle was ThermalTake,Sirtec,Fortron and now Enermax (which i decide to get) but stuck on 2 models.pretendets is: EG465AX-VE(G) vs  ELT400AWT cant decide which to be...ATX v1.3 vs ATX 2.0, 460W vs 400W, 33Aon +12V v