Error in Duplicate record validation in EntityObject using unique key contraint

I have implemented Unique key validation in Entity Object by creating Alternate key in Entity object.
So the problem is that whenever the duplicate record is found,the duplicate record error is shown, but the page becomes blank, and no error is shown in the log.
I wanted to know what may be the possible cause of it.
I am using Jdev 11.1.2.4.

After duplication, clear the PK item, then populate from the sequence in a PRE-INSERT block-level trigger.
Francois

Similar Messages

  • Master Delta Load - Error "38 Duplicate records found"

    Hi Guys,
    In one of our delta uploads for a Master data we are getting error "38 Duplicate records found". I checked the PSA data but there were no duplicate records in that.
    Once the data upload fails I manually update the failed packet and it goes fine.
    Please any one have solution for this.

    Hi
    You can see the check box in the Processing tab page. Make a tick mark for the check box Ignore Duplicate Data Records indicator . When multiple data records that have the same key are transferred, the last data record in the request is updated to BI. Any other data records in the request with the same key are ignored.
    Help says that:
    To maintain consistency, ignoring duplicate data records is only possible if the data is updated serially. A serial update is when data is first updated into the PSA then, after it has been successfully written to the PSA, it is updated into the master data or text tables of the InfoObject.
    If a DataSource transfers potentially duplicate data records or if you manually set the Ignore Duplicate Data Records indicator, the PSA Only update type is automatically selected in the scheduler.
    hope it clears ur doubt, otherwise let me know.
    u can see the same in my previous thread
    Re: duplicate records found while loading master data(very urgent)
    Regards
    Kiran

  • Error:1 duplicate record found. 0 recordings used in table /BI0/XBPARTNER

    Hei everyone
    I get an error of duplicate data records when trying to load masterdata to InfoObject.
    I've tried to load to PSA  and then to InfoObject. In PSA it doesn't show any errors but when loading to infoobject it again gets an error.
    I've also tried to loading to PSA with options 'Ignore double data records' and 'Update subsequently in data targets'.  I still get an error.
    Any suggestions.
    Thanks in advance,
    Maarja

    Take a look at these links below....
    http://help.sap.com/saphelp_nw70/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
    http://help.sap.com/saphelp_nw70/helpdata/en/05/bb223905b61b0ae10000000a11402f/frameset.htm
    (Use option -  1 C under activtites)

  • Error RSMPTEXTS~:Duplicate record dur durin EHP5 in phase SHADOW_IMPORT_INC

    Hi expert,
    i find this error during an EHP5 upgrade in phase shadow_import_inc:
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    SHADOW IMPORT ERRORS and RETURN CODE in SAPK-701DOINSAPBASIS.ERD
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    2EETW000 Table RSMPTEXTS~: Duplicate record during array insert occured.
    2EETW000 Table RSMPTEXTS~: Duplicate record during array insert occured.
    1 ETP111 exit code           : "8"
    here also the last part of log SAPK-701DOINSAPBASIS.ERD
    4 ETW000 Totally 4 tabentries imported.
    4 ETW000 953984 bytes modified in database.
    4 ETW000  [     dev trc,00000]  Thu Aug 11 16:58:45 2011                                             7954092  8.712985
    4 ETW000  [     dev trc,00000]  Disconnecting from ALL connections:                                       28  8.713013
    4 ETW000  [     dev trc,00000]  Disconnecting from connection 0 ...                                       38  8.713051
    4 ETW000  [     dev trc,00000]  Closing user session (con=0, svc=0000000005C317C8, usr=0000000005C409A0)
    4 ETW000                                                                                8382  8.721433
    4 ETW000  [     dev trc,00000]  Detaching from DB Server (con=0,svchp=0000000005C317C8,srvhp=0000000005C32048)
    4 ETW000                                                                                7275  8.728708
    4 ETW000  [     dev trc,00000]  Now I'm disconnected from ORACLE                                        8648  8.737356
    4 ETW000  [     dev trc,00000]  Disconnected from connection 0                                            84  8.737440
    4 ETW000  [     dev trc,00000]  statistics db_con_commit (com_total=13, com_tx=13)                        18  8.737458
    4 ETW000  [     dev trc,00000]  statistics db_con_rollback (roll_total=0, roll_tx=0)                      14  8.737472
    4 ETW000 Disconnected from database.
    4 ETW000 End of Transport (0008).
    4 ETW000 date&time: 11.08.2011 - 16:58:45
    4 ETW000 1 warning occured.
    4 ETW000 1 error occured.
    1 ETP187 R3TRANS SHADOW IMPORT
    1 ETP110 end date and time   : "20110811165845"
    1 ETP111 exit code           : "8"
    1 ETP199 ######################################
    4 EPU202XEND OF SECTION BEING ANALYZED IN PHASE SHADOW_IMPORT_INC
    and i've already try to use the last version of R3trans.
    can you help me????
    thanks a lot
    Franci

    Hello Fransesca,
    I am also facing same error while upgrading ehp5 upgradation please if you know tell me steps to solve it.
    Thanks,
    Venkat

  • How to delete the duplicate records in a table without promary key

    I have a table that contains around 1 million records and there is no promary key or auto number coulums. I need to delete the duplicate records from this table. what is the simple effective way to do this.

    Please see this link:
    Remove duplicate records ...
    sqldevelop.wordpress.com

  • How to use Unique key constraint in EJB

    Hi All,
    I am using an entity bean to create my table in MySQL4.1.13
    I have a table called user which has userid,username,password as its columns.
    The userid is a primary key over here.
    But I want the user name to be unique.
    I.e if a user with the name 'Adrian' is present in the database then another user with this user name should not be created.
    I have been told that you can configure a unique key in jbosscmp-jdbc.xml using insert-after-ejb-post-create.
    But i dont know how to use it.
    Can anyone give me a sample code for using this.
    Any help would be appreciated.
    Thanks
    P2

    Please explain your problem better. Really can't figure what you are trying to do.....
    choosing which sql statement to use at runtime
    or creating dynamic sql statement at runtime
    which??
    Regards

  • Find matching records from two tables, useing three key fields, then replace two fields in table1 from table2

    I have two tables - table1 and table2 - that have the exact same schema. There are three fields that can be used to compare the data of the two tables, field1, field2, and field3. When there are matching rows in the two tables (table1.field1/2/3 = table2.field1/2/3)
    I want to replace table1.field4 with table2.field4 and replace table1.field5 with table2.field5.
    I have worked with the query but have come up with goobly goop. I would appreciate any help. Thanks.

    If your field1, field2, and field3 combinations in these tables are unique, you
    can do a join on them.
    Select t1.field4, t2.field4 , t1.field5, t2.field5
    from table1 t1 inner join table2 t2 on t1.field1 =t2.field1 and t1.field2=t2.field2 AND t1.field3=t2.field3
    --You can update your table1 with following code:
    Merge table1 t1
    using table2 t2 on
    on t1.field1 =t2.field1 and t1.field2=t2.field2 AND t3.field3=t2.field3
    When matched then
    Update Set
    t1.field4= t2.field4
    ,t1.field5 = t2.field5 ;

  • Merge records from resultset by its unique key

    Java 1.6 - can someone help me, how to efficiently traverse through the resultset and merge the records by unique ID - coming from one - many relationship tables not using group by clause

    Order the results by the relevant value so you can work with all the results for a given id at the same time, then move to a new id.
    if you can't do that, then you have to keep all objects in memory, which, depending on the number of records, may cause memory problems.

  • Error while deleting Records using ROWID

    Hi all
    I have a small PL/SQL Block which accepts the Table Name as the User Input and
    deletes duplicate records from that table using ROWID.
    SET SERVEROUTPUT ON
    SET VERIFY OFF
    PROMPT Script to Delete Duplicate Records from a Table using ROWID
    ACCEPT TAB_NAME PROMPT 'Enter the Table Name : '
    DECLARE
    v_tab_name VARCHAR2(50):='&TAB_NAME';
    BEGIN
    EXECUTE IMMEDIATE 'DELETE FROM v_tab_name AA WHERE AA.ROWID <> (SELECT MIN(B.ROWID) FROM v_tab_name B WHERE AA.ID=B.ID)';
    DBMS_OUTPUT.PUT_LINE('Duplicate Records Deleted');
    END;
    When i execute this query it errors out saying table or view does not exist.
    Anybody's help is kindly appreciated.
    Regards
    Nakul.V

    Dear Nakul.V!
    Please change your execute immediate statement as follows:
    EXECUTE IMMEDIATE 'DELETE FROM ' || v_tab_name || ' AA
                       WHERE AA.ROWID IN (SELECT MIN(B.ROWID)
                                          FROM' || v_tab_name || ' B
                                          WHERE AA.ID=B.ID)';Yours sincerely
    Florian W.

  • How to rectify the error message " duplicate data records found"

    Hi,
    How to  rectify the error "duplicate data records found" , PSA is not there.
    and give me brief description about RSRV
    Thanks in advance,
    Ravi Alakunlta

    Hi Ravi,
    In the Info Package screen...Processing tab...check the option Do not allow Duplicate records.
    RSRV is used for Repair and Analysis purpose.
    If you found Duplicate records in the F fact table...Compress it then Duplicate records will be summarized in the Cube.
    Hope this helps.

  • Duplicate records error?

    hello all
    while extracting master data am getting duplicate records error?
    how do i rectify this?
    in infopackage screen in processing tab,will i get the  option " ignore double data records",?
    when will this option will be enable?
    regards

    Hello
    This option will be available only for Master Data and not for Transactional Data. You could control the Duplicate Records for Transactional Data in ODS, there is an option in the ODS Settings.
    ***F1 Help
    Flag: Handling of duplicate data records
    From BW 3.0 you can determine for DataSources for master data attributes and texts whether the extractor transfers more than one data record in a request for a value belonging to time-independent master data.
    Independently of the extractor settings (the extractor potentially delivers duplicate data records) you can use this indicator to tell the BW whether or not you want it to handle any duplicate records.
    This is useful if the setting telling the extractor how to handle duplicate records is not active, but the system is told from another party that duplicate records are being transferred (for example, when data is loaded from flat files).
    Sankar

  • Error due to duplicate records

    Hello friends,
    I have done a full upload to the particular charactersitics info object using direct update.(PSA and directly to data target). I used PSA and Subsequently into data target option.
    When i load the data into the object through process chain i get an error that duplicate records exist and the request become red in PSA.
    But no duplicate records exist in the data package and when we try to manually load the record from PSA to data Target it works fine.
    Can any one try to throw some lights on this error?
    Regards
    Sre....

    Hello Roberto and Paolo
    There was an OSS note that we should not use that option only PSA with delete duplicate records and update into data target .
    I dont know the reason Exactly.
    Can you throw some lights on this, Why its like that?
    Thanks for the reply paolo and roberto
    Regards
    Sri

  • Error:Info record Validity date is Past in PO

    Hello,
    PO is having eight line items,but when i wanted to change the delivery date for 5&6th line item then the system gives an error message" Info record Validity date is in Past" i n PO due to this i am unable to change the delivery date in PO.
    so please kindly suggest me where should i checked the settings in info record. and why this error is coming.
    please let me advice since this is matter of urgency.
    Regards
    sis

    Look up the error message Number and then get the error message to warning and this will solve your problem.
    Follow this path to change message from E to W: Spro>MM>Purchasing>Enivronment data>Define Attributes of system message, click execute, then look up your message.
    Edited by: Afshad Irani on Apr 30, 2010 11:39 AM

  • Duplicate record issue in IP file

    Hello All,
    I have a requirement to handle the duplicate record in IP flat file.
    my interface is very simple, just to load the data using IP interface. There is no IP imput query just FF load.
    My requirement is to apply validation check if file has similer two records that file should be rejected.
    ex
    Field 1   Field2   Amount
    XXX       ABC    100
    XXX       ABC    100
    File should reject. As per the standard functionality it sum-up the data in the cube
    Is there any way to handle that
    Thanks
    Samit

    I dont think you can do it. This is standard. May be you can write your own class to check for duplicate records. You can use this class in a custom planning function and throw error message.
    Best is to make sure the end users take the reponsibilty for data.
    Arun

  • Setting change to correct duplicate records ..

    Hi Experts,
    > I tried loading a flat file to an ODS which was in the quality system and i got some duplicate rows which i had to correct manually in the PSA.
    > When i tried loading the same flat file to the same ODS in Production , i did not get the error message 'duplicate records' .
    Can you please tell me what is the setting that has to be changed to allow duplicate records, or correct it ...that i have to do in the quality.
    Help done would be defenately rewarded with points ....
    Thanks ,
    Santosh ....

    Hi Santhosh,
    Remove check for unique data records.
    <i><b>Unique Data Records</b></i>
    With the Unique Data Records indicator, you determine whether only unique data records are to be updated to the ODS object. This means that you cannot load a data record into the ODS object the key combination for which already exists in the system – otherwise a termination occurs. Only use this setting when you are sure that only unique data records are to be loaded into the ODS object (for example, single documents). A typical application of this is in the loading of mass data. It improves the load performance.
    You can also deselect this indicator again (even if data has already been loaded into the ODS object). This can be necessary if you want to re-post deleted data records using a repair request (see: Tab Page: Updating). In this case, you need to deselect the Unique Data Records indicator before posting the repair request, following which you can then reset the Unique Data Records indicator once more. The regeneration of metadata of the Export DataSource, which takes place when the ODS object is reactivated, has no effect on the existing data mart delta method.
    More info @ <a href="http://help.sap.com/saphelp_nw04/helpdata/en/a6/1205406640c442e10000000a1550b0/content.htm">ODS Object Settings</a>
    Hope it Helps
    Srini

Maybe you are looking for

  • [SOLVED]Xorg 1.8 doesn't work

    Hello, I've just upgraded to linux-firmware and the newest X.org, which doesn't work. Xorg.0.log tells me: [ 934.318] X.Org X Server 1.8.1 Release Date: 2010-05-11 [ 934.322] X Protocol Version 11, Revision 0 [ 934.323] Build Operating System: Linux

  • When adding music to an iphoto slide show, is it possible to sample pieces of songs?

    Does anyone simply sample bits of music for their slide shows? I have only been able to apply entire songs.

  • [SOLVED] Error with PKGBUILD for package hash-identifier

    Hi, I tried to install the package hash-identifier. While running makepkg -s, there is an error: ==> ERROR: PKGBUILD contains CRLF characters and cannot be sourced. I guess this is due to the New Line character at line 17 of the PKGBUILD: sed -e 's|

  • ASM disks information

    Hi, Can please provide the information on ASM disks to find out 1) what is redundancy (normal or high) ? 2) How many failgroups configured ? 3) Configured with Striping or mirroring or mixed? 4) How many disks involved in each ASM disk group. DB vers

  • Renaming of attributes and elements in XMLs

    Hi, I have stored the following xml in a xmltype-column. <Element Attribute1="Test1" Attribute2="Test2"></Element> Now I want to rename the elements and attributes. <Element2 Attribute10="Test1" Attribute20="Test2"></Element2> How can I do this? Is t