BPC Database duplicate Record

Hi All,
Currently I face problem which BPC database looks like duplicate the record 3 times compare with the script log.
On script log the figure is like this:
Forecast      EntityA      ContractA      2010.Jan     Revenue        USD     1000
On database become like this:
Forecast      EntityA      ContractA      2010.Jan     Revenue        USD     1000
Forecast      EntityA      ContractA      2010.Jan     Revenue        USD     1000
Forecast      EntityA      ContractA      2010.Jan     Revenue        USD     1000
The application used has 13 dimensions and one of dimension has about 7000 members..
I have tried to optimise the application; however this is unable to solve the problem.
Have you come across similar problems before? If so, appreciate if you could suggest what the cause could be and how to resolve. We are using BPC MSSQLver 7.0
Thanks!

Hi All,
Following is the script logic, the objective is looking which end date is using by the user, either end3a or end2a.
Member GCHURN is set to 2 if the user is using end2a, and set to 3 if the user is using end3a.
*WHEN CONTRACTD.END3a
*IS ""
*WHEN TIME.TIMEID
*IS >= CONTRACTD.END2a
//Set GCHURN to 2 -> as an indicator that user is using end2a instead of end3a
*REC(ACCOUNT="GCHURN",NOADD,EXPRESSION=2)
*ENDWHEN  //end when time.timeid (end2a)
*ELSE   // Contractd.end3a <> blank
*WHEN TIME.TIMEID
*IS >= CONTRACTD.END3a
*REC(ACCOUNT="GCHURN",NOADD,EXPRESSION=3)
*ENDWHEN  //end when time.timeid (end3a)
*ENDWHEN //end when contract.end3a
I already run above script on debuger and real database, the log from both of them describe 1 time record generation but on database i saw 3 records generated instead, as you can see as following:
Script Log and Debugger
GCHURN,BUDGET_V5,ContractA,E2100,AUD,2010.FEB,2
Database will show following results:
ACCOUNT     CATEGORY     ENTITY     TIMEID     SIGNEDDATA SOURCE     CONTRACTD          INPUTCURRENCY
===========================================================================================
GCHURN     BUDGET_V5     E2100     20100200     2       0     ContractA          AUD
GCHURN     BUDGET_V5     E2100     20100200     2       0     ContractA          AUD
GCHURN     BUDGET_V5     E2100     20100200     2       0     ContractA          AUD
FYI, this script already used since October 2009, and just this month face this problem.

Similar Messages

  • Duplicate records in Fact Tables

    Hi,
    We are using BPC 7.0 MS SP7. BPC created duplicate records in WB and Fac2 tables. We faced similar issue before and the solution was to reboot the server and cleanup the additional data created. I think it should not be an issue with the script logic files we have. We had the issue across all applications. Data is fine now after the server reboot and running the same logic files.  I want to know if any one faced this issue and if there is any solution other than reboot. I appreciate your help.
    Thanks
    Raj

    Hi Sorin,
    I know this thread is rather old, but i have a problem which is pretty much related to this thread and appreciate if you could assist me. I have client running on 7.0 MS who has been using it for the past 3 years.
    It is a heavily customized system with many batch files running daily to update dimensions, copy data and sort. And Yes we do use custom packages that incorporates stored procedures.
    Recently, with no change in environment, we encountered our factwb ballooning up out of no where. fact table only contains less then 1 gb data but, factwb has 200 gb data and practically paralayzed the system. There is also equilavent 300 gb increase in the log files.
    We are not able to find out what caused this? Or if even the 200gb records in wb are even valid records that are duplicated. Is there a way to troubleshoot this?

  • Check for duplicate record in SQL database before doing INSERT

    Hey guys,
           This is part powershell app doing a SQL insert. BUt my question really relates to the SQL insert. I need to do a check of the database PRIOR to doing the insert to check for duplicate records and if it exists then that record needs
    to be overwritten. I'm not sure how to accomplish this task. My back end is a SQL 2000 Server. I'm piping the data into my insert statement from a powershell FileSystemWatcher app. In my scenario here if the file dumped into a directory starts with I it gets
    written to a SQL database otherwise it gets written to an Access Table. I know silly, but thats the environment im in. haha.
    Any help is appreciated.
    Thanks in Advance
    Rich T.
    #### DEFINE WATCH FOLDERS AND DEFAULT FILE EXTENSION TO WATCH FOR ####
                $cofa_folder = '\\cpsfs001\Data_pvs\TestCofA'
                $bulk_folder = '\\cpsfs001\PVS\Subsidiary\Nolwood\McWood\POD'
                $filter = '*.tif'
                $cofa = New-Object IO.FileSystemWatcher $cofa_folder, $filter -Property @{ IncludeSubdirectories = $false; EnableRaisingEvents= $true; NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite' }
                $bulk = New-Object IO.FileSystemWatcher $bulk_folder, $filter -Property @{ IncludeSubdirectories = $false; EnableRaisingEvents= $true; NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite' }
    #### CERTIFICATE OF ANALYSIS AND PACKAGE SHIPPER PROCESSING ####
                Register-ObjectEvent $cofa Created -SourceIdentifier COFA/PACKAGE -Action {
           $name = $Event.SourceEventArgs.Name
           $changeType = $Event.SourceEventArgs.ChangeType
           $timeStamp = $Event.TimeGenerated
    #### CERTIFICATE OF ANALYSIS PROCESS BEGINS ####
                $test=$name.StartsWith("I")
         if ($test -eq $true) {
                $pos = $name.IndexOf(".")
           $left=$name.substring(0,$pos)
           $pos = $left.IndexOf("L")
           $tempItem=$left.substring(0,$pos)
           $lot = $left.Substring($pos + 1)
           $item=$tempItem.Substring(1)
                Write-Host "in_item_key $item in_lot_key $lot imgfilename $name in_cofa_crtdt $timestamp"  -fore green
                Out-File -FilePath c:\OutputLogs\CofA.csv -Append -InputObject "in_item_key $item in_lot_key $lot imgfilename $name in_cofa_crtdt $timestamp"
                start-sleep -s 5
                $conn = New-Object System.Data.SqlClient.SqlConnection("Data Source=PVSNTDB33; Initial Catalog=adagecopy_daily; Integrated Security=TRUE")
                $conn.Open()
                $insert_stmt = "INSERT INTO in_cofa_pvs (in_item_key, in_lot_key, imgfileName, in_cofa_crtdt) VALUES ('$item','$lot','$name','$timestamp')"
                $cmd = $conn.CreateCommand()
                $cmd.CommandText = $insert_stmt
                $cmd.ExecuteNonQuery()
                $conn.Close()
    #### PACKAGE SHIPPER PROCESS BEGINS ####
              elseif ($test -eq $false) {
                $pos = $name.IndexOf(".")
           $left=$name.substring(0,$pos)
           $pos = $left.IndexOf("O")
           $tempItem=$left.substring(0,$pos)
           $order = $left.Substring($pos + 1)
           $shipid=$tempItem.Substring(1)
                Write-Host "so_hdr_key $order so_ship_key $shipid imgfilename $name in_cofa_crtdt $timestamp"  -fore green
                Out-File -FilePath c:\OutputLogs\PackageShipper.csv -Append -InputObject "so_hdr_key $order so_ship_key $shipid imgfilename $name in_cofa_crtdt $timestamp"
    Rich Thompson

    Hi
    Since SQL Server 2000 has been out of support, I recommend you to upgrade the SQL Server 2000 to a higher version, such as SQL Server 2005 or SQL Server 2008.
    According to your description, you can try the following methods to check duplicate record in SQL Server.
    1. You can use
    RAISERROR to check the duplicate record, if exists then RAISERROR unless insert accordingly, code block is given below:
    IF EXISTS (SELECT 1 FROM TableName AS t
    WHERE t.Column1 = @ Column1
    AND t.Column2 = @ Column2)
    BEGIN
    RAISERROR(‘Duplicate records’,18,1)
    END
    ELSE
    BEGIN
    INSERT INTO TableName (Column1, Column2, Column3)
    SELECT @ Column1, @ Column2, @ Column3
    END
    2. Also you can create UNIQUE INDEX or UNIQUE CONSTRAINT on the column of a table, when you try to INSERT a value that conflicts with the INDEX/CONSTRAINT, an exception will be thrown. 
    Add the unique index:
    CREATE UNIQUE INDEX Unique_Index_name ON TableName(ColumnName)
    Add the unique constraint:
    ALTER TABLE TableName
    ADD CONSTRAINT Unique_Contraint_Name
    UNIQUE (ColumnName)
    Thanks
    Lydia Zhang

  • How to delete duplicate records in all tables of the database

    I would like to be able to delete all duplicate records in all the tables of the database. Many of the tables have LONG columns.
    Thanks.

    Hello
    To delete duplicates from an individual table you can use a construct like:
    DELETE FROM
        table_a del_tab
    WHERE
        del_tab.ROWID <> (SELECT
                                MAX(dups_tab.ROWID)
                          FROM
                                table_a dups_tab
                          WHERE
                                dups_tab.col1 = del_tab.col1
                          AND
                                dups_tab.col2 = del_tab.col2
                          )You can then apply this to any table you want. The only differences will be the columns that you join on in the sub query. If you want to look for duplicated data in the long columns themselves, I'm pretty sure you're going to need to do some PL/SQL coding or maybe convert them to blobs or something.
    HTH
    David

  • The ABAP/4 Open SQL array insert results in duplicate Record in database

    Hi All,
    I am trying to transfer 4 plants from R/3 to APO. The IM contains only these 4 plants. However a queue gets generated in APO saying 'The ABAP/4 Open SQL array insert results in duplicate record in database'. I checked for table /SAPAPO/LOC, /SAPAPO/LOCMAP & /SAPAPO/LOCT for duplicate entry but the entry is not found.
    Can anybody guide me how to resolve this issue?
    Thanks in advance
    Sandeep Patil

    Hi Sandeep,
              Now try to delete ur location before activating the IM again.
    Use the program /SAPAPO/DELETE_LOCATIONS to delete locations.
    Note :
    1. Set the deletion flag (in /SAPAPO/LOC : Location -> Deletion Flag)
    2. Remove all the dependencies (like transportation lane, Model ........ )
    Check now and let me know.
    Regards,
    Siva.
    null

  • Duplicate Records error when processing transaction file....BPC 7.0

    Hi All,
    I have a situation. I am using BPC NW 7.0 and I have updated my dimension files. When I try to validate my transaction file every single record is validated successfully. But when I try to import the flat file into my application, I am getting a lot of dupplication records error and these are my questions.
    1. Will we get duplicate records in transaction files?
    2. Even if there are duplication, since it is a cube it should summarize not display that as a error and reject records?
    3. Is there something I can do to accept duplicates (I have checked the Replace option in the data package, to overwrite the simillar records, but it is only for account, category and entity only.
    5. In mycase I see identical values in all my dimension and the $value is the only difference. Why is it not summing up.
    Your quickest reply is much appreciated.
    Thanks,
    Alex.

    Hi,
    I have the same problem.
    In my case the file that I want to upload has different row that differ for the nature column. In the conversion file I map different nature to one internal nature.
    ES: cost1 --> cost
          cost2 --> cost
          cost3 --> cost
    In my desire was that in BPC the nature cost assume the result  cost = cost1 + cost2 + cost3.
    The result is that only the first record is uploaded and all other recorda are rejected as duplicate.
    Any suggestion?

  • Duplicate records in database view for ANLA and ANLC tables

    HI all,
    Can any one please suggest me how to remove duplicate records from ANLA and ANLC tables when creating a database view.
    thanks in advance,
    ben.

    Hi,
    Suppose we have two tables one with one field and another with two fields:
    TAB1 - Key field KEY1
    TAB2 - Key fields KEY1 & Key 2.
    No if we create a Database view of these two tables we can do by joining these two tables on Key field KEY1.
    Now if in View tab we have inculded TAB1- Key1.
    Now lets suppose following four entries are in table TAB1: (AAA), (BBB), (CCC).
    and following entries are in table TAB2: (AAA, 1), (AAA, 2),  (BBB, 3), (BBB, 5), (DDD, 3).
    The data base view will show following entries:
    AAA,
    AAA,
    BBB,
    BBB,
    Now these entris are duplicate in the output.
    This is because TAB2 has multilple entries for same key value of TAB1.
    Now if we want to remove multiple entries from ouput - we need to include an entry in selection conditions like TAB2-KEY2 = '1'.
    Regards,
    Pranav.

  • Duplicate records problem

    Hi everyone,
    I'm having a a little difficulty resolving a problem with a repeating field causing duplication of data in a report I'm working on, and was hoping someone on here can suggest something to help!
    My report is designed to detail library issues during a particular period, categorised by the language of the item issued. My problem is that on the sql database that out library management system uses, it is possible for an item to have more than one language listed against it (some books will be in more than one language). When I list the loan records excluding the language data field, I get a list of distinct loan records. Bringing the language data into the report causes the loan record to repeat for each language associated with it, so if a book is both in English and French, it will cause the loan record to appear like this:
    LOAN RECORD NO.     LANGUAGE CODE
      123456                             ENG
      123456                             FRE
    So, although the loan only occurred once I have two instances of it in my report.
    I am only interested in the language that appears first and I can exclude duplicated records from the report page. I can also count only the distinct records to get an accurate overall total. My problem is that when I group the loan records by language code (I really need to do this as there are millions of loan records held in the database) the distinct count stops being a solution, as when placed at this group level it only excludes duplicates in the respective group level it's placed in. So my report would display something like this:
    ENG     1
    FRE      1
    A distinct count of the whole report would give the correct total of 1, but a cumulative total of the figures calculated at the language code group level would total 2, and be incorrect. I've encountered similar results when using Running Totals evaluating on a formula that excludes repeated loan record no.s from the count, but again when I group on the language code this goes out of the window.
    I need to find a way of grouping the loan records by language with a total count of loan records alongside each grouping that accurately reflects how many loans of that language took place.
    Is this possible using a calculation formula when there are repeating fields, or do I need to find a way of merging the repeating language fields into one field so that the report would appear like:
    LOAN RECORD     LANGUAGE CODE
      123456                      ENG, FRE
    Any suggestions would be greatly appreciated, as aside from this repeating language data there are quite a few other repeating database fields on the system that it would be nice to report on!
    Thanks!

    if you create a group by loan
    then create a group by language
    place the values in the group(loan id in the loan header)
    you should only see the loan id 1x.
    place the language in the language group you should only see that one time
    a group header returns the 1st value of a unique id....
    then in order to calculate avoiding the duplicates
    use manual running totals
    create a set for each summary you want- make sure each set has a different variable name
    MANUAL RUNNING TOTALS
    RESET
    The reset formula is placed in a group header report header to reset the summary to zero for each unique record it groups by.
    whileprintingrecords;
    Numbervar  X := 0;
    CALCULATION
    The calculation is placed adjacent to the field or formula that is being calculated.
    (if there are duplicate values; create a group on the field that is being calculated on. If there are not duplicate records, the detail section is used.
    whileprintingrecords;
    Numbervar  X := x + ; ( or formula)
    DISPLAY
    The display is the sum of what is being calculated. This is placed in a group, page or report footer. (generally placed in the group footer of the group header where the reset is placed.)
    whileprintingrecords;
    Numbervar  X;
    X

  • How to get rid of duplicate records generated frm hierarchical cube in sql?

    Hi All,
    database version 10gR2.
    I am trying to aggregated data for two hierarchical dimensions, specifically organization and products.
    I am using one ROLLUP for each dimension, which would be two ROLLUP in GROUP BY clause to do the aggregation for every level of organization and product that are in included in the hierarchy.
    the troubling part is that that products that have data in corresponding fact table are not always located at the lowest level (which is 6) of the product hierarchy.
    e.g.
    product_id                               level
    0/01/0101/010102/01010201    5                           -->01010201, at level 5 , has data in fact table
    0/01/0101/010103                   4                           -->010103, at level 4, has data in fact table as well
    0/02/0201/020102/02010203/0201020304/020102030405              6   --> at level 6,(lowest level) and has data in fact table     we have a flat product hierarchy stored in table as below:
    prod_id  up_code_1 up_code_2 up_code_3   up_code_4   up_code_5 up_code_6
    01010201     0     01     0101     010102     01010201      NULL
    010103     0     01     0101     010103     null             nulldue to the NULL in product in level 6 for 01010201, when i run the query below, one duplicate record will be generated.
    for 010103, there will be 2 duplicate records, and for 020102030405 will be none.
    Encounter the same issue with the organizational dimension.
    currently, I am using DISTINCT to get rid of the duplicate records, but I don`t feel right to do it this way.
    So, I wonder if there is a more formal and standard way to do this?
    select distinct ORG_ID, DAY_ID,  TRADE_TYPE_ID, cust_id, PRODUCT_ID, QUANTITY_UNIT, COST_UNIT, SOURCE_ID,
          CONTRACT_AMOUNT, CONTRACT_COST, SALE_AMOUNT,SALE_COST, ACTUAL_AMOUNT, ACTUAL_COST, TRADE_COUNT
    from (     
    select  coalesce(UP_ORG_ID_6, UP_ORG_ID_5, UP_ORG_ID_4, UP_ORG_ID_3, UP_ORG_ID_2, UP_ORG_ID_1) as ORG_ID,
          a.day_id as day_id,        
          a.TRADE_TYPE_ID as TRADE_TYPE_ID,
          a.CUST_ID,
          coalesce(UP_CODE_6, UP_CODE_5, UP_CODE_4, UP_CODE_3, UP_CODE_2, UP_CODE_1) as product_id,
          QUANTITY_UNIT,
          COST_UNIT,
          A.SOURCE_ID as SOURCE_ID,
          SUM(CONTRACT_AMOUNT) as CONTRACT_AMOUNT,
          SUM(CONTRACT_COST) as CONTRACT_COST,
          SUM(SALE_AMOUNT) as SALE_AMOUNT,
          SUM(SALE_COST) as SALE_COST,
          SUM(ACTUAL_AMOUNT) as ACTUAL_AMOUNT,
          SUM(ACTUAL_COST) as ACTUAL_COST,
          SUM(TRADE_COUNT) as TRADE_COUNT     
    from DM_F_LO_SALE_DAY a, DM_D_ALL_ORG_FLAT B, DM_D_ALL_PROD_FLAT D --, DM_D_LO_CUST E
    where a.ORG_ID=B.ORG_ID
          and a.PRODUCT_ID=D.CODE
    group by rollup(UP_ORG_ID_1, UP_ORG_ID_2, UP_ORG_ID_3, UP_ORG_ID_4, UP_ORG_ID_5, UP_ORG_ID_6),
          a.TRADE_TYPE_ID,
          a.day_id,
          A.CUST_ID,
          rollup(UP_CODE_1, UP_CODE_2, UP_CODE_3, UP_CODE_4, UP_CODE_5, UP_CODE_6),
          a.QUANTITY_UNIT,
          a.COST_UNIT,
          a.SOURCE_ID );Note, GROUPING_ID seems not help, at least i didn`t find it useful in this scenario.
    any recommendation, links or ideas would be highly appreciated as always.
    Thanks

    anyone ever encounter this kind of problems?
    any thought would be appreciated.
    thanks

  • Check duplicate records

    Right now i'm doing a migration project. I need to import data from excel and text files into the oracle database. Now my question is how to do the duplication data checking,validation on identical attributes and data type need to make sure i import the data correctly and accurately..Does anyone have any suggestion..your ideas n comments are highly appreciated..thanx in advance

    Hi,
    I'm new to this forum, so my answer is a little bit late ...
    Export data from all documents in an identical format. For example |name|adress|and|so|on|, merge all files to a big one. Than eliminate duplicate record with cat file | sort -u and you'll have unique rows. Next step, you have to check whether different keys means different attribute-values.Truncate the key-values from the file and do a unique sort again. Than diff -c both files and you get a list with the duplicate key. The last step ist to decide, which are the correct ones - I'm afraid this will be the hardest work.
    After that you can import all data without any constraint violation.
    Best regards
    Andreas

  • Duplicate Records & CFTRANSACTION

    Maybe I'm missing the obvious on this, as I've never had a
    problem with this before, but recently I developed a custom tag
    that logs and blocks the IP addresses of all these recent DECLARE
    SQL injection attempts.
    The issue is however that my blocked IP address seems to be
    getting duplicates here and there. My "datetime" field in the
    database show the duplicates are all added to the database the
    exact same second. What gives?
    Shouldn't CFTRANSACTION be preventing such a thing, even if
    multiple injection attempts come at the same time?

    I've always coded my applications where my primary key is my
    database's autonumber field, and instead insure my coding is solid
    enough to prevent duplicate records from appearing. Oddly enough
    it's worked flawlessly until now.
    Would not ColdFusion throw errors if I made the "ip_address"
    field my primary key, and my code allowed for a duplicate record to
    be entered? Am I interpretting the CFTRANSACTION code to do
    something it doesn't do?
    Also, the duplicates aren't causing problems, so a DISTINCT
    select isn't necessary. The IP address is blocked whether one
    record or fifty exist in my blocked_ip_addresses table. My goal is
    just not to waste database space.
    Any further help you can provide is MUCH appreciated!
    Thanks!

  • Problem with duplicate records..

    Hi..I am trying to insert the records to the database based on a button click from a Swing frame..I have succeeded in that...But when i try to insert a duplicate record it is showing com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Duplicate entry '1' for key 1
    My code is
    import java.awt.event.*;
    import java.sql.*;
    public class BarcodeReader extends javax.swing.JFrame implements ActionListener {
        static Connection con;
        static String url = "jdbc:mysql://localhost:3306/";
        static String db = "mynewdatabase";
        static String driver = "com.mysql.jdbc.Driver";
        static String user = "uname";
        static String pass = "pwd";
        static Statement stmt;
        /** Creates new form BarcodeReader */
        public BarcodeReader() {
            initComponents();
            nb.addActionListener(this);
        public static Connection getConnection(){
            try{
                Class.forName(driver);
                con = DriverManager.getConnection(url + db, user, pass);
                stmt=con.createStatement();
                System.out.println("jdbc driver for mysql : " + driver);
                System.out.println("Connection url : " + url + db);
                }catch (Exception ex)
                   ex.printStackTrace();
            return con;
    private void ActionPerformed(java.awt.event.ActionEvent evt) {                                
            Connection con=getConnection();
            String t=newtxt.getText();
            int t1=Integer.parseInt(t);
            try{
                PreparedStatement p=con.prepareStatement("INSERT INTO machine(barcode) values(?)");
                ResultSet rs=stmt.executeQuery("SELECT * FROM MACHINE");
                while(rs.next()){
                    String s=rs.getString("barcode");
                   if(s.equals(t))
                      System.out.println("this id exists....");
                        rs.last();
                   p.setInt(1, t1);
                   p.executeUpdate();
                   new NewJFrame().setVisible(true);
            }catch(Exception e){
                e.printStackTrace();
    public static void main(String args[]) {
            java.awt.EventQueue.invokeLater(new Runnable() {
                public void run() {
                    new BarcodeReader().setVisible(true);
        }When i insert a new record is ok, but when i am inserting the same record again it shows the following exception..
    jdbc driver for mysql : com.mysql.jdbc.Driver
    Connection url : jdbc:mysql://localhost:3306/mynewdatabase
    this id exists....
    com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Duplicate entry '1' for key 1
            at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
            at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
            at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
            at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
            at com.mysql.jdbc.Util.handleNewInstance(Util.java:406)
            at com.mysql.jdbc.Util.getInstance(Util.java:381)
            at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1015)
            at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:956)
            at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3536)
            at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3468)
            at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1957)
            at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2107)
            at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2648)
            at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2086)
            at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2371)
            at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2289)
            at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2274)
            at project.BarcodeReader.ActionPerformed(BarcodeReader.java:276)
            at project.BarcodeReader.access$000(BarcodeReader.java:17)
            at project.BarcodeReader$1.actionPerformed(BarcodeReader.java:138)
            at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:1995)
            at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2318)
            at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:387)
            at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:242)
            at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(BasicButtonListener.java:236)
            at java.awt.Component.processMouseEvent(Component.java:6263)
            at javax.swing.JComponent.processMouseEvent(JComponent.java:3267)
            at java.awt.Component.processEvent(Component.java:6028)
            at java.awt.Container.processEvent(Container.java:2041)
            at java.awt.Component.dispatchEventImpl(Component.java:4630)
            at java.awt.Container.dispatchEventImpl(Container.java:2099)
            at java.awt.Component.dispatchEvent(Component.java:4460)
            at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4574)
            at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4238)
            at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4168)
            at java.awt.Container.dispatchEventImpl(Container.java:2085)
            at java.awt.Window.dispatchEventImpl(Window.java:2475)
            at java.awt.Component.dispatchEvent(Component.java:4460)
            at java.awt.EventQueue.dispatchEvent(EventQueue.java:599)
            at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:269)
            at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:184)
            at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:174)
            at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:169)
            at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:161)
            at java.awt.EventDispatchThread.run(EventDispatchThread.java:122)Thanks.

    shelton141 wrote:
    Hi..I am trying to insert the records to the database based on a button click from a Swing frame..I have succeeded in that...But when i try to insert a duplicate record it is showing com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Duplicate entry '1' for key 1Of course it is. That is what it is suppossed to do.
    Try inserting the same record twice, manually, and see what happens.
    There is however, if I remember right, an "IGNORE" "clause/option/whatever" that you can use on INSERT statements in MySQL. Check the manual, if that is what you really want.

  • Ignore duplicate records for master data attributes

    dear  experts ,
                   how & where can i  enable "ignore duplicate records" when i am running my DTP to load data
                     to master data attributes.

    Hi Raj
    Suppose you are loading master data to InfoObject and in PSA you have more than one records for Key.
    Let's assume you are loading some attributes of Document NUmber ( 0DOC_NUMBER) and in PSA you have multiple records for same document number. In PSA this is not a problem, as Key for PSA table is technical key and which is different for every record in PSA. But this is a problem for InfoObject attribute table as more than  one records for primary key ( here 0DOC_NUMBER) will create Primary Key consraints in database.
    This issue can be easily avoided by selecting "Handle Duplicate Records" in DTP . You will find this option under *update" tab of DTP.
    Regards
    Anindya

  • Importing and Updating Non-Duplicate Records from 2 Tables

    I need some help with the code to import data from one table
    into another if it is not a duplicate or if a record has changed.
    I have 2 tables, Members and NetNews. I want to check NetNews
    and import non-duplicate records from Members into NetNews and
    update an email address in NetNews if it has changed in Members. I
    figured it could be as simple as checking Members.MembersNumber and
    Members.Email against the existance of NetNews.Email and
    Members.MemberNumber and if a record in NetNews does not exist,
    create it and if the email address in Members.email has changed,
    update it in NetNews.Email.
    Here is what I have from all of the suggestions received from
    another category last year. It is not complete, but I am stuck on
    the solution. Can someone please help me get this code working?
    Thanks!
    <cfquery datasource="#application.dsrepl#"
    name="qryMember">
    SELECT distinct Email,FirstName,LastName,MemberNumber
    FROM members
    WHERE memberstanding <= 2 AND email IS NOT NULL AND email
    <> ' '
    </cfquery>
    <cfquery datasource="#application.ds#"
    name="newsMember">
    SELECT distinct MemberNumber
    FROM NetNews
    </cfquery>
    <cfif
    not(listfindnocase(valuelist(newsMember.MemberNumber),qryMember.MemberNumber)
    AND isnumeric(qryMember.MemberNumber))>
    insert into NetNews (Email_address, First_Name, Last_Name,
    MemberNumber)
    values ('#trim(qryMember.Email)#',
    '#trim(qryMember.FirstName)#', '#trim(qryMember.LastName)#', '#
    trim(qryMember.MemberNumber)#')-
    </cfif>
    </cfloop>
    </cfquery>
    ------------------

    Dan,
    My DBA doesn't have the experience to help with a VIEW. Did I
    mention that these are 2 separate databases on different servers?
    This project is over a year old now and it really needs to get
    finished so I thought the import would be the easiest way to go.
    Thanks to your help, it is almost working.
    I added some additional code to check for a changed email
    address and update the NetNews database. It runs without error, but
    I don't have a way to test it right now. Can you please look at the
    code and see if it looks OK?
    I am also still getting an error on line 10 after the routine
    runs. The line that has this code: "and membernumber not in
    (<cfqueryparam list="yes"
    value="#valuelist(newsmember.membernumber)#
    cfsqltype="cf_sql_integer">)" even with the cfif that Phil
    suggested.
    <cfquery datasource="#application.ds#"
    name="newsMember">
    SELECT distinct MemberNumber, Email_Address
    FROM NetNewsTest
    </cfquery>
    <cfquery datasource="#application.dsrepl#"
    name="qryMember">
    SELECT distinct Email,FirstName,LastName,MemberNumber
    FROM members
    WHERE memberstanding <= 2 AND email IS NOT NULL AND email
    <> ' '
    and membernumber not in (<cfqueryparam list="yes"
    value="#valuelist(newsmember.membernumber)#"
    cfsqltype="cf_sql_integer">)
    </cfquery>
    <CFIF qryMember.recordcount NEQ 0>
    <cfloop query ="qryMember">
    <cfquery datasource="#application.ds#"
    name="newsMember">
    insert into NetNewsTest (Email_address, First_Name,
    Last_Name, MemberNumber)
    values ('#trim(qryMember.Email)#',
    '#trim(qryMember.FirstName)#', '#trim(qryMember.LastName)#', '#
    trim(qryMember.MemberNumber)#')
    </cfquery>
    </cfloop>
    </cfif>
    <cfquery datasource="#application.dsrepl#"
    name="qryEmail">
    SELECT distinct Email
    FROM members
    WHERE memberstanding <= 2 AND email IS NOT NULL AND email
    <> ' '
    and qryMember.email NEQ newsMember.email
    </cfquery>
    <CFIF qryEmail.recordcount NEQ 0>
    <cfloop query ="qryEmail">
    <cfquery datasource="#application.ds#"
    name="newsMember">
    update NetNewsTest (Email_address)
    values ('#trim(qryMember.Email)#')
    where email_address = #qryEmail.email#
    </cfquery>
    </cfloop>
    </cfif>
    Thank you again for the help.

  • SQL Loader loads duplicate records even when there is PK defined

    Hi,
    I have created table with pk on one of the column.Loaded the table using sql loader.the flat file has duplicate record.It loaded all the records without any error but now the index became unusable.
    The requirement is to fail the process if there are any duplicate.Please help me in understaing why this is happening.
    Below is the ctl file
    OPTIONS(DIRECT=TRUE, ERRORS=0)
    UNRECOVERABLE
    load data
    infile 'test.txt'
    into table abcdedfg
    replace
    fields terminated by ',' optionally enclosed by '"'
    col1 ,
    col2
    i defined pk on col1

    Check out..
    http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_modes.htm#sthref1457
    It states...
    During a direct path load, some integrity constraints are automatically disabled. Others are not. For a description of the constraints, see the information about maintaining data integrity in the Oracle Database Application Developer's Guide - Fundamentals.
    Enabled Constraints
    The constraints that remain in force are:
    NOT NULL
    UNIQUE
    PRIMARY KEY (unique-constraints on not-null columns)Since OP has the primary key in place before starting the DIRECT path load, this is contradictory to what the documentation says or probably a bug?

Maybe you are looking for

  • How to update a CLOB column

    here is my problem. I have a table called t_clob as follows: SQL> desc t_clob Name Null? Type C CLOB I need to update an existing row with new data (overwrite). The new data is a big java script string with line break, spaces, alignment etc. as follo

  • SPMenu Field for sharepoint document library

    I have written code where it will show the Sharepoint document library in sp grid view.It is showing the grid but i want the default spmenu field for the type icon. please see the attached image. Below is the code of mine private void CreateBoundFiel

  • I bought an iPhone 4 from a guy that can't remember his iCloud password. How can I remove his old account?

    I bought an iPhone from a guy that says he can't remember the iCloud password. How can I remove his old account and establish my own?

  • Add on connection error

    hello experts, i have some problem in my client pc .my client pc face some problem at the time of add a add on .when he add the document a error occur "u r not connect to the company" while i m doing well every thing on my pc i can add the document o

  • Unable to install Flash PLayer 10.1

    Windows 7 64 bit operating system When i enter the downloading page the option of ACTIVE X CONTROLS does not appear. It just brings up the installer, but it does not install, instead it asks for a proxy server