Duplicated records - continued

Hi All,
I am using following script to find records, having two columns with duplicated values in one or more records. It works OK, but columns used to find the records are not keys and therefore can't precisely identify them, so I need to print column WORKORDER_NUMBER to identify the record.
Environment: Oracle 10g.
SELECT compid, compjobid, count(*)
FROM Workorder
WHERE deptid = 221000001 AND wostatus NOT IN (7)
GROUP BY compid, compjobid
HAVING count(*) > 1
ORDER BY compid, compjobid
Please advice the solution. Many thanks in advance.
Aleks

You want something like:
SQL> ed
Wrote file afiedt.buf
  1  WITH t AS (select 1 AS PID, 123 as PAKey, 1 as SG from dual union all
  2             select 2, 123, 1 from dual union all
  3             select 3, 123, 1 from dual union all
  4             select 4, 123, 2 from dual union all
  5             select 5, 234, 1 from dual union all
  6             select 6, 234, 2 from dual union all
  7             select 7, 234, 2 from dual)
  8  -- END OF TEST DATA
  9  select t.pid, t.pakey, t.sg
10  from t, (select pakey, sg
11           from t
12           group by pakey, sg
13           having count(*) > 1
14          ) tx
15  where t.pakey = tx.pakey
16* and   t.sg = tx.sg
SQL> /
       PID      PAKEY         SG
         1        123          1
         2        123          1
         3        123          1
         6        234          2
         7        234          2
SQL>

Similar Messages

  • Duplicated records on infoobect data load

    Hi,
    I have a problem when loading dato to 0UCINSTALLA infoobject.
    It goes to Red flga and It reports duplicated records in /BI0/QUCINSTALLA and /BI0/YUCINSTALLA tables.
    I checked the infopackage and the "PSA only" checkbox is selected, and "Continuing..." and "Ingnore dup records" checkboxes are selected, too.
    If the "Ignore duplicated records" is selected, why is reporting the error?
    I don't know what to do with this problem.
    any ideas?
    thanks for the help.
    Mauricio.

    In transfer structure write a start routine that delete duplicate record like that:
    sort DATAPAK by /BIC/filed1 descending /BIC/filed2
    /BIC/filed3.
    delete adjacent duplicates from DATAPAK comparing /BIC/filed1 /BIC/filed2.
    Hope it helps.
    Regards

  • How to delete the duplicated records, not just surpress the records?

    I am new to CR. Right now I am doing a project which needs CR get query from Oracle. I got the query from Oracle. There are records with duplicated fields. For example (the following only show part of fields):
    ID    body_code  
    1            10            
    2             10            
    3             15            
    4              15           
    5              15           
    6              16           
    I need to only select records (not surpress, because I will do some caluculate later) like following:
    ID         body_code        
    1             10         
    2              15
    3               16   
    I tried to creat selection fomula in fomula workshop, shown as follows:
    onlastrecord;
    <>next
    but CR said next can be evaluated. I think it must have something to do with the print-time. So what show I do to delete the duplicated records.Thank you very much.

    Ting,
    Try this:
    Insert a group on body_code.  Then create a running total called Distinct Count.  Field to summarize -> ID, Evaluate on change of group Group # 1 Body Code, and Never Reset.
    Then insert a chart in the report header:
    In the advanced layout, select body_code on change of and select Distinct Count running total in the Show values.
    I hope I understood what you're looking to accomplish.
    Z

  • Deleting duplicated records by date

    Hi,
    what could I do to delete duplicated records from test1 table
    the test1 table have the next columns that I describe below:
    code
    operator
    phone
    init_date
    end_date The records are duplicated by code, operator, phone, init_date
    and I need to delete the records with min(end_date)
    thanks in advanced...

    /* Formatted on 1/12/2012 7:28:44 AM (QP5 v5.149.1003.31008) */
    CREATE TABLE data
    AS
       (SELECT 'A' code,
               'Bob' operator,
               '111-2222' phone,
               ADD_MONTHS (SYSDATE, -1) init_date,
               SYSDATE end_date
          FROM DUAL
        UNION ALL
        SELECT 'A',
               'Bob',
               '111-2222',
               ADD_MONTHS (SYSDATE, -1) init_date,
               SYSDATE + 1 end_date
          FROM DUAL);
    DELETE FROM data
          WHERE (code, operator, phone, init_date, end_date) IN
                   (SELECT code,
                           operator,
                           phone,
                           init_date,
                           end_date
                      FROM (SELECT data.*,
                                   COUNT (
                                   OVER (
                                      PARTITION BY code,
                                                   operator,
                                                   phone,
                                                   init_date)
                                      cnt,
                                   ROW_NUMBER ()
                                   OVER (
                                      PARTITION BY code,
                                                   operator,
                                                   phone,
                                                   init_date
                                      ORDER BY end_date)
                                      rn
                              FROM data)
                     WHERE cnt > 1 AND rn = 1);

  • Duplicated records

    Dear friends,
    One question for you from a PL/SQL beginner:
    I have to individuate duplicated records (and set them as errated):
    With the below code I retrieve only the plus records, but I need ALL the duplicated records (example, if I have 2 equal records, I need the 2 records, not only one)
    SELECT
    ID_AMBS
    , ASL
    , CD_PRESIDIO
    , GGMM_CONTATTO
    , NR_RICETTA
    , CD_CONT_PRESCR
    , NR_PROG_INT
    FROM NOC_AMBS_WORK
    WHERE ID_AMBS IN
        SELECT
        min(ID_AMBS)
        FROM
        NOC_AMBS_WORK
        GROUP BY
        ASL, CD_PRESIDIO, GGMM_CONTATTO, NR_RICETTA, CD_CONT_PRESCR, NR_PROG_INT
        HAVING
        COUNT (*) > 1
    )With the below code I retrieve all the records of the table (not the duplicated)
           SELECT *
            FROM noc_ambs_work a,
                 noc_ambs_work b
           WHERE NVL(a.asl,'x') = NVL(b.asl,'x')
             AND NVL(a.cd_presidio,'x') = NVL(b.cd_presidio,'x')
             AND NVL(a.ggmm_contatto,'x') = NVL(b.ggmm_contatto,'x')
             AND NVL(a.nr_ricetta,'x') = NVL(b.nr_ricetta,'x')
             AND NVL(a.cd_cont_prescr,'x') = NVL(b.cd_cont_prescr,'x')
             AND NVL(a.nr_prog_int,'x') = NVL(b.nr_prog_int,'x')Can you help me?
    Tks
    leo

    Since you have not mentioned, which columns are getting duplicated, whats table structure, which column is pk, I think you can take help from below sample
    SQL> ed
    Wrote file afiedt.buf
      1  with c as
      2  (
      3  select 1 id, 'A' n from dual union all
      4  select 1,'A' from dual union all
      5  select 1,'A' from dual union all
      6  select 2,'e' from dual union all
      7  select 3,'r' from dual)
      8  select *
      9  from c
    10  where id =(select t1.id
    11         from c t1
    12         group by id
    13*        having count(*) >1)
    SQL> /
            ID N
             1 A
             1 A
             1 A
    SQL>

  • When using the record function for narration in Keynote 09 version 5.1.1., the recording continually gets off sync with the slides even if nothing has been altered on any of the slides.

    I am trying to record narration for my slideshow in Keynote 09 version 5.1.1. I put the transitions on automatic and hit the record button, then read the slides as they advanced. The program allows me to record my voice, however, when I play the slideshow back the recording continually becomes more off sync with the slides with each play. Is there a way to fix this?

     Hi,
    One of my ex-colleagues has installed a NI-DAQ 6.5 in our system. [And I do not see any other naitional instruments card in the CPU of the computer, may be he removed it] I deleted the account and all his files in the system. When I am trying to install version8.0, its not getting installed and giving me a message that I should uninstall the previous version by going to Add/Remove programs in the control panel.
    I tried doing that, but the "Change/Remove" button does not seem to work...[There is no response and so unable to install the new version...]
    Any idea how can this problem be solved?
    It is a windowsXP operating system with SP2 installed on a machine with P4 processor.
    Thanks

  • LSO: Duplicated Records Created For Each Person During Booking For Course

    Dear All,
    Since 2 months ago, I've been hitting the following issue.
    When I book an employee to a course through ESS, duplicated course participation records are created in the system.
    The duplicated records are having different booking priorities.
    And I aware that, this happened to the course which has workflow tagged to it.
    Sometimes, instead of having duplicated records created, it hit dump with the following message:
    "The ABAP/4 Open SQL array insert results in duplicate database records."
    Does anyone know what problem is this?
    Thanks.

    Did you solve this problem, Im facing the same problem.
    My HRP1001 table is increasing.

  • How to connect COEP-ESLH tables - duplicated records

    Dear All,
    I'd like to ask you to please give a hand in this issue:
    I connected COEP and ESLH tables in SQVI. The tables are connected through fields EBELN & EBELP. The problem is that I get duplicated records and I till now I haven't been able to figure out what the reason might be...
    How can I solve this? Please...
    Thanks,
    Csaba

    Hi Christine,
    Thanks for your help.
    To tell the truth I've got a task to check why someone's else query doesn't work...after I while I found out that the problem is about this table-connection...
    I'm not (fully) aware of the business purpose of this query, I'm working in logistics field...
    Example of duplicated reords:
    ESLH
    packno_ebeln_ebelp
    000001_0011_0010
    000002_0011_0010
    COEP
    belnr_buzei_ebeln_ebelp
    000A_0001_0011_0010
    000A_0002_0011_0010
    As a result of COEP-ESLH query I get four records instead of 2 (I guess first record in COEP belong to first record in ESLH, etc but of course it's my opinion and I SAP cannot consider it...)
    BR
    Csaba

  • Select duplicated records using SE16 ?

    Is it possible to discover the duplicated records using SE16?
    Thanks!

    Yes it is!
    You display the whole table (all lines and all fields), than you compare it line by line. Some sorting might be useful
    On the other hand if there is at least one key field in the table (which is always the case), there should not be much duplicate entries...

  • Outbound: Duplicated records, Scripting tool, Campaign name

    Hi All;
    Is there an option to remove duplicated records in the outbound which could happen when importing two files that have some same numbers, so we need the new to be used while old to be removed (this option is called: remove the duplication .. or something like this).
    Also, I am looking for a scipting tool, so when the agents is talking with the customer, there is a script that help the agent to know how to deal with the customer, this script need to be created per each campaign. Is there something like this?
    Another thing, when the call reaches to the agent, is it possible the name of the campaign to be shown at the CTI client?
    Regards
    Bilal

    Dear Pallavi,
    Very useful post!
    I am looking for similar accelerators for
    Software Inventory Accelerator
    Hardware Inventory Accelerator
    Interfaces Inventory
    Customization Assessment Accelerator
    Sizing Tool
    Which helps us to come up with the relevant Bill of Matetials for every area mentioned above, and the ones which I dont know...
    Request help on such accelerators... Any clues?
    Any reply, help is highly appreciated.
    Regards
    Manish Madhav

  • Duplicated records for DTP

    Hello Gurus,
             I have a process chain failed. data flow is from sap datasource to write-optimized DSO.  due to duplicated records in the datasource, so DTP failed.  how can I resolve this issue in detailed steps?
    Many thanks

    Hi,
    There is a setting in DTP handled the duplicate key.Just check that and try to load it again.
    It will reslove the issue.
    Cheers,
    Reena

  • Handling Duplicated Records in DTP

    Dear Experts,
    I am trying to load to the master data of 0PLANT using datasource 0BBP_PLANT_LOCMAP_ATTR (Location/Plant Mapping) using DTP.
    This standard datasource is neither language- nor time-dependent. Also, in the source system, it is not marked to handle duplicate records.
    I have also referred to OSS Note 1147563 - Consulting note: Indicator "Duplicate records". One of the key highlight is "If you use a DTP to transfer master data or texts, you can set the indicator 'Duplicate records' in the DTP tab page 'Data transfer'." I would suppose that this means the "Handle Duplicated Record Keys" option under tab "Update" of the respective DTP.
    In this OSS Note, it was also mentioned that
    "You must not set the indicator if the following prerequisites apply:
    The indicator 'DataSource delivers duplicate records' is not set in the DataSource."
    >> which is currently the case of datasource 0BBP_PLANT_LOCMAP_ATTR
    Checked in SAP Help [link|http://help.sap.com/saphelp_nw04s/helpdata/en/42/fbd598481e1a61e10000000a422035/frameset.htm]:
    You can specify how duplicate data records within a request are handled, independently of whether the setting that allows DataSources to deliver potentially duplicate data records has been made. This is useful if the setting was not made to the DataSource, but the system knows from other sources that duplicate data records are transferred (for example, when flat files are loaded).
    My question is, I can't load the master data mainly because of these duplicated record key errors and when I checked on the indicator to handle duplicated record keys, I was given the error message "Enter a valid value". Therafter, I can't do anything at all - activate DTP, click on other tabs - and it just got stucked at this point, and I could only choose to exit the transaction.
    Can anyone advise if I have basically missed anything?
    Thank you in advance.
    Regards,
    Adelynn

    Hi,
    Handling Duplicate Data Records 
    Use
    DataSources for texts or attributes can transfer data records with the same key into BI in one request. Whether the DataSource transfers multiple data records with the same key in one request is a property of the DataSource. There may be cases in which you want to transfer multiple data records with the same key (referred to as duplicate data records below) to BI more than once within a request; this is not always an error. BI provides functions to handle duplicate data records so that you can accommodate this.
    Features
    In a dataflow that is modeled using a transformation, you can work with duplicate data records for time-dependent and time-independent attributes and texts.
    If you are updating attributes or texts from a DataSource to an InfoObject using a data transfer process (DTP), you can specify the number of data records with the same record key within a request that the system can process. In DTP maintenance on the Update tab page, you set the Handle Duplicate Record Keys indicator to specify the number of data records.
    This indicator is not set by default.
    If you set the indicator, duplicate data records (multiple records with identical key values) are handled as follows:
    ●      Time-independent data:
    If data records have the same key, the last data record in the data package is interpreted as being valid and is updated to the target.
    ●      Time-Dependent Data
    If data records have the same key, the system calculates new time intervals for the data record values. The system calculates new time intervals on the basis of the intersecting time intervals and the sequence of the data records.
    Data record 1 is valid from 01.01.2006 to 31.12.2006
    Data record 2 has the same key but is valid from 01.07.2006 to 31.12.2007
    The system corrects the time interval for data record 1 to 01.01.2006 to 30.06.2006. As of 01.07.2006, the next data record in the data package (data record 2) is valid.
    If you set the indicator for time-dependent data, note the following:
    You cannot include the data source field that contains the DATETO information in the semantic key of the DTP. This may cause duplicate data records to be sorted incorrectly and time intervals to be incorrectly calculated.
    The semantic key specifies the structure of the data packages that are read from the source.
    Example
    You have two data records with the same key within one data package.
    In the following graphic, DATETO is not an element of the key:
    In the data package, the data records are in sequence DS2, DS1. In this case, the time interval for data record 1 is corrected:
    Data record 1 is valid from 1.1.2002 to 31.12.2006.
    Data record 2 is valid from 1.1.2000 to 31.12.2001.
    In the following graphic, DATETO is an element of the key:
    If DATETO is an element of the key, the records are sorted by DATETO. In this case, the data record with the earliest date is put before the data record with the most recent date. In the data package, the data records are in sequence DS2, DS1. In this case, the time interval for data record 2 is corrected:
    Data record 2 is valid from 1.1.2000 to 31.12.2000.
    Data record 1 is valid from 1.1.2001 to 31.12.2006.
    If you do not set this indicator, data records that have the same key are written to the error stack of the DTP.
    Note
    You can specify how duplicate data records within a request are handled, independently of whether the setting that allows DataSources to deliver potentially duplicate data records has been made. This is useful if the setting was not made to the DataSource, but the system knows from other sources that duplicate data records are transferred (for example, when flat files are loaded).

  • The load from ODS - 0FIAP_O03 to cube 0FIAP_C03  is duplicating records

    Hello Everyone
    I need some help/advice for the following issue
    SAP BW version 3.5
    The Delta load for ODS - 0FIAP_O03 works correctly
    The load from ODS - 0FIAP_O03 to cube 0FIAP_C03  is duplicating records
    NB Noticed one other forum user who has raised the same issue but question is not answered
    My questions are 
    1. Is this a known problem?
    2. Is there a fix available from SAP?
    3. If anyone has had this issue and fixed it, could you please share how you achieved this
    i have possible solutions but need to know if there is a standard solution
    Thankyou
    Pushpa

    Hello Pushpa,
    I assume that you are using the Delta load to the initial ODS and then to the CUBE as well.
    If the Delta is placed in both the places then there should not be any issue while sending the data to CUBE.
    If you are using the FULL Load then normally the data will gets aggregated in the cube as objects are with the Addition mode in the Cube.
    Can you post the exact error that you are facing here as this also can be of the design issue.
    Murali

  • Master Data load duplicated records - 0ART_SALES_ATTR

    Hi All,
    I'm trying to load Master Data in the Test System and I'm getting an error that says that There are duplicates of the data record 70114 with the key '10 30 000000000001034170 ' for characteristic 0MAT_SALES .
    This is wrong, no data is duplicated, I already check the PSA and it's fine.
    The dataflow is the 3.5.
    The Infopackage has been set up with Update Initialization with Data Transfer.
    The PSA and the Infoobjects are empty, they have never been loaded.
    The Processing in the Infopackage has been set as PSA and Infoobject (Package by package).
    I can't modify the Infopackage as I am in a Test System.
    Any Idea on how to solve this?

    The problem was that the person who designed the dataflow made a mistake and swapeed two fields.

  • Multi-record continuous acquisition with PCI-5112

    Hallo!
    I need to know if it is possible to obtayn a continuous acquisition in multi-record mode.
    In "niScope EX Multi Record Fetch Forever.vi" shipping example (LV 5.1), it seems to stop when "loop index" rises to "number of records -1" value.
    I've not the avilability of the PCI5112 card at the moment, so I cannot to try on, but I need of a confirm.
    If I've well understood, I could modify the loop condition, by erasing the check on loop index, and by wiring "loop index"mod(N), to "fetch record number" attribute (let's define N as "number of records"). So I could continuously fetch the records 0,1,..,N-1, for the first cycle, and again 0,..N-1, for the second cycle, and so on.
    OK! I think that the contents of a reco
    rd number K (with 0<=K<=N-1) are different if acquired in different cycles! In this way I can obtayn a "for ever" multi record acquisition. Is it right?
    Thanks for your answers.

    NatRob,
    It sounds like you are on the right track. The only thing keeping this from running continuously is the comparison of "loop iteration + 1" to the "number of records." Other than that, the loop should only stop when an error occurs or hte Stop button is pressed. Just erase the greater than comparison and the boolean used to stop the loop and it should work.
    Regards,
    Chris Drymalla
    NI Applications Engineer

Maybe you are looking for