Causing duplicate records

Hi,
generally we use the following api to insert records. but it is causing duplication of records. plz give me some guide lines.
/* Formatted on 2006/04/21 15:31 (Formatter Plus v4.5.2) */
CREATE OR REPLACE PACKAGE BODY cec_ipc_transactions_api
AS
PROCEDURE LOG (
p_transaction_type VARCHAR2,
p_order_number NUMBER,
p_header_id NUMBER,
p_order_id VARCHAR2,
p_original_order_id VARCHAR2,
p_version NUMBER,
p_cancel_flag VARCHAR2,
p_success OUT BOOLEAN,
p_error_message OUT VARCHAR2,
p_order_source IN VARCHAR2
DEFAULT 'MarketPlace Router Store',
p_business_entity IN VARCHAR2 DEFAULT 'UNK'
IS
x_header_id NUMBER;
x_order_number NUMBER;
x_statement_id NUMBER;
x_task_id NUMBER;
x_org_id NUMBER;
x_transaction_type VARCHAR2 (60);
x_transaction_type_old VARCHAR2 (60) := NULL;
x_error_message VARCHAR2 (2000);
x_coe_flag VARCHAR (1) := NULL;
-- Added on 10 FEB 2005
x_errcode VARCHAR2 (240);
x_errmsg VARCHAR2 (240);
v_procedure_name VARCHAR2 (80) := '<11i Procedure Name>';
v_tran_control_rec xxcca_iei_ne_util.cca_iei_tran_control_rec;
-- 5/16/99 -- add open_flag = 'Y' to preven DTS errors when cancelling orders
-- Everest Migration 10.7 to 11i ---Change Date '14 JUL 2004'
-- Get header id of respective order for update with nowait option from oe_order_headers_all
CURSOR lock_header (p_header_id NUMBER)
IS
SELECT ooha.header_id header_id
FROM oe_order_headers_all ooha,
xxcca_oe_order_headers_ext xoohe
WHERE ooha.header_id = p_header_id
AND ooha.header_id = xoohe.header_id
AND NVL (xoohe.cms_customer_version, -999) < p_version
AND ooha.open_flag
|| '' = 'Y'
FOR UPDATE OF xoohe.cms_customer_version NOWAIT;
BEGIN
cec_debug_pk.DEBUG ('Begin CEC_IPC_TRANSACTIONS_API.log');
CEC_DEBUG_PK.debug('EAH ord: ' || p_order_number
|| ' hdrid: ' || p_header_id
|| ' ordid: ' || p_order_id
|| ' origid: ' || p_original_order_id
|| ' vers: ' || p_version
|| ' src: ' || p_order_source
|| ' ent: ' || p_business_entity);
IF p_header_id IS NULL
THEN
BEGIN
x_statement_id := 10;
-- Everest Migration 10.7 to 11i ---Change Date '14 JUL 2004'
-- Get the header id, org id, order number of the respetive order from oe_order_headers_all
SELECT header_id, org_id, order_number
INTO x_header_id, x_org_id, x_order_number
FROM oe_order_headers_all
WHERE order_number = p_order_number;
EXCEPTION
WHEN NO_DATA_FOUND
THEN
NULL;
END;
ELSE
-- Everest Migration 10.7 to 11i ---Change Date '14 JUL 2004'
-- Get the org id, order number of the respective order from oe_order_headers_all
BEGIN
SELECT org_id, order_number
INTO x_org_id, x_order_number
FROM oe_order_headers_all
WHERE header_id = p_header_id;
EXCEPTION
--TD #38404 - Added exception handler
WHEN NO_DATA_FOUND
THEN
NULL;
END;
x_header_id := p_header_id;
END IF;
IF x_org_id IS NULL AND NVL (p_version, 1) = 1
THEN
BEGIN
SELECT org_id
INTO x_org_id
FROM cec_ipc_transactions
WHERE original_order_id = p_original_order_id
AND NVL (version, 1) = p_version
AND org_id > -1000
AND ROWNUM = 1;
EXCEPTION
WHEN NO_DATA_FOUND
THEN
NULL;
END;
END IF;
IF x_org_id IS NULL AND NVL (p_version, 1) = 1
THEN
BEGIN
SELECT TO_NUMBER (meaning)
INTO x_org_id
FROM cec_lookups
WHERE lookup_type = 'CEC_ORG_ID'
AND lookup_code = p_business_entity
AND ROWNUM = 1;
EXCEPTION
WHEN NO_DATA_FOUND
THEN
x_org_id := -1000;
END;
END IF;
x_transaction_type := p_transaction_type;
If the transaction type is a terminal transaction type,
then we try to update the version in xxcca_oe_order_headers_ext.
If the record is locked, set the transaction type to
COE_LOCKED.
IF x_transaction_type IN
(cec_globals_pk.coe_manual_rlse, cec_globals_pk.coe_ec_applied)
THEN
BEGIN
FOR c IN lock_header (x_header_id)
LOOP
-- Everest Migration 10.7 to 11i ---Change Date '14 JUL 2004'
-- Updae the cms customer version of the respective order in xxcca_oe_order_headers_ext table
UPDATE xxcca_oe_order_headers_ext
SET cms_customer_version = p_version
WHERE header_id = c.header_id;
END LOOP;
EXCEPTION
WHEN OTHERS
THEN
-- Handle locked record
IF SQLCODE = -54
THEN
x_transaction_type := cec_globals_pk.coe_locked;
-- if the record is locked, we want to save the
-- old trans type so we can later restore it
x_transaction_type_old := p_transaction_type;
END IF;
END;
END IF;
-- If the change is "done" either by automatic application or by
-- release of all IC change holds, we want to send out the
-- order acknowledgments
-- This is done by inserting into CCA_NOTIFY_CHANGES table
IF x_transaction_type IN (cec_globals_pk.coe_manual_rlse,
cec_globals_pk.coe_ec_applied
AND NVL (p_version, 1) > 1
THEN
-- Commented the insert and changed as follows on 10 FEB 2005
/*INSERT INTO cca_iei_tran_control
(source_system,
source_id,
control_id,
request_start_time,
tran_type,
tran_status,
tran_ack_q,
tran_code
VALUES ('IC_CHANGE',
cca_iei_tran_control_s1.NEXTVAL,
x_order_number,
SYSDATE,
'ORDER',
'NEW',
999,
'IC_CHANGE'
-- Added on 10 FEB 2005
SELECT cca_iei_tran_control_s1.NEXTVAL
INTO v_tran_control_rec.DATA.source_id
FROM DUAL;
v_tran_control_rec.DATA.control_id := x_order_number;
v_tran_control_rec.DATA.header_id := x_header_id;
v_tran_control_rec.DATA.tran_ack_q := 999;
v_tran_control_rec.DATA.tran_code := 'IC_CHANGE';
v_tran_control_rec.DATA.tran_status := 'NEW';
v_tran_control_rec.DATA.tran_type := 'ORDER';
v_tran_control_rec.DATA.request_start_time := SYSDATE;
v_tran_control_rec.DATA.source_system := 'IC_CHANGE';
v_tran_control_rec.DATA.attribute5 := v_procedure_name;
xxcca_iei_ne_util.insert_cca_iei_tran_control (
v_tran_control_rec,
x_errcode,
x_errmsg
END IF;
IF x_transaction_type = cec_globals_pk.coe_print_and_tear
AND (NVL (p_version, 1) > 1 AND x_header_id IS NULL)
THEN
x_transaction_type := cec_globals_pk.coe_ec_applied;
END IF;
/* Try to update existing record in CEC_IPC_TRANSACTIONS for this
header_id / version combination.
If none is found, then create a new one.
Usually, the Query will create the record and others will update
It is possible that another transaction type will result in the
creation of the CEC_IPC_TRANSACTIONS record if the order was not
imported yet or if ERP was down at the time of Query.
x_statement_id := 20;
x_coe_flag := NULL;
IF x_transaction_type = cec_globals_pk.coe_submitted
THEN
BEGIN
SELECT 'X'
INTO x_coe_flag
FROM cec_ipc_transactions
WHERE status_code
|| '' NOT IN (cec_globals_pk.coe_aborted,
cec_globals_pk.coe_queried,
cec_globals_pk.coe_expired,
cec_globals_pk.coe_locked,
cec_globals_pk.coe_wait,
cec_globals_pk.coe_wait_for_version,
cec_globals_pk.coe_wait_for_cust_data,
cec_globals_pk.coe_importing_cust_data
AND ( erp_header_id = x_header_id
OR ( original_order_id = p_original_order_id
AND order_source <> 'ALL'
AND NVL (version, 1) = NVL (p_version, 1);
EXCEPTION
WHEN OTHERS
THEN
x_coe_flag := NULL;
END;
END IF;
IF NVL (x_coe_flag, 'YY') != 'X'
THEN
UPDATE cec_ipc_transactions
SET status_code = x_transaction_type,
status_code_old = x_transaction_type_old,
operation_code = DECODE (
operation_code,
'CANCEL', operation_code,
DECODE (
p_version,
NULL, 'INSERT',
1, 'INSERT',
DECODE (
p_cancel_flag,
'Y', 'CANCEL',
'UPDATE'
change_query_date = DECODE (
x_transaction_type,
cec_globals_pk.coe_queried, SYSDATE,
change_query_date
change_submit_date =
NVL (
change_submit_date,
DECODE (
x_transaction_type,
cec_globals_pk.coe_submitted, SYSDATE,
cec_globals_pk.coe_print_and_tear, SYSDATE,
change_submit_date
change_complete_date =
DECODE (
x_transaction_type,
cec_globals_pk.coe_manual_rlse, SYSDATE,
cec_globals_pk.coe_aborted, SYSDATE,
cec_globals_pk.coe_expired, SYSDATE,
cec_globals_pk.coe_ec_applied, SYSDATE,
cec_globals_pk.coe_print_and_tear, NULL,
cec_globals_pk.coe_submitted, NULL,
change_complete_date
WHERE ( ( original_order_id = p_original_order_id
AND order_source <> 'ALL'
OR erp_header_id = x_header_id
AND NVL (version, 1) = NVL (p_version, 1);
-- If update failed, then insert
IF SQL%ROWCOUNT = 0
THEN
x_statement_id := 30;
INSERT INTO cec_ipc_transactions
(ec_header_id, version, status_code, operation_code,
load_code, change_query_date, change_submit_date,
change_complete_date, erp_header_id, error_message,
order_id, original_order_id, org_id, order_source,
creation_date)
SELECT cec_headers_s.NEXTVAL, p_version, x_transaction_type,
DECODE (
p_version,
NULL, 'INSERT',
1, 'INSERT',
DECODE (p_cancel_flag, 'Y', 'CANCEL', 'UPDATE')
'ELECTRONIC_COMMERCE',
DECODE (
x_transaction_type,
cec_globals_pk.coe_queried, SYSDATE,
NULL
DECODE (
x_transaction_type,
cec_globals_pk.coe_submitted, SYSDATE,
cec_globals_pk.coe_print_and_tear, SYSDATE,
NULL
DECODE (
x_transaction_type,
cec_globals_pk.coe_manual_rlse, SYSDATE,
cec_globals_pk.coe_aborted, SYSDATE,
cec_globals_pk.coe_expired, SYSDATE,
cec_globals_pk.coe_ec_applied, SYSDATE,
NULL
x_header_id, x_error_message, p_order_id,
p_original_order_id, x_org_id, p_order_source, SYSDATE
FROM DUAL;
-- Since we bump the version of the change hold, we want to send
-- all previous print and tear versions to manually released as well
ELSIF x_transaction_type = cec_globals_pk.coe_manual_rlse
THEN
UPDATE cec_ipc_transactions
SET status_code = x_transaction_type,
change_complete_date = SYSDATE
WHERE status_code
|| '' IN (cec_globals_pk.coe_print_and_tear,
cec_globals_pk.coe_queried
AND ( erp_header_id = x_header_id
OR original_order_id = p_original_order_id
AND NVL (version, 1) < NVL (p_version, 1);
-- Fix incident 3241
ELSIF x_transaction_type = cec_globals_pk.coe_aborted
THEN
UPDATE cec_ipc_transactions
SET status_code = x_transaction_type,
change_complete_date = SYSDATE
WHERE status_code
|| '' IN (cec_globals_pk.coe_queried)
AND ( erp_header_id = x_header_id
OR ( original_order_id = p_original_order_id
AND order_source <> 'ALL'
AND NVL (version, 1) < NVL (p_version, 1);
-- Fixed incident 2771
ELSIF x_transaction_type = cec_globals_pk.coe_submitted
THEN
UPDATE cec_ipc_transactions
SET status_code = cec_globals_pk.coe_expired,
change_complete_date = SYSDATE
WHERE status_code
|| '' IN (cec_globals_pk.coe_queried)
AND ( erp_header_id = x_header_id
OR ( original_order_id = p_original_order_id
AND order_source <> 'ALL'
AND NVL (version, 1) < NVL (p_version, 1);
END IF;
END IF;
p_success := TRUE;
p_error_message := x_error_message;
cec_debug_pk.DEBUG ('End CEC_IPC_TRANSACTIONS_API.log');
EXCEPTION
WHEN OTHERS
THEN
p_success := FALSE;
cec_debug_pk.get_error (
'GENERIC_SQL_ERROR',
x_error_message,
x_task_id
cec_debug_pk.set_token (
x_error_message,
'PROCEDURE_NAME',
'CEC_IPC_TRANSACTIONS_API.log'
cec_debug_pk.set_token (
x_error_message,
'STATEMENT_ID',
TO_CHAR (x_statement_id)
cec_debug_pk.set_token (x_error_message, 'SQLCODE', SQLCODE);
cec_debug_pk.set_token (x_error_message, 'SQLERRM', SQLERRM);
p_error_message := x_error_message;
END LOG;
END cec_ipc_transactions_api;
/

  CURSOR lock_header (p_header_id NUMBER) IS
    SELECT ooha.header_id header_id
      FROM oe_order_headers_all ooha, xxcca_oe_order_headers_ext xoohe
     WHERE ooha.header_id = p_header_id
       AND ooha.header_id = xoohe.header_id
       AND NVL (xoohe.cms_customer_version, -999) < p_version
       AND ooha.open_flag|| '' = 'Y'
    FOR UPDATE OF xoohe.cms_customer_version NOWAIT;i'm not sure why you had this predicate in your where clause of your cursor:
           AND ooha.open_flag|| '' = 'Y'when you can simply
           AND ooha.open_flag = 'Y'
or
           AND Nvl(ooha.open_flag,'X') = 'Y'also you only had one loop structures which is to update and not insert. could it be that the procedure log is being called out from other pl/sql within it's loop structure that cause duplication in inserts.

Similar Messages

  • Muliple package calls on aftercommit causing duplicate records

    I have a jsp page inserting into table A. The app module has an aftercommit method that calls a package using createCallableStatement. This package call inserts into table B.
    We found duplicate records were being created in table B. The create time is the same in the 2 duplicate records. To try to prevent the duplicates I put a check in the package before the insert to see if the record already exists then don't insert. This did not stop the duplicates. This implies that there are 2 instances of the package executing in parallel in different sessions. They can't see each others uncommitted inserts and so go ahead and insert.
    Is this possible? Could it be creating multiple sessions?

    Hi,
    to work with multiple sessions, there have to be multiple database sessions. While e.g AM pooling could have additional connections to t he database open, I don't see how a commit of one session invokes the same on another
    Frank

  • JCA DB Adapter process the same file twice in a clustered envrionment causing duplicate records

    We have 2 Nodes in a OSB Cluster and for one of the webservices we use DB Adapter to process the records and send it to external system.
    When ever we catch up with below exception and when the same records are reprocessed/retried the message is processed/sent by both the nodes which is causing a duplicate of records. 
    'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <1e3df38069b38bbe:3fdbec90:14c7cca0f5f:-8000-0000000000000006> <1428935826370> <BEA-000000> <<oracle.tip.adapter.db.InboundWork handleException> BINDING.JCA-11624
    DBActivationSpec Polling Exception.
    Query name: [XYZDBAdapterSelect], Descriptor name: [XYZDBAdapter.ABCEnrollment]. Polling the database for events failed on this iteration.
    Caused by JTS transaction was marked rollback only.
    Please advice!
    Thanks!

    You have to use MOVE operation, if there is any remote exception occured, then move the file to someother folder and again move back to the same folder where the file pickup will start.
    In that way you will be use the same file picked up next time when the polling happens.
    It is considered good etiquette to reward answerers with points (as "helpful" - 5 pts - or "correct" - 10pts).
    Thanks,
    Vijay

  • HTTP Protocol caused Duplicate Records

    Hello,
    Out company has just developed a j2me program for sales personal to place order from mobile phone. The program is developed using netbeans 6. The connection to the the server is using http protocol via gprs or 3G.
    There is a problem that we face; sometimes we there are duplicate entries from the one mobile request. For example, A sales personal placed an order and press the send button to the server.. The server incoming log sometimes will have multiple request from this this request..
    public void send()throws IOException
    //mainS = null;
    Days = "";
    if (mainS == null){
    getAlert1().setString("Invalid Input lala");
    throw new IOException("Invalid Input lala");
    String onum = getonum();
    StringBuffer b = new StringBuffer();
    String url = server + "?act=i&unm=" + uname + "&upw=" + upw + "&icode="+Icode+"&onum="+onum +"&dd="+ getDates() + "&num=" + mainS;
    System.out.println(url);
    try{
    //Connect to the server
    HttpConnection hc = (HttpConnection) Connector.open(url);
    //Authentication
    System.out.println(hc.getResponseCode());
    if (hc.getResponseCode() == HttpConnection.HTTP_OK) {
    InputStream is = hc.openDataInputStream();
    int ch;
    while((ch = is.read()) != -1) {
    b.append((char) ch);
    System.out.println(b.toString().trim());
    returnS = b.toString().trim();
    is.close();
    }else{
    getAlert1().setString("No respon from server.");
    hc.close();
    throw new IOException("No respon from server.");
    hc.close();
    }catch(Exception ex){
    getAlert1().setString(ex.getMessage());
    throw new IOException(ex.getMessage());
    Please help.

    yhkok wrote:
    Hi.
    Thank you for your reply. I think the problem caused by user to send multiple times should not be happened. Because when a user press "SEND" to submit the request, the can't press anything while waiting for the confirmation from the server. Perhaps the second issue cause the problem.. can you tell me more in detail how can I solve this problem.. or how can I check if the program send multiple time. This happens especially during heavily transactions or a lot of users send in together in a short period. Can It cause by while waiting for the server to reply.. the user end "time-out" and send the form again ? We are using Nokia 2630 mobilephone which loaded with S40 5th edition..
    Regards.
    KokI carefully said the user OR the user browser (whatever that may be) is re-submitting the form.
    Who cares why? It doesn't matter why. It (or someone) just is.

  • Remove duplicate records in Live Office, caused by CR Groups

    hello all
    i have a CR with groups. all works well, until i use the report in live office, were it is duplicating the group data for each of the detail records
    i have removed the details from the CR report, leaving only the group data, but it still happens
    anyone have a work around ?
    thanks
    g

    Hi,
    First you select the report name from the left panel and  check whether  option is coming on not.
    or you try  with right click on any report cell then go to live office and object properties.
    Second , you are getting duplicate record in the particular this report or all reports.And how many time highlight expert you are using in this report.
    Thanks,
    Amit

  • Duplicate records problem

    Hi everyone,
    I'm having a a little difficulty resolving a problem with a repeating field causing duplication of data in a report I'm working on, and was hoping someone on here can suggest something to help!
    My report is designed to detail library issues during a particular period, categorised by the language of the item issued. My problem is that on the sql database that out library management system uses, it is possible for an item to have more than one language listed against it (some books will be in more than one language). When I list the loan records excluding the language data field, I get a list of distinct loan records. Bringing the language data into the report causes the loan record to repeat for each language associated with it, so if a book is both in English and French, it will cause the loan record to appear like this:
    LOAN RECORD NO.     LANGUAGE CODE
      123456                             ENG
      123456                             FRE
    So, although the loan only occurred once I have two instances of it in my report.
    I am only interested in the language that appears first and I can exclude duplicated records from the report page. I can also count only the distinct records to get an accurate overall total. My problem is that when I group the loan records by language code (I really need to do this as there are millions of loan records held in the database) the distinct count stops being a solution, as when placed at this group level it only excludes duplicates in the respective group level it's placed in. So my report would display something like this:
    ENG     1
    FRE      1
    A distinct count of the whole report would give the correct total of 1, but a cumulative total of the figures calculated at the language code group level would total 2, and be incorrect. I've encountered similar results when using Running Totals evaluating on a formula that excludes repeated loan record no.s from the count, but again when I group on the language code this goes out of the window.
    I need to find a way of grouping the loan records by language with a total count of loan records alongside each grouping that accurately reflects how many loans of that language took place.
    Is this possible using a calculation formula when there are repeating fields, or do I need to find a way of merging the repeating language fields into one field so that the report would appear like:
    LOAN RECORD     LANGUAGE CODE
      123456                      ENG, FRE
    Any suggestions would be greatly appreciated, as aside from this repeating language data there are quite a few other repeating database fields on the system that it would be nice to report on!
    Thanks!

    if you create a group by loan
    then create a group by language
    place the values in the group(loan id in the loan header)
    you should only see the loan id 1x.
    place the language in the language group you should only see that one time
    a group header returns the 1st value of a unique id....
    then in order to calculate avoiding the duplicates
    use manual running totals
    create a set for each summary you want- make sure each set has a different variable name
    MANUAL RUNNING TOTALS
    RESET
    The reset formula is placed in a group header report header to reset the summary to zero for each unique record it groups by.
    whileprintingrecords;
    Numbervar  X := 0;
    CALCULATION
    The calculation is placed adjacent to the field or formula that is being calculated.
    (if there are duplicate values; create a group on the field that is being calculated on. If there are not duplicate records, the detail section is used.
    whileprintingrecords;
    Numbervar  X := x + ; ( or formula)
    DISPLAY
    The display is the sum of what is being calculated. This is placed in a group, page or report footer. (generally placed in the group footer of the group header where the reset is placed.)
    whileprintingrecords;
    Numbervar  X;
    X

  • Duplicate records in Fact Tables

    Hi,
    We are using BPC 7.0 MS SP7. BPC created duplicate records in WB and Fac2 tables. We faced similar issue before and the solution was to reboot the server and cleanup the additional data created. I think it should not be an issue with the script logic files we have. We had the issue across all applications. Data is fine now after the server reboot and running the same logic files.  I want to know if any one faced this issue and if there is any solution other than reboot. I appreciate your help.
    Thanks
    Raj

    Hi Sorin,
    I know this thread is rather old, but i have a problem which is pretty much related to this thread and appreciate if you could assist me. I have client running on 7.0 MS who has been using it for the past 3 years.
    It is a heavily customized system with many batch files running daily to update dimensions, copy data and sort. And Yes we do use custom packages that incorporates stored procedures.
    Recently, with no change in environment, we encountered our factwb ballooning up out of no where. fact table only contains less then 1 gb data but, factwb has 200 gb data and practically paralayzed the system. There is also equilavent 300 gb increase in the log files.
    We are not able to find out what caused this? Or if even the 200gb records in wb are even valid records that are duplicated. Is there a way to troubleshoot this?

  • Duplicate Records & CFTRANSACTION

    Maybe I'm missing the obvious on this, as I've never had a
    problem with this before, but recently I developed a custom tag
    that logs and blocks the IP addresses of all these recent DECLARE
    SQL injection attempts.
    The issue is however that my blocked IP address seems to be
    getting duplicates here and there. My "datetime" field in the
    database show the duplicates are all added to the database the
    exact same second. What gives?
    Shouldn't CFTRANSACTION be preventing such a thing, even if
    multiple injection attempts come at the same time?

    I've always coded my applications where my primary key is my
    database's autonumber field, and instead insure my coding is solid
    enough to prevent duplicate records from appearing. Oddly enough
    it's worked flawlessly until now.
    Would not ColdFusion throw errors if I made the "ip_address"
    field my primary key, and my code allowed for a duplicate record to
    be entered? Am I interpretting the CFTRANSACTION code to do
    something it doesn't do?
    Also, the duplicates aren't causing problems, so a DISTINCT
    select isn't necessary. The IP address is blocked whether one
    record or fifty exist in my blocked_ip_addresses table. My goal is
    just not to waste database space.
    Any further help you can provide is MUCH appreciated!
    Thanks!

  • Duplicate Records in Details for ECC data source. Help.

    Hello. First post on SDN. I have been searching prior posts, but have come up empty. I am in the middle of creating a report linking directly into 4 tables in ECC 6.0. I am having trouble in getting either the table links set up correctly, or filtering out duplicate record sets that are being reporting in the details section of my report. It appears that I have 119 records being displayed, when the parameters values should only yeild 7. The details section is repeating the 7 records 17 times (there are 17 matching records for the parameter choices in one of the other tables which I think is the cause).
    I think this is due to the other table links for my parameter values. But, I need to keep the links the way they are for other aspects of the report (header information). The tables in question are using an Inner Join, Enforced Both, =. I tried the other link options, with no luck.
    I am unable to use the "Select Disctinct Records" option in the Database menu since this is not supported when connecting to ECC.
    Any ideas would be greatly appreciated.
    Thanks,
    Barret
    PS. I come from more of a Functional background, so development is sort of new to me. Take it easy on the newbie.

    If you can't establish links to bring back unique data then use a group to diplay data.
    Group report by a filed which is the lowest commom denominator.
    Move all fields into group footer and suppress Group header and details
    You will not be able to use normal summaries as they will count/sum all the duplicated data, use Running Totals instead and select evaluate on change of the introduced group
    Ian

  • Duplicate Records in DTP, but not in PSA

    Hi,
    I'm facing a strange behavior of the DTP while trying to load a Master Data, detecting duplicate notes where there is none.
    For example:
    ID 'cours000000000001000'
    In the source system: 1 record
    In the PSA: 1 record
    In the DTP Temporary Storage, 2 identical lines are identified.
    In fact, in this Temporary Storage, all the PSA records are duplicated... but only 101 are displayed as erroneous in the DTP...
    Here is my question: How to get rid of this duplication in the temporary storage?
    Thanks for your help
    Sylvain

    semantic keys selection could cause the duplicate issue in master data. if similar values in the keys were found then that will be taken as duplicate .
    in the second tab of DTP u can the handle duplicate records option choose that and load.
    Ramesh

  • Fact Table allows duplicate records

    Is Fact Table allows duplicate records?

    What do you mean by duplicate records? It could be what appears duplicate to you is having some other technical key info that are different in the fact table.
    At technical level, there wouldn't be duplicate records in fact table [in SAP's BW/BI there are two fact tables for each cube - which can itself cause some confusion]

  • Oracle 10 -   Avoiding Duplicate Records During Import Process

    I have two databases on different servers(DB1&DB2) and a dblink connecting the two. In DB2, I have 100 million records in table 'DB2Target'.
    I tried to load 100 million more records from tables DB1Source to DB2Target on top of existing 400 million records but after an hour I got a network error from DB2.
    The load failed after inserting 70% records.Now I have three tasks. First I have to find the duplicate record between DB1 and DB2. Second I have to find out the remaining 30% records missing from DB2Target.
    Third I have to re-load the remaining 30% records. What is the best solution?
    SELECT COUNT(*), A, B FROM DB2TARGET
    GROUP BY A, B
    HAVING COUNT(*) > 2
    re-loading
    MERGE INTO DB2TARGET tgt
    USING DB1SOURCE src
    ON ( tgt .A=  tgt .A)
    WHEN NOT MATCHED THEN
    INSERT ( tgt.A,  tgt .B)
    VALUES ( src .A,  src .B)Thanks for any guidance.

    when I execute this I get the folllowing error message:
    SQL Error: ORA-02064: distributed operation not supported
    02064. 00000 - "distributed operation not supported"
    *Cause:    One of the following unsupported operations was attempted
    1. array execute of a remote update with a subquery that references
    a dblink, or
    2. an update of a long column with bind variable and an update of
    a second column with a subquery that both references a dblink
    and a bind variable, or
    3. a commit is issued in a coordinated session from an RPC procedure
    call with OUT parameters or function call.
    *Action:   simplify remote update statement                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Duplicate record found short dump, if routed through PSA

    Hi Experts,
    I am getting these errors extracting the master data for 0ACTIVITY
    1 duplicate record found. 1991 recordings used in table /BI0/XACTIVITY
    1 duplicate record found. 1991 recordings used in table /BI0/PACTIVITY
    2 duplicate record found. 2100 recordings used in table /BI0/XACTIVITY
    2 duplicate record found. 2100 recordings used in table /BI0/PACTIVITY.
    If I extract via PSA with the option to ingnore duplicate records, I am getting a short dump
    ShrtText                                            
        An SQL error occurred when accessing a table.   
    How to correct the error                                                             
         Database error text........: "ORA-14400: inserted partition key does not map to  
          any partition"                                                                  
    What is  causing errors in my extraction ?
    thanks
    D Bret

    Go to RSRV .Go to All Elementary Tests -->PSA Tables -->Consistency Between PSA Partitions and SAP Administration Information## ..and correct the error...

  • How to handel duplicate record by bcp command

    Hi All,
    I`m using BCP to import ASCII data text into a table that already has many records. BCP failed because of `Duplicate primary key`. 
    Now, is there any way using BCP to know precisely which record whose primary key caused that `violation of inserting duplicate key`.
    I already used the option -O to output error to a `error.log`, but it doesn`t help much, because that error log contains the same error message mentioned above without telling me exactly which record so that I can pull that `duplicate record`
    out of my import data file.
    TIA and you have a great day.

    The only way of figuring out what PKs conflicted I know of is to load the data to a different table and then run an INNER JOIN select between the two.
    BCP.exe is not part of SSIS technically, don't know why you are posting in this section of the forum.
    Arthur My Blog

  • Error in Duplicate record validation in EntityObject using unique key contraint

    I have implemented Unique key validation in Entity Object by creating Alternate key in Entity object.
    So the problem is that whenever the duplicate record is found,the duplicate record error is shown, but the page becomes blank, and no error is shown in the log.
    I wanted to know what may be the possible cause of it.
    I am using Jdev 11.1.2.4.

    After duplication, clear the PK item, then populate from the sequence in a PRE-INSERT block-level trigger.
    Francois

Maybe you are looking for

  • How do I globally set the font of a java program

    I would like to define a default font for all the labels, button, and text on a JFrame. I used the following lines: static Font defaultFont = new Font("dialog", Font.PLAIN, 12); contentPane = frame.getContentPane(); contentPane.setFont(defaultFont);

  • Send Idoc For Purchse Order

    Hi Friends, I have Configure All  the things require to send idoc for purchse order..........when i m creating purchase order from source system it's getting send to traget system.................... but in target system giving error in idoc.that VKO

  • Third party JDBC Driver and Domain Extension

    Hi everyone, I'm using a third party driver to acess some of the legacy databases in my company. I'm adding them to Weblogic's (v. 10.3.3.0) Classpath by setting the EXT_PRE_CLASSPATH in the setWLEnv.sh file. Everything worked fine so far, until I ha

  • Network connection slow to become active after waking from sleep.

    Using MBP (2007) with OS 10.8.2  I put my machine to sleep each night after work.  Each morning when I wake it, the network connection takes over 90 seconds to become active. This is on a WIRED network.  Didn't have this problem before Mountain Lion

  • Can't log into server after 10.5.5 update

    After running the 10.5.5 update on our Intel Xserve, it keeps getting stuck at the login window. I typed in the correct admin login info, the window would go away as if it was going to the next step, then it came back again. All services are running