Duplicate records getting inserted instead of update.
I have any entity which has multiple childern. All have one to many relation with the parent. When I update the no of childerns and merge using EntityManager. It is updating the parent and insert all the childern records, which is creating duplicates. What could be the possible error.
Where did you read the parent object and children from originally (was it serialized?), and how are you merging it? How have you defined your id in your child objects? What version of TopLink are you using?
Perhaps include some sample code. It most likely has something to do with how you are merging the objects, if you just read the parent and access the children, are they inserted?
-- James : http://www.eclipselink.org
Similar Messages
-
Avoiding duplicate records while inserting into the table
Hi
I tried the following insert statement , where i want to avoid the duplicate records while inserting itself
but giving me the errror like invalid identifier, though the column exists in the table
Please let me know Where i'm doing the mistake.
INSERT INTO t_map tm(sn_id,o_id,txt,typ,sn_time)
SELECT 100,
sk.obj_id,
sk.key_txt,
sk.obj_typ,
sysdate,
FROM S_KEY sk
WHERE sk.obj_typ = 'AY'
AND SYSDATE BETWEEN sk.start_date AND sk.end_date
AND sk.obj_id IN (100170,1001054)
and not exists (select 1
FROM t_map tm1 where tm1.O_ID=tm.o_id
and tm1.sn_id=tm.sn_id
and tm1.txt=tm.txt
and tm1.typ=tm.typ
and tm1.sn_time=tm.sn_time )Then
you have to join the table with alias tml where is that ?do you want like this?
INSERT INTO t_map tm(sn_id,o_id,txt,typ,sn_time)
SELECT 100,
sk.obj_id,
sk.key_txt,
sk.obj_typ,
sysdate,
FROM S_KEY sk
WHERE sk.obj_typ = 'AY'
AND SYSDATE BETWEEN sk.start_date AND sk.end_date
AND sk.obj_id IN (100170,1001054)
and not exists (select 1
FROM t_map tm where sk.obj_ID=tm.o_id
and 100=tm.sn_id
and sk.key_txt=tm.txt
and sk.obj_typ=tm.typ
and sysdate=tm.sn_time ) -
Child Records getting inserted before Parent
HI All,
I have an Entity object (Vendors) that has a defined relationship to payments (12 months) when I create a new vendor record I'm creating the 12 child records that all works fine but when I try to issue a commit the child records get inserted first creating a constraint error. If I turn the constraint off and using the "Jbo.debugoutput=console" I can see the 12 months being inserted followed by the vendor record. The payment records vendor_id is being set to the correct parent but the insert is happing in the wrong order does anybody know how to control the order of the insert. There's a Association set between the Vendor entity and the payment entity. If I move the code to a view object it seems to work fine. But in this case we have multiply view objects on these entity objects and I'm trying to prevent having to write the same code in every view Object.
Here's the code for the Create on the Vendors EntityImpl
protected void create(AttributeList attributeList) {
super.create(attributeList);
// Set the default values and the Seq number for this new record
SequenceImpl seqNoImpl = new SequenceImpl("VENDORS_SEQ",getDBTransaction());
this.setVendorId( seqNoImpl.getSequenceNumber());
Number zeroAmt = new Number(0);
RowIterator paymentsIter = this.getVendorPayments();
// When creating a new Vendor create the 12 months of payment records
for (int i = 1; i < 13; i++) {
Row newRow = paymentsIter.createRow();
newRow.setAttribute("Month",i);
newRow.setAttribute("Amount",zeroAmt);
paymentsIter.insertRow(newRow);
Dose anybody have any ideas....Pieter,
Thanks for the information!!
Using it I was able to get the process working. There where some interesting side affects. If I just turned on the "Composition Association" then no payment records got inserted into the database even though they where in the entity and view objects. I took the second recommendation of adding code to the PostChanges event and was able to get the parent and children inserted in the correct order.
Thanks again for the idea and information -
Two records getting inserted with the same timestamp...
hi all,
I am trying to submit a form . Now whenever I click submit before I insert any
data sent in that form I make a check (SELECT stmt. to see if that record can
be inserted ...few business validations..) and if the check is successful I then
proceed for the necessary insert into a table the Primary key for which is a running
Oracle sequence.
But if I click on the Submit button twice in close succession I have observed
may be 1 in 1000 attempts I am able to submit two records with the same time stamp
for the date of insertion . We are using Oracle 8 with weblogic 5.1. And I don't
think ORACLE's date precision is beyond seconds.
So any suggestion ..what is the best way to handle such things : one can be
to place the same business validation check just before the ending brace of the
method , or secondly sucmit the form thru javascript and don't submit it twice
even if the user clicks the submit button twice... any suggestion which u can
give .. are welcome.
thnx in advance
sajan
Is the pkey a timestamp or an Oracle sequence? The latter will always work,
since no two requests to a sequence can get the same value (rollover
excluded). If you must use timestamp, then you must auto-retry the insert
if the first attempt fails. Oracle does have higher precision than seconds,
but I can remember the exact precision ... I am pretty sure it works out to
at least two or three digits though.
Peace,
Cameron Purdy
Tangosol, Inc.
http://www.tangosol.com
Tangosol Server: Enabling enterprise application customization
"Sajan Parihar" <[email protected]> wrote in message
news:[email protected]...
>
> hi all,
> I am trying to submit a form . Now whenever I click submit before I
insert any
> data sent in that form I make a check (SELECT stmt. to see if that record
can
> be inserted ...few business validations..) and if the check is successful
I then
> proceed for the necessary insert into a table the Primary key for which is
a running
> Oracle sequence.
> But if I click on the Submit button twice in close succession I have
observed
> may be 1 in 1000 attempts I am able to submit two records with the same
time stamp
> for the date of insertion . We are using Oracle 8 with weblogic 5.1. And I
don't
> think ORACLE's date precision is beyond seconds.
> So any suggestion ..what is the best way to handle such things : one
can be
> to place the same business validation check just before the ending brace
of the
> method , or secondly sucmit the form thru javascript and don't submit it
twice
> even if the user clicks the submit button twice... any suggestion which u
can
> give .. are welcome.
>
> thnx in advance
> sajan
-
Dear All,
I have oracle 10g R2 On windows.
I have table structure like below...
ASSIGNED_TO
USER_ZONE
CREATED
MASTER_FOLIO_NUMBER
NAME
A_B_BROKER_CODE
INTERACTION_ID
INTERACTION_CREATED
INTERACTION_STATE
USER_TEAM_BRANCH
A4_IN_CALL_TYPE
A5_IN_CALL_SUBTYPE
DNT_AGING_IN_DAYS
DNT_PENDING_WITH
DNT_ESCALATION_STAGE_2
DT_UPDATEI use sql loader to load the data from .csv file to oracle table and have assign the value to dt_update sysdate. Everytime i execute the sql loader control file dt_update set as sysdate.
Sometimes problem occures while inserting data through sql loader and half row get insert. after solving the problem again i execute sql loader and hence these duplicate records get inserted.
Now I want to remove all the duplicate records for those dt_update is same.
Please help me to solve the problem
Regards,
Chanchal Wankhade.Galbarad wrote:
Hi
I think you have two ways
first - if it is first import in your table - you can delete all record from table and run import yet one time
second - you can delete all duplicate records and not running import
try this script
<pre>
delete from YOUR_TABLE
where rowid in (select min(rowid)
from YOUR_TABLE
group by ASSIGNED_TO,
USER_ZONE,
CREATED,
MASTER_FOLIO_NUMBER,
NAME,
A_B_BROKER_CODE,
INTERACTION_ID,
INTERACTION_CREATED,
INTERACTION_STATE,
USER_TEAM_BRANCH,
A4_IN_CALL_TYPE,
A5_IN_CALL_SUBTYPE,
DNT_AGING_IN_DAYS,
DNT_PENDING_WITH,
DNT_ESCALATION_STAGE_2,
DT_UPDATE)
</pre>Have you ever tried that script for deleting duplicates? I think not. If you did you'd find it deleted non-duplicates too. You'd also find that it only deletes the first duplicate where there are duplicates.
XXXX> CREATE TABLE dt_test_dup
2 AS
3 SELECT
4 mod(rownum,3) id
5 FROM
6 dual
7 CONNECT BY
8 level <= 9
9 UNION ALL
10 SELECT
11 rownum + 3 id
12 FROM
13 dual
14 CONNECT BY
15 level <= 3
16 /
Table created.
Elapsed: 00:00:00.10
XXXX> select * from dt_test_dup;
ID
1
2
0
1
2
0
1
2
0
4
5
6
12 rows selected.
Elapsed: 00:00:00.18
XXXX> delete
2 from
3 dt_test_dup
4 where
5 rowid IN ( SELECT
6 MIN(rowid)
7 FROM
8 dt_test_dup
9 GROUP BY
10 id
11 )
12 /
6 rows deleted.
Elapsed: 00:00:00.51
XXXX> select * from dt_test_dup;
ID
1
2
0
1
2
0
6 rows selected.
Elapsed: 00:00:00.00 -
Hi Everyone,
A Very Very Happy, Fun-filled, Awesome New Year to You All.
Now coming to the discussion of my problem in Oracle Forms 6i:
I have created a form in which the data is entered & saved in the database.
CREATE TABLE MATURED_FD_DTL
ACCT_FD_NO VARCHAR2(17 BYTE) NOT NULL,
CUST_CODE NUMBER(9),
FD_AMT NUMBER(15),
FD_INT_BAL NUMBER(15),
TDS NUMBER(15),
CHQ_NO NUMBER(10),
CREATED_DATE DATE,
CREATED_BY VARCHAR2(15 BYTE),
PREV_YR_TDS NUMBER(15),
ADD_FD_AMT NUMBER(15),
DESCRIPTION VARCHAR2(100 BYTE),
P_SAP_CODE NUMBER(10),
P_TYPE VARCHAR2(1 BYTE)
The form looks like below:
ENTER_QUERY EXECUTE_QUERY SAVE CLEAR EXIT
ACCT_FD_NO
CUST_CODE
FD_AMT
FD_INT_BAL
PREV_YR_TDS
TDS
ADD_FD_AMT
P_SAP_CODE
P_TYPE
CHQ_NO
DESCRIPTION
R
W
P
List Item
There are 5 push buttons namely ENTER_QUERY, EXECUTE_QUERY, SAVE, CLEAR, EXIT.
The table above is same as in the form. All the fields are text_item, except the P_TYPE which is a List_Item ( Elements in List Item are R, W & P).
The user will enter the data & save it.
So all this will get updated in the table MATURED_FD_DTL .
I am updating one column in another table named as KEC_FDACCT_MSTR.
and
I want this details to get updated in another table named as KEC_FDACCT_DTL only if the P_TYPE='P'
CREATE TABLE KEC_FDACCT_DTL
FD_SR_NO NUMBER(8) NOT NULL,
FD_DTL_SL_NO NUMBER(5),
ACCT_FD_NO VARCHAR2(17 BYTE) NOT NULL,
FD_AMT NUMBER(15,2),
INT_RATE NUMBER(15,2),
SAP_GLCODE NUMBER(10),
CATOGY_NAME VARCHAR2(30 BYTE),
PROCESS_YR_MON NUMBER(6),
INT_AMT NUMBER(16,2),
QUTERLY_FD_AMT NUMBER(16,2),
ITAX NUMBER(9,2),
MATURITY_DT DATE,
FDR_STAUS VARCHAR2(2 BYTE),
PAY_ACC_CODE VARCHAR2(85 BYTE),
BANK_CODE VARCHAR2(150 BYTE),
NET_AMOUNT_PAYABLE NUMBER,
QUATERLY_PAY_DT DATE,
CHEQUE_ON VARCHAR2(150 BYTE),
CHEQUE_NUMBER VARCHAR2(10 BYTE),
CHEQUE_DATE DATE,
MICR_NUMBER VARCHAR2(10 BYTE),
PAY_TYPE VARCHAR2(3 BYTE),
ADD_INT_AMT NUMBER(16,2),
ADD_QUTERLY_FD_AMT NUMBER(16,2),
ADD_ITAX NUMBER(16,2),
ECS_ADD_INT_AMT NUMBER(16),
ECS_ADD_QUTERLY_FD_AMT NUMBER(16),
ECS_ADD_ITAX NUMBER(16)
So for the push button 'Save' , i have put in the following code in the Trigger : WHEN BUTTON PRESSED,
BEGIN
Commit_form;
UPDATE KEC_FDACCT_MSTR SET PAY_STATUS='P' WHERE ACCT_FD_NO IN (SELECT ACCT_FD_NO FROM MATURED_FD_DTL);
UPDATE MATURED_FD_DTL SET CREATED_DATE=sysdate, CREATED_BY = :GLOBAL.USER_ID WHERE ACCT_FD_NO = :acct_fd_NO;
IF :P_TYPE='P' THEN
INSERT INTO KEC_FDACCT_DTL
SELECT FD_SR_NO, NULL, MATURED_FD_DTL.ACCT_FD_NO, FD_AMT, INT_RATE, P_SAP_CODE,
GROUP_TYPE, (TO_CHAR(SYSDATE, 'YYYYMM'))PROCESS_YR_MON,
FD_INT_BAL, (FD_INT_BAL-MATURED_FD_DTL.TDS)QUTERLY_FD_AMT , MATURED_FD_DTL.TDS,
MATURITY_DATE, P_TYPE, NULL, NULL, (FD_INT_BAL-MATURED_FD_DTL.TDS)NET_AMOUNT_PAYABLE,
NULL, NULL, CHQ_NO, SYSDATE, NULL, 'CHQ', NULL, NULL, NULL, NULL, NULL, NULL
FROM MATURED_FD_DTL, KEC_FDACCT_MSTR
WHERE KEC_FDACCT_MSTR.ACCT_FD_NO=MATURED_FD_DTL.ACCT_FD_NO;
END IF;
COMMIT;
MESSAGE('RECORD HAS BEEN UPDATED AS PAID');
MESSAGE(' ',no_acknowledge);
END;
If P_TYPE='P' , then the data must get saved in KEC_FDACCT_DTL table.
The problem what is happening is,
If i enter the details with all the records as 'P' , the record gets inserted into the table KEC_FDACCT_DTL
If i enter the details with records of 'P' and 'R' , then nothing gets inserted into the table KEC_FDACCT_DTL.
Even the records with 'P' is not getting updated.
I want the records of 'P' , to be inserted into table KEC_FDACCT_DTL, even when multiple records of all types of 'P_Type' (R, w & P) are entered.
So, can you please help me with this.
Thank You.
Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production
Oracle Forms Builder 6i.Its not working properly.
At Form_level_Trigger: POST_INSERT, I have put in the following code.
IF :P_TYPE='P'THEN
INSERT INTO KEC_FDACCT_DTL
SELECT FD_SR_NO, NULL, MATURED_FD_DTL.ACCT_FD_NO, FD_AMT, INT_RATE, P_SAP_CODE,
GROUP_TYPE, (TO_CHAR(SYSDATE, 'YYYYMM'))PROCESS_YR_MON,
FD_INT_BAL, (FD_INT_BAL-MATURED_FD_DTL.TDS)QUTERLY_FD_AMT , MATURED_FD_DTL.TDS,
MATURITY_DATE, P_TYPE, NULL, NULL, (FD_INT_BAL-MATURED_FD_DTL.TDS)NET_AMOUNT_PAYABLE,
NULL, NULL, CHQ_NO, SYSDATE, NULL, 'CHQ', NULL, NULL, NULL, NULL, NULL, NULL
FROM MATURED_FD_DTL, KEC_FDACCT_MSTR
WHERE KEC_FDACCT_MSTR.ACCT_FD_NO=MATURED_FD_DTL.ACCT_FD_NO;
END IF;
MESSAGE('RECORD HAS BEEN UPDATED AS PAID');
MESSAGE(' ',no_acknowledge);
It worked properly when i executed first time, but second time , in database duplicate values were stored.
Example: First I entered the following in the form & saved it.
ACCT_FD_NO
CUST_CODE
FD_AMT
FD_INT_BAL
PREV_YR_TDS
TDS
ADD_FD_AMT
P_SAP_CODE
P_TYPE
CHQ_NO
DESCRIPTION
250398
52
50000
6000
0
600
0
45415
P
5678
int1
320107
56
100000
22478
3456
2247
0
45215
R
456
320108
87
50000
6500
0
650
0
21545
W
0
In the database, in table KEC_FDACCT_DTL, the ACCT_FD_NO:250398 with P_TYPE='P' record was inserted.
ACCT_FD_NO
P_TYPE
250398
P
But second time, when i entered the following in the form & saved.
ACCT_FD_NO
CUST_CODE
FD_AMT
FD_INT_BAL
PREV_YR_TDS
TDS
ADD_FD_AMT
P_SAP_CODE
P_TYPE
CHQ_NO
DESCRIPTION
260189
82
50000
6000
0
600
0
45415
P
5678
interest567
120011
46
200000
44478
0
4447
0
45215
R
456
30191
86
50000
6500
0
650
0
21545
W
56
In the database, in the table KEC_FDACCT_DTL, the following rows were inserted.
ACCT_FD_NO
P_TYPE
250398
P
250398
P
260189
P
320107
R
320108
W
There was duplicate of 250398 which i dint enter in the form second time,
All the other P_TYPE was also inserted , but i want only the P_TYPE='P' to be inserted into the database.
I want only those records to be inserted into the form where P_TYPE='P' and duplicate rows must not be entered.
How do i do this??? -
Data Loader inserting duplicate records
Hi,
There is an import that we need to run everyday in order to load data from another system into CRM On Demand . I have set up a data loader script which is scheduled to run every morning. The script should perform insert operation.
Every morning a file with new insert data is available in the same location(generated by someone else) & same name. The data loader script must insert all records in it.
One morning , there was a problem in the other job and a new file was not produced. When the data loader script ran , it found the old file and re-inserted the records (there were 3 in file). I had specified the -duplicatecheckoption parameter as the external id, since the records come from another system, but I came to know that the option works in the case of update operations only.
How can a situation like this handled in future? The external id should be checked for duplicates before the insert operation is performed. If we cant check on the data loader side, is it possible to somehow specify the field as 'unique' in the UI so that there is an error if a duplicate record is inserted? Please suggest.
Regards,Hi
You can use something like this:
cursor crs is select distinct deptno,dname,loc from dept.
Now you can insert all the records present in this cursor.
Assumption: You do not have duplicate entry in the dept table initially.
Cheers
Sudhir -
ADF: controlling the order in which VO records are inserted
All,
in an ADF BC project, we have some complex MD insert pages that add records to master and detail tables at the same time. The application sits on top of CDM RuleFrame, which exposes the datamodel as DB views with instead-of triggers and adds (Designer) a few nifty things like auto-populated PK fields. Sad enough, we can't use these auto-generated PKs as the 'refresh after insert' settings in an EO work with a RETURNING clause, which is not supported for instead-of triggers. This means we'll have to find the PK sequences ourselves from the EOs and populate these fields ourselves, before committing to the database. So far so good, but things go wrong when we want to insert master and detail at the same time. The PK and FK of the child record are already filled in at insert time (which is nice) but for some reason the child records get inserted before the parents, resulting in an error on the FK as the parent record does not yet exist (which is not nice).
So, here's the question: is there a way to control the order in which new VO records are inserted ? On what is the default order based (maybe a little renaming could help us out to get the alphabet our way) ?
We know defining the FKs as 'initially deferred' will make sure they are only checked at commit time, but our DBA prefers to not use that solution unless strictly necessary. (is there anything bad about deferred keys why we should believe him? ;o) )
And additionally, is there a way around all this manual PK-fetching ? We've tried using the refresh(int refreshMode) method of EntityImpl, but it didn't seem to work.
Many thanks in advance for your tips, suggestions and solutiosn !
Best regards,
Benjamin De BoeBenjamin,
To handle the problem with the child/parent records being inserted in the wrong order - have a read of the Oracle ADF Developer's Guide for Forms/4GL developers, section 26.7. I use that technique with great success.
John -
Duplicate records in input structure of model node
Hi,
Following is the way, I am assigning data to a model node:
//Clearing the model input node
for (int i = wdContext.nodeInsppointdata().size(); i > 0; i--)
wdContext.nodeInsppointdata().removeElement(wdContext.nodeInsppointdata().getElementAt(i - 1));
//Creating element of the input model node
IPrivateResultsView.IInsppointdataElement eleInspPointData;
//START A
Bapi2045L4 objBapi2045L4_1 = new Bapi2045L4(); //Instance of the input structure type
//Populating data
eleInspPointData = wdContext.nodeInsppointdata().createInsppointdataElement(objBapi2045L4_1);
wdContext.nodeInsppointdata().addElement(eleInspPointData);
eleInspPointData.setInsplot(wdContext.currentContextElement().getInspectionLotNumber());
eleInspPointData.setInspoper("0101");
//Inspection_Validate_Input is the model node. Adding instance to main node
wdContext.currentInspection_Validate_InputElement().modelObject().addInsppointdata(objBapi2045L4_1);
//STOP A
//Now executing the RFC
Above code seems to be fine. Works very well for the first time. But, when the user clicks on the same button for the second time, I can see duplicate records getting passed to RFC [Debugged using external breakpoint]. When I am sending 4 records, I can see there are total of 6 records. The number keeps increasing when clicked on the button.
I am adding multiple records to input model node using the code from START A to STOP A. Does the code look fine? Why do I see multiple records?
Thanks,
ShamIssue solved.
After executing RFC, I used following code to clear the input model node:
try
wdContext.current<yourBAPI>_InputElement().modelObject().get<yourinputnode>().clear();
catch (Exception e1) -
Hi gurus
We created a text datasource in R/3 and replicated it into BW 7.0
An infopackage (loading to PSA) and DataTransferProcess was created and included in a process chain.
The job failed because of duplicate records.
We now discovered that the setting of the u201CDelivery of Duplicate recordsu201D for this Datasource in BW is set to u201CUndefinedu201D
When creating the datasource in R/3, there were no settings for the u201CDelivery of duplicate recordsu201D.
In BW, Iu2019ve tried to change the settings of u201CDelivery of Duplicate data recordsu201D to NONE, but when I go into change-mode, the u201CDelivery of duplicateu201D is not changeable.
Does anyone have any suggestion on how to solve this problem?
Thanks,
@nne ThereseHi Muraly,
I do have the same issue. I am loading texts from R/3 to PSA using Infopackage with Full update. From PSA I am using DTP with Delta with the option" vaild records update, No reporting(Request Record).
It was running fine for last few weeks like transfer records and added records are same as in the PSA request every day.
suddenly the load has filed to infoobect . I deleted the request from Infoobject and reloaded using the DTP then again failed. I tried loading full update as it texts then again failed. Now I analised the error it says Duplicate records. So I changed the DTP by checking the option Handling Duplicate records and loaded with full update . It worked fine the transferred records more than 50000 and added records are exact no of the PSA request.
I reset the DTP again back to Delta and loaded today but the transferred records are 14000 and added records(3000) same as PSA request. I am fine if you see the history of loads the no of records in Transfer and Added in infoobjects and the no of records in PSA request are same every day..
Why this difference is now? But in Production I have no issues. Since I changed the DTP if I transport to Production does it make any difference. I am first time doing BI 7.0.
Please suggest me and explain me if I am wrong.
Thanks,
Sudha.. -
Hi all,
how to identify & eliminate the duplicate records in the PSA??Hi,
Here is the FI Help for the 'Handle Duplicate Record Keys' option in the Update tab:
"Indicator: Handling Duplicate Data Records
If this indicator is set, duplicate data records are handled during an update in the order in which they occur in a data package.
For time-independent attributes of a characteristic, the last data record with the corresponding key within a data package defines the the valid attribute value for the update for a given data record key.
For time-dependent attributes, the validity ranges of the data record values are calculated according to their order (see example).
If during your data quality measures you want to make sure that the data packages delivered by the DTP are not modified by the master data update, you must not set this indicator!
Use:
Note that for time-dependent master data, the semantic key of the DTP may not contain the field of the data source containing the DATETO information. When you set this indicator, error handling must be activated for the DTP because correcting duplicate data records is an error correction. The error correction must be "Update valid records, no reporting" or "Update valid records, reporting possible".
Example:
Handling of time-dependent data records
- Data record 1 is valid from 01.01.2006 to 31.12.2006
- Data record 2 has the same key but is valid from 01.07.2006 to 31.12.2007
- The system corrects the time interval for data record 1 to 01.01.2006 to 30.06.2006. As of 01.07.2006, the next data record in the data package (data record 2) is valid."
By flagging this option in the DTP, you are allowing it to take the latest value.
There is further information at this SAP Help Portal link:
http://help.sap.com/saphelp_nw04s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/content.htm
Rgds,
Colum -
Importing and Updating Non-Duplicate Records from 2 Tables
I need some help with the code to import data from one table
into another if it is not a duplicate or if a record has changed.
I have 2 tables, Members and NetNews. I want to check NetNews
and import non-duplicate records from Members into NetNews and
update an email address in NetNews if it has changed in Members. I
figured it could be as simple as checking Members.MembersNumber and
Members.Email against the existance of NetNews.Email and
Members.MemberNumber and if a record in NetNews does not exist,
create it and if the email address in Members.email has changed,
update it in NetNews.Email.
Here is what I have from all of the suggestions received from
another category last year. It is not complete, but I am stuck on
the solution. Can someone please help me get this code working?
Thanks!
<cfquery datasource="#application.dsrepl#"
name="qryMember">
SELECT distinct Email,FirstName,LastName,MemberNumber
FROM members
WHERE memberstanding <= 2 AND email IS NOT NULL AND email
<> ' '
</cfquery>
<cfquery datasource="#application.ds#"
name="newsMember">
SELECT distinct MemberNumber
FROM NetNews
</cfquery>
<cfif
not(listfindnocase(valuelist(newsMember.MemberNumber),qryMember.MemberNumber)
AND isnumeric(qryMember.MemberNumber))>
insert into NetNews (Email_address, First_Name, Last_Name,
MemberNumber)
values ('#trim(qryMember.Email)#',
'#trim(qryMember.FirstName)#', '#trim(qryMember.LastName)#', '#
trim(qryMember.MemberNumber)#')-
</cfif>
</cfloop>
</cfquery>
------------------Dan,
My DBA doesn't have the experience to help with a VIEW. Did I
mention that these are 2 separate databases on different servers?
This project is over a year old now and it really needs to get
finished so I thought the import would be the easiest way to go.
Thanks to your help, it is almost working.
I added some additional code to check for a changed email
address and update the NetNews database. It runs without error, but
I don't have a way to test it right now. Can you please look at the
code and see if it looks OK?
I am also still getting an error on line 10 after the routine
runs. The line that has this code: "and membernumber not in
(<cfqueryparam list="yes"
value="#valuelist(newsmember.membernumber)#
cfsqltype="cf_sql_integer">)" even with the cfif that Phil
suggested.
<cfquery datasource="#application.ds#"
name="newsMember">
SELECT distinct MemberNumber, Email_Address
FROM NetNewsTest
</cfquery>
<cfquery datasource="#application.dsrepl#"
name="qryMember">
SELECT distinct Email,FirstName,LastName,MemberNumber
FROM members
WHERE memberstanding <= 2 AND email IS NOT NULL AND email
<> ' '
and membernumber not in (<cfqueryparam list="yes"
value="#valuelist(newsmember.membernumber)#"
cfsqltype="cf_sql_integer">)
</cfquery>
<CFIF qryMember.recordcount NEQ 0>
<cfloop query ="qryMember">
<cfquery datasource="#application.ds#"
name="newsMember">
insert into NetNewsTest (Email_address, First_Name,
Last_Name, MemberNumber)
values ('#trim(qryMember.Email)#',
'#trim(qryMember.FirstName)#', '#trim(qryMember.LastName)#', '#
trim(qryMember.MemberNumber)#')
</cfquery>
</cfloop>
</cfif>
<cfquery datasource="#application.dsrepl#"
name="qryEmail">
SELECT distinct Email
FROM members
WHERE memberstanding <= 2 AND email IS NOT NULL AND email
<> ' '
and qryMember.email NEQ newsMember.email
</cfquery>
<CFIF qryEmail.recordcount NEQ 0>
<cfloop query ="qryEmail">
<cfquery datasource="#application.ds#"
name="newsMember">
update NetNewsTest (Email_address)
values ('#trim(qryMember.Email)#')
where email_address = #qryEmail.email#
</cfquery>
</cfloop>
</cfif>
Thank you again for the help. -
Update statement inserts a new row instead of updating
My code should just update a single specific row in the table. I used a simple procedure to update a record in the current application as well and it runs without any problem except this case. Every time the query runs, it actually inserts a new row instead
of just updating an existing:
So here is my c# code:
try
command.Parameters.Clear();
command.Parameters.AddRange(vars);
command.CommandText = "Update" + tableName;
conn.Open();
command.ExecuteNonQuery();
conn.Close();
this.GetData(tableName);
catch
throw;
And here is my SQL code (please ingore 'Alter Procedure' statement, I just wanted to get the core script of the procedure):
ALTER procedure [dbo].[UpdateExpenditureItems]
(@ID int,
@Name nvarchar(50),
@IsGroup bit,
@Parent nvarchar(50),
@Description nvarchar(50)
AS
Begin
--a new inserted row seems to be a result of running IF statement rather than ELSE
if @Parent is not null
begin
declare @ParentID int
set @ParentID = (select ID from ExpenditureItems where Name=@Parent);
UPDATE ExpenditureItems SET Name =@Name, Parent =@ParentID, [Description] =@Description, IsGroup = @IsGroup WHERE ID=@ID;
end
else
begin
UPDATE ExpenditureItems SET Name =@Name, [Description] =@Description WHERE ID=@ID
end
end
I ran the query itself from sql management studio and it works fine. The problem seems to be in c# code, but there's nothing unusual. Same code for operations with the databse works fine in other parts of the software... I appreciate your help!
Thanks in advance!Please can you post your full C# code for this function? Also the code you sent is not calling an SP? so why do we need to validate against the SP?
Fouad Roumieh
So here is my code:
//Update is called on OK button click
private void okBut_Click(object sender, EventArgs e)
//edit code runs here
SqlParameter[] pars = new SqlParameter[5];
pars[0] = new SqlParameter("@ID", int.Parse(idTB.Text));
pars[0].SqlDbType = SqlDbType.Int;
pars[1] = new SqlParameter("@Name", itemTB.Text);
pars[1].SqlDbType = SqlDbType.NVarChar;
pars[2] = new SqlParameter("@IsGroup", true);
pars[2].SqlDbType = SqlDbType.Bit;
pars[3] = new SqlParameter("@Parent", DBNull.Value);
pars[3].SqlDbType = SqlDbType.Int;
pars[4] = new SqlParameter("@Description", descTB.Text);
pars[4].SqlDbType = SqlDbType.NVarChar;
try
DAL dal = new DAL();
dal.UpdateData("ExpenditureItems", pars); //Update is called here
catch (Exception ex)
StaticValues.WriteEventLogXML(ex, this.Text);
switch (StaticValues.user.Language)
case "English":
MessageBox.Show("Database error", "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
break;
default:
break;
finally
this.DialogResult = System.Windows.Forms.DialogResult.OK;
StaticValues.currentNode = itemTB.Text;
public void UpdateData(string tableNameString, SqlParameter[] vars)
string tableName = tableNameString.Replace(" ", string.Empty);
try
command.Parameters.Clear();
command.Parameters.AddRange(vars);
command.CommandText = "Update" + tableName;
conn.Open();
command.ExecuteNonQuery();
conn.Close();
this.GetData(tableName);
catch
throw;
And here is GetData procedure:
public void GetData(string tableNameString)
string tableName = tableNameString.Replace(" ", string.Empty);
command.CommandText = "Get" + tableName;
command.Parameters.Clear();
if (!StaticValues.dataSet.Tables.Contains(tableName))
StaticValues.dataSet.Tables.Add(tableName);
else
StaticValues.dataSet.Tables[tableName].Clear();
try
adapter.SelectCommand = command;
conn.ConnectionString = DAL.ConnectionString;
command.Connection = conn;
conn.Open();
adapter.Fill(StaticValues.dataSet.Tables[tableName]);
conn.Close();
catch
throw;
And this is my complete code for sql procedure:
USE [ISWM project]
GO
/****** Object: StoredProcedure [dbo].[UpdateExpenditureItems] Script Date: 13.04.2015 9:09:36 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER procedure [dbo].[UpdateExpenditureItems]
(@ID int,
@Name nvarchar(50),
@Parent nvarchar(50),
@Description nvarchar(50),
@IsGroup bit
AS
Begin
if @Parent IS NOT null
begin
declare @ParentID int
set @ParentID = (select ID from ExpenditureItems where Name=@Parent);
UPDATE ExpenditureItems SET Name =@Name, Parent =@ParentID, [Description] =@Description, IsGroup = @IsGroup WHERE ID=@ID;
end
else
begin
UPDATE ExpenditureItems SET Name =@Name, [Description] =@Description WHERE ID=@ID
end
end
The codes posted above work great in other similar parts of the application where select, insert, delete and update is required.
Concerning my current problem - I ran the procedure itself in SQL Server Management Studio, and it works well. I would say that there's something wrong with my c# code, but it can't be wrong because it's quite simple and works in other place in my application...
I really don't get what's missing... -
Hi,
I am providing support to one of our clients, where we have jobs scheduled to load the data from the tables in the source database to the destination database via SSIS packages. The first time load is a full load where we truncate all the tables in the destination
and load them from the source tables. But from the next day, we perform the incremental load from source to destination, i.e., only modified records fetched using changed tracking concept will be loaded to the destination. After full load, if we run the incremental
load, the job is failing with the error on one of the packages "Violation of PRIMARY KEY constraint. Cannot insert duplicate key in object '<tablename>'. The duplicate key value is <1234>, even though there are no duplicate records. When we
try debugging and running the failing package, it runs successfully. We are not able to figure out why the package fails and when we run the next day it runs successfully. Request you to help me in this regard.
Thank you,
Bala Murali Krishna Medipally.Hi,
I am providing support to one of our clients, where we have jobs scheduled to load the data from the tables in the source database to the destination database via SSIS packages. The first time load is a full load where we truncate all the tables in the destination
and load them from the source tables. But from the next day, we perform the incremental load from source to destination, i.e., only modified records fetched using changed tracking concept will be loaded to the destination. After full load, if we run the incremental
load, the job is failing with the error on one of the packages "Violation of PRIMARY KEY constraint. Cannot insert duplicate key in object '<tablename>'. The duplicate key value is <1234>, even though there are no duplicate records. When we
try debugging and running the failing package, it runs successfully. We are not able to figure out why the package fails and when we run the next day it runs successfully. Request you to help me in this regard.
Thank you,
Bala Murali Krishna Medipally.
I suspect you are trying to insert modified records instead of updating. -
Check for duplicate record in SQL database before doing INSERT
Hey guys,
This is part powershell app doing a SQL insert. BUt my question really relates to the SQL insert. I need to do a check of the database PRIOR to doing the insert to check for duplicate records and if it exists then that record needs
to be overwritten. I'm not sure how to accomplish this task. My back end is a SQL 2000 Server. I'm piping the data into my insert statement from a powershell FileSystemWatcher app. In my scenario here if the file dumped into a directory starts with I it gets
written to a SQL database otherwise it gets written to an Access Table. I know silly, but thats the environment im in. haha.
Any help is appreciated.
Thanks in Advance
Rich T.
#### DEFINE WATCH FOLDERS AND DEFAULT FILE EXTENSION TO WATCH FOR ####
$cofa_folder = '\\cpsfs001\Data_pvs\TestCofA'
$bulk_folder = '\\cpsfs001\PVS\Subsidiary\Nolwood\McWood\POD'
$filter = '*.tif'
$cofa = New-Object IO.FileSystemWatcher $cofa_folder, $filter -Property @{ IncludeSubdirectories = $false; EnableRaisingEvents= $true; NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite' }
$bulk = New-Object IO.FileSystemWatcher $bulk_folder, $filter -Property @{ IncludeSubdirectories = $false; EnableRaisingEvents= $true; NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite' }
#### CERTIFICATE OF ANALYSIS AND PACKAGE SHIPPER PROCESSING ####
Register-ObjectEvent $cofa Created -SourceIdentifier COFA/PACKAGE -Action {
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
#### CERTIFICATE OF ANALYSIS PROCESS BEGINS ####
$test=$name.StartsWith("I")
if ($test -eq $true) {
$pos = $name.IndexOf(".")
$left=$name.substring(0,$pos)
$pos = $left.IndexOf("L")
$tempItem=$left.substring(0,$pos)
$lot = $left.Substring($pos + 1)
$item=$tempItem.Substring(1)
Write-Host "in_item_key $item in_lot_key $lot imgfilename $name in_cofa_crtdt $timestamp" -fore green
Out-File -FilePath c:\OutputLogs\CofA.csv -Append -InputObject "in_item_key $item in_lot_key $lot imgfilename $name in_cofa_crtdt $timestamp"
start-sleep -s 5
$conn = New-Object System.Data.SqlClient.SqlConnection("Data Source=PVSNTDB33; Initial Catalog=adagecopy_daily; Integrated Security=TRUE")
$conn.Open()
$insert_stmt = "INSERT INTO in_cofa_pvs (in_item_key, in_lot_key, imgfileName, in_cofa_crtdt) VALUES ('$item','$lot','$name','$timestamp')"
$cmd = $conn.CreateCommand()
$cmd.CommandText = $insert_stmt
$cmd.ExecuteNonQuery()
$conn.Close()
#### PACKAGE SHIPPER PROCESS BEGINS ####
elseif ($test -eq $false) {
$pos = $name.IndexOf(".")
$left=$name.substring(0,$pos)
$pos = $left.IndexOf("O")
$tempItem=$left.substring(0,$pos)
$order = $left.Substring($pos + 1)
$shipid=$tempItem.Substring(1)
Write-Host "so_hdr_key $order so_ship_key $shipid imgfilename $name in_cofa_crtdt $timestamp" -fore green
Out-File -FilePath c:\OutputLogs\PackageShipper.csv -Append -InputObject "so_hdr_key $order so_ship_key $shipid imgfilename $name in_cofa_crtdt $timestamp"
Rich ThompsonHi
Since SQL Server 2000 has been out of support, I recommend you to upgrade the SQL Server 2000 to a higher version, such as SQL Server 2005 or SQL Server 2008.
According to your description, you can try the following methods to check duplicate record in SQL Server.
1. You can use
RAISERROR to check the duplicate record, if exists then RAISERROR unless insert accordingly, code block is given below:
IF EXISTS (SELECT 1 FROM TableName AS t
WHERE t.Column1 = @ Column1
AND t.Column2 = @ Column2)
BEGIN
RAISERROR(‘Duplicate records’,18,1)
END
ELSE
BEGIN
INSERT INTO TableName (Column1, Column2, Column3)
SELECT @ Column1, @ Column2, @ Column3
END
2. Also you can create UNIQUE INDEX or UNIQUE CONSTRAINT on the column of a table, when you try to INSERT a value that conflicts with the INDEX/CONSTRAINT, an exception will be thrown.
Add the unique index:
CREATE UNIQUE INDEX Unique_Index_name ON TableName(ColumnName)
Add the unique constraint:
ALTER TABLE TableName
ADD CONSTRAINT Unique_Contraint_Name
UNIQUE (ColumnName)
Thanks
Lydia Zhang
Maybe you are looking for
-
After upgrading to Lion, Printer error messages.
After upgrading to Lion, I suddenly see the following error message whenever I print to my HP printer: "You can't open the application PrinterProxy because PowerPC applications are no longer supported." Now, my document will print, but I can no longe
-
HELP--Thousands of Photos Mysteriously Lost on iPhoto on my Macbook!!!
A few days ago my wife and I were looking at pictures on iPhoto and as of 10/23/13, when I opened iPhoto this AM, they are all GONE!!!! I have NO idea what happened to them. Can anyone help me?
-
Cross platform in PRD environment
Hi, we have DB and CI with HP-UX and four aplication servers. the 4 apps are running with Linux. In this 4 apps 1 Dialog instance is running in DEV/QAS. can we maintain like this type of environment in PRD. Please clarify this, Thanks, Harish
-
How to figure out ORA_12203:TNS unable to connect to destination?
OS: WSK I have installed oracle817 and form6i in different directory in a machine and defined two oracle home for them. I can access sqlplus through OEM of oracle817, but I can not do the same thing through form6i, the error showed me: ORA_12203:TNS
-
Iphone 4s updated today and since then my contact list is gone. How can I retrieve it?