Missing records in table
Hi All,
I am using XML publisher APIs to generate reports. For a particular table the report shows a single record but when I run the query(used in the data template) in SQL plus it retrieves 3 records.
The RTF template has a FOR-EACH statement and their is no other logic.
In the RTF template I introduced <?/*?> tag, which would print all the data. The output did not contain the other two records.
I will mail the data template and the RTF in case you can help.
Thanks in advance,
Sandesh
Regards,
Sandesh
Hi Vetsrini,
I am including the structure of the data template and the RTF template, please take a look
Data Template
<root>
<G1>
<HEADING>HEADING 1</HEADING>
<G11>
<E>ONE</E>
</G11>
<G11>
<E>TWO</E>
</G11>
</G1>
<G1>
<HEADING>HEADING 2</HEADING>
<G11>
<E>THREE</E>
</G11>
<G11>
<E>FOUR</E>
</G11>
</G1>
</root>
RTF Template
<?for-each:G1?>
<?HEADING?>
<?for-each:G11?>
<?E?>
<?end for-each?>
<?end for-each?>
Output
HEADING 1
ONE
TWO
HEADING 2
FOUR
On changing the ORDER BY in the query the output will change to
Output
HEADING 2
THREE
FOUR
HEADING 1
TWO
(In both cases only the last row is printed...it looks like overlapping)
The correct output should be
HEADING 1
ONE
TWO
HEADING 2
THREE
FOUR
Regards,
Sandesh
Message was edited by:
user578901
Similar Messages
-
How to get missing records from one table
I have one table with many records in the table. Each time a record is entered the date the record was entered is also saved in the table.
I need a query that will find all the missing records in the table.
so if I have in my table:
ID Date Location
1 4/1/2015 bld1
2 4/2/2015 bld1
3 4/4/2015 bld1
I want to run a query like
Select Date, Location FROM [table] WHERE (Date Between '4/1/2015' and '4/4/2015') and (Location = bld1)
WHERE Date not in
(Select Date, Location FROM [table])
and the results would be:
4/3/2015 bld1
Thank youDo you have a table with all possible dates in it? You can do a left join from that to your above mentioned table where the right side of the join is null. If you don't have a table with all possible dates you could user a numbers table.
Below is one way to achieve what you want with a numbers table...
DECLARE @Table table (ID Int, DateField Date, Location VarChar(4))
DECLARE @RunDate datetime
SET @RunDate=GETDATE()
IF OBJECT_ID('dbo.Numbers') IS NOT NULL
DROP TABLE NUMBERS
SELECT TOP 10000 IDENTITY(int,1,1) AS Number
into Numbers
FROM sys.objects s1
CROSS JOIN sys.objects s2
ALTER TABLE Numbers ADD CONSTRAINT PK_Numbers PRIMARY KEY CLUSTERED (Number)
INSERT INTO @Table (ID, DateField, Location)
VALUES ('1','20150401','bld1')
,('1','20150402','bld1')
,('1','20150404','bld1');
WITH AllDates
as
SELECT DATEADD(dd,N.Number,D.StartDate) as Dates
FROM Numbers N
cross apply (SELECT CAST('20150101' as Date) as StartDate) as D
select *
from AllDates AD
left join @Table T on AD.Dates = T.DateField
where ad.Dates between '20150401' and '20150404'
AND T.ID IS NULL
LucasF -
How to identify missing records in a single-column table?
How to identify missing records in a single-column table ?
Column consists of numbers in a ordered manner but the some numbers are deleted from the table in random manner and need to identify those rows.Something like:
WITH t AS (
SELECT 1 ID FROM DUAL UNION ALL
SELECT 2 ID FROM DUAL UNION ALL
SELECT 3 ID FROM DUAL UNION ALL
SELECT 5 ID FROM DUAL UNION ALL
SELECT 8 ID FROM DUAL UNION ALL
SELECT 10 ID FROM DUAL UNION ALL
SELECT 11 ID FROM DUAL
-- end of on-the-fly data sample
SELECT '[' || (id + 1) || ' - ' || (next_id - 1) || ']' gap
FROM (
SELECT id,
lead(id,1,id + 1) over(order by id) next_id
FROM t
where id != next_id - 1
GAP
[4 - 4]
[6 - 7]
[9 - 9]
SQL> SY.
P.S. I assume sequence lower and upper limits are always present, otherwise query needs a little adjustment. -
Installed Cisco Unity 7 on Windows Server 2000 and Unity Server is on Domain,and Exchange we are using is Exhcnage 2007. When we reach at Message Store Wizard it is showing some error at the end that " Failed reaching for Unity in Active Directory Global Catalog. The SVR Record may be missing in DNS Table" . It shows when we slected Exchange in Process and the Selected Storage Group. And the it takes time and then shows error. Please try to resolve this issue soon, we need this solution urgently. We have already spent several days on different issues regarding Unity.
If you need to talk regarding this matter, Please contact on +919780660619, and the conatct person name is Manav.
Thanks in Advance
RosyRosy,
We cannot call you from this forum. If you truly need a call, you can open a TAC case. The info I can tell you is that you need to fix this in your dns. A simple google search of "no SRV record Windows 2000 server" comes up with numerous resources that can help you fix this. Here is just one example.
http://support.microsoft.com/kb/241505
and
http://www.petri.co.il/active_directory_srv_records.htm
I would also advise you, if you are doing a new install with Unity 7 and that version of Exchange, that you use Windows 2003 as the OS. Even MS doesn't support Windows 2000 Server any more so if you end up having an OS issue on your Unity server, we will not be able to get MS to help you. From a support perspective, I would advise you to use Windows 2003 Server in this implimentation.
Thanks!
Tray -
How to find out the missing records
Dear all,
I feel that there is some Inconsistent records between R/3 and the extracted records BI for the data source 2LIS_03_BF. some of the records may not be updated through delta.. Is this possible? as we have created Zkey figures to capture the diffrent unit of measure's quantity from the table MSEGO2 and updated to IC_c03 cube.
Can any one tell me how to go about the missed records..where to find them.. and how to extract them.
Poits will be assigned
Regards
venuhi,
inventory scenario is difficult to handle as this have non cumulative key figures which can be viewed at the repors.
extreme care is required while compression and intila loads and their compression.
run the report or ask the user to run the report and sort out the missing records.
chk out the document and chk if the procedure done by you is correct.
inventory management
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328
Ramesh -
Dimension key 16 missing in dimension table /BIC/DZPP_CP1P
Hi all,
I have a problem with an infocube ZPP_CP1. I am not able to delete nor load any data. It was working fine till some time back.
Below is the outcome of running RSRV check on this cube. I tried to run the error correction in RSRV. But n o use
Please help.
Dimension key 16 missing in dimension table /BIC/DZPP_CP1P
Message no. RSRV018
Diagnosis
The dimension key 16 that appears as field KEY_ZPP_CP1P in the fact table, does not appear as a value of the DIMID field in the dimensions table /BIC/DZPP_CP1P.
There are 17580 fact records that use the dimension key 16.
The facts belonging to dimension key 16 are therefore no longer connected to the master data of the characteristic in dimension.
Note that errors that are reported for the package dimension are not serious (They are thus shown as warnings (yellow) and not errors (red). When deleting transaction data requests, it can arise that the associated entries in the package dimension have already been deleted. As a result, the system terminates when deleting what can be a very large number of fact records. At the moment, we are working on a correction which will delete such data which remains after deletion of the request. Under no circumstances must you do this manually. Also note that data for request 0 cannot generally be deleted.
The test investigates whether all the facts are zero. If this is the case, the system is able to remove the inconsistency by deleting these fact records. If the error cannot be removed, the only way to re-establish a consistent status is to reconstruct the InfoCube. It may be possible for SAP to correct the inconsistency, for which you should create an error message.
Procedure
This inconsistency can occur if you use methods other than those found in BW to delete data from the SAP BW tables (for example, maintaining tables manually, using your own coding or database tools).Hi Ansel,
There has been no changes in the cube. I am getting this problem in my QA server. So I retransported the cube again from Dev to QA. But did not help me..
Any other ideas??
Regards,
Adarsh -
Insert missing records dynamically in SQL Server 2008 R2
Hi, I am working on a requirement where I will have to work on some % calculations for 2 different states where we do business. The task that I am trying to accomplish is technically we need to have any Product sales in both the states if not for instance
in the following example ProductC does not have sales in OR so I need to insert a record for ProductC - OR with 0. This is an intermediary resultset of all the calculations that I am working with.
Example Dataset:
CREATE TABLE [dbo].[Product](
[Product] [varchar](18) NOT NULL,
[State] [varchar](5) NOT NULL,
[Area1] [int] NOT NULL,
[Area2] [int] NOT NULL,
[Area3] [int] NOT NULL,
[Area4] [int] NOT NULL,
[Area5] [int] NOT NULL,
[Area6] [int] NOT NULL,
[Area7] [int] NOT NULL,
[Area8] [int] NOT NULL,
[Area9] [int] NOT NULL,
[Area10] [int] NOT NULL,
[Total] [int] NULL
INSERT INTO [Product] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductA','OR',0,0,2,0,2,0,0,0,0,0,4)
INSERT INTO [Product] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductA','WA',0,0,1,0,0,0,0,0,0,0,1)
INSERT INTO [Product] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductA','Total',0,0,3,0,2,0,0,0,0,0,5)
INSERT INTO [Product] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductB','OR',0,0,5,0,0,0,0,0,0,0,5)
INSERT INTO [Product] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductB','WA',0,0,2,0,0,1,0,0,0,0,3)
INSERT INTO [Product] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductB','Total',0,0,7,0,0,1,0,0,0,0,8)
INSERT INTO [Product] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductC','WA',0,0,0,0,0,0,0,0,0,1,1)
INSERT INTO [Product] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductC','Total',0,0,0,0,0,0,0,0,0,1,1)
INSERT INTO [Product] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductD','OR',5,2,451,154,43,1,0,0,0,0,656)
INSERT INTO [Product] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductD','WA',0,20,102,182,58,36,0,1,0,0,399)
INSERT INTO [Product] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductD','Total',5,22,553,336,101,37,0,1,0,0,1055)
How to accomplish this in SQL Server 2008 R2.
Insert missing record:
INSERT INTO [Product] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductC','OR',0,0,0,0,0,0,0,0,0,0,0)
Thanks in advance......
IoneCREATE TABLE [dbo].[Products](
[Product] [varchar](18) NOT NULL,
[State] [varchar](5) NOT NULL,
[Area1] [int] NOT NULL,
[Area2] [int] NOT NULL,
[Area3] [int] NOT NULL,
[Area4] [int] NOT NULL,
[Area5] [int] NOT NULL,
[Area6] [int] NOT NULL,
[Area7] [int] NOT NULL,
[Area8] [int] NOT NULL,
[Area9] [int] NOT NULL,
[Area10] [int] NOT NULL,
[Total] [int] NULL
INSERT INTO [Products] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductA','OR',0,0,2,0,2,0,0,0,0,0,4)
INSERT INTO [Products] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductA','WA',0,0,1,0,0,0,0,0,0,0,1)
INSERT INTO [Products] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductA','Total',0,0,3,0,2,0,0,0,0,0,5)
INSERT INTO [Products] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductB','OR',0,0,5,0,0,0,0,0,0,0,5)
INSERT INTO [Products] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductB','WA',0,0,2,0,0,1,0,0,0,0,3)
INSERT INTO [Products] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductB','Total',0,0,7,0,0,1,0,0,0,0,8)
INSERT INTO [Products] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductC','WA',0,0,0,0,0,0,0,0,0,1,1)
INSERT INTO [Products] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductC','Total',0,0,0,0,0,0,0,0,0,1,1)
INSERT INTO [Products] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductD','OR',5,2,451,154,43,1,0,0,0,0,656)
INSERT INTO [Products] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductD','WA',0,20,102,182,58,36,0,1,0,0,399)
INSERT INTO [Products] ([Product],[State],[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])VALUES('ProductD','Total',5,22,553,336,101,37,0,1,0,0,1055)
;WIth mycte as (
Select Product,State From (Select distinct Product from Products) p, (
Select distinct State from Products) s
Insert into [Products](Product,State,[Area1],[Area2],[Area3],[Area4],[Area5],[Area6],[Area7],[Area8],[Area9],[Area10],[Total])
Select m.Product,m.State
,0,0,0,0,0,0,0,0,0,0,0
from mycte m Left Join Products p on m.[Product]=p.Product and m.State=p.State
WHERE p.Product is null and p.State is null
select * from Products
Order by Product
,Case when State='Total' Then 2 else 1 End
Drop table Products -
PR00 is missing in the table PRCC_COND_CT
Hi Experts
I have successful replicated all conditional Objects like DNL_CUST_CNDALL,DNL_CUST_CND,DNL_CUST_PRC from ECC to CRM2007,
when i try to create a Sales Order in crm Condition types are missing ,i can able to see only values with Zero in the condition Tab page for which i have maintained the conditional maitainece gp for PRODUCTCRM inorder to see condition types in CRM product conditions as per Building Block(c03).
so what i was observed is condition types like PR00,K004,K005,K007 are missing in the table PRCC_COND_CT.which are not properly replicated into CRM system for which i have done the initial replication for all conditional objects so many time.
Hence i request you all could you please tell me, is there any way to get all condition types into crm
will it possible to add manually condition types in crm system.
i would appreciate you help
thanking you in advance
Regards
RaoHi Bhanu
I have checked in SLG1 there are following errors and warnings
Errors are:Table PRCC_COND_PPD has been posted in the database with errors
Message no. CND_MAP155
Table PRCC_COND_CT has been posted in the database with errors
Message no. CND_MAP155
1 .Error converting field FIELD_TIMESTAMP into /SAPCND/T685 for condition type BGEW.
Message no. CND_MAP181
Diagnosis
Data records from the tables mentioned are required for converting field contents. These tables show inconsistencies that could have the following causes:
A required entry is not available in one of the tables.
Field contents for table entries are not compatible with each other.
System Response
Only those data records are modified in the table for which consistent table entries exist, and for which conversion of field entry is possible.
Procedure
With the help of OSS Note 314542 and instructions contained therein, analyze causes, and make necessary corrections.
2. Error converting field FIELD_TIMESTAMP into /SAPCND/T685 for condition type BI00.
Message no. CND_MAP181
Diagnosis
Data records from the tables mentioned are required for converting field contents. These tables show inconsistencies that could have the following causes:
A required entry is not available in one of the tables.
Field contents for table entries are not compatible with each other.
System Response
Only those data records are modified in the table for which consistent table entries exist, and for which conversion of field entry is possible.
Procedure
With the help of OSS Note 314542 and instructions contained therein, analyze causes, and make necessary corrections.
3.Error converting field FIELD_TIMESTAMP into /SAPCND/T685 for condition type PR00.
Message no. CND_MAP181
Diagnosis
Data records from the tables mentioned are required for converting field contents. These tables show inconsistencies that could have the following causes:
A required entry is not available in one of the tables.
Field contents for table entries are not compatible with each other.
System Response
Only those data records are modified in the table for which consistent table entries exist, and for which conversion of field entry is possible.
Procedure
With the help of OSS Note 314542 and instructions contained therein, analyze causes, and make necessary corrections.
ther are follwoing Warnings:
1. Double entry for table PRCC_COND_PPD. For details, see long text.
Message no. CND_MAP815
Diagnosis
An error occurred during the update. You tried to update a data record with the key KAPPL = CRM and KALSM = 18CBCL in table PRCC_COND_PPD. A data record with the above key already exists in table PRCC_COND_PPD.
System Response
Data records with identical keys conflict with data integrity. The above data record was not updated for this reason.
Procedure
Use SAP Note 0314315 to eliminate the cause of the error.
2. Double entry for table PRCC_COND_CT. For details, see long text.
Message no. CND_MAP812
Diagnosis
An error occurred during the update. You tried to update a data record with the key KAPPL = CRM and KSCHL = 18CL in table PRCC_COND_CT. A data record with the above key already exists in table PRCC_COND_CT.
System Response
Data records with identical keys conflict with data integrity. The above data record was not updated for this reason.
Procedure
Use SAP Note 0314315 to eliminate the cause of the error.
Edited by: padmarao mota on Mar 13, 2009 9:47 PM -
FK01: Vendor Master Record Validation Table is not maintained
Hi Experts,
Appreciate your help on this matter: I have created a new Vendor Account Group and tried to used it. However, upon saving I encountered problem: error message Vendor Master Record Validation Table is not maintained. Do I missed some configuration?
Thank you very much!Hi,
STEP 3 : In Financial accounting > A/R A/P > Business Transactions > Outgoing payments > Manual Outgoing payments >
Define Tolerance (Vendors)
STEP 4 : Create a GL A/c Creditors in FS00 as a balance sheet a/c and in control tab give reconciliation type as Vendors
and Field Status Group G067. Assign this account in company code data in FK02.
Regards,
Sadashivan -
Hi friends,
We load sales order daily in our Quality system using a process chain, but lately we are facing a strange issue. The records in BW rae not matching in R/3.
EX: on 17.10.2007 in VBAK table there are around 6500 Orders created (Created on), the same will be loaded to our BW ODS without any transformation or restrictions, so we should get the same number of records. Instead we got 100 records only.
This missing records is not happening every day. It happened in the 1st week of OCt and also on 17th of Oct rest other days its OK.
I wanted to know if anyone of u have faced similar issue? if yes how did u solve ( Filling up of setup tables for the missing records is not the solution), what i am looking at is is there any programs which deletes records either from the LBWQ or RSA7?.
Your help is very much needed.
Thanks in advance.
BNHi Manoj,
Thanks a lot for ur quick reply. We are loading data for Header, Item and conditions in BW, we are not doing any transformations anywhere from R/3 BW, its all 1-2-1 mappings.
We have been loading this for last 2 yrs, it suddenly started misbehaving. I suspect there must be some job which might be deleteing the records from RSA7, but im not sure if there is any prog or job which does this.
Thanks
BN -
hello all,
to the point, i am trying to mirroring 2 database using archive log.
so, i do some test by inserting large data (1200 record / s) into 4 table for some interval (like 10 menit).
Each time database create archive log, my daemon in Master will copy the archive log to the Slave, and daemon in the Slave will extract them using dbms_logmnr.
the problem is number of rows in all Slave table is not equal (less than) number of all rows in Master.
even if i try to alter the system log file manuallly, still there is missing rows in slave.
i have try some alternative like,
- increasing db_writer_processes (from 1 to 2 to 4)
- increasing redo log size (from 50 MB to 100 MB)
- increasing redo log group (from 3 to 6)
but still there is a missing record.
where is those missing record go ?
master db: 10.2.0.1.0 + ASM
master os: SunOS z8 5.10 Generic_137138-09 i86pc
slave db: 10.2.0.1.0
slave os: GNU/Linux 2.6.18-8.el5 x86_64 (RHEL)Thank you for your reply sir.
The problem with CDC (and/or Streams implementation) is they use a cpu resource within Oracle it self. So, if in peak time Oracle in Master has already reaching 80-90%, with addition using Streams they will make it to > 90%. Don't imagine 1000000 / day transaction here, but imagine with 1000/s transaction. And That's what the-one-who-already-use-stream says.
i am not reinvent the wheel here, i am seeking for best alternative. in fact i am looking for answer from "where is those missing record go ?" -
Missing record in customizing tabele after client copy
Hi,
A box with Solaris 10 X64, SAP ECC 6.03, SAP AFS 6.03 and MaxDB 7.6.
In a just installed system we have crate two client as copy from 000 .... in this two client some customizing tables has missing records. In 000 client all is OK.
Have you have any idea ?
Regards.
Ganimede Dignan.Hi,
You have said customizing tables, are those the default one or build by you. If they are given default from SAP, please note that ceratin data content is not copied even by Client copy (SAP_ALL profile) you have to do a manual entry.
Since CCopy have ended with no error and you ahve selected SAP_ALL profile but you dont see the data in the table, which means they cannot be moved by C.Copy and you have enter them manually.
Regards,
Raju. -
Delete records from tableA which are not in tableB
Table A contains milions of records which is the best way to delete records from tableA which are not in tableB
delete from tableA where empno not in (select empno from tableb)
or
delete from tableA where empno not exists (select empno from tableb
where b.empno=a.empno)
any helpHi
If you can do this, do with this:
create table tableC
as select a.*
from tableA a,
(select empno from tableA
minus
select empno from tableB) b
where a.empno = b.empno;
drop table tableA;
rename table tableC to tableA;
Ott Karesz
http://www.trendo-kft.hu -
PDE-PLI031 Unable to fetch record from table tool_modulre
Dear ALL
I am creating PL/SQL Libraries in report builder.
but When I try to save the Library to database, a error
PDE-PLI031 Unable to fetch record from table tool_modulre.
Would you please tell me how to solve this problem
and why it coming
thankyou very much
pritam singhHi ,
Saving a library (.pll) to database would store the object inside specific tables that are to be created.
If you are using 6i, then you should find toolbild & toolgrnt.sql files which you have to run in the order specified. The above scripts creates the necessary tables and henceforth you won't get those errors while saving.
Hope this helps.
Thanks,
Vinod. -
How to insert a new record to table with foreign key
I have 3 tables like this :
CREATE TABLE PERSON (
PK INTEGER NOT NULL,
NAME VARCHAR(10),
SSNUM INTEGER,
MGR INTEGER);
ALTER TABLE PERSON ADD CONSTRAINT PK_PERSON PRIMARY KEY (PK);
ALTER TABLE PERSON ADD CONSTRAINT FK_PERSON FOREIGN KEY (MGR) REFERENCES
PERSON (PK);
/* Tables
CREATE TABLE PROJECT (
PK INTEGER NOT NULL,
CODE_NAME INTEGER);
ALTER TABLE PROJECT ADD CONSTRAINT PK_PROJECT PRIMARY KEY (PK);
/* Tables
CREATE TABLE XREF (
PERSON INTEGER NOT NULL,
PROJECT INTEGER NOT NULL);
ALTER TABLE XREF ADD CONSTRAINT PK_XREF PRIMARY KEY (PERSON, PROJECT);
ALTER TABLE XREF ADD CONSTRAINT FK_XREF1 FOREIGN KEY (PERSON) REFERENCES
PERSON (PK);
ALTER TABLE XREF ADD CONSTRAINT FK_XREF2 FOREIGN KEY (PROJECT) REFERENCES
PROJECT (PK);
I do like the way of "ReverseTutoral" and the file .jdo here :
<?xml version="1.0" encoding="UTF-8"?>
<jdo>
<package name="reversetutorial">
<class name="Person" objectid-class="PersonId">
<extension vendor-name="kodo" key="class-column" value="none"/>
<extension vendor-name="kodo" key="lock-column" value="none"/>
<extension vendor-name="kodo" key="table" value="PERSON"/>
<field name="name">
<extension vendor-name="kodo" key="data-column"
value="NAME"/>
</field>
<field name="person">
<extension vendor-name="kodo" key="pk-data-column"
value="MGR"/>
</field>
<field name="persons">
<collection element-type="Person"/>
<extension vendor-name="kodo" key="inverse"
value="person"/>
<extension vendor-name="kodo" key="inverse-owner"
value="person"/>
</field>
<field name="pk" primary-key="true">
<extension vendor-name="kodo" key="data-column"
value="PK"/>
</field>
<field name="ssnum">
<extension vendor-name="kodo" key="data-column"
value="SSNUM"/>
</field>
<field name="xrefs">
<collection element-type="Xref"/>
<extension vendor-name="kodo" key="inverse"
value="person"/>
<extension vendor-name="kodo" key="inverse-owner"
value="person"/>
</field>
</class>
<class name="Project" objectid-class="ProjectId">
<extension vendor-name="kodo" key="class-column" value="none"/>
<extension vendor-name="kodo" key="lock-column" value="none"/>
<extension vendor-name="kodo" key="table" value="PROJECT"/>
<field name="codeName">
<extension vendor-name="kodo" key="data-column"
value="CODE_NAME"/>
</field>
<field name="pk" primary-key="true">
<extension vendor-name="kodo" key="data-column"
value="PK"/>
</field>
<field name="xrefs">
<collection element-type="Xref"/>
<extension vendor-name="kodo" key="inverse"
value="project"/>
<extension vendor-name="kodo" key="inverse-owner"
value="project"/>
</field>
</class>
<class name="Xref" objectid-class="XrefId">
<extension vendor-name="kodo" key="class-column" value="none"/>
<extension vendor-name="kodo" key="lock-column" value="none"/>
<extension vendor-name="kodo" key="table" value="XREF"/>
<field name="person">
<extension vendor-name="kodo" key="pk-data-column"
value="PERSON"/>
</field>
<field name="person2" primary-key="true">
<extension vendor-name="kodo" key="data-column"
value="PERSON"/>
</field>
<field name="project">
<extension vendor-name="kodo" key="pk-data-column"
value="PROJECT"/>
</field>
<field name="project2" primary-key="true">
<extension vendor-name="kodo" key="data-column"
value="PROJECT"/>
</field>
</class>
</package>
</jdo>
Data of those tables are :
PERSON :
| PK | NAME | SSNUM | MGR |
| 1 | ABC | 1 | 1 |
| 2 | DEF | 5 | 1 |
PROJECT
| PK | CODE_NAME |
| 1 | 12 |
| 2 | 13 |
And now I want to add a new record into table XREF : insert into XREF
values (1,1);
public void createData() {
Xref xref = new Xref();
Person person = new Person(1);
Project project = new Project(1);
xref.setPerson(person);
xref.setProject(project);
person.getXrefs().add(xref);
person.getXrefs().add(xref);
pm.currentTransaction().begin();
pm.makePersistent(xref);
pm.currentTransaction().commit();
I don't know why Kodo automatically insert new record to table PERSON ->
confilct Primary Key. The errors are :
0 [main] INFO kodo.Runtime - Starting Kodo JDO version 2.4.1
(kodojdo-2.4.1-20030126-1556) with capabilities: [Enterprise Edition
Features, Standard Edition Features, Lite Edition Features, Evaluation
License, Query Extensions, Datacache Plug-in, Statement Batching, Global
Transactions, Developer Tools, Custom Database Dictionaries, Enterprise
Databases, Custom ClassMappings, Custom ResultObjectProviders]
41 [main] WARN kodo.Runtime - WARNING: Kodo JDO Evaluation expires in 29
days. Please contact [email protected] for information on extending
your evaluation period or purchasing a license.
1627 [main] INFO kodo.MetaData -
com.solarmetric.kodo.meta.JDOMetaDataParser@e28b9: parsing source:
file:/D:/AN/Test/classes/reversetutorial/reversetutorial.jdo
3092 [main] INFO jdbc.JDBC - [ C:23387093; T:19356985; D:10268916 ] open:
jdbc:firebirdsql:localhost/3050:D:/An/test/temp.gdb (sysdba)
3325 [main] INFO jdbc.JDBC - [ C:23387093; T:19356985; D:10268916 ]
close:
com.solarmetric.datasource.PoolConnection@164dbd5[[requests=0;size=0;max=70;hits=0;created=0;redundant=0;overflow=0;new=0;leaked=0;unavailable=0]]
3335 [main] INFO jdbc.JDBC - [ C:23387093; T:19356985; D:10268916 ] close
connection
3648 [main] INFO jdbc.JDBC - Using dictionary class
"com.solarmetric.kodo.impl.jdbc.schema.dict.InterbaseDictionary" to
connect to "Firebird" (version "__WI-V6.2.972 Firebird 1.0.3)WI-V6.2.972
Firebird 1.0.3/tcp (annm)/P10") with JDBC driver "firebirdsql jca/jdbc
resource adapter" (version "0.1")
4032 [main] INFO jdbc.JDBC - [ C:25657668; T:19356985; D:10268916 ] open:
jdbc:firebirdsql:localhost/3050:D:/An/test/temp.gdb (sysdba)
4143 [main] INFO jdbc.SQL - [ C:25657668; T:19356985; D:10268916 ]
preparing statement <3098834>: INSERT INTO XREF(PERSON, PROJECT) VALUES
4224 [main] INFO jdbc.SQL - [ C:25657668; T:19356985; D:10268916 ]
executing statement <3098834>: [reused=1;params={(int)1,(int)1}]
4244 [main] INFO jdbc.SQL - [ C:25657668; T:19356985; D:10268916 ]
preparing statement <9090824>: INSERT INTO PERSON(MGR, NAME, PK, SSNUM)
VALUES (?, ?, ?, ?)
4315 [main] INFO jdbc.SQL - [ C:25657668; T:19356985; D:10268916 ]
executing statement <9090824>: [reused=1;params={null,null,(int)1,(int)0}]
4598 [main] WARN jdbc.JDBC - java.sql.SQLWarning: java.sql.SQLWarning:
resultSetType or resultSetConcurrency changed
4598 [main] WARN jdbc.JDBC - java.sql.SQLWarning: java.sql.SQLWarning:
resultSetType or resultSetConcurrency changed
4598 [main] INFO jdbc.JDBC - [ C:25657668; T:19356985; D:10268916 ] begin
rollback
4608 [main] INFO jdbc.JDBC - [ C:25657668; T:19356985; D:10268916 ] end
rollback 10ms
4628 [main] INFO jdbc.JDBC - [ C:25657668; T:19356985; D:10268916 ]
close:
com.solarmetric.datasource.PoolConnection@1878144[[requests=2;size=2;max=70;hits=0;created=2;redundant=0;overflow=0;new=2;leaked=0;unavailable=0]]
4628 [main] INFO jdbc.JDBC - [ C:25657668; T:19356985; D:10268916 ] close
connection
javax.jdo.JDOFatalDataStoreException:
com.solarmetric.kodo.impl.jdbc.sql.SQLExceptionWrapper:
[SQL=INSERT INTO PERSON(MGR, NAME, PK, SSNUM) VALUES (null, null, 1, 0)]
[PRE=INSERT INTO PERSON(MGR, NAME, PK, SSNUM) VALUES (?, ?, ?, ?)]
GDS Exception. violation of PRIMARY or UNIQUE KEY constraint "PK_PERSON"
on table "PERSON" [code=335544665;state=null]
NestedThrowables:
com.solarmetric.kodo.impl.jdbc.sql.SQLExceptionWrapper:
[SQL=INSERT INTO PERSON(MGR, NAME, PK, SSNUM) VALUES (null, null, 1, 0)]
[PRE=INSERT INTO PERSON(MGR, NAME, PK, SSNUM) VALUES (?, ?, ?, ?)]
GDS Exception. violation of PRIMARY or UNIQUE KEY constraint "PK_PERSON"
on table "PERSON"
at
com.solarmetric.kodo.impl.jdbc.runtime.SQLExceptions.throwFatal(SQLExceptions.java:17)
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.flush(JDBCStoreManager.java:416)
at
com.solarmetric.kodo.runtime.PersistenceManagerImpl.flush(PersistenceManagerImpl.java:575)
at
com.solarmetric.kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:438)
at reversetutorial.Finder.createData(Finder.java:74)
at reversetutorial.Finder.main(Finder.java:141)
NestedThrowablesStackTrace:
org.firebirdsql.jdbc.FBSQLException: GDS Exception. violation of PRIMARY
or UNIQUE KEY constraint "PK_PERSON" on table "PERSON"
at
org.firebirdsql.jdbc.FBPreparedStatement.internalExecute(FBPreparedStatement.java:425)
at
org.firebirdsql.jdbc.FBPreparedStatement.executeUpdate(FBPreparedStatement.java:136)
at
com.solarmetric.datasource.PreparedStatementWrapper.executeUpdate(PreparedStatementWrapper.java:111)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executePreparedStatementNonBatch(SQLExecutionManagerImpl.java:542)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executePreparedStatement(SQLExecutionManagerImpl.java:511
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executeInternal(SQLExecutionManagerImpl.java:405)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.flush(SQLExecutionManagerImpl.java:272
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.flush(JDBCStoreManager.java:411)
at
com.solarmetric.kodo.runtime.PersistenceManagerImpl.flush(PersistenceManagerImpl.java:575)
at
com.solarmetric.kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:438)
at reversetutorial.Finder.createData(Finder.java:74)
at reversetutorial.Finder.main(Finder.java:141)
at org.firebirdsql.gds.GDSException: violation of PRIMARY or UNIQUE KEY
constraint "PK_PERSON" on table "PERSON
at org.firebirdsql.jgds.GDS_Impl.readStatusVector(GDS_Impl.java:1683)
at org.firebirdsql.jgds.GDS_Impl.receiveResponse(GDS_Impl.java:1636)
at org.firebirdsql.jgds.GDS_Impl.isc_dsql_execute2(GDS_Impl.java:865)
at
org.firebirdsql.jca.FBManagedConnection.executeStatement(FBManagedConnection.java:782)
at
org.firebirdsql.jdbc.FBConnection.executeStatement(FBConnection.java:1072)
at
org.firebirdsql.jdbc.FBPreparedStatement.internalExecute(FBPreparedStatement.java:420)
at
org.firebirdsql.jdbc.FBPreparedStatement.executeUpdate(FBPreparedStatement.java:136)
at
com.solarmetric.datasource.PreparedStatementWrapper.executeUpdate(PreparedStatementWrapper.java:111)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executePreparedStatementNonBatch(SQLExecutionManagerImpl.java:542)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executePreparedStatement(SQLExecutionManagerImpl.java:511)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executeInternal(SQLExecutionManagerImpl.java:405)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.flush(SQLExecutionManagerImpl.java:272)
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.flush(JDBCStoreManager.java:411)
at
com.solarmetric.kodo.runtime.PersistenceManagerImpl.flush(PersistenceManagerImpl.java:575)
at
com.solarmetric.kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:438)
at reversetutorial.Finder.createData(Finder.java:74)
at reversetutorial.Finder.main(Finder.java:141)
Exception in thread "main"First off, use the '-primaryKeyOnJoin true' flag when running the reverse
mapping tool so that you can get rid of that useless Xref class and have
a direct relation between Person and Project. See the documentation on
reverse mapping tool options here:
http://www.solarmetric.com/Software/Documentation/latest/docs/ref_guide_pc_reverse.html
But your real problem is that you are creating new objects, assigning
primary key values, and expecting them to represent existing objects.
That's not the way JDO works. If you want to set relations to existing
objects in JDO, you use the PM to look up those objects. If you try to
create new objects, JDO will assume you want to insert new records into
the DB, and you'll get PK conflicts like you see here.
There are several good books out on JDO; if you're just starting out with
it, they might save you a lot of time and help you master JDO quickly.
Maybe you are looking for
-
File download to presentation server in BACKGROUND
Hello, i'm trying to download a simple list to the presentation server, but in a background process (part of an update rule for an infopackage). I´ve tried WS_Download and CALL METHOD o->gui_download etc but the job cancels with them since they seem
-
There is probably not much of a chance to get an answer but still I'll try. During conversion of an xls-fo file into pdf all cyrillic symbols get screwed (replaced with #). There is a guide on how to embed fonts via the configuration file. However if
-
Cross Business Area posting restriction
Dear Experts, Is it possible to restrict in Cross business area posting through configuration? I want to keep validation in cross business area posting. Sourav
-
Hi! How can i move a page between two documents?
-
I cant play bonus features on itunes extras
I have been buying films with itunes extras for a while now and successfully downloading them and viewing both the films and the extras. On 17th November I bought and downloaded a film with extras but was unable to play the extras. Since then I have