JPA mapping

How shuold I map oracle Date type to java.util.Date without lossing hours, minutes and seconds. I musn't change oracle date type to timestamp
I'll be grateful for suggestions

Hi kamik!
I now this is kind of an old post, and you've probably figured it out on your own, but it might help others having the same problem.
You can map a Date column using @Temporal(TemporalType.TIMESTAMP) to a java.util.Date type, you don't have to change anything in the db.
Zsom

Similar Messages

  • WS 3.2, Kodo 4, JPA mapping of CLOB

    The problem that I have is related to JPA CLOB mapping. Underlying oracle table has a CLOB field. DbXplorer correctly recognize the type of the column as CLOB but when I generate JPA mapping for the table, the CLOB column is mapped to Object without @Lob annotation.
    <i>
    private Object payload;
    @Basic()
    @Column(name="PAYLOAD", length=4000)
    public Object getPayload() {
    return this.payload;
    public void setPayload(Object payload) {
    this.payload = payload;
    </i>
    Such a class is compiled but not ENHANCED (I opened .class file in editor and it did not implemented kodo.enhance.PeristenceCapable). No errors at all. When I manually added @Lob() annotation and changed Object to String, it was enhanced, but when I read the data, CLOB field was ‘null’ and the rest of columns populated with correct data. I updated such an object with a new data and it was written to database. The next read was the same – gived me a ‘null’ value for CLOB field.
    Am I missing something? Please help.

    Download the latest 10g oracle jdbc driver (supports 9.2 database), mapping still DOES NOT work (doesn't generate @Lob annotation but java.sql.Clob object), running is OK.

  • JPA: mapping circular references

    Hi all,
    I'm trying to map a simple parent-to-children-knows-parent scenario, which contains circular references, using JPA annotations. Here is the current code:
    @Entity
    public class Assortment implements Identifiable {
         @Id
         @GeneratedValue
         private long id;
         @OneToMany(cascade = CascadeType.ALL, fetch = FetchType.LAZY, mappedBy = "parent")
         private Set<Assortment> subAssortments = new HashSet<Assortment>();
         @ManyToOne(fetch = FetchType.LAZY)
         private Assortment parent = null;
    }The DDL for the table is the following:
    CREATE TABLE ASSORTMENT (ID BIGINT PRIMARY KEY, CODE VARCHAR(5) NOT NULL, NAME VARCHAR(50) NOT NULL, STATUS TINYINT NOT NULL, PARENT_ID BIGINT, CONSTRAINT FK_ASMT_PARENT FOREIGN KEY (PARENT_ID) REFERENCES ASSORTMENT(ID));When creating a basic tree (one root to which two children were added) and persisting this object using a simple entityManager.persist(root) call, I get the following (shortened) exception:
    org.springframework.dao.InvalidDataAccessApiUsageException: org.hibernate.exception.ConstraintViolationException: could not insert: [core.model.goods.Assortment]; nested exception is javax.persistence.EntityExistsException: org.hibernate.exception.ConstraintViolationException: could not insert: [core.model.goods.Assortment]
    Caused by: javax.persistence.EntityExistsException: org.hibernate.exception.ConstraintViolationException: could not insert: [core.model.goods.Assortment]
         at org.hibernate.ejb.AbstractEntityManagerImpl.throwPersistenceException(AbstractEntityManagerImpl.java:555)
    Caused by: org.hibernate.exception.ConstraintViolationException: could not insert: [core.model.goods.Assortment]
         at org.hibernate.exception.SQLStateConverter.convert(SQLStateConverter.java:71)
    Caused by: java.sql.SQLException: Attempt to insert null into a non-nullable column: column: ID table: ASSORTMENT in statement [insert into Assortment (id, code, name, status, parent_id) values (null, ?, ?, ?, ?)]
         at org.hsqldb.jdbc.Util.throwError(Unknown Source)
         ...It looks like the Hibernate EntityManager (the JPA implementation) is trying to insert a null value in the table's primary key (id). This understandable, since saving an object requires first knowing what the parent's ID is - although here the root has no parent and hence the parent's ID must be null.
    How must I adapt my code to handle this particular scenario? Are my annotations correct? Should I change the data structure definition instead?
    Thanks a lot,
    GB

    Two things:
    First, the Assortment.parent attribute needed to be annotated with @JoinColumn(name="parent_id")Second, the SQL query sent by Hibernate used an HSQLDB specificity, the identity column, to generate the primary key. By setting hibernate.hbm2ddl.auto=true in my persistence.xml file and setting the log level to debug, I could see that the table was generated using this (approximate) statement:
    CREATE TABLE ASSORTMENT (ID BIGINT GENERATED BY DEFAULT AS IDENTITY PRIMARY KEY, CODE VARCHAR(5), NAME VARCHAR(50), STATUS TINYINT NOT NULL, PARENT_ID BIGINT, CONSTRAINT FK_ASMT_PARENT FOREIGN KEY (PARENT_ID) REFERENCES ASSORTMENT(ID))The "GENERATED BY DEFAULT AS IDENTITY" seems to be the key here.
    Hope this helps someone out there.
    Cheers,
    GB

  • Proper JPA mapping for MySQL SET type?

    If I have a MySQL ENUM type, it seems straightforward to map it to "int" in Java. However, what about a MySQL SET type? I don't see an obvious way that that should be mapped to Java. I looked for examples of this, but I couldn't find any.
    if there's a better place to ask questions about this, please direct me.

    Hi Vinod,
    due to performance improvement when using forward only cursor we changed the default for the resultset type from TYPE_SCROLL_SENSITIVE to TYPE_FORWARD_ONLY starting with JDBC driver version 7.6. So I guess the exception comes from a statement where you didn't set the resultset type while creating it. Please check if all of the statements that you want to be scrollable have set the correct resultset type.
    Regards,
    Marco

  • Local Cache with write-behind backing map

    Hi there,
    I am a Coherence newb, so I hope my question isn't too naive. I have been experimenting with Coherence using a write-behind JPA backing map, but I have only been able to make it work with a distributed cache. Because of my specific database RAC architecture, I need to ensure that entries written to the database from a given WLS node are restricted to a specific RAC node. In my view, using a local cache rather than a distributed cache should solve my problem, although if there is a way of configuring my cache so this works I'd appreciate the info.
    So, the short form of the question: can I back a local cache with a write-behind JPA map?
    Cheers,
    Ron

    Hi Ron,
    The configuration for <local-scheme> allows you to add a cache store but you cannot use write-behind, only write-through.
    Presumably you do not want the data to be shared by the different WLS nodes, i.e. if one node puts data in the cache and that is eventually written to a RAC node, that data cannot be seen in the cache by other WLS nodes. Or do you want all the WLS nodes to share the data but just write it to different RAC nodes?
    If you use a local-scheme then the data will only be local to that WLS node and not shared.
    I can think of a possible way to do what you want but it depends on the answer to the above question.
    JK

  • [Java and all else] How do you document DB design?

    Hello,
    Although most of the technical choices are explicit in SQL (UniqueKey & ForeignKey constraints, indexes,...), the semantics of the data columns and their constraints seem better served as plain human-readable comments ("ID column: identifies the plane copy; 6 figures are enough as we don't expect to sell more than 1 million planes in a foreseeable future...").
    In my current system, (EJB-based, but the DB schema is not created by the JPA-compliant ORM, but via SQL scripts), I see the following ways to write and maintain this documentation:
    - SQL comments in the schema creation and patch scripts
    - Javadoc comments in the Java source of the Entity classes.
    - UML notes in UML diagrams(1)
    - external "Architecture and design" document.
    All 4 of them are used, sometimes inconsistently, for various parts of the design choices; I mean, the docs do not contradict themselves (not yet, but it's a mere question of time), but some tables are commented in SQL scripts, others are in design docs, and the details for some columns are in entity javadocs. Although each developer may find it handier to "write" the doc via his preferred medium, it becomes increasingly difficult to "maintain" theoverall documentation.
    Our current situation is that most of the project team members are Java developers, so it would probably be better accepted and served if we standardized that we document DB tables and columns in Java source, but I am worried that:
    - someone else pouring over our DB (ex: DBA helping us to optimize things, or other team developping a data-mining tool to leverage the historical data in DB) may not be as comfortable with Java
    - this may not cover all design choices of the DB schema:
    - - - - first, although that is the case currently, in the future there might not be a 100% 1-1 mapping between e.g. each entity class and a DB table. Maybe some columns will not need to be mapped anymore,...
    - - - - second, I fear some DB constructs are not amenable to Java counterparts; no accurate idea there, I'm not an SQL nor JPA expert, but I presume Indexes, table partitioning, tablespaces, for example (yes, the DBMS is Oracle :o) are not taken into account in JPA mapping.
    And if we document those choices at the SQL level (my preferred idea so far), this will gradually make the javadocs in the entities obsolete, or removed altogether, and that will make future maintenance of the Java source risky.
    How do you document your DB design in general?
    Do you have specific advices for my case?
    Thanks in advance,
    J.
    (1) Just for the record, here are a few posts that refer to modelling the DB in UML:
    [A post on this forum highlighting that UML 1.4 standardizes a notation for RDB modelling|http://forums.sun.com/thread.jspa?messageID=1383724#1383724]
    [A DB modelling tutorial|http://www.tomjewett.com/dbdesign/dbdesign.php] (not sure whether it leverages the standard mentioned above :o)
    P.S.: "Java and all else" as in, damn, I'd love to use the familiar JDC forums and people to discuss not only Java-related issues, but also [all other things that surround Java|http://forums.sun.com/thread.jspa?threadID=5422264&tstart=0] (other technologies, processes, people) and enable to make workable systems out of Java code.

    jduprez wrote:
    Thank you again.
    A couple more questions:
    2. Table, and proc dictionary maintained as one or more text files. Those files and schema are all in source control.Do you mean, a proprietary text format (a la tabledesign.txt), or +.sql+ source files? Again, you seem to suggest the scehma info is present in two locations.I didn't say two locations unless you meant files. Then yes.Yes I meant two files. I trust the team to get the doc file lagging a few revisions behind the SQL file, and to correct that I would have to include one more step in the review process (e.g. "review CM actions to check that the doc file is updated consistently with the SQL source").
    The tool I wrote would throw errors if the comment file didn't match the schema. And if comments were not provided. That of course doesn't stop someone from documenting a table with "a table".
    You can of course keep the addtional info as special comments in the SQL, but when I did this, and even in retrospect, it seems better and perhaps easier to keep them seperate. I have done special comments in SQL before and it seems a bit of a kluge but you do have the single source. But in that case I was the only one maintaining it too.What do you mean by "special"?For code generation I have a schema file with the following (pseudo sql.)
        create mytable
           mytable_id int,
           column_foo varchar(10);
           column_fum varchar(20)
            -- Query: for_a_query (column_foo, column_fum)
        )The generator consumes the schema and generates the standard CRUD which would include a query based on the primary key.
    The above comment is used to provide an additional query where the proc is named 'for_a_query' (munged with table name) and takes two parameters (column_foo, column_fum) whose type/name matches those of the table.
    If we don't try to generate a browseable HTML out of the schema (although the idea is appealing, I don't have the resource to make such a tool), do you see anything kludgy in maintaining SQL comments interpersed within DDL code?
    Of course that is doable.

  • Jpql, join on?

    a simplified example:
    i have two tables, dog and cat, there are not foreign key reference between them. i use openJPA to implement jpa mapping, and two entities are generated: Catty and Doggy. and there are not reference between these two objects too.
    but in my app, i want to implement join query. something like following in NATIVE SQL:
    select from dog d join cat c on d.age = c.age.*
    it works on my db2.
    so i use jpql like following:
    SELECT d FROM Doggy d JOIN Catty c ON d.age = c.age.
    unfortunately, the jpql throw exception. similar with following:
    org.apache.openjpa.persistence.ArgumentException: Encountered "JOIN Catty c" at character xx, but expected: [".", "FETCH", "INNER", "JOIN", "LEFT", <IDENTIFIER>].
    question:
    1 what is the problem of my jpql?
    2 while i use join in jpql, is it mandatory to define a reference attribute to other object in first object?
    3 does jpql support "ON" key word. i have checked jpql manual, it seem it is not preserved?
    thanks

    On is not supported. You should have posted your relationship definititions as well. you have have Collection<Cat> in the Dog entity?
    i.e INNER JOIN is done on relationships. Otherwise you are stuck with the theta join
    SELECT d FROM Doggy d, Cat c WHERE d.age = c.age

  • Kodo extensions in orm.xml?

    I'm evaluating Kodo for use with JPA, but I would like to use externalized mapping files. These work fine for standard "to the spec" mappings, but I'd like to be able to use some of the Kodo-specific extensions (such as custom field handlers). Is this possible within an externalized JPA mapping file (e.g. orm.xml)? All of the docs that I have found for Kodo JPA use annotations.

    "Amol" <[email protected]> writes:
    I have been going thru mails in the newsgrp, for help in my work and I
    found a lot of extensions used (r they only kodo-specific). I have gone
    thru all of the documentation tht comes with 2.2 release but havent a
    documentation for this extensions. Where can i find them.The JDO standard defines a syntax for adding vendor-specific metadata to
    the JDO metadata files. So yes, the extensions that you are referring to
    are Kodo-specific.
    Most of our vendor extensions are used for controlling the
    object-relational mapping that Kodo performs.
    Look in the metadata and existing schema sections of the 2.2.2
    documentation for more information on possible extensions. The 2.2.3
    documentation is formatted slightly differently; it contains an appendix
    with a list of all extensions supported by Kodo.
    -Patrick
    Patrick Linskey [email protected]
    SolarMetric Inc. http://www.solarmetric.com

  • NoSuchMethodError due to EclipseLink internal weaving?

    Hello everyone,
    We are using EclipseLink 2.6.0 and are seeing a weird error that appears to be resolved by setting the value for the PersistenceUnitProperties.WEAVING_INTERNAL property to false in the EntityManagerFactory definition.
    A little background on our JPA mapped entities in play here. We have a concrete class called CareTeam that extends a MappedSuperclass called AuditedEntity, and AuditedEntity extends another MappedSuperclass called IdentifiedEntity. We have plenty of classes that extend either IdentifiedEntity or AuditedEntity without issue thus far in our three or so months of using EclipseLink 2.6.0. So, this doesn't pear to be due to AuditedEntity or IdentifiedEntity. The CareTeam class is a simple concrete Entity that extends AuditedEntity and it has a default public no args constructor, as does the AuditedEntity class, contrary to what the stack trace below might lead you to believe.
    If we do not set this property, we see a stack trace of the following form at runtime, at the time a request is being serviced, not at runtime initialization time :
    2015-08-03 22:16:04.334 RI: dd39a3d4-7e5a-42bb-beb9-09249df66031 ERROR http-bio-8780-exec-1 BaseExceptionMapper - Service invocation failure :
    java.lang.NoSuchMethodError: com.company.db.jpa.AuditedEntity.<init>(Lorg/eclipse/persistence/internal/descriptors/PersistenceObject;)V
    at com.company.careteamservice.beans.CareTeam.<init>(CareTeam.java)
    at com.company.careteamservice.beans.CareTeam._persistence_new(CareTeam.java)
    at org.eclipse.persistence.internal.descriptors.PersistenceObjectInstantiationPolicy.buildNewInstance(PersistenceObjectInstantiationPolicy.java:33)
    at org.eclipse.persistence.descriptors.ClassDescriptor.selfValidationAfterInitialization(ClassDescriptor.java:4230)
    at org.eclipse.persistence.descriptors.ClassDescriptor.validateAfterInitialization(ClassDescriptor.java:6099)
    at org.eclipse.persistence.descriptors.ClassDescriptor.postInitialize(ClassDescriptor.java:3915)
    at org.eclipse.persistence.internal.sessions.DatabaseSessionImpl.initializeDescriptors(DatabaseSessionImpl.java:692)
    at org.eclipse.persistence.internal.sessions.DatabaseSessionImpl.initializeDescriptors(DatabaseSessionImpl.java:637)
    at org.eclipse.persistence.internal.sessions.DatabaseSessionImpl.initializeDescriptors(DatabaseSessionImpl.java:568)
    at org.eclipse.persistence.internal.sessions.DatabaseSessionImpl.postConnectDatasource(DatabaseSessionImpl.java:804)
    at org.eclipse.persistence.internal.sessions.DatabaseSessionImpl.login(DatabaseSessionImpl.java:761)
    at org.eclipse.persistence.internal.jpa.EntityManagerFactoryProvider.login(EntityManagerFactoryProvider.java:255)
    at org.eclipse.persistence.internal.jpa.EntityManagerSetupImpl.deploy(EntityManagerSetupImpl.java:728)
    at org.eclipse.persistence.internal.jpa.EntityManagerFactoryDelegate.getAbstractSession(EntityManagerFactoryDelegate.java:205)
    at org.eclipse.persistence.internal.jpa.EntityManagerFactoryDelegate.createEntityManagerImpl(EntityManagerFactoryDelegate.java:305)
    at org.eclipse.persistence.internal.jpa.EntityManagerFactoryImpl.createEntityManagerImpl(EntityManagerFactoryImpl.java:337)
    at org.eclipse.persistence.internal.jpa.EntityManagerFactoryImpl.createEntityManager(EntityManagerFactoryImpl.java:303)
    at org.springframework.orm.jpa.JpaTransactionManager.createEntityManagerForTransaction(JpaTransactionManager.java:449)
    at org.springframework.orm.jpa.JpaTransactionManager.doBegin(JpaTransactionManager.java:369)
    at org.springframework.transaction.support.AbstractPlatformTransactionManager.getTransaction(AbstractPlatformTransactionManager.java:373)
    at org.springframework.transaction.interceptor.TransactionAspectSupport.createTransactionIfNecessary(TransactionAspectSupport.java:463)
    at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:276)
    at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96)
    at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
    at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:653)
    I have a few questions to pose to the forum :
    1) Is this a known issue with EclipseLink 2.6.0?
    2) We wish to use load time weaving, so is it possible that this change will compromise the behavior of load time weaving in some way?
    Thanks, Doug

    Hi Lukas,
    Thanks for the response. We don't have a means to extract the code that causing this behavior into a simple test case, but would have been happy to do so otherwise. We are using Tomcat, Spring, no persistence.xml file, Spring to create the EntityManagerFactory from Java configuration.
    We wish to use load time weaving, so is it possible that setting internal weaving to false will compromise JPA entity behavior of overall load time weaving in some way?
    Thanks, Doug

  • Xml mapping with JPA error

    I am &#304;shak Teyran , the data manager of a team which attends to IBM XML Challenge Turkey, we are trying to build a dynamic web project in Rational Software Architect 7.5, WAS 7.0 , DB2 9.5 EE.
    the article of Vitor Rodrigues - http://www.ibm.com/developerworks/data/library/techarticle/dm-0901rodrigues/index.html - describes how to map xml to java using pureQuery however we need to do the same using JPA.. is there any difference?
    we are trying to achieve the same using JPA. but unfortunately we failed..
    for simplifying lets assume we have two tables , STUDENT ( id int, name char) and COURSES ( studentID int , and lessons xml) ... these two tables are related to each other with studentID foreign key... and we want to have student and courses java files where courses.java includes proper mapping to xml column lessons..
    i tried to make an .xsd file for xml column lessons.. and then i created its java file.. then i used JPA for creating student and courses java files but it always turns me a null pointer exception.. i dont know why.. and as result , only student.java is being created..
    i have tried following steps.
    My database connection and the tables are already present.
    1- i have created a new dynamic web site project, namely dene..
    from project facets i enabled dynamic web module, faces support (base, enhanced componnents) , Java Persistence, Javascript Toolkit, JavaServer faces, JSTL, Web 2.0 , Websphere Web(Co-existence),Websphere Web(Extended),Default Style Sheet facets.. and these facets musts be enabled for our project.
    2- i have created the CourseList.xsd and using -JAXB- i have created lessons.java
    3- i right clicked project and from the menu .. JPA tools->Generate Entities .. i choose connection and tables to generate entities but unfortunately no entity gets created ... when i only choose student table to generate the entity of, Student.java is being created however Courses.java always returns nullpointerexception..
    the stack trace of the exception is in the picture i attached.. here is its link.... [http://www.ibm.com/developerworks/forums/servlet/JiveServlet/download/430-250134-14212507-336353/hata.JPG]
    i have thought it might be because i use the openjpa and jaxb libraries of RSA.. and then i tried to load the latest versions of JAXB and openjpa .. i added all of their .jar files to the application classpath.. (if it is sufficient to add only openjpa-1.2.0.jar and jaxb-api.jar files let me know it please).. but this did not work neither..
    the only exception i get is :
    An internal error occurred during: "Generating Entities".
    java.lang.NullPointerException
    while creating lessons.java and the entity java files i am making sure that i give the same package name for both ... but i never succeeded creating courses.java..
    in RSA when i right click to project and go te JPA Tools menu there is a menu item "Configure Project for JDBC deployment" .. what is this for.. i have also tried using this and then trying to create entities but this failed too..
    then when they did not work ;
    i have decided to code the classes myself but in order to get things a little easier, i opened my DB and converted the XML column to a SMALLINT column .. just not an XML column... and then using JPA in RSA - i did not include openJPA jars in my project externally- i created the classes .. both Course.java and Student.java have been created well, because the column that was actually xml was converted to small integer..
    and if i add the related persistence and strategy lines :
    @Persistent
    @Strategy("org.apache.openjpa.xmlmapping.XmlValueHandler")
    to their proper places and editing the function and attributes data types and return types and then if i re-convert the column to XML data type in database, do you think it will work properly?
    so i really need a suggestion.. any help will be appreciated..
    what would you suggest?
    how can we achieve this goal for those two tables i mentioned above in RSA and using JPA ?

    Hi Tushar
    You will need to be more specific. Do you want an XML layout to use for importing or exporting SAP Business One objects? Maybe give us details of the specific object. Most DI API objects have a write to XML which will give you the layout for that object. You can also refer to the REFDI.CHM file under the SDK Help folder.
    Kind regards
    Peter Juby

  • Native mapping with JPA doesn't work by default - not marked cascade persi

    I started to use our old TopLink native mapping but via JPA instead of EclpseLink native API.
    But at first execution it's already complaining of my mappings. When using the native mappings shouldn't EclipseLink interpret it to keep old behavior?
    What are the other expected changes of behavior so that I can fix them proactively?
    What is the workaround for this specific problem?
    The error is:
    java.lang.IllegalStateException: During synchronization a new object was found through a relationship that was not marked cascade PERSIST: Id: 0
    DateTime: Mon Jun 08 02:00:00 EDT 2009
    Entry Type: 10009
    Entry Type In/Out: 10014
    calendarDate: Mon Jun 08 00:00:00 EDT 2009
    businessDate: Mon Jun 08 00:00:00 EDT 2009
    origin: timecard
    Inserted: true
    Changed: false
    Deleted: false
    Mon Jun 08 02:00:00 EDT 2009.
         at org.eclipse.persistence.internal.sessions.UnitOfWorkImpl.discoverAndPersistUnregisteredNewObjects(UnitOfWorkImpl.java:4016)
         at org.eclipse.persistence.internal.sessions.RepeatableWriteUnitOfWork.discoverUnregisteredNewObjects(RepeatableWriteUnitOfWork.java:182)
    persistence.xml is:
    <persistence xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_1_0.xsd" version="1.0">
         <persistence-unit name="AllInOne" transaction-type="JTA" >
         <description>This represent all CITYADM schema mappings. It does include pmsi.</description>
              <exclude-unlisted-classes>false</exclude-unlisted-classes>
              <properties>
                   <property name="eclipselink.session-name" value="AllInOne"/>
                   <property name="eclipselink.sessions-xml" value="sessions.xml"/>
              </properties>
         </persistence-unit>
         <persistence-unit name="DataServices" transaction-type="JTA">
         <description>This represent all REPORTADM schema mappings</description>
              <exclude-unlisted-classes>false</exclude-unlisted-classes>
              <properties>
                   <property name="eclipselink.session-name" value="DataServices"/>
                   <property name="eclipselink.sessions-xml" value="sessionsDataServices.xml"/>
              </properties>
         </persistence-unit>
    </persistence>

    Hello Sebastien,
    You are using em.persist instead of uow.regsiterObject, which has different behaviour as mandated by the JPA specification. EclipseLink/TopLink automatically registered referenced objects through registerObject, but the JPA specification states that persist will only cascade if the mapping is marked cascade persist - otherwise it is required to throw the exception you see.
    If you wish to use the JPA with a project defined through native api, you will need to modify your mappings to match how you want to use the API. In this case, to get the same effect as you would from registerObject, you will need to mark your mappings as cascade persist. Depending on how you merge your entities (shallow, deep etc) you might also need to evaluate how you plan to use the JPA merge method and mark mappings as cascade merge appropriately.
    See setCascadeMerge on ForeignReferenceMapping.
    Best Regards,
    Chris

  • Interface Mapping not supported in the JPA specification?

    Are there any plans to add Interface support in the JPA specification? It is not supported by JPA annotations, which seems quite disruptive to proper object oriented design. However, individual implementations of JPA seem to support this:
    http://docs.jboss.org/hibernate/stable/core/reference/en/html_single/#inheritance-tableperclass
    http://wiki.eclipse.org/Using_EclipseLink_JPA_Extensions_%28ELUG%29#How_to_Use_the_.40VariableOneToOne_Annotation
    There is visible interest in getting this implemented at an Annotation level for Hibernate also but the developers for Hibernate point out that this isn't even in the JPA specifications.
    http://opensource.atlassian.com/projects/hibernate/browse/ANN-9
    https://forum.hibernate.org/viewtopic.php?f=9&t=941363&sid=4abdbc72cbf04380f4a8e2cadd7dfada&start=15
    Is this being talked about/in the works for JPA? Why not include @VariableOneToOne in the spec? It would seem to be quite an essential feature for wide adoption.

    Hi,
    You can only choose the Interface mapping for the Enhanced recevier determination in the extended tab of Receiver determination. & I dont see the Interface mapping in the select list.
    Where do I have to check for the proper outbound message???
    Regards

  • Toplink JPA and Java type mapping

    Hi there,
    I'm struggling wit mapping an existing database using Toplink Essentials.
    Specifically: I have an INT SQL field which I want to map as boolean/Boolean in my entity class.
    How to do this? Either in standard JPA or with some Toplink extensions?
    Markus

    Hi Markus,
    The JPA spec doesn't cover conversion/transformation but you can do it with TopLink Essentials using a Converter. I have an example pasted below that uses a converter to map between a database VARCHAR value and a pre-Java 5 enum (i.e., a hand coded enum-type class that runs in Java 1.4).
    I use a descriptor customizer to get hold of the Basic mapping (in TopLink a "DirectToFieldMapping") and then plug in a converter.
    --Shaun
    Persistence.xml:
    <property
       name="toplink.descriptor.customizer.TypeWriter"
       value="oracle.toplink.essentials.emf.examples.library.orm.TypeWriterCustomizer"/>The customizer:
    public class TypeWriterCustomizer implements DescriptorCustomizer {
          * Add customizer to translate between Enum and value.
         public void customize(ClassDescriptor descriptor) throws Exception {
              DirectToFieldMapping typeMapping = (DirectToFieldMapping) descriptor.getMappingForAttributeName("type");
              Converter typeWriterEnumConverter = TypeWriterEnumConverter.getInstance();
              typeMapping.setConverter(typeWriterEnumConverter);
    }The actual converter class:
    * TypeWriterEnumConverter is a singleton since it has no state.
    public class TypeWriterEnumConverter implements Converter {
         protected static TypeWriterEnumConverter instance = new TypeWriterEnumConverter();
         private TypeWriterEnumConverter() {}
         public static TypeWriterEnumConverter getInstance() {
              return instance;
         public Object convertDataValueToObjectValue(Object data, Session session) {
              String typeName = (String)data;
              return TWriterType.get(typeName);
         public Object convertObjectValueToDataValue(Object object, Session session) {
              TWriterType type = (TWriterType) object;
              if (type != null) {
                   return type.getName();
              } else {
                   return null;
         public void initialize(DatabaseMapping arg0, Session arg1) {}
         public boolean isMutable() {
              return false;
    }

  • JPA: Attr ... mapped to a primary key column in the DB. Update not allowed.

    Let me just say that I posted a bug report for this here:
    https://glassfish.dev.java.net/issues/show_bug.cgi?id=3937
    But I'm also posting the info here, so that people who search on this forum may get some help:
    TopLink (both Essentials and 11g) has a problem with flushing entities
    containing a self-reference relationship. When flushing, the following exception
    occurs:
    Exception [TOPLINK-7251] (Oracle TopLink Essentials - 2.1 (Build b14-fcs
    (12/17/2007))): oracle.toplink.essentials.exceptions.ValidationException
    Exception Description: The attribute [id] of class [test.jpa.entities.Person] is
    mapped to a primary key column in the database. Updates are not allowed.
    No manual updates have been made to ANY primary key.
    What I'm doing is:
    1. I instantiate a new entity.
    2. I start a transaction
    3. I persist the new entity.
    4. I read an existing entity from the DB.
    5. I let the existing entity point to the new entity via the self-reference
    relationship.
    6. I flush the persistence context.
    7. I issue commit(), and the exception occurs. (I have provided the stack traces for various versions of TopLink below.)
    This is a clear bug.
    Here are some additional observations:
    1. Reproduced on the following versions of TopLink:
    1.1. Oracle TopLink Essentials - 2.0 (Build b58g-fcs (09/07/2007))
    1.2. Oracle TopLink Essentials - 2.1 (Build b14-fcs (12/17/2007))
    1.3. Oracle TopLink - 11g Technology Preview 3 (11.1.1.0.0) (Build 071214)
    2. Reproducible both on Java SE and Java EE. (I tested on Oracle Application Server)
    3. Reproducible with and without class weaving
    4. Reproducible regardless of whether the JPA annotations are on fields or on
    methods
    5. Reproducible regardless of whether "cascade={CascadeType.PERSIST}" is used or
    not.
    6. Reproducible regardless of the fetch type of the self-reference relationship
    (EAGER or LAZY).
    Also:
    1. Without flushing, the bug doesn't occur. That is, if I commit without
    flushing, it works.
    2. Without setting the self-reference relationship, the bug doesn't occur.
    This is an issue that appears when using BOTH self-reference relationship AND
    flushing.
    Best regards,
    Bisser
    The message was edited by bisser:
    Added info that the exception occurs "when I issue commit()" on step 7.

    I'm extremely surprised that you couldn't reproduce the error. It's reproduced each time when I run the Test Scenario that I described above.
    You could download a sample Eclipse project that reproduces the error from here: https://glassfish.dev.java.net/issues/show_bug.cgi?id=3937
    For the log below I used TopLink, version: Oracle TopLink Essentials - 2.0 (Build b58g-fcs (09/07/2007)).
    Could you, please, tell me what version you use and I will try the Test Case on it.
    Here's the FINEST log:
    [TopLink Finest]: 2008.01.09 07:35:58.094--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.weaving; value=false
    [TopLink Finest]: 2008.01.09 07:35:59.312--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.orm.throw.exceptions; default value=true
    [TopLink Finer]: 2008.01.09 07:35:59.312--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--Searching for default mapping file in file:/D:/dev/bull/jpa_pk_bug/bin/
    [TopLink Config]: 2008.01.09 07:35:59.547--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--The alias name for the entity class [class test.jpa.entities.Person] is being defaulted to: Person.
    [TopLink Config]: 2008.01.09 07:35:59.594--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--The column name for element [private java.lang.Long test.jpa.entities.Person.id] is being defaulted to: ID.
    [TopLink Config]: 2008.01.09 07:35:59.609--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--The column name for element [private java.lang.String test.jpa.entities.Person.name] is being defaulted to: NAME.
    [TopLink Config]: 2008.01.09 07:35:59.641--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--The target entity (reference) class for the many to one mapping element [test.jpa.entities.Person test.jpa.entities.Person.mgr] is being defaulted to: class test.jpa.entities.Person.
    [TopLink Config]: 2008.01.09 07:35:59.703--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--The primary key column name for the mapping element [test.jpa.entities.Person test.jpa.entities.Person.mgr] is being defaulted to: ID.
    [TopLink Finest]: 2008.01.09 07:35:59.703--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--end predeploying Persistence Unit Test; state Predeployed; factoryCount 0
    [TopLink Finer]: 2008.01.09 07:35:59.703--Thread(Thread[Main Thread,5,main])--cmp_init_transformer_is_null
    [TopLink Finest]: 2008.01.09 07:35:59.703--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--begin predeploying Persistence Unit Test; state Predeployed; factoryCount 0
    [TopLink Finest]: 2008.01.09 07:35:59.703--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--end predeploying Persistence Unit Test; state Predeployed; factoryCount 1
    [TopLink Finest]: 2008.01.09 07:35:59.719--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--begin deploying Persistence Unit Test; state Predeployed; factoryCount 1
    [TopLink Finest]: 2008.01.09 07:35:59.734--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.logging.level; value=FINEST; translated value=FINEST
    [TopLink Finest]: 2008.01.09 07:35:59.734--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.logging.level; value=FINEST; translated value=FINEST
    [TopLink Finest]: 2008.01.09 07:35:59.750--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.jdbc.user; value=rms
    [TopLink Finest]: 2008.01.09 07:35:59.750--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.jdbc.password; value=xxxxxx
    [TopLink Finest]: 2008.01.09 07:36:00.766--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.target-database; value=Oracle; translated value=oracle.toplink.essentials.platform.database.oracle.OraclePlatform
    [TopLink Finest]: 2008.01.09 07:36:00.781--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.jdbc.driver; value=oracle.jdbc.OracleDriver
    [TopLink Finest]: 2008.01.09 07:36:00.781--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.jdbc.url; value=jdbc:oracle:thin:@//10.20.6.126:1521/region2
    [TopLink Info]: 2008.01.09 07:36:00.797--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--TopLink, version: Oracle TopLink Essentials - 2.0 (Build b58g-fcs (09/07/2007))
    [TopLink Config]: 2008.01.09 07:36:00.812--ServerSession(1968077)--Connection(5182312)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:01.797--ServerSession(1968077)--Connection(4252099)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Config]: 2008.01.09 07:36:01.797--ServerSession(1968077)--Connection(5744890)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:01.875--ServerSession(1968077)--Connection(5747801)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Config]: 2008.01.09 07:36:01.891--ServerSession(1968077)--Connection(5760373)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:01.969--ServerSession(1968077)--Connection(5763284)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Config]: 2008.01.09 07:36:01.969--ServerSession(1968077)--Connection(5497095)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:02.047--ServerSession(1968077)--Connection(5500006)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Config]: 2008.01.09 07:36:02.047--ServerSession(1968077)--Connection(5512041)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:02.125--ServerSession(1968077)--Connection(5514977)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Config]: 2008.01.09 07:36:02.125--ServerSession(1968077)--Connection(5527528)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:02.203--ServerSession(1968077)--Connection(5530440)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Config]: 2008.01.09 07:36:02.203--ServerSession(1968077)--Connection(5542993)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:02.281--ServerSession(1968077)--Connection(5545904)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Finest]: 2008.01.09 07:36:02.312--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--sequencing connected, state is Preallocation_NoTransaction_State
    [TopLink Info]: 2008.01.09 07:36:02.484--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--file:/D:/dev/bull/jpa_pk_bug/bin/-Test login successful
    [TopLink Finest]: 2008.01.09 07:36:02.484--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--end deploying Persistence Unit Test; state Deployed; factoryCount 1
    [TopLink Finer]: 2008.01.09 07:36:02.516--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--client acquired
    [TopLink Finest]: 2008.01.09 07:36:02.531--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Execute query DoesExistQuery()
    [TopLink Finest]: 2008.01.09 07:36:02.547--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--PERSIST operation called on: test.jpa.entities.Person@563c8c.
    [TopLink Finest]: 2008.01.09 07:36:02.562--ClientSession(5666151)--Thread(Thread[Main Thread,5,main])--Execute query ValueReadQuery()
    [TopLink Fine]: 2008.01.09 07:36:02.594--ServerSession(1968077)--Connection(5747801)--Thread(Thread[Main Thread,5,main])--SELECT PERSONS_ID_SEQ.NEXTVAL FROM DUAL
    [TopLink Finest]: 2008.01.09 07:36:03.297--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--sequencing preallocation for PERSONS_ID_SEQ: objects: 1 , first: 5, last: 5
    [TopLink Finest]: 2008.01.09 07:36:03.312--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--assign sequence to the object (5 -> test.jpa.entities.Person@563c8c)
    [TopLink Finest]: 2008.01.09 07:36:03.328--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Execute query ReadObjectQuery(test.jpa.entities.Person)
    [TopLink Fine]: 2008.01.09 07:36:03.438--ServerSession(1968077)--Connection(4252099)--Thread(Thread[Main Thread,5,main])--SELECT ID, NAME, MGR_ID FROM Persons WHERE (ID = ?)
         bind => [1]
    [TopLink Finest]: 2008.01.09 07:36:03.531--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Register the existing object test.jpa.entities.Person@3a4484
    [TopLink Finer]: 2008.01.09 07:36:03.625--ClientSession(5666151)--Connection(5763284)--Thread(Thread[Main Thread,5,main])--begin transaction
    [TopLink Finest]: 2008.01.09 07:36:03.625--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Execute query UpdateObjectQuery(test.jpa.entities.Person@3a57fa)
    [TopLink Finest]: 2008.01.09 07:36:03.641--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Execute query WriteObjectQuery(test.jpa.entities.Person@563c8c)
    [TopLink Fine]: 2008.01.09 07:36:03.656--ClientSession(5666151)--Connection(5763284)--Thread(Thread[Main Thread,5,main])--INSERT INTO Persons (ID, NAME, MGR_ID) VALUES (?, ?, ?)
         bind => [5, Boss, null]
    [TopLink Fine]: 2008.01.09 07:36:03.688--ClientSession(5666151)--Connection(5763284)--Thread(Thread[Main Thread,5,main])--UPDATE Persons SET MGR_ID = ? WHERE (ID = ?)
         bind => [5, 1]
    [TopLink Finer]: 2008.01.09 07:36:03.703--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--begin unit of work commit
    [TopLink Finest]: 2008.01.09 07:36:03.703--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Execute query UpdateObjectQuery(test.jpa.entities.Person@563c8c)
    [TopLink Warning]: 2008.01.09 07:36:03.812--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Local Exception Stack:
    Exception [TOPLINK-7251] (Oracle TopLink Essentials - 2.0 (Build b58g-fcs (09/07/2007))): oracle.toplink.essentials.exceptions.ValidationException
    Exception Description: The attribute [id] of class [test.jpa.entities.Person] is mapped to a primary key column in the database. Updates are not allowed.
         at oracle.toplink.essentials.exceptions.ValidationException.primaryKeyUpdateDisallowed(ValidationException.java:2222)
         at oracle.toplink.essentials.mappings.foundation.AbstractDirectMapping.writeFromObjectIntoRowWithChangeRecord(AbstractDirectMapping.java:750)
         at oracle.toplink.essentials.internal.descriptors.ObjectBuilder.buildRowForUpdateWithChangeSet(ObjectBuilder.java:948)
         at oracle.toplink.essentials.internal.queryframework.DatabaseQueryMechanism.updateObjectForWriteWithChangeSet(DatabaseQueryMechanism.java:1263)
         at oracle.toplink.essentials.queryframework.UpdateObjectQuery.executeCommitWithChangeSet(UpdateObjectQuery.java:91)
         at oracle.toplink.essentials.internal.queryframework.DatabaseQueryMechanism.executeWriteWithChangeSet(DatabaseQueryMechanism.java:390)
         at oracle.toplink.essentials.queryframework.WriteObjectQuery.executeDatabaseQuery(WriteObjectQuery.java:109)
         at oracle.toplink.essentials.queryframework.DatabaseQuery.execute(DatabaseQuery.java:628)
         at oracle.toplink.essentials.queryframework.DatabaseQuery.executeInUnitOfWork(DatabaseQuery.java:555)
         at oracle.toplink.essentials.queryframework.ObjectLevelModifyQuery.executeInUnitOfWorkObjectLevelModifyQuery(ObjectLevelModifyQuery.java:138)
         at oracle.toplink.essentials.queryframework.ObjectLevelModifyQuery.executeInUnitOfWork(ObjectLevelModifyQuery.java:110)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.internalExecuteQuery(UnitOfWorkImpl.java:2233)
         at oracle.toplink.essentials.internal.sessions.AbstractSession.executeQuery(AbstractSession.java:952)
         at oracle.toplink.essentials.internal.sessions.AbstractSession.executeQuery(AbstractSession.java:909)
         at oracle.toplink.essentials.internal.sessions.CommitManager.commitChangedObjectsForClassWithChangeSet(CommitManager.java:309)
         at oracle.toplink.essentials.internal.sessions.CommitManager.commitAllObjectsWithChangeSet(CommitManager.java:195)
         at oracle.toplink.essentials.internal.sessions.AbstractSession.writeAllObjectsWithChangeSet(AbstractSession.java:2657)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.commitToDatabase(UnitOfWorkImpl.java:1044)
         at oracle.toplink.essentials.internal.ejb.cmp3.base.RepeatableWriteUnitOfWork.commitToDatabase(RepeatableWriteUnitOfWork.java:403)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.commitToDatabaseWithChangeSet(UnitOfWorkImpl.java:1126)
         at oracle.toplink.essentials.internal.ejb.cmp3.base.RepeatableWriteUnitOfWork.commitRootUnitOfWork(RepeatableWriteUnitOfWork.java:107)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.commitAndResume(UnitOfWorkImpl.java:856)
         at oracle.toplink.essentials.internal.ejb.cmp3.transaction.base.EntityTransactionImpl.commit(EntityTransactionImpl.java:102)
         at oracle.toplink.essentials.internal.ejb.cmp3.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:60)
         at test.jpa.TestPkBug.runTest(TestPkBug.java:53)
         at test.jpa.TestPkBug.main(TestPkBug.java:95)
    [TopLink Finer]: 2008.01.09 07:36:03.828--ClientSession(5666151)--Connection(5763284)--Thread(Thread[Main Thread,5,main])--rollback transaction
    [TopLink Finer]: 2008.01.09 07:36:03.844--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--release unit of work
    [TopLink Finer]: 2008.01.09 07:36:03.844--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--initialize identitymaps
    [TopLink Finer]: 2008.01.09 07:36:03.844--ClientSession(5666151)--Thread(Thread[Main Thread,5,main])--client released
    [TopLink Finest]: 2008.01.09 07:36:03.844--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--begin undeploying Persistence Unit Test; state Deployed; factoryCount 1
    [TopLink Finest]: 2008.01.09 07:36:03.844--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--sequencing disconnected
    [TopLink Config]: 2008.01.09 07:36:03.844--ServerSession(1968077)--Connection(4252099)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Finer]: 2008.01.09 07:36:03.859--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--initialize identitymaps
    [TopLink Info]: 2008.01.09 07:36:03.859--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--file:/D:/dev/bull/jpa_pk_bug/bin/-Test logout successful
    [TopLink Config]: 2008.01.09 07:36:03.859--ServerSession(1968077)--Connection(5747801)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Config]: 2008.01.09 07:36:03.859--ServerSession(1968077)--Connection(5182312)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Config]: 2008.01.09 07:36:03.859--ServerSession(1968077)--Connection(5500006)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Config]: 2008.01.09 07:36:03.875--ServerSession(1968077)--Connection(5514977)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Config]: 2008.01.09 07:36:03.875--ServerSession(1968077)--Connection(5530440)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Config]: 2008.01.09 07:36:03.875--ServerSession(1968077)--Connection(5545904)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Config]: 2008.01.09 07:36:03.891--ServerSession(1968077)--Connection(5763284)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Finest]: 2008.01.09 07:36:03.891--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--end undeploying Persistence Unit Test; state Undeployed; factoryCount 0
    Exception in thread "Main Thread" javax.persistence.RollbackException: Exception [TOPLINK-7251] (Oracle TopLink Essentials - 2.0 (Build b58g-fcs (09/07/2007))): oracle.toplink.essentials.exceptions.ValidationException
    Exception Description: The attribute [id] of class [test.jpa.entities.Person] is mapped to a primary key column in the database. Updates are not allowed.
         at oracle.toplink.essentials.internal.ejb.cmp3.transaction.base.EntityTransactionImpl.commit(EntityTransactionImpl.java:120)
         at oracle.toplink.essentials.internal.ejb.cmp3.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:60)
         at test.jpa.TestPkBug.runTest(TestPkBug.java:53)
         at test.jpa.TestPkBug.main(TestPkBug.java:95)
    Caused by: Exception [TOPLINK-7251] (Oracle TopLink Essentials - 2.0 (Build b58g-fcs (09/07/2007))): oracle.toplink.essentials.exceptions.ValidationException
    Exception Description: The attribute [id] of class [test.jpa.entities.Person] is mapped to a primary key column in the database. Updates are not allowed.
         at oracle.toplink.essentials.exceptions.ValidationException.primaryKeyUpdateDisallowed(ValidationException.java:2222)
         at oracle.toplink.essentials.mappings.foundation.AbstractDirectMapping.writeFromObjectIntoRowWithChangeRecord(AbstractDirectMapping.java:750)
         at oracle.toplink.essentials.internal.descriptors.ObjectBuilder.buildRowForUpdateWithChangeSet(ObjectBuilder.java:948)
         at oracle.toplink.essentials.internal.queryframework.DatabaseQueryMechanism.updateObjectForWriteWithChangeSet(DatabaseQueryMechanism.java:1263)
         at oracle.toplink.essentials.queryframework.UpdateObjectQuery.executeCommitWithChangeSet(UpdateObjectQuery.java:91)
         at oracle.toplink.essentials.internal.queryframework.DatabaseQueryMechanism.executeWriteWithChangeSet(DatabaseQueryMechanism.java:390)
         at oracle.toplink.essentials.queryframework.WriteObjectQuery.executeDatabaseQuery(WriteObjectQuery.java:109)
         at oracle.toplink.essentials.queryframework.DatabaseQuery.execute(DatabaseQuery.java:628)
         at oracle.toplink.essentials.queryframework.DatabaseQuery.executeInUnitOfWork(DatabaseQuery.java:555)
         at oracle.toplink.essentials.queryframework.ObjectLevelModifyQuery.executeInUnitOfWorkObjectLevelModifyQuery(ObjectLevelModifyQuery.java:138)
         at oracle.toplink.essentials.queryframework.ObjectLevelModifyQuery.executeInUnitOfWork(ObjectLevelModifyQuery.java:110)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.internalExecuteQuery(UnitOfWorkImpl.java:2233)
         at oracle.toplink.essentials.internal.sessions.AbstractSession.executeQuery(AbstractSession.java:952)
         at oracle.toplink.essentials.internal.sessions.AbstractSession.executeQuery(AbstractSession.java:909)
         at oracle.toplink.essentials.internal.sessions.CommitManager.commitChangedObjectsForClassWithChangeSet(CommitManager.java:309)
         at oracle.toplink.essentials.internal.sessions.CommitManager.commitAllObjectsWithChangeSet(CommitManager.java:195)
         at oracle.toplink.essentials.internal.sessions.AbstractSession.writeAllObjectsWithChangeSet(AbstractSession.java:2657)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.commitToDatabase(UnitOfWorkImpl.java:1044)
         at oracle.toplink.essentials.internal.ejb.cmp3.base.RepeatableWriteUnitOfWork.commitToDatabase(RepeatableWriteUnitOfWork.java:403)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.commitToDatabaseWithChangeSet(UnitOfWorkImpl.java:1126)
         at oracle.toplink.essentials.internal.ejb.cmp3.base.RepeatableWriteUnitOfWork.commitRootUnitOfWork(RepeatableWriteUnitOfWork.java:107)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.commitAndResume(UnitOfWorkImpl.java:856)
         at oracle.toplink.essentials.internal.ejb.cmp3.transaction.base.EntityTransactionImpl.commit(EntityTransactionImpl.java:102)
         ... 3 moreEDIT: Are you using the EXACT Test Case as I have described it in the previous posts? It's important that you commit(), and not rollback(), the transaction after the flush.
    EDIT: Updated the log because I found out that I had made a small change to the original Test Case while I was trying to find a workaround. The current log is produced by the EXACT Test Case I described in my previous posts.
    Message was edited by:
    bisser

  • JPA: @ManyToOne legacy mapping using @JoinTable

    Dear JEE experts,
    I have a tough legacy mapping problem. There are two entities Pac and BasePac where each Pac has a BasePac field which is to be queried from the other entity table. The association is definied in a third table pac_component which has a Pac and a BasePac field among several others. I think the schema is a bit weird and I would have defined it differently, but I cannot change it because other applications using the database must not be changed.
    My code looks like this:
    @javax.persistence.Entity(name="Pacs")
    @javax.persistence.Table(name="packet")
    @javax.persistence.SequenceGenerator(name="PacsSeqGen", sequenceName="packet_packet_id_seq")
    public class Pac
         implements java.io.Serializable
        // virtual attribute basePac
        @javax.persistence.ManyToOne(fetch=EAGER, optional=true) // optional should be default anyway
        @javax.persistence.JoinTable(
             name="packet_component",
             [email protected](name="packet_id"),
             [email protected](name="basepacket_id") )
        private BasePac basePac;
        public BasePac getBasePac() { return basePac; }
        public void setBasePac( BasePac basePac ) { this.basePac = basePac; }
    @javax.persistence.Entity(name="BasePacs")
    @javax.persistence.Table(name="basepacket")
    @javax.persistence.SequenceGenerator(name="BasePacsSeqGen", sequenceName="basepacket_basepacket_id_seq")
    public class BasePac
         implements java.io.Serializable
    { ... }The Entity for pac_component does not appear so far and afaik it does not matter.
    When I now create a Pac instance and persist it, JPA (with Hibernate) always wants to create a link object:
    insert into pac_component (basepacket_id, packet_id) values (?, ?)Where the ? for basepacket_id is null. But this is not a valid row for pac_component, thus I will get a ConstaintViolationException.
    My question is: Why does it create this row at all? The association is marked optional!
    My solution might be to make the field PacBase within Pac transient and access it only through a pacComponents field, which is a @OneToMany but every assiciated PacComponent entity refers to the same BasePac. Anyway, I wonder why JPA or Hibernate wants to create such a row at all.
    ... MIchael

    I wouldn't focus too much on wanting to solve this through JPA. What you have here is a 'problem' which you will run into in many forms - your business requirements do not map directly to the data layer. This simply means that you need some business logic to make the translation. For example if this were for a web layer I would implement a specialized bean which can take different entities and then provide an alternative view on the data, optionally by generating it.
    If 'calculated' data is closely tied to the database layer and less to the business layer then you could of course choose to fix it through the database itself - by creating a view and mapping an entity to that. That is especially useful if you need the same data in multiple aspects of the application framework and not only in the Java code (think of reporting and analysis for example), but it has other considerations like performance.

Maybe you are looking for

  • Migrating from Crystal Reports Charts using TopN and "Other" to Reporting Services

    I am currently migrating several Crystal Reports that have charts.  The chart series is set to use the TopN values in a count, and combine the rest of the series into one series titled "Other".  I see how to filter the chart using TopN, however I nee

  • "A Serious Error Has Occurred That Requires Adobe Premiere Pro to Shut Down"

    "A Serious Error Has Occurred That Requires Adobe Premiere Pro to Shut Down" Pops up every time I export a video.. I have 2 TB of space where i'm exporting so thats not the issue... I'm using RED Giant Pluraleyes to match multiple cameras but other t

  • Flash CS4 and ActionScript 2.0

    Hello Was wondering if you could recommend the publish and save settings in Flash CS4 when working with ActionScript 2.0. What are my publish settings? Flash Player 8 or 9? Script: ActionScript 2.0 Should I be saving the project as CS4 or CS3? - I no

  • Copy Highlighted Text in Pdf file

    Hi everyone, Recently I downloaded the Adobe Reader X (version 10.0). This version allows me to highlight text (in yellow colour). However, If I want to copy a highlighted block of text (for instance to paste into a Word document), I am not able to d

  • Drag and drop between two Datagrid

    I have to drag and drop data between two different datagrid but in drop side i have to modify the item. How Can I do it? Thanks A