Reverse Mapping: letting Kodo manage pk-column

Hi again,
i have a db with many predefined tables. Its not allowed to change the db
schema. The generated Java-classes are looking fine but i want to let kodo
manage the pk columns (like JDOIDX in generated tables). I dont want the
more technical pks in my business classes. Is it possible?
Any help is welcome!

Abe White wrote:
Adding a data store identity option to the reverse mapping tool is relatively
high on our to-do list, but it's not implemented yet. For now, you can
follow the steps in the documentation for reverse-mapping your classes, then
switch over to datastore identity manually by changing the class definitions
and metadata.Ok, this solution is workable. Will try it, thanks!

Similar Messages

  • Reverse mapping tool in 3.0.1 ignores "-schemas" option

    I believe I have discovered a bug in the 3.0.1 version of the reverse
    mapping tool.
    Here is a script of the commands that worked fine in 3.0.0:
    Script started on Mon Jan 12 11:02:19 2004
    1$ which schemagen
    /opt/kodo-jdo-3.0.0/bin/schemagen
    2$ echo $PATH
    /opt/kodo-jdo-3.0.0/bin:/sw/db/oracle/oracle817/bin:/sw/gen/sparc-sun-solaris2.9/acroread/5.06/bin:/sw/gen/sparc-sun-solaris2.9/cvs/1.11.5/bin:/sw/gen/sparc-sun-solaris2.9/esound/0.2.29/bin:/sw/gen/sparc-sun-solaris2.9/mpg123/0.59r/bin:/usr/bin:/sw/gen/sparc-sun-solaris2.9/gnupg/1.2.1/bin:/sw/gen/sparc-sun-solaris2.9/mozilla/1.3/bin:/sw/gen/sparc-sun-solaris2.9/openssh/3.7.1p2/sbin:/sw/gen/sparc-sun-solaris2.9/openssh/3.7.1p2/bin:/sw/pd/workman-1.3.4/bin:/usr/openwin/bin:/usr/bin:/sbin:/bin:/usr/sbin:/usr/ccs/bin:/usr/ucb:/opt/local/bin:/sw/modules/bin:/sw/com/bin:/sw/pd/bin:/sw/pd/office52/program:/sw/pd/RealPlayer8:/users/n9208/bin:/opt/openssh/bin:/usr/dt/bin:/usr/dt/bin:/usr/openwin/bin:/sw/db/tools/bin:/sw/db/iss/bin:/usr/local/bin:/usr/local/scripts
    3$ echo $CLASSPATH
    :/opt/oracle/oracle9.0.1.4.zip:/opt/kodo-jdo-3.0.0:/opt/kodo-jdo-3.0.0/lib/kodo-jdo-runtime.jar:/opt/kodo-jdo-3.0.0/lib/kodo-jdo.jar:/opt/kodo-jdo-3.0.0/lib/jakarta-commons-collections-2.1.jar:/opt/kodo-jdo-3.0.0/lib/jakarta-commons-lang-1.0.1.jar:/opt/kodo-jdo-3.0.0/lib/jakarta-commons-logging-1.0.3.jar:/opt/kodo-jdo-3.0.0/lib/jakarta-commons-pool-1.0.1.jar:/opt/kodo-jdo-3.0.0/lib/jakarta-regexp-1.1.jar:/opt/kodo-jdo-3.0.0/lib/jca1.0.jar:/opt/kodo-jdo-3.0.0/lib/jdbc-hsql-1_7_0.jar:/opt/kodo-jdo-3.0.0/lib/jdbc2_0-stdext.jar:/opt/kodo-jdo-3.0.0/lib/jdo-1.0.1.jar:/opt/kodo-jdo-3.0.0/lib/jndi.jar:/opt/kodo-jdo-3.0.0/lib/jta-spec1_0_1.jar:/opt/kodo-jdo-3.0.0/lib/log4j-1.2.6.jar:/opt/kodo-jdo-3.0.0/lib/xalan.jar:/opt/kodo-jdo-3.0.0/lib/xercesImpl.jar:/opt/kodo-jdo-3.0.0/lib/xml-apis.jar:/opt/kodo-jdo-3.0.0/lib/jfreechart-0.9.13.jar:/opt/kodo-jdo-3.0.0/lib/jcommon-0.8.8.jar
    4$ schemagen -p kodo.properties -f schema.xml -schemas PRODTRDTA.F0101
    0 INFO [main] kodo.Tool - Schema generator running on schemas
    "PRODTRDTA.F0101". This process may take some time. Enable the
    kodo.jdbc.Schema logging category to see messages about the collection of
    schema data.
    136 INFO [main] jdbc.Schema - Reading table information for schema name
    "PRODTRDTA", table name "F0101".
    672 INFO [main] jdbc.Schema - Reading column information for table
    "PRODTRDTA.F0101".
    727 INFO [main] jdbc.Schema - Reading primary keys for schema name
    "PRODTRDTA", table name "F0101".
    2187 INFO [main] jdbc.Schema - Reading indexes for schema name
    "PRODTRDTA", table name "F0101".
    2432 INFO [main] jdbc.Schema - Reading foreign keys for schema name
    "PRODTRDTA", table name "F0101".
    2632 INFO [main] kodo.Tool - Writing XML schema.
    5$
    script done on Mon Jan 12 11:03:14 2004
    Note the first line of logging output: both the schema name and table name
    are properly recognized.
    Here is the scripted output of the same commands in 3.0.1:
    Script started on Mon Jan 12 10:29:03 2004
    1$ which schemagen
    /opt/kodo-jdo-3.0.1/bin/schemagen
    2$ echo $PATH
    /opt/kodo-jdo-3.0.1/bin:/sw/db/oracle/oracle817/bin:/sw/gen/sparc-sun-solaris2.9/acroread/5.06/bin:/sw/gen/sparc-sun-solaris2.9/cvs/1.11.5/bin:/sw/gen/sparc-sun-solaris2.9/esound/0.2.29/bin:/sw/gen/sparc-sun-solaris2.9/mpg123/0.59r/bin:/usr/bin:/sw/gen/sparc-sun-solaris2.9/gnupg/1.2.1/bin:/sw/gen/sparc-sun-solaris2.9/mozilla/1.3/bin:/sw/gen/sparc-sun-solaris2.9/openssh/3.7.1p2/sbin:/sw/gen/sparc-sun-solaris2.9/openssh/3.7.1p2/bin:/sw/pd/workman-1.3.4/bin:/usr/openwin/bin:/usr/bin:/sbin:/bin:/usr/sbin:/usr/ccs/bin:/usr/ucb:/opt/local/bin:/sw/modules/bin:/sw/com/bin:/sw/pd/bin:/sw/pd/office52/program:/sw/pd/RealPlayer8:/users/n9208/bin:/opt/openssh/bin:/usr/dt/bin:/usr/dt/bin:/usr/openwin/bin:/sw/db/tools/bin:/sw/db/iss/bin:/usr/local/bin:/usr/local/scripts
    3$ echo $CLASSPATH
    :/opt/oracle/oracle9.0.1.4.zip:/opt/kodo-jdo-3.0.1:/opt/kodo-jdo-3.0.1/lib/kodo-jdo-runtime.jar:/opt/kodo-jdo-3.0.1/lib/kodo-jdo.jar:/opt/kodo-jdo-3.0.1/lib/jakarta-commons-collections-2.1.jar:/opt/kodo-jdo-3.0.1/lib/jakarta-commons-lang-1.0.1.jar:/opt/kodo-jdo-3.0.1/lib/jakarta-commons-logging-1.0.3.jar:/opt/kodo-jdo-3.0.1/lib/jakarta-commons-pool-1.0.1.jar:/opt/kodo-jdo-3.0.1/lib/jakarta-regexp-1.1.jar:/opt/kodo-jdo-3.0.1/lib/jca1.0.jar:/opt/kodo-jdo-3.0.1/lib/jdbc-hsql-1_7_0.jar:/opt/kodo-jdo-3.0.1/lib/jdbc2_0-stdext.jar:/opt/kodo-jdo-3.0.1/lib/jdo-1.0.1.jar:/opt/kodo-jdo-3.0.1/lib/jndi.jar:/opt/kodo-jdo-3.0.1/lib/jta-spec1_0_1.jar:/opt/kodo-jdo-3.0.1/lib/log4j-1.2.6.jar:/opt/kodo-jdo-3.0.1/lib/xalan.jar:/opt/kodo-jdo-3.0.1/lib/xercesImpl.jar:/opt/kodo-jdo-3.0.1/lib/xml-apis.jar:/opt/kodo-jdo-3.0.1/lib/jfreechart-0.9.13.jar:/opt/kodo-jdo-3.0.1/lib/jcommon-0.8.8.jar:/opt/kodo-jdo-3.0.1/lib/jline.jar:/opt/kodo-jdo-3.0.1/lib/sqlline.jar
    4$ schemagen -p kodo.properties -f schema.xml -schemas PRODTRDTA.F0101
    1 INFO [main] kodo.Tool - Schema generator running on schemas "all".
    This process may take some time. Enable the kodo.jdbc.Schema logging
    category to see messages about the collection of schema data.
    103 INFO [main] jdbc.Schema - Reading table information for schema name
    "null", table name "null".
    Exception in thread "main" java.lang.OutOfMemoryError
    5$
    script done on Mon Jan 12 11:01:45 2004
    Note the first line of logging output here: the schema is listed as "all"
    instead of the limited scope I had specified.
    This run eventually crashes due to the fact that the account which I am
    running the mapping tool in has access to thousands of tables, and thus
    eventually the JVM runs out of available heap.
    My workaround is to fall back to 3.0.0.

    Thanks for the report. We noticed this ourselves a short while ago.
    The bug will be fixed in 3.0.2.

  • Managed metadata columns in document information panel with multiple content types

    Hi everyone,
    The problem I have is that for custom content types not all managed metadata columns are displayed in Document Information Panel (DIP) for the document in the Office client application. 
    However, everything works fine with 1 specific content type. Even though the others using exactly the same site columns. The content types are deployed using visual studio to the content type hub, and after this the content types are correctly published to
    the site collections, there are no publish issues here. 
    When I create a document based on the second content type in the same library, all fields are showed in the document information panel, except the managed metadata columns.
    Detailed explanation:
    Library: procedures
    Content types:
    - simple procedure (with 4 managed metadata fields and some other text fields)
    - procedure with approval (with the same 4 managed metadata fields and some other text fields)
    Scenario 1: I add the 'simple procedure' content type to the procedures library as only content type. Everything works fine, and all fields show correctly in the document information panel in Word.
    Scenario 2: I add the 'procedure with approval' content type to the procedures library as only content type. Everything works fine, and all fields show correctly in the document information panel in Word.
    Scenario 3: I add the 'simple procedure' and 'procedure with approval' content types to the document library procedures (added simple procedure first). When I create a new document based on the 'simple procedure'
    content type, everything works fine and he shows all metadata fields. When I add a new document based on the 'procedure with approval' content type, the document information panel shows correctly, except all managed metadata fields. These are not visible at
    all. Though they worked perfectly in scenario 1 and 2.
    Is this a known issue or is there a workaround for this? 
    Thanks in advance! 
    Kind regards, Davy

    Yes!
    This problem is solved right now.
    My issue was that I'm using custom content types deployed by Visual Studio in the content type hub. To create a managed metadata site column in visual studio, you need to have first of all your managed metadata field, but also a hidden field accompagnied
    to make the actual mapping like the example below:
    <Field ID="{B654D984-187A-471B-8738-F08F3356CFDA}"
    Type="TaxonomyFieldType"
    DisplayName="Countries"
    ShowField="Term1033"
    EnforceUniqueValues="FALSE"
    Group="Demo"
    StaticName="Countries"
    Name="Countries">
    <Customization>
    <ArrayOfProperty>
    <Property>
    <Name>TextField</Name>;
    <Value xmlns:q6="http://www.w3.org/2001/XMLSchema" p4:type="q6:string" xmlns:p4="http://www.w3.org/2001/XMLSchema-instance">{67308AC2-9556-456B-BF9E-43E8F23EBEE6}</Value>
    </Property>
    </ArrayOfProperty>
    </Customization>
    </Field>
    <Field Type="Note"
    DisplayName="Countries_0"
    StaticName="CountriesTaxHTField0"
    Name="CountriesTaxHTField0"
    ID="{67308AC2-9556-456B-BF9E-43E8F23EBEE6}"
    ShowInViewForms="FALSE"
    Required="FALSE"
    Hidden="TRUE"
    CanToggleHidden="TRUE"
    Group="Demo"
    RowOrdinal="0"
    />
    </Elements>
    VERY important here is that when you create your content type using visual studio, you not only have to add the managed metadata site column in your xml (which let the content type work already perfectly) but also add the hidden field to your content type
    xml !! This way, SharePoint knows that when you have multiple content types with the same site columns in the same library, the second content type also need to get the hidden field from this site columns like in the example below!
    <?xml version="1.0" encoding="utf-8"?>
    <Elements xmlns="http://schemas.microsoft.com/sharepoint/">
    <!-- Parent ContentType: Document (0x0101) -->;
    <ContentType ID="0x010100571ebc0f478a49d5a775039347ee1535"
    Name="Document Location"
    Group="Demo"
    Description="A content type containing Managed Metadata Column."
    Inherits="TRUE"
    Version="0">
    <FieldRefs>
    <FieldRef ID="{B654D984-187A-471B-8738-F08F3356CFDA}" Name="Countries"/>
    <FieldRef ID="{67308AC2-9556-456B-BF9E-43E8F23EBEE6}" Name="CountriesTaxHTField0"/>
    <FieldRef ID="{f3b0adf9-c1a2-4b02-920d-943fba4b3611}" Name="TaxCatchAll"/>
    <FieldRef ID="{8f6b6dd8-9357-4019-8172-966fcd502ed2}" Name="TaxCatchAllLabel"/>
    </FieldRefs>
    </ContentType>
    </Elements>
    I'm very happy I found this solution, because in the whole project i'm implementing, this was used a lot!
    Special thanks to the blog of @cann0nf0dder (http://cann0nf0dder.wordpress.com/2013/04/01/creating-a-site-column-with-managed-metadata) which let me think about this! 
    This ticket is answered now! :-)
    Kind regards,
    Davy

  • 3.1.4 reverse mapping tool issue

    (Sorry for the duplicate posting...I meant to start a new thread with
    this but accidentally posted it as a reply to a 6-month old thread)
    Hello,
    I was running Kodo 3.0.2 when Abe and I had the exchange reproduced
    below back in January to deal with Oracle tables with "$" in the column
    names (which I subsequently updated to 3.0.3). The original subject of
    this discussion was "3.0.2 reverse mapping tool generates invalid
    ..mapping file".
    I was able to get this working by running the following commands to
    implement Abe's suggestion:
    reversemappingtool -p kodo.properties -package db \
    -cp custom.properties -ds false schema.xml
    sed -e 's/\$/__DOLLAR__/' db/package.mapping > db/package.mapping.new
    mv db/package.mapping.new db/package.mapping
    javac db/*.java
    mappingtool -p kodo.properties -a import db/package.mapping
    sed -e 's/__DOLLAR__/\$/' db/package.jdo > db/package.jdo.new
    mv db/package.jdo.new db/package.jdo
    In my custom.properties file, I had lines like these to put useful names
    on my class's fields:
    db.TransactionDetailHistory.y$an8.rename : addressNumber
    As I said, in 3.0.3, this worked perfectly.
    I picked this code back up for the first time since getting it working 6
    months ago, and decided to update it to 3.1.4 (since I'm already using
    that on other projects). Problem is, the reverse mapping tool has
    changed and the code it generates no longer works as it once did. I
    tried running the 3.1.2 and 3.1.0 reverse mapping tool, and it failed
    the same way, so it looks like this change happened in the 3.0.x to
    3.1.x version change.
    What happens is this: In the generated Java source, my fields used to
    end up with names as per my specification (e.g., the Oracle column named
    "y$an8" showed up as "addressNumber" in the java source).
    However, it looks like the "$" became special somehow in 3.1.0 - the
    "y$an8" column now shows up as "yAn8" in the generated Java. I tried
    changing my custom.properties file accordingly, but it still shows up as
    yAn8 even after changing my mapping to look like this:
    db.TransactionDetailHistory.yAn8.rename : addressNumber
    What do you make of this?
    Thanks,
    Bill
    Abe White wrote:
    > Hmmm... this is a problem. '$' is not legal in XML names, and there
    is no standard way to escape it.
    >
    > Your best bet is probably to do the following:
    > 1. In the generated .mapping file, replace all '$' characters with
    another token, such as '--DOLLAR--'.
    > 2. Switch your properties to use the metadata mapping factory:
    > kodo.jdbc.MappingFactory: metadata
    > 3. Import your mappings into the metadata mapping factory:
    > mappingtool -a import package.mapping
    > 4. Delete the mapping file.
    > 5. In your .jdo file, replace '--DOLLAR--' with '$' again.
    >
    > The metadata mapping factory doesn't put column names in its XML
    attribute names, so you should be able to use it safely.

    William-
    However, it looks like the "$" became special somehow in 3.1.0 - the
    "y$an8" column now shows up as "yAn8" in the generated Java. I tried
    changing my custom.properties file accordingly, but it still shows up as
    yAn8 even after changing my mapping to look like this:
    db.TransactionDetailHistory.yAn8.rename : addressNumberWell, the reverse mapping tool makes some assumptions based on common
    naming strategies for relational databases and Java naming: columns like
    "FIRST_NAME" will be renamed to "firstName". The Reverse Mapping tool is
    seeing the "$" and treating it as a non-alphanumeric delimiter, so is
    fixing it.
    Can you try a couple of additional properties:
    db.TransactionDetailHistory.y$An8.rename: addressNumber
    db.TransactionDetailHistory.y$an8.rename: addressNumber
    Also, are other rename properties working for you, or is that the only
    field or class you attempt to rename? It might just be the case that
    you aren't correctly specifying the properties file or something.
    Finally, bear in mind that you can always implement your own
    kodo.jdbc.meta.ReverseCustomizer and just use that; not the easiest
    solution, but it can certainly be used to have very fine-grained control
    over the exact names that are generated.
    In article <[email protected]>, William Korb wrote:
    (Sorry for the duplicate posting...I meant to start a new thread with
    this but accidentally posted it as a reply to a 6-month old thread)
    Hello,
    I was running Kodo 3.0.2 when Abe and I had the exchange reproduced
    below back in January to deal with Oracle tables with "$" in the column
    names (which I subsequently updated to 3.0.3). The original subject of
    this discussion was "3.0.2 reverse mapping tool generates invalid
    .mapping file".
    I was able to get this working by running the following commands to
    implement Abe's suggestion:
    reversemappingtool -p kodo.properties -package db \
    -cp custom.properties -ds false schema.xml
    sed -e 's/\$/__DOLLAR__/' db/package.mapping > db/package.mapping.new
    mv db/package.mapping.new db/package.mapping
    javac db/*.java
    mappingtool -p kodo.properties -a import db/package.mapping
    sed -e 's/__DOLLAR__/\$/' db/package.jdo > db/package.jdo.new
    mv db/package.jdo.new db/package.jdo
    In my custom.properties file, I had lines like these to put useful names
    on my class's fields:
    db.TransactionDetailHistory.y$an8.rename : addressNumber
    As I said, in 3.0.3, this worked perfectly.
    I picked this code back up for the first time since getting it working 6
    months ago, and decided to update it to 3.1.4 (since I'm already using
    that on other projects). Problem is, the reverse mapping tool has
    changed and the code it generates no longer works as it once did. I
    tried running the 3.1.2 and 3.1.0 reverse mapping tool, and it failed
    the same way, so it looks like this change happened in the 3.0.x to
    3.1.x version change.
    What happens is this: In the generated Java source, my fields used to
    end up with names as per my specification (e.g., the Oracle column named
    "y$an8" showed up as "addressNumber" in the java source).
    However, it looks like the "$" became special somehow in 3.1.0 - the
    "y$an8" column now shows up as "yAn8" in the generated Java. I tried
    changing my custom.properties file accordingly, but it still shows up as
    yAn8 even after changing my mapping to look like this:
    db.TransactionDetailHistory.yAn8.rename : addressNumber
    What do you make of this?
    Thanks,
    Bill
    Abe White wrote:
    Hmmm... this is a problem. '$' is not legal in XML names, and thereis no standard way to escape it.
    Your best bet is probably to do the following:
    1. In the generated .mapping file, replace all '$' characters withanother token, such as '--DOLLAR--'.
    2. Switch your properties to use the metadata mapping factory:
    kodo.jdbc.MappingFactory: metadata
    3. Import your mappings into the metadata mapping factory:
    mappingtool -a import package.mapping
    4. Delete the mapping file.
    5. In your .jdo file, replace '--DOLLAR--' with '$' again.
    The metadata mapping factory doesn't put column names in its XMLattribute names, so you should be able to use it safely.--
    Marc Prud'hommeaux
    SolarMetric Inc.

  • Problem with reverse mapping

    Hi!
    I am having a problem with reverse mapping. Here's what I do (copying the
    generated files to a correct directory omitted):
    % rd-schemagen -properties jdo.properties -file schema.xml
    % rd-reversemappingtool -properties jdo.properties -package testi
    schema.xml
    % javac -d build/classes src/testi/*.java
    % rd-importtool -properties jdo.properties src/testi/testi.mapping
    Here's a part of the output:
    <clip>
    2958 INFO [main] jdbc.Schema - Found existing table "Kirja" for schema
    "null".
    3002 INFO [main] jdbc.Schema - Found existing table "Kustantaja" for
    schema "n
    ull".
    3047 INFO [main] jdbc.SQL - [C: 5948361; T: 15336018]close
    3125 INFO [main] jdbc.SQL - [C: 2478770; T: 15336018]open:
    jdbc:mysql://localh
    ost/kirjakauppa (root)
    3129 INFO [main] jdbc.Schema - Found existing table "Kirjailija" for
    schema "n
    ull".
    3140 INFO [main] jdbc.SQL - [C: 2478770; T: 15336018]close
    3187 INFO [main] jdbc.SQL - [C: 7529545; T: 15336018]open:
    jdbc:mysql://localh
    ost/kirjakauppa (root)
    3193 INFO [main] jdbc.Schema - Found existing table "Kirjoittaja" for
    schema "
    null".
    3225 INFO [main] jdbc.SQL - [C: 7529545; T: 15336018]close
    Exception in thread "main" javax.jdo.JDOFatalInternalException:
    java.lang.Illega
    lArgumentException: You are attempting to link to a primary key column in
    table "Kirja" in a foreign key that is already linked to primary key
    columns in table "Kirjailija".
    NestedThrowables:
    java.lang.IllegalArgumentException: You are attempting to link to a primary
    key column in table "Kirja" in a foreign key that is already linked to
    primary key c
    olumns in table "Kirjailija".
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.Mappings.createClassMapping(Ma
    ppings.java:160)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMapping(M
    appingRepository.java:279)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMetaData(
    MappingRepository.java:147)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMapping(M
    appingRepository.java:158)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.getMapping(I
    mportTool.java:126)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.importMappin
    gs(ImportTool.java:57)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.run(ImportTo
    ol.java:408)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.main(ImportT
    ool.java:385)
    NestedThrowablesStackTrace:
    java.lang.IllegalArgumentException: You are attempting to link to a primary
    key column in table "Kirja" in a foreign key that is already linked to
    primary key c
    olumns in table "Kirjailija".
    at
    com.solarmetric.rd.kodo.impl.jdbc.schema.ForeignKey.join(ForeignKey.j
    ava:238)
    at
    com.solarmetric.rd.kodo.impl.jdbc.schema.SchemaGenerator.generateFore
    ignKeys(SchemaGenerator.java:625)
    at
    com.solarmetric.rd.kodo.impl.jdbc.schema.DynamicSchemaFactory.findTab
    le(DynamicSchemaFactory.java:111)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.map.BaseClassMapping.fromMappi
    ngInfo(BaseClassMapping.java:113)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.Mappings.createClassMapping(Ma
    ppings.java:144)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMapping(M
    appingRepository.java:279)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMetaData(
    MappingRepository.java:147)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMapping(M
    appingRepository.java:158)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.getMapping(I
    mportTool.java:126)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.importMappin
    gs(ImportTool.java:57)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.run(ImportTo
    ol.java:408)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.main(ImportT
    ool.java:385)
    </clip>
    Here's what MySQLCC gives for creation statement of the tables:
    <clip>
    # Host: localhost
    # Database: kirjakauppa
    # Table: 'Asiakas'
    # CREATE TABLE `Asiakas` (
    `Asiakas_id` int(11) NOT NULL auto_increment,
    `Nimi1` varchar(50) default NULL,
    `Nimi2` varchar(50) default NULL,
    `KatuOsoite` varchar(50) default NULL,
    `Postiosoite` varchar(50) default NULL,
    `Email` varchar(50) default NULL,
    `Puhelin` varchar(50) default NULL,
    `Fax` varchar(50) default NULL,
    `Salasana` varchar(50) default NULL,
    `ExtranetTunnus` varchar(50) default NULL,
    PRIMARY KEY (`Asiakas_id`),
    KEY `Asiakas_id` (`Asiakas_id`)
    ) TYPE=InnoDB;
    # Host: localhost
    # Database: kirjakauppa
    # Table: 'Kirja'
    # CREATE TABLE `Kirja` (
    `Kirja_id` int(11) NOT NULL auto_increment,
    `Kustantaja_id` int(11) default NULL,
    `Nimi` varchar(60) default NULL,
    `Nimi2` varchar(60) default NULL,
    `ISBN` varchar(50) default NULL,
    `Kieli` varchar(50) default NULL,
    `Kansi_URL` varchar(50) default NULL,
    `Sisalto_URL` varchar(50) default NULL,
    `Tukkuhinta` decimal(10,2) default NULL,
    `Kuluttajahinta` decimal(10,2) default NULL,
    `Varastokpl` int(11) default NULL,
    PRIMARY KEY (`Kirja_id`),
    KEY `Kirja_id` (`Kirja_id`),
    KEY `Kustantaja_id` (`Kustantaja_id`),
    FOREIGN KEY (`Kustantaja_id`) REFERENCES `kirjakauppa.Kustantaja`
    (`Kustantaja_id`)
    ) TYPE=InnoDB;
    # Host: localhost
    # Database: kirjakauppa
    # Table: 'Kirjailija'
    # CREATE TABLE `Kirjailija` (
    `Kirjailija_id` int(11) NOT NULL auto_increment,
    `Sukunimi` varchar(50) default NULL,
    `Etunimi` varchar(50) default NULL,
    `Maa` varchar(50) default NULL,
    `Kirjailija_URL` varchar(50) default NULL,
    PRIMARY KEY (`Kirjailija_id`),
    KEY `Kirjailija_id` (`Kirjailija_id`)
    ) TYPE=InnoDB;
    # Host: localhost
    # Database: kirjakauppa
    # Table: 'Kirjoittaja'
    # CREATE TABLE `Kirjoittaja` (
    `Kirjoittaja_id` int(11) NOT NULL auto_increment,
    `Kirjailija_id` int(11) NOT NULL default '0',
    `Kirja_id` int(11) NOT NULL default '0',
    PRIMARY KEY (`Kirjoittaja_id`),
    KEY `Kirjailija_id` (`Kirjailija_id`),
    KEY `Kirja_id` (`Kirja_id`),
    FOREIGN KEY (`Kirjailija_id`) REFERENCES `kirjakauppa.Kirjailija`
    (`Kirjailija_id`),
    FOREIGN KEY (`Kirja_id`) REFERENCES `kirjakauppa.Kirja` (`Kirja_id`)
    ) TYPE=InnoDB;
    # Host: localhost
    # Database: kirjakauppa
    # Table: 'Koodi'
    # CREATE TABLE `Koodi` (
    `Koodi_id` int(11) NOT NULL auto_increment,
    `Koodi` varchar(50) default NULL,
    `Tyyppi` varchar(50) default NULL,
    `Arvo` varchar(50) default NULL,
    PRIMARY KEY (`Koodi_id`)
    ) TYPE=InnoDB;
    # Host: localhost
    # Database: kirjakauppa
    # Table: 'Kustantaja'
    # CREATE TABLE `Kustantaja` (
    `Kustantaja_id` int(11) NOT NULL auto_increment,
    `Nimi` varchar(80) default NULL,
    `Maa` varchar(50) default NULL,
    `Kustantaja_URL` varchar(50) default NULL,
    `KirjaLkm` int(11) default NULL,
    PRIMARY KEY (`Kustantaja_id`),
    KEY `Kustantaja_id` (`Kustantaja_id`)
    ) TYPE=InnoDB;
    # Host: localhost
    # Database: kirjakauppa
    # Table: 'Luokittelu'
    # CREATE TABLE `Luokittelu` (
    `Luokittelu_id` int(11) NOT NULL auto_increment,
    `Luokka_id` int(11) NOT NULL default '0',
    `Kirja_id` int(11) NOT NULL default '0',
    PRIMARY KEY (`Luokittelu_id`),
    KEY `Luokka_id` (`Luokka_id`),
    KEY `Kirja_id` (`Kirja_id`),
    FOREIGN KEY (`Luokka_id`) REFERENCES `kirjakauppa.Luokka` (`Luokka_id`),
    FOREIGN KEY (`Kirja_id`) REFERENCES `kirjakauppa.Kirja` (`Kirja_id`)
    ) TYPE=InnoDB;
    # Host: localhost
    # Database: kirjakauppa
    # Table: 'Luokka'
    # CREATE TABLE `Luokka` (
    `Luokka_id` int(11) NOT NULL auto_increment,
    `Luokka` varchar(50) default NULL,
    PRIMARY KEY (`Luokka_id`),
    KEY `Luokka_id` (`Luokka_id`)
    ) TYPE=InnoDB;
    # Host: localhost
    # Database: kirjakauppa
    # Table: 'Myyja'
    # CREATE TABLE `Myyja` (
    `Myyja_id` int(11) NOT NULL auto_increment,
    `Myyja` varchar(50) default NULL,
    `Myyja_URL` varchar(50) default NULL,
    PRIMARY KEY (`Myyja_id`),
    KEY `Myyja_id` (`Myyja_id`)
    ) TYPE=InnoDB;
    # Host: localhost
    # Database: kirjakauppa
    # Table: 'Tilaus'
    # CREATE TABLE `Tilaus` (
    `Tilaus_id` int(11) NOT NULL auto_increment,
    `Asiakas_id` int(11) NOT NULL default '0',
    `Myyja_id` int(11) default NULL,
    `TilausPvm` timestamp(14) NOT NULL,
    `EnsimmToimitusPvm` timestamp(14) NOT NULL,
    `ViimToimitusPvm` timestamp(14) NOT NULL,
    `Tila` int(11) NOT NULL default '0',
    `Mk` decimal(10,2) default NULL,
    PRIMARY KEY (`Tilaus_id`),
    KEY `Asiakas_id` (`Asiakas_id`),
    KEY `Myyja_id` (`Myyja_id`),
    KEY `Tilaus_id` (`Tilaus_id`),
    FOREIGN KEY (`Asiakas_id`) REFERENCES `kirjakauppa.Asiakas`
    (`Asiakas_id`),
    FOREIGN KEY (`Myyja_id`) REFERENCES `kirjakauppa.Myyja` (`Myyja_id`)
    ) TYPE=InnoDB;
    # Host: localhost
    # Database: kirjakauppa
    # Table: 'Tilausrivi'
    # CREATE TABLE `Tilausrivi` (
    `TilausRivi_id` int(11) NOT NULL auto_increment,
    `Tilaus_id` int(11) NOT NULL default '0',
    `Kirja_id` int(11) NOT NULL default '0',
    `TilausLkm` int(11) default NULL,
    `Ahinta` decimal(10,2) default NULL,
    `Alepros` float default NULL,
    `Mk` decimal(10,2) default NULL,
    `ToimitettuLkm` int(11) default NULL,
    `ToimitusPvm` timestamp(14) NOT NULL,
    `ViimToimitusPvm` timestamp(14) NOT NULL,
    `Tila` int(11) NOT NULL default '0',
    PRIMARY KEY (`TilausRivi_id`),
    KEY `Tilaus_id` (`Tilaus_id`),
    KEY `Kirja_id` (`Kirja_id`),
    FOREIGN KEY (`Tilaus_id`) REFERENCES `kirjakauppa.Tilaus` (`Tilaus_id`),
    FOREIGN KEY (`Kirja_id`) REFERENCES `kirjakauppa.Kirja` (`Kirja_id`)
    ) TYPE=InnoDB;
    </clip>
    I can find the original creation script if it is necessary.
    My guess was that I need to define the foreign keys myself into the
    generated schema.xml This is stated in the manual. However, this did not
    help, although it changed the stack trace a little (it complains about
    different classes than before):
    <clip>
    Exception in thread "main" javax.jdo.JDOFatalInternalException:
    java.lang.IllegalArgumentException: You are attempting to link to a primary
    key column in table "Myyja" in a foreign key that is already linked to
    primary key columns in table "Asiakas".
    NestedThrowables:
    java.lang.IllegalArgumentException: You are attempting to link to a primary
    key column in table "Myyja" in a foreign key that is already linked to
    primary key columns in table "Asiakas".
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.Mappings.createFieldMapping(Mappings.java:208)
    </clip>
    I don't think I fully understand the error message, what exactly is wrong
    here? How can I fix it?
    Here's a sample of the changes I made to schema.xml:
    - added the name - attribute to schema (it was missing)
    <schema name="kirjakauppa">
    - added the foreign key elements according to the table creation statements
    given above
    <fk name="Kustantaja_id" to-table="Kustantaja" column="Kustantaja_id"/>
         etc...
    -Antti

    On Mon, 16 Jun 2003 17:55:35 -0500, Abe White <[email protected]>
    wrote:
    It seems the last three options are being ignored - I still get a
    mapping
    file with schema names in front of tables (e.g. kirjakauppa.Asiakas, not
    Asiakas),That, unfortunately, is impossible to turn off. The -useSchemaName
    option controls whether the schema name is included as part of the
    generated class name; it doesn't affect the mapping data that is
    generated. What problems does including the schema name in the mapping
    data cause?
    rd-importtool -properties jdo.properties gensrc/testi/testi.mapping0 INFO [main] kodo.MetaData - Parsing metadata resource
    "file:/home/akaranta/work/kurssit/jdo/Harjoituskoodi/kirjakauppa/gensrc/testi/testi.mapping".
    Exception in thread "main"
    com.solarmetric.rd.kodo.meta.JDOMetaDataNotFoundException: No JDO metadata
    was found for type "class testi.Asiakas".
    FailedObject:class testi.Asiakas
    at
    com.solarmetric.rd.kodo.meta.JDOMetaDataRepositoryImpl.getMetaData(JDOMetaDataRepositoryImpl.java:126)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMetaData(MappingRepository.java:184)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMapping(MappingRepository.java:197)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.getMapping(ImportTool.java:128)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.importMappings(ImportTool.java:60)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.run(ImportTool.java:400)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.main(ImportTool.java:377)
    This exception goes away if I edit the schema name out of the mapping
    file from all classes.
    separate classes are being generated for join tables with
    primary keysDo these join tables have an extra primary key column? TheYes, they do. Ok, now I know where the problem is.
    -primaryKeyOnJoin flag tells Kodo to ignore a join table with a primary
    key on the join columns. But Kodo can't handle join tables with extra
    column(s) just for a primary key identifier. This isn't a limitation of
    the reverse mapping tool, it's a limitation of Kodo. Kodo wouldn't know
    what to insert in those extra primary key column(s) when adding membersWhy not? If it can handle single numeric pk columns when making the
    generated classes use data store identity, it has to generate something to
    those columns. I can't see why this is different.
    That is simply out of curiosity - the next thing fixed my problem:
    to the join table. Of course, if the primary key is an auto-increment or
    something where Kodo can ignore it for inserts, you can just remove the
    <column> elements and the <pk> element from your .schema file and the
    reverse mapping tool will map it as a join table appropriately.It is auto-increment, so I did this and it worked. Thanks.
    , and application id is used for all classes.Are your primary keys on single, numeric columns? Kodo uses Java longsYes (int in MySQL), so that should not be a problem. They are also auto-
    incremented. This seems to be the only real problem remaining with this
    schema.
    -Antti

  • Dynamical reverse mapping

    Hi,
    I didn't manage to do the reverse mapping at runtime.
    I have 2 components.
    The context of them are described below:
    There is only one node in component 1 called "idioms"
    In the component 2 (that is the component used called POPUP) there is the node called "columns"
    I want to map idioms to columns at runtime. It is a reverse mapping.
    I have the code below:
    data: lo_node_info type ref to if_wd_context_node_info,
            lo_dyn_node_info type ref to if_wd_context_node_info,
            stru_mapping_info type wdr_context_mapping_info,
            tab_mapping_path type wdr_ctx_element_path_segments,
            wa_path type wdr_ctx_element_name.
      wa_path = 'COMPONENTCONTROLLER.COLUMNS'.
      insert wa_path into table tab_mapping_path.
      stru_mapping_info-component_usage = 'POPUP'.
      stru_mapping_info-controller = 'COMPONENTCONTROLLER'.
      stru_mapping_info-path = tab_mapping_path.
    lo_node_info = wd_context->get_node_info( ).
    * Map the context node dynamically
      call method lo_node_info->add_new_mapped_child_node
        exporting
          child_name   = 'IDIOMS'
          mapping_info = stru_mapping_info
        receiving
          child_node_info = lo_dyn_node_info.
    The mapping hasn't been done. Why?
    Thanks a lot.
    Regards,
    Jorge Luiz

    Hi David,
    Thanks for your helping.
    I have done what you said, but I am receinving the error below;
    Adapter error in &VIEW_ELEMENT_TYPE& "TABS" of view "YSWD_HR_TABLE_POPUP.MAIN": Context binding of property DATA_SOURCE cannot be resolved: Subnode MAIN.IDIOMA_POPUP does not exist
    I am passing the node IDIOMA_POPUP as a parameter of the set_data( ) method of the used component.
    TABS is the table that I want to populate.
    Regards,
    Jorge Luiz

  • Unrecognized types in Reverse Mapping

    In our database schema there is use of a user-defined database type. The
    reverse mapping tool cannot recogize this type and automatically
    classifies it as a blob in the mapping file and the generic Object for the
    java sources. I would like to cast this user-defined type to a String in
    java, because otherwise kodo blows up when I try to retrieve the field. I
    extended PropertiesReverseCustomizer and was able to get the java sources
    to output String instead. But I couldn't find an easy way of getting the
    mapping file to use "value" instead of "blob". Right now I am having to do
    a query replace on the mapping file, but I would like to know if there's
    way of getting the Reverse mapping tool to do this for you?
    Toby

    You can probably just add the fields for UDT types manually in your
    customizer:
    import java.sql.*;
    import kodo.meta.*;
    import kodo.jdbc.meta.*;
    import kodo.jdbc.schema.*;
    private ReverseMappingTool tool; // set in setTool ()
    public boolean customize (ClassMapping cls)
    Column[] cols = cls.getTable ().getColumns ();
    for (int i = 0; i < cols.length; i++)
    if (cols.isCompatible (Types.BLOB, 0))
    addStringField (cls, cols[i]);
    return super.customize (cls);
    private void addStringField (ClassMapping cls, Column col)
    String name = tool.getFieldName (col.getName (), cls);
    FieldMetaData fmd = tool.newFieldMetaData (name, String.class, cls);
    ValueFieldMapping field = new ValueFieldMapping (fmd);
    mapping.setColumn (col);
    tool.addFieldMapping (field, cls);

  • Reverse Mapping Tutorial - Finder.java queries the wrong table?!

    I have been almost successful in running the Reverse Mapping Tutorial, by
    creating Java Classes from the hsqldb sample database, and running the JDO
    Enhancer on them.
    However, I cannot get he Finder.java to work. It seems to look in the wrong
    table: MAGAZINEX instead of MAGAZINE?
    Did anyone have trouble with this step, or ran it successfully?
    Liviu
    PS: here is the trace:
    0 [main] INFO kodo.Runtime - Starting Kodo JDO version 2.4.2
    (kodojdo-2.4.2-20030326-1841) with capabilities: [Enterprise Edition
    Features, Standard Edition Features, Lite Edition Features, Evaluation
    License, Query Extensions, Datacache Plug-in, Statement Batching, Global
    Transactions, Developer Tools, Custom Database Dictionaries, Enterprise
    Databases]
    70 [main] WARN kodo.Runtime - WARNING: Kodo JDO Evaluation expires in 25
    days. Please contact [email protected] for information on extending your
    evaluation period or purchasing a license.
    68398 [main] INFO kodo.MetaData -
    com.solarmetric.kodo.meta.JDOMetaDataParser@19eda2c: parsing source:
    file:/C:/Documents%20and%20Settings/default/jbproject/JDO/classes/reversetut
    orial.jdo
    74577 [main] INFO jdbc.JDBC - [ C:24713456; T:31737213; D:22310332 ] open:
    jdbc:hsqldb:hsql_sample_database (sa)
    75689 [main] INFO jdbc.JDBC - [ C:24713456; T:31737213; D:22310332 ] close:
    com.solarmetric.datasource.PoolConnection@17918f0[[requests=0;size=0;max=70;
    hits=0;created=0;redundant=0;overflow=0;new=0;leaked=0;unavailable=0]]
    75699 [main] INFO jdbc.JDBC - [ C:24713456; T:31737213; D:22310332 ] close
    connection
    77331 [main] INFO jdbc.JDBC - Using dictionary class
    "com.solarmetric.kodo.impl.jdbc.schema.dict.HSQLDictionary" to connect to
    "HSQL Database Engine" (version "1.7.0") with JDBC driver "HSQL Database
    Engine Driver" (version "1.7.0")
    1163173 [main] INFO jdbc.JDBC - [ C:3093871; T:31737213; D:22310332 ] open:
    jdbc:hsqldb:hsql_sample_database (sa)
    1163293 [main] INFO jdbc.SQL - [ C:3093871; T:31737213; D:22310332 ]
    preparing statement <17940412>: SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM
    MAGAZINEX
    1163313 [main] INFO jdbc.SQL - [ C:3093871; T:31737213; D:22310332 ]
    executing statement <17940412>: [reused=1;params={}]
    1163443 [main] INFO jdbc.JDBC - [ C:3093871; T:31737213; D:22310332 ]
    close:
    com.solarmetric.datasource.PoolConnection@2f356f[[requests=1;size=0;max=70;h
    its=0;created=1;redundant=0;overflow=0;new=1;leaked=0;unavailable=0]]
    1163443 [main] INFO jdbc.JDBC - [ C:3093871; T:31737213; D:22310332 ] close
    connection
    Hit uncaught exception javax.jdo.JDOFatalDataStoreException
    javax.jdo.JDOFatalDataStoreException:
    com.solarmetric.kodo.impl.jdbc.sql.SQLExceptionWrapper:
    [SQL=SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
    [PRE=SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
    Table not found: S0002 Table not found: MAGAZINEX in statement [SELECT
    DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX] [code=-22;state=S0002]
    NestedThrowables:
    com.solarmetric.kodo.impl.jdbc.sql.SQLExceptionWrapper:
    [SQL=SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
    [PRE=SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
    Table not found: S0002 Table not found: MAGAZINEX in statement [SELECT
    DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
    at
    com.solarmetric.kodo.impl.jdbc.runtime.SQLExceptions.throwFatal(SQLException
    s.java:17)
    at
    com.solarmetric.kodo.impl.jdbc.ormapping.SubclassProviderImpl.getSubclasses(
    SubclassProviderImpl.java:283)
    at
    com.solarmetric.kodo.impl.jdbc.ormapping.ClassMapping.getPrimaryMappingField
    s(ClassMapping.java:1093)
    at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.executeQuery(JDBCSto
    reManager.java:704)
    at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCQuery.executeQuery(JDBCQuery.java
    :93)
    at com.solarmetric.kodo.query.QueryImpl.executeWithMap(QueryImpl.java:792)
    at com.solarmetric.kodo.query.QueryImpl.execute(QueryImpl.java:595)
    at reversetutorial.Finder.main(Finder.java:32)
    NestedThrowablesStackTrace:
    java.sql.SQLException: Table not found: S0002 Table not found: MAGAZINEX in
    statement [SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
    at org.hsqldb.Trace.getError(Trace.java:226)
    at org.hsqldb.jdbcResultSet.<init>(jdbcResultSet.java:6595)
    at org.hsqldb.jdbcConnection.executeStandalone(jdbcConnection.java:2951)
    at org.hsqldb.jdbcConnection.execute(jdbcConnection.java:2540)
    at org.hsqldb.jdbcStatement.fetchResult(jdbcStatement.java:1804)
    at org.hsqldb.jdbcStatement.executeQuery(jdbcStatement.java:199)
    at
    org.hsqldb.jdbcPreparedStatement.executeQuery(jdbcPreparedStatement.java:391
    at
    com.solarmetric.datasource.PreparedStatementWrapper.executeQuery(PreparedSta
    tementWrapper.java:93)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executePreparedQueryI
    nternal(SQLExecutionManagerImpl.java:771)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executeQueryInternal(
    SQLExecutionManagerImpl.java:691)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executeQuery(SQLExecu
    tionManagerImpl.java:372)
    at
    com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executeQuery(SQLExecu
    tionManagerImpl.java:356)
    at
    com.solarmetric.kodo.impl.jdbc.ormapping.SubclassProviderImpl.getSubclasses(
    SubclassProviderImpl.java:246)
    at
    com.solarmetric.kodo.impl.jdbc.ormapping.ClassMapping.getPrimaryMappingField
    s(ClassMapping.java:1093)
    at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.executeQuery(JDBCSto
    reManager.java:704)
    at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCQuery.executeQuery(JDBCQuery.java
    :93)
    at com.solarmetric.kodo.query.QueryImpl.executeWithMap(QueryImpl.java:792)
    at com.solarmetric.kodo.query.QueryImpl.execute(QueryImpl.java:595)
    at reversetutorial.Finder.main(Finder.java:32)

    The reason I did not run importtool is because ... I actually ran it, but it
    was not successfull. **!
    I now tried the solutions directory, from the kodo distribution, and that
    failed as well. Here is what I did:
    - I went to reversetutorial/solutions, and compiled all the classes, and
    then placed them into a reversetutorial folder (to match the package)
    - ran "rd-importtool reversetutorial.mapping" (the mapping file from the
    solutions directory), which failed as below:
    0 [main] INFO kodo.MetaData - Parsing metadata resource
    "file:/C:/kodo/reversetutorial/solutions/reversetutorial.mapping".
    Exception in thread "main"
    com.solarmetric.rd.kodo.meta.JDOMetaDataNotFoundException: No JDO metadata
    was found for type "class reversetutorial.Article".
    FailedObject:class reversetutorial.Article
    at
    com.solarmetric.rd.kodo.meta.JDOMetaDataRepositoryImpl.getMetaData(JDOMetaDa
    taRepositoryImpl.java:148)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMetaData(Mapping
    Repository.java:147)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMapping(MappingR
    epository.java:158)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.getMapping(ImportTo
    ol.java:126)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.importMappings(Impo
    rtTool.java:57)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.run(ImportTool.java
    :408)
    at
    com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.main(ImportTool.jav
    a:385)
    Any idea why? The solutions directory should work, right? I even tried
    specifying a kodo.properties file, but it did not seem to help.
    Liviu
    "Abe White" <[email protected]> wrote in message
    news:[email protected]...
    Running the reversemappingtool creates classes, metadata files, and a
    .mapping file. That .mapping file contains all the O/R mapping
    information for how the generated classes map to your existing database
    tables. What the importtool does is just transfer that mapping
    information to the metadata files, in the form of <extension> elements.
    The reason this is a separate step will be clear once Kodo 3.0 comes out.
    So in sum, the importtool does not affect the database in any way. It
    just moves information from one format (.mapping file) to another
    (<extension> elements in the .jdo file).

  • JPA: Attr ... mapped to a primary key column in the DB. Update not allowed.

    Let me just say that I posted a bug report for this here:
    https://glassfish.dev.java.net/issues/show_bug.cgi?id=3937
    But I'm also posting the info here, so that people who search on this forum may get some help:
    TopLink (both Essentials and 11g) has a problem with flushing entities
    containing a self-reference relationship. When flushing, the following exception
    occurs:
    Exception [TOPLINK-7251] (Oracle TopLink Essentials - 2.1 (Build b14-fcs
    (12/17/2007))): oracle.toplink.essentials.exceptions.ValidationException
    Exception Description: The attribute [id] of class [test.jpa.entities.Person] is
    mapped to a primary key column in the database. Updates are not allowed.
    No manual updates have been made to ANY primary key.
    What I'm doing is:
    1. I instantiate a new entity.
    2. I start a transaction
    3. I persist the new entity.
    4. I read an existing entity from the DB.
    5. I let the existing entity point to the new entity via the self-reference
    relationship.
    6. I flush the persistence context.
    7. I issue commit(), and the exception occurs. (I have provided the stack traces for various versions of TopLink below.)
    This is a clear bug.
    Here are some additional observations:
    1. Reproduced on the following versions of TopLink:
    1.1. Oracle TopLink Essentials - 2.0 (Build b58g-fcs (09/07/2007))
    1.2. Oracle TopLink Essentials - 2.1 (Build b14-fcs (12/17/2007))
    1.3. Oracle TopLink - 11g Technology Preview 3 (11.1.1.0.0) (Build 071214)
    2. Reproducible both on Java SE and Java EE. (I tested on Oracle Application Server)
    3. Reproducible with and without class weaving
    4. Reproducible regardless of whether the JPA annotations are on fields or on
    methods
    5. Reproducible regardless of whether "cascade={CascadeType.PERSIST}" is used or
    not.
    6. Reproducible regardless of the fetch type of the self-reference relationship
    (EAGER or LAZY).
    Also:
    1. Without flushing, the bug doesn't occur. That is, if I commit without
    flushing, it works.
    2. Without setting the self-reference relationship, the bug doesn't occur.
    This is an issue that appears when using BOTH self-reference relationship AND
    flushing.
    Best regards,
    Bisser
    The message was edited by bisser:
    Added info that the exception occurs "when I issue commit()" on step 7.

    I'm extremely surprised that you couldn't reproduce the error. It's reproduced each time when I run the Test Scenario that I described above.
    You could download a sample Eclipse project that reproduces the error from here: https://glassfish.dev.java.net/issues/show_bug.cgi?id=3937
    For the log below I used TopLink, version: Oracle TopLink Essentials - 2.0 (Build b58g-fcs (09/07/2007)).
    Could you, please, tell me what version you use and I will try the Test Case on it.
    Here's the FINEST log:
    [TopLink Finest]: 2008.01.09 07:35:58.094--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.weaving; value=false
    [TopLink Finest]: 2008.01.09 07:35:59.312--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.orm.throw.exceptions; default value=true
    [TopLink Finer]: 2008.01.09 07:35:59.312--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--Searching for default mapping file in file:/D:/dev/bull/jpa_pk_bug/bin/
    [TopLink Config]: 2008.01.09 07:35:59.547--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--The alias name for the entity class [class test.jpa.entities.Person] is being defaulted to: Person.
    [TopLink Config]: 2008.01.09 07:35:59.594--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--The column name for element [private java.lang.Long test.jpa.entities.Person.id] is being defaulted to: ID.
    [TopLink Config]: 2008.01.09 07:35:59.609--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--The column name for element [private java.lang.String test.jpa.entities.Person.name] is being defaulted to: NAME.
    [TopLink Config]: 2008.01.09 07:35:59.641--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--The target entity (reference) class for the many to one mapping element [test.jpa.entities.Person test.jpa.entities.Person.mgr] is being defaulted to: class test.jpa.entities.Person.
    [TopLink Config]: 2008.01.09 07:35:59.703--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--The primary key column name for the mapping element [test.jpa.entities.Person test.jpa.entities.Person.mgr] is being defaulted to: ID.
    [TopLink Finest]: 2008.01.09 07:35:59.703--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--end predeploying Persistence Unit Test; state Predeployed; factoryCount 0
    [TopLink Finer]: 2008.01.09 07:35:59.703--Thread(Thread[Main Thread,5,main])--cmp_init_transformer_is_null
    [TopLink Finest]: 2008.01.09 07:35:59.703--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--begin predeploying Persistence Unit Test; state Predeployed; factoryCount 0
    [TopLink Finest]: 2008.01.09 07:35:59.703--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--end predeploying Persistence Unit Test; state Predeployed; factoryCount 1
    [TopLink Finest]: 2008.01.09 07:35:59.719--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--begin deploying Persistence Unit Test; state Predeployed; factoryCount 1
    [TopLink Finest]: 2008.01.09 07:35:59.734--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.logging.level; value=FINEST; translated value=FINEST
    [TopLink Finest]: 2008.01.09 07:35:59.734--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.logging.level; value=FINEST; translated value=FINEST
    [TopLink Finest]: 2008.01.09 07:35:59.750--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.jdbc.user; value=rms
    [TopLink Finest]: 2008.01.09 07:35:59.750--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.jdbc.password; value=xxxxxx
    [TopLink Finest]: 2008.01.09 07:36:00.766--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.target-database; value=Oracle; translated value=oracle.toplink.essentials.platform.database.oracle.OraclePlatform
    [TopLink Finest]: 2008.01.09 07:36:00.781--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.jdbc.driver; value=oracle.jdbc.OracleDriver
    [TopLink Finest]: 2008.01.09 07:36:00.781--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--property=toplink.jdbc.url; value=jdbc:oracle:thin:@//10.20.6.126:1521/region2
    [TopLink Info]: 2008.01.09 07:36:00.797--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--TopLink, version: Oracle TopLink Essentials - 2.0 (Build b58g-fcs (09/07/2007))
    [TopLink Config]: 2008.01.09 07:36:00.812--ServerSession(1968077)--Connection(5182312)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:01.797--ServerSession(1968077)--Connection(4252099)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Config]: 2008.01.09 07:36:01.797--ServerSession(1968077)--Connection(5744890)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:01.875--ServerSession(1968077)--Connection(5747801)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Config]: 2008.01.09 07:36:01.891--ServerSession(1968077)--Connection(5760373)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:01.969--ServerSession(1968077)--Connection(5763284)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Config]: 2008.01.09 07:36:01.969--ServerSession(1968077)--Connection(5497095)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:02.047--ServerSession(1968077)--Connection(5500006)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Config]: 2008.01.09 07:36:02.047--ServerSession(1968077)--Connection(5512041)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:02.125--ServerSession(1968077)--Connection(5514977)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Config]: 2008.01.09 07:36:02.125--ServerSession(1968077)--Connection(5527528)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:02.203--ServerSession(1968077)--Connection(5530440)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Config]: 2008.01.09 07:36:02.203--ServerSession(1968077)--Connection(5542993)--Thread(Thread[Main Thread,5,main])--connecting(DatabaseLogin(
         platform=>OraclePlatform
         user name=> "rms"
         datasource URL=> "jdbc:oracle:thin:@//10.20.6.126:1521/region2"
    [TopLink Config]: 2008.01.09 07:36:02.281--ServerSession(1968077)--Connection(5545904)--Thread(Thread[Main Thread,5,main])--Connected: jdbc:oracle:thin:@//10.20.6.126:1521/region2
         User: RMS
         Database: Oracle  Version: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64bit Production
    With the Partitioning, OLAP and Data Mining options
         Driver: Oracle JDBC driver  Version: 10.2.0.1.0
    [TopLink Finest]: 2008.01.09 07:36:02.312--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--sequencing connected, state is Preallocation_NoTransaction_State
    [TopLink Info]: 2008.01.09 07:36:02.484--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--file:/D:/dev/bull/jpa_pk_bug/bin/-Test login successful
    [TopLink Finest]: 2008.01.09 07:36:02.484--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--end deploying Persistence Unit Test; state Deployed; factoryCount 1
    [TopLink Finer]: 2008.01.09 07:36:02.516--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--client acquired
    [TopLink Finest]: 2008.01.09 07:36:02.531--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Execute query DoesExistQuery()
    [TopLink Finest]: 2008.01.09 07:36:02.547--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--PERSIST operation called on: test.jpa.entities.Person@563c8c.
    [TopLink Finest]: 2008.01.09 07:36:02.562--ClientSession(5666151)--Thread(Thread[Main Thread,5,main])--Execute query ValueReadQuery()
    [TopLink Fine]: 2008.01.09 07:36:02.594--ServerSession(1968077)--Connection(5747801)--Thread(Thread[Main Thread,5,main])--SELECT PERSONS_ID_SEQ.NEXTVAL FROM DUAL
    [TopLink Finest]: 2008.01.09 07:36:03.297--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--sequencing preallocation for PERSONS_ID_SEQ: objects: 1 , first: 5, last: 5
    [TopLink Finest]: 2008.01.09 07:36:03.312--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--assign sequence to the object (5 -> test.jpa.entities.Person@563c8c)
    [TopLink Finest]: 2008.01.09 07:36:03.328--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Execute query ReadObjectQuery(test.jpa.entities.Person)
    [TopLink Fine]: 2008.01.09 07:36:03.438--ServerSession(1968077)--Connection(4252099)--Thread(Thread[Main Thread,5,main])--SELECT ID, NAME, MGR_ID FROM Persons WHERE (ID = ?)
         bind => [1]
    [TopLink Finest]: 2008.01.09 07:36:03.531--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Register the existing object test.jpa.entities.Person@3a4484
    [TopLink Finer]: 2008.01.09 07:36:03.625--ClientSession(5666151)--Connection(5763284)--Thread(Thread[Main Thread,5,main])--begin transaction
    [TopLink Finest]: 2008.01.09 07:36:03.625--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Execute query UpdateObjectQuery(test.jpa.entities.Person@3a57fa)
    [TopLink Finest]: 2008.01.09 07:36:03.641--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Execute query WriteObjectQuery(test.jpa.entities.Person@563c8c)
    [TopLink Fine]: 2008.01.09 07:36:03.656--ClientSession(5666151)--Connection(5763284)--Thread(Thread[Main Thread,5,main])--INSERT INTO Persons (ID, NAME, MGR_ID) VALUES (?, ?, ?)
         bind => [5, Boss, null]
    [TopLink Fine]: 2008.01.09 07:36:03.688--ClientSession(5666151)--Connection(5763284)--Thread(Thread[Main Thread,5,main])--UPDATE Persons SET MGR_ID = ? WHERE (ID = ?)
         bind => [5, 1]
    [TopLink Finer]: 2008.01.09 07:36:03.703--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--begin unit of work commit
    [TopLink Finest]: 2008.01.09 07:36:03.703--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Execute query UpdateObjectQuery(test.jpa.entities.Person@563c8c)
    [TopLink Warning]: 2008.01.09 07:36:03.812--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--Local Exception Stack:
    Exception [TOPLINK-7251] (Oracle TopLink Essentials - 2.0 (Build b58g-fcs (09/07/2007))): oracle.toplink.essentials.exceptions.ValidationException
    Exception Description: The attribute [id] of class [test.jpa.entities.Person] is mapped to a primary key column in the database. Updates are not allowed.
         at oracle.toplink.essentials.exceptions.ValidationException.primaryKeyUpdateDisallowed(ValidationException.java:2222)
         at oracle.toplink.essentials.mappings.foundation.AbstractDirectMapping.writeFromObjectIntoRowWithChangeRecord(AbstractDirectMapping.java:750)
         at oracle.toplink.essentials.internal.descriptors.ObjectBuilder.buildRowForUpdateWithChangeSet(ObjectBuilder.java:948)
         at oracle.toplink.essentials.internal.queryframework.DatabaseQueryMechanism.updateObjectForWriteWithChangeSet(DatabaseQueryMechanism.java:1263)
         at oracle.toplink.essentials.queryframework.UpdateObjectQuery.executeCommitWithChangeSet(UpdateObjectQuery.java:91)
         at oracle.toplink.essentials.internal.queryframework.DatabaseQueryMechanism.executeWriteWithChangeSet(DatabaseQueryMechanism.java:390)
         at oracle.toplink.essentials.queryframework.WriteObjectQuery.executeDatabaseQuery(WriteObjectQuery.java:109)
         at oracle.toplink.essentials.queryframework.DatabaseQuery.execute(DatabaseQuery.java:628)
         at oracle.toplink.essentials.queryframework.DatabaseQuery.executeInUnitOfWork(DatabaseQuery.java:555)
         at oracle.toplink.essentials.queryframework.ObjectLevelModifyQuery.executeInUnitOfWorkObjectLevelModifyQuery(ObjectLevelModifyQuery.java:138)
         at oracle.toplink.essentials.queryframework.ObjectLevelModifyQuery.executeInUnitOfWork(ObjectLevelModifyQuery.java:110)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.internalExecuteQuery(UnitOfWorkImpl.java:2233)
         at oracle.toplink.essentials.internal.sessions.AbstractSession.executeQuery(AbstractSession.java:952)
         at oracle.toplink.essentials.internal.sessions.AbstractSession.executeQuery(AbstractSession.java:909)
         at oracle.toplink.essentials.internal.sessions.CommitManager.commitChangedObjectsForClassWithChangeSet(CommitManager.java:309)
         at oracle.toplink.essentials.internal.sessions.CommitManager.commitAllObjectsWithChangeSet(CommitManager.java:195)
         at oracle.toplink.essentials.internal.sessions.AbstractSession.writeAllObjectsWithChangeSet(AbstractSession.java:2657)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.commitToDatabase(UnitOfWorkImpl.java:1044)
         at oracle.toplink.essentials.internal.ejb.cmp3.base.RepeatableWriteUnitOfWork.commitToDatabase(RepeatableWriteUnitOfWork.java:403)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.commitToDatabaseWithChangeSet(UnitOfWorkImpl.java:1126)
         at oracle.toplink.essentials.internal.ejb.cmp3.base.RepeatableWriteUnitOfWork.commitRootUnitOfWork(RepeatableWriteUnitOfWork.java:107)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.commitAndResume(UnitOfWorkImpl.java:856)
         at oracle.toplink.essentials.internal.ejb.cmp3.transaction.base.EntityTransactionImpl.commit(EntityTransactionImpl.java:102)
         at oracle.toplink.essentials.internal.ejb.cmp3.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:60)
         at test.jpa.TestPkBug.runTest(TestPkBug.java:53)
         at test.jpa.TestPkBug.main(TestPkBug.java:95)
    [TopLink Finer]: 2008.01.09 07:36:03.828--ClientSession(5666151)--Connection(5763284)--Thread(Thread[Main Thread,5,main])--rollback transaction
    [TopLink Finer]: 2008.01.09 07:36:03.844--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--release unit of work
    [TopLink Finer]: 2008.01.09 07:36:03.844--UnitOfWork(5663897)--Thread(Thread[Main Thread,5,main])--initialize identitymaps
    [TopLink Finer]: 2008.01.09 07:36:03.844--ClientSession(5666151)--Thread(Thread[Main Thread,5,main])--client released
    [TopLink Finest]: 2008.01.09 07:36:03.844--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--begin undeploying Persistence Unit Test; state Deployed; factoryCount 1
    [TopLink Finest]: 2008.01.09 07:36:03.844--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--sequencing disconnected
    [TopLink Config]: 2008.01.09 07:36:03.844--ServerSession(1968077)--Connection(4252099)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Finer]: 2008.01.09 07:36:03.859--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--initialize identitymaps
    [TopLink Info]: 2008.01.09 07:36:03.859--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--file:/D:/dev/bull/jpa_pk_bug/bin/-Test logout successful
    [TopLink Config]: 2008.01.09 07:36:03.859--ServerSession(1968077)--Connection(5747801)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Config]: 2008.01.09 07:36:03.859--ServerSession(1968077)--Connection(5182312)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Config]: 2008.01.09 07:36:03.859--ServerSession(1968077)--Connection(5500006)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Config]: 2008.01.09 07:36:03.875--ServerSession(1968077)--Connection(5514977)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Config]: 2008.01.09 07:36:03.875--ServerSession(1968077)--Connection(5530440)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Config]: 2008.01.09 07:36:03.875--ServerSession(1968077)--Connection(5545904)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Config]: 2008.01.09 07:36:03.891--ServerSession(1968077)--Connection(5763284)--Thread(Thread[Main Thread,5,main])--disconnect
    [TopLink Finest]: 2008.01.09 07:36:03.891--ServerSession(1968077)--Thread(Thread[Main Thread,5,main])--end undeploying Persistence Unit Test; state Undeployed; factoryCount 0
    Exception in thread "Main Thread" javax.persistence.RollbackException: Exception [TOPLINK-7251] (Oracle TopLink Essentials - 2.0 (Build b58g-fcs (09/07/2007))): oracle.toplink.essentials.exceptions.ValidationException
    Exception Description: The attribute [id] of class [test.jpa.entities.Person] is mapped to a primary key column in the database. Updates are not allowed.
         at oracle.toplink.essentials.internal.ejb.cmp3.transaction.base.EntityTransactionImpl.commit(EntityTransactionImpl.java:120)
         at oracle.toplink.essentials.internal.ejb.cmp3.transaction.EntityTransactionImpl.commit(EntityTransactionImpl.java:60)
         at test.jpa.TestPkBug.runTest(TestPkBug.java:53)
         at test.jpa.TestPkBug.main(TestPkBug.java:95)
    Caused by: Exception [TOPLINK-7251] (Oracle TopLink Essentials - 2.0 (Build b58g-fcs (09/07/2007))): oracle.toplink.essentials.exceptions.ValidationException
    Exception Description: The attribute [id] of class [test.jpa.entities.Person] is mapped to a primary key column in the database. Updates are not allowed.
         at oracle.toplink.essentials.exceptions.ValidationException.primaryKeyUpdateDisallowed(ValidationException.java:2222)
         at oracle.toplink.essentials.mappings.foundation.AbstractDirectMapping.writeFromObjectIntoRowWithChangeRecord(AbstractDirectMapping.java:750)
         at oracle.toplink.essentials.internal.descriptors.ObjectBuilder.buildRowForUpdateWithChangeSet(ObjectBuilder.java:948)
         at oracle.toplink.essentials.internal.queryframework.DatabaseQueryMechanism.updateObjectForWriteWithChangeSet(DatabaseQueryMechanism.java:1263)
         at oracle.toplink.essentials.queryframework.UpdateObjectQuery.executeCommitWithChangeSet(UpdateObjectQuery.java:91)
         at oracle.toplink.essentials.internal.queryframework.DatabaseQueryMechanism.executeWriteWithChangeSet(DatabaseQueryMechanism.java:390)
         at oracle.toplink.essentials.queryframework.WriteObjectQuery.executeDatabaseQuery(WriteObjectQuery.java:109)
         at oracle.toplink.essentials.queryframework.DatabaseQuery.execute(DatabaseQuery.java:628)
         at oracle.toplink.essentials.queryframework.DatabaseQuery.executeInUnitOfWork(DatabaseQuery.java:555)
         at oracle.toplink.essentials.queryframework.ObjectLevelModifyQuery.executeInUnitOfWorkObjectLevelModifyQuery(ObjectLevelModifyQuery.java:138)
         at oracle.toplink.essentials.queryframework.ObjectLevelModifyQuery.executeInUnitOfWork(ObjectLevelModifyQuery.java:110)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.internalExecuteQuery(UnitOfWorkImpl.java:2233)
         at oracle.toplink.essentials.internal.sessions.AbstractSession.executeQuery(AbstractSession.java:952)
         at oracle.toplink.essentials.internal.sessions.AbstractSession.executeQuery(AbstractSession.java:909)
         at oracle.toplink.essentials.internal.sessions.CommitManager.commitChangedObjectsForClassWithChangeSet(CommitManager.java:309)
         at oracle.toplink.essentials.internal.sessions.CommitManager.commitAllObjectsWithChangeSet(CommitManager.java:195)
         at oracle.toplink.essentials.internal.sessions.AbstractSession.writeAllObjectsWithChangeSet(AbstractSession.java:2657)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.commitToDatabase(UnitOfWorkImpl.java:1044)
         at oracle.toplink.essentials.internal.ejb.cmp3.base.RepeatableWriteUnitOfWork.commitToDatabase(RepeatableWriteUnitOfWork.java:403)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.commitToDatabaseWithChangeSet(UnitOfWorkImpl.java:1126)
         at oracle.toplink.essentials.internal.ejb.cmp3.base.RepeatableWriteUnitOfWork.commitRootUnitOfWork(RepeatableWriteUnitOfWork.java:107)
         at oracle.toplink.essentials.internal.sessions.UnitOfWorkImpl.commitAndResume(UnitOfWorkImpl.java:856)
         at oracle.toplink.essentials.internal.ejb.cmp3.transaction.base.EntityTransactionImpl.commit(EntityTransactionImpl.java:102)
         ... 3 moreEDIT: Are you using the EXACT Test Case as I have described it in the previous posts? It's important that you commit(), and not rollback(), the transaction after the flush.
    EDIT: Updated the log because I found out that I had made a small change to the original Test Case while I was trying to find a workaround. The current log is produced by the EXACT Test Case I described in my previous posts.
    Message was edited by:
    bisser

  • Refiners for managed metadata column

    Hi,
    I have followed the procedure outlined here  to set the search schema for metadata column that I want to show as the refiners: 
    http://blogs.technet.com/b/sharepoint_made_easy/archive/2013/03/19/step-by-step-configuration-to-add-custom-refiners-in-the-refinement-panel-of-search-results-page-for-sharepoint-online.aspx
    My managed property is showing in the available "refiners", however, there are no values returned from it. Please note that I am mapping "RefinableString00" to a managed metadata column crawled property. What am I doing wrong?

    Resolved it. i just have to initiate a crawl.

  • How to create a managed metadata column that allows terms from several term sets or groups ?

    My requirement is to create a a Managed Metadata column that allows terms that come from several term sets (or possibly from one group containing several term sets).
    Would it be possible ?
    Thanks.

    Hi,
    I don't know if this is possible, however the design of your managed metadata term sets should be that you don't have a type of value coming from two different sets. For example, departments in your organization, offices of your company. Let me know if you
    get a solution. It would be helpful to others.

  • Map the hierarchical queries pseudo column level

    Is it possible to map the hierarchical queries pseudo column level in the Workbench?
    I have an object for which I use connect by queries. I would like to map the pseudo column level. is this possible? if yes, how
    If the mapping is not possible how do I get the level using a report query?
    Is there any other way to retrive level?
    Thanks
    Edited by: amehta5 on May 4, 2010 11:52 AM
    Edited by: amehta5 on May 4, 2010 11:55 AM

    Thanks James, appreciate your feedback.
    I tried report.addItem("level", builder.getFunction("LEVEL"));
    but the query TopLink generates has LEVEL() and it errors out.
    Code -
    Expression startExpr = null;
    Expression connectBy = builder.get("manager");
    Vector<Expression> order = new Vector<Expression>();
    order.addElement(builder.get("name"));
    report.setHierarchicalQueryClause(startExpr, connectBy, order);
    report.addAttribute("name");
    report.addItem("level", builder.getFunction("LEVEL"));
    Query generated by TOpLink - SELECT NAME, ID, LEVEL() FROM EMPLOYEE WHERE (EMPLOYEE_TYPE = ?) CONNECT BY PRIOR EMPLOYEE.MANAGER_ID = EMPLOYEE.ID ORDER SIBLINGS BY NAME
    bind => [M]
    Edited by: amehta5 on May 6, 2010 6:11 AM

  • Reverse Map Schema

    While trying to run the reverse mapping tool on an existing Oracle
    Database(Version 9.2.0.1), I am getting an error "An Error occurred while
    accessing the Schema". Do we need some special settings in the Kodo
    Workbench other than the connection settings? Am I missing something?
    Here is my config settings...
    For the driver class : oracle.jdbc.driver.OracleDriver
    For the Connection URL :jdbc:oracle:thin:@IPAddress:1521:DBNAME

    By the way, if you launch the tool from the start menu, then it will
    probably use javaw and there will be no stack trace.
    "Marc Prud'hommeaux" <[email protected]> wrote in message
    news:[email protected]..
    Vijay-
    Can you post the complete stack trace of the exception?
    In article <cpkg6d$q7v$[email protected]>, Vijay Kumar Narayanan
    wrote:
    >>
    While trying to run the reverse mapping tool on an existing Oracle
    Database(Version 9.2.0.1), I am getting an error "An Error occurredwhile
    accessing the Schema". Do we need some special settings in the Kodo
    Workbench other than the connection settings? Am I missing something?
    Here is my config settings...
    For the driver class : oracle.jdbc.driver.OracleDriver
    For the Connection URL :jdbc:oracle:thin:@IPAddress:1521:DBNAME
    Marc Prud'hommeaux
    SolarMetric Inc.

  • No values for refiners for managed properties based managed metadata columns in SharePoint 2013 on premise

    On a SharePoint Server 2013 setup (on premise) we want to use a number of managed metadata columns in the search as refiner. The managed metadata column is called "Document Language" and so it automatically creates a managed property called owstaxidDocumentx002Language
    which I can use to do a keyword query by typing owstaxiDocumentx0020Language:"NL" in the search box and it returns results. I also marked this managed property as refineable (in the service search application) and it appears in the
    seach refiners selection box but it states that there are no values for this refiner.
    So I decided to create a new managed search property "DocumentLanguage" and maps this to "ows_Document_x0020_Language" - I marked it as queryable, searchable, refineable, retrievable and sortable but this managed search property
    is not being filled up.
    I already executed a number of full crawls but this does not seem to work. Anyone know what to do next?
    Rgds,
    Joris [http://jopx.blogspot.com]

    It happened with me many times that the automatically generated managed property is not properly mapped to the crawled property and to get the correct crawled property  get the document item 
    $web=Get-SPWeb http://webUrl
    $list=$web.Lists["documentLibrary"]
    $list.Items[0].Xml>>item.xml
    View the item.xml and check if the ows_Document_x0020_Language contains data
    Hope that helps|Amr Fouad|MCTS,MCPD sharePoint 2010

  • Create key mapping using import manager for lookup table FROM EXCEL file

    hello,
    i would like create key mapping while importing the values via excel file.
    the source file containing the key, but how do i map it to the lookup table?
    the properties of the table has enable the creation of mapping key. but during the mapping in import manager, i cant find any way to map the key mapping..
    eg
    lookup table contains:
    Material Group
    Code
    excel file contain
    MatGroup1  Code   System
    Thanks!
    Shanti

    Hi Shanti,
    Assuming you have already defined below listed points
    1)  Key Mapping "Yes" to your lookup table in MDM Console
    2) Created a New Remote System in MDM console
    3) proper rights for your account for updating the remote key values in to data manager through import manager.
    Your sample file can have Material Group and Code alone which can be exported from Data Manager by File-> Export To -> Excel, if you have  data already in Data Manager.
    Open your sample file through Import Manager by selecting  the remote system for which you want to import the Key mapping.
    (Do Not select MDM as Remote System, which do not allows you to maintain key mapping values) and also the file type as Excel
    Now select your Soruce and Destination tables, under the destination fields you will be seeing a new field called [Remote Key]
    Map you source and destination fields correspondingly and Clone your source field code by right clicking on code in the source hierarchy and map it to Remote Key if you want the code to be in the remote key values.
    And in the matching criteria select destination field code as a Matching field and change the default import action to Update NULL fields or UPDATED MAPPED FIELDS as required,
    After sucessfull import you can check the Remote Key values in Data Manager.
    Hope this helps
    Thanks
    Sowseel

Maybe you are looking for

  • Multiple While Loops Write to One File

    Hello, I have three different instruments from which I am sampling, each at a different sampling rate.  Each instrument is being sampled in its own While loop.  How do I write all the samples I collect to a single text file?  I would like something i

  • Can't find Lightroom

    Amazon shows a ship date of Mid-April and B&H doesn't know when they will get any. Has anyone received a boxed version from any store ??

  • Ethernet connection issue with iMac, OS 10.6.6

    I can no longer connect to another iMac on our network that is running OS 10.6.6 from my iMac running OS 10.6.6 via Ethernet. It was fine until a couple days ago. I can connect to other iMacs running 10.5.x on the network via Ethernet, but not the 10

  • Keep Macbook Pro w/Retina Display?

    So I recently went out and purchased (not even a week ago) a Mid Tier, Mid 2014 Macbook Pro with retina Display 13' because of my previous Macbook Pro 13, mid 2012 base model had reached the end of its life cycle. Or so I thought. I mustered up the c

  • Screwing down motherboard in HP G62 b29SA - unused holes in motherboard?

    I've been fixing a friend's G62 b29SA laptop (just replacing the fan). There was some time between taking it apart and putting it back together (had to get the right fan) so I can't remember whether all the holes in the motherboard had screws in them