Dynamical reverse mapping
Hi,
I didn't manage to do the reverse mapping at runtime.
I have 2 components.
The context of them are described below:
There is only one node in component 1 called "idioms"
In the component 2 (that is the component used called POPUP) there is the node called "columns"
I want to map idioms to columns at runtime. It is a reverse mapping.
I have the code below:
data: lo_node_info type ref to if_wd_context_node_info,
lo_dyn_node_info type ref to if_wd_context_node_info,
stru_mapping_info type wdr_context_mapping_info,
tab_mapping_path type wdr_ctx_element_path_segments,
wa_path type wdr_ctx_element_name.
wa_path = 'COMPONENTCONTROLLER.COLUMNS'.
insert wa_path into table tab_mapping_path.
stru_mapping_info-component_usage = 'POPUP'.
stru_mapping_info-controller = 'COMPONENTCONTROLLER'.
stru_mapping_info-path = tab_mapping_path.
lo_node_info = wd_context->get_node_info( ).
* Map the context node dynamically
call method lo_node_info->add_new_mapped_child_node
exporting
child_name = 'IDIOMS'
mapping_info = stru_mapping_info
receiving
child_node_info = lo_dyn_node_info.
The mapping hasn't been done. Why?
Thanks a lot.
Regards,
Jorge Luiz
Hi David,
Thanks for your helping.
I have done what you said, but I am receinving the error below;
Adapter error in &VIEW_ELEMENT_TYPE& "TABS" of view "YSWD_HR_TABLE_POPUP.MAIN": Context binding of property DATA_SOURCE cannot be resolved: Subnode MAIN.IDIOMA_POPUP does not exist
I am passing the node IDIOMA_POPUP as a parameter of the set_data( ) method of the used component.
TABS is the table that I want to populate.
Regards,
Jorge Luiz
Similar Messages
-
Dynamic reverse context node mapping at runtime
Hi!
I have an application which uses another embedded component containing a ALV table which shows the data of a transparent table.
Everything works fine if i'm working with a static table with static attributes and map the context node of the application to the node in the embedded component. The context node in the embedded component is reversed mapped and gets its data from the application's context node.
Now i want to do it dynamically. Inside the application the user should choose the name of a table and I'm using RTTI functions to get it's technical informations. At runtime i'm adding dynamically attributes (i.e. fieldname, type, length) to a static context node called 'DATA'.
The ALV component of the embedded component is mapped to an interface context node named 'DATA' which uses reverse mapping to receive data from the applications context node 'DATA'.
What I need to know is how to create the mapping between application context node and interface context node of the embedded component at runtime after i have created the attributes of the node dynamically. Is that basically possible?
I've already tried to map the both nodes statically to see if the dynamically created attributes are mapped automatically then. The mapping seems to be fine at node level, but doesn't work for the later dynamically added attributes.
Could someone point me to informations if dynamic mapping context nodes is basically possible and how to do it?
Regards
Ralf-J.Hi Lars!
Thanks for the link! The problem isn't to fill the context node dynamically. That's already working using RTTI etc.
It seems to me that the interface context node of the ALV in the embedded component doesn't recognize the dynamically added attributes. As i wrote the binding on node level (DATA->DATA), which i'm doing at design time, is working as expected, but the added attributes are only shown in the context node of the using component but not in the embedded.
I'm still thinking there must be a way to do a binding at runtime. On the other hand the comment of Glenn in the thread you've pointed me to, seems to suggest that the ALV itself might have problems with dynamically appended attributes.
Regards
Ralf-J. -
Dynamic reverse context mapping
hi,
can someone explain me how to use dynamic reverse context mapping in my web dynpro application?
I have read in other sdn-threads that in IF_WD_CONTEXT_NODE_INFO the methods ADD_NEW_MAPPED_CHILD_NODE and SET_MAPPING_COMPLETE might be the solution.
does anyone know where and how to use these methods? please explain me the priciple and which parameters I should use for example CONTEXT_PATH and MAPPED_CONTROLLER with the SET_MAPPING_COMPLETE method.Hi Thorsten,
I don't think you can acheive Dynamic Reverse Context Mapping using the above mentioned methods.
Please find the below scenario in which we can acheive Dynamic Reverse Context Mapping programatically.
The situation is, with in my WD component I created a context in the component controller dynamically or statically. Now I want to map this node to another node of a comonent at run time.
Here my WD component is the parent component and I am using a child component say SALV_WD_TABLE, which has a context DATA. Inorder to populate data in the ALV the DATA node must be supplied with the necessary contents either at run time or at design time.
The run tiime mapping is as follows.
DATA : lo_componentusage TYPW REF TO if_wd_component_usage,
lo_interfacecontroller TYPE REF TO IWCI_SALV_WD_TABLE.
lo_componentusage = wd_this->wd_cpuse_ref_salv( ).
if lo_componentusage ->HAS_ACTIVE_COMPONENT( ) is INITIAL.
CALL METHOD LO_COMPONENTUSAGE->CREATE_COMPONENT
endif.
lo_interfacecontroller = wd_this->wd_cpifc_ref_salv( ).
CALL METHOD lo_interfacecontroller ->SET_DATA
EXPORTING
R_NODE_DATA = lo_nd_flight_salv .
lo_nd_flight_salv is a reference to the node in your WD component which contains the data to be displayed.
Hope I could give you a clue...
Cheers,
Sankar -
Reverse Mapping Tutorial - Finder.java queries the wrong table?!
I have been almost successful in running the Reverse Mapping Tutorial, by
creating Java Classes from the hsqldb sample database, and running the JDO
Enhancer on them.
However, I cannot get he Finder.java to work. It seems to look in the wrong
table: MAGAZINEX instead of MAGAZINE?
Did anyone have trouble with this step, or ran it successfully?
Liviu
PS: here is the trace:
0 [main] INFO kodo.Runtime - Starting Kodo JDO version 2.4.2
(kodojdo-2.4.2-20030326-1841) with capabilities: [Enterprise Edition
Features, Standard Edition Features, Lite Edition Features, Evaluation
License, Query Extensions, Datacache Plug-in, Statement Batching, Global
Transactions, Developer Tools, Custom Database Dictionaries, Enterprise
Databases]
70 [main] WARN kodo.Runtime - WARNING: Kodo JDO Evaluation expires in 25
days. Please contact [email protected] for information on extending your
evaluation period or purchasing a license.
68398 [main] INFO kodo.MetaData -
com.solarmetric.kodo.meta.JDOMetaDataParser@19eda2c: parsing source:
file:/C:/Documents%20and%20Settings/default/jbproject/JDO/classes/reversetut
orial.jdo
74577 [main] INFO jdbc.JDBC - [ C:24713456; T:31737213; D:22310332 ] open:
jdbc:hsqldb:hsql_sample_database (sa)
75689 [main] INFO jdbc.JDBC - [ C:24713456; T:31737213; D:22310332 ] close:
com.solarmetric.datasource.PoolConnection@17918f0[[requests=0;size=0;max=70;
hits=0;created=0;redundant=0;overflow=0;new=0;leaked=0;unavailable=0]]
75699 [main] INFO jdbc.JDBC - [ C:24713456; T:31737213; D:22310332 ] close
connection
77331 [main] INFO jdbc.JDBC - Using dictionary class
"com.solarmetric.kodo.impl.jdbc.schema.dict.HSQLDictionary" to connect to
"HSQL Database Engine" (version "1.7.0") with JDBC driver "HSQL Database
Engine Driver" (version "1.7.0")
1163173 [main] INFO jdbc.JDBC - [ C:3093871; T:31737213; D:22310332 ] open:
jdbc:hsqldb:hsql_sample_database (sa)
1163293 [main] INFO jdbc.SQL - [ C:3093871; T:31737213; D:22310332 ]
preparing statement <17940412>: SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM
MAGAZINEX
1163313 [main] INFO jdbc.SQL - [ C:3093871; T:31737213; D:22310332 ]
executing statement <17940412>: [reused=1;params={}]
1163443 [main] INFO jdbc.JDBC - [ C:3093871; T:31737213; D:22310332 ]
close:
com.solarmetric.datasource.PoolConnection@2f356f[[requests=1;size=0;max=70;h
its=0;created=1;redundant=0;overflow=0;new=1;leaked=0;unavailable=0]]
1163443 [main] INFO jdbc.JDBC - [ C:3093871; T:31737213; D:22310332 ] close
connection
Hit uncaught exception javax.jdo.JDOFatalDataStoreException
javax.jdo.JDOFatalDataStoreException:
com.solarmetric.kodo.impl.jdbc.sql.SQLExceptionWrapper:
[SQL=SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
[PRE=SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
Table not found: S0002 Table not found: MAGAZINEX in statement [SELECT
DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX] [code=-22;state=S0002]
NestedThrowables:
com.solarmetric.kodo.impl.jdbc.sql.SQLExceptionWrapper:
[SQL=SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
[PRE=SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
Table not found: S0002 Table not found: MAGAZINEX in statement [SELECT
DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
at
com.solarmetric.kodo.impl.jdbc.runtime.SQLExceptions.throwFatal(SQLException
s.java:17)
at
com.solarmetric.kodo.impl.jdbc.ormapping.SubclassProviderImpl.getSubclasses(
SubclassProviderImpl.java:283)
at
com.solarmetric.kodo.impl.jdbc.ormapping.ClassMapping.getPrimaryMappingField
s(ClassMapping.java:1093)
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.executeQuery(JDBCSto
reManager.java:704)
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCQuery.executeQuery(JDBCQuery.java
:93)
at com.solarmetric.kodo.query.QueryImpl.executeWithMap(QueryImpl.java:792)
at com.solarmetric.kodo.query.QueryImpl.execute(QueryImpl.java:595)
at reversetutorial.Finder.main(Finder.java:32)
NestedThrowablesStackTrace:
java.sql.SQLException: Table not found: S0002 Table not found: MAGAZINEX in
statement [SELECT DISTINCT MAGAZINEX.JDOCLASSX FROM MAGAZINEX]
at org.hsqldb.Trace.getError(Trace.java:226)
at org.hsqldb.jdbcResultSet.<init>(jdbcResultSet.java:6595)
at org.hsqldb.jdbcConnection.executeStandalone(jdbcConnection.java:2951)
at org.hsqldb.jdbcConnection.execute(jdbcConnection.java:2540)
at org.hsqldb.jdbcStatement.fetchResult(jdbcStatement.java:1804)
at org.hsqldb.jdbcStatement.executeQuery(jdbcStatement.java:199)
at
org.hsqldb.jdbcPreparedStatement.executeQuery(jdbcPreparedStatement.java:391
at
com.solarmetric.datasource.PreparedStatementWrapper.executeQuery(PreparedSta
tementWrapper.java:93)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executePreparedQueryI
nternal(SQLExecutionManagerImpl.java:771)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executeQueryInternal(
SQLExecutionManagerImpl.java:691)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executeQuery(SQLExecu
tionManagerImpl.java:372)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.executeQuery(SQLExecu
tionManagerImpl.java:356)
at
com.solarmetric.kodo.impl.jdbc.ormapping.SubclassProviderImpl.getSubclasses(
SubclassProviderImpl.java:246)
at
com.solarmetric.kodo.impl.jdbc.ormapping.ClassMapping.getPrimaryMappingField
s(ClassMapping.java:1093)
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.executeQuery(JDBCSto
reManager.java:704)
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCQuery.executeQuery(JDBCQuery.java
:93)
at com.solarmetric.kodo.query.QueryImpl.executeWithMap(QueryImpl.java:792)
at com.solarmetric.kodo.query.QueryImpl.execute(QueryImpl.java:595)
at reversetutorial.Finder.main(Finder.java:32)The reason I did not run importtool is because ... I actually ran it, but it
was not successfull. **!
I now tried the solutions directory, from the kodo distribution, and that
failed as well. Here is what I did:
- I went to reversetutorial/solutions, and compiled all the classes, and
then placed them into a reversetutorial folder (to match the package)
- ran "rd-importtool reversetutorial.mapping" (the mapping file from the
solutions directory), which failed as below:
0 [main] INFO kodo.MetaData - Parsing metadata resource
"file:/C:/kodo/reversetutorial/solutions/reversetutorial.mapping".
Exception in thread "main"
com.solarmetric.rd.kodo.meta.JDOMetaDataNotFoundException: No JDO metadata
was found for type "class reversetutorial.Article".
FailedObject:class reversetutorial.Article
at
com.solarmetric.rd.kodo.meta.JDOMetaDataRepositoryImpl.getMetaData(JDOMetaDa
taRepositoryImpl.java:148)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMetaData(Mapping
Repository.java:147)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMapping(MappingR
epository.java:158)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.getMapping(ImportTo
ol.java:126)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.importMappings(Impo
rtTool.java:57)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.run(ImportTool.java
:408)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.main(ImportTool.jav
a:385)
Any idea why? The solutions directory should work, right? I even tried
specifying a kodo.properties file, but it did not seem to help.
Liviu
"Abe White" <[email protected]> wrote in message
news:[email protected]...
Running the reversemappingtool creates classes, metadata files, and a
.mapping file. That .mapping file contains all the O/R mapping
information for how the generated classes map to your existing database
tables. What the importtool does is just transfer that mapping
information to the metadata files, in the form of <extension> elements.
The reason this is a separate step will be clear once Kodo 3.0 comes out.
So in sum, the importtool does not affect the database in any way. It
just moves information from one format (.mapping file) to another
(<extension> elements in the .jdo file). -
hi,
can u plz explain about Dynamic value mapping and where it actually use this
thanks
gunaHi ,
Dynamic value mapping.is nothing but
It is FixValues and ValueMapping under Conversion Functions.
Eg:
You need such a requirement. Where the values in the source are mapped to someother value in the target as below.
1--> Mr
2--> MS
3--> MRS
FixValues is used when you know the entire set of Key value pair in the Design Time. You give the Key and the value in the FixValues and the mapping checks and maps the values to the target.
In the case of Value mapping, you maintain this Key - Value pair in the Integration Directory and thereby make changes easily and also use them in the Mapping in IR
Refer These blogs
ValueMapping using the Graphical Mapping Tool -value mapping using grapic mapping tool
Value Mapping replication - value mapping replication
Accessing Value Mapping defined in Directory using Java functions - accesing value mapping
Dynamic Date Conversion in Message Mapping - dynamic date conversion
Dynamic Configuration of Some Communication Channel Parameters using Message Mapping - dynamic confighuration
Dynamic file name(XSLT Mapping with Java Enhancement) using XI 3.0 SP12 Part -II - dynamic file name
and also
Refer this link
http://help.sap.com/saphelp_nw2004s/helpdata/en/d7/e551cf896c3a49bb87bb4ce38c99c8/frameset.htm - external context mapping
Regards,
Suryanarayana -
Dynamic Tempo Mapping In Audition CC
Hello Adobe,
I've seen various forum postings about this in the past, but nothing very recently. I just downloaded the latest version of Audition CC. It looks great, a lot of cool new features, but I wanted to reiterate on past postings that dynamic tempo mapping functionality would be extremely useful. It doesn't need to be anything too fancy. Just need to be able to tell Audition that after 30 bars of 4/4 at 120 bpm, I'd like the metronome and bars/beats display to change to 3/4 at 130 bpm etc.
Just want to bump this back onto your radar. Thanks.
-BrendanI've used the Ozone with Audition in the past (we're talking about the keyboard/audio combo device, right?) and it's been fine. If you're seeing the playhead move and the level meters update when attempting to play through it, that means the communication should be there.
Any chance you can show us screenshots of the "Audio Hardware" and "Audio Channel Mapping" panels in the preferences dialog? That might help get started. -
Reverse mapping tool in 3.0.1 ignores "-schemas" option
I believe I have discovered a bug in the 3.0.1 version of the reverse
mapping tool.
Here is a script of the commands that worked fine in 3.0.0:
Script started on Mon Jan 12 11:02:19 2004
1$ which schemagen
/opt/kodo-jdo-3.0.0/bin/schemagen
2$ echo $PATH
/opt/kodo-jdo-3.0.0/bin:/sw/db/oracle/oracle817/bin:/sw/gen/sparc-sun-solaris2.9/acroread/5.06/bin:/sw/gen/sparc-sun-solaris2.9/cvs/1.11.5/bin:/sw/gen/sparc-sun-solaris2.9/esound/0.2.29/bin:/sw/gen/sparc-sun-solaris2.9/mpg123/0.59r/bin:/usr/bin:/sw/gen/sparc-sun-solaris2.9/gnupg/1.2.1/bin:/sw/gen/sparc-sun-solaris2.9/mozilla/1.3/bin:/sw/gen/sparc-sun-solaris2.9/openssh/3.7.1p2/sbin:/sw/gen/sparc-sun-solaris2.9/openssh/3.7.1p2/bin:/sw/pd/workman-1.3.4/bin:/usr/openwin/bin:/usr/bin:/sbin:/bin:/usr/sbin:/usr/ccs/bin:/usr/ucb:/opt/local/bin:/sw/modules/bin:/sw/com/bin:/sw/pd/bin:/sw/pd/office52/program:/sw/pd/RealPlayer8:/users/n9208/bin:/opt/openssh/bin:/usr/dt/bin:/usr/dt/bin:/usr/openwin/bin:/sw/db/tools/bin:/sw/db/iss/bin:/usr/local/bin:/usr/local/scripts
3$ echo $CLASSPATH
:/opt/oracle/oracle9.0.1.4.zip:/opt/kodo-jdo-3.0.0:/opt/kodo-jdo-3.0.0/lib/kodo-jdo-runtime.jar:/opt/kodo-jdo-3.0.0/lib/kodo-jdo.jar:/opt/kodo-jdo-3.0.0/lib/jakarta-commons-collections-2.1.jar:/opt/kodo-jdo-3.0.0/lib/jakarta-commons-lang-1.0.1.jar:/opt/kodo-jdo-3.0.0/lib/jakarta-commons-logging-1.0.3.jar:/opt/kodo-jdo-3.0.0/lib/jakarta-commons-pool-1.0.1.jar:/opt/kodo-jdo-3.0.0/lib/jakarta-regexp-1.1.jar:/opt/kodo-jdo-3.0.0/lib/jca1.0.jar:/opt/kodo-jdo-3.0.0/lib/jdbc-hsql-1_7_0.jar:/opt/kodo-jdo-3.0.0/lib/jdbc2_0-stdext.jar:/opt/kodo-jdo-3.0.0/lib/jdo-1.0.1.jar:/opt/kodo-jdo-3.0.0/lib/jndi.jar:/opt/kodo-jdo-3.0.0/lib/jta-spec1_0_1.jar:/opt/kodo-jdo-3.0.0/lib/log4j-1.2.6.jar:/opt/kodo-jdo-3.0.0/lib/xalan.jar:/opt/kodo-jdo-3.0.0/lib/xercesImpl.jar:/opt/kodo-jdo-3.0.0/lib/xml-apis.jar:/opt/kodo-jdo-3.0.0/lib/jfreechart-0.9.13.jar:/opt/kodo-jdo-3.0.0/lib/jcommon-0.8.8.jar
4$ schemagen -p kodo.properties -f schema.xml -schemas PRODTRDTA.F0101
0 INFO [main] kodo.Tool - Schema generator running on schemas
"PRODTRDTA.F0101". This process may take some time. Enable the
kodo.jdbc.Schema logging category to see messages about the collection of
schema data.
136 INFO [main] jdbc.Schema - Reading table information for schema name
"PRODTRDTA", table name "F0101".
672 INFO [main] jdbc.Schema - Reading column information for table
"PRODTRDTA.F0101".
727 INFO [main] jdbc.Schema - Reading primary keys for schema name
"PRODTRDTA", table name "F0101".
2187 INFO [main] jdbc.Schema - Reading indexes for schema name
"PRODTRDTA", table name "F0101".
2432 INFO [main] jdbc.Schema - Reading foreign keys for schema name
"PRODTRDTA", table name "F0101".
2632 INFO [main] kodo.Tool - Writing XML schema.
5$
script done on Mon Jan 12 11:03:14 2004
Note the first line of logging output: both the schema name and table name
are properly recognized.
Here is the scripted output of the same commands in 3.0.1:
Script started on Mon Jan 12 10:29:03 2004
1$ which schemagen
/opt/kodo-jdo-3.0.1/bin/schemagen
2$ echo $PATH
/opt/kodo-jdo-3.0.1/bin:/sw/db/oracle/oracle817/bin:/sw/gen/sparc-sun-solaris2.9/acroread/5.06/bin:/sw/gen/sparc-sun-solaris2.9/cvs/1.11.5/bin:/sw/gen/sparc-sun-solaris2.9/esound/0.2.29/bin:/sw/gen/sparc-sun-solaris2.9/mpg123/0.59r/bin:/usr/bin:/sw/gen/sparc-sun-solaris2.9/gnupg/1.2.1/bin:/sw/gen/sparc-sun-solaris2.9/mozilla/1.3/bin:/sw/gen/sparc-sun-solaris2.9/openssh/3.7.1p2/sbin:/sw/gen/sparc-sun-solaris2.9/openssh/3.7.1p2/bin:/sw/pd/workman-1.3.4/bin:/usr/openwin/bin:/usr/bin:/sbin:/bin:/usr/sbin:/usr/ccs/bin:/usr/ucb:/opt/local/bin:/sw/modules/bin:/sw/com/bin:/sw/pd/bin:/sw/pd/office52/program:/sw/pd/RealPlayer8:/users/n9208/bin:/opt/openssh/bin:/usr/dt/bin:/usr/dt/bin:/usr/openwin/bin:/sw/db/tools/bin:/sw/db/iss/bin:/usr/local/bin:/usr/local/scripts
3$ echo $CLASSPATH
:/opt/oracle/oracle9.0.1.4.zip:/opt/kodo-jdo-3.0.1:/opt/kodo-jdo-3.0.1/lib/kodo-jdo-runtime.jar:/opt/kodo-jdo-3.0.1/lib/kodo-jdo.jar:/opt/kodo-jdo-3.0.1/lib/jakarta-commons-collections-2.1.jar:/opt/kodo-jdo-3.0.1/lib/jakarta-commons-lang-1.0.1.jar:/opt/kodo-jdo-3.0.1/lib/jakarta-commons-logging-1.0.3.jar:/opt/kodo-jdo-3.0.1/lib/jakarta-commons-pool-1.0.1.jar:/opt/kodo-jdo-3.0.1/lib/jakarta-regexp-1.1.jar:/opt/kodo-jdo-3.0.1/lib/jca1.0.jar:/opt/kodo-jdo-3.0.1/lib/jdbc-hsql-1_7_0.jar:/opt/kodo-jdo-3.0.1/lib/jdbc2_0-stdext.jar:/opt/kodo-jdo-3.0.1/lib/jdo-1.0.1.jar:/opt/kodo-jdo-3.0.1/lib/jndi.jar:/opt/kodo-jdo-3.0.1/lib/jta-spec1_0_1.jar:/opt/kodo-jdo-3.0.1/lib/log4j-1.2.6.jar:/opt/kodo-jdo-3.0.1/lib/xalan.jar:/opt/kodo-jdo-3.0.1/lib/xercesImpl.jar:/opt/kodo-jdo-3.0.1/lib/xml-apis.jar:/opt/kodo-jdo-3.0.1/lib/jfreechart-0.9.13.jar:/opt/kodo-jdo-3.0.1/lib/jcommon-0.8.8.jar:/opt/kodo-jdo-3.0.1/lib/jline.jar:/opt/kodo-jdo-3.0.1/lib/sqlline.jar
4$ schemagen -p kodo.properties -f schema.xml -schemas PRODTRDTA.F0101
1 INFO [main] kodo.Tool - Schema generator running on schemas "all".
This process may take some time. Enable the kodo.jdbc.Schema logging
category to see messages about the collection of schema data.
103 INFO [main] jdbc.Schema - Reading table information for schema name
"null", table name "null".
Exception in thread "main" java.lang.OutOfMemoryError
5$
script done on Mon Jan 12 11:01:45 2004
Note the first line of logging output here: the schema is listed as "all"
instead of the limited scope I had specified.
This run eventually crashes due to the fact that the account which I am
running the mapping tool in has access to thousands of tables, and thus
eventually the JVM runs out of available heap.
My workaround is to fall back to 3.0.0.Thanks for the report. We noticed this ourselves a short while ago.
The bug will be fixed in 3.0.2. -
Hi,
Gurus,
I'm working on idoc to file scenarion and i have to implement MDM dynamic value mapping for one of the idoc fields, and i have no idead about MDM dynamic value mapping , can anybody help me with step-by-step procedure to implement it and I'm not java guy...
Thanks!!
Waiting for reply...Hi,
This might help you,
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/62770ffa-0301-0010-a0b2-c77294a3902e
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/5a9c405f-0a01-0010-0980-fa5082e517e6
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/603ca425-0c01-0010-cdb2-c10d13c43631
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/4ce0d691-0b01-0010-f4aa-c938d438ceb2
Setting up algorithm on MDM Client
Regards
Agasthuri Doss -
Value Mapping and Dynamic Value Mapping
Hi Experts,
Could you please give a example for Value Mapping and Dynamic Value Mapping?
Regards
SaraSara,
I am assuming that you are going to use the Receiver JDBC adapter to select the data from the database.
In this case, take a look at this blog of mine to understand how te dataypes should be created for the request and response JDBC Select Query.
/people/bhavesh.kantilal/blog/2006/07/03/jdbc-receiver-adapter--synchronous-select-150-step-by-step
The only difference would be that in the mapping when you create the Request Message for the JDBC adapter, the columns you create would be determined from the Source,
i.e, if value = 1 , only the required Columns should be mapped as Blank constants,
likewise for the other requirement.
the columns you do not want to select you should not create them in theoutput of your request mapping.
Regards
Bhavesh -
3.1.4 reverse mapping tool issue
(Sorry for the duplicate posting...I meant to start a new thread with
this but accidentally posted it as a reply to a 6-month old thread)
Hello,
I was running Kodo 3.0.2 when Abe and I had the exchange reproduced
below back in January to deal with Oracle tables with "$" in the column
names (which I subsequently updated to 3.0.3). The original subject of
this discussion was "3.0.2 reverse mapping tool generates invalid
..mapping file".
I was able to get this working by running the following commands to
implement Abe's suggestion:
reversemappingtool -p kodo.properties -package db \
-cp custom.properties -ds false schema.xml
sed -e 's/\$/__DOLLAR__/' db/package.mapping > db/package.mapping.new
mv db/package.mapping.new db/package.mapping
javac db/*.java
mappingtool -p kodo.properties -a import db/package.mapping
sed -e 's/__DOLLAR__/\$/' db/package.jdo > db/package.jdo.new
mv db/package.jdo.new db/package.jdo
In my custom.properties file, I had lines like these to put useful names
on my class's fields:
db.TransactionDetailHistory.y$an8.rename : addressNumber
As I said, in 3.0.3, this worked perfectly.
I picked this code back up for the first time since getting it working 6
months ago, and decided to update it to 3.1.4 (since I'm already using
that on other projects). Problem is, the reverse mapping tool has
changed and the code it generates no longer works as it once did. I
tried running the 3.1.2 and 3.1.0 reverse mapping tool, and it failed
the same way, so it looks like this change happened in the 3.0.x to
3.1.x version change.
What happens is this: In the generated Java source, my fields used to
end up with names as per my specification (e.g., the Oracle column named
"y$an8" showed up as "addressNumber" in the java source).
However, it looks like the "$" became special somehow in 3.1.0 - the
"y$an8" column now shows up as "yAn8" in the generated Java. I tried
changing my custom.properties file accordingly, but it still shows up as
yAn8 even after changing my mapping to look like this:
db.TransactionDetailHistory.yAn8.rename : addressNumber
What do you make of this?
Thanks,
Bill
Abe White wrote:
> Hmmm... this is a problem. '$' is not legal in XML names, and there
is no standard way to escape it.
>
> Your best bet is probably to do the following:
> 1. In the generated .mapping file, replace all '$' characters with
another token, such as '--DOLLAR--'.
> 2. Switch your properties to use the metadata mapping factory:
> kodo.jdbc.MappingFactory: metadata
> 3. Import your mappings into the metadata mapping factory:
> mappingtool -a import package.mapping
> 4. Delete the mapping file.
> 5. In your .jdo file, replace '--DOLLAR--' with '$' again.
>
> The metadata mapping factory doesn't put column names in its XML
attribute names, so you should be able to use it safely.William-
However, it looks like the "$" became special somehow in 3.1.0 - the
"y$an8" column now shows up as "yAn8" in the generated Java. I tried
changing my custom.properties file accordingly, but it still shows up as
yAn8 even after changing my mapping to look like this:
db.TransactionDetailHistory.yAn8.rename : addressNumberWell, the reverse mapping tool makes some assumptions based on common
naming strategies for relational databases and Java naming: columns like
"FIRST_NAME" will be renamed to "firstName". The Reverse Mapping tool is
seeing the "$" and treating it as a non-alphanumeric delimiter, so is
fixing it.
Can you try a couple of additional properties:
db.TransactionDetailHistory.y$An8.rename: addressNumber
db.TransactionDetailHistory.y$an8.rename: addressNumber
Also, are other rename properties working for you, or is that the only
field or class you attempt to rename? It might just be the case that
you aren't correctly specifying the properties file or something.
Finally, bear in mind that you can always implement your own
kodo.jdbc.meta.ReverseCustomizer and just use that; not the easiest
solution, but it can certainly be used to have very fine-grained control
over the exact names that are generated.
In article <[email protected]>, William Korb wrote:
(Sorry for the duplicate posting...I meant to start a new thread with
this but accidentally posted it as a reply to a 6-month old thread)
Hello,
I was running Kodo 3.0.2 when Abe and I had the exchange reproduced
below back in January to deal with Oracle tables with "$" in the column
names (which I subsequently updated to 3.0.3). The original subject of
this discussion was "3.0.2 reverse mapping tool generates invalid
.mapping file".
I was able to get this working by running the following commands to
implement Abe's suggestion:
reversemappingtool -p kodo.properties -package db \
-cp custom.properties -ds false schema.xml
sed -e 's/\$/__DOLLAR__/' db/package.mapping > db/package.mapping.new
mv db/package.mapping.new db/package.mapping
javac db/*.java
mappingtool -p kodo.properties -a import db/package.mapping
sed -e 's/__DOLLAR__/\$/' db/package.jdo > db/package.jdo.new
mv db/package.jdo.new db/package.jdo
In my custom.properties file, I had lines like these to put useful names
on my class's fields:
db.TransactionDetailHistory.y$an8.rename : addressNumber
As I said, in 3.0.3, this worked perfectly.
I picked this code back up for the first time since getting it working 6
months ago, and decided to update it to 3.1.4 (since I'm already using
that on other projects). Problem is, the reverse mapping tool has
changed and the code it generates no longer works as it once did. I
tried running the 3.1.2 and 3.1.0 reverse mapping tool, and it failed
the same way, so it looks like this change happened in the 3.0.x to
3.1.x version change.
What happens is this: In the generated Java source, my fields used to
end up with names as per my specification (e.g., the Oracle column named
"y$an8" showed up as "addressNumber" in the java source).
However, it looks like the "$" became special somehow in 3.1.0 - the
"y$an8" column now shows up as "yAn8" in the generated Java. I tried
changing my custom.properties file accordingly, but it still shows up as
yAn8 even after changing my mapping to look like this:
db.TransactionDetailHistory.yAn8.rename : addressNumber
What do you make of this?
Thanks,
Bill
Abe White wrote:
Hmmm... this is a problem. '$' is not legal in XML names, and thereis no standard way to escape it.
Your best bet is probably to do the following:
1. In the generated .mapping file, replace all '$' characters withanother token, such as '--DOLLAR--'.
2. Switch your properties to use the metadata mapping factory:
kodo.jdbc.MappingFactory: metadata
3. Import your mappings into the metadata mapping factory:
mappingtool -a import package.mapping
4. Delete the mapping file.
5. In your .jdo file, replace '--DOLLAR--' with '$' again.
The metadata mapping factory doesn't put column names in its XMLattribute names, so you should be able to use it safely.--
Marc Prud'hommeaux
SolarMetric Inc. -
Ssh has stopped working - reverse mapping causes segmentation fault
This was working on Friday, believe me. I haven't done anything that I'm aware of (apart from reboot the machine) to change things, except in trying to fix it.
Briefly, ssh crashes out with a segmentation fault and a crash log (below). Poking around with verbosity gives (real ip obscured):
% ssh ip4 -vvvv
OpenSSH_3.8.1p1, OpenSSL 0.9.7i 14 Oct 2005
debug1: Reading configuration data /etc/ssh_config
debug2: ssh_connect: needpriv 0
debug1: Connecting to ip4 [ip4] port 22.
debug1: Connection established.
debug1: identity file /Users/rpg/.ssh/identity type -1
debug1: identity file /Users/rpg/.ssh/id_rsa type -1
debug1: identity file /Users/rpg/.ssh/id_dsa type -1
debug1: Remote protocol version 1.99, remote software version OpenSSH_3.8.1p1
debug1: match: OpenSSH_3.8.1p1 pat OpenSSH*
debug1: Enabling compatibility mode for protocol 2.0
debug1: Local version string SSH-2.0-OpenSSH_3.8.1p1
debug3: Trying to reverse map address ip4.
Segmentation fault
I get similar reports for %ssh FQDN and %ssh $USER@[FQDN*|*ip4].
Although I'm trying to ssh to a machine on another continent, trying to ssh into my own machine (from a Terminal window on my own machine) also does not work. Setting UseDNS no in /etc/sshd_config on my machine does not help. Oddly, trying to ssh to my own machine by
%ssh 127.0.0.1 gives
debug1: Enabling compatibility mode for protocol 2.0
debug1: Local version string SSH-2.0-OpenSSH_3.8.1p1
debug3: Trying to reverse map address 127.0.0.1.
debug1: An invalid name was supplied
Configuration file does not specify default realm
debug1: An invalid name was supplied
A parameter was malformed
Validation error
debug1: An invalid name was supplied
Configuration file does not specify default realm
debug1: An invalid name was supplied
A parameter was malformed
Validation error
debug1: SSH2MSGKEXINIT sent
debug1: SSH2MSGKEXINIT received
and then works (after a lot more info).
I can ssh into this machine from elsewhere, I just can't ssh out. Below the crash log is a report from running a server with
% sudo sshd -D -ddd -e -p 10000
and connecting with
% ssh -vvv -p 10000 $USER@FQDN
ssh crash log:
Date/Time: 2006-05-29 15:42:09.284 +1000
OS Version: 10.4.6 (Build 8I1119)
Report Version: 4
Command: ssh
Path: /usr/bin/ssh
Parent: bash [1020] (note: also fails under tcsh)
Version: ??? (???)
PID: 1021
Thread: 0
Exception: EXCBADACCESS (0x0001)
Codes: KERNINVALIDADDRESS (0x0001) at 0xb1d255e4
Thread 0 Crashed:
0 libstdc++.6.dylib 0x90b37e3a _cxa_getglobals + 324
1 libstdc++.6.dylib 0x90b3853a _gxx_personalityv0 + 658
2 libgcc_s.1.dylib 0x90bcabf7 UnwindRaiseException + 147
3 libstdc++.6.dylib 0x90b38857 _cxathrow + 87
4 edu.mit.Kerberos 0x94c4a238 CCIContextDataMachIPCStub::OpenCCache(std::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) + 314
5 edu.mit.Kerberos 0x94c49fde CCEContext::OpenCCache(cccontextd*, char const*, ccccached**) + 160
6 edu.mit.Kerberos 0x94c49d5e cc_open + 64
7 edu.mit.Kerberos 0x94c49bf6 krb5stdccresolve + 182
8 edu.mit.Kerberos 0x94c4f1a1 __KLGetCCacheByName + 254
9 edu.mit.Kerberos 0x94c4ee8a __KLAcquireInitialTicketsForCache + 179
10 edu.mit.Kerberos 0x94c4ed7f krb5intccdefault + 85
11 edu.mit.Kerberos 0x94c40215 krb5gss_acquirecred + 2409
12 edu.mit.Kerberos 0x94c4ed11 kggetdefcred + 73
13 edu.mit.Kerberos 0x94c4da14 krb5gss_init_seccontext + 208
14 ssh 0x00024305 0x1000 + 144133
15 ssh 0x000246f4 0x1000 + 145140
16 ssh 0x000247fb 0x1000 + 145403
17 ssh 0x0000c462 0x1000 + 46178
18 ssh 0x0000a251 0x1000 + 37457
19 ssh 0x000042c7 0x1000 + 12999
20 ssh 0x000025f2 0x1000 + 5618
21 ssh 0x0000250d 0x1000 + 5389
Thread 0 crashed with i386 Thread State:
eax: 0x00000000 ebx: 0x90b3880d ecx:0xbfffda7c edx: 0xa4c425a0
edi: 0xb1d255e4 esi: 0xa4c425a0 ebp:0xbfffd9e8 esp: 0xbfffd9b0
ss: 0x0000002f efl: 0x00010246 eip:0x90b37e3a cs: 0x00000027
ds: 0x0000002f es: 0x0000002f fs:0x00000000 gs: 0x00000037
sudo sshd -D -ddd -e -p 10000:
debug2: readserverconfig: filename /etc/sshd_config
debug1: sshd version OpenSSH_3.8.1p1
debug1: private host key: #0 type 0 RSA1
debug3: Not a RSA1 key file /etc/sshhost_rsakey.
debug1: read PEM private key done: type RSA
debug1: private host key: #1 type 1 RSA
debug3: Not a RSA1 key file /etc/sshhost_dsakey.
debug1: read PEM private key done: type DSA
debug1: private host key: #2 type 2 DSA
debug1: Bind to port 10000 on ::.
Server listening on :: port 10000.
debug1: Bind to port 10000 on 0.0.0.0.
Server listening on 0.0.0.0 port 10000.
Generating 768 bit RSA key.
RSA key generation complete. <- pause here
debug1: Server will not fork when running in debugging mode.
Connection from ip4 port 50148
debug1: Current Session ID is 00B16810 / Session Attributes are 00008030
debug1: Creating new security session...
debug1: New Session ID is 0F7C2940 / Session Attributes are 00009020
debug1: Client protocol version 2.0; client software version OpenSSH_3.8.1p1
debug1: match: OpenSSH_3.8.1p1 pat OpenSSH*
debug1: Enabling compatibility mode for protocol 2.0
debug1: Local version string SSH-1.99-OpenSSH_3.8.1p1
debug2: Network child is on pid 1101
debug3: preauth child monitor started
debug3: mmrequestreceive entering
debug3: privsep user:group 75:75
debug1: permanentlysetuid: 75/75
debug1: listhostkeytypes: ssh-rsa,ssh-dss
debug3: mmrequestsend entering: type 40
debug3: mmrequest_receiveexpect entering: type 41
debug3: mmrequestreceive entering
debug3: monitor_read: checking request 40
debug1: Miscellaneous failure
No such file or directory
debug3: mmrequestsend entering: type 41
debug3: mmrequestreceive entering
debug1: no credentials for GSSAPI mechanism Kerberos
debug1: SSH2MSGKEXINIT sent
Connection closed by ip4
debug1: do_cleanup
debug1: PAM: cleanup
debug3: PAM: sshpamthreadcleanup entering
debug1: do_cleanup
debug1: PAM: cleanup
debug3: PAM: sshpamthreadcleanup entering
and
% ssh -vvv -p 10000 $USER@FQDN :
OpenSSH_3.8.1p1, OpenSSL 0.9.7i 14 Oct 2005
debug1: Reading configuration data /etc/ssh_config
debug2: ssh_connect: needpriv 0
debug1: Connecting to FQDN [ip4] port 10000.
debug1: Connection established.
debug1: identity file /Users/rpg/.ssh/identity type -1
debug1: identity file /Users/rpg/.ssh/id_rsa type -1
debug1: identity file /Users/rpg/.ssh/id_dsa type -1
debug1: Remote protocol version 1.99, remote software version OpenSSH_3.8.1p1
debug1: match: OpenSSH_3.8.1p1 pat OpenSSH*
debug1: Enabling compatibility mode for protocol 2.0
debug1: Local version string SSH-2.0-OpenSSH_3.8.1p1
debug3: Trying to reverse map address ip4.
Segmentation fault10.4.7 fixed this.
But broke iCal. . .
Actually, I never had any problems with 10.4.6, but ssh on my nat'ed Intel Macbook now segfaults when doing reverse mapping after upgrading to 10.4.7.
OpenSSH_3.8.1p1, OpenSSL 0.9.7i 14 Oct 2005
debug1: Reading configuration data /etc/ssh_config
debug1: Applying options for *
debug2: ssh_connect: needpriv 0
debug1: Connecting to amrshampine [10.4.51.45] port 22.
debug1: Connection established.
debug1: identity file /Users/lindkvis/.ssh/identity type -1
debug1: identity file /Users/lindkvis/.ssh/id_rsa type -1
debug1: identity file /Users/lindkvis/.ssh/id_dsa type -1
debug1: Remote protocol version 1.99, remote software version OpenSSH_3.6.1p2
debug1: match: OpenSSH_3.6.1p2 pat OpenSSH*
debug1: Enabling compatibility mode for protocol 2.0
debug1: Local version string SSH-2.0-OpenSSH_3.8.1p1
debug3: Trying to reverse map address 10.4.51.45.
Macbook White 13" 1.83GHz 1GB Mac OS X (10.4.6) -
Hi!
I am having a problem with reverse mapping. Here's what I do (copying the
generated files to a correct directory omitted):
% rd-schemagen -properties jdo.properties -file schema.xml
% rd-reversemappingtool -properties jdo.properties -package testi
schema.xml
% javac -d build/classes src/testi/*.java
% rd-importtool -properties jdo.properties src/testi/testi.mapping
Here's a part of the output:
<clip>
2958 INFO [main] jdbc.Schema - Found existing table "Kirja" for schema
"null".
3002 INFO [main] jdbc.Schema - Found existing table "Kustantaja" for
schema "n
ull".
3047 INFO [main] jdbc.SQL - [C: 5948361; T: 15336018]close
3125 INFO [main] jdbc.SQL - [C: 2478770; T: 15336018]open:
jdbc:mysql://localh
ost/kirjakauppa (root)
3129 INFO [main] jdbc.Schema - Found existing table "Kirjailija" for
schema "n
ull".
3140 INFO [main] jdbc.SQL - [C: 2478770; T: 15336018]close
3187 INFO [main] jdbc.SQL - [C: 7529545; T: 15336018]open:
jdbc:mysql://localh
ost/kirjakauppa (root)
3193 INFO [main] jdbc.Schema - Found existing table "Kirjoittaja" for
schema "
null".
3225 INFO [main] jdbc.SQL - [C: 7529545; T: 15336018]close
Exception in thread "main" javax.jdo.JDOFatalInternalException:
java.lang.Illega
lArgumentException: You are attempting to link to a primary key column in
table "Kirja" in a foreign key that is already linked to primary key
columns in table "Kirjailija".
NestedThrowables:
java.lang.IllegalArgumentException: You are attempting to link to a primary
key column in table "Kirja" in a foreign key that is already linked to
primary key c
olumns in table "Kirjailija".
at
com.solarmetric.rd.kodo.impl.jdbc.meta.Mappings.createClassMapping(Ma
ppings.java:160)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMapping(M
appingRepository.java:279)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMetaData(
MappingRepository.java:147)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMapping(M
appingRepository.java:158)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.getMapping(I
mportTool.java:126)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.importMappin
gs(ImportTool.java:57)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.run(ImportTo
ol.java:408)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.main(ImportT
ool.java:385)
NestedThrowablesStackTrace:
java.lang.IllegalArgumentException: You are attempting to link to a primary
key column in table "Kirja" in a foreign key that is already linked to
primary key c
olumns in table "Kirjailija".
at
com.solarmetric.rd.kodo.impl.jdbc.schema.ForeignKey.join(ForeignKey.j
ava:238)
at
com.solarmetric.rd.kodo.impl.jdbc.schema.SchemaGenerator.generateFore
ignKeys(SchemaGenerator.java:625)
at
com.solarmetric.rd.kodo.impl.jdbc.schema.DynamicSchemaFactory.findTab
le(DynamicSchemaFactory.java:111)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.map.BaseClassMapping.fromMappi
ngInfo(BaseClassMapping.java:113)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.Mappings.createClassMapping(Ma
ppings.java:144)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMapping(M
appingRepository.java:279)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMetaData(
MappingRepository.java:147)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMapping(M
appingRepository.java:158)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.getMapping(I
mportTool.java:126)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.importMappin
gs(ImportTool.java:57)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.run(ImportTo
ol.java:408)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.main(ImportT
ool.java:385)
</clip>
Here's what MySQLCC gives for creation statement of the tables:
<clip>
# Host: localhost
# Database: kirjakauppa
# Table: 'Asiakas'
# CREATE TABLE `Asiakas` (
`Asiakas_id` int(11) NOT NULL auto_increment,
`Nimi1` varchar(50) default NULL,
`Nimi2` varchar(50) default NULL,
`KatuOsoite` varchar(50) default NULL,
`Postiosoite` varchar(50) default NULL,
`Email` varchar(50) default NULL,
`Puhelin` varchar(50) default NULL,
`Fax` varchar(50) default NULL,
`Salasana` varchar(50) default NULL,
`ExtranetTunnus` varchar(50) default NULL,
PRIMARY KEY (`Asiakas_id`),
KEY `Asiakas_id` (`Asiakas_id`)
) TYPE=InnoDB;
# Host: localhost
# Database: kirjakauppa
# Table: 'Kirja'
# CREATE TABLE `Kirja` (
`Kirja_id` int(11) NOT NULL auto_increment,
`Kustantaja_id` int(11) default NULL,
`Nimi` varchar(60) default NULL,
`Nimi2` varchar(60) default NULL,
`ISBN` varchar(50) default NULL,
`Kieli` varchar(50) default NULL,
`Kansi_URL` varchar(50) default NULL,
`Sisalto_URL` varchar(50) default NULL,
`Tukkuhinta` decimal(10,2) default NULL,
`Kuluttajahinta` decimal(10,2) default NULL,
`Varastokpl` int(11) default NULL,
PRIMARY KEY (`Kirja_id`),
KEY `Kirja_id` (`Kirja_id`),
KEY `Kustantaja_id` (`Kustantaja_id`),
FOREIGN KEY (`Kustantaja_id`) REFERENCES `kirjakauppa.Kustantaja`
(`Kustantaja_id`)
) TYPE=InnoDB;
# Host: localhost
# Database: kirjakauppa
# Table: 'Kirjailija'
# CREATE TABLE `Kirjailija` (
`Kirjailija_id` int(11) NOT NULL auto_increment,
`Sukunimi` varchar(50) default NULL,
`Etunimi` varchar(50) default NULL,
`Maa` varchar(50) default NULL,
`Kirjailija_URL` varchar(50) default NULL,
PRIMARY KEY (`Kirjailija_id`),
KEY `Kirjailija_id` (`Kirjailija_id`)
) TYPE=InnoDB;
# Host: localhost
# Database: kirjakauppa
# Table: 'Kirjoittaja'
# CREATE TABLE `Kirjoittaja` (
`Kirjoittaja_id` int(11) NOT NULL auto_increment,
`Kirjailija_id` int(11) NOT NULL default '0',
`Kirja_id` int(11) NOT NULL default '0',
PRIMARY KEY (`Kirjoittaja_id`),
KEY `Kirjailija_id` (`Kirjailija_id`),
KEY `Kirja_id` (`Kirja_id`),
FOREIGN KEY (`Kirjailija_id`) REFERENCES `kirjakauppa.Kirjailija`
(`Kirjailija_id`),
FOREIGN KEY (`Kirja_id`) REFERENCES `kirjakauppa.Kirja` (`Kirja_id`)
) TYPE=InnoDB;
# Host: localhost
# Database: kirjakauppa
# Table: 'Koodi'
# CREATE TABLE `Koodi` (
`Koodi_id` int(11) NOT NULL auto_increment,
`Koodi` varchar(50) default NULL,
`Tyyppi` varchar(50) default NULL,
`Arvo` varchar(50) default NULL,
PRIMARY KEY (`Koodi_id`)
) TYPE=InnoDB;
# Host: localhost
# Database: kirjakauppa
# Table: 'Kustantaja'
# CREATE TABLE `Kustantaja` (
`Kustantaja_id` int(11) NOT NULL auto_increment,
`Nimi` varchar(80) default NULL,
`Maa` varchar(50) default NULL,
`Kustantaja_URL` varchar(50) default NULL,
`KirjaLkm` int(11) default NULL,
PRIMARY KEY (`Kustantaja_id`),
KEY `Kustantaja_id` (`Kustantaja_id`)
) TYPE=InnoDB;
# Host: localhost
# Database: kirjakauppa
# Table: 'Luokittelu'
# CREATE TABLE `Luokittelu` (
`Luokittelu_id` int(11) NOT NULL auto_increment,
`Luokka_id` int(11) NOT NULL default '0',
`Kirja_id` int(11) NOT NULL default '0',
PRIMARY KEY (`Luokittelu_id`),
KEY `Luokka_id` (`Luokka_id`),
KEY `Kirja_id` (`Kirja_id`),
FOREIGN KEY (`Luokka_id`) REFERENCES `kirjakauppa.Luokka` (`Luokka_id`),
FOREIGN KEY (`Kirja_id`) REFERENCES `kirjakauppa.Kirja` (`Kirja_id`)
) TYPE=InnoDB;
# Host: localhost
# Database: kirjakauppa
# Table: 'Luokka'
# CREATE TABLE `Luokka` (
`Luokka_id` int(11) NOT NULL auto_increment,
`Luokka` varchar(50) default NULL,
PRIMARY KEY (`Luokka_id`),
KEY `Luokka_id` (`Luokka_id`)
) TYPE=InnoDB;
# Host: localhost
# Database: kirjakauppa
# Table: 'Myyja'
# CREATE TABLE `Myyja` (
`Myyja_id` int(11) NOT NULL auto_increment,
`Myyja` varchar(50) default NULL,
`Myyja_URL` varchar(50) default NULL,
PRIMARY KEY (`Myyja_id`),
KEY `Myyja_id` (`Myyja_id`)
) TYPE=InnoDB;
# Host: localhost
# Database: kirjakauppa
# Table: 'Tilaus'
# CREATE TABLE `Tilaus` (
`Tilaus_id` int(11) NOT NULL auto_increment,
`Asiakas_id` int(11) NOT NULL default '0',
`Myyja_id` int(11) default NULL,
`TilausPvm` timestamp(14) NOT NULL,
`EnsimmToimitusPvm` timestamp(14) NOT NULL,
`ViimToimitusPvm` timestamp(14) NOT NULL,
`Tila` int(11) NOT NULL default '0',
`Mk` decimal(10,2) default NULL,
PRIMARY KEY (`Tilaus_id`),
KEY `Asiakas_id` (`Asiakas_id`),
KEY `Myyja_id` (`Myyja_id`),
KEY `Tilaus_id` (`Tilaus_id`),
FOREIGN KEY (`Asiakas_id`) REFERENCES `kirjakauppa.Asiakas`
(`Asiakas_id`),
FOREIGN KEY (`Myyja_id`) REFERENCES `kirjakauppa.Myyja` (`Myyja_id`)
) TYPE=InnoDB;
# Host: localhost
# Database: kirjakauppa
# Table: 'Tilausrivi'
# CREATE TABLE `Tilausrivi` (
`TilausRivi_id` int(11) NOT NULL auto_increment,
`Tilaus_id` int(11) NOT NULL default '0',
`Kirja_id` int(11) NOT NULL default '0',
`TilausLkm` int(11) default NULL,
`Ahinta` decimal(10,2) default NULL,
`Alepros` float default NULL,
`Mk` decimal(10,2) default NULL,
`ToimitettuLkm` int(11) default NULL,
`ToimitusPvm` timestamp(14) NOT NULL,
`ViimToimitusPvm` timestamp(14) NOT NULL,
`Tila` int(11) NOT NULL default '0',
PRIMARY KEY (`TilausRivi_id`),
KEY `Tilaus_id` (`Tilaus_id`),
KEY `Kirja_id` (`Kirja_id`),
FOREIGN KEY (`Tilaus_id`) REFERENCES `kirjakauppa.Tilaus` (`Tilaus_id`),
FOREIGN KEY (`Kirja_id`) REFERENCES `kirjakauppa.Kirja` (`Kirja_id`)
) TYPE=InnoDB;
</clip>
I can find the original creation script if it is necessary.
My guess was that I need to define the foreign keys myself into the
generated schema.xml This is stated in the manual. However, this did not
help, although it changed the stack trace a little (it complains about
different classes than before):
<clip>
Exception in thread "main" javax.jdo.JDOFatalInternalException:
java.lang.IllegalArgumentException: You are attempting to link to a primary
key column in table "Myyja" in a foreign key that is already linked to
primary key columns in table "Asiakas".
NestedThrowables:
java.lang.IllegalArgumentException: You are attempting to link to a primary
key column in table "Myyja" in a foreign key that is already linked to
primary key columns in table "Asiakas".
at
com.solarmetric.rd.kodo.impl.jdbc.meta.Mappings.createFieldMapping(Mappings.java:208)
</clip>
I don't think I fully understand the error message, what exactly is wrong
here? How can I fix it?
Here's a sample of the changes I made to schema.xml:
- added the name - attribute to schema (it was missing)
<schema name="kirjakauppa">
- added the foreign key elements according to the table creation statements
given above
<fk name="Kustantaja_id" to-table="Kustantaja" column="Kustantaja_id"/>
etc...
-AnttiOn Mon, 16 Jun 2003 17:55:35 -0500, Abe White <[email protected]>
wrote:
It seems the last three options are being ignored - I still get a
mapping
file with schema names in front of tables (e.g. kirjakauppa.Asiakas, not
Asiakas),That, unfortunately, is impossible to turn off. The -useSchemaName
option controls whether the schema name is included as part of the
generated class name; it doesn't affect the mapping data that is
generated. What problems does including the schema name in the mapping
data cause?
rd-importtool -properties jdo.properties gensrc/testi/testi.mapping0 INFO [main] kodo.MetaData - Parsing metadata resource
"file:/home/akaranta/work/kurssit/jdo/Harjoituskoodi/kirjakauppa/gensrc/testi/testi.mapping".
Exception in thread "main"
com.solarmetric.rd.kodo.meta.JDOMetaDataNotFoundException: No JDO metadata
was found for type "class testi.Asiakas".
FailedObject:class testi.Asiakas
at
com.solarmetric.rd.kodo.meta.JDOMetaDataRepositoryImpl.getMetaData(JDOMetaDataRepositoryImpl.java:126)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMetaData(MappingRepository.java:184)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.MappingRepository.getMapping(MappingRepository.java:197)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.getMapping(ImportTool.java:128)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.importMappings(ImportTool.java:60)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.run(ImportTool.java:400)
at
com.solarmetric.rd.kodo.impl.jdbc.meta.compat.ImportTool.main(ImportTool.java:377)
This exception goes away if I edit the schema name out of the mapping
file from all classes.
separate classes are being generated for join tables with
primary keysDo these join tables have an extra primary key column? TheYes, they do. Ok, now I know where the problem is.
-primaryKeyOnJoin flag tells Kodo to ignore a join table with a primary
key on the join columns. But Kodo can't handle join tables with extra
column(s) just for a primary key identifier. This isn't a limitation of
the reverse mapping tool, it's a limitation of Kodo. Kodo wouldn't know
what to insert in those extra primary key column(s) when adding membersWhy not? If it can handle single numeric pk columns when making the
generated classes use data store identity, it has to generate something to
those columns. I can't see why this is different.
That is simply out of curiosity - the next thing fixed my problem:
to the join table. Of course, if the primary key is an auto-increment or
something where Kodo can ignore it for inserts, you can just remove the
<column> elements and the <pk> element from your .schema file and the
reverse mapping tool will map it as a join table appropriately.It is auto-increment, so I did this and it worked. Thanks.
, and application id is used for all classes.Are your primary keys on single, numeric columns? Kodo uses Java longsYes (int in MySQL), so that should not be a problem. They are also auto-
incremented. This seems to be the only real problem remaining with this
schema.
-Antti -
function displayMap(e) {
var title = e.data.title,
latlng = e.data.lat + ',' + e.data.lng;
if (typeof device !='undefined' && device.platform.toLowerCase() == 'android') {
window.location = 'http://maps.google.com/maps?z=16&q=' + encodeURIComponent(title) + '@' + latlng;
} else {
$('#map h1').text(title);
$('#map div[data-role=content]').html('<img src="http://maps.google.com/maps/api/staticmap?center=>' + latlng + ' &zoom=16&size=320x420&markers=' + latlng + '&sensor=false">');
$.mobile.changePage('#map', 'fade', false, true);
my phonegap (Adobe press, Powers jQuery with dw 5.5) book (old book (c)2010-11) says for above code: // is this valid for today, is this the right to use or for iOS can use dynamic google maps embeded(can be embedded fo iOS)???
On iOS, calling window.location loads the map directly
into the app. That’s great until you realize that iOS devices
don’t have a Back button, so there’s no way to exit the
map. To get round this problem, I loaded a static map as
an image in the map page block. It’s not interactive, but at
least you can continue using the Travel Notes app after
viewing the map by clicking the Back button generated by
jQuery Mobile.Well, this took me a while to get solved, but it is indeed solved.
I tried USB Overdrive and it could, and perhaps should work, but apparently it will not. When adding a device, it seems that USB Overdrive is not set up to handle any input device that does not register itself as either a Mouse or a Joystick. The VEC USB Footpedal that I'm using is "Device type: Other".
So, I went for Quickeys. And Quickeys can do it all. It did recognize the device, I was able to assign it to the scope of the particular audio playback app I wanted to use (Amazing Slow Downer OS X - which is truly amazing. Any musicians reading this who are looking for a way to learn pieces by ear, this does it better than anything else I've seen yet).
I created a shortcut in Quickeys for the ASD app; added the middle button of the foot pedal as the trigger; set one step, entering 'space bar' as the step (which toggles playback, similar to many audio players).
It all worked.
Quickeys is very confusing and seemingly featured with an endless array of options. Enter at your own risk. Ask me for help. This was the only way to get it done that I could find. I did write to the author of USB Overdrive asking him to please support additional devices as I did find some traction from gamers who like to use a foot pedal in addition to other input devices. There was a Windows-only management utility for the foot pedal that was intended for custom input, assigning the buttons to any keyboard input or mouse click event. It would be nice to have a simple and easy to use utility like this. But, Quickeys did do the job.
Thanks for your help, you guys!!! -
Unrecognized types in Reverse Mapping
In our database schema there is use of a user-defined database type. The
reverse mapping tool cannot recogize this type and automatically
classifies it as a blob in the mapping file and the generic Object for the
java sources. I would like to cast this user-defined type to a String in
java, because otherwise kodo blows up when I try to retrieve the field. I
extended PropertiesReverseCustomizer and was able to get the java sources
to output String instead. But I couldn't find an easy way of getting the
mapping file to use "value" instead of "blob". Right now I am having to do
a query replace on the mapping file, but I would like to know if there's
way of getting the Reverse mapping tool to do this for you?
TobyYou can probably just add the fields for UDT types manually in your
customizer:
import java.sql.*;
import kodo.meta.*;
import kodo.jdbc.meta.*;
import kodo.jdbc.schema.*;
private ReverseMappingTool tool; // set in setTool ()
public boolean customize (ClassMapping cls)
Column[] cols = cls.getTable ().getColumns ();
for (int i = 0; i < cols.length; i++)
if (cols.isCompatible (Types.BLOB, 0))
addStringField (cls, cols[i]);
return super.customize (cls);
private void addStringField (ClassMapping cls, Column col)
String name = tool.getFieldName (col.getName (), cls);
FieldMetaData fmd = tool.newFieldMetaData (name, String.class, cls);
ValueFieldMapping field = new ValueFieldMapping (fmd);
mapping.setColumn (col);
tool.addFieldMapping (field, cls); -
Reverse Mapping: letting Kodo manage pk-column
Hi again,
i have a db with many predefined tables. Its not allowed to change the db
schema. The generated Java-classes are looking fine but i want to let kodo
manage the pk columns (like JDOIDX in generated tables). I dont want the
more technical pks in my business classes. Is it possible?
Any help is welcome!Abe White wrote:
Adding a data store identity option to the reverse mapping tool is relatively
high on our to-do list, but it's not implemented yet. For now, you can
follow the steps in the documentation for reverse-mapping your classes, then
switch over to datastore identity manually by changing the class definitions
and metadata.Ok, this solution is workable. Will try it, thanks!
Maybe you are looking for
-
How do I use my redeemed gift card to purchase a book on iBooks when it will only give the option of "Mastercard, Visa, Amex or 'None' (Billing Address)"?? Desperately need help so I can purchase books for my study course!!
-
Hi, i want to implement a function in ODI
hi, i am new to odi. i have created a project in odi which was very simple and only transfers data from one table to another. but now i want to implement a function in odi. i have the code for that function which is written in pl/sql. my function acc
-
I have music on my Ipod touch, how can I get it on to my new macbook?
Hello! My old macbook died, I want to get my music from my ipod touch to my new macbook. I know this requires a third party program, are there any recommendations? Thanks
-
Hi, I was wondering if someone could help me, I recently restored my itouch 4th Gen and was then told to upgrade to itunes 10.6 now everytime I try and install itunes I get the following message, help please! Error writing to file: C:\Program\Files\i
-
Template for United States Statistics
I would like to know if there is a template I can use as a starting point to show statistics across the country, using states of different colors? This is for a documentary film. I'd like to not start this from scratch because I know this is done on