Lookup on Ztable
Hi,
my scenario is File to Idoc. i have to perform lookups my Custom table is located at Z Table in Xi. for this custom table i have to pass 1 value as input parameter then i will get 3 out put parameters then i will map these values to target Idoc.
for this i created Ztable in ABAP stack and by using functionmodule i will retrieve the values after that how can i proceed .... shal i use Jco call in side UDF
or i have to create a CC for this
if u have java code for this can u plz send me
Thanks for your help
ytey
Similar Messages
-
SAP R3 custom extract - trigger XI to map and send to Mainframe
Hello, I'm looking for the best way to occasionaly send table data (as decided by business user via custom transaction) from R3 business system to XI and eventually on to Mainframe. I'm OK with this latter portion of XI mapping, JMS / MQ config, and then on to Mainframe.
Here is my question. Our ABAP developer can create a BAPI to lookup the Ztable extract data for us in the R3 system. In this way, I would like to have it pushed from SAP rather than setting a polling interval because the extract will happen on a irregular basis. But to my understanding a BAPI is to be called / pulled rather than pushed. How can we best accomplish a push from R3? We are accustomed to using IDOCS but in this case it is completely custom.
I appreciate your experiences on what is the best method to use in this scenario.
Thanks in advance,
AaronHi, thank you for the ideas.
Our ABAP programmer is willing to try it using the RFC approach but we are unclear on how to do this. I would be grateful for some more information such as:
Does anyone have a simple example?
When we use the RFC Channel as a Sender, are the RFC Server Parameters pertaining to the ERP system or the XI system?
How does the ABAP program on ERP system actually send data to XI?
Do we need to set up any special ports to do this?
Thank you for your help,
Aaron -
How to check whether a Record Exists or not in Ztable
Hi all,
I have Req like this
In ZTable i have 2 fields * Legacy System , Legacy Material No*
Environment Value is coming in one of the field in Idoc
The logic is if the Environment Value is UK and a Record Exists on the ZTable for Legacy Sysyetm = Leg1 means i have send Legacy Material No , else Send 01
Please suggest me the process for this
REgards
Vamsi
Edited by: Vamsi Krishna on May 19, 2009 5:31 PMHi Michal / Aamir,
Thanks for your replies.
The ZTable is at R/3 side only, iam using RFC Lookup for this, but iam bit confuse on how to construct the logic to check whether there is a record Exists or not on Ztable
Logic shld be If the Environment value is UK and a record exists on the Ztable for Legacy system "ABC" means then send Legacy material number else send 01
In Table we will have the fields Legacy System Legacy Material No & Environment value is coming from Idoc itself( Ex: UK or US)
Regards -
Java Error in RFC Lookup in XSLT Mapping usinf Java helper class
Hi All,
I am doing RFC Lookup in XSLT Mapping using Java Helper class.
The Lookup works fine when called one RFC at a time However my requirement is I want to do 2 Lookups.
Both Lookups works when done individually however when I call both lookups in one mapping I get following error "javax.xml.transform.TransformerException: DOMSource whose Node is null."
Following is the code I have written in XSLT for the lookup:
<xsl:template name="Lookup_1">
<xsl:param name="STDPN"/>
<rfc:RFC_READ_TABLE>
<QUERY_TABLE>KNA1</QUERY_TABLE>
<OPTIONS><item><TEXT>
<xsl:value-of select="$STDPN"/>
</TEXT></item>
</OPTIONS>
<FIELDS>
<item>
<FIELDNAME>KUNNR</FIELDNAME>
</item>
</FIELDS>
</rfc:RFC_READ_TABLE>
</xsl:variable>
<xsl:variable name="response" xmlns:lookup="java:urn.mt.pi" select="lookup:execute($request, 'BS_D, 'cc_RfcLookup', $inputparam)"/>
<xsl:element name="STDPN">
<xsl:value-of select="$response//DATA/item/WA"/>
</xsl:element>
</xsl:template>
<xsl:template name="Lookup_2">
<xsl:param name="BELNR"/>
<xsl:variable name="Query">AGMNT = '<xsl:value-of select="$BELNR"/>'</xsl:variable>
<xsl:variable name="request1">
<rfc:RFC_READ_TABLE>
<QUERY_TABLE>ZTABLE</QUERY_TABLE>
<OPTIONS><item><TEXT>
<xsl:value-of select="$Query"/>
</TEXT></item>
</OPTIONS>
<FIELDS>
<item>
<FIELDNAME>KUNAG</FIELDNAME>
</item>
</FIELDS>
</rfc:RFC_READ_TABLE>
</xsl:variable>
<xsl:variable name="response1" xmlns:lookup="java:urn.mt.pi" select="lookup:execute($request1, 'BS_D','cc_RfcLookup', $inputparam)"/>
<xsl:element name="BELNR">
<xsl:value-of select="$response1//DATA/item/WA"/>
</xsl:element>
</xsl:template>
My Question: Am I doing anything wrong? Or Is it possible to call multiple lookups in one XSLT?
Thanks and Regards,
AtulHi Atul,
I had the same problem like you had.
The main Problem is that with the example code the request variable is created as NodeList object. In XSLT a variable is somekind of a constant and can't be changed. As the request object is empty after the first request the programm fails at the following line:
Source source = new DOMSource(request.item(0));
So I've created a workaround for this problem.
In the call of the template I've put the request as a parameter object at the template call:
<xsl:with-param name="req">
<rfc:PLM_EXPLORE_BILL_OF_MATERIAL xmlns:rfc="urn:sap-com:document:sap:rfc:functions">
<APPLICATION>Z001</APPLICATION>
<FLAG_NEW_EXPLOSION>X</FLAG_NEW_EXPLOSION>
<MATERIALNUMBER><xsl:value-of select="value"/></MATERIALNUMBER>
<PLANT>FSD0</PLANT>
<VALIDFROM><xsl:value-of select="//Recordset/Row[name='DTM-031']/value"/></VALIDFROM>
<BOMITEM_DATA/>
</rfc:PLM_EXPLORE_BILL_OF_MATERIAL>
</xsl:with-param>
With this change the request will be provided as a String object and not as a NodeList object.
Afterwards the RfcLookup.java has to be changed to the following:
package com.franke.mappings;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.PrintWriter;
import java.io.StringWriter;
import java.util.Map;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.transform.Source;
import javax.xml.transform.Transformer;
import javax.xml.transform.TransformerFactory;
import javax.xml.transform.dom.DOMSource;
import javax.xml.transform.stream.StreamResult;
import org.w3c.dom.Document;
import org.w3c.dom.Node;
import org.w3c.dom.NodeList;
import com.sap.aii.mapping.lookup.Channel;
import com.sap.aii.mapping.api.StreamTransformationConstants;
import com.sap.aii.mapping.api.AbstractTrace;
import com.sap.aii.mapping.lookup.RfcAccessor;
import com.sap.aii.mapping.lookup.LookupService;
import com.sap.aii.mapping.lookup.XmlPayload;
* @author Thorsten Nordholm Søbirk, AppliCon A/S
* Helper class for using the XI Lookup API with XSLT mappings for calling RFCs.
* The class is generic in that it can be used to call any remote-enabled
* function module in R/3. Generation of the XML request document and parsing of
* the XML response is left to the stylesheet, where this can be done in a very
* natural manner.
* TD:
* Changed the class that request is sent as String, because of IndexOutOfBound-exception
* When sending multiple requests in one XSLT mapping.
public class RfcLookup {
* Execute RFC lookup.
* @param request RFC request - TD: changed to String
* @param service name of service
* @param channelName name of communication channel
* @param inputParam mapping parameters
* @return Node containing RFC response
public static Node execute( String request,
String service,
String channelName,
Map inputParam)
AbstractTrace trace = (AbstractTrace) inputParam.get(StreamTransformationConstants.MAPPING_TRACE);
Node responseNode = null;
try {
// Get channel and accessor
Channel channel = LookupService.getChannel(service, channelName);
RfcAccessor accessor = LookupService.getRfcAccessor(channel);
// Serialise request NodeList - TD: Not needed anymore as request is String
/*TransformerFactory factory = TransformerFactory.newInstance();
Transformer transformer = factory.newTransformer();
Source source = new DOMSource(request.item(0));
ByteArrayOutputStream baos = new ByteArrayOutputStream();
StreamResult streamResult = new StreamResult(baos);
transformer.transform(source, streamResult);*/
// TD: Add xml header and remove linefeeds for the request string
request = "<?xml version=\"1.0\" encoding=\"UTF-8\"?>"+request.replaceAll("[\r\n]+", "");
// TD: Get byte Array from request String to send afterwards
byte[] requestBytes = request.getBytes();
// TD: Not used anymore as request is String
//byte[] requestBytes = baos.toByteArray();
trace.addDebugMessage("RFC Request: " + new String(requestBytes));
// Create input stream representing the function module request message
InputStream inputStream = new ByteArrayInputStream(requestBytes);
// Create XmlPayload
XmlPayload requestPayload =LookupService.getXmlPayload(inputStream);
// Execute lookup
XmlPayload responsePayload = accessor.call(requestPayload);
InputStream responseStream = responsePayload.getContent();
TeeInputStream tee = new TeeInputStream(responseStream);
// Create DOM tree for response
DocumentBuilder docBuilder =DocumentBuilderFactory.newInstance().newDocumentBuilder();
Document document = docBuilder.parse(tee);
trace.addDebugMessage("RFC Response: " + tee.getStringContent());
responseNode = document.getFirstChild();
} catch (Throwable t) {
StringWriter sw = new StringWriter();
t.printStackTrace(new PrintWriter(sw));
trace.addWarning(sw.toString());
return responseNode;
* Helper class which collects stream input while reading.
static class TeeInputStream extends InputStream {
private ByteArrayOutputStream baos;
private InputStream wrappedInputStream;
TeeInputStream(InputStream inputStream) {
baos = new ByteArrayOutputStream();
wrappedInputStream = inputStream;
* @return stream content as String
String getStringContent() {
return baos.toString();
/* (non-Javadoc)
* @see java.io.InputStream#read()
public int read() throws IOException {
int r = wrappedInputStream.read();
baos.write(r);
return r;
Then you need to compile and upload this class and it should work.
I hope that this helps you.
Best regards
Till -
Mapping JDBC lookup: Passing values to multiple fields at target
Hi all
i my scenario i am using mapping JDBC lookup ,and it is working sucessfully.
but i need to select 4 fields from lookups select query and passing it to 3 different target field at target MT.
if i use Global container object methods,than only 1 field i can store inglobal container variable..
and return it 1 the target fields...
BUT how do i pass 4 fields ,result of the select query to 4 different fields at target MT..
query :"
Select BPNO,emp,BENR,bacepack from Ztable where BPNO='"+BPNO[0]+"'";
Regards
AjayPHi,
I have to fetch 3 values and populate it to the 3 fields in the target. The UDF am using is as attached. This UDF is for fetching one value. Kindly tell me about the changes I will have to make to fetch 3 values instead of 1 value.
String Query = " ";
Channel channel = null;
DataBaseAccessor accessor = null;
DataBaseResult resultSet = null;
// Query to retrieve the PROP value for the particular source value passed.
Query ="Select PROP from TANKS where ID='" + ID[0] + "' ";
try{
//Determine a channel, as created in the Configuration
channel = LookupService.getChannel("<Business Service>","<Communication Channel>");
//Get a system accessor for the channel. As the call is being made to an DB, an DatabaseAccessor is obtained.
accessor = LookupService.getDataBaseAccessor(channel);
//Execute Query and get the values in resultset
resultSet = accessor.execute(Query);
for(Iterator rows = resultSet.getRows();rows.hasNext();){
Map rowMap = (Map)rows.next();
result.addValue((String)rowMap.get("PROP"));
catch(Exception ex){
result.addValue(ex.getMessage());
finally{
try{
if (accessor!=null) accessor.close();
catch(Exception e){
result.addValue(e.getMessage()); -
Hi experts,
In my File to IDoc Scenario , How can I create RFC lookups using the Ztables of SAP R/3 in XI.
Please give me any suggestions.
Regards,
SriSree,
Can you please brief about your requirement.
If I am not wrong..you need a lookup before posting idoc in R/3 system. I think its better to write a code in User Exit on R/3 side rather than calling a R/3 RFC function module from SAP XI. This approach will slow down your process.
Its better if you can find an user exit before posting idoc and write a code for lookup data.
Let me know if you need more details.
Nilesh -
RFC Lookup Tranport to QA and PROD
hi,
I have used Sravya's code to lookup for a Ztable within SRM with some customizations. Its a nice blog very helpful.
Question is all the connection parameters to the SRM box are coded in the UDF.
Now if i have to transport this to QA and PROD do i have to change the mapping in the QA and PROD or is there any other alternative?
Thanks,
TiruHi,
there is another blog, that explains how to use properties files. This could be an option to put your parameters into a properties file so you just jave to adjust the properties file and not the mapping.
/people/sap.user72/blog/2006/06/07/property-file-a-smart-use-in-xi-context
An alternative is also to use the SAP Mapping Lookup API, where you would write an RFC that encapsulates the SQL call to the table and call the RFC from within the mapping. In the mapping you will just need to give an RFC communication channel which you can name the same way in each system, so you do not need to do any adjustments at all.
/people/alessandro.guarneri/blog/2006/03/27/sap-xi-lookup-api-the-killer
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/xi/xi-code-samples/xi mapping lookups rfc api.pdf
Regards
Christine -
Importing lookup values for Z tables thru FTP connections running MDMGX
When executing MDMGX and importing into lookup tables of MDM thru FTP connection on MDM Server it is working fine with defined tables,
But when we want to get the data into Z tables thru FTP connection on MDMserver executing MDMGX,how is it possible
I have added the Ztables extraction in MDMGX txt file.Hi Shifali,
If I have got your question correct.
The MDM Generic Extractor Program(MDMGX) is used to extract the ECC Checktable data out form MDM and place it either in to the MDM FTP enabled ready folder or the Local desktop form where it can be imported into MDM Repository.
If you want to send data out form MDM to ECC system.you need to syndicate the master data from MDM using the MDM syndicator in the XML format using the std XSD schema.
This XML data is then converted in to idoc by PI/XI an posted to ECC Tables.
You can simillarly use the MDM_CLNT_EXTR ie the MDM client extractor program to work with the main table data instead of checktable on similar lines.
These two transaction work with the standard ECC tables extraction.
You can refer the below links on the same:
MDMGX:
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f0740b31-a934-2b10-2e95-af2252991baa (MDMGX)
MDM_CLNT_EXTR:
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/50260ae5-5589-2910-4791-fc091d9bf797 (MDM_CLNT_EXTR)
Hope It Helped
Thanks & Regards
Simona Pinto -
Hi @,
I have a scenario where I need to check One ZTABLE for a particular field whether it is present or not and then if it is present then I have to send the exception .
I know how to send the exception but dont know how to check up ZTABLE FROM SAP XI maping
REgardsyou can use a lookup as mentioned below;
The com.sap.aii.mapping.lookup API -
https://media.sdn.sap.com/javadocs/NW04/SPS15/pi/index.html
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/xi/xi-code-samples/xi%20mapping%20lookups%20rfc%20api.pdf -
F4 help for month and Year field in ztable please
I have created a ztable.
The first field in my ztable is a primary key field and it holds MM/YYYY(eg. 08/2011).
How ever my requirement is when ever we want to see the output, uer wants F4 help on that fields and when he selects the range , the ztable should display the values in that range.
For eg.
1). if user selects 01/2011 to 08/2011 using F4 Help,
2). then the ztables should display all the records in that range.
Thanks in advance for your help.
Best regards,Sam
Moderator message : Spec dumping / Outsourcing is not allowed. Thread locked.
Edited by: Vinod Kumar on Aug 8, 2011 11:17 AMHi Sam,
Create an F4 help using this FM
l_t_date" has month and date
l_f_programm = sy-repid.
l_f_dynnr = sy-dynnr.
CALL FUNCTION 'F4IF_INT_TABLE_VALUE_REQUEST'
EXPORTING
retfield = 'Date'"Field name of coloum of value tab
dynpprog = l_f_programm
dynpnr = l_f_dynnr
dynprofield = <>"Screen field name'
value_org = 'S'
callback_program = l_f_programm
callback_form = ''
TABLES
value_tab = l_t_date"Value table for date and month
* FIELD_TAB = L_T_RETURN
return_tab = l_t_return_tab
EXCEPTIONS
parameter_error = 1
no_values_found = 2
OTHERS = 3.
READ TABLE l_t_return_tab INDEX 1.
IF sy-subrc = 0 .
<fields> = l_t_return_tab-fieldval.
ENDIF.
Regards,
Amit
then filter the ztable accordingly -
JDBC Lookup - Import table data from a different schema in same DB
Hi XI Experts,
We are facing an issue while importing a Database table into the external definition in PI 7.1.
The details are as below:
I have configured user 'A' in PI communication channel to access the database. But the table that I want to access is present in schema "B". Due to this, I am unable to view the table that I have to import in the list available.
In other words, I am trying to access a table present in a different schema in the same database. Please note that my user has been given all the required permissions to access different schema. Even then, I am unable to access the table in different schema.
Kindly provide your valuable suggestions as to how I can import table which is present in another schema but in the same Database.
Regards,
SubbuIf you are using PI 7.1, then you can do JDBC Lookup to import JDBC meta data (table structures from DB). Configure a jdbc receiver communication channel where you specify username and password which has permission to access schema A and Schema B of database. Specify database name in the connection string. Then you might have access to import both schema.
Please refer these links
SAP PI 7.1 Mapping Enhancements Series: Graphical Support for JDBC and RFC Lookups
How to use JDBC Lookup in PI 7.1 ? -
Error while configuring kodo to lookup a datasource
We had a working application using EEPersistenceManagerFactory
I changed the kodo.properties to lookup a non XA JDBC datasource.
After that the application is working fine (it creates
,updates,deletes,finds record in the DB)
but SystemOut.log has the following error for every operation
We are using Kodo 2.5.0, Websphere 5.0 and Oracle 8
How can we avoid getting this error ?.
We tried to find any property on the Websphere datasource which can be
altered to avoid this error but no luck.
Thanks
Paresh
[10/7/03 15:30:45:467 IST] 3d8b2d1a MCWrapper E J2CA0081E: Method
destroy failed while trying to execute method destroy on ManagedConnection
com.ibm.ws.rsadapter.spi.WSRdbManagedConnectionImpl@437f6d23 from resource
<null>. Caught exception: com.ibm.ws.exception.WsException: DSRA0080E: An
exception was received by the Data Store Adapter. See original exception
message: Cannot call 'cleanup' on a ManagedConnection while it is still in
a transaction..
at
com.ibm.ws.rsadapter.exceptions.DataStoreAdapterException.<init>(DataStoreAdapterException.java:222)
at
com.ibm.ws.rsadapter.exceptions.DataStoreAdapterException.<init>(DataStoreAdapterException.java:172)
at
com.ibm.ws.rsadapter.AdapterUtil.createDataStoreAdapterException(AdapterUtil.java:182)
at
com.ibm.ws.rsadapter.spi.WSRdbManagedConnectionImpl.cleanupTransactions(WSRdbManagedConnectionImpl.java:1826)
at
com.ibm.ws.rsadapter.spi.WSRdbManagedConnectionImpl.destroy(WSRdbManagedConnectionImpl.java:1389)
at com.ibm.ejs.j2c.MCWrapper.destroy(MCWrapper.java:1032)
at
com.ibm.ejs.j2c.poolmanager.FreePool.returnToFreePool(FreePool.java:259)
at com.ibm.ejs.j2c.poolmanager.PoolManager.release(PoolManager.java:777)
at com.ibm.ejs.j2c.MCWrapper.releaseToPoolManager(MCWrapper.java:1304)
at
com.ibm.ejs.j2c.ConnectionEventListener.connectionClosed(ConnectionEventListener.java:195)
at
com.ibm.ws.rsadapter.spi.WSRdbManagedConnectionImpl.processConnectionClosedEvent(WSRdbManagedConnectionImpl.java:843)
at
com.ibm.ws.rsadapter.jdbc.WSJdbcConnection.closeWrapper(WSJdbcConnection.java:569)
at com.ibm.ws.rsadapter.jdbc.WSJdbcObject.close(WSJdbcObject.java:132)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.close(SQLExecutionManagerImpl.java:814)
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.release(JDBCStoreManager.java(Inlined
Compiled Code))
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.load(JDBCStoreManager.java(Compiled
Code))
at
com.solarmetric.kodo.runtime.StateManagerImpl.loadFields(StateManagerImpl.java(Compiled
Code))
at
com.solarmetric.kodo.runtime.StateManagerImpl.preSerialize(StateManagerImpl.java:784)
at com.paresh.core.vo.Release.jdoPreSerialize(Release.java)
at com.paresh.core.vo.Release.writeObject(Release.java)
at java.lang.reflect.Method.invoke(Native Method)
at
com.ibm.rmi.io.IIOPOutputStream.invokeObjectWriter(IIOPOutputStream.java:703)
at com.ibm.rmi.io.IIOPOutputStream.outputObject(IIOPOutputStream.java:671)
at
com.ibm.rmi.io.IIOPOutputStream.simpleWriteObject(IIOPOutputStream.java:146)
at
com.ibm.rmi.io.ValueHandlerImpl.writeValueInternal(ValueHandlerImpl.java:217)
at com.ibm.rmi.io.ValueHandlerImpl.writeValue(ValueHandlerImpl.java:144)
at com.ibm.rmi.iiop.CDROutputStream.write_value(CDROutputStream.java:1590)
at com.ibm.rmi.iiop.CDROutputStream.write_value(CDROutputStream.java:1107)
at
com.paresh.core.interfaces._EJSRemoteStatelessValidation_da16513c_Tie.findCorrectionAction(_EJSRemoteStatelessValidation_da16513c_Tie.java:309)
at
com.paresh.core.interfaces._EJSRemoteStatelessValidation_da16513c_Tie._invoke(_EJSRemoteStatelessValidation_da16513c_Tie.java:104)
at
com.ibm.CORBA.iiop.ServerDelegate.dispatchInvokeHandler(ServerDelegate.java:582)
at com.ibm.CORBA.iiop.ServerDelegate.dispatch(ServerDelegate.java:437)
at com.ibm.rmi.iiop.ORB.process(ORB.java:320)
at com.ibm.CORBA.iiop.ORB.process(ORB.java:1544)
at com.ibm.rmi.iiop.Connection.doWork(Connection.java:2063)
at com.ibm.rmi.iiop.WorkUnitImpl.doWork(WorkUnitImpl.java:63)
at com.ibm.ejs.oa.pool.PooledThread.run(ThreadPool.java:95)
at com.ibm.ws.util.ThreadPool$Worker.run(ThreadPool.java:592)
kodo.properties
com.solarmetric.kodo.LicenseKey=
#com.solarmetric.kodo.ee.ManagedRuntimeProperties=TransactionManagerName=java:/TransactionManager
com.solarmetric.kodo.ee.ManagedRuntimeProperties=TransactionManagerName=TransactionFactory
TransactionManagerMethod=com.ibm.ejs.jts.jta.TransactionManagerFactory.getTransactionManager
#com.solarmetric.kodo.ee.ManagedRuntimeClass=com.solarmetric.kodo.ee.InvocationManagedRuntime
com.solarmetric.kodo.ee.ManagedRuntimeClass=com.solarmetric.kodo.ee.AutomaticManagedRuntime
#javax.jdo.PersistenceManagerFactoryClass=com.solarmetric.kodo.impl.jdbc.JDBCPersistenceManagerFactory
javax.jdo.PersistenceManagerFactoryClass=com.solarmetric.kodo.impl.jdbc.ee.EEPersistenceManagerFactory
javax.jdo.option.ConnectionFactoryName=ds/kodo/DataSource1
javax.jdo.option.Optimistic=true
javax.jdo.option.RetainValues=true
javax.jdo.option.NontransactionalRead=true
#com.solarmetric.kodo.DataCacheClass=com.solarmetric.kodo.runtime.datacache.plugins.CacheImpl
# Changing these to a non-zero value will dramatically increase
# performance, but will cause in-memory databases such as Hypersonic
# SQL to never exit when your main() method exits, as the pooled
# connections in the in-memory database will cause a daemon thread to
# remain running.
javax.jdo.option.MinPool=5
javax.jdo.option.MaxPool=10We do have a makeTransientAll() before the object returns from Session
Bean.
We also tried the JCA path
After installing the JCA RAR and doing a lookup for
PersistenceManagetFactory the same code is not throwing any exception.
The exception is thrown only if datasource is used.
Thanks
Paresh
Marc Prud'hommeaux wrote:
Paresh-
It looks like you are returning a collection of instances from an EJB,
which will cause them to be serialized. The serialization is happening
outside the context of a transaction, and Kodo needs to obtain a
connection. Websphere seems to not like that.
You have a few options:
1. Call makeTransientAll() on all the instances before you return them
from your bean methods
2. Manually instantiate all the fields yourself before sending them
back. You could use a bogus ObjectOutputStream to do this.
3. In 3.0, you can use the new detach() API to detach the instances
before sending them back to the client.
In article <[email protected]>, Paresh wrote:
We had a working application using EEPersistenceManagerFactory
I changed the kodo.properties to lookup a non XA JDBC datasource.
After that the application is working fine (it creates
,updates,deletes,finds record in the DB)
but SystemOut.log has the following error for every operation
We are using Kodo 2.5.0, Websphere 5.0 and Oracle 8
How can we avoid getting this error ?.
We tried to find any property on the Websphere datasource which can be
altered to avoid this error but no luck.
Thanks
Paresh
[10/7/03 15:30:45:467 IST] 3d8b2d1a MCWrapper E J2CA0081E: Method
destroy failed while trying to execute method destroy on ManagedConnection
com.ibm.ws.rsadapter.spi.WSRdbManagedConnectionImpl@437f6d23 from resource
<null>. Caught exception: com.ibm.ws.exception.WsException: DSRA0080E: An
exception was received by the Data Store Adapter. See original exception
message: Cannot call 'cleanup' on a ManagedConnection while it is still in
a transaction..
at
com.ibm.ws.rsadapter.exceptions.DataStoreAdapterException.<init>(DataStoreAdapterException.java:222)
at
com.ibm.ws.rsadapter.exceptions.DataStoreAdapterException.<init>(DataStoreAdapterException.java:172)
at
com.ibm.ws.rsadapter.AdapterUtil.createDataStoreAdapterException(AdapterUtil.java:182)
at
com.ibm.ws.rsadapter.spi.WSRdbManagedConnectionImpl.cleanupTransactions(WSRdbManagedConnectionImpl.java:1826)
at
com.ibm.ws.rsadapter.spi.WSRdbManagedConnectionImpl.destroy(WSRdbManagedConnectionImpl.java:1389)
at com.ibm.ejs.j2c.MCWrapper.destroy(MCWrapper.java:1032)
at
com.ibm.ejs.j2c.poolmanager.FreePool.returnToFreePool(FreePool.java:259)
at com.ibm.ejs.j2c.poolmanager.PoolManager.release(PoolManager.java:777)
at com.ibm.ejs.j2c.MCWrapper.releaseToPoolManager(MCWrapper.java:1304)
at
com.ibm.ejs.j2c.ConnectionEventListener.connectionClosed(ConnectionEventListener.java:195)
at
com.ibm.ws.rsadapter.spi.WSRdbManagedConnectionImpl.processConnectionClosedEvent(WSRdbManagedConnectionImpl.java:843)
at
com.ibm.ws.rsadapter.jdbc.WSJdbcConnection.closeWrapper(WSJdbcConnection.java:569)
at com.ibm.ws.rsadapter.jdbc.WSJdbcObject.close(WSJdbcObject.java:132)
at
com.solarmetric.kodo.impl.jdbc.SQLExecutionManagerImpl.close(SQLExecutionManagerImpl.java:814)
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.release(JDBCStoreManager.java(Inlined
Compiled Code))
at
com.solarmetric.kodo.impl.jdbc.runtime.JDBCStoreManager.load(JDBCStoreManager.java(Compiled
Code))
at
com.solarmetric.kodo.runtime.StateManagerImpl.loadFields(StateManagerImpl.java(Compiled
Code))
at
com.solarmetric.kodo.runtime.StateManagerImpl.preSerialize(StateManagerImpl.java:784)
at com.paresh.core.vo.Release.jdoPreSerialize(Release.java)
at com.paresh.core.vo.Release.writeObject(Release.java)
at java.lang.reflect.Method.invoke(Native Method)
at
com.ibm.rmi.io.IIOPOutputStream.invokeObjectWriter(IIOPOutputStream.java:703)
at com.ibm.rmi.io.IIOPOutputStream.outputObject(IIOPOutputStream.java:671)
at
com.ibm.rmi.io.IIOPOutputStream.simpleWriteObject(IIOPOutputStream.java:146)
at
com.ibm.rmi.io.ValueHandlerImpl.writeValueInternal(ValueHandlerImpl.java:217)
at com.ibm.rmi.io.ValueHandlerImpl.writeValue(ValueHandlerImpl.java:144)
at com.ibm.rmi.iiop.CDROutputStream.write_value(CDROutputStream.java:1590)
at com.ibm.rmi.iiop.CDROutputStream.write_value(CDROutputStream.java:1107)
at
com.paresh.core.interfaces._EJSRemoteStatelessValidation_da16513c_Tie.findCorrectionAction(_EJSRemoteStatelessValidation_da16513c_Tie.java:309)
at
com.paresh.core.interfaces._EJSRemoteStatelessValidation_da16513c_Tie._invoke(_EJSRemoteStatelessValidation_da16513c_Tie.java:104)
at
com.ibm.CORBA.iiop.ServerDelegate.dispatchInvokeHandler(ServerDelegate.java:582)
at com.ibm.CORBA.iiop.ServerDelegate.dispatch(ServerDelegate.java:437)
at com.ibm.rmi.iiop.ORB.process(ORB.java:320)
at com.ibm.CORBA.iiop.ORB.process(ORB.java:1544)
at com.ibm.rmi.iiop.Connection.doWork(Connection.java:2063)
at com.ibm.rmi.iiop.WorkUnitImpl.doWork(WorkUnitImpl.java:63)
at com.ibm.ejs.oa.pool.PooledThread.run(ThreadPool.java:95)
at com.ibm.ws.util.ThreadPool$Worker.run(ThreadPool.java:592)
kodo.properties
com.solarmetric.kodo.LicenseKey=
#com.solarmetric.kodo.ee.ManagedRuntimeProperties=TransactionManagerName=java:/TransactionManager
>>
>
com.solarmetric.kodo.ee.ManagedRuntimeProperties=TransactionManagerName=TransactionFactory
>>
>
TransactionManagerMethod=com.ibm.ejs.jts.jta.TransactionManagerFactory.getTransactionManager
>>
>>
>
#com.solarmetric.kodo.ee.ManagedRuntimeClass=com.solarmetric.kodo.ee.InvocationManagedRuntime
>>
>
com.solarmetric.kodo.ee.ManagedRuntimeClass=com.solarmetric.kodo.ee.AutomaticManagedRuntime
>>
>>
>
#javax.jdo.PersistenceManagerFactoryClass=com.solarmetric.kodo.impl.jdbc.JDBCPersistenceManagerFactory
>>
>
javax.jdo.PersistenceManagerFactoryClass=com.solarmetric.kodo.impl.jdbc.ee.EEPersistenceManagerFactory
>>
javax.jdo.option.ConnectionFactoryName=ds/kodo/DataSource1
javax.jdo.option.Optimistic=true
javax.jdo.option.RetainValues=true
javax.jdo.option.NontransactionalRead=true
#com.solarmetric.kodo.DataCacheClass=com.solarmetric.kodo.runtime.datacache.plugins.CacheImpl
>>
>>
# Changing these to a non-zero value will dramatically increase
# performance, but will cause in-memory databases such as Hypersonic
# SQL to never exit when your main() method exits, as the pooled
# connections in the in-memory database will cause a daemon thread to
# remain running.
javax.jdo.option.MinPool=5
javax.jdo.option.MaxPool=10
Marc Prud'hommeaux [email protected]
SolarMetric Inc. http://www.solarmetric.com -
Error while deleting a Lookup Taxonomy field
Hi Experts,
I have created a main table Products and there is a field Manufacturer which is a Lookup Taxonomy field. When I try to delete the field there comes an error stating 'The field cannot be deleted until references to it in Family Hierarchy are removed'. But in data manager there exists no data for that table. Even in the other mode I have checked whether there exists any relationships. There seems to be nothing, still I can't able to delete that field. Please help!!!
Thanks in Advance,
Thamizharasi NHi Thamizharasi,
I feel you have defined a Family table in your repository. This family table is referring to this taxonomy field. To resolve this, try below steps:
1. Open MDM Console and login to the repository.
2. Once connected to the repository, click on the repository name, Search for the "Families" table in the Tables Sub-window on right hand side, and check the family field.
3. If the family field is manufacturer Lookup taxonomy field, then delete the family table.
4. Delete the lookup taxonomy field from the main table.
Hope this will solve your issue. Revert with the result..
- Shiv -
Searching Records Following the Creation of a Lookup Table
Hello,
Please excuse my ignorance but I have just created a lookup table by using the wizard in SQL Workshop. All is well text has been replaced by numbers. My issue is that when I do a search now using a drop down list based on a query it returns records based on other numbers as well. That is if '2' relates to an item the search picks up anything containing a 2: 2,21,22, etc. How can I get the query to return a value for the number selected and not anything containing that number?
and (
instr(upper("MODEL_ID"),upper(nvl(:P40_REPORT_SEARCH,"MODEL_ID"))) > 0
Kind Regards,
Swelch
Edited by: Steve Welch on Mar 9, 2012 12:03 PMHere are two potential solutions
and (instr(upper('~'||"MODEL_ID"||'~'),upper(nvl('~'||:P40_REPORT_SEARCH||'~',"MODEL_ID"))) > 0)or
AND model_id = NVL(:P40_REPORT_SEARCH, model_id)The first is often used for components such as shuttles. I think I've added the special characters in the right place - not tested/verified against my previous example
The second may be more appropriate to your scenario, depending on your data.
Scott -
Hi all,
during mapping I have to use entries of another node of the same payload as lookup table? Is this possible with graphical message mapping, may be advanced functions, or should I use XSLT?
Regards
Mathiashi,
sure it's possible:
you can use global variables (global arrays for example)
/people/sap.user72/blog/2005/10/01/xi-new-features-in-sp14
the easiest way
you can also use containers but go for global variables
Regards,
michal
Maybe you are looking for
-
Lenovo System Update 5 / Ohne Funktion
Hallo Nutze für mein X220 den Lenovo System Updater 5.03 , der gerade auch neu installiert wurde. Wenn ich nun nach neuen Updates suchen möchte erhalte ich die Fehlermeldung : Verbindung zum Proxy Server kann nicht hergestellt werden Kennt einer das
-
Hi guys, I've just starting put a site together here: http://ashling.yorkwebsites.co.uk/jonah/ I'm aiming for a the image effect at the top with imagerotator and a coverflow type effect at the bottom. I put the coverflow viewer in first and it worked
-
Both Internal hard drives fail on Macbook Pro late-2011
Both internal hard drives (SSD and HDD) failed on Macbook Pro late-2011. Setup: MBP closed with Thunderbolt display attached. Problem on HDD-A I called Apple Support to log an issue with my MBP2011. The system is made up of 2 drives - SSD (128GB) and
-
Itunes does not recognize any of my devices
I connect my iPhone or iPad(s) and Itunes does not recognize they are there. I have Itunes latest version. My last backup I was able to do was back in January. I tried to delete and reload itunes and that does not help. My computer recognizes my devi
-
Hi! I want to be enlightened what to do so my apps would connect to the internet again, excluding the option of formatting and hard resetting. The Social app, Nokia Blogs app, other media apps (made with App Wizard) can't connect to internet. Messag