POF serialization of BigInteger
The doc for PofWriter states that WriteBigInteger throws an IllegalStateException if the BigInteger is > 128 bits, and indeed it does in practice. I have some objects contianing BigIntegers which are learger than this. I can get round the problem for my object (serialize and deserialize BigInteger as byte[]) but in this case the key is also a BigInteger - which may be > 128 bits.
I tried to add a serializer for BigInteger to the pof config, but it doesn't seem to override the default behavour - is this possible?
Obviously I can write some class to wrap this BigInteger key (though thats a bit of a drag) but I really think that Coherence should not refuse to serialize a perfectly valid java object?
Alasdair wrote:
The doc for PofWriter states that WriteBigInteger throws an IllegalStateException if the BigInteger is > 128 bits, and indeed it does in practice. I have some objects contianing BigIntegers which are learger than this. I can get round the problem for my object (serialize and deserialize BigInteger as byte[]) but in this case the key is also a BigInteger - which may be > 128 bits.
I tried to add a serializer for BigInteger to the pof config, but it doesn't seem to override the default behavour - is this possible?
Obviously I can write some class to wrap this BigInteger key (though thats a bit of a drag) but I really think that Coherence should not refuse to serialize a perfectly valid java object?Hi Alasdair,
BigInteger and BigDecimal are a bit unfairly treated if you look at it from the Java side. On the other hand, POF format is not Java oriented, it is a platform-independent specification implemented among others in Java, which dictates not Java BigInteger/BigDecimal objects, rather dictates 128-bit integer/floating point values, which happen to be represented in Java as BigInteger/BigDecimal objects. I agree that the method names writeBigDecimal and writeBigInteger is misleading.
On the other hand, it should definitely be possible to register a custom PofSerializer for BigInteger and BigDecimal and that is captured as feature request COH-5308. I don't know when it will be released and what versions will be patched with it. Coding-wise it is a fairly simple change, on the other hand it may have some performance implications for serialization (not for deserialization), so it may be delayed a bit.
Look for COH-5308 in patch release notes and/or try to ask your Oracle contact to find out when it is scheduled for releasing.
Best regards,
Robert
Similar Messages
-
Am running into following exception even after following all guidelines to implement POF. The main objective is to perform Distributed Bulk cache loading.
Oracle Coherence GE 3.7.1.10 <Error> (thread=Invocation:InvocationService, member=1): Failure to deserialize an Invocable object: java.io.StreamCorruptedException: unknown user type: 1001
java.io.StreamCorruptedException: unknown user type: 1001
at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3312)
at com.tangosol.io.pof.PofBufferReader.readObject(PofBufferReader.java:2604)
at com.tangosol.io.pof.ConfigurablePofContext.deserialize(ConfigurablePofContext.java:371)
at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.readObject(Service.CDB:1)
at com.tangosol.coherence.component.net.Message.readObject(Message.CDB:1)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.InvocationService$InvocationRequest.read(InvocationService.CDB:8)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.deserializeMessage(Grid.CDB:19)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:31)
at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
at java.lang.Thread.run(Thread.java:662)
Following is the pof-config.xml
<?xml version="1.0" encoding="UTF-8" ?>
<pof-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns="http://xmlns.oracle.com/coherence/coherence-pof-config"
xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-pof-config/1.1/coherence-pof-config.xsd">
<user-type-list>
<include>coherence-pof-config.xml</include>
<user-type>
<type-id>1001</type-id>
<class-name>com.westgroup.coherence.bermuda.loader.DistributedLoaderAgent</class-name>
<serializer>
<class-name>com.tangosol.io.pof.PofAnnotationSerializer</class-name>
<init-params>
<init-param>
<param-type>int</param-type>
<param-value>{type-id}</param-value>
</init-param>
<init-param>
<param-type>java.lang.Class</param-type>
<param-value>{class}</param-value>
</init-param>
<init-param>
<param-type>boolean</param-type>
<param-value>true</param-value>
</init-param>
</init-params>
</serializer>
</user-type>
<user-type>
<type-id>1002</type-id>
<class-name>com.westgroup.coherence.bermuda.profile.lpa.LPACacheProfile</class-name>
<serializer>
<class-name>com.tangosol.io.pof.PofAnnotationSerializer</class-name>
<init-params>
<init-param>
<param-type>int</param-type>
<param-value>{type-id}</param-value>
</init-param>
<init-param>
<param-type>java.lang.Class</param-type>
<param-value>{class}</param-value>
</init-param>
<init-param>
<param-type>boolean</param-type>
<param-value>true</param-value>
</init-param>
</init-params>
</serializer>
</user-type>
<user-type>
<type-id>1003</type-id>
<class-name>com.westgroup.coherence.bermuda.profile.lpa.Address</class-name>
<serializer>
<class-name>com.tangosol.io.pof.PofAnnotationSerializer</class-name>
<init-params>
<init-param>
<param-type>int</param-type>
<param-value>{type-id}</param-value>
</init-param>
<init-param>
<param-type>java.lang.Class</param-type>
<param-value>{class}</param-value>
</init-param>
<init-param>
<param-type>boolean</param-type>
<param-value>true</param-value>
</init-param>
</init-params>
</serializer>
</user-type>
<user-type>
<type-id>1004</type-id>
<class-name>com.westgroup.coherence.bermuda.profile.lpa.Discipline</class-name>
<serializer>
<class-name>com.tangosol.io.pof.PofAnnotationSerializer</class-name>
<init-params>
<init-param>
<param-type>int</param-type>
<param-value>{type-id}</param-value>
</init-param>
<init-param>
<param-type>java.lang.Class</param-type>
<param-value>{class}</param-value>
</init-param>
<init-param>
<param-type>boolean</param-type>
<param-value>true</param-value>
</init-param>
</init-params>
</serializer>
</user-type>
<user-type>
<type-id>1005</type-id>
<class-name>com.westgroup.coherence.bermuda.profile.lpa.Employment</class-name>
<serializer>
<class-name>com.tangosol.io.pof.PofAnnotationSerializer</class-name>
<init-params>
<init-param>
<param-type>int</param-type>
<param-value>{type-id}</param-value>
</init-param>
<init-param>
<param-type>java.lang.Class</param-type>
<param-value>{class}</param-value>
</init-param>
<init-param>
<param-type>boolean</param-type>
<param-value>true</param-value>
</init-param>
</init-params>
</serializer>
</user-type>
</user-type-list>
<allow-interfaces>true</allow-interfaces>
<allow-subclasses>true</allow-subclasses>
</pof-config>
cache-config.xml
<cache-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns="http://xmlns.oracle.com/coherence/coherence-cache-config"
xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-cache-config http://xmlns.oracle.com/coherence/coherence-cache-config/1.1/coherence-cache-config.xsd">
<defaults>
<serializer>pof</serializer>
</defaults>
<caching-scheme-mapping>
<cache-mapping>
<cache-name>DistributedLPACache</cache-name>
<scheme-name>LPANewCache</scheme-name>
<init-params>
<init-param>
<param-name>back-size-limit</param-name>
<param-value>250MB</param-value>
</init-param>
</init-params>
</cache-mapping>
</caching-scheme-mapping>
<caching-schemes>
<!-- Distributed caching scheme. -->
<distributed-scheme>
<scheme-name>LPANewCache</scheme-name>
<service-name>HBaseLPACache</service-name>
<serializer>
<instance>
<class-name>com.tangosol.io.pof.ConfigurablePofContext</class-name>
<init-params>
<init-param>
<param-type>java.lang.String</param-type>
<param-value>pof-config.xml</param-value>
</init-param>
</init-params>
</instance>
</serializer>
<backing-map-scheme>
<read-write-backing-map-scheme>
<internal-cache-scheme>
<class-scheme>
<class-name>com.tangosol.util.ObservableHashMap</class-name>
</class-scheme>
</internal-cache-scheme>
<cachestore-scheme>
<class-scheme>
<class-name>com.westgroup.coherence.bermuda.profile.lpa.LPACacheProfile</class-name>
</class-scheme>
</cachestore-scheme>
<read-only>false</read-only>
<write-delay-seconds>0</write-delay-seconds>
</read-write-backing-map-scheme>
</backing-map-scheme>
<autostart>true</autostart>
</distributed-scheme>
<invocation-scheme>
<scheme-name>InvocationService</scheme-name>
<service-name>InvocationService</service-name>
<thread-count>5</thread-count>
<autostart>true</autostart>
</invocation-scheme>
</caching-schemes>
</cache-config>
DistributedLoaderAgent (user type 1001)
import java.io.IOException;
import java.io.Serializable;
import java.lang.annotation.Annotation;
import org.apache.log4j.Logger;
import com.tangosol.io.pof.PofReader;
import com.tangosol.io.pof.PofWriter;
import com.tangosol.io.pof.PortableObject;
import com.tangosol.io.pof.annotation.Portable;
import com.tangosol.io.pof.annotation.PortableProperty;
import com.tangosol.net.AbstractInvocable;
import com.tangosol.net.InvocationService;
@Portable
public class DistributedLoaderAgent extends AbstractInvocable implements PortableObject{
private static final long serialVersionUID = 10L;
private static Logger m_logger = Logger.getLogger(DistributedLoaderAgent.class);
@PortableProperty(0)
public String partDumpFileName = null;
public String getPartDumpFileName() {
return partDumpFileName;
public void setPartDumpFileName(String partDumpFileName) {
this.partDumpFileName = partDumpFileName;
public DistributedLoaderAgent(){
super();
m_logger.debug("Configuring this loader ");
public DistributedLoaderAgent(String partDumpFile){
super();
m_logger.debug("Configuring this loader to load dump file "+ partDumpFile);
partDumpFileName = partDumpFile;
@Override
public void init(InvocationService service) {
// TODO Auto-generated method stub
super.init(service);
@Override
public void run() {
// TODO Auto-generated method stub
try{
m_logger.debug("Invoked DistributedLoaderAgent");
MetadataTranslatorService service = new MetadataTranslatorService(false, "LPA");
m_logger.debug("Invoking service.loadLPACache");
service.loadLPACache(partDumpFileName);
}catch(Exception e){
m_logger.debug("Exception in DistributedLoaderAgent " + e.getMessage());
@Override
public void readExternal(PofReader arg0) throws IOException {
// TODO Auto-generated method stub
setPartDumpFileName(arg0.readString(0));
@Override
public void writeExternal(PofWriter arg0) throws IOException {
// TODO Auto-generated method stub
arg0.writeString(0, getPartDumpFileName());
Please assist.OK I have two suggestions.
1. Always create and flush the ObjectOutputStream before creating the ObjectInputStream.
2. Always close the output before you close the input. Actually once you close the output stream both the input stream and the socket are closed anyway so you can economize on this code. In the above you have out..writeObject() followed by input.close() followed by out.close(). Change this to out.writeObject() followed by out.close(). It may be that something needed flushing and the input.close() prevented the flush from happening. -
hi
I would like to write an integration test for POF serialization.
Is it possible to make Coherence to serialize all objects I put into named cache ?
Or I need to start two nodes (in two classloaders) in my test ?If you want to just test the POF serialization of your classes you can just serialize, deserialize and check the results
e.g.
MyPofClass original = new MyPofClass();
// populate fields of original
// Serialize to Binary
ConfigurablePofContext pofContext = new ConfigurablePofContext("name of pof config file .xml");
Binary binary = ExternalizableHelper.toBinary(original, pofCOntext);
// Deserialize back to original class
MyPofClass deserialized = ExternalizableHelper.fromBinary(binary, pofContext);
// now do some asserts to check that the fields in the original match the fields in the deserialized instanceI usually do that test for all but the most simple POF classes I write.
JK -
I have a strange issue using a POF serializer where one of the fields of the class is not populated when the object is retrieved from the cache.
The class looks like:
public class RegionSerializer extends AbstractPersistentEntitySerializer<RegionImpl> {
private static final int REGION_CODE = 11;
private static final int NAME = 12;
private static final int ID_REGION = 13;
private static final int ID_CCY_RPT = 14;
@Override
protected RegionImpl createInstance() {
return new RegionImpl();
public void serialize(PofWriter out, Object o) throws IOException {
RegionImpl obj = (RegionImpl) o;
super.serialize(out, obj);
out.writeObject(REGION_CODE, obj.getRegionCode());
out.writeString(NAME, obj.getName());
out.writeInt(ID_REGION, obj.getIdRegion());
out.writeObject(ID_CCY_RPT, obj.getIdCcyRpt());
public Object deserialize(PofReader in) throws IOException {
RegionImpl obj = (RegionImpl) super.deserialize(in);
obj.setRegionCode((RegionCode) in.readObject(REGION_CODE));
obj.setName(in.readString(NAME));
obj.setIdRegion(in.readInt(ID_REGION));
obj.setIdCcyRpt((CurrencyCode) in.readObject(ID_CCY_RPT));
return obj;
}and the RegionCodeSerializer...
public class RegionCodeSerializer implements PofSerializer{
private static final int CODE = 11;
public void serialize(PofWriter out, Object o) throws IOException {
RegionCode obj = (RegionCode) o;
out.writeString(CODE, obj.getCode());
public Object deserialize(PofReader in) throws IOException {
RegionCode obj = new RegionCode();
obj.setCode(in.readString(CODE));
return obj;
}the output from the log after inserting and retrieving from the cache is
06-Oct-2010 10:11:28,277 BST DEBUG refdata.RefDataServiceImpl main - Region count:4
06-Oct-2010 10:11:28,277 BST INFO cache.TestCacheStartUp main - REGION FROM DAO: RegionImpl[RegionImpl[objectId=5],regionCode=EMEA,name=LONDON]
06-Oct-2010 10:11:28,277 BST INFO cache.TestCacheStartUp main - REGION FROM DAO: RegionImpl[RegionImpl[objectId=6],regionCode=US,name=NEW YORK]
06-Oct-2010 10:11:28,277 BST INFO cache.TestCacheStartUp main - REGION FROM DAO: RegionImpl[RegionImpl[objectId=7],regionCode=APAC,name=TOKYO]
06-Oct-2010 10:11:28,277 BST INFO cache.TestCacheStartUp main - REGION FROM DAO: RegionImpl[RegionImpl[objectId=8],regionCode=NON_PYRAMID_DESKS,name=NON PYRAMID DESKS]
06-Oct-2010 10:11:28,293 BST INFO cache.TestCacheStartUp main - Is Cache empty?: false
06-Oct-2010 10:11:28,293 BST INFO cache.TestCacheStartUp main - Cache Size is: 4
06-Oct-2010 10:11:28,324 BST INFO cache.TestCacheStartUp main - REGION FROM CACHE: RegionImpl[RegionImpl[objectId=6],regionCode=US,name=<null>]
06-Oct-2010 10:11:28,324 BST INFO cache.TestCacheStartUp main - REGION FROM CACHE: RegionImpl[RegionImpl[objectId=7],regionCode=APAC,name=<null>]
06-Oct-2010 10:11:28,324 BST INFO cache.TestCacheStartUp main - REGION FROM CACHE: RegionImpl[RegionImpl[objectId=8],regionCode=NON_PYRAMID_DESKS,name=<null>]
06-Oct-2010 10:11:28,324 BST INFO cache.TestCacheStartUp main - REGION FROM CACHE: RegionImpl[RegionImpl[objectId=5],regionCode=EMEA,name=<null>]as can be seen from the output the name field is null after retrieving. It seems that the 3 remaining fields after regionCode are being ignored by the deserialize (or serialize method) in the serializer class above but I can't see why? Any ideas?Hi,
You need to call read/writeRemainder() method at the end of serialization/deserialization to properly terminate reading/writing of a user type.
Thanks,
Wei -
Overriding pof serialization in subclasses?
When subclassing a class that support POF-serialization how would I know what property index I can pass in for the additional elements that are added by the subclass assuming that I do not have the source code of the superclass (or the information exists in the javadoc of the superclass)?
Is it for instance possible to read the last property index used from the stream (PofWriter) somehow?
I tried to find anything in the PofContext and PofWriter interface but did not see anything that looked relevant...
Examples of situations were this would apply is for instance when creating a custom filter using a Coherence built in filter as base-class.
Or am I missing something obvious about how to handle subclassing and POF-serialization that makes this a non-issue?
Best Regards
MagnusMagnusE wrote:
Thanks for the VERY quick answer Robert - I hardly posted before you answered :-)
I think it sounds a bit fragile to rely on documntation for this (on the other hand most cases of extending classes you don't have the source to are more or less fragile!).
Since it does not sound to complicated to provide a method that return the highest property index used so far from the stream (or is there some inherent problem with that solution?) I would prefer that solution over simply documenting it.
Hi Magnus,
On the contrary.
IMHO, documentation of the first property index a subclass may use is the only correct way to do this.
This is because it is perfectly valid to encode null values by not writing anything to the PofWriter. Therefore it is possible that the property index that was last written is not the one the greatest one which can ever be used, in which case you can't just continue from the value following the one that was last written, as that depends on the superclass state and therefore it is not constant, even though the mapping of property indexes to attributes must not depend on state. At least the intent of the specification and the direction of ongoing development is that the mapping is constant and dependencies will most probably be introduced on this in upcoming versions (sorry, can't tell anything more specific about this, yet).
As for also providing the method which returns this CONSTANT value, that can also be requested, but the last written property index is not a good candidate for your purposes.
For Evolvable implementors, this is even trickier, as all properties from later versions most follow all properties from earlier versions, otherwise they wouldn't end up in the remainder and therefore Evolvable can't function as it should. This also means that all properties of all subclasses from later implementation versions must have larger property indexes than all properties from preceeding implementation versions.
Therefore for Evolvable implementors, the first property index a subclass may use for each implementation version must be documented separately.
Actually, this also poses a theoretical problem, as this also means that the superclass must contain information derived from all subclasses, which is impossible to do correctly when the subclass is from another vendor and not from the superclass vendor. The superclass vendor may allocate a large gap for subclass properties, but when another attribute is added to a later version of the superclass, it is theoretically possible that the gap is not enough.
Best regards,
Robert -
Hi i am getting the following error while doing the POF serialization. please help
java.lang.StackOverflowError
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)saiyansharwan wrote:
Hi i am getting the following error while doing the POF serialization. please help
java.lang.StackOverflowError
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.isEvolvable(PofBufferWriter.java:2753)
Hi saiyansharwan,
which Coherence version is this?
Best regards,
Robert -
Default to Java Serialization in case Pof Serialization not defined
Is this possible to do, i.e. essentially if for a certain class Pof serialization is not defined, use the Java serialization instead?
Or to turn it around, is it possible to define pof serialization only for certain classes in a distributed cache and use Java serialization for the rest?Hi,
the problem for this is that Java serialization is not aware of POF (or for that matter even ExternalizableLite), so if you have a Java-serialized class which has a member which is supposed to be POF-serializable, it in fact will not be serialized with POF, because Java serialization will not delegate to POF.
So it is very hard to mix the two together. You can do it for top-level objects by providing a special PofSerializer instance for the non-POF class which serializes to byte array and you write the byte array as a POF attribute, but it is not possible for POF-aware objects contained within a non-POF aware object to be POF serialized.
Also, if you attempt to do this, then you can kiss goodbye to platform independence. You must use Java on both ends and have all the libraries which the classes used in the state want to pull in.
Best regards,
Robert -
Unknown user type with POF serialization
Hi all,
I'm using 3.6 and am just starting to implement POF. In general it has been pretty easy but I seem to have a problem with my near scheme and POF. Things work ok in my unit tests, but it doesn't work when I deploy to a single instance of WebLogic 12 on my laptop. Here is an example scheme:
<near-scheme>
<scheme-name>prod-near</scheme-name>
<autostart>true</autostart>
<front-scheme>
<local-scheme>
<high-units>{high-units 2000}</high-units>
<expiry-delay>{expiry-delay 2h}</expiry-delay>
</local-scheme>
</front-scheme>
<back-scheme>
<distributed-scheme>
<backing-map-scheme>
<local-scheme>
<high-units>{high-units 10000}</high-units>
<expiry-delay>{expiry-delay 2h}</expiry-delay>
</local-scheme>
</backing-map-scheme>
<serializer>
<instance>
<class-name>com.tangosol.io.pof.ConfigurablePofContext</class-name>
<init-params>
<init-param>
<param-value>/Bus/pof-config.xml</param-value>
<param-value>String</param-value>
</init-param>
</init-params>
</instance>
</serializer>
</distributed-scheme>
</back-scheme>
</near-scheme>
I don't know if it matter, but some of my caches use another scheme that references this one as a parent:
<near-scheme>
<scheme-name>daily-near</scheme-name>
<scheme-ref>prod-near</scheme-ref>
<autostart>true</autostart>
<back-scheme>
<distributed-scheme>
<backing-map-scheme>
<local-scheme>
<high-units system-property="daily-near-high-units">{high-units 10000}</high-units>
<expiry-delay>{expiry-delay 1d}</expiry-delay>
</local-scheme>
</backing-map-scheme>
<serializer>
<instance>
<class-name>com.tangosol.io.pof.ConfigurablePofContext</class-name>
<init-params>
<init-param>
<param-value>/Bus/pof-config.xml</param-value>
<param-value>String</param-value>
</init-param>
</init-params>
</instance>
</serializer>
</distributed-scheme>
</back-scheme>
</near-scheme>
Those schemes have existed for years. I'm only now adding the serializers. I use this same cache config file in my unit tests, as well as the same pof config file. My unit tests do ExternalizableHelper.toBinary(o, pofContext) and ExternalizableHelper.fromBinary(b, pofContext). I create the test pof context by doing new ConfigurablePofContext("/Bus/pof-config.xml"). I've also tried actually putting and getting an object to and from a cache in my unit tests. Everything works as expected.
My type definition looks like this:
<user-type>
<type-id>1016</type-id>
<class-name>com.mycompany.mydepartment.bus.service.role.RoleResource</class-name>
</user-type>
I'm not using the tangosol.pof.enabled system property because I don't think it's necessary with the explicit serializers.
Here is part of a stack trace:
(Wrapped) java.io.IOException: unknown user type: com.mycompany.mydepartment.bus.service.role.RoleResource
at com.tangosol.util.ExternalizableHelper.toBinary(ExternalizableHelper.java:214)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ConverterValueToBinary.convert(PartitionedCache.CDB:3)
at com.tangosol.util.ConverterCollections$ConverterCacheMap.put(ConverterCollections.java:2486)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.put(PartitionedCache.CDB:1)
at com.tangosol.coherence.component.util.SafeNamedCache.put(SafeNamedCache.CDB:1)
at com.tangosol.net.cache.CachingMap.put(CachingMap.java:943)
at com.tangosol.net.cache.CachingMap.put(CachingMap.java:902)
at com.tangosol.net.cache.CachingMap.put(CachingMap.java:814)
Any idea what I'm missing?
Thanks
JohnSR-APX wrote:
Aleks
thanks for your response.
However the include property needs to present inside the <user-type-list> tag....
<user-type-list> <include>coherence-pof-config.xml </include></user-type-list>
For all other interested users the following property, tangosol.pof.enabled=true , must also be set for pof serialization to work correctly.
Thanks again...
ShamsurHi Shamsur,
it is not mandatory to use tangosol.pof.enabled=true, you can alternatively specify the serializer for the clustered services to be configured for POF (not necessarily all of them) on a service-by-service basis in the cache configuration file explicitly with the following element.
<serializer>com.tangosol.io.pof.ConfigurablePofContext</serializer>Best regards,
Robert -
Can I enable pof serialization for one cache and other JAVA serialization
I had coherence cluster with few cache , Is there any way i can enable pof serialization for one cache and other to use normal JAVA serialization
839051 wrote:
I had coherence cluster with few cache , Is there any way i can enable pof serialization for one cache and other to use normal JAVA serializationHi,
you can control serialization on a service-by-service basis. You can specify which serializer to use for the service with the <serializer> element in the service-scheme element corresponding to the service in the scache configuration file.
Be aware, though, that if you use Coherence*Extend, and the service serializer configuration for the proxy service does not match the serializer configuration of the service which you are proxying to the extend client then the proxy node has to de- and reserialize the data which it moves between the service and the client.
Best regards,
Robert -
Error getting while creating the custom pof serializer class
the error which i am getting is
2011-07-19 15:16:38.767/4.840 Oracle Coherence GE 3.7.0.0 <Error> (thread=main, member=1): Error while starting se
vice "AspNetSessionCache": (Wrapped) (Wrapped: error creating class "com.tangosol.io.pof.ConfigurablePofContext")
Wrapped: Unable to load class for user type (Config=pof-config.xml, Type-Id=1001, Class-Name=examples.testJavaClas
)) (Wrapped) java.lang.ClassNotFoundException: examples.testJavaClass
at com.tangosol.coherence.component.util.Daemon.start(Daemon.CDB:52)
at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.start(Service.CDB:7)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.start(Grid.CDB:6)
at com.tangosol.coherence.component.util.SafeService.startService(SafeService.CDB:39)
at com.tangosol.coherence.component.util.safeService.SafeCacheService.startService(SafeCacheService.CDB:5)
at com.tangosol.coherence.component.util.SafeService.ensureRunningService(SafeService.CDB:27)
at com.tangosol.coherence.component.util.SafeService.start(SafeService.CDB:14)
at com.tangosol.net.DefaultConfigurableCacheFactory.ensureServiceInternal(DefaultConfigurableCacheFactory.
ava:1102)
at com.tangosol.net.DefaultConfigurableCacheFactory.ensureService(DefaultConfigurableCacheFactory.java:934
at com.tangosol.net.DefaultCacheServer.startServices(DefaultCacheServer.java:81)
at com.tangosol.net.DefaultCacheServer.intialStartServices(DefaultCacheServer.java:250)
at com.tangosol.net.DefaultCacheServer.startAndMonitor(DefaultCacheServer.java:55)
at com.tangosol.net.DefaultCacheServer.main(DefaultCacheServer.java:197)
Caused by: (Wrapped: error creating class "com.tangosol.io.pof.ConfigurablePofContext") (Wrapped: Unable to load c
ass for user type (Config=pof-config.xml, Type-Id=1001, Class-Name=examples.testJavaClass)) (Wrapped) java.lang.Cl
ssNotFoundException: examples.testJavaClass
at com.tangosol.io.ConfigurableSerializerFactory.createSerializer(ConfigurableSerializerFactory.java:46)
at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.instantiateSerializer(Service.CDB:1
at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.ensureSerializer(Service.CDB:32)
at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.ensureSerializer(Service.CDB:4)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onEnter(Grid.CDB:26)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onEnter(Par
itionedService.CDB:19)
at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:14)
at java.lang.Thread.run(Thread.java:619)
Caused by: (Wrapped: Unable to load class for user type (Config=pof-config.xml, Type-Id=1001, Class-Name=examples.
estJavaClass)) (Wrapped) java.lang.ClassNotFoundException: examples.testJavaClass
at com.tangosol.util.Base.ensureRuntimeException(Base.java:288)
at com.tangosol.io.pof.ConfigurablePofContext.report(ConfigurablePofContext.java:1254)
at com.tangosol.io.pof.ConfigurablePofContext.createPofConfig(ConfigurablePofContext.java:956)
at com.tangosol.io.pof.ConfigurablePofContext.initialize(ConfigurablePofContext.java:775)
at com.tangosol.io.pof.ConfigurablePofContext.setContextClassLoader(ConfigurablePofContext.java:319)
at com.tangosol.io.ConfigurableSerializerFactory.createSerializer(ConfigurableSerializerFactory.java:42)
... 7 more
Caused by: (Wrapped) java.lang.ClassNotFoundException: examples.testJavaClass
at com.tangosol.util.Base.ensureRuntimeException(Base.java:288)
at com.tangosol.util.Base.ensureRuntimeException(Base.java:269)
at com.tangosol.io.pof.ConfigurablePofContext.loadClass(ConfigurablePofContext.java:1198)
at com.tangosol.io.pof.ConfigurablePofContext.createPofConfig(ConfigurablePofContext.java:952)
... 10 more
Caused by: java.lang.ClassNotFoundException: examples.testJavaClass
at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:169)
at com.tangosol.util.ExternalizableHelper.loadClass(ExternalizableHelper.java:3056)
at com.tangosol.io.pof.ConfigurablePofContext.loadClass(ConfigurablePofContext.java:1194)
... 11 more
Exception in thread "main" (Wrapped) (Wrapped: error creating class "com.tangosol.io.pof.ConfigurablePofContext")
Wrapped: Unable to load class for user type (Config=pof-config.xml, Type-Id=1001, Class-Name=examples.testJavaClas
)) (Wrapped) java.lang.ClassNotFoundException: examples.testJavaClass
at com.tangosol.coherence.component.util.Daemon.start(Daemon.CDB:52)
at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.start(Service.CDB:7)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.start(Grid.CDB:6)
at com.tangosol.coherence.component.util.SafeService.startService(SafeService.CDB:39)
at com.tangosol.coherence.component.util.safeService.SafeCacheService.startService(SafeCacheService.CDB:5)
at com.tangosol.coherence.component.util.SafeService.ensureRunningService(SafeService.CDB:27)
at com.tangosol.coherence.component.util.SafeService.start(SafeService.CDB:14)
at com.tangosol.net.DefaultConfigurableCacheFactory.ensureServiceInternal(DefaultConfigurableCacheFactory.
ava:1102)
at com.tangosol.net.DefaultConfigurableCacheFactory.ensureService(DefaultConfigurableCacheFactory.java:934
at com.tangosol.net.DefaultCacheServer.startServices(DefaultCacheServer.java:81)
at com.tangosol.net.DefaultCacheServer.intialStartServices(DefaultCacheServer.java:250)
at com.tangosol.net.DefaultCacheServer.startAndMonitor(DefaultCacheServer.java:55)
at com.tangosol.net.DefaultCacheServer.main(DefaultCacheServer.java:197)
Caused by: (Wrapped: error creating class "com.tangosol.io.pof.ConfigurablePofContext") (Wrapped: Unable to load c
ass for user type (Config=pof-config.xml, Type-Id=1001, Class-Name=examples.testJavaClass)) (Wrapped) java.lang.Cl
ssNotFoundException: examples.testJavaClass
at com.tangosol.io.ConfigurableSerializerFactory.createSerializer(ConfigurableSerializerFactory.java:46)
at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.instantiateSerializer(Service.CDB:1
at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.ensureSerializer(Service.CDB:32)
at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.ensureSerializer(Service.CDB:4)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onEnter(Grid.CDB:26)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onEnter(Par
itionedService.CDB:19)
at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:14)
at java.lang.Thread.run(Thread.java:619)
Caused by: (Wrapped: Unable to load class for user type (Config=pof-config.xml, Type-Id=1001, Class-Name=examples.
estJavaClass)) (Wrapped) java.lang.ClassNotFoundException: examples.testJavaClass
at com.tangosol.util.Base.ensureRuntimeException(Base.java:288)
at com.tangosol.io.pof.ConfigurablePofContext.report(ConfigurablePofContext.java:1254)
at com.tangosol.io.pof.ConfigurablePofContext.createPofConfig(ConfigurablePofContext.java:956)
at com.tangosol.io.pof.ConfigurablePofContext.initialize(ConfigurablePofContext.java:775)
at com.tangosol.io.pof.ConfigurablePofContext.setContextClassLoader(ConfigurablePofContext.java:319)
at com.tangosol.io.ConfigurableSerializerFactory.createSerializer(ConfigurableSerializerFactory.java:42)
... 7 more
Caused by: (Wrapped) java.lang.ClassNotFoundException: examples.testJavaClass
at com.tangosol.util.Base.ensureRuntimeException(Base.java:288)
at com.tangosol.util.Base.ensureRuntimeException(Base.java:269)
at com.tangosol.io.pof.ConfigurablePofContext.loadClass(ConfigurablePofContext.java:1198)
at com.tangosol.io.pof.ConfigurablePofContext.createPofConfig(ConfigurablePofContext.java:952)
... 10 more
Caused by: java.lang.ClassNotFoundException: examples.testJavaClass
at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:169)
at com.tangosol.util.ExternalizableHelper.loadClass(ExternalizableHelper.java:3056)
at com.tangosol.io.pof.ConfigurablePofContext.loadClass(ConfigurablePofContext.java:1194)
... 11 more
2011-07-19 15:16:38.825/4.898 Oracle Coherence GE 3.7.0.0 <D4> (thread=ShutdownHook, member=1): ShutdownHook: stop
ing cluster node
2011-07-19 15:16:38.826/4.899 Oracle Coherence GE 3.7.0.0 <D5> (thread=Cluster, member=1): Service Cluster left th
cluster
Press any key to continue . . .
coherence-pof-config.xml is
<?xml version="1.0"?>
<!DOCTYPE pof-config SYSTEM "pof-config.dtd">
<pof-config>
<user-type-list>
<user-type>
<type-id>1001</type-id>
<class-name>examples.testJavaClass</class-name>
<serializer>
<class-name>com.tangosol.io.pof.PortableObjectSerializer</class-name>
<init-params>
<init-param>
<param-type>string</param-type>
<param-value>1</param-value>
</init-param>
</init-params>
</serializer>
</user-type>
</user-type-list>
</pof-config>
testJavaClass.CLASS file is
package examples;
import com.tangosol.io.pof.PofReader;
import com.tangosol.io.pof.PofWriter;
import com.tangosol.io.pof.PortableObject;
import com.tangosol.util.Base;
import java.io.IOException;
public class testJavaClass implements PortableObject
private String MembershipId;
private String m_sStreet;
private String m_sCity;
public testJavaClass()
public testJavaClass(String sName, String sStreet, String sCity)
setName(sName);
setStreet(sStreet);
setCity(sCity);
public void readExternal(PofReader reader)
throws IOException
setName(reader.readString(0));
setStreet(reader.readString(1));
setCity(reader.readString(2));
public void writeExternal(PofWriter writer)
throws IOException
writer.writeString(0, getName());
writer.writeString(1, getStreet());
writer.writeString(2, getCity());
// accessor methods omitted for brevity
Thanks.Hi Wijk,
I have created java class with using NetBeans IDE .
and running with .NET client and i kept it in the folder (C:\Program Files\Oracle\Coherence for .NET\examples\ContactCache.Java\src\examples).
Thanks.... -
Pof Serialization Error leads to partial cache updates in XA Tran
I am using coherence jca adapter to enlist in XA transactions with the database operations. The data is being stored in distributed caches with the cache member running on weblogic server with "Storage disabled". POF is being used for serialization. As a part of a single transaction, multiple caches, which are obtained from the CacheAdapter, are being updated. The application code does explicit updates to the cache and the database within the same transaction with the write to the cache happening after the write the database has been executed.
It is being observed that when an exception happens during the serialization of an object all the cache updates prior to this error are not rolled back. Namely,
I have CacheA for Object A, Cache B for Object B and Cache C for Object C.
I am updating A, B, C with the same transaction and in the same order as the objects are listed. So database for A is updated followed by cache update for A, DataBase B is updated followed by Cache update for B and similarly for C.
If there is an error while serializing C, all the database updates are rolled back, however updates to A and B are committed to the cache.
Why aren't all the cache updates being rolled back ? Has this been fixed already in coherence 3.6.
Thanks,
Shamsur
Application Server: Weblogic 10.3
jdbc driver : XA thin driver
coherence : 3.5.0
Caused by: java.lang.IllegalStateException: decimal value exceeds IEEE754r 128-bit range: 7777777788888888888899999999999900000000000000044444444444447777777777777.00
at com.tangosol.io.pof.PofHelper.calcDecimalSize(PofHelper.java:1517)
at com.tangosol.io.pof.PofBufferWriter.writeBigDecimal(PofBufferWriter.java:562)
at com.tangosol.io.pof.PofBufferWriter.writeObject(PofBufferWriter.java:1325)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.writeObject(PofBufferWriter.java:2092)
at com.apx.core.datalayer.data.basictypes.BigDecimalMoneyImpl.writeExternal(BigDecimalMoneyImpl.java:127)
at com.tangosol.io.pof.PortableObjectSerializer.serialize(PortableObjectSerializer.java:88)
at com.tangosol.io.pof.PofBufferWriter.writeObject(PofBufferWriter.java:1439)
at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.writeObject(PofBufferWriter.java:2092)
at com.apx.instrument.datalayer.data.domain.impl.DOtcInstrumentImpl.writeExternal(DOtcInstrumentImpl.java:200)
at com.apx.core.datalayer.data.impl.AbstractDomainObject.writeExternal(AbstractDomainObject.java:109)
... 185 more
at com.tangosol.coherence.ra.component.connector.resourceAdapter.cciAdapter.CacheAdapter$ManagedConnection$LocalTransaction.commit(CacheAdapter.CDB:37)
at weblogic.connector.security.layer.AdapterLayer.commit(AdapterLayer.java:570)
at weblogic.connector.transaction.outbound.NonXAWrapper.commit(NonXAWrapper.java:84)
at weblogic.transaction.internal.NonXAServerResourceInfo.commit(NonXAServerResourceInfo.java:330)
at weblogic.transaction.internal.ServerTransactionImpl.globalPrepare(ServerTransactionImpl.java:2251)
at weblogic.transaction.internal.ServerTransactionImpl.internalCommit(ServerTransactionImpl.java:270)
at weblogic.transaction.internal.ServerTransactionImpl.commit(ServerTransactionImpl.java:230)
at weblogic.transaction.internal.TransactionManagerImpl.commit(TransactionManagerImpl.java:283)
at $Proxy150.create(Unknown Source)
javax.resource.spi.LocalTransactionException: CoherenceRA: Commit failed:
java.lang.RuntimeException: error with the class: com.apx.instrument.datalayer.data.domain.impl.DOtcInstrumentImpl
at com.apx.core.datalayer.data.impl.AbstractDomainObject.writeExternal(AbstractDomainObject.java:111)
at com.tangosol.io.pof.PortableObjectSerializer.serialize(PortableObjectSerializer.java:88)
at com.tangosol.io.pof.PofBufferWriter.writeObject(PofBufferWriter.java:1439)
at com.tangosol.io.pof.ConfigurablePofContext.serialize(ConfigurablePofContext.java:338)
at com.tangosol.util.ExternalizableHelper.serializeInternal(ExternalizableHelper.java:2508)
at com.tangosol.util.ExternalizableHelper.toBinary(ExternalizableHelper.java:205)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.DistributedCache$ConverterValueToBinary.convert(DistributedCache.CDB:3)
at com.tangosol.util.ConverterCollections$AbstractConverterEntry.getValue(ConverterCollections.java:3333)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.DistributedCache$BinaryMap.putAll(DistributedCache.CDB:19)
at com.tangosol.util.ConverterCollections$ConverterMap.putAll(ConverterCollections.java:1570)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.DistributedCache$ViewMap.putAll(DistributedCache.CDB:1)
at com.tangosol.coherence.component.util.SafeNamedCache.putAll(SafeNamedCache.CDB:1)
at com.tangosol.coherence.component.util.collections.WrapperMap.putAll(WrapperMap.CDB:1)
at com.tangosol.coherence.component.util.DeltaMap.resolve(DeltaMap.CDB:9)
at com.tangosol.coherence.component.util.deltaMap.TransactionMap.commit(TransactionMap.CDB:1)
at com.tangosol.coherence.component.util.TransactionCache.commit(TransactionCache.CDB:14)
at com.tangosol.coherence.component.util.transactionCache.Local.commit(Local.CDB:1)
at com.tangosol.coherence.ra.component.connector.resourceAdapter.cciAdapter.CacheAdapter$ManagedConnection$LocalTransaction.commit(CacheAdapter.CDB:25)
at weblogic.connector.security.layer.AdapterLayer.commit(AdapterLayer.java:570)
at weblogic.connector.transaction.outbound.NonXAWrapper.commit(NonXAWrapper.java:84)
at weblogic.transaction.internal.NonXAServerResourceInfo.commit(NonXAServerResourceInfo.java:330)
at weblogic.transaction.internal.ServerTransactionImpl.globalPrepare(ServerTransactionImpl.java:2251)
at weblogic.transaction.internal.ServerTransactionImpl.internalCommit(ServerTransactionImpl.java:270)
at weblogic.transaction.internal.ServerTransactionImpl.commit(ServerTransactionImpl.java:230)
at weblogic.transaction.internal.TransactionManagerImpl.commit(TransactionManagerImpl.java:283)
at org.springframework.transaction.jta.JtaTransactionManager.doCommit(JtaTransactionManager.java:1009)
at org.springframework.transaction.support.AbstractPlatformTransactionManager.processCommit(AbstractPlatformTransactionManager.java:754)
at org.springframework.transaction.support.AbstractPlatformTransactionManager.commit(AbstractPlatformTransactionManager.java:723)
at org.springframework.transaction.interceptor.TransactionAspectSupport.commitTransactionAfterReturning(TransactionAspectSupport.java:374)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:120)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)Hi SR-APX
The problem is that even though you are using the JCA adaptor Coherence (pre-3.6) is not really transactional. Once you commit all that data is still being pushed out to the distributed cluster members who all work independently. An error on one member will not stop data being written successfully to others.
In Coherence 3.6 there are real transactions but you would need to see if the limitations on them fit your use-cases.
JK -
Hi guys,
I get an "unknown user type issue" for only one class defined in my pof-config-file. The rest are ok which is the part I can't get my head round.
Log (showing config load)
Everything seems to load ok. In fact, just to confirm, I messed up some tags in the xml file and I got problems. So it's finding the file ok.
+010-06-30 11:35:11.965/1.212 Oracle Coherence GE 3.5.3/465 <D5> (thread=Cluster, member=n/a): Member 1 joined Service InvocationService with senior member 1+
+2010-06-30 11:35:12.007/1.255 Oracle Coherence GE 3.5.3/465 <D5> (thread=Invocation:Management, member=2): Service Management joined the cluster with senior service member 1+
+2010-06-30 11:35:12.267/1.514 Oracle Coherence GE 3.5.3/465 <Info> (thread=Cluster, member=2): Loading POF configuration from resource "file:/mnt/linux-share/zeus/Zeus-core/config/zeus-pof-config.xml"+
+2010-06-30 11:35:12.268/1.515 Oracle Coherence GE 3.5.3/465 <Info> (thread=Cluster, member=2): Loading POF configuration from resource "jar:file:/mnt/linux-share/repository/com/tangsol/coherence/3.5.3/coherence-3.5.3.jar!/coherence-pof-config.xml"+
+2010-06-30 11:35:12.283/1.530 Oracle Coherence GE 3.5.3/465 <D5> (thread=DistributedCache, member=2): Service DistributedCache joined the cluster with senior service member 1+
+2010-06-30 11:35:12.338/1.585 Oracle Coherence GE 3.5.3/465 <D5> (thread=DistributedCache, member=2): Service DistributedCache: received ServiceConfigSync containing 2040 entries+
Exception
Exception in thread "main" java.lang.IllegalArgumentException: unknown user type: org.zeus.query.QueryInvocable
at com.tangosol.io.pof.ConfigurablePofContext.getUserTypeIdentifier(ConfigurablePofContext.java:400)
at com.tangosol.io.pof.ConfigurablePofContext.getUserTypeIdentifier(ConfigurablePofContext.java:389)
at com.tangosol.io.pof.PofBufferWriter.writeObject(PofBufferWriter.java:1432)
at com.tangosol.io.pof.ConfigurablePofContext.serialize(ConfigurablePofContext.java:338)
at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.writeObject(Service.CDB:4)
at com.tangosol.coherence.component.net.Message.writeObject(Message.CDB:1)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.InvocationService$InvocationRequest.write(InvocationService.CDB:3)
at com.tangosol.coherence.component.util.daemon.queueProcessor.packetProcessor.PacketPublisher.packetizeMessage(PacketPublisher.CDB:137)
at com.tangosol.coherence.component.util.daemon.queueProcessor.packetProcessor.PacketPublisher$InQueue.add(PacketPublisher.CDB:8)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.dispatchMessage(Grid.CDB:50)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.post(Grid.CDB:35)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.send(Grid.CDB:1)
at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.InvocationService.execute(InvocationService.CDB:31)
at com.tangosol.coherence.component.util.safeService.SafeInvocationService.execute(SafeInvocationService.CDB:1)
at org.ebtic.bpm.zeus.query.DistributedQueryProcessor.invokeQuery(DistributedQueryProcessor.java:133)
at org.ebtic.bpm.zeus.query.DistributedQueryProcessor.query(DistributedQueryProcessor.java:95)
at org.ebtic.bpm.zeus.query.DistributedQueryProcessor.main(DistributedQueryProcessor.java:56)
My pof-config file looks like this:
+<!DOCTYPE pof-config SYSTEM "pof-config.dtd">+
+<pof-config>+
+<user-type-list>+
+<include>coherence-pof-config.xml</include>+
+<user-type>+
+<type-id>10001</type-id>+
+<class-name>org.util.ZeusKey</class-name>+
+</user-type>+
+<user-type>+
+<type-id>10002</type-id>+
+<clas-name>org.query.QueryInvocable</clas-name>+
+</user-type>+
+<user-type>+
+<type-id>10003</type-id>+
+<class-name>org.sequencegenerator.ZeusSequenceGenerator$State</class-name>+
+</user-type>+
+</user-type-list>+
+<allow-interfaces>true</allow-interfaces>+
+<allow-subclasses>true</allow-subclasses>+
+</pof-config>+
Only the class QueryInvocable is causing a problem. ZeusKey and SequenceGenerator are working perfectly well.
QueryInvocable looks like the following:
public class QueryInvocable
extends AbstractInvocable implements ExternalizableLite,
PortableObject
+{+
private String m_sCacheName;
private PartitionSet partitionSet;
public QueryInvocable()
+{+
+}+
public QueryInvocable(String sCacheName, PartitionSet partitions)
+{+
m_sCacheName = sCacheName;
partitionSet = partitions;
+}+
+public PartitionSet getPartitionSet(){+
return partitionSet;
+}+
+public String getCacheName(){+
return m_sCacheName;
+}+
+public void setPartitionSet(PartitionSet pSet){+
this.partitionSet = pSet;
+}+
+public void setCacheName(String name){+
this.m_sCacheName = name;
+}+
+/**+
+* {@inheritDoc}+
+public void run(){+
+try{+
ZeusQueryProcessor client = new ZeusQueryProcessor("Distributed Query Processor", partitionSet);
client.process();
+}+
+catch(Exception e){+
System.err.println("Exception creating ZeusQueryProcessor.");
e.printStackTrace();
+}+
+}+
+public void readExternal(PofReader reader) throws IOException{+
System.out.println("Reading in....");
m_sCacheName = reader.readString(0);
partitionSet = (PartitionSet) reader.readObject(1);
+}+
+public void writeExternal (PofWriter writer) throws IOException{+
System.out.println("Writing out....");;
writer.writeString(0, m_sCacheName);
writer.writeObject(1, partitionSet);
+}+
+public void readExternal(DataInput in) throws IOException{+
System.out.println("Reading in....");
m_sCacheName = (String) ExternalizableHelper.readObject(in);
partitionSet = (PartitionSet) ExternalizableHelper.readObject(in);
+}+
+public void writeExternal(DataOutput out) throws IOException{+
System.out.println("Writing out....");;
ExternalizableHelper.writeObject(out, m_sCacheName);
ExternalizableHelper.writeObject(out, partitionSet);
+}+
+}+
Edited by: user11218537 on 30-Jun-2010 00:54user11218537 wrote:
Sorry, this was the result of my not-very-efficient attempt to anonymize my code posting. The package and class names do actually match.
I may have "by accident" solved the problem but I am not sure since I have to test properly. It appears that if I define the PofSerialization stuff in the pof-context.xml, which is then referenced from my coherence-cache-config.xml, I should not set the tangosol.pof.enabled=true flag.
I was basically trying to turn of pof so that I could test something else in my code but it appears that just by removing that flag in the cache-server.sh script (which I had added earlier), the problem disappeared and coherence is still using pof.
What am I missing? Is this behaviour correct?If you remove the tangosol.pof.enabled flag from the Java properties, then POF by default will not be used.
On the other hand if you still have POF configured for certain services with the <serializer> element referring to a PofContext implementation e.g. ConfigurablePofContext, then those services (and only those services) will use POF.
Best regards,
Robert -
POF Serialization Issue (HashMap)
Hi.
It looks like the following use, produced EOFException error in java serialization.
I couldn't find any clear documentation regarding whether readMap (java) should be paired to readDictionary (.net) for pof, but currently by using a type pair of Dictionary<String, Double> in .NET side and HashMap<String, Double> in java we started getting the error below.
Is it a wrong use? Should Hashtable be used instead? Or some other collection? Please advise.
Error: An exception occurred while decoding a Message for Service=Proxy:ExtendTcpProxyService:TcpAcceptor received from: TcpConnection(Id={CUT}, Open=true, Member(Id=0, Timestamp=3981-11-15 06:40:05.166, Address=10.111.12.147:0, MachineId=0, Location={CUT}, Role=.NET RTC client), {CUT}): java.io.EOFException
at com.tangosol.io.nio.ByteBufferReadBuffer$ByteBufferInput.readByte(ByteBufferReadBuffer.java:340)
at com.tangosol.io.AbstractReadBuffer$AbstractBufferInput.readUnsignedByte(AbstractReadBuffer.java:435)
at com.tangosol.io.AbstractReadBuffer$AbstractBufferInput.readPackedInt(AbstractReadBuffer.java:560)
at com.tangosol.io.MultiBufferReadBuffer$MultiBufferInput.readPackedInt(MultiBufferReadBuffer.java:683)
at com.tangosol.io.pof.PofBufferReader.readAsUniformObject(PofBufferReader.java:3344)
at com.tangosol.io.pof.PofBufferReader.readMap(PofBufferReader.java:2537)
Java Pof where it occurs:
writer.writeMap(40, getMyDict());
setMyDict((HashMap<String, Double>)reader.readMap(40, new HashMap<String, Double>()));
public HashMap<String, Double> getMyDict() {
return myDict;
public void setMyDict(HashMap<String, Double> MyDict) {
this.myDict = MyDict;
.NET Pof:
writer.WriteDictionary<String, Double>(40, MyDict);
MyDict = ((Dictionary<String, Double>)reader.ReadDictionary<String, Double>(40, new Dictionary<String, Double>()));
public Dictionary<String, Double> MyDict
get { return myDict; }
set { myDict = value; }
Notes: If it helps, 40 is the last pof index on that object. The error appears sometime, not constantly based on data.AntonZ wrote:
Hi.
It looks like the following use, produced EOFException error in java serialization.
I couldn't find any clear documentation regarding whether readMap (java) should be paired to readDictionary (.net) for pof, but currently by using a type pair of Dictionary<String, Double> in .NET side and HashMap<String, Double> in java we started getting the error below.
Is it a wrong use? Should Hashtable be used instead? Or some other collection? Please advise.
Error: An exception occurred while decoding a Message for Service=Proxy:ExtendTcpProxyService:TcpAcceptor received from: TcpConnection(Id={CUT}, Open=true, Member(Id=0, Timestamp=3981-11-15 06:40:05.166, Address=10.111.12.147:0, MachineId=0, Location={CUT}, Role=.NET RTC client), {CUT}): java.io.EOFException
at com.tangosol.io.nio.ByteBufferReadBuffer$ByteBufferInput.readByte(ByteBufferReadBuffer.java:340)
at com.tangosol.io.AbstractReadBuffer$AbstractBufferInput.readUnsignedByte(AbstractReadBuffer.java:435)
at com.tangosol.io.AbstractReadBuffer$AbstractBufferInput.readPackedInt(AbstractReadBuffer.java:560)
at com.tangosol.io.MultiBufferReadBuffer$MultiBufferInput.readPackedInt(MultiBufferReadBuffer.java:683)
at com.tangosol.io.pof.PofBufferReader.readAsUniformObject(PofBufferReader.java:3344)
at com.tangosol.io.pof.PofBufferReader.readMap(PofBufferReader.java:2537)
Java Pof where it occurs:
writer.writeMap(40, getMyDict());
setMyDict((HashMap<String, Double>)reader.readMap(40, new HashMap<String, Double>()));
public HashMap<String, Double> getMyDict() {
return myDict;
public void setMyDict(HashMap<String, Double> MyDict) {
this.myDict = MyDict;
.NET Pof:
writer.WriteDictionary<String, Double>(40, MyDict);
MyDict = ((Dictionary<String, Double>)reader.ReadDictionary<String, Double>(40, new Dictionary<String, Double>()));
public Dictionary<String, Double> MyDict
get { return myDict; }
set { myDict = value; }
Notes: If it helps, 40 is the last pof index on that object. The error appears sometime, not constantly based on data.How is the class containing the map serialized? Is it a PortableObject or do you use a PofSerializer for serializing it? If you use a PofSerializer, please verify that you did not forget the writeRemainder / readRemainder calls from the end of the serialize/deserialize methods.
Best regards,
Robert -
POF Serialization for Exceptions thrown
If any Java Exceptions are being thrown and this needs to be shown to clients in some proper way.
How to make these Java Exceptions to be POF enable ?
Is com.tangosol.io.pof.ThrowablePofSerializer is meant for this ?
I can have my custom Exception class and have Portable Interface implemented but i wanted to know if their is some way to enable available java excpetions to
make POF enable ex: java.util.ConcurrentModificationException
Edited by: Khangharoth on Jul 19, 2009 10:40 PMIs com.tangosol.io.pof.ThrowablePofSerializer is meant for this ? : Yes and then we can have any Java thowable Pof enabled.
But a catch here "Any deserialized exception will loose type information, and simply be represented as a PortableException". -
POF serialization with replicated cache?
Sorry again for the newbie question.
Can you use POF serialized objects in a replicated cache?
All of the examples show POF serialized objects being used with a partitioned cache.
If you can do this, are there any caveats involved with the "replication" cache? I assume it would have to be started using the same configuration as the "master" cache.Thanks Rob.
So, you just start up the Coherence instance on the (or at the) replication site, using the same configuration as the "master"? (Of course with appropriate classpaths and such set correctly)
Maybe you are looking for
-
Security of data in extended PDFs
I am anticipating a project in which external users would download extended PDFs from our web site and fill them in and submit them, at which time the data would be written to an internal SQL db. Much of the data to be submitted is confidential busin
-
I am working on a map where there are two nodes of type Child in the source schema. These nodes do not repeat, they are predefined Ex <TopLevel> <MainChild> <FirstName>Tom</FirstName> <LastName>Harry</LastName> <Address>London</Address> </MainChild>
-
Problem while printing 132 col. report....
We have one report of width 132 columns. But when we print this report, it prints only upto half width (Approx. upto 70 characters.) We are using Reports 6i on Red Hat Linux server. I tried to modify report width at developing time. But it is of no u
-
Running Weblogic Server 10.3.3 in Debug mode
Hi to all, Here i got a problem after installing Oracle Identity Manager 11g R3,to solve the issue i need to run WebLogic Server 10.3.3 in Debug mode. Can any suggest how to run WebLogic Server 10.3.3 in Debug mode. Here i find a solution to run in d
-
Accidentally submitted AP ECE VAT Report in Final mode for future period.
Hi, I have submitted the AP ECE VAT Register report accidentally in future period with parameter has Final in oracle 11.5.10.2 version. What are all the data it will effect? Also can you please let us know if there any way to Undo this? Regards, Anil