Issue with POF serialization - Failure to deserialize an Invocable object: java.io.StreamCorruptedException

Am running into following exception even after following all guidelines to implement POF. The main objective is to perform Distributed Bulk cache loading.
Oracle Coherence GE 3.7.1.10 <Error> (thread=Invocation:InvocationService, member=1): Failure to deserialize an Invocable object: java.io.StreamCorruptedException: unknown user type: 1001
java.io.StreamCorruptedException: unknown user type: 1001
  at com.tangosol.io.pof.PofBufferReader.readAsObject(PofBufferReader.java:3312)
  at com.tangosol.io.pof.PofBufferReader.readObject(PofBufferReader.java:2604)
  at com.tangosol.io.pof.ConfigurablePofContext.deserialize(ConfigurablePofContext.java:371)
  at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.readObject(Service.CDB:1)
  at com.tangosol.coherence.component.net.Message.readObject(Message.CDB:1)
  at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.InvocationService$InvocationRequest.read(InvocationService.CDB:8)
  at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.deserializeMessage(Grid.CDB:19)
  at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:31)
  at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
  at java.lang.Thread.run(Thread.java:662)
Following is the pof-config.xml
<?xml version="1.0" encoding="UTF-8" ?>
<pof-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
            xmlns="http://xmlns.oracle.com/coherence/coherence-pof-config"
            xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-pof-config/1.1/coherence-pof-config.xsd">
  <user-type-list>
    <include>coherence-pof-config.xml</include>
    <user-type>
      <type-id>1001</type-id>
      <class-name>com.westgroup.coherence.bermuda.loader.DistributedLoaderAgent</class-name>
      <serializer>
        <class-name>com.tangosol.io.pof.PofAnnotationSerializer</class-name>     
        <init-params>
          <init-param>
            <param-type>int</param-type>
            <param-value>{type-id}</param-value>
          </init-param>
          <init-param>
            <param-type>java.lang.Class</param-type>
            <param-value>{class}</param-value>
          </init-param>
          <init-param>
            <param-type>boolean</param-type>
            <param-value>true</param-value>
          </init-param>
        </init-params>
      </serializer>
    </user-type>
     <user-type>
      <type-id>1002</type-id>
      <class-name>com.westgroup.coherence.bermuda.profile.lpa.LPACacheProfile</class-name>
      <serializer>
        <class-name>com.tangosol.io.pof.PofAnnotationSerializer</class-name>     
        <init-params>
          <init-param>
            <param-type>int</param-type>
            <param-value>{type-id}</param-value>
          </init-param>
          <init-param>
            <param-type>java.lang.Class</param-type>
            <param-value>{class}</param-value>
          </init-param>
          <init-param>
            <param-type>boolean</param-type>
            <param-value>true</param-value>
          </init-param>
        </init-params>
      </serializer>
    </user-type>
     <user-type>
      <type-id>1003</type-id>
      <class-name>com.westgroup.coherence.bermuda.profile.lpa.Address</class-name>
      <serializer>
        <class-name>com.tangosol.io.pof.PofAnnotationSerializer</class-name>     
        <init-params>
          <init-param>
            <param-type>int</param-type>
            <param-value>{type-id}</param-value>
          </init-param>
          <init-param>
            <param-type>java.lang.Class</param-type>
            <param-value>{class}</param-value>
          </init-param>
          <init-param>
            <param-type>boolean</param-type>
            <param-value>true</param-value>
          </init-param>
        </init-params>
      </serializer>
    </user-type>
     <user-type>
      <type-id>1004</type-id>
      <class-name>com.westgroup.coherence.bermuda.profile.lpa.Discipline</class-name>
      <serializer>
        <class-name>com.tangosol.io.pof.PofAnnotationSerializer</class-name>     
        <init-params>
          <init-param>
            <param-type>int</param-type>
            <param-value>{type-id}</param-value>
          </init-param>
          <init-param>
            <param-type>java.lang.Class</param-type>
            <param-value>{class}</param-value>
          </init-param>
          <init-param>
            <param-type>boolean</param-type>
            <param-value>true</param-value>
          </init-param>
        </init-params>
      </serializer>
    </user-type>
     <user-type>
      <type-id>1005</type-id>
      <class-name>com.westgroup.coherence.bermuda.profile.lpa.Employment</class-name>
      <serializer>
        <class-name>com.tangosol.io.pof.PofAnnotationSerializer</class-name>     
        <init-params>
          <init-param>
            <param-type>int</param-type>
            <param-value>{type-id}</param-value>
          </init-param>
          <init-param>
            <param-type>java.lang.Class</param-type>
            <param-value>{class}</param-value>
          </init-param>
          <init-param>
            <param-type>boolean</param-type>
            <param-value>true</param-value>
          </init-param>
        </init-params>
      </serializer>
    </user-type>
  </user-type-list>
  <allow-interfaces>true</allow-interfaces>
  <allow-subclasses>true</allow-subclasses>
</pof-config>
cache-config.xml
<cache-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xmlns="http://xmlns.oracle.com/coherence/coherence-cache-config"
  xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-cache-config http://xmlns.oracle.com/coherence/coherence-cache-config/1.1/coherence-cache-config.xsd">
   <defaults>
    <serializer>pof</serializer>
    </defaults>
  <caching-scheme-mapping>
  <cache-mapping>
  <cache-name>DistributedLPACache</cache-name>
  <scheme-name>LPANewCache</scheme-name>
  <init-params>
  <init-param>
  <param-name>back-size-limit</param-name>
  <param-value>250MB</param-value>
  </init-param>
  </init-params>
  </cache-mapping>
  </caching-scheme-mapping>
  <caching-schemes>
  <!-- Distributed caching scheme. -->
  <distributed-scheme>
  <scheme-name>LPANewCache</scheme-name>
  <service-name>HBaseLPACache</service-name>
  <serializer>
  <instance>
             <class-name>com.tangosol.io.pof.ConfigurablePofContext</class-name>
             <init-params>
               <init-param>
                 <param-type>java.lang.String</param-type>
                 <param-value>pof-config.xml</param-value>
               </init-param>
             </init-params>
        </instance>
  </serializer>
  <backing-map-scheme>
  <read-write-backing-map-scheme>
  <internal-cache-scheme>
  <class-scheme>
  <class-name>com.tangosol.util.ObservableHashMap</class-name>
  </class-scheme>
  </internal-cache-scheme>
  <cachestore-scheme>
  <class-scheme>
  <class-name>com.westgroup.coherence.bermuda.profile.lpa.LPACacheProfile</class-name>
  </class-scheme>
  </cachestore-scheme>
  <read-only>false</read-only>
  <write-delay-seconds>0</write-delay-seconds>
  </read-write-backing-map-scheme>
  </backing-map-scheme>
  <autostart>true</autostart>
  </distributed-scheme>
  <invocation-scheme>
       <scheme-name>InvocationService</scheme-name>
       <service-name>InvocationService</service-name>
       <thread-count>5</thread-count>
       <autostart>true</autostart>
    </invocation-scheme>
  </caching-schemes>
</cache-config>
DistributedLoaderAgent (user type 1001)
import java.io.IOException;
import java.io.Serializable;
import java.lang.annotation.Annotation;
import org.apache.log4j.Logger;
import com.tangosol.io.pof.PofReader;
import com.tangosol.io.pof.PofWriter;
import com.tangosol.io.pof.PortableObject;
import com.tangosol.io.pof.annotation.Portable;
import com.tangosol.io.pof.annotation.PortableProperty;
import com.tangosol.net.AbstractInvocable;
import com.tangosol.net.InvocationService;
@Portable
public class DistributedLoaderAgent extends AbstractInvocable implements PortableObject{
  private static final long serialVersionUID = 10L;
  private static Logger m_logger = Logger.getLogger(DistributedLoaderAgent.class);
  @PortableProperty(0)
  public String partDumpFileName = null;
  public String getPartDumpFileName() {
  return partDumpFileName;
  public void setPartDumpFileName(String partDumpFileName) {
  this.partDumpFileName = partDumpFileName;
  public DistributedLoaderAgent(){
  super();
  m_logger.debug("Configuring this loader ");
  public DistributedLoaderAgent(String partDumpFile){
  super();
  m_logger.debug("Configuring this loader to load dump file "+ partDumpFile);
  partDumpFileName = partDumpFile;
  @Override
  public void init(InvocationService service) {
  // TODO Auto-generated method stub
  super.init(service);
  @Override
  public void run() {
  // TODO Auto-generated method stub
  try{
  m_logger.debug("Invoked DistributedLoaderAgent");
  MetadataTranslatorService service = new MetadataTranslatorService(false, "LPA");
  m_logger.debug("Invoking service.loadLPACache");
  service.loadLPACache(partDumpFileName);
  }catch(Exception e){
  m_logger.debug("Exception in DistributedLoaderAgent " + e.getMessage());
  @Override
  public void readExternal(PofReader arg0) throws IOException {
  // TODO Auto-generated method stub
  setPartDumpFileName(arg0.readString(0));
  @Override
  public void writeExternal(PofWriter arg0) throws IOException {
  // TODO Auto-generated method stub
  arg0.writeString(0, getPartDumpFileName());
Please assist.

OK I have two suggestions.
1. Always create and flush the ObjectOutputStream before creating the ObjectInputStream.
2. Always close the output before you close the input. Actually once you close the output stream both the input stream and the socket are closed anyway so you can economize on this code. In the above you have out..writeObject() followed by input.close() followed by out.close(). Change this to out.writeObject() followed by out.close(). It may be that something needed flushing and the input.close() prevented the flush from happening.

Similar Messages

  • Strange issue with POF: byte array with the value 94

    This is a somewhat strange issue we’ve managed to reduce to this test case. We’ve also seen similar issues with chars and shorts as well. It’s only a problem if the byte value inside the byte array is equal to 94! A value of 93, 95, etc, seems to be ok.
    Given the below class, the byte values both in the array and the single byte value are wrong when deserializing. The value inside the byte array isn’t what we put in (get [75] instead of [94]) and the single byte value is null (not 114).
    Pof object code:
    package com.test;
    import java.io.IOException;
    import java.util.Arrays;
    import com.tangosol.io.pof.PofReader;
    import com.tangosol.io.pof.PofWriter;
    import com.tangosol.io.pof.PortableObject;
    public class PofObject1 implements PortableObject {
         private byte[] byteArray;
         private byte byteValue;
         public void setValues() {
              byteArray = new byte[] {94};
              byteValue = 114;
         @Override
         public void readExternal(PofReader reader) throws IOException {
              Object byteArray = reader.readObjectArray(0, null);
              Object byteValue = reader.readObject(1);
              System.out.println(Arrays.toString((Object[])byteArray));
              System.out.println(byteValue);
              if (byteValue == null) throw new IOException("byteValue is null!");
         @Override
         public void writeExternal(PofWriter writer) throws IOException {
              writer.writeObject(0, byteArray);
              writer.writeObject(1, byteValue);
    Using writer.writeObjectArray(0, byteArray); instead of writer.writeObject(0, byteArray); doesn't help. In this case byteArray would be of type Object[] (as accessed through reflection).
    This is simply put in to a distributed cache and then fetched back. No EPs, listeners or stuff like that involved:
         public static void main(String... args) throws Exception {
              NamedCache cache = CacheFactory.getCache("my-cache");
              PofObject1 o = new PofObject1();
              o.setValues();
              cache.put("key1", o);
              cache.get("key1");
    Only tried it with Coherecne 3.7.1.3.
    Cache config file:
    <?xml version="1.0" encoding="UTF-8"?>
    <!DOCTYPE cache-config SYSTEM "cache-config.dtd">
    <cache-config>
         <caching-scheme-mapping>
              <cache-mapping>
                   <cache-name>my-cache</cache-name>
                   <scheme-name>my-cache</scheme-name>
              </cache-mapping>
         </caching-scheme-mapping>
         <caching-schemes>
              <distributed-scheme>
                   <scheme-name>my-cache</scheme-name>
                   <service-name>my-cache</service-name>
                   <serializer>
                        <class-name>
                             com.tangosol.io.pof.ConfigurablePofContext
                        </class-name>
                        <init-params>
                             <init-param>
                                  <param-type>string</param-type>
                                  <param-value>pof-config.xml</param-value>
                             </init-param>
                        </init-params>
                   </serializer>
                   <lease-granularity>thread</lease-granularity>
                   <thread-count>10</thread-count>
                   <backing-map-scheme>
                        <local-scheme>
                        </local-scheme>
                   </backing-map-scheme>
                   <autostart>true</autostart>
              </distributed-scheme>
         </caching-schemes>
    </cache-config>
    POF config file:
    <?xml version="1.0"?>
    <pof-config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xmlns="http://xmlns.oracle.com/coherence/coherence-pof-config"
         xsi:schemaLocation="http://xmlns.oracle.com/coherence/coherence-pof-config coherence-pof-config.xsd">
         <user-type-list>
              <!-- coherence POF user types -->
              <include>coherence-pof-config.xml</include>
              <user-type>
                   <type-id>1460</type-id>
                   <class-name>com.test.PofObject1</class-name>
              </user-type>
         </user-type-list>
    </pof-config>

    Hi,
    POF uses certain byte values as an optimization to represent well known values of certain Object types - e.g. boolean True and False, some very small numbers, null etc... When you do read/write Object instead of using the correct method I suspect POF gets confused over the type and value that the field should be.
    There are a number of cases where POF does not know what the type is - Numbers would be one of these, for example if I stored a long of value 10 on deserialization POF would not know if that was an int, long double etc... so you have to use the correct method to get it back. Collections are another - If you serialize a Set all POF knows is that you have serialized some sort of Collection so unless you are specific when deserializing you will get back a List.
    JK

  • Unknown user type with POF serialization

    Hi all,
    I'm using 3.6 and am just starting to implement POF.  In general it has been pretty easy but I seem to have a problem with my near scheme and POF.  Things work ok in my unit tests, but it doesn't work when I deploy to a single instance of WebLogic 12 on my laptop.  Here is an example scheme:
    <near-scheme>
      <scheme-name>prod-near</scheme-name>
      <autostart>true</autostart>
      <front-scheme>
        <local-scheme>
          <high-units>{high-units 2000}</high-units>
          <expiry-delay>{expiry-delay 2h}</expiry-delay>
        </local-scheme>
      </front-scheme>
      <back-scheme>
        <distributed-scheme>
          <backing-map-scheme>
            <local-scheme>
              <high-units>{high-units 10000}</high-units>
              <expiry-delay>{expiry-delay 2h}</expiry-delay>
            </local-scheme>
          </backing-map-scheme>
          <serializer>
            <instance>
              <class-name>com.tangosol.io.pof.ConfigurablePofContext</class-name>
              <init-params>
                <init-param>
                  <param-value>/Bus/pof-config.xml</param-value>
                  <param-value>String</param-value>
                </init-param>
              </init-params>
            </instance>
          </serializer>
        </distributed-scheme>
      </back-scheme>
    </near-scheme>
    I don't know if it matter, but some of my caches use another scheme that references this one as a parent:
    <near-scheme>
      <scheme-name>daily-near</scheme-name>
      <scheme-ref>prod-near</scheme-ref>
      <autostart>true</autostart>
      <back-scheme>
        <distributed-scheme>
          <backing-map-scheme>
            <local-scheme>
              <high-units system-property="daily-near-high-units">{high-units 10000}</high-units>
              <expiry-delay>{expiry-delay 1d}</expiry-delay>
            </local-scheme>
          </backing-map-scheme>
          <serializer>
            <instance>
              <class-name>com.tangosol.io.pof.ConfigurablePofContext</class-name>
              <init-params>
                <init-param>
                  <param-value>/Bus/pof-config.xml</param-value>
                  <param-value>String</param-value>
                </init-param>
              </init-params>
            </instance>
          </serializer>
        </distributed-scheme>
      </back-scheme>
    </near-scheme>
    Those schemes have existed for years.  I'm only now adding the serializers.  I use this same cache config file in my unit tests, as well as the same pof config file.  My unit tests do ExternalizableHelper.toBinary(o, pofContext) and ExternalizableHelper.fromBinary(b, pofContext).  I create the test pof context by doing new ConfigurablePofContext("/Bus/pof-config.xml").  I've also tried actually putting and getting an object to and from a cache in my unit tests.  Everything works as expected.
    My type definition looks like this:
    <user-type>
      <type-id>1016</type-id>
      <class-name>com.mycompany.mydepartment.bus.service.role.RoleResource</class-name>
    </user-type>
    I'm not using the tangosol.pof.enabled system property because I don't think it's necessary with the explicit serializers.
    Here is part of a stack trace:
    (Wrapped) java.io.IOException: unknown user type: com.mycompany.mydepartment.bus.service.role.RoleResource
        at com.tangosol.util.ExternalizableHelper.toBinary(ExternalizableHelper.java:214)
        at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ConverterValueToBinary.convert(PartitionedCache.CDB:3)
        at com.tangosol.util.ConverterCollections$ConverterCacheMap.put(ConverterCollections.java:2486)
        at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.put(PartitionedCache.CDB:1)
        at com.tangosol.coherence.component.util.SafeNamedCache.put(SafeNamedCache.CDB:1)
        at com.tangosol.net.cache.CachingMap.put(CachingMap.java:943)
        at com.tangosol.net.cache.CachingMap.put(CachingMap.java:902)
        at com.tangosol.net.cache.CachingMap.put(CachingMap.java:814)
    Any idea what I'm missing?
    Thanks
    John

    SR-APX wrote:
    Aleks
    thanks for your response.
    However the include property needs to present inside the <user-type-list> tag....
    <user-type-list> <include>coherence-pof-config.xml </include></user-type-list>
    For all other interested users the following property, tangosol.pof.enabled=true , must also be set for pof serialization to work correctly.
    Thanks again...
    ShamsurHi Shamsur,
    it is not mandatory to use tangosol.pof.enabled=true, you can alternatively specify the serializer for the clustered services to be configured for POF (not necessarily all of them) on a service-by-service basis in the cache configuration file explicitly with the following element.
    <serializer>com.tangosol.io.pof.ConfigurablePofContext</serializer>Best regards,
    Robert

  • Issue with ICE Online Protocol for Content Push- KM objects

    Hi All,
    I am trying to Push the content from one Portal to another Portal using ICE Online Protocol.
    While pushing I want to transfer each Object (Image, XML file, Documents,etc..etc..)with the same GUID as it is in Souce to Target Portal.
    Kindly help me how to maintain the same GUID in Target portal using ICE Online method, with the value in Source Portal.
    Thanks in Advance..
    Regards||
    Ashok M.

    We can't maintain the same GUID's when we use ICE Online method.

  • Issues with using ConfigurablePofContext Serailizer

    I am trying to use ConfigurablePofContext Serializer but getting exception :
    2010-02-02 16:34:33.861/3324.349 Oracle Coherence GE 3.5.2/463 <Error> (thread=P
    roxy:ExtendTcpProxyService:TcpAcceptor, member=1): An exception occurred while d
    ecoding a Message for Service=Proxy:ExtendTcpProxyService:TcpAcceptor received f
    rom: TcpConnection(Id=null, Open=true, LocalAddress=10.153.233.224:9099, RemoteA
    ddress=10.153.233.224:1822): java.lang.ClassCastException: com.tangosol.io.pof.P
    ortableException cannot be cast to com.tangosol.util.UUID
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.P
    eer$MessageFactory$OpenConnectionRequest.readExternal(Peer.CDB:6)
    at com.tangosol.coherence.component.net.extend.Codec.decode(Codec.CDB:29
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.P
    eer.decodeMessage(Peer.CDB:25)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.P
    eer.onNotify(Peer.CDB:47)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
    at java.lang.Thread.run(Thread.java:619)
    Server coherence-cache-config:
    <cache-config>
    <caching-scheme-mapping>
    <cache-mapping>
    <cache-name>dist-*</cache-name>
    <scheme-name>dist-default</scheme-name>
    </cache-mapping>
    </caching-scheme-mapping>
    <caching-schemes>
    <distributed-scheme>
    <scheme-name>dist-default</scheme-name>
    <serializer>
              <class-name>com.tangosol.io.pof.ConfigurablePofContext</class-name>
         </serializer>
    <lease-granularity>member</lease-granularity>
    <backing-map-scheme>
    <local-scheme/>
    </backing-map-scheme>
    <autostart>true</autostart>
    </distributed-scheme>
    <proxy-scheme>
    <service-name>ExtendTcpProxyService</service-name>
    <thread-count>5</thread-count>
    <acceptor-config>
    <tcp-acceptor>
    <local-address>
    <address>localhost</address>
    <port>9099</port>
    </local-address>
    </tcp-acceptor>
    <serializer>
                   <class-name>com.tangosol.io.pof.ConfigurablePofContext</class-name>
         </serializer>
    </acceptor-config>
    <autostart>true</autostart>
    </proxy-scheme>
    </caching-schemes>
    </cache-config>
    Client Coherence cache config:
    <?xml version="1.0"?>
    <!DOCTYPE cache-config SYSTEM "cache-config.dtd">
    <cache-config>
    <caching-scheme-mapping>
    <cache-mapping>
    <cache-name>dist-extend</cache-name>
    <scheme-name>extend-dist</scheme-name>
    </cache-mapping>
    <cache-mapping>
    <cache-name>dist-extend-near</cache-name>
    <scheme-name>extend-near</scheme-name>
    </cache-mapping>
    </caching-scheme-mapping>
    <caching-schemes>
    <near-scheme>
    <scheme-name>extend-near</scheme-name>
    <front-scheme>
    <local-scheme>
    <high-units>1000</high-units>
    </local-scheme>
    </front-scheme>
    <back-scheme>
    <remote-cache-scheme>
    <scheme-ref>extend-dist</scheme-ref>
    </remote-cache-scheme>
    </back-scheme>
    <invalidation-strategy>all</invalidation-strategy>
    </near-scheme>
    <remote-cache-scheme>
    <scheme-name>extend-dist</scheme-name>
    <service-name>ExtendTcpCacheService</service-name>
    <initiator-config>
    <tcp-initiator>
    <remote-addresses>
    <socket-address>
    <address>localhost</address>
    <port>9099</port>
    </socket-address>
    </remote-addresses>
    <connect-timeout>10s</connect-timeout>
    </tcp-initiator>
    <outgoing-message-handler>
    <request-timeout>5s</request-timeout>
    </outgoing-message-handler>
    </initiator-config>
    </remote-cache-scheme>
    <remote-invocation-scheme>
    <scheme-name>extend-invocation</scheme-name>
    <service-name>ExtendTcpInvocationService</service-name>
    <initiator-config>
    <tcp-initiator>
    <remote-addresses>
    <socket-address>
    <address>localhost</address>
    <port>9099</port>
    </socket-address>
    </remote-addresses>
    <connect-timeout>10s</connect-timeout>
    </tcp-initiator>
    <outgoing-message-handler>
    <request-timeout>5s</request-timeout>
    </outgoing-message-handler>
    <serializer>
         <class-name>com.tangosol.io.pof.ConfigurablePofContext</class-name>
              </serializer>
    </initiator-config>
    </remote-invocation-scheme>
    </caching-schemes>
    </cache-config>
    client side code to put value in cache:
    NamedCache cache = CacheFactory.getCache("dist-extend");
              String testVal = (String)cache.get("test");
              System.out.println("Value in coherence cache::" + testVal);
              cache.put("test", "coherece");
              System.out.println("Getting cache value from coherence");
    any idea?

    Thanks for your response Bob.
    I added POFSerializer to ExtendTcpCacheService & now I am not getting error when using put/get methods. I am getting exception when I am trying to execute InvocationService code: InvocationService service = (InvocationService)CacheFactory.getConfigurableCacheFactory().ensureService("ExtendTcpInvocationService");
              Map map = service.query(new AbstractInvocable()
                        public void run()
    setResult(CacheFactory.getCache("dist-extend").get("test"));
    }, null);
              testVal = (String) map.get(service.getCluster().getLocalMember());
              System.out.println("Value in coherence cache::" + testVal);
    Exceptiion:
    2010-02-05 16:39:11.021/460.104 Oracle Coherence GE 3.5.2/463 <Error> (thread=[A
    CTIVE] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)', me
    mber=n/a): An exception occurred while encoding a InvocationRequest for Service=
    ExtendTcpInvocationService:TcpInitiator: java.lang.IllegalArgumentException: unk
    nown user type: portlets.announcements.AnnouncementsController$1
    at com.tangosol.io.pof.ConfigurablePofContext.getUserTypeIdentifier(Conf
    igurablePofContext.java:400)
    at com.tangosol.io.pof.ConfigurablePofContext.getUserTypeIdentifier(Conf
    igurablePofContext.java:389)
    at com.tangosol.coherence.component.net.extend.Channel.getUserTypeIdenti
    fier(Channel.CDB:7)
    at com.tangosol.io.pof.PofBufferWriter.writeObject(PofBufferWriter.java:
    1432)
    at com.tangosol.io.pof.PofBufferWriter$UserTypeWriter.writeObject(PofBuf
    ferWriter.java:2092)
    at com.tangosol.coherence.component.net.extend.messageFactory.Invocation
    ServiceFactory$InvocationRequest.writeExternal(InvocationServiceFactory.CDB:3)
    Do I have to do some POF Serialization related coding for calling InvocationService? I am just setting/getting String objects. My understanding is we need additional coding only for custom objects.
    Sorry if these questions are very basic. I didn't find any document on using InvocationService with POF Serialization.
    It will be great if someone can provide link to any documentation related to the same.

  • Issues with CS6 and Adobe Application Manager

    When trying to open an Adobe CS6 application, a crashing loop begins. The AAM opens and closes once per second in the Dock, along with the application I was attempting to open.  Because of the constant open and closing of these apps, it's almost impossible to get the mouse or keyboard to focus for enough time to do a force quit on either application, and a hard reboot is usually required to break out of the cycle. I had the entire suite uninstalled and reinstalled, but several weeks later, the issue resurfaced.

    There was an issue with the serialization file created.
    Creating a new serilaization file via AAMEE and running it resolved the issue.
    Create a serialization file via AAMEE: http://wwwimages.adobe.com/www.adobe.com/content/dam/Adobe/en/devnet/creativesuite/pdfs/Ad obeApplicationManagerEnterpriseEditionDeploymentGuide_v_3_1.pdf
    And then run it via command line.
    Here are the steps:
    AdobeSerialization (serialization file created using AAMEE 3.1) is not a double clickable file. You need to run it via command line to serialize the products. the command line is -
    AdobeSerialization --tool=VolumeSerialize --stream --provfile=”Absolute_Path_of_prov.xml”
    where
    --tool=VolumeSerialize:  specifies that the tool needs to process the prov.xml file to serialize.
    --provfile” specifies the absolute file path to the prov.xml file,
    By default, the tool looks for the prov.xml file in the same directory as the executable file and so
    typically you do not need to specify this option. (Specify this option only if the prov.xml file is in some
    other location.)
    Make sure you invoke command prompt with Admin privileges. Return code of zero means success.
    Tiffany can you please confirm the same?
    -Pragya

  • Issues with IFrame and calling an application outside the IFrame

    Hi
    I have an issue which is really turning my head 360 degrees in a "dead loop":)
    I have a WebDynpro application, one component and three views in a viewset.
    The first view will always be placed in column one and depending on some action the two other views should be loaded one by one. So in the main view the users will select three parameters from three dropdownboxes and then a call with the given parameters to an html file in KM - which then again is passing the parameters to a swf file which is presenting an Business Objects dashboard. The BO Dashboard is presented in an iFrame.
    The users selects an element in the Dashboard and is then again calling the webdynpro application with four parameters, the three first from the dropdownboxes, but also a new one which is used to call the third view. I am using NavigationTarget in the URL to load the application at the given position in the portal. And the logic is working, BUT the portal is presented in the IFram of the first View.
    So my question is - How to navigate from the IFrame and "back into the portal"????
    Regards
    Kay-Arne

    There was an issue with the serialization file created.
    Creating a new serilaization file via AAMEE and running it resolved the issue.
    Create a serialization file via AAMEE: http://wwwimages.adobe.com/www.adobe.com/content/dam/Adobe/en/devnet/creativesuite/pdfs/Ad obeApplicationManagerEnterpriseEditionDeploymentGuide_v_3_1.pdf
    And then run it via command line.
    Here are the steps:
    AdobeSerialization (serialization file created using AAMEE 3.1) is not a double clickable file. You need to run it via command line to serialize the products. the command line is -
    AdobeSerialization --tool=VolumeSerialize --stream --provfile=”Absolute_Path_of_prov.xml”
    where
    --tool=VolumeSerialize:  specifies that the tool needs to process the prov.xml file to serialize.
    --provfile” specifies the absolute file path to the prov.xml file,
    By default, the tool looks for the prov.xml file in the same directory as the executable file and so
    typically you do not need to specify this option. (Specify this option only if the prov.xml file is in some
    other location.)
    Make sure you invoke command prompt with Admin privileges. Return code of zero means success.
    Tiffany can you please confirm the same?
    -Pragya

  • K8N neo2 raid0 issues... random failure

    I'm running 2x80 hitachi sata II in raid0 on sata 3 and 4.
    Although everything works fine most of the time, sometimes (rarely) when I turn the pc on only one disk is recognized by the bios, causing a blue screen in windows boot (of course). I push the reset button and all becomes fine then. All the data is still there.
    I've checked both disks for surface errors, they're both ok.
    Nvraid bios is 4.81.
    Does anybody has this issue with ocasional raid0 failure? This is really strange... I don't believe it's due the disks.

    I hope you like my signature now, lol... :-)
    Well, the PSU is a quality one, I guess, and the two hitachi drives aren't 1 month old yet. I already scanned them several times and found no problems at all.
    Also, all my disks have a HDD cooler.
    It could really be a power cable issue, since there's really a lot of cables inside... but I only have two sata power cables, and they're both powering the two disks... :-/ Besides, the SATA power cables are independent of all the other cables (that is, nothing else is connected to the wires where they are connected).
    The only thing I did today was adding a HDD cooler for my Seagate IDE drive (the only one that didn't have one), after powering the pc up one of the Hitachi drives wasn't detected, I just reset the pc (didn't turned it off) and it was fine... Maybe my system is soooo lightning fast that the electricity wasn't fast enough to reach the 2nd drive on time... bahaha.        

  • Weblogic 10.3.0 issues with remote object calls.

    All:
    I was wondering if anyone has experienced any issues with Weblogic 10.3.0 dropping initial remote object calls over AMF Secure Channel. Here are the issues we are experiencing.
    1.     FLEX applications fail consistently on the first remote object call made across the AMF Secure Channel. Resulting in the request not returning from the application server; which has had varying affects on the different applications including missing data, application freeze and general degrading of the user experience.
    2.     FLEX applications require a browser/application refresh once the application has been inactive for a certain period of time. In our experiences the behavior occurs after 30 minutes of inactivity.
    I've deployed this same code to Weblogic 10.3.3 and the behaviors go away. Are there any patches to 10.3.0 that might take care of this issue that we are not aware of?
    Thanks for you help,
    Mike

    Hello,
    I found the problem. But I needed change the target of all my datasources until discover that one of my datasource didn´t answer and no errors was trigged.
    My server was waiting this datasource, and not get started.

  • URL issue with Windows 7 Internet Explorer 11 and SAP NW PI 7.

    Hello SAP community,
    I am facing a strange issue with Windows 7 - Internet Explorer 11 and SAP NW java services (I hope I am in the correct discussion) ...
    When I try to open URL http://sapserver.hosters-name:port it is working so far with Windows 7 and Internet Explorer 11.
    But when I click than on NWA (example), I am getting "http 500 server error".
    I don't have the issue, when I start-up a VMware Workstation with Windows XP and Internet Explorer 8.
    But now the strange part: In our network DNS (Domain Name Service), we can also open the URL by http://sapserver.our-dns:port.
    Than it is working without problems with Windows 7 and Internet Explorer.
    I would agree to state the comment - it is just a Windows 7/Internet Explorer 11 issue, if it wouldn't work for both URLs (http://sapserver.hosters-name:port = problem URL; http://sapserver.our-dsn:port = working).
    We are using a SAP NW PI 7.x system. Some URLs for PI are using http://sapserver.hosters-name:port/java service.
    Do you have any hints what is cause the issue - I suspect some with Internet Explorer and Firefox (because with Google Chrome at least the URL for NWA is opened, beside the issue, Google Chrome can't display NWA content)?
    Thanks for your help.
    Best regards
    Carlos

    Hi all,
    I think I found the solution.
    The reason seems to be, the domain name for http://sapserver.our-dns:port, "our-dns" was already added to the compatibility view settings of Microsoft Internet Explorer.
    That seemed to be the reason, why it was working with http://sapserver.our-dns:port.
    When I now added for http://sapserver.hosters-name:port the domain "hosters-name" to the compatibility view settings it is now working.
    Internet Explorer Options => Settings for compatibility view => Add domain
    Best regards
    Carlos

  • CVC creation - Strange issue with Master data table of 9AMATNR

    Hi Experts,
    We have encountered a strange issue with Master data table (/BI0/9APMATNR) of info object 9AMATNR.
    We have a BADI implemented for checking the valid Characteristic before creation of the CVC using transaction /SAPAPO/MC62. This BADI puts a select on master data tab of material /BI0/9APMATNR and returns no value. But the material actually exists in the table (checked through SE16).
    Now we go inside the info object 9AMATNR and go to the Master data Tab. There we go inside the master table
    /BI0/9APMATNR and activate that. After activating the table it is read by the select statement inside BADI (Strange) and allows the CVC to be created.
    Ideally it should not allow us to activate the SAP standard table /BI0/9APMATNR. I observed that in technical settings of this table it has single record buffering as switched on. (But as per my knowledge buffer gets refreshed every 2 to 4 mins and not in 2 days or something).
    Your expert comment is valuable to us. Thanks.
    Best Regards,
    Chandan Dubey

    Hi Chandan,
                 Try to use a WAIT statment with 5 seconds before your select statment.
    I'm not sure whether this will work. Anyway check it and let me know the result.
    Regards,
    Siva.

  • SSL communication issue with JDK 1.6.0_19

    Hi,
    I am facing issue with JDK 1.6.0_19. I have a Java client which communicate with the Server in SSL communication.so, It is able to communicate properly with the JDK <=1.6.0_18 version.But I got handling exception: javax.net.ssl.SSLException: HelloRequest followed by an unexpected  handshake message exception when the client is trying to communicate with the server in JDK 1.6.0_19.
    We are using mutual authentication.The client and the server both have the signed certificate.The client certificate has to be validated by the server to establish the connection.
    I have seen in forum that it is a renegotiation issue.So, if I enable the renegotiation flag by -Dsun.security.ssl.allowUnsafeRenegotiation=true it's working fine.But enabling renegotiation itself is a vulnerability.So, I can't enable renegotiation.
    I am using httpclient 4.0 and JSSE in client side and IIS in the server side for this SSL connection.
    I am not sure which side client or server initiating the renegotiation?
    Please help me out.
    I have tried Openssl command from console.
    The command is : openssl s_client -connect X.X.X:443 -CAfile "xxxxx" -cert "xxxxxxxx" -key "xxxxxxxxxx" -state -verify 20 here is the output:
    Loading 'screen' into random state - done
    CONNECTED(00000748)
    SSL_connect:before/connect initialization
    SSL_connect:SSLv2/v3 write client hello A
    SSL_connect:SSLv3 read server hello A
    xxxxxxxxxxx.................
    verify return:1
    xxxxxxxxxxx.................
    verify return:1
    SSL_connect:SSLv3 read server certificate A
    SSL_connect:SSLv3 read server done A
    SSL_connect:SSLv3 write client key exchange A
    SSL_connect:SSLv3 write change cipher spec A
    SSL_connect:SSLv3 write finished A
    SSL_connect:SSLv3 flush data
    SSL_connect:SSLv3 read finished A
    Certificate chain
    xxxxxxxxxxx.................
    Server certificate
    -----BEGIN CERTIFICATE-----
    xxxxxxxxxxx.................
    -----END CERTIFICATE-----
    xxxxxxxxxxx.................
    No client certificate CA names sent
    SSL handshake has read 1839 bytes and written 392 bytes
    New, TLSv1/SSLv3, Cipher is RC4-MD5
    Server public key is 1024 bit
    Secure Renegotiation IS NOT supported
    Compression: NONE
    Expansion: NONE
    SSL-Session:
        Protocol  : TLSv1
        Cipher    : RC4-MD5
        Session-ID: xxxxxxxxxxx
        Session-ID-ctx:
        Master-Key: xxxxxxxxxxx
        Key-Arg   : None
        PSK identity: None
        PSK identity hint: None
        Start Time: 1275564626
        Timeout   : 300 (sec)
        Verify return code: 0 (ok)
    read:errno=10054If you see the console output you can see that two statement is missing those are :
    SSL_connect:SSLv3 read server certificate request A
    SSL_connect:SSLv3 write client certificate ASo, I like to know if this is any clue which is asking for renegotiation.

    Thank you for your response.
    Yes I have set the particular proerty SSLAlwaysNegoClientCert to True and it is able to establish the ssl conneciton without initiating renegotiation from IIS server side.The property has to be set the metabase.xml file.
    Thank you very much once again.
    Edited by: arpitak on Jun 23, 2010 2:10 AM

  • Issues with games, video card issue?

    Hi,
    I have been having issues with games lately. Mostly they are hidden object games from  Big Fish, but they start up then minimize, it doesn't crash completely just minimizes, I click on it again and it works for a few seconds and minimizes itself again and again. I am unable to play the games because it keeps minimizing. could this be an issue with my video card? I have attempted updating it but it doesn't seem to help. Any ideas?

    Ok OK, Update...
    the games work perfectly on my father's HP which runs windows 7...so it definatly is NOT because of not updated stuff....I really dont know what else to do at this point....no suggestions from anyone?!?!?!

  • Pof Serialization issue

    Hi guys,
    I get an "unknown user type issue" for only one class defined in my pof-config-file. The rest are ok which is the part I can't get my head round.
    Log (showing config load)
    Everything seems to load ok. In fact, just to confirm, I messed up some tags in the xml file and I got problems. So it's finding the file ok.
    +010-06-30 11:35:11.965/1.212 Oracle Coherence GE 3.5.3/465 <D5> (thread=Cluster, member=n/a): Member 1 joined Service InvocationService with senior member 1+
    +2010-06-30 11:35:12.007/1.255 Oracle Coherence GE 3.5.3/465 <D5> (thread=Invocation:Management, member=2): Service Management joined the cluster with senior service member 1+
    +2010-06-30 11:35:12.267/1.514 Oracle Coherence GE 3.5.3/465 <Info> (thread=Cluster, member=2): Loading POF configuration from resource "file:/mnt/linux-share/zeus/Zeus-core/config/zeus-pof-config.xml"+
    +2010-06-30 11:35:12.268/1.515 Oracle Coherence GE 3.5.3/465 <Info> (thread=Cluster, member=2): Loading POF configuration from resource "jar:file:/mnt/linux-share/repository/com/tangsol/coherence/3.5.3/coherence-3.5.3.jar!/coherence-pof-config.xml"+
    +2010-06-30 11:35:12.283/1.530 Oracle Coherence GE 3.5.3/465 <D5> (thread=DistributedCache, member=2): Service DistributedCache joined the cluster with senior service member 1+
    +2010-06-30 11:35:12.338/1.585 Oracle Coherence GE 3.5.3/465 <D5> (thread=DistributedCache, member=2): Service DistributedCache: received ServiceConfigSync containing 2040 entries+
    Exception
    Exception in thread "main" java.lang.IllegalArgumentException: unknown user type: org.zeus.query.QueryInvocable
    at com.tangosol.io.pof.ConfigurablePofContext.getUserTypeIdentifier(ConfigurablePofContext.java:400)
    at com.tangosol.io.pof.ConfigurablePofContext.getUserTypeIdentifier(ConfigurablePofContext.java:389)
    at com.tangosol.io.pof.PofBufferWriter.writeObject(PofBufferWriter.java:1432)
    at com.tangosol.io.pof.ConfigurablePofContext.serialize(ConfigurablePofContext.java:338)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.writeObject(Service.CDB:4)
    at com.tangosol.coherence.component.net.Message.writeObject(Message.CDB:1)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.InvocationService$InvocationRequest.write(InvocationService.CDB:3)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.packetProcessor.PacketPublisher.packetizeMessage(PacketPublisher.CDB:137)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.packetProcessor.PacketPublisher$InQueue.add(PacketPublisher.CDB:8)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.dispatchMessage(Grid.CDB:50)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.post(Grid.CDB:35)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.send(Grid.CDB:1)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.InvocationService.execute(InvocationService.CDB:31)
    at com.tangosol.coherence.component.util.safeService.SafeInvocationService.execute(SafeInvocationService.CDB:1)
    at org.ebtic.bpm.zeus.query.DistributedQueryProcessor.invokeQuery(DistributedQueryProcessor.java:133)
    at org.ebtic.bpm.zeus.query.DistributedQueryProcessor.query(DistributedQueryProcessor.java:95)
    at org.ebtic.bpm.zeus.query.DistributedQueryProcessor.main(DistributedQueryProcessor.java:56)
    My pof-config file looks like this:
    +<!DOCTYPE pof-config SYSTEM "pof-config.dtd">+
    +<pof-config>+
    +<user-type-list>+
    +<include>coherence-pof-config.xml</include>+
    +<user-type>+
    +<type-id>10001</type-id>+
    +<class-name>org.util.ZeusKey</class-name>+
    +</user-type>+
    +<user-type>+
    +<type-id>10002</type-id>+
    +<clas-name>org.query.QueryInvocable</clas-name>+
    +</user-type>+
    +<user-type>+
    +<type-id>10003</type-id>+
    +<class-name>org.sequencegenerator.ZeusSequenceGenerator$State</class-name>+
    +</user-type>+
    +</user-type-list>+
    +<allow-interfaces>true</allow-interfaces>+
    +<allow-subclasses>true</allow-subclasses>+
    +</pof-config>+
    Only the class QueryInvocable is causing a problem. ZeusKey and SequenceGenerator are working perfectly well.
    QueryInvocable looks like the following:
    public class QueryInvocable
    extends AbstractInvocable implements ExternalizableLite,
    PortableObject
    +{+
    private String m_sCacheName;
    private PartitionSet partitionSet;
    public QueryInvocable()
    +{+
    +}+
    public QueryInvocable(String sCacheName, PartitionSet partitions)
    +{+
    m_sCacheName = sCacheName;
    partitionSet = partitions;
    +}+
    +public PartitionSet getPartitionSet(){+
    return partitionSet;
    +}+
    +public String getCacheName(){+
    return m_sCacheName;
    +}+
    +public void setPartitionSet(PartitionSet pSet){+
    this.partitionSet = pSet;
    +}+
    +public void setCacheName(String name){+
    this.m_sCacheName = name;
    +}+
    +/**+
    +* {@inheritDoc}+
    +public void run(){+
    +try{+
    ZeusQueryProcessor client = new ZeusQueryProcessor("Distributed Query Processor", partitionSet);
    client.process();
    +}+
    +catch(Exception e){+
    System.err.println("Exception creating ZeusQueryProcessor.");
    e.printStackTrace();
    +}+
    +}+
    +public void readExternal(PofReader reader) throws IOException{+
    System.out.println("Reading in....");
    m_sCacheName = reader.readString(0);
    partitionSet = (PartitionSet) reader.readObject(1);
    +}+
    +public void writeExternal (PofWriter writer) throws IOException{+
    System.out.println("Writing out....");;
    writer.writeString(0, m_sCacheName);
    writer.writeObject(1, partitionSet);
    +}+
    +public void readExternal(DataInput in) throws IOException{+
    System.out.println("Reading in....");
    m_sCacheName = (String) ExternalizableHelper.readObject(in);
    partitionSet = (PartitionSet) ExternalizableHelper.readObject(in);
    +}+
    +public void writeExternal(DataOutput out) throws IOException{+
    System.out.println("Writing out....");;
    ExternalizableHelper.writeObject(out, m_sCacheName);
    ExternalizableHelper.writeObject(out, partitionSet);
    +}+
    +}+
    Edited by: user11218537 on 30-Jun-2010 00:54

    user11218537 wrote:
    Sorry, this was the result of my not-very-efficient attempt to anonymize my code posting. The package and class names do actually match.
    I may have "by accident" solved the problem but I am not sure since I have to test properly. It appears that if I define the PofSerialization stuff in the pof-context.xml, which is then referenced from my coherence-cache-config.xml, I should not set the tangosol.pof.enabled=true flag.
    I was basically trying to turn of pof so that I could test something else in my code but it appears that just by removing that flag in the cache-server.sh script (which I had added earlier), the problem disappeared and coherence is still using pof.
    What am I missing? Is this behaviour correct?If you remove the tangosol.pof.enabled flag from the Java properties, then POF by default will not be used.
    On the other hand if you still have POF configured for certain services with the <serializer> element referring to a PofContext implementation e.g. ConfigurablePofContext, then those services (and only those services) will use POF.
    Best regards,
    Robert

  • POF Serializer issue

    I have a strange issue using a POF serializer where one of the fields of the class is not populated when the object is retrieved from the cache.
    The class looks like:
    public class RegionSerializer extends AbstractPersistentEntitySerializer<RegionImpl> {
         private static final int REGION_CODE = 11;
         private static final int NAME = 12;
         private static final int ID_REGION = 13;
         private static final int ID_CCY_RPT = 14;
         @Override
         protected RegionImpl createInstance() {
              return new RegionImpl();
         public void serialize(PofWriter out, Object o) throws IOException {
              RegionImpl obj = (RegionImpl) o;
              super.serialize(out, obj);
              out.writeObject(REGION_CODE, obj.getRegionCode());
              out.writeString(NAME, obj.getName());
              out.writeInt(ID_REGION, obj.getIdRegion());
              out.writeObject(ID_CCY_RPT, obj.getIdCcyRpt());
         public Object deserialize(PofReader in) throws IOException {
              RegionImpl obj = (RegionImpl) super.deserialize(in);
              obj.setRegionCode((RegionCode) in.readObject(REGION_CODE));
              obj.setName(in.readString(NAME));
              obj.setIdRegion(in.readInt(ID_REGION));
              obj.setIdCcyRpt((CurrencyCode) in.readObject(ID_CCY_RPT));
              return obj;
    }and the RegionCodeSerializer...
    public class RegionCodeSerializer implements PofSerializer{
         private static final int CODE = 11;
         public void serialize(PofWriter out, Object o) throws IOException {
              RegionCode obj = (RegionCode) o;
              out.writeString(CODE, obj.getCode());
         public Object deserialize(PofReader in) throws IOException {
              RegionCode obj = new RegionCode();
              obj.setCode(in.readString(CODE));
              return obj;
    }the output from the log after inserting and retrieving from the cache is
    06-Oct-2010 10:11:28,277 BST DEBUG refdata.RefDataServiceImpl main - Region count:4
    06-Oct-2010 10:11:28,277 BST INFO  cache.TestCacheStartUp main - REGION FROM DAO: RegionImpl[RegionImpl[objectId=5],regionCode=EMEA,name=LONDON]
    06-Oct-2010 10:11:28,277 BST INFO  cache.TestCacheStartUp main - REGION FROM DAO: RegionImpl[RegionImpl[objectId=6],regionCode=US,name=NEW YORK]
    06-Oct-2010 10:11:28,277 BST INFO  cache.TestCacheStartUp main - REGION FROM DAO: RegionImpl[RegionImpl[objectId=7],regionCode=APAC,name=TOKYO]
    06-Oct-2010 10:11:28,277 BST INFO  cache.TestCacheStartUp main - REGION FROM DAO: RegionImpl[RegionImpl[objectId=8],regionCode=NON_PYRAMID_DESKS,name=NON PYRAMID DESKS]
    06-Oct-2010 10:11:28,293 BST INFO  cache.TestCacheStartUp main - Is Cache empty?: false
    06-Oct-2010 10:11:28,293 BST INFO  cache.TestCacheStartUp main - Cache Size is: 4
    06-Oct-2010 10:11:28,324 BST INFO  cache.TestCacheStartUp main - REGION FROM CACHE: RegionImpl[RegionImpl[objectId=6],regionCode=US,name=<null>]
    06-Oct-2010 10:11:28,324 BST INFO  cache.TestCacheStartUp main - REGION FROM CACHE: RegionImpl[RegionImpl[objectId=7],regionCode=APAC,name=<null>]
    06-Oct-2010 10:11:28,324 BST INFO  cache.TestCacheStartUp main - REGION FROM CACHE: RegionImpl[RegionImpl[objectId=8],regionCode=NON_PYRAMID_DESKS,name=<null>]
    06-Oct-2010 10:11:28,324 BST INFO  cache.TestCacheStartUp main - REGION FROM CACHE: RegionImpl[RegionImpl[objectId=5],regionCode=EMEA,name=<null>]as can be seen from the output the name field is null after retrieving. It seems that the 3 remaining fields after regionCode are being ignored by the deserialize (or serialize method) in the serializer class above but I can't see why? Any ideas?

    Hi,
    You need to call read/writeRemainder() method at the end of serialization/deserialization to properly terminate reading/writing of a user type.
    Thanks,
    Wei

Maybe you are looking for