Cache entrySet EqualsFilter ClassNotFoundException

Hello,
While using namedCache.entrySet(new EqualsFilter("attributeAccessor", value), null) I get a ClassNotFoundException.
Notes:_
- I have created my cache with the ClassLoader of the class I am trying to find instances for.
- Able to namedCache.put() with no issues
- Able to deserialize the same container object when I use namedCache.get()
- Tried both single and multiple attributeAccessor navigations: attr1 and attr1.attr2
- In the case of attr1 I get the ClassNotFoundException for the container object
- In the case of attr1.attr2 I get the ClassNotFoundException for the contained object
- I use a fluent API, which means my getters don't start with "get"; thus, attribute1 is accessed with method attribute1()
- I have attempted to create getAttribute1() and getAttribute2() methods with "getAttribute1.getAttribute2" just in case
- It is only a problem when using entrySet() with a Filter
Stack trace:_
(Wrapped: Failed request execution for DistributedCache service on Member(Id=1, Timestamp=2011-04-13 15:34:03.656, Address=XXX:8088, MachineId=61697, Location=machine:MACHNAME,process:3508, Role=CoherenceServer)) java.io.IOException: readObject failed: java.lang.ClassNotFoundException: packagename.ContainedClassName
     at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
     at java.security.AccessController.doPrivileged(Native Method)
     at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
     at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
     at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
     at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
     at java.lang.Class.forName0(Native Method)
     at java.lang.Class.forName(Class.java:247)
     at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:604)
     at com.tangosol.io.ResolvingObjectInputStream.resolveClass(ResolvingObjectInputStream.java:68)
     at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1575)
     at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1496)
     at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1732)
     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1329)
     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:351)
     at com.tangosol.util.ExternalizableHelper.readSerializable(ExternalizableHelper.java:2180)
     at com.tangosol.util.ExternalizableHelper.readObjectInternal(ExternalizableHelper.java:2311)
     at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2254)
     at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2233)
     at com.tangosol.util.filter.ComparisonFilter.readExternal(ComparisonFilter.java:204)
     at com.tangosol.util.ExternalizableHelper.readExternalizableLite(ExternalizableHelper.java:2004)
     at com.tangosol.util.ExternalizableHelper.readObjectInternal(ExternalizableHelper.java:2308)
     at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2254)
     at com.tangosol.io.DefaultSerializer.deserialize(DefaultSerializer.java:74)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.readObject(Service.CDB:1)
     at com.tangosol.coherence.component.net.Message.readObject(Message.CDB:1)
     at com.tangosol.coherence.component.net.message.requestMessage.distributedCacheRequest.partialRequest.FilterRequest.read(FilterRequest.CDB:7)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$QueryRequest.read(PartitionedCache.CDB:1)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.deserializeMessage(Grid.CDB:42)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:31)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onNotify(PartitionedService.CDB:3)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onNotify(PartitionedCache.CDB:3)
     at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
     at java.lang.Thread.run(Thread.java:619)
ClassLoader: null
     at com.tangosol.util.Base.ensureRuntimeException(Base.java:293)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.tagException(Grid.CDB:36)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onQueryRequest(PartitionedCache.CDB:72)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$QueryRequest.run(PartitionedCache.CDB:1)
     at com.tangosol.coherence.component.net.message.requestMessage.DistributedCacheRequest.onReceived(DistributedCacheRequest.CDB:12)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onMessage(Grid.CDB:11)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:33)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onNotify(PartitionedService.CDB:3)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onNotify(PartitionedCache.CDB:3)
     at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
     at java.lang.Thread.run(Thread.java:619)
Caused by: java.io.IOException: readObject failed: java.lang.ClassNotFoundException: packagename.ContainedClassName
     at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
     at java.security.AccessController.doPrivileged(Native Method)
     at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
     at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
     at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
     at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
     at java.lang.Class.forName0(Native Method)
     at java.lang.Class.forName(Class.java:247)
     at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:604)
     at com.tangosol.io.ResolvingObjectInputStream.resolveClass(ResolvingObjectInputStream.java:68)
     at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1575)
     at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1496)
     at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1732)
     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1329)
     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:351)
     at com.tangosol.util.ExternalizableHelper.readSerializable(ExternalizableHelper.java:2180)
     at com.tangosol.util.ExternalizableHelper.readObjectInternal(ExternalizableHelper.java:2311)
     at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2254)
     at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2233)
     at com.tangosol.util.filter.ComparisonFilter.readExternal(ComparisonFilter.java:204)
     at com.tangosol.util.ExternalizableHelper.readExternalizableLite(ExternalizableHelper.java:2004)
     at com.tangosol.util.ExternalizableHelper.readObjectInternal(ExternalizableHelper.java:2308)
     at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2254)
     at com.tangosol.io.DefaultSerializer.deserialize(DefaultSerializer.java:74)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.readObject(Service.CDB:1)
     at com.tangosol.coherence.component.net.Message.readObject(Message.CDB:1)
     at com.tangosol.coherence.component.net.message.requestMessage.distributedCacheRequest.partialRequest.FilterRequest.read(FilterRequest.CDB:7)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$QueryRequest.read(PartitionedCache.CDB:1)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.deserializeMessage(Grid.CDB:42)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:31)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService.onNotify(PartitionedService.CDB:3)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache.onNotify(PartitionedCache.CDB:3)
     at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
     at java.lang.Thread.run(Thread.java:619)
ClassLoader: null
     at com.tangosol.util.ExternalizableHelper.readSerializable(ExternalizableHelper.java:2188)
     at com.tangosol.util.ExternalizableHelper.readObjectInternal(ExternalizableHelper.java:2311)
     at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2254)
     at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2233)
     at com.tangosol.util.filter.ComparisonFilter.readExternal(ComparisonFilter.java:204)
     at com.tangosol.util.ExternalizableHelper.readExternalizableLite(ExternalizableHelper.java:2004)
     at com.tangosol.util.ExternalizableHelper.readObjectInternal(ExternalizableHelper.java:2308)
     at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2254)
     at com.tangosol.io.DefaultSerializer.deserialize(DefaultSerializer.java:74)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.readObject(Service.CDB:1)
     at com.tangosol.coherence.component.net.Message.readObject(Message.CDB:1)
     at com.tangosol.coherence.component.net.message.requestMessage.distributedCacheRequest.partialRequest.FilterRequest.read(FilterRequest.CDB:7)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$QueryRequest.read(PartitionedCache.CDB:1)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.deserializeMessage(Grid.CDB:42)
     at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Grid.onNotify(Grid.CDB:31)
     ... 4 more
Thanks for your help!
Kind regards,
Vaughn

I am using out-of-the-box configuration. I have not changed a thing about it. I have whittled down the code into a sandbox with unit test. I am using default Java serialization. The point was to get the simplest thing working.
In the code below, see CoherenceProductRepository#allProductsOfTenant(). If you comment out the statement with EqualsFilter and uncomment the other statement, the test works. If you use EqualsFilter it throws ClassNotFoundException.
The code is wired a bit differently than the actual code because I have removed a bunch of dependencies. Still it produces the same error.
Vaughn
Product.java
==========
package sandbox;
import java.io.Serializable;
public class Product implements Serializable {
private static final long serialVersionUID = 1L;
private String description;
private String name;
private Tenant tenant;
public Product(Tenant aTenant, String aName, String aDescription) {
this();
this.setDescription(aDescription);
this.setName(aName);
this.setTenant(aTenant);
this.initialize();
public void rename(String aName) {
this.setName(aName);
public String description() {
return this.description;
public String name() {
return this.name;
public Tenant tenant() {
return this.tenant;
@Override
public boolean equals(Object anObject) {
boolean equalObjects = false;
if (anObject != null && this.getClass() == anObject.getClass()) {
Product typedObject = (Product) anObject;
equalObjects =
this.name().equals(typedObject.name()) &&
this.description().equals(typedObject.description());
return equalObjects;
@Override
public int hashCode() {
int hashCodeValue =
+ (169853 * 229)
+ this.name().hashCode()
+ this.description().hashCode();
return hashCodeValue;
@Override
public String toString() {
return
"Product"
+ " description = " + this.description()
+ " name = " + this.name();
protected Product() {
super();
protected void setDescription(String aDescription) {
if (aDescription == null || aDescription.length() == 0) {
throw new IllegalArgumentException("Description is required.");
if (aDescription.length() > 500) {
throw new IllegalArgumentException("Description must be 500 characters or less.");
this.description = aDescription;
protected void setName(String aName) {
if (aName == null || aName.length() == 0) {
throw new IllegalArgumentException("Name is required.");
if (aName.length() > 100) {
throw new IllegalArgumentException("Name must be 100 characters or less.");
this.name = aName;
protected void setTenant(Tenant aTenant) {
if (aTenant == null) {
throw new IllegalArgumentException("The tenant is required.");
this.tenant = aTenant;
private void initialize() {
// begin-user-code: initialize()
// end-user-code: initialize()
Tenant.java
=========
package sandbox;
import java.io.Serializable;
public class Tenant implements Serializable {
private static final long serialVersionUID = 1L;
private String id;
public Tenant(String anId) {
super();
this.setId(anId);
this.initialize();
public Tenant(Tenant aTenant) {
this(aTenant.id());
public String id() {
return this.id;
@Override
public boolean equals(Object anObject) {
boolean equalObjects = false;
if (anObject != null && this.getClass() == anObject.getClass()) {
Tenant typedObject = (Tenant) anObject;
equalObjects =
this.id().equals(typedObject.id());
return equalObjects;
@Override
public int hashCode() {
int hashCodeValue =
+ (175819 * 311)
+ this.id().hashCode();
return hashCodeValue;
@Override
public String toString() {
return
"Tenant"
+ " id = " + this.id();
protected Tenant() {
super();
private void initialize() {
// begin-user-code: initialize()
// end-user-code: initialize()
private void setId(String anId) {
if (anId == null || anId.length() == 0) {
throw new IllegalArgumentException("The tenant identity is required.");
if (anId.length() > 36) {
throw new IllegalArgumentException("The tenant identity must be 36 characters or less.");
this.id = anId;
CoherenceProductRepository.java
======================
package sandbox;
import java.util.Collection;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Map;
import java.util.Set;
import com.tangosol.net.CacheFactory;
import com.tangosol.net.NamedCache;
import com.tangosol.util.filter.EqualsFilter;
public class CoherenceProductRepository {
public CoherenceProductRepository() {
super();
public void add(Product aProduct) {
this.cache().put(this.idOf(aProduct), aProduct);
public void addAll(Collection<Product> aProductCollection) {
if (!aProductCollection.isEmpty()) {
Map<String,Product> productsMap =
new HashMap<String,Product>(aProductCollection.size());
for (Product product : aProductCollection) {
productsMap.put(this.idOf(product), product);
this.cache().putAll(productsMap);
@SuppressWarnings("unchecked")
public Collection<Product> allProductsOfTenant(Tenant aTenant) {
Set<Map.Entry<String, Product>> entries =
this.cache().entrySet(new EqualsFilter("tenant.id", aTenant.id()), null);
// Set<Map.Entry<String, Product>> entries =
// this.cache().entrySet();
Collection<Product> products =
new HashSet<Product>(entries.size());
for (Map.Entry<String, Product> entry : entries) {
products.add(entry.getValue());
return products;
public Product productOfName(String aName) {
return (Product) this.cache().get(aName);
public void remove(Product aProduct) {
this.cache().remove(this.idOf(aProduct));
public void removeAll(Collection<Product> aProductCollection) {
for (Product product : aProductCollection) {
this.remove(product);
private NamedCache cache() {
NamedCache cache = CacheFactory.getCache("coherence.sandbox.product");
return cache;
private String idOf(Product aProduct) {
return aProduct.name();
CoherenceProductRepositoryTest.java
========================
package sandbox;
import junit.framework.TestCase;
import java.util.Arrays;
import java.util.Collection;
public class CoherenceProductRepositoryTest extends TestCase {
private CoherenceProductRepository productRepository;
private Tenant tenant;
public CoherenceProductRepositoryTest() {
super();
public void testSaveProduct() throws Exception {
Tenant tenant = new Tenant("01234567");
Product product =
new Product(
tenant,
"My Product",
"This is the description of my product.");
this.productRepository().add(product);
Product readProduct =
this.productRepository().productOfName(product.name());
assertNotNull(readProduct);
assertEquals(readProduct.tenant(), tenant);
assertEquals(readProduct.name(), product.name());
assertEquals(readProduct.description(), product.description());
public void testFindMultipleProducts() throws Exception {
Product product1 =
new Product(
tenant,
"My Product 1",
"This is the description of my first product.");
Product product2 =
new Product(
tenant,
"My Product 2",
"This is the description of my second product.");
Product product3 =
new Product(
tenant,
"My Product 3",
"This is the description of my third product.");
this.productRepository().addAll( Arrays.asList(product1, product2, product3) );
assertNotNull(this.productRepository().productOfName(product1.name()));
assertNotNull(this.productRepository().productOfName(product2.name()));
assertNotNull(this.productRepository().productOfName(product3.name()));
Collection<Product> allProducts =
this.productRepository().allProductsOfTenant(tenant);
assertEquals(allProducts.size(), 3);
@Override
protected void setUp() throws Exception {
this.setProductRepository(new CoherenceProductRepository());
this.tenant = new Tenant("01234567");
super.setUp();
@Override
protected void tearDown() throws Exception {
Collection<Product> products =
this.productRepository().allProductsOfTenant(tenant);
this.productRepository().removeAll(products);
protected CoherenceProductRepository productRepository() {
return this.productRepository;
protected void setProductRepository(CoherenceProductRepository aProductRepository) {
this.productRepository = aProductRepository;
}

Similar Messages

  • Cache query results in too much garbage collection activity

    Oracle Coherence Version 3.6.1.3 Enterprise Edition: Production mode
    JRE 6 Update 21
    Linux OS 64 bit
    The application is using a object Customer having following structure
    Customer(CustID, FirstName, LastName, CCNumber, OCNumber)
    Each property of Customer is a inner classes having getValue as one of the methods retuning a value. The getValue method of CCNumber and OCNumber return a Long value. There are 150m instances of Customer in cache. To hold this much data in cache we are running several nodes on 2 machines.
    The following code is used to create indexes on CCNumber and OCNumber:
         ValueExtractor[] valExt = new ValueExtractor[]{
              new ReflectionExtractor("getCCNumber"), new ReflectionExtractor("getValue")};
         ChainedExtractor chExt = new ChainedExtractor(valExt);
         Long value = new Long(0);
         Filter f = new NotEqualsFilter(chExt, value);
         ValueExtractor condExtractor = new ConditionalExtractor(f, chExt, true);
         cache.addIndex(condExtractor, false, null);The client code queries the cache with following code:
         ValueExtractor[] valExt1 = new ValueExtractor[]{
              new ReflectionExtractor("getCCNumber"), new ReflectionExtractor("getValue")};
         ChainedExtractor chExt1 = new ChainedExtractor(valExt1);
         EqualsFilter filter1 = new EqualsFilter(chExt1, ccnumber);
         ValueExtractor[] valExt2 = new ValueExtractor[]{
              new ReflectionExtractor("getOCNumber"), new ReflectionExtractor("getValue")};
            ChainedExtractor chExt2 = new ChainedExtractor(valExt2);
         EqualsFilter filter2 = new EqualsFilter(chExt2, ocnumber);
         AnyFilter anyFilter = new AnyFilter(new Filter[]{filter1, filter2})
         cache.entrySet(anyFilter);The observation is that for 20 client threads the application performs well(avg response time = 200ms) but as the number of client threads increases the application performance goes down disproportionately(query returns anywhere between 1000ms to 8000ms for 60 threads). I think this is because of the eden space filling up very fast when the number of client thread goes up. The number of collections per second goes up with the number of client threads. There are almost 2-3 ParNew collections every second when there are 60 client threads where as only 1 collection per second for 20 client threads. Even 100-200ms pause degrades the overall query performance.
    My question is why coherence is creating so many objects that fills up eden so fast? Is there anything I need to do in my code?

    Hi Coh,
    The reason for so much garbage is that you are using ReflectionExtractors in you filters, I assume you do not have any indexes on your caches either. This means that each time you execute a query Coherence has to scan the cache for matches to the filter - like a full table scan in a DB. For each entry in the cache Coherence has to deserialize that entry into a real object then using reflection call the methods in the filters. Once the query is finished all these deserialized objects are garbage that needs to be collected. For a big cache this can be a lot of garbage.
    You can change to POF extractors to save the deserialization step which should reduce the garbage quite a bit, although not eliminate it. You could also use indexes, which should eliminate pretty much more of the garbage you are seeing during queries.
    JK

  • Getting All Entries from a cache

    Hi Folks,
         Just a small interesting observation. In an attempt to get back all the data from my partitioned cache I tried the following approaches:
         //EntrySet
         NamedCache cache = NamedCache.getCache("MyCache");
         Iterator<Entry<MyKeyObj, MyObj>> iter = cache.entrySet().iterator();
         //iterator over objects and get values
         //KeySet & getAll
         NamedCache cache = NamedCache.getCache("MyCache");
         Map results = cache.getAll(cache.keySet());
         Iterator<Entry<MyKeyObj, MyObj>> iter = results.iterator();
         //iterate over objects and get values
         Retrieving ~47k objects from 4 nodes takes 21 seconds using the entryset approach and 10 seconds for the keyset/getal approach.
         does that sound right to you? That implies that the entryset iterator is lazy loaded using get(key) for each entry.
         Regards,
         Max

    Hi Gene,
         I actually posted the question, because we are currently performance-tuning our application, and there are scenarios where (due to having a large amount of badly organized legacy code with the bottom layers ported to Coherence) there are lots of invocations getting all the entries from some caches, sometimes even hundreds of times during the processing of a HTTP request.
         In some cases (typically with caches having a low cache-size) we found, that the entrySet-AlwaysFilter solution was way faster than the keyset-getall solution, which was about as fast as the solution iterating over the cache (new HashMap(cache)).
         I just wanted to ask if there are some rules of thumb on how long is it efficient to use the AlwaysFilter on distributed caches, and where it starts to be better to use the keyset-getall approach (from a naive test-case keyset-getall seemed to be better upwards from a couple-of-thousand entries).
         Also, as we are considering to move some of the caches (static data mostly, with usually less than 1000 entries, sometimes even as few as a dozen entries in a named cache, and in very few cases as many as 40000 entries) to a replicated topology, that is why I asked about the effect of using replicated caches...
         I expect the entrySet-AlwaysFilter to be slower than the iterating solution, since it effectively does the same, and also has some additional filter evaluation to be done.
         The keySet-getall will be something similar to the iterating solution, I guess.
         What is to be known about the implementation of the values() method?
         Can it be worth using in some cases? Does it give us an instant snapshot in case of replicated caches? Is it faster than the entrySet(void) in case of replicated caches?
         Thanks and best regards,
         Robert

  • Retrieving contents of cache

    Hi,
    What is the most efficient way to retrieve a static view of the full contents of the cache? I've been using the following:
    Set fullSet = new HashSet(cache.entrySet());
    Is there a better way to do this?
    Regards,
    Steven

    Steven,
    Assuming the entire cache content could fit into your process memory (which is a very big assumption!), the easiest way is to write:
    Map mapSnapshot = new HashMap(cache);Regards,
    Gene

  • How to improve ordering performance in coherence cache?

    Hi,
    We have distributed local-scheme cache and about 60K target object in cache.
    our application shows these target objects, and they should be sorted in one Field.
    We are using following scenario:
    1) added ordered index to cache
    2) added 60K object to cache
    3) create condition filter(will accept only 16K objects)
    4) create limit filter, new LimitFilter(conditionFilter, 100)
    5) create comparator
    6) call cache.entrySet(limitFilter, comparator) -- standard java.util.Date comparator -- execution time is *8 sec*, returned 100 sorted objects
    7) call cache.entrySet(limitFilter) execution time is *0.5 sec*, returned 100 unsorted objects
    8) call cache.entrySet(conditionFilter, comparator) – execution time is *8.5 seconds*, returned 16K sorted objects
    9) call cache.entrySet(conditionFilter) – execution time is *0.7 sec* returned 16K unsorted objects
    Thus, even if we want to access only 100 records execution time will be almost same in case with 16K objects,
    So, Our goal is return first 100 objects as soon as possible, time 8 seconds is very bad :-(.
    Could you please provide us with recommendation about improving sorting data in the cache?
    Thank you,
    Alexander Gunko.

    Hi Alexander
    Have you tried combining the conditional filter with a limit filter. Like:
    LimitFilter filter = new LimitFilter(condFilter, 100);
    Set set = cache.entrySet(filter, comp);
    filter.extractPage(setResult);
    Thanks
    /Charlie

  • How to transfer the coherence cache t to a different network/environment?

    Hi,
    I have a requirement where in i need to import/export cache from one network into a different network/environment all together keeping the cache data intact. How do i achieve this from Coherence side? I am using distributed cache scheme.
    Regards,
    Radhika

    You could serialize to a file the content of the cache then read it back at the other end.  The cache dump usually does not take much time even for GB sized caches.  The import usually takes more time.
    Here is sample code to serialize the content of a cache.  Ideally you should use POF to have already compacted data.
    Export:
         public void exportCache(String cacheName, File file) throws Exception {
              WrapperBufferOutput wrappedBufferOutput = null;
              try {
                   NamedCache cache = CacheFactory.getCache(cacheName);
                   FileOutputStream fileOutputStream = new FileOutputStream(file);
                   BufferedOutputStream bufferedOutputStream = new BufferedOutputStream(fileOutputStream, 1024 * 1024);
                   DataOutputStream dataOutputStream = new DataOutputStream(bufferedOutputStream);
                   wrappedBufferOutput = new WrapperBufferOutput(dataOutputStream);
                   ConfigurablePofContext pofContext = (ConfigurablePofContext) cache.getCacheService().getSerializer();
                        for (Object o : cache.entrySet()) {
                             pofContext.serialize(wrappedBufferOutput, ((Map.Entry) o).getValue());
              } finally {
                   if (wrappedBufferOutput != null) {
                        wrappedBufferOutput.close();
             timer.print();
    Here is sample code to deserialize the content of a cache store on a file:
    Import:
    public void importCache(String cacheName, File file) throws Exception {
              WrapperBufferInput wrappedBufferInput = null;
              try {
                   NamedCache cache = CacheFactory.getCache(cacheName);
                   FileInputStream fileInputStream = new FileInputStream(file);
                   BufferedInputStream bufferedInputStream = new BufferedInputStream(fileInputStream, 1024 * 1024);
                   DataInputStream dataInputStream = new DataInputStream(bufferedInputStream);
                   wrappedBufferInput = new WrapperBufferInput(dataInputStream);
                   ConfigurablePofContext pofContext = (ConfigurablePofContext) cache.getCacheService().getSerializer();
                   while (wrappedBufferInput.available() > 0) {
                        ImportableObject o = (ImportableObject) pofContext.deserialize(wrappedBufferInput);
                        cache.put(o.getObjectKey(), o);
              } finally {
                   if (wrappedBufferInput != null) {
                        wrappedBufferInput.close();
    Here we assume that cache entries implement ImportableObject interface which has the getObjectKey() method.  This make it easy to figure out the key of the entry without knowing its real type.

  • Can not filter the data with the extended class

    Hi,
    I have a quick question about PortableObject format. I have created a class which extends PortableObject interface and implemented serializer methods as well. I have updated it in the pof-config.xml file as well. If I insert the objects of this type of object in the cache, they get inserted properly and I can filter the values based on the getters defined in the class. Everything works fine here.
    Now, I am trying to extend the existing class that I have. We have our custom API which we have built for our domain objects. I need to store these objects in the cache. So, naturally I need to implement PortableObject interface to do that. So, instead of creating a new class with new set of getters and setters and local fields, I am extending our domain class to create a new class which implements PortableObject interface. Instead of defining the local fields and getters and setters i am reusing the ones provided by my existing class. Now, I can insert the objects of the new class to the cache. But I can not filter the values for the objects of this new class.
    Let me show you what exactly I am trying to achieve by giving a small example:
    Domain Class:
    class Person
    private String person_name;
    *public String getPerson_name() {return person_name;}*
    *public String setPerson_name(person_name) {this.person_name = person_name;}*
    The new class implementing PortableObject interface:
    class ExtPerson extends Person implements PortableObject
    public static final PERSON_NAME = 0;
    *public void readExternal(PofReader reader) throws IOException{*
    setPerson_name(reader.readString(PERSON_NAME));
    *public void writeExternal(PofWriter writer) throws IOException{*
    writer.writeString(PERSON_NAME, getPerson_name());
    *// And HashCode, Equals and ToString methods, all implemented using the getter from the Person class*
    So, if I create a new class ExtPerson without extending the Person class and write all the methods, store the objects in the cache and perform the following query, I get the size printed
    System.out.println((cache.entrySet(new EqualsFilter("getPerson_name","ABC"))).size());
    But if I use the extended class and insert the values into the cache and if I use the same query to filter, I get 0 printed on the console.
    System.out.println((cache.entrySet(new EqualsFilter("getPerson_name","ABC"))).size());
    So, can anyone tell what exactly is causing this?
    Thanks!

    Well, just a quick question. It seems that I can not get ContainsAnyFilter or ContainsAllFilter working.
    EqualsFilter is actually working properly.
    I am preparing a Set of Strings and passing it to ContainsAnyFilter or ContainsAllFilter and it is returning me 0 records.
    E.g.:
    Set<String> setStr = new HashSet<String>();
    setStr.add("ABC");
    setStr.add("DEF");
    System.out.println((cache2.entrySet(new ContainsAnyFilter("getPerson_name", setStr))).size());
    I get 0 in my output
    If I try this:
    System.out.println((cache.entrySet(new EqualsFilter("getPerson_name","ABC"))).size());
    System.out.println((cache.entrySet(new EqualsFilter("getPerson_name","DEF"))).size());
    I get 1 for each of the query.
    If I club all these EqualsFilter in a Filter[] array and create an AnyFilter or AllFilter and pass it to the query, it works fine.
    List<Object> lst = new ArrayList<Object>();
              lst.add("ABC");
              lst.add("DEF");
    Filter[] filter = new Filter[lst.size()];
         for(int i=0;i<lst.size();i++)
              filter[i] = new EqualsFilter("getPerson_name",lst.get(i).toString());
    AnyFilter fil = new AnyFilter(filter);
    System.out.println((cache4.entrySet(fil)).size());
    I get the desired result here, which is 2.
    Am I missing something here?

  • Why use KeyExtractor here? How about remove it?

    Please see the following script:
    setResults = cache.entrySet(new AndFilter(
    new LikeFilter(new KeyExtractor("getLastName"), "S%",
    (char) 0, false),
    new EqualsFilter("getHomeAddress.getState", "MA")));Why use KeyExtractor here? How about remove it and make it like:
    setResults = cache.entrySet(new AndFilter(
    new LikeFilter("getLastName"), "S%"),
    new EqualsFilter("getHomeAddress.getState", "MA")));Is this scrip right?
    Edited by: jetq on Oct 23, 2009 2:08 PM

    I Googled for KeyExtractor and the likeliest-looking thing was javadocs for com.tangosol.util.extractor.KeyExtractor on an Oracle site.
    Why don't you check around the Oracle developer site for a forum specifically about whatever product this is?

  • Coherence - query language

    Hi All ,
    On which part of the cache entry the 'Filter' acts ?
    say I have created a filter like the below ,
    Filter filter = new EqualsFilter("getName" , "BobSmith");
    Set entrySet=cache.entrySet(filter);
    will this 'filter' act on the 'key' of cache entries of the 'value' of cache entries ?

    Hi
    Using EqualsFilter will implicitly look for the method 'getName' on the value part of the entry. If you want the Filter to look for it on the key part you need to do the following:
    Filter filter = new EqualsFilter(new KeyExtractor("getName") , "BobSmith");/Charlie

  • How can I  delete and update records using where conditions?

    I want to delete and update the coherence records with some conditions, I describe it to use SQL as follows:
    delete from "contacts" where getStreet() = "dsada";
    update contacts set getStreet() = "dddd" where getCity() = "ssss";
    Can I use the filter like query to achieve this requirement as follows:
    ValueExtractor::View vHomeStateExtractor = ChainedExtractor::create(
    ChainedExtractor::createExtractors("getHomeAddress.getState"));
    Object::View voStateName = String::create("MA");
    Set::View setResults = hCache->entrySet(
    EqualsFilter::create(vHomeStateExtractor, voStateName));
    I know I can use get and put to achieve this requirement , but it Requires a two-interaction between the client and coherence server. Does it have And another way?
    Thanks very much, and please Forgive my English is not very good.

    Hi,
    You have a couple of options for updating or deleting using a Filter.
    For deleting you can use an Entry Processor and the cache invokeAll method. Using "out of the box" Coherence you can use the ConditionalRemove entry processor. I'm a Java person so the C++ below might not be exactly right but you should get the idea.
    ValueExtractor::View vHomeStateExtractor = ChainedExtractor::create(
    ChainedExtractor::createExtractors("getHomeAddress.getState"));
    Object::View voStateName = String::create("MA");
    hCache->invokeAll(EqualsFilter::create(vHomeStateExtractor, voStateName),
    ConditionalRemove::create(AlwaysFilter.getInstance());For update you would either need to write custom Entry Processor implementations that perform the updates you require or you can use out of the box POF or Reflection ValueUpdaters that update specific fields of the entries in the cache. These valueUpdaters would be wrapped in an UpdaterProcessor so the call would be very similar to the code above.
    JK

  • Error exporting NamedCache to file via PofWriter

    I'd like to dump some NamedCaches with a large number of entries (> 200,000) to files, and then be able to re-load the NamedCaches from those files. I realize that there may be a feature in an upcoming version of Coherence to do this sort of thing, but I'm using 3.5 for now and need to roll my own for the time being.
    I already have working code that takes the entire NamedCache and writes it as one single Map to a file using the writeMap method of a PofBufferWriter (a variation of the code shown below).
    The problem is that attempting to load a pof file containing one single Map with 200,000+ entries usually causes a out of memory error. This is because it is not possible to know ahead of time the number of elements in the Map in the pof file, which means it isn't possible to set the initial capacity of the Map that the contents of the file are being read into via the PofReader.readMap method. The consequence of this is that the Map's growth algorithm kicks in, doubling the size of the map over and over, and pretty soon the map is giant and there is insufficient heap. It's not that the objects in the file wouldn't fit in memory, the problem is that the Map growth algorithm just creates a Map too big for the heap.
    So, I've been thinking about a variation on the theme that avoids the Map growth problem. A variety of potential solutions come to mind, and I've tried many of them.
    First, I could write an int in position 0 of the PofWriter stream to say how many elements the Map in position 1 contains. That would allow me to compute an initial capacity for the Map that the file is loaded into (via PofReader) that avoids the Map growth algorithm problems.
    Second, I can iterate the elements of the NamedCache and build a number of smaller sub-Maps of a known size, and then write those sub-Maps in consecutive positions of the PofWriter stream, probably with an int in position 0 that says how many Maps follow.
    A third alternative is to iterate the elements of the NamedCache and write each key/value pair to the Nth posiition in the PofWriter stream (again, with an int in position zero saying how many key/value pairs follow).
    The code shown below is basically strategy #3, but whether I try strategy #1 or #2 or #3, I always get the following exception upon the second invocation of a "write" method on the PofWriter:
    <pre>
    java.lang.IllegalArgumentException: not in a complex type
    at com.tangosol.io.pof.PofBufferWriter.beginProperty(PofBufferWriter.java:1940)
    at com.tangosol.io.pof.PofBufferWriter.writeObject(PofBufferWriter.java:1426)
    at com.mycompany.util.coherence.NamedCacheExporter.exportCache(NamedCacheExporter.java:182)
    </pre>
    Just to be clear, if I only write one entry to the PofWriter (e.g., the entire NamedCache), it does not throw this exception.
    I am confident that all of the objects being written implement PortableObject (correctly) and are registered in the pof config file. The NamedCacheEntry object used below is a simple key/value wrapper that implements PortableObject and has only two Object-valued instance variables named "key" and "value".
    <pre>
    public static void exportCacheNamedTo(String cacheName, File file)
    throws Exception
    WrapperBufferOutput wrappedBufferOutput = null;
    try
    NamedCache cache = CacheFactory.getCache( cacheName );
    FileOutputStream fileOutputStream = new FileOutputStream( file );
    BufferedOutputStream bufferedOutputStream =
    new BufferedOutputStream( fileOutputStream, 1024 * 1024 );
    DataOutputStream dataOutputStream = new DataOutputStream( bufferedOutputStream );
    wrappedBufferOutput = new WrapperBufferOutput( dataOutputStream );
    ConfigurablePofContext pofContext =
    (ConfigurablePofContext) cache.getCacheService().getSerializer();
    PofWriter pofWriter = new PofBufferWriter( wrappedBufferOutput, pofContext );
    //this works fine
    //pofWriter.writeMap( 0, cache.getAll( cache.keySet() ) );
    //this fails with the above error
    //pofWriter.writeInt( 0, cache.size() );
    //pofWriter.writeMap( 1, cache.getAll( cache.keySet() ) );
    //this fails with the above error
    int pofIndex = 0;
    //index 0 contains an int that says how many more POF indexes to expect
    pofWriter.writeInt( pofIndex++, cache.size() );
    //index N contains a NamedCacheEntry (a simple key/value wrapper)
    for ( Object o : cache.entrySet() )
    NamedCacheEntry nce = new NamedCacheEntry( (Entry) o );
    pofWriter.writeObject( pofIndex++, nce );
    finally
    if ( wrappedBufferOutput != null )
    wrappedBufferOutput.close();
    </pre>
    Anybody see what I'm doing wrong or have any idea what the cause of the exception above is?
    Thanks in advance
    Edited by: dm197 on Feb 20, 2010 3:07 PM
    Edited by: dm197 on Feb 21, 2010 9:49 AM
    Edited by: dm197 on Feb 21, 2010 9:51 AM

    Hi DM
    Can't you just use the ConfigurablePofContext you have to serialize the entries to the WrapperBufferOutput
    public static void exportCacheNamedTo(String cacheName, File file)
    throws Exception
        WrapperBufferOutput wrappedBufferOutput = null;
        try
            NamedCache cache = CacheFactory.getCache( cacheName );
            FileOutputStream fileOutputStream = new FileOutputStream( file );
            BufferedOutputStream bufferedOutputStream =
                new BufferedOutputStream( fileOutputStream, 1024 * 1024 );
            DataOutputStream dataOutputStream = new DataOutputStream( bufferedOutputStream );
            wrappedBufferOutput = new WrapperBufferOutput( dataOutputStream );
            ConfigurablePofContext pofContext =
                (ConfigurablePofContext) cache.getCacheService().getSerializer();
            for ( Object o : cache.entrySet() )
                NamedCacheEntry nce = new NamedCacheEntry( (Map.Entry) o );
                pofContext.serialize(wrappedBufferOutput, nce);
        finally
            if ( wrappedBufferOutput != null )
                wrappedBufferOutput.close();
    }And then read them back like this:
    public static void importCacheNamedTo(String cacheName, File file)
    throws Exception
        WrapperBufferInput wrappedBufferInput = null;
        try
            NamedCache cache = CacheFactory.getCache( cacheName );
            FileInputStream fileInputStream = new FileInputStream( file );
            BufferedInputStream bufferedInputStream =
                new BufferedInputStream( fileInputStream, 1024 * 1024 );
            DataInputStream dataInputStream = new DataInputStream( bufferedInputStream );
            wrappedBufferInput = new WrapperBufferInput( dataInputStream );
            ConfigurablePofContext pofContext =
                (ConfigurablePofContext) cache.getCacheService().getSerializer();
            while (wrappedBufferInput.available() > 0)
                NamedCacheEntry nce = pofContext.deserialize(wrappedBufferInput);
                // Add the entry back into the cache
                cache.put(nce.????, nce.????);
        finally
            if ( wrappedBufferInput != null )
                wrappedBufferInput.close();
    }I have not tested this so I cannot guarantee it works, especially the wrappedBufferInput.available() > 0 condition in the while loop. I am not sure if that is the best way to check for the end of the stream.
    JK

  • Coherence+JPA query issues

    Hello everyone,
    I've setup Coherence 3.6 to use JPA according to the generic instructions found here [http://coherence.oracle.com/display/COH35UG/Configuring+Coherence+for+JPA]
    This works fine and I can actually get data out of the DB backed cache. What I've found however is that CohSQL queries do not return anything unless the entry is already in Coherence. I'm using this piece of code to test query Coherence
    public <T> List<T> getAll(Class<T> type) {
            final List<T> data = new ArrayList<T>();
            Filter filter = QueryHelper.createFilter("value().class=?1", new Object[] { type });
            Set<Map.Entry<?, T>> filteredSet = cache.entrySet(filter);
            for (Map.Entry<?, T> t : filteredSet) {
                data.add(t.getValue());
            return Collections.unmodifiableList(data);
      }Should I be expecting Coherence to query the DB at this point, or is CohSQL confined to whatever's already in the cache?
    Thanks
    Edited by: 876420 on 01-Aug-2011 07:07

    Section 20 of the dev guide:
    "It should be noted that queries apply only to currently cached data (and will not use the CacheLoader interface to retrieve additional data that may satisfy the query). Thus, the data set should be loaded entirely into cache before queries are performed. In cases where the data set is too large to fit into available memory, it may be possible to restrict the cache contents along a specific dimension (for example, "date") and manually switch between cache queries and database queries based on the structure of the query. For maintainability, this is usually best implemented inside a cache-aware data access object (DAO)."
    Cheers,
    Steve

  • Does tangasol provide some setting to restrict response datasize of a query

    We are are using tangasol caching in our application, we have around 10-15 GB of data.
    My cache cluster consists of 7 nodes (7 linux box, each box has 10 jvms).
    Some of the clients application connect to our cache as a remote client and query data (using filters).
    In some of the cases the result of query is very large, this create some trouble on the cache server, in the coherence log we can see many frequent GCs happing, communication deploy reported , removal of some members from the cache cluster etc..
    I know client can use Limit filter and fetch data in batches instead of fetching all the data in one go, but I do not have control on client application, I want to restrict the data size from server ends.
    So what I want is to restrict the size of response data.
    I am wondering if tangasol provide some setting in (may be cache config file) cache configuration to restrict the response data size (like response data should < 1 GB or response data set < 10000 records).

    Hi user 644246,
    In Coherence 3.4 (which is currently in Pre-release - http://www.oracle.com/technology/products/coherence/index.html) we introduced a new PartitionFilter that would allow you for example to do the following:
    *Run a parallel query partition-by-partition:
    void executeByPartitions(NamedCache cache, Filter filter)
       DistributedCacheService service =
           (DistributedCacheService) cache.getCacheService();
       int cPartitions = service.getPartitionCount();
       PartitionSet parts = new PartitionSet(cPartitions);
       for (int iPartition = 0; iPartition < cPartitions; iPartition++)
           parts.add(iPartition);
           Filter filterPart = new PartitionedFilter(filter, parts);
           Set setEntriesPart = cache.entrySet(filterPart);
           // process the entries ...
           parts.remove(iPartition);
       }* Run a parallel query member-by-member:
    void executeByMembers(NamedCache cache, Filter f)
       DistributedCacheService service =
           (DistributedCacheService) cache.getCacheService();
       int cPartitions = service.getPartitionCount();
       PartitionSet partsProcessed = new PartitionSet(cPartitions);
       for (Iterator iter = service.getStorageEnabledMembers().iterator();
               iter.hasNext();)
           Member member = (Member) iter.next();
           PartitionSet partsMember = service.getPartitionSet(member);
           // due to a redistribution some partitions may have already been processed
           partsMember.remove(partsProcessed);
           Filter filterPart = new PartitionedFilter(filter, partsMember);
           Set setEntriesPart = cache.entrySet(filterPart);
           // process the entries ...
           partsProcessed.add(partsMember);
       // due to a possible redistribution, some partitions may have been skipped
       if (!partsProcessed.isFull())
           partsProcessed.invert();
           Filter filter = new PartitionedFilter(filter, partsProcessed);
           // process the remaining entries ...
       }Regards,
    Gene

  • How to query involving Multi-Value Attributes objects

    Hello.
         I have one question regarding coherence. We are looking for best and the fastest way to query multi-value attribute of objects. In Coherence User guide there is example, that shows, how can this be made with java.lang.String object:
         Set searchTerms = new HashSet();
         searchTerms.add("java");
         searchTerms.add("clustering");
         searchTerms.add("books");
         // The cache contains instances of a class "Document" which has a method
         // "getWords" which returns a Collection<String> containing the set of
         // words that appear in the document.
         Filter filter = new ContainsAllFilter("getWords", searchTerms);
         Set entrySet = cache.entrySet(filter);
         // iterate through the search results
         But I would like to know, how can this be made with some other object type. For example I could have object MyCoherenceObject with attribute HashSet<Identifier> idHashSet that would represent Person object that has composite key comprised of name and lastname. Identifier object would contain two attributes of type String: name & value.
         So basically I could have two identifiers in list:
         public MyCoherenceObject {
         public HashSet idHashSet = new HashSet();
         public MyCoherenceObject() {
         Identifier id1 = new Identifier("name", "John");
         Identifier id2 = new Identifier("surname", Smith");
         idHashSet.add(id1);
         idHashSet.add(id2);
         public HashSet getIdentifiers() {
         return idHashSet;
         This object would later be inserted in coherence cache. When query over coherence cache would be performed I would like all objects where multi-value parameter with name="name" equal "John" and parameter with name="lastname" equal "Smith". My code would look something like this:
         Set searchTerms = new HashSet();
         searchTerms.add("John");
         searchTerms.add("Smith");
         // The cache contains instances of a class "Document" which has a method
         // "getWords" which returns a Collection<String> containing the set of
         // words that appear in the document.
         Filter filter = new ContainsAllFilter("getIdentifiers", searchTerms);
         Set entrySet = cache.entrySet(filter);
         // iterate through the search results
         How can this be done. Basically I don't know how to search in multi-value attribute if one value represents arbitrary object and how to tell coherence which getter fuction must be used for comparison. In my case getValue() must be used and not getName().
         Second problem:
         Coherence must not return Person object with name="Smith" and lastname="John". Here upper filter would be satisified, but problem is that I am looking for person with lastname "Smith" and name "John".
         Domain description:
         I will have different objects of same type in coherence cache with different multi-value attribute list length. For example some objects will have only one identifier object in list (e.g. Phone "phoneNumber") some two (e.g. Person "name", "lastname") and other objects will have maybe three identifier objects (e.g. City "country", "area", "state").
         If there is faster way to do this in Coherence (I saw examples with getters that contain attributes for example), please give me some directions.
         Thank you very much for your help.

    When filtering based on the getIdentifiers, you should add Identifier instances into the searchTerms collection, and it will satisfy all your expectations. If you add simple Strings into that searchTerms collection, then none of your queries will return anything, because the String will never equal() the Identifier instances.
         You can also add an index on the getIdentifiers() method. However, the Identifier class should properly implement the equals() and hashCode() methods, and in order to be able to use an ordered index (you don't need that for the current requirements you listed), it should also implement Comparable.
         BR,
         Robert

  • ClassCastException when querying an entry set w/ Comparator

    Hi,
    I have come across a problem trying to retrieve a sorted entry set from a distributed cache. I have a cache containing objects of a single class (OrderHistory) and am using the NamedCache.entrySet(filter, comparator) method. The results are fine when only a filter is used, and also when 'null' is passed as the comparator (correctly sorting OrderHistories in natural order). However, a ClassCastException occurs when I specify my own OrderSorter as a comparator. This occurs even when the OrderSorter class is stripped of all mention of the OrderHistory class and the compare method is reduced to:
         public int compare(Object o1, Object o2) {
              return 1;
    OrderSorted implements both Comparator and ExternalizableLite. It appears that Tangosol is attempting to perform the cast internally. The stack trace is as follows:
    Failed request execution for DistributedCache service on Member(Id=1, Timestamp=2007-05-14 15:19:34.9, Address=10.142.194.36:8088, MachineId=17956))
    java.lang.ClassCastException: com.[...].OrderHistory
         at com.tangosol.util.comparator.EntryComparator.compare(EntryComparator.java:103)
         at com.tangosol.util.comparator.SafeComparator.compareEntries(SafeComparator.java:106)
         at com.tangosol.util.comparator.EntryComparator.compare(EntryComparator.java:114)
         at java.util.Arrays.mergeSort(Arrays.java:1284)
         at java.util.Arrays.sort(Arrays.java:1223)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.DistributedCache$Storage.extractBinaryEntries(DistributedCache.CDB:9)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.DistributedCache$Storage.query(DistributedCache.CDB:209)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.DistributedCache.onQueryRequest(DistributedCache.CDB:23)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.DistributedCache$QueryRequest.run(DistributedCache.CDB:1)
         at com.tangosol.coherence.component.net.message.requestMessage.DistributedCacheRequest.onReceived(DistributedCacheRequest.CDB:12)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.onMessage(Service.CDB:9)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.Service.onNotify(Service.CDB:122)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.DistributedCache.onNotify(DistributedCache.CDB:3)
         at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:35)
         at java.lang.Thread.run(Thread.java:595)
    Any ideas?
    Thanks in advance,
    James

    Err, seems that bug is not fixed yet, at least version 3.5/459 still has it. Too bad.
    Here is my test case & stack trace:
    log("building cache instance");
    NamedCache cache = CacheFactory.getCache("mycache");
    log("put some objects into the cache");
    cache.put("#TEST#10#001","#1111");
    cache.put("#TEST#10#012","#2222");
    cache.put("#TEST#10#003","#3333");
    cache.put("#TEST#20#1","#4444");
    log("building filter object");
    Filter filter = new LikeFilter(new KeyExtractor(), "#TEST#10%",'\\',true);
    testComparator comparator = new testComparator();
    for (Iterator it = cache.entrySet(filter,new EntryComparator(comparator, EntryComparator.CMP_KEY)).iterator(); it.hasNext();) {
    Map.Entry curr_entry = (Map.Entry)it.next();
    System.out.println("curr_key="+curr_entry.getKey().toString()+
    "curr_value="+curr_entry.getValue().toString()
    Exception in thread "main" java.lang.ClassCastException: java.lang.String
         at com.tangosol.util.comparator.EntryComparator.compare(EntryComparator.java:99)
         at com.tangosol.util.comparator.SafeComparator.compareSafe(SafeComparator.java:166)
         at com.tangosol.util.comparator.SafeComparator.compare(SafeComparator.java:84)
         at com.tangosol.util.comparator.EntryComparator.compare(EntryComparator.java:115)
         at java.util.Arrays.mergeSort(Arrays.java:1284)
         at java.util.Arrays.sort(Arrays.java:1223)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.DistributedCache$ViewMap.entrySet(DistributedCache.CDB:105)
         at com.tangosol.coherence.component.util.SafeNamedCache.entrySet(SafeNamedCache.CDB:1)
    -------------------------------------

Maybe you are looking for

  • AT&T customers service reps. (Phone) and in the store not on the same page!

    On 6/27/15, I called into the 1(800) number to order a new phone and change my plan to the "Next" plan.  I went through whole process with the rep over the phone.  I asked if I could pick up the phone from the AT&T store in my area.  She told me I wo

  • Need to disable adobe updater in reader 10

    We deploy adobe reader using the MSI in Group Policy.  We just pushed out Reader X.  We also push out updates the same way as our users do not have admin rights.  We need to disable automatic updating and checking for updates.  The Customization Wizz

  • Where is the ibooks chapter in the iPad user guide?

    This statement was on the web page Apple - Support - iOS apps - iBooks. Download the User Guide Download the User Guide for your device and view the iBooks chapter to learn more about iBooks and the iBooks Store. Select your device below: I could not

  • My email seems to get deleted after i read it

    My emails seems to get deleted after I read them

  • ENHANCEMENT:  Mark posts as Read in Tree View

    As suggested by Host Kady, we are now extracting Bugs and Enhancements from our consolidation threads, and re-posting them individually. Reported by Da Gopha: When using Tree View, it is not possible to mark all posts in a thread as having been read.