PIF-POF Standard?

Is PIF-POF an open standard? I.e. does it have a documented wire format? Or is it closed/proprietary?
     We are currently looking at the hardware capability to set up any-to-any (e.g. XML to arbitrary binary) data format transformation on the wire. We are planning to expose XML-based/SOAP-based web services on the outside and use the hardware capability to transform the web service call into a binary-format service call on the inside, so our Java processes are spared the necessity of parsing XML.
     We are evaluating the various documented binary data formats for object/invocation representation and PIF-POF looks appealing due to its built-in class evolution support.
     If we had documentation as to how should the PIF-POF message look on the wire, we could transform our external XML calls directly into Coherence*Extend-style internal binary calls.
     Do you think this is something that can be accomplished or is it a little far-fetched for now?
     Thank you,
     Denis.

Is PIF-POF an open standard? I.e. does it have a     > documented wire format? Or is it closed/proprietary?
     It has a documented wire format and a documented API (Java, C# and soon C++), but neither is "open" in the sense of open standards (such as HTTP).
     > We are currently looking at the hardware capability
     > to set up any-to-any (e.g. XML to arbitrary binary)
     > data format transformation on the wire. We are
     > planning to expose XML-based/SOAP-based web services
     > on the outside and use the hardware capability to
     > transform the web service call into a binary-format
     > service call on the inside, so our Java processes are
     > spared the necessity of parsing XML.
     >
     > We are evaluating the various documented binary data
     > formats for object/invocation representation and
     > PIF-POF looks appealing due to its built-in class
     > evolution support.
     >
     > If we had documentation as to how should the PIF-POF
     > message look on the wire, we could transform our
     > external XML calls directly into
     > Coherence*Extend-style internal binary calls.
     >
     > Do you think this is something that can be
     > accomplished or is it a little far-fetched for now?
     It is something that could be accomplished. Note however that PIF/POF is at its best when it is being bound to something, e.g. a Java class. Otherwise, you would want to evaluate the various "binary XML" standards.
     To learn more, ask your account manager for a copy of the PIF/POF specification.
     Peace,
     Cameron Purdy
     Tangosol Coherence: The Java Data Grid

Similar Messages

  • Needing JARs on storage nodes

    Hi Guys,
    With the advent of PIF/POF is there still a need for the classes you intend to process with entry processors to exist on on the storage nodes?
    I read one message somewhere (someone was getting a classpath error) that implied that despite using 3.2/376 he/she still needed the classes on the classpath.
    Kind Regards,
    Max

    Hi Robert,
    PIF/POF does not necessarily help you anywhere with
    getting rid of any classes ..Yet ;-)
    - it is not currently used at serialization/deserialization, yetCorrect. This is a work in progress, and this specific part of it will be in the next release.
    according to the Changelog and the forums, it is only used by Coherence*ExtendCorrect. It will be used more widely over time.
    - XMLBeans are not PIF/POF-based yetCorrect. This too will change.
    - PIF/POF still refers to class names, so it is currently something like a more versatile
    ExternalizableLite, but they still need the referred classes and all their runtime
    dependency to exist in the storage-enabled node classloadersPIF/POF itself does not refer to class names, but a platform-specific binding does. In Java, we encode the binding in an XML file, and use it as the basis to configure a POF context.
    You will need the following things in a
    storage-enabled node:
    - all Invocable/EntryProcessor/EntryAggregator
    classes
    - all cache-store/cache-loader classes
    - all the runtime dependency of the previous
    - all other classes referred in the cache
    configuration files (custom eviction policies,
    KeyAssociator implementors, etc.)This is true at the current time (release 3.2). Some of it will always be true. As the tools advance, Coherence will do more for you, yet (as always) existing applications will continue to work without changes.
    For example, it will be possible to define classes of objects (what we call a schema) for Coherence to manage without actually writing the code for them, and Coherence will provide the libraries (e.g. documentation, source code, binaries, etc.) necessary for expressing those classes in Java, .NET, etc. That means that Coherence will do more work for you automatically and dynamically in the future, but in the mean time you still have to spell some things out.
    If you use indexes, or queries, you will also need
    all the classes of entries stored in the caches (an
    index uses it when the data is put in the backing
    map; querying uses it, if an index does not exist, or
    does not fully resolve a query).See above ;-)
    Peace,
    Cameron Purdy
    Tangosol Coherence: The Java Data Grid

  • Performance comparisons between POF & open source serialization mechanism?

    I'm curious whether anyone has done any comparisons of performance and serialized object sizes between POF and open source mechanisms such as Google Protocol Buffers and Thrift, both of which seem to be becoming quite popular. Personally, I dislike having to write a separate schema and then generate classes from it, which Protocol Buffers and Thrift require you to do, and I vastly prefer POF's mechanism of keeping everything in the code (although I wish the POF annotation framework was officially supported). But aside from that, I'd prefer to use Coherence for many of the purposes that some of my co-workers are currently using other solutions for, and this would be useful information to have in making the case.
    FWIW, I hope someone at Oracle is seriously considering open-sourcing POF. I don't think that anyone who would've bought a Coherence license would decide not to because they could get POF for free. They'd just go and use something else, like the aforementioned Protocol Buffers and Thrift. Not only are many companies adopting these as standards, but as has been mentioned in other threads on this forum, that's exactly what even some Coherence users are doing:
    Re: POF compatibility across Coherence versions
    I really wish I could to encourage developers that I work with to give POF a look as an alternative to those two (both of which we're currently using), regardless of whether or not they plan on using Coherence in the immediate future. As things stand right now, I can't use Coherence for code that needs to be shared with people in other groups who haven't adopted Coherence yet. But if I could use POF outside of Coherence, it would probably be acceptable to those folks as a generic serialization mechanism, and it would make migrating such code to Coherence at some point down the road that much easier. If, on the other hand, I have to write that code around, say, Protocol Buffers, then it becomes much harder to later justify creating and maintaining POF as a second serialization mechanism for the same set of objects, which means it's much harder to justify using Coherence for those objects.
    In short, making POF usable outside of Coherence, and who knows, maybe even getting it supported in popular open source projects such as Cassandra (which, as I understand it, uses Thrift) would make it easier to adopt Coherence in environments where objects are already persisted in other systems.
    That's my two cents.

    Hi,
    Thank you for links. It is very interesting.
    I have implemented POF serialization plugin for this benchmark http://wiki.github.com/eishay/jvm-serializers/
    You can get code, run benchmark for yourself and compare result.
    Handmade POF serialization http://gridkit.googlecode.com/svn/wiki/snippets/CoherencePofSerializer.java
    Reflection POF serialization http://gridkit.googlecode.com/svn/wiki/snippets/CoherencePofReflection.java
    Also you should put a two line in BenchmarkRunner.java, all other instructions are on jvm-serializers project page.
              Protobuf.register(groups);
              Thrift.register(groups);
              ActiveMQProtobuf.register(groups);
              Protostuff.register(groups);
              Kryo.register(groups);
              AvroSpecific.register(groups);
              AvroGeneric.register(groups);
    // register POF tests here
              CoherencePofSerializer.register(groups);
              CoherencePofReflection.register(groups);
              CksBinary.register(groups);
              Hessian.register(groups);
              JavaBuiltIn.register(groups);
              JavaManual.register(groups);
              Scala.register(groups);A few comments on result.
    * Micro benchmark is a micro benchmark, I saw quite differnt results then comparis java vs POF vs POF reflection on own domain objects.
    * POF score very good compared to protocols like Protobuf or Thrift, especially on deserialization.
    * Kryo project is quite interesting, I'm going to give it a try in next project for sure.
    Again, thanks a lot for a link.

  • Getting "invalid type: 169" errors when using POF with Push Replication

    I'm trying to get Push Replication - latest version - running on Coherence 3.6.1. I can get it working fine if I don't use POF with my objects, but when trying to use POF format for my objects I get this:
    2011-02-11 13:06:00.993/2.297 Oracle Coherence GE 3.6.1.1 <D5> (thread=Invocation:Management, member=1): Service Management joined the cluster with senior service member 1
    2011-02-11 13:06:01.149/2.453 Oracle Coherence GE 3.6.1.1 <Info> (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded POF configuration from "file:/C:/wsgpc/GlobalPositionsCache/resource/coherence/pof-config.xml"
    2011-02-11 13:06:01.149/2.453 Oracle Coherence GE 3.6.1.1 <Info> (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded included POF configuration from "jar:file:/C:/coherence3.6/coherence/lib/coherence.jar!/coherence-pof-config.xml"
    2011-02-11 13:06:01.149/2.453 Oracle Coherence GE 3.6.1.1 <Info> (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded included POF configuration from "jar:file:/C:/coherence3.6-pushreplication/coherence-3.6-common-1.7.3.20019.jar!/coherence-common-pof-config.xml"
    2011-02-11 13:06:01.165/2.469 Oracle Coherence GE 3.6.1.1 <Info> (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded included POF configuration from "jar:file:/C:/coherence3.6-pushreplication/coherence-3.6-messagingpattern-2.7.4.21016.jar!/coherence-messagingpattern-pof-config.xml"
    2011-02-11 13:06:01.165/2.469 Oracle Coherence GE 3.6.1.1 <Info> (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded included POF configuration from "jar:file:/C:/coherence3.6-pushreplication/coherence-3.6-pushreplicationpattern-3.0.3.20019.jar!/coherence-pushreplicationpattern-pof-config.xml"
    2011-02-11 13:06:01.243/2.547 Oracle Coherence GE 3.6.1.1 <D5> (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Service DistributedCacheForSequenceGenerators joined the cluster with senior service member 1
    2011-02-11 13:06:01.258/2.562 Oracle Coherence GE 3.6.1.1 <D5> (thread=DistributedCache:DistributedCacheForLiveObjects, member=1): Service DistributedCacheForLiveObjects joined the cluster with senior service member 1
    2011-02-11 13:06:01.274/2.578 Oracle Coherence GE 3.6.1.1 <D5> (thread=DistributedCache:DistributedCacheForSubscriptions, member=1): Service DistributedCacheForSubscriptions joined the cluster with senior service member 1
    2011-02-11 13:06:01.290/2.594 Oracle Coherence GE 3.6.1.1 <D5> (thread=DistributedCache:DistributedCacheForMessages, member=1): Service DistributedCacheForMessages joined the cluster with senior service member 1
    2011-02-11 13:06:01.305/2.609 Oracle Coherence GE 3.6.1.1 <D5> (thread=DistributedCache:DistributedCacheForDestinations, member=1): Service DistributedCacheForDestinations joined the cluster with senior service member 1
    2011-02-11 13:06:01.305/2.609 Oracle Coherence GE 3.6.1.1 <D5> (thread=DistributedCache:DistributedCacheWithPublishingCacheStore, member=1): Service DistributedCacheWithPublishingCacheStore joined the cluster with senior service member 1
    2011-02-11 13:06:01.321/2.625 Oracle Coherence GE 3.6.1.1 <D5> (thread=DistributedCache, member=1): Service DistributedCache joined the cluster with senior service member 1
    2011-02-11 13:06:01.461/2.765 Oracle Coherence GE 3.6.1.1 <Info> (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): TcpAcceptor now listening for connections on 166.15.224.91:20002
    2011-02-11 13:06:01.461/2.765 Oracle Coherence GE 3.6.1.1 <D5> (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): Started: TcpAcceptor{Name=Proxy:ExtendTcpProxyService:TcpAcceptor, State=(SERVICE_STARTED), ThreadCount=0, Codec=Codec(Format=POF), Serializer=com.tangosol.io.DefaultSerializer, PingInterval=0, PingTimeout=0, RequestTimeout=0, SocketProvider=SystemSocketProvider, LocalAddress=[/166.15.224.91:20002], SocketOptions{LingerTimeout=0, KeepAliveEnabled=true, TcpDelayEnabled=false}, ListenBacklog=0, BufferPoolIn=BufferPool(BufferSize=2KB, BufferType=DIRECT, Capacity=Unlimited), BufferPoolOut=BufferPool(BufferSize=2KB, BufferType=DIRECT, Capacity=Unlimited)}
    2011-02-11 13:06:01.461/2.765 Oracle Coherence GE 3.6.1.1 <D5> (thread=Proxy:ExtendTcpProxyService, member=1): Service ExtendTcpProxyService joined the cluster with senior service member 1
    2011-02-11 13:06:01.461/2.765 Oracle Coherence GE 3.6.1.1 <Info> (thread=main, member=1):
    Services
    ClusterService{Name=Cluster, State=(SERVICE_STARTED, STATE_JOINED), Id=0, Version=3.6, OldestMemberId=1}
    InvocationService{Name=Management, State=(SERVICE_STARTED), Id=1, Version=3.1, OldestMemberId=1}
    PartitionedCache{Name=DistributedCacheForSequenceGenerators, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=257, BackupCount=1, AssignedPartitions=257, BackupPartitions=0}
    PartitionedCache{Name=DistributedCacheForLiveObjects, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=257, BackupCount=1, AssignedPartitions=257, BackupPartitions=0}
    PartitionedCache{Name=DistributedCacheForSubscriptions, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=257, BackupCount=1, AssignedPartitions=257, BackupPartitions=0}
    PartitionedCache{Name=DistributedCacheForMessages, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=257, BackupCount=1, AssignedPartitions=257, BackupPartitions=0}
    PartitionedCache{Name=DistributedCacheForDestinations, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=257, BackupCount=1, AssignedPartitions=257, BackupPartitions=0}
    PartitionedCache{Name=DistributedCacheWithPublishingCacheStore, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=257, BackupCount=1, AssignedPartitions=257, BackupPartitions=0}
    PartitionedCache{Name=DistributedCache, State=(SERVICE_STARTED), LocalStorage=enabled, PartitionCount=257, BackupCount=1, AssignedPartitions=257, BackupPartitions=0}
    ProxyService{Name=ExtendTcpProxyService, State=(SERVICE_STARTED), Id=9, Version=3.2, OldestMemberId=1}
    Started DefaultCacheServer...
    2011-02-11 13:08:27.894/149.198 Oracle Coherence GE 3.6.1.1 <Error> (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): Failed to publish EntryOperation{siteName=csfb.cs-group.com, clusterName=SPTestCluster, cacheName=source-cache, operation=Insert, publishableEntry=PublishableEntry{key=Binary(length=32, value=0x15A90F00004E07424F4F4B303038014E08494E535430393834024E0345535040), value=Binary(length=147, value=0x1281A30115AA0F0000A90F00004E07424F4F4B303038014E08494E535430393834024E03455350400248ADEEF99607060348858197BF22060448B4D8E9BE02060548A0D2CDC70E060648B0E9A2C4030607488DBCD6E50D060848B18FC1882006094E03303038402B155B014E0524737263244E1F637366622E63732D67726F75702E636F6D2D535054657374436C7573746572), originalValue=Binary(length=0, value=0x)}} to Cache passive-cache because of
    (Wrapped) java.io.StreamCorruptedException: invalid type: 169 Class:com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher
    2011-02-11 13:08:27.894/149.198 Oracle Coherence GE 3.6.1.1 <D5> (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): An exception occurred while processing a InvocationRequest for Service=Proxy:ExtendTcpProxyService:TcpAcceptor: (Wrapped: Failed to publish a batch with the publisher [Active Publisher] on cache [source-cache]) java.lang.IllegalStateException: Attempted to publish to cache passive-cache
         at com.tangosol.util.Base.ensureRuntimeException(Base.java:293)
         at com.oracle.coherence.patterns.pushreplication.publishers.RemoteClusterPublisher$RemotePublishingAgent.run(RemoteClusterPublisher.java:348)
         at com.tangosol.coherence.component.net.extend.proxy.serviceProxy.InvocationServiceProxy.query(InvocationServiceProxy.CDB:6)
         at com.tangosol.coherence.component.net.extend.messageFactory.InvocationServiceFactory$InvocationRequest.onRun(InvocationServiceFactory.CDB:12)
         at com.tangosol.coherence.component.net.extend.message.Request.run(Request.CDB:4)
         at com.tangosol.coherence.component.net.extend.proxy.serviceProxy.InvocationServiceProxy.onMessage(InvocationServiceProxy.CDB:9)
         at com.tangosol.coherence.component.net.extend.Channel.execute(Channel.CDB:39)
         at com.tangosol.coherence.component.net.extend.Channel.receive(Channel.CDB:26)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer.onNotify(Peer.CDB:103)
         at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
         at java.lang.Thread.run(Thread.java:662)
    Caused by: java.lang.IllegalStateException: Attempted to publish to cache passive-cache
         at com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher.publishBatch(AbstractCachePublisher.java:163)
         at com.oracle.coherence.patterns.pushreplication.publishers.RemoteClusterPublisher$RemotePublishingAgent.run(RemoteClusterPublisher.java:343)
         ... 9 more
    Caused by: (Wrapped) java.io.StreamCorruptedException: invalid type: 169
         at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:265)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService$ConverterKeyToBinary.convert(PartitionedService.CDB:16)
         at com.tangosol.util.ConverterCollections$ConverterInvocableMap.invoke(ConverterCollections.java:2156)
         at com.tangosol.util.ConverterCollections$ConverterNamedCache.invoke(ConverterCollections.java:2622)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.invoke(PartitionedCache.CDB:11)
         at com.tangosol.coherence.component.util.SafeNamedCache.invoke(SafeNamedCache.CDB:1)
         at com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher.publishBatch(AbstractCachePublisher.java:142)
         ... 10 more
    Caused by: java.io.StreamCorruptedException: invalid type: 169
         at com.tangosol.util.ExternalizableHelper.readObjectInternal(ExternalizableHelper.java:2265)
         at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2253)
         at com.tangosol.io.DefaultSerializer.deserialize(DefaultSerializer.java:74)
         at com.tangosol.util.ExternalizableHelper.deserializeInternal(ExternalizableHelper.java:2703)
         at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:261)
         ... 16 more
    2011-02-11 13:08:37.925/159.229 Oracle Coherence GE 3.6.1.1 <Error> (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): Failed to publish EntryOperation{siteName=csfb.cs-group.com, clusterName=SPTestCluster, cacheName=source-cache, operation=Insert, publishableEntry=PublishableEntry{key=Binary(length=32, value=0x15A90F00004E07424F4F4B303038014E08494E535430393834024E0345535040), value=Binary(length=147, value=0x1281A30115AA0F0000A90F00004E07424F4F4B303038014E08494E535430393834024E03455350400248ADEEF99607060348858197BF22060448B4D8E9BE02060548A0D2CDC70E060648B0E9A2C4030607488DBCD6E50D060848B18FC1882006094E03303038402B155B014E0524737263244E1F637366622E63732D67726F75702E636F6D2D535054657374436C7573746572), originalValue=Binary(length=0, value=0x)}} to Cache passive-cache because of
    (Wrapped) java.io.StreamCorruptedException: invalid type: 169 Class:com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher
    2011-02-11 13:08:37.925/159.229 Oracle Coherence GE 3.6.1.1 <D5> (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): An exception occurred while processing a InvocationRequest for Service=Proxy:ExtendTcpProxyService:TcpAcceptor: (Wrapped: Failed to publish a batch with the publisher [Active Publisher] on cache [source-cache]) java.lang.IllegalStateException: Attempted to publish to cache passive-cache
         at com.tangosol.util.Base.ensureRuntimeException(Base.java:293)
         at com.oracle.coherence.patterns.pushreplication.publishers.RemoteClusterPublisher$RemotePublishingAgent.run(RemoteClusterPublisher.java:348)
         at com.tangosol.coherence.component.net.extend.proxy.serviceProxy.InvocationServiceProxy.query(InvocationServiceProxy.CDB:6)
         at com.tangosol.coherence.component.net.extend.messageFactory.InvocationServiceFactory$InvocationRequest.onRun(InvocationServiceFactory.CDB:12)
         at com.tangosol.coherence.component.net.extend.message.Request.run(Request.CDB:4)
         at com.tangosol.coherence.component.net.extend.proxy.serviceProxy.InvocationServiceProxy.onMessage(InvocationServiceProxy.CDB:9)
         at com.tangosol.coherence.component.net.extend.Channel.execute(Channel.CDB:39)
         at com.tangosol.coherence.component.net.extend.Channel.receive(Channel.CDB:26)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer.onNotify(Peer.CDB:103)
         at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
         at java.lang.Thread.run(Thread.java:662)
    Caused by: java.lang.IllegalStateException: Attempted to publish to cache passive-cache
         at com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher.publishBatch(AbstractCachePublisher.java:163)
         at com.oracle.coherence.patterns.pushreplication.publishers.RemoteClusterPublisher$RemotePublishingAgent.run(RemoteClusterPublisher.java:343)
         ... 9 more
    Caused by: (Wrapped) java.io.StreamCorruptedException: invalid type: 169
         at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:265)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService$ConverterKeyToBinary.convert(PartitionedService.CDB:16)
         at com.tangosol.util.ConverterCollections$ConverterInvocableMap.invoke(ConverterCollections.java:2156)
         at com.tangosol.util.ConverterCollections$ConverterNamedCache.invoke(ConverterCollections.java:2622)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.invoke(PartitionedCache.CDB:11)
         at com.tangosol.coherence.component.util.SafeNamedCache.invoke(SafeNamedCache.CDB:1)
         at com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher.publishBatch(AbstractCachePublisher.java:142)
         ... 10 more
    Caused by: java.io.StreamCorruptedException: invalid type: 169
         at com.tangosol.util.ExternalizableHelper.readObjectInternal(ExternalizableHelper.java:2265)
         at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2253)
         at com.tangosol.io.DefaultSerializer.deserialize(DefaultSerializer.java:74)
         at com.tangosol.util.ExternalizableHelper.deserializeInternal(ExternalizableHelper.java:2703)
         at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:261)
         ... 16 more
    2011-02-11 13:08:47.940/169.244 Oracle Coherence GE 3.6.1.1 <Error> (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): Failed to publish EntryOperation{siteName=csfb.cs-group.com, clusterName=SPTestCluster, cacheName=source-cache, operation=Insert, publishableEntry=PublishableEntry{key=Binary(length=32, value=0x15A90F00004E07424F4F4B303038014E08494E535430393834024E0345535040), value=Binary(length=147, value=0x1281A30115AA0F0000A90F00004E07424F4F4B303038014E08494E535430393834024E03455350400248ADEEF99607060348858197BF22060448B4D8E9BE02060548A0D2CDC70E060648B0E9A2C4030607488DBCD6E50D060848B18FC1882006094E03303038402B155B014E0524737263244E1F637366622E63732D67726F75702E636F6D2D535054657374436C7573746572), originalValue=Binary(length=0, value=0x)}} to Cache passive-cache because of
    (Wrapped) java.io.StreamCorruptedException: invalid type: 169 Class:com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher
    2011-02-11 13:08:47.940/169.244 Oracle Coherence GE 3.6.1.1 <D5> (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, member=1): An exception occurred while processing a InvocationRequest for Service=Proxy:ExtendTcpProxyService:TcpAcceptor: (Wrapped: Failed to publish a batch with the publisher [Active Publisher] on cache [source-cache]) java.lang.IllegalStateException: Attempted to publish to cache passive-cache
         at com.tangosol.util.Base.ensureRuntimeException(Base.java:293)
         at com.oracle.coherence.patterns.pushreplication.publishers.RemoteClusterPublisher$RemotePublishingAgent.run(RemoteClusterPublisher.java:348)
         at com.tangosol.coherence.component.net.extend.proxy.serviceProxy.InvocationServiceProxy.query(InvocationServiceProxy.CDB:6)
         at com.tangosol.coherence.component.net.extend.messageFactory.InvocationServiceFactory$InvocationRequest.onRun(InvocationServiceFactory.CDB:12)
         at com.tangosol.coherence.component.net.extend.message.Request.run(Request.CDB:4)
         at com.tangosol.coherence.component.net.extend.proxy.serviceProxy.InvocationServiceProxy.onMessage(InvocationServiceProxy.CDB:9)
         at com.tangosol.coherence.component.net.extend.Channel.execute(Channel.CDB:39)
         at com.tangosol.coherence.component.net.extend.Channel.receive(Channel.CDB:26)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer.onNotify(Peer.CDB:103)
         at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
         at java.lang.Thread.run(Thread.java:662)
    Caused by: java.lang.IllegalStateException: Attempted to publish to cache passive-cache
         at com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher.publishBatch(AbstractCachePublisher.java:163)
         at com.oracle.coherence.patterns.pushreplication.publishers.RemoteClusterPublisher$RemotePublishingAgent.run(RemoteClusterPublisher.java:343)
         ... 9 more
    Caused by: (Wrapped) java.io.StreamCorruptedException: invalid type: 169
         at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:265)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.PartitionedService$ConverterKeyToBinary.convert(PartitionedService.CDB:16)
         at com.tangosol.util.ConverterCollections$ConverterInvocableMap.invoke(ConverterCollections.java:2156)
         at com.tangosol.util.ConverterCollections$ConverterNamedCache.invoke(ConverterCollections.java:2622)
         at com.tangosol.coherence.component.util.daemon.queueProcessor.service.grid.partitionedService.PartitionedCache$ViewMap.invoke(PartitionedCache.CDB:11)
         at com.tangosol.coherence.component.util.SafeNamedCache.invoke(SafeNamedCache.CDB:1)
         at com.oracle.coherence.patterns.pushreplication.publishers.cache.AbstractCachePublisher.publishBatch(AbstractCachePublisher.java:142)
         ... 10 more
    Caused by: java.io.StreamCorruptedException: invalid type: 169
         at com.tangosol.util.ExternalizableHelper.readObjectInternal(ExternalizableHelper.java:2265)
         at com.tangosol.util.ExternalizableHelper.readObject(ExternalizableHelper.java:2253)
         at com.tangosol.io.DefaultSerializer.deserialize(DefaultSerializer.java:74)
         at com.tangosol.util.ExternalizableHelper.deserializeInternal(ExternalizableHelper.java:2703)
         at com.tangosol.util.ExternalizableHelper.fromBinary(ExternalizableHelper.java:261)
         ... 16 more
    It seems to be loading my POF configuration file - which also includes the standard Coherence ones as well as those required for PR - just fine, as you can see at the top of the trace.
    Any ideas why POF format for my objects is giving this error (NB. I've tested the POF stuff outside of PR and it all works fine.)
    EDIT: I've tried switching the "publisher" to the "file" publisher in PR. And that works fine. I see my POF format cached data extracted and published to the directory I specify. So the "publish" part seems to work when I use a file-publisher.
    Cheers,
    Steve

    Hi Neville,
    I don't pass any POF config parameters on the command-line. My POF file is called "pof-config.xml" so seems to be picked up by default. The trace I showed in my post shows the file being picked up.
    My POF config file content is as follows:
    <pof-config>
         <user-type-list>
              <!-- Standard Coherence POF types -->
              <include>coherence-pof-config.xml</include>
              <!-- Coherence Push Replication Required POF types -->
    <include>coherence-common-pof-config.xml</include>
    <include>coherence-messagingpattern-pof-config.xml</include>
    <include>coherence-pushreplicationpattern-pof-config.xml</include>
              <!-- User POF types (must be above 1000) -->
              <user-type>
                   <type-id>1001</type-id>
                   <class-name>com.csg.gpc.domain.model.position.trading.TradingPositionKey</class-name>
                   <serializer>
                        <class-name>com.csg.gpc.coherence.pof.position.trading.TradingPositionKeySerializer</class-name>
                   </serializer>
              </user-type>
              <user-type>
                   <type-id>1002</type-id>
                   <class-name>com.csg.gpc.domain.model.position.trading.TradingPosition</class-name>
                   <serializer>
                        <class-name>com.csg.gpc.coherence.pof.position.trading.TradingPositionSerializer</class-name>
                   </serializer>
              </user-type>
              <user-type>
                   <type-id>1003</type-id>
                   <class-name>com.csg.gpc.domain.model.position.simple.SimplePosition</class-name>
                   <serializer>
                        <class-name>com.csg.gpc.coherence.pof.position.simple.SimplePositionSerializer</class-name>
                   </serializer>
              </user-type>
              <user-type>
                   <type-id>1004</type-id>
                   <class-name>com.csg.gpc.coherence.processor.TradingPositionUpdateProcessor</class-name>
              </user-type>
         </user-type-list>
    </pof-config>
    EDIT: I'm running both clusters here from within Eclipse. Here's the POF bits from the startup of the receiving cluster:
    2011-02-11 15:05:22.607/2.328 Oracle Coherence GE 3.6.1.1 <D5> (thread=Invocation:Management, member=1): Service Management joined the cluster with senior service member 1
    2011-02-11 15:05:22.779/2.500 Oracle Coherence GE 3.6.1.1 <Info> (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded POF configuration from "file:/C:/wsgpc/GlobalPositionsCache/resource/coherence/pof-config.xml"
    2011-02-11 15:05:22.779/2.500 Oracle Coherence GE 3.6.1.1 <Info> (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded included POF configuration from "jar:file:/C:/coherence3.6/coherence/lib/coherence.jar!/coherence-pof-config.xml"
    2011-02-11 15:05:22.779/2.500 Oracle Coherence GE 3.6.1.1 <Info> (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded included POF configuration from "jar:file:/C:/coherence3.6-pushreplication/coherence-3.6-common-1.7.3.20019.jar!/coherence-common-pof-config.xml"
    2011-02-11 15:05:22.779/2.500 Oracle Coherence GE 3.6.1.1 <Info> (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded included POF configuration from "jar:file:/C:/coherence3.6-pushreplication/coherence-3.6-messagingpattern-2.7.4.21016.jar!/coherence-messagingpattern-pof-config.xml"
    2011-02-11 15:05:22.779/2.500 Oracle Coherence GE 3.6.1.1 <Info> (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded included POF configuration from "jar:file:/C:/coherence3.6-pushreplication/coherence-3.6-pushreplicationpattern-3.0.3.20019.jar!/coherence-pushreplicationpattern-pof-config.xml"
    And here's the start-up POF bits from the sending cluster:
    2011-02-11 15:07:09.744/2.343 Oracle Coherence GE 3.6.1.1 <D5> (thread=Invocation:Management, member=1): Service Management joined the cluster with senior service member 1
    2011-02-11 15:07:09.916/2.515 Oracle Coherence GE 3.6.1.1 <Info> (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded POF configuration from "file:/C:/wsgpc/GlobalPositionsCache/resource/coherence/pof-config.xml"
    2011-02-11 15:07:09.916/2.515 Oracle Coherence GE 3.6.1.1 <Info> (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded included POF configuration from "jar:file:/C:/coherence3.6/coherence/lib/coherence.jar!/coherence-pof-config.xml"
    2011-02-11 15:07:09.916/2.515 Oracle Coherence GE 3.6.1.1 <Info> (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded included POF configuration from "jar:file:/C:/coherence3.6-pushreplication/coherence-3.6-common-1.7.3.20019.jar!/coherence-common-pof-config.xml"
    2011-02-11 15:07:09.916/2.515 Oracle Coherence GE 3.6.1.1 <Info> (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded included POF configuration from "jar:file:/C:/coherence3.6-pushreplication/coherence-3.6-messagingpattern-2.7.4.21016.jar!/coherence-messagingpattern-pof-config.xml"
    2011-02-11 15:07:09.916/2.515 Oracle Coherence GE 3.6.1.1 <Info> (thread=DistributedCache:DistributedCacheForSequenceGenerators, member=1): Loaded included POF configuration from "jar:file:/C:/coherence3.6-pushreplication/coherence-3.6-pushreplicationpattern-3.0.3.20019.jar!/coherence-pushreplicationpattern-pof-config.xml"
    They both seem to be reading my pof-config.xml file.
    I have the following in my sending cluster cache config:
              <sync:provider pof-enabled="true">
                   <sync:coherence-provider />
              </sync:provider>
    And this in the receiving cache config:
    <introduce:config
    file="coherence-pushreplicationpattern-pof-cache-config.xml" />
    Cheers,
    Steve
    Edited by: stevephe on 11-Feb-2011 07:05

  • Oracle coherence-Pof(Portable object format)

    HI,
    I am new to oracle coherence.I read that POF is language independent.I am trying to create a portable object in java and access that object in .NET .So i created a portable object by implementing Portable object interface in Java. My Portable object is as follows:
    Pof.java:
    package com;
    import java.io.IOException;
    import java.io.Serializable;
    import javax.persistence.*;
    import oracle.eclipselink.coherence.integrated.config.CoherenceReadWriteCustomizer;
    import org.eclipse.persistence.annotations.Customizer;
    import com.tangosol.io.pof.PofReader;
    import com.tangosol.io.pof.PofWriter;
    import com.tangosol.io.pof.PortableObject;
    * The persistent class for the POF database table.
    @Entity
    @Customizer(CoherenceReadWriteCustomizer.class)
    public class Pof implements Serializable,PortableObject {
         private static final long serialVersionUID = 1L;
         @Id
         private long id;
         private String city;
         private String name;
         @Column(name="\"STATE\"")
         private String state;
         private String street;
         private String zip;
    public Pof() {
    public Pof(long id,String city,String name,String state,String street,String zip)
         this.id=id;
         this.city=city;
         this.name=name;
         this.state=state;
         this.street=street;
         this.zip=zip;
         public long getId() {
              return this.id;
         public void setId(long id) {
              this.id = id;
         public String getCity() {
              return this.city;
         public void setCity(String city) {
              this.city = city;
         public String getName() {
              return this.name;
         public void setName(String name) {
              this.name = name;
         public String getState() {
              return this.state;
         public void setState(String state) {
              this.state = state;
         public String getStreet() {
              return this.street;
         public void setStreet(String street) {
              this.street = street;
         public String getZip() {
              return this.zip;
         public void setZip(String zip) {
              this.zip = zip;
         @Override
         public void readExternal(PofReader arg0) throws IOException
              id=arg0.readLong(0);
              city=arg0.readString(1);
              name=arg0.readString(2);
              state=arg0.readString(3);
              street=arg0.readString(4);
              zip=arg0.readString(5);
         @Override
         public void writeExternal(PofWriter arg0) throws IOException
              arg0.writeLong(0,id);
              arg0.writeString(1,city);
              arg0.writeString(2,name);
              arg0.writeString(3,state);
              arg0.writeString(4,street);
              arg0.writeString(5,zip);
         @Override
         public String toString()
              return "cityname="+city;
    coherence-cache-config.xml
    <?xml version="1.0"?>
    <!--<!DOCTYPE cache-config SYSTEM "cache-config.dtd">-->
    <cache-config>
         <caching-scheme-mapping>
         <cache-mapping>
         <cache-name>Pof</cache-name>
         <scheme-name>disributedschema</scheme-name>
         </cache-mapping>
         </caching-scheme-mapping>
         <caching-schemes>
         <distributed-scheme>
         <scheme-name>disributedschema</scheme-name>
         <service-name>distributedschemaservice</service-name>
         <serializer>
    <class-name>com.tangosol.io.pof.ConfigurablePofContext</class-name>
    <init-params>
    <init-param>
    <param-value system-property="pof.config">file:///C:/Users/Praveen/workspace/newappworkspace/POF/src/pof-config.xml</param-value>
    <param-type>String</param-type>
    </init-param>
    </init-params>
    </serializer>
    <backing-map-scheme>
    <read-write-backing-map-scheme>
    <internal-cache-scheme>
    <local-scheme/>
    </internal-cache-scheme>
    <cachestore-scheme>
    <class-scheme>
    <class-name>oracle.eclipselink.coherence.integrated.EclipseLinkJPACacheStore</class-name>
    <init-params>
    <init-param>
    <param-type>java.lang.String</param-type>
    <param-value>{cache-name}</param-value>
    </init-param>
    <init-param>
    <param-type>java.lang.String</param-type>
    <param-value>POF</param-value>
    </init-param>
    </init-params>
    </class-scheme>
    </cachestore-scheme>
    </read-write-backing-map-scheme>
    </backing-map-scheme>
    <autostart>true</autostart>
         </distributed-scheme>
         </caching-schemes>
    </cache-config>
    pof-config.xml
    <?xml version="1.0"?>
    <!DOCTYPE pof-config SYSTEM "pof-config.dtd">
    <pof-config>
         <user-type-list>
         <!-- include all "standard" Coherence POF user types -->
         <include>coherence-pof-config.xml</include>
         <user-type>
              <type-id>1001</type-id>
              <class-name>com.Pof</class-name>
         </user-type>
         </user-type-list>
         <allow-interfaces>true</allow-interfaces>
    <allow-subclasses>true</allow-subclasses>
    </pof-config>
    Now the data is inserting in to the database and cache.Now how to access this portable object in .NET?
    Any suggestions would be great.
    Regards,
    Praveen

    Hopefully, Coherence Developer's Guide 3.6.1 "Using POF" will help:
    http://download.oracle.com/docs/cd/E15357_01/coh.360/e15723/api_pof.htm#BABEJCFF
    /Mark J
    Oracle Coherence

  • Populating our log message along with standard sap log in ck11n.

    Hi all,
    I have developed a user exit which is used in costing of material using ck11n.
    Here i have to show our custom log message along with the standard log shown by standard sap system after costing run is complete.
    I got one FM-- CM_F_MESSAGE  which is used by SAP. But i want the message along with SAP messages and not separately.
    Can u help me out for this. its very urgent.
    Thanks in advance.

    Hi
    I'm not sure because I don't know that trx, but I seem the function group of that function manages a log, so you can try.
    This is an extract of abap code of SAPLCKDI where that fm is used:
    CALL FUNCTION 'CM_F_MESSAGE'
       EXPORTING
         ARBGB = Y_CMF-CK
         MSGNR = '327'
         MSGTY = Y_CMF-W
         MSGV1 = SICHT
         MSGV2 = KLVAR.
    So I suppose you should call it by this way:
    CALL FUNCTION 'CM_F_MESSAGE'
       EXPORTING
         ARBGB = <your message class>
         MSGNR = <message number>
         MSGTY = <message type>
         MSGV1 = <text 1>
         MSGV2 = <text 2>
         MSGV3 = <text 3>
    I think MSGV* is optional parameter.
    Max

  • Standard report for Open Orders & JIT lines

    Hi,
    I need a standard report that will list out all open Purchase Orders and All Open JIT lines for a Supplier for Perticular plant.
    Is there any standard report that can pull this data.
    Regards
    Sandeep

    Use ME2M / ME2N  and ME2L reports with selection parameters WE101 and scope of list EINT

  • FI Invoices Outbound - Any Standard Program ?

    Hi,
      Is there any Standard program to send SAP FI Open, Chnaged and Closed Invoices in to a File (Outbound)?
    How to Track Invoice is new or chnaged or closed ?
    I appricate any help. Thank you.

    i am away from my system...Just check BKPF / BSEG tables...U can get the field name....
    Award points if useful...

  • Standard GR/IR report with the followingfields?

    Hi All,
    Transaction FBL3N gives a list of open GR/IR items at a keydate. However, I'm looking for the following fields to appear in the output
    Purchasing doc#, line item, Vendor number, vendor name, Material, Quantity, Accounting document no., Document date and Amount.
    I've tried adding certain fields using the Layout option but still havent got all these fields together. Is there any standard SAP report which would give me this output or would I need to develop a custom report?
    I also tried ABAP query but there;s no link between an accounting document and Purchase order apart from the "Assignment" field in the BSEG table which has purchase orderno. and line item combined into a single text field.
    Thanks
    Tejas

    The link between the PO and the accounting document is not a simple link, so you will not find it on the same line in ANY standard report.
    You will either have to write a custom report or use two reports or similar.
    Are you also aware of transaction MR11 this also does not display it in the format you want but it is a relevant transaction for open GR/IRs.
    I would be interested to know what you want to use this report for and why you need the accounting document numbers to be displayed like this? It might help us to help you if we know what you are trying to achieve.
    Steve B

  • XML Publisher Report in EBS without Standard Oracle Report

    Hi folks ,
    i have some questions.
    Can I create a XML Publisher Report for the EBS without a Standard Oracle Report in EBS.
    So that I can build up the files with the Desktop Publisher, create Data Definition / Template with Upload / Create the executable und concurrent and than only start the new program in EBS ?
    I have the situation that I can start my program with the template in the background but ít is searching for the report on the file system.
    Thanks in advance for the feedback.
    regards
    Kay

    Hi Ravi ,
    can i do it only with the xml Publisher , because when i tried it in the past and get all the staff like Template / DD / CP up and running and started the CR i get an error from the system that he is missing a report directly in the file system... so he searched for the report himself on system like a standard 6i Report. But the template and the dd is stored in the db. So my question again, can I use the XML Puplisher without a Standard Report or can i use a dummy file only for checking and after that he use my template / dd.
    regards
    Kay

  • How to get week of a year in American Standard

    Hi,
    I know it might be a repetitive question, but did not find a convincing solution to it in any of the previous threads. I am looking for a function (user defined, if someone has already written it) to return me the week of the year, in American Standard. i.e,the week should start on Sunday and end on Saturday (and NOT from Monday to Sunday)
    01/03/2010 - Should be the start of 1st week of 2010...... 01/10/2010 will be the start of second week and so on
    01/04/2009 - Should be the start of 1st week of 2009...... 01/11/2009 will be the start of second week and so on.
    Does any one have a function that takes a date as input and returns back the week of the year in this above format? Any help is greatly appreciated.
    Thanks

    It's a bit trickier because the ISO rule has some fine prints.
    Take a look at the example below:
    SQL> with t as (
      2  select to_date('01/02/2010', 'MM/DD/YYYY') dt from dual union all
      3  select to_date('01/03/2010', 'MM/DD/YYYY') dt from dual union all
      4  select to_date('01/03/2009', 'MM/DD/YYYY') dt from dual union all
      5  select to_date('01/04/2009', 'MM/DD/YYYY') from dual)
      6  --
      7  select dt,
      8         to_char(dt+1, 'iw') tweaked_week,
      9         to_char(dt, 'ww') nls_week
    10    from t
    11   order by dt;
    DT          TWEAKED_WEEK NLS_WEEK
    3/1/2009    01           01
    4/1/2009    02           01
    2/1/2010    53           01
    3/1/2010    01           01
    SQL> If may notice that 02/01/2010 is week 53 and and that at the same time 03/01/2009 is already week 1, being 04/01/2009 week 2.
    The reason is in the docs:
    An ISO week always starts on a Monday and ends on a Sunday.
    * If January 1 falls on a Friday, Saturday, or Sunday, then the ISO week that includes January 1 is the last week of the previous year, because most of the days in the week belong to the previous year.
    * If January 1 falls on a Monday, Tuesday, Wednesday, or Thursday, then the ISO week is the first week of the new year, because most of the days in the week belong to the new year.It depends on what your requirement asks.
    If you need to prevent any January 1st that fall on Monday to Thursday from becoming your week 1 you'd probably have to adjust that implementation and fine tune it further.
    Let us know if that will do or what your rule would be otherwise.

  • Standard def STB repeatedly blacks out for a second or 2

    A site search on "black out" does not turn up a similar issue, of course it may be my search skills are lacking. 
    New subsciber since Sep 2010.   I have 3 boxes, 2 standard QIP2500 phase 3 (basement/MBR) and 1 hi-def (LR). Hi def TV in MBR, standard in other 2.  Both standards would on occassion 'black out' for just a second or 2 ever since original install. No problem with the hi-def. One standard is wired with coax, one with s-video, the hi-def with HDMI.  When the black out occurs, the screen will freeze for an instant, then the screen goes black, then it comes back. The information on the box does not black out, the TVs don't power-off and restart. It's just the screen goes blank and then comes back. It happens without any seeming warning, and will happen at any time of day. Sometimes it will happen repeatedly, sometimes it won't happen for a long time.  Both boxes have been repeatedly reset and rebooted with no improvement.  During one tech call the tech had me relocate one of the standard boxes to the hi-def location to isolate that it wasn't a location (in-house cable) problem. I thought it curious since it was happening at 2 different locations but went through the drill anyway.  The relocated standard worked for a few minutes, then went black for a second as the problem does.  Plugging the high-def back in resolved the issue at that location. The tech said he would send 2 new standard boxes. The replacement boxes have not resolved the issue.  Any suggestions? 

    if it's doing that on two different boxes using both coax and s-video then I think that leaves a possibility of the coax.  that would be a unusual symptom of a bad coax/splitter but try this.  
    Wait for it to start acting up - when it starts acting up, stay tuned to that station, and hit your menu > customer support > in home agent > network diagnostics and click ok to start.
    check the video signal using that menu selection.  after it's done running it will give you a general message like "you're video signals are not in the optimal range"     after you see that, hit info and then look at the SNR DB   it should give you a number.
    32-36 is optimal, below that is questionable.
    If it is below that then it's probably going to be a problem with the splitter at your house, and you would either replace it yourself, or call verizon to replace it.  be sure to tell them the DB

  • I would like to add another browser so I can get google email standard, how do I do this?

    I am having a problem with my gmail account, I can only load the HTML version and not the standard version. They suggested I upgrade my browser, which I have several times and I have the latest Firefox. But, I still can not load the Standard Version in the Gmail account. Months ago I could, then it just stopped and since then I can not get the Standard Version. So, I thought I would download their browser as a second browser to be used just for Google and I can't down load it. What do I do now?

    You really want to use Firefox don't you?
    Create a bookmark and you can invoke it by it's keyword shortcut:
    : '''name:''' gmail: (standard) Gmail - Inbox (white) Standard
    : '''Location:''' javascript:void(window.open(location.href="https://mail.google.com/mail/?ui=html&zy=s"))
    : '''Keyword:''' gmail:
    : '''Description:''' https://mail.google.com/mail/?ui=html&zy=s
    Your actual problem would be resolved by deleting your Google.com cookies pertaining to gmail. Do not delete all of your cookies.
    * https://support.mozilla.com/kb/Deleting cookies
    As far as downloads go, Firefox closes the download window immediately now when the download completes, you can bring the window back up with :"Ctrl+J".
    ''More complete answer for the GMail cookies''
    * Can't open GMail in Firefox but can in I.E. - any suggestions? | Firefox Support Forum | Firefox Help<br>https://support.mozilla.com/en-US/questions/831534

  • Error message with a Serial Number after purchasing CS6 Design Standard

    Hello everyone,
    I have an error message "we are unable to validate this serial number for CS6 Design Standard" when using my official serial number.
    I got the Serial Number from my Adobe ID account and e-mail purchase validation.
    And no phone number to contact. Someone knows what I could do ?

    You should Contact Customer Care via chat; phone support is only available Mon-Fri during US daytime.

  • Error in printing a standard smartform (LE_SHP_DELNOTE)

    Hi,
    I'm supposed to copy a standard form for delivery note (LE_SHP_DELNOTE), the problem is, I'm trying to print preview the standard form but it's resulting in an error :  <b>
    Exception       FORMATTING_ERROR
    Message ID:          SSFCOMPOSER
    Message:
    Error in address output (name not filled).</b>
    I tried many printers and different systems and i'm still getting the same result.
    Could anyone help in resolving this issue please?
    Thank you.
    F.A.D

    Check if your data is ok, meaning that you indeed have an address maintained for the ship-to of this delivery.
    Hope it helps,
    Leonardo De Araujo

Maybe you are looking for

  • Onboard HDMI half-works, then gone entirely on new build

    Hello, I built a system today with these specs: Mobo: MSI 785GM-E65 (AMD 785G chipset) GPU: (Integrated) Radeon HD 4200 CPU: Phenom II X2 550 Black 3.1ghz w/ Stock Heatsink-Fan RAM: 6GB DDR3 1333 PC10600, Crucial Micron Case: MicroATX w/ Antec 550W A

  • Fix for iTunes and Windows not recognizing iPod after 7.7 upgrade...

    ..it worked for me! XP wanted drivers for the iPod, and I used the method in this doc with one exception: iPod is not recognized properly by computer when USB drivers are not installed properly or are out of date http://docs.info.apple.com/article.ht

  • How to use the mail?

    Hi. I have been trying to use the Mail that the computer has by default but just I can not do it. It opens and accept my emails but eventually refuse to send them over or if it sends them they never arrived. Just I tried again with a photo booth one

  • Lightening to 30 Pin Connector- Accessory not Supported

    My Lightening to 30 PIN Apple connector is having issues with my ipod nano- 7th generation. It keeps saying "accessory not supported" It will randomly work (not very often anymore) and I have tried cleaning it but to no avail. Does anyone have any ot

  • Retina 5k iMac + CS6

    Hi, It is my understanding that Creative Suite CS66 was updated to handle standard size retina screens on Mac. I am wondering has anyone tried using any of the CS6 software (mainly PS, ID, AI) on the new iMac with the 5k retina screen? If so, does ev