Gzip 1.3.3

I just want to suggest an upgrade of gzip to 1.3.3 from the current 1.2.x.  There is a bug with large zipped files such that zcat will cause an error of the following:
/path/file.gz Value too large for defined data type
the current 1.3.3 version fixes this bug. (i have downloaded the source and compiled it, and it works for me, but don't have the time or knowhow to create a new package)
This problem is applicable to anyone dd'ing hard drives to compressed image files or otherwise having large compressed files that they need to uncompress.

rayzorblayde wrote:
I just want to suggest an upgrade of gzip to 1.3.3 from the current 1.2.x.  There is a bug with large zipped files such that zcat will cause an error of the following:
/path/file.gz Value too large for defined data type
the current 1.3.3 version fixes this bug. (i have downloaded the source and compiled it, and it works for me, but don't have the time or knowhow to create a new package)
This problem is applicable to anyone dd'ing hard drives to compressed image files or otherwise having large compressed files that they need to uncompress.
just go to the home page, find the package for gzip and click "flag this package out of date" - the maintainer will update

Similar Messages

  • What is difference between C# Gzip and Java swing GZIPOutputStream?

    Hi All,
    I have a Java swing tool where i can compress file inputs and we have C# tool.
    I am using GZIPOutputStream to compress the stream .
    I found the difference between C# and Java Gzip compression while a compressing a file (temp.gif ) -
    After Compression of temp.gif file in C# - compressed file size increased
    while in java i found a 2% percentage of compression of data.
    Could you please tell me , can i achieve same output in Java as compared to C# using GZIPOutputStream ?
    Thank a lot in advance.

    797957 wrote:
    Does java provides a better compression than C#?no idea, i don't do c# programming. and, your question is most likely really: "does java default to a higher compression level than c#".
    Btw what is faster compression vs. better compression?meaning, does the code spend more time/effort trying to compress the data (slower but better compression) or less time/effort trying to compress the data (faster but worse compression). most compression algorithms allow you to control this tradeoff depending on whether you care more about cpu time or disk/memory space.

  • Loading a text file in a gzip or zip archive using an applet to a String

    How do I load a text file in a gzip or zip archive using an applet to a String, not a byte array? Give me both gzip and zip examples.

    This doesn't work:
              try
                   java.net.URL url = new java.net.URL(getCodeBase() + filename);
                   inputStream = new java.io.BufferedInputStream(url.openStream());
                   if (filename.toLowerCase().endsWith(".txt.gz"))
                        inputStream = (java.io.InputStream)(new java.util.zip.GZIPInputStream(inputStream));
                   else if (filename.toLowerCase().endsWith(".zip"))
                        java.util.zip.ZipInputStream zipInputStream = new java.util.zip.ZipInputStream(inputStream);
                        java.util.zip.ZipEntry zipEntry = zipInputStream.getNextEntry();
                        while (zipEntry != null && (zipEntry.isDirectory() || !zipEntry.getName().toLowerCase().endsWith(".txt")))
                             zipEntry = zipInputStream.getNextEntry();
                        if (zipEntry == null)
                             zipInputStream.close();
                             inputStream.close();
                             return "";
                        inputStream = zipInputStream;
                   else
                   byte bytes[] = new byte[10000000];
                   byte s;
                   int i = 0;
                   while (((s = inputStream.read())) != null)
                        bytes[i++] = s;
                   inputStream.close();
            catch (Exception e)
              }

  • Error while transferring data to a Unix file using the FILTER 'gzip'

    I have to particularly use the 'gzip' filter to compress the files that are placed in Unix directory through ABAP code. This filter was working fine initially and I was able to get the files saved correctly however lately I am getting a short dump at the TRANSFER command. the runtime error 'DATASET_PIPE_CLOSED'. Kindly guide me as to how i can avoid this.

    there is o relation of infoobject name in flat file and infoobjet name at BW side.
    please check with the object in the BW and their lengths and type of the object and check your flat file weather u have the same type there,
    now check the sequence of the objects in the transfer rules  and activate them.
    there u go.

  • What is the best gzip ratio by using pipe to export

    Hello,
    I try to use data pump to export a core application schema. (10Gdb 10.2.0.4)
    I use 'estimate' parameter of expdp before real export, and get future dump file size
    will be 235GB.
    the only issue is that my largest file system size is 34GB, so seems like I have to use
    pipe to zip while exporting...
    the action plan is like below:
    1 make a unix pipe
    $ mknod /faj29/appl/oracle/exports/scott1 p
    2.driect pipe to .dmp
    nohup gzip -5c < scott1 > /faj29/appl/oracle/exports/ODS.dmp.gz & ===>quesiton is here?
    3. create a par
    userid=system/password@dbname
    SCHEMAS=xxx
    DUMPFILE=scott1
    LOGFILE=dpump_emrgods:exp_xxx.log
    COMPRESSION=NONE
    CONTENT=ALL
    ATTACH=xxx.EXPORT.job
    nohup expdp PARFILE=exp_xxx.par &
    My quesiton is based the information I provided, (34GB filesystem size vs 235GB dumpfile size)
    how much zip extent shoud I use?
    gzip -5c or gzip -9c
    seems like -9c is very time consuming, this why I don't like to use it. but I don't know
    zip ratio, so any friend knows?
    or do we have better way to do this task?
    (seems like I can not use parallel export by using pipe, if yes, how?)
    thanks a lot
    Jerry
    Edited by: jerrygreat on May 8, 2009 9:00 AM

    Hello,
    I am wrong, the pipe won't work with expdp at all.
    it is unlike the process with the older EXP tool, which you could tell to write to a named pipe and the data written to the named pipe could be compressed—all in one step.
    So can I use parameter 'compression'???
    --it seems like to only to compress meta data? not for dump file...?
    (but I want to compress my expected 235G dump file to fit one 34G filesystem)
    Any idea?
    regards,
    Jerry

  • Oracle 9.0.1 Linux Download Disk 1 Is NOT in gzip Format

    Hi,
    I tried to download twice Oracle 9.0.1 Linux on my Windows first, then tranfer to my Linux machine. All the other disks (gz files) are transfered and uncompressed into cpio files, expect Disk 1, Linux9i_Disk1.cpio.gz.
    I checked Linux9i_Disk1.cpio.gz first using WinZip on Windows, then gunzip on Linux. The problem is the same: Linux9i_Disk1.cpio.gz is not in gzip format.
    Since all the other Disks (.cpio.gz files) works fine, my suspicion is that Linux9i_Disk1.cpio.gz is corrupted on the server. Has anyone in Oracle really tested the download recently?
    Of cause next monday I will check direct download to my Linux machine at work.
    Thanks in advance!
    Dong Liu
    So that we may better diagnose DOWNLOAD problems, please provide the following information.
    - Server name
    - Filename
    - Date/Time
    - Browser + Version
    - O/S + Version
    - Error Msg

    Hi,
    I reported that Oracle 9.0.1 Linux Download Disk 1 does not work if I first download it to Windows. My colleague downloaded it to a Linux machine, and it can be unzipped.
    So only one disk file can only be directly downloaded to Linux.

  • GZIP Compression Issue in Weblogic 8.1

    Has anyone experienced issues with gzip compression filtering in 8.1?
    After about 2 months of dev. and testing with the filter, a bug has been found
    in our portal, and I have traced it back to the filter that I downloaded from
    dev2dev code and utilities.
    It is strange. The only time I see an issue is when a user clicks the "back button"
    in the browser itself to go back to a previous page, AND then clicks a link (for
    example) in a portlet that is tied to an action in a jpf for that portlet. The
    error only happens for actions in a portlet that are local to that portlet (no
    forwarding to other portlets or anything fancy)...just pure "local internal forwards."
    If you do this, USING THE BACK BUTTON to get back to this page with the portlet,
    when you click the action, you are actually taken to the page where you were when
    you clicked the back button.
    I have re-created this is simple portals that have no content, so I know it is
    not something else I introduced.
    Any ideas? Maybe it is the filter-mapping pattern I am using??? I am currently
    mapping on *.portal, but this is the only situation that seems to break.

    For those interested, I have some updates on this subject and someone out there sent me a mail asking if I ever found anything out.
    Basically, when I encountered this issue the first time, I was working on a SP2 Portal. We were having numerous small bugs like this one with SP2 that centered around the back button or refresh button. BEA gave us a "super patch" back in July or so that when applied to our SP2 project, fixed our issue. I have sense moved onto SP3, and I did not see the compression issue (so I think SP3 must have the patches correctly setup...like I would assume it would).

  • How to index a gzipped pdf blob

    Hi,
    I already have a TAR logged on this but if anyone knows how to do this, I would greatly appreciate any tips.
    I have our PDF documents stored in BLOB's in our DB but they are stored in gzipped format for significant space savings. I have developed two oracle functions; "gzip" and "gunzip", which are wrappers to java classes loaded into the DB, that can do the gzip compression and uncompression on the fly. These functions
    accept BLOB's and return BLOB's. I use these functions to send the documents to client browsers, etc.
    Now, I want to index these PDF documents but have hit problems.
    Eg, suppose the table structure is; STREAM(STREAM_ID NUMBER, PDF_GZIPPED BLOB). I need to build an Oracle Text index on
    STREAM(unzip(PDF_GZIPPED)). Where the unzip function accepts a BLOB and returns a BLOB (the real PDF document).
    My original approach was to use the multicolumn datastore because context indexes using these datastores could be based on functions of the table columns rather than just table columns. Oracle support informed me that multicolumn datastores will only work for VARCHAR2 types (not BLOB's) so I can't use that.
    If anyone has experience in indexing table/column functions for BLOB's, I would be interested to hear how to go about it. Please remember our PDF BLOB's are gzipped, not just PDF documents dumped into BLOB's.
    Harris.

    Sorry, the "unzip" function should read "gunzip" in the previous post.
    Anyway, for those interested, I have found a temporary workaround but boy is it cludgy;
    1. Had to setup a USER_FILTER
    2. Wrote a shell script wrapper to ctxhx
    3. But ctxhx on 8.1.7.2 always failed (unsure at this stage)
    4. So, it calls ctxhx under a previous version of Oracle;
    Here is the code;
    exec ctx_ddl.drop_preference('unzip_filter');
    exec ctx_ddl.create_preference('unzip_filter','USER_FILTER');
    exec ctx_ddl.set_attribute('unzip_filter', 'COMMAND', 'unzip_inso');
    exec ctx_ddl.drop_preference('pdf_lexer');
    exec ctx_ddl.create_preference('pdf_lexer', 'BASIC_LEXER');
    exec ctx_ddl.set_attribute('pdf_lexer', 'printjoins', '_-');
    drop index cas_stream_blob_ctx_i;
    create index cas_stream_blob_ctx_i
    on cas.stream_blob(pdf_gzipped)
    indextype is ctxsys.context
    parameters('filter UNZIP_FILTER lexer PDF_LEXER');
    exec ctx_ddl.sync_index('cas_stream_blob_ctx_i');
    Here is the source of unzip_inso;
    #!/bin/ksh
    test $# -lt 2 && return
    TMP_DIR=/tmp
    TMP_FILE=${TMP_DIR}/ora_gunzip_$$
    /usr/bin/gunzip -cf $1 > ${TMP_FILE} &&
    OUTFILE=$2
    shift 2
    #$ORACLE_HOME/ctx/bin/ctxhx ${TMP_FILE} ${OUTFILE} $*
    # 8.1.6 workaround
    export ORACLE_HOME=/product/oracle/8.1.6
    export LD_LIBRARY_PATH=$ORACLE_HOME/ctx/lib:$LD_LIBRARY_PATH
    export LD_LIBRARY_PATH_64=$ORACLE_HOME/ctx/lib64:$LD_LIBRARY_PATH_64
    ${ORACLE_HOME}/ctx/bin/ctxhx ${TMP_FILE} ${OUTFILE} $*
    rm -f ${TMP_FILE}
    It works - but very ugly.
    Still seeking a better solution - since I have the gzip utilities
    already loaded in the DB.

  • WebService : error to retrieve big result gzip encoded

    Hi, I have a strange error with Results from Webservice SOAP
    when data is encoded in gzip or deflate mode.
    I have a webservice that return in non encoded mode a
    resultset of 95542 Bytes.
    The same resultset compressed in gzip is 8251 Bytes.
    No problem if there is no encoding between server and Flash
    Player (9,0,47 and 9,0,115 used)
    If Accept-Encoding header is set to gzip, deflate, then
    server send resultset encoded.
    Browser receive this resultset (trace with WireShark), but
    Flash Player don't load the result and the WebService go to timeout
    If I limit the data returned for this webservice by limiting
    number of rows returned, Flash is able to handle the result. For
    example : uncompressed data of 84070 Bytes give an encoded
    resultset of 7506 Bytes and theses data are well handled by flash
    player.
    I don't understand where is the problem.
    Does flash player have gzip decompression limitation ?
    Please help
    Thanks

    Hi, I have a strange error with Results from Webservice SOAP
    when data is encoded in gzip or deflate mode.
    I have a webservice that return in non encoded mode a
    resultset of 95542 Bytes.
    The same resultset compressed in gzip is 8251 Bytes.
    No problem if there is no encoding between server and Flash
    Player (9,0,47 and 9,0,115 used)
    If Accept-Encoding header is set to gzip, deflate, then
    server send resultset encoded.
    Browser receive this resultset (trace with WireShark), but
    Flash Player don't load the result and the WebService go to timeout
    If I limit the data returned for this webservice by limiting
    number of rows returned, Flash is able to handle the result. For
    example : uncompressed data of 84070 Bytes give an encoded
    resultset of 7506 Bytes and theses data are well handled by flash
    player.
    I don't understand where is the problem.
    Does flash player have gzip decompression limitation ?
    Please help
    Thanks

  • How to handle gzip on Adapter side

    Hello together,
    i have a problem with handling gzip content on SAP PI 7.0. I'd like to unzip a gzip file (*.gs) by using a Adapter Module e.g. PayloadZipBean. Is it possible to unzip gzip files or is an custom AdapterBean necessary.
    I have seen a lot of other entries peferring to use the ABAP method for handling gzip - files. It would be nice if it could also be handled via an AdapterBean.
    Has someone done this via an AdapterModule and maybe can provide a solution?
    Thanks for helping.
    Best Regards

    You can try to use the module I have developed to zip and unzip messages using gzip:
    http://scn.sap.com/community/pi-and-soa-middleware/blog/2013/04/29/module-payloadgzipbean-zip-and-unzip-payloads-using-gzip

  • Setting up gzip compression to optimize text throughtput

    Hello,
    I struggled to setup my dev cluster with gzip but it's running okay. However I cannot connect to it from .Net client. The server complains with Not in GZIP format
    What do I miss?
    Thanks,
    Michal
    BTW: The for doc .Net is not very great (since when one put's documentation in a dtd file?!? see [Network Filters |http://coherence.oracle.com/display/COH35UG/Network+Filters#]... ).
    On client I have
    app.config
    <coherence>
    <coherence-config>Config\coherence.xml</coherence-config>
    <cache-config>Config\cache-config.xml</cache-config>
    <pof-config>Config\pof-config.xml</pof-config>
    </coherence>
    cache-config.xls
    <cache-config xmlns="http://schemas.tangosol.com/cache">
    <caching-scheme-mapping>
    <cache-mapping>
    <cache-name>dist-contact-cache</cache-name>
    <scheme-name>extend-direct</scheme-name>
    </cache-mapping>
    </caching-scheme-mapping>
    <caching-schemes>
    <remote-cache-scheme>
    <scheme-name>extend-direct</scheme-name>
    <service-name>ExtendTcpCacheService</service-name>
    <initiator-config>
    <tcp-initiator>
    <remote-addresses>
    <socket-address>
    <address>[name of my machine]</address>
    <port>9099</port>
    </socket-address>
    </remote-addresses>
    </tcp-initiator>
    <outgoing-message-handler>
    <request-timeout>30s</request-timeout>
    </outgoing-message-handler>
    *<use-filters><filter-name>gzip</filter-name></use-filters>*
    </initiator-config>
    </remote-cache-scheme>
    </caching-schemes>
    </cache-config>
    msg from the client:
    Oracle Coherence for .NET Version 3.7.0.0 Build 23256
    RTC Release Build
    Copyright (c) 2000, 2011, Oracle and/or its affiliates. All rights reserved.
    2011-08-19 13:39:26.714 <D5> (thread=System.Threading.Thread): Loaded operational configuration from "FileResource(Uri = file://Config\coherence.xml, AbsolutePath = O:\bin\Debug\Config\coherence.xml)"
    2011-08-19 13:39:26.714 <D5> (thread=System.Threading.Thread): Loaded cache configuration from "FileResource(Uri = file://Config\cache-config.xml, AbsolutePath = O:\bin\Debug\Config\cache-config.xml)"
    2011-08-19 13:39:26.855 <D5> (thread=ExtendTcpCacheService:TcpInitiator): Started: TcpInitiator{Name=ExtendTcpCacheService:TcpInitiator, State=(Started), Codec=Tangosol.Net.Messaging.Impl.Codec, PingInterval=0, PingTimeout=30000, RequestTimeout=30000, ConnectTimeout=30000, RemoteAddresses=[10.166.110.67:9099], KeepAliveEnabled=True, TcpDelayEnabled=False, ReceiveBufferSize=0, SendBufferSize=0, LingerTimeout=-1}
    2011-08-19 13:39:26.870 <D5> (thread=System.Threading.Thread): Connecting Socket to 10.166.110.67:9099
    2011-08-19 13:39:26.870 <Info> (thread=System.Threading.Thread): Connected TcpClient to 10.166.110.67:9099
    2011-08-19 13:39:26.917 <D5> (thread=ExtendTcpCacheService:TcpInitiator): Loaded POF configuration from "FileResource(Uri = file://Config\pof-config.xml, AbsolutePath = O:\bin\Debug\Config\pof-config.xml)"
    2011-08-19 13:39:26.933 <D5> (thread=ExtendTcpCacheService:TcpInitiator): Loaded included POF configuration from "EmbeddedResource(Uri = assembly://Coherence/Tangosol.Config/coherence-pof-config.xml, AbsolutePath = assembly://Coherence/Tangosol.Config/coherence-pof-config.xml)"
    2011-08-19 13:39:27.011 <Info> (thread=System.Threading.Thread): Error establishing a connection with 10.166.110.67:9099: Tangosol.Net.Messaging.ConnectionException: TcpConnection(Id=, Open=True, LocalAddress=0.0.0.0:3551, RemoteAddress=10.166.110.67:9099)
    2011-08-19 13:39:27.011 <Error> (thread=System.Threading.Thread): Error while starting service "ExtendTcpCacheService": Tangosol.Net.Messaging.ConnectionException: could not establish a connection to one of the following addresses: [10.166.110.67:9099]; make sure the "remote-addresses" configuration element contains an address and port of a running TcpAcceptor
    2011-08-19 13:39:27.026 <D5> (thread=ExtendTcpCacheService:TcpInitiator): Stopped: TcpInitiator{Name=ExtendTcpCacheService:TcpInitiator, State=(Stopped), Codec=Tangosol.Net.Messaging.Impl.Codec, PingInterval=0, PingTimeout=30000, RequestTimeout=30000, ConnectTimeout=30000, RemoteAddresses=[10.166.110.67:9099], KeepAliveEnabled=True, TcpDelayEnabled=False, ReceiveBufferSize=0, SendBufferSize=0, LingerTimeout=-1}
    Unhandled Exception: Tangosol.Net.Messaging.ConnectionException: could not establish a connection to one of the following addresses: [10.166.110.67:9099]; make sure the "remote-addresses" configuration element contains an address and port of a running TcpAcceptor
    My server config is
    config\tangosol-coherence-override.xml
    <?xml version="1.0"?>
    <!DOCTYPE coherence SYSTEM "coherence.dtd">
    <coherence>
    <cluster-config>
    *<filter>*
    *<filter-name>gzip</filter-name>*
    *<filter-class>com.tangosol.net.CompressionFilter</filter-class>*
    *<init-params>*
    *<init-param>*
    *<param-name>strategy</param-name>*
    *<param-value>gzip</param-value>*
    *</init-param>*
    *<init-param>*
    *<param-name>level</param-name>*
    *<param-value>speed</param-value>*
    *</init-param>*
    *</init-params>*
    *</filter>*
    </cluster-config>
    </coherence>
    and config/contact-cache-config.xml
    <cache-config>
    <defaults>
    <serializer>pof</serializer>
    </defaults>
    <caching-scheme-mapping>
    <cache-mapping>
    <cache-name>dist-*</cache-name>
    <scheme-name>dist-default</scheme-name>
    </cache-mapping>
    <cache-mapping>
    <cache-name>repl-*</cache-name>
    <scheme-name>repl-default</scheme-name>
    </cache-mapping>
    <cache-mapping>
    <cache-name>aspnet-session-storage</cache-name>
    <scheme-name>aspnet-session-scheme</scheme-name>
    </cache-mapping>
    <cache-mapping>
    <cache-name>aspnet-session-overflow</cache-name>
    <scheme-name>aspnet-session-overflow-scheme</scheme-name>
    </cache-mapping>
    </caching-scheme-mapping>
    <caching-schemes>
    <distributed-scheme>
    <scheme-name>aspnet-session-scheme</scheme-name>
    <scheme-ref>dist-default</scheme-ref>
    <service-name>AspNetSessionCache</service-name>
    <backing-map-scheme>
    <local-scheme>
    <class-name>com.tangosol.net.cache.LocalCache</class-name>
    <listener>
    <class-scheme>
    <class-name>
    com.tangosol.net.internal.AspNetSessionStoreProvider$SessionCleanupListener
    </class-name>
    <init-params>
    <init-param>
    <param-type>com.tangosol.net.BackingMapManagerContext</param-type>
    <param-value>{manager-context}</param-value>
    </init-param>
    </init-params>
    </class-scheme>
    </listener>
    </local-scheme>
    </backing-map-scheme>
    <autostart>true</autostart>
    </distributed-scheme>
    <distributed-scheme>
    <scheme-name>aspnet-session-overflow-scheme</scheme-name>
    <scheme-ref>dist-default</scheme-ref>
    <service-name>AspNetSessionCache</service-name>
    <autostart>true</autostart>
    </distributed-scheme>
    <distributed-scheme>
    <scheme-name>dist-default</scheme-name>
    <backing-map-scheme>
    <local-scheme/>
    </backing-map-scheme>
    <autostart>true</autostart>
    </distributed-scheme>
    <replicated-scheme>
    <scheme-name>repl-default</scheme-name>
    <backing-map-scheme>
    <local-scheme/>
    </backing-map-scheme>
    <autostart>true</autostart>
    </replicated-scheme>
    <proxy-scheme>
    <service-name>ExtendTcpProxyService</service-name>
    <thread-count>5</thread-count>
    <acceptor-config>
    <tcp-acceptor>
    <local-address>
    <address>localhost</address>
    <port>9099</port>
    </local-address>
    </tcp-acceptor>
         *<use-filters><filter-name>gzip</filter-name></use-filters>*
    </acceptor-config>
    <autostart>true</autostart>
    </proxy-scheme>
    </caching-schemes>
    </cache-config>
    And I get
    2011-08-19 13:39:27.008/7012.815 Oracle Coherence GE 3.7.0.0 <Error> (thread=Proxy:ExtendTcpProxyService:TcpAcceptor, me
    mber=2): An exception occurred while decoding a Message for Service=Proxy:ExtendTcpProxyService:TcpAcceptor received fro
    m: TcpConnection(Id=null, Open=true, LocalAddress=10.166.110.67:9099, RemoteAddress=10.166.110.54:3551): java.io.IOExcep
    tion: Not in GZIP format
    at java.util.zip.GZIPInputStream.readHeader(GZIPInputStream.java:143)
    at java.util.zip.GZIPInputStream.<init>(GZIPInputStream.java:58)
    at java.util.zip.GZIPInputStream.<init>(GZIPInputStream.java:67)
    at com.tangosol.net.CompressionFilter.getInputStream(CompressionFilter.java:57)
    at com.tangosol.coherence.component.util.daemon.queueProcessor.service.Peer.onNotify(Peer.CDB:54)
    at com.tangosol.coherence.component.util.Daemon.run(Daemon.CDB:42)
    at java.lang.Thread.run(Thread.java:662)
    Edited by: 880448 on Aug 19, 2011 10:58 AM
    Edited by: 880448 on Aug 19, 2011 11:07 AM

    in .Net I have app.config file which specifies which .xml files are used. At least that's how the Coherence.Net example is configured. Basically I would like to know the schema for Config\coherence.xml referenced in app.config.
    As the documentation http://download.oracle.com/docs/cd/E15357_01/coh.360/e15726/net_netfilters.htm#sthref213 states:
    |"There are two steps to configuring a filter." The first is to declare the filter in the <filters> XML element of the cache |factory configuration file. This is illustrated in Example 23-2:
    |
    |Example 23-2 Configuring a Filter
    |
    |<coherence>
    | <cluster-config>
    | <filters>
    | <filter>
    | <filter-name>gzip</filter-name>
    | <filter-class>Tangosol.Net.CompressionFilter, Coherence</filter-class>
    | </filter>
    | </filters>
    | </cluster-config>
    |...
    |</coherence>
    1) It is really unclear whether it should be done on client or on the proxy!!! 2) Just pasting this into coherence.xml does not work with client complaining on "Could not find schema information for the element 'coherence'" This example is clearly wrong by not specifying the name of XML schema...
    |The second step is to attach the filter to one or more specific services. To specify the filter for a specific service, for |example the ExtendTcpCacheService service, add a <filter-name> element to the <use-filters> element of the service |declaration in the cache configuration file.
    |
    |Example 23-3 Attaching a Filter to a Service
    |
    |<remote-cache-scheme>
    | <scheme-name>extend-direct</scheme-name>
    | <service-name>ExtendTcpCacheService</service-name>
    | <initiator-config>
    | ...
    | <use-filters>
    | <filter-name>gzip</filter-name>
    | </use-filters>
    |
    | ...
    |</remote-cache-scheme>
    I am pretty sure it's on the client as that's where my <tcp-initiator> and then <address> of my cluster proxy is. theoretically these "two steps" should be easy...

  • Gzip-1.5-1 error uncompressing temporary file in lynx

    Today updated gzip to 1.5-1-i686.  Thereafter in lynx 2.8.7-5 no websites would display.  The message is "error uncompressing temporary file."  I reverted to gzip 1.4-4 and lynx works as expected.

    There is a bugtracker...

  • GZIPPed  XMLEncoded HttpRequest post parameter!

    Hi All,
    I'm devolping a standalone java application that builds some kinds of objects and needs to send them to a servlet in xml fashion (via HttpUrlConnection with "post" method).
    My problem is that now this objects are too big so i thought: no problem let's gzip them.....!
    Hence I wrote a method for gzipping xml encoded objects but now, using the same method that until yesterday made http requests by replacing the old xml string whith the new gzipped string url encoded (UTF-8) , doesn't work and I can't guess why......#][#@!
    Some one can help me please?
    Many thanks in advance.

    You're turning the output from GZIP into a String? That's not a good idea. If you really want to turn it into a String, I'd suggest that you use hex encoding. The servlet would in turn have to decode it into a byte array which you could then filter through GZIP again, probably with a ByteArrayOutputStream, to uncompress it. If you want to save some time, there is a Jakarta Commons project that has hex encoding capabilities. I think the project is actually called encoding, not 100% sure though :)

  • Problem with Compression (Deflater & GZip)

    Hi All,
    I've large data as a String which I need to save in Oracle in VARCHAR2 column. As Varchar2 allows maximum of 4000 characters, I wish to compress this String and save in DB.
    I tried to compress the String using Delfater and GZip. In both methods I uses Streams concep (DeflaterOutputStream, GZipOutputStream) and both Classes have the option to return the Compressed data in byte[] and String format.
    When I returned in String format, the same String is giving error while decompressing "java.util.zip.ZipException: incorrect data check". How to solve this problem.
    When I tried to save the compressed String in DB (Oracle),
    initially I got the error "java.sql.SQLException: Malformed SQL92 string at position: 1109"
    and later I tried to save like this 'strCompressed.replace("'","''") i.e., I replaced all single quotes to 2-single quotes and the error message is "java.sql.SQLException: ORA-00911: invalid character".
    Is there any character to replace in the compressed String. and how to solve the problem with decompression.
    Please help me in this.
    Thanks in advance.
    Regards
    Pavan Pinnu.

    both Classes have the option to return the Compressed data in byte[] and String format.Don't do that. String is not a container for binary data. You can't use it for compressed data. Use the byte[], send the byte[] to the database, get it back from the database, uncompress it, and then turn that back into a String.

  • Gzip encoded XML data in HTTP adapter

    Hi,
    I'm involved in building an synchronous interface to an external credit agency. According to their documentation their XML response is encoded as a
    gzip XML-data stream. As I understand this it implies that the XML data stream is compressed. Does the XI HTTP adapter (Web-As ICM/ICF framework) support this encoding? 
    Johan Göthberg

    From Adobe Support...
    "When you use an XML-based data provider with a tree you must
    specify the label field, even if it is "label". The XML object
    includes the root, so you must set showRoot="false". Remember that
    the Tree will not, by default, reflect dynamic changes to the XML
    object."
    So all I had to do was change the component tag from this:
    <mx:Tree id="checkTree"
    itemRenderer="util.CheckTreeRenderer" labelField="label"
    width="100%" height="100%" />
    to this:
    <mx:Tree id="checkTree"
    itemRenderer="util.CheckTreeRenderer" showRoot="false"
    labelField="@label" width="100%" height="100%" />
    Just as an FYI... The Adobe support is worth the cost if you
    are fairly new to Flex. I have been flexing for about 9 months now
    and find their service invaluable!!!
    Have an Ordinary Day...
    KomputerMan ~|:-)

  • Can  i build simple encoder like gzip,- png-, jpeg- or bmp-encoder?

    Hello dear people,
    can i ask about unknown photo- or texture-type?
    TGA File will show into flash-player?
    I know any encoders...
    Example: Encoders:
    For jpeg, png encoder:
    https://github.com/mikechambers/as3corelib/blob/master/src/com/adobe/images/PNGEncoder.as
    For gzip or zip encoder:
    http://code.google.com/p/ascompress/source/browse/trunk/src/com/probertson/utils/GZIPEncod er.as?r=9
    any suggestions encoder for unknown file types?
    If you like openning or viewing file like this OpenOffice Writer like PDF Viewer?
    Can i build tga Encoder for tga, tif and psd file types?
    How do i know hex or binary from current file type?`
    I know becasue encoder for compress like gzip.works nice.
    How do i multi file tree when one or  more files were compressed. But i have been found - same site is down
    Question:
    Why do we need "Encoder" for flash player?
    Do you think because encoder supports complety?
    Thanks

    Nancy O. wrote:
    An image is just an image.  You need HTML or JavaScript code to make it into a hyperlink. 
         <a href="http://example.com"><img src="your_image.jpg"></a>
    Unless the site you're uploading to is willing to wrap your image inside HTML code like the example above or JavaScript there is no way to make the image file link to anything on its own.
    Nancy O.
    But how is it that a pdf file is able to contain a hyperlink within the image and it takes me to my website?  Where on the image.gif, etc., do I attach the code you've posted?  The website I'm wanting to upload the image to, only allows the files with extensions I've already indicated in my op.   Thanx for your reply and help.   -KD

Maybe you are looking for

  • Is it possible to get my old iphone 3gs from att to work on a verizon network?

    is it possible to get my old iphone 3gs from att to work on a verizon network?

  • The bottom of the computer gets very hot

    I just got this macbook pro MC721ll/a. The bottom of the computer gets very hot as it is sitting on my lap. I have never had a mac before but I don't think it should be doing that. Also, I don't hear any fans running. Is this normal?

  • Oracle backup & restore problem

    Hi Team, I am new to the RMAN, I want to test Backup & restore operation I have database called Test , I am creating new table in the database called Test_table through SQL *coPlus create mmand. Now After this I am taking backup the of all cotrolfile

  • Time stamp woes

    Attached is a code I am building to generate and acquire waveform signals and then write them to a text file with appropriate time stamps. As it stands, I have only linked up the writing portion to the acquired signals (I plan to write generated sign

  • Disabling automatic download of Java plug-in

    We make a software product which is deployed as a large applet and uses the Java plug-in with IE. We have a customer who NEVER wants the plug-in to be installed dynamically, even from an internal location. Instead of attempting to install the plug-in