Gzip and compress slow on T6320

So far testing has shown that gzip and compress run 2-3 times slower on the new blades as on the older V-440s.
Mkfile, Tar, unzip, rm –rf, and Copy run at about the same speed.
I have eliminated LDOMs, RAM, swap, and local hard drives as issues (by putting SWAP and the target directory on SAN, and running in single user mode).
Dusting off my performance tuning hat - been a long time - any hints as to where to look would be greatly appreciated.

Further research found that the problem is with the Niagra chip - for single threaded processes it is "less efficient" than the older SPARC chips (it blows). For most applications that use a large number of small threads these chips rock (pun intended) - BUT for large single thread jobs (like gzip or compress) the processor is just not as fast. Sun recommends checking Oracle jobs for large sequential queries, and breaking these into many smaller queries.
Experience so far has been that 4 threads on the T6320 can match 2 CPUs on a F-280 for our Oracle instances, the only problem has been with the backups, where a large number of files are gzip'd. With 64 threads available we will increase the number of threads during the backup window - or - be more patient.
gzip version was not considered since the performance issue arises with compress also.

Similar Messages

  • Gzip is very slow on T5220

    I have one SUN Server as oracle 9i server.
    Server Model: T5220, 32 Virtual CPUs and 16G memory. I am very disappointe for its IO speed using internal disk. When I import data into DB server , it spend 16Hours. but it only spend 8 hours in SUN V480.
    And then, I use command "gzip" to compress one file. I found it's also very slow. It spend 1 hours to gzip 10G size file. When it gzip, I check using iostat. The IO speed is only 3M/s.
    What's wrong is it? How can I improve the IO speed?

    shula4 wrote:
    I have one SUN Server as oracle 9i server.
    Server Model: T5220, 32 Virtual CPUs and 16G memory. I am very disappointe for its IO speed using internal disk. When I import data into DB server , it spend 16Hours. but it only spend 8 hours in SUN V480.
    And then, I use command "gzip" to compress one file. I found it's also very slow. It spend 1 hours to gzip 10G size file. When it gzip, I check using iostat. The IO speed is only 3M/s.
    What's wrong is it? How can I improve the IO speed?Both commands (import into db server and gzip) are typical single-threaded tasks. CMT-based servers (e.g. T5220) doesn't perform very well in such
    tasks. They were designed to perform very well with many, many tasks or jobs which can be parallelized. Use different tools in each case:
    Import into DB
    Try to paralleize it, e.g. since Oracle 10 you can use data pump (instead of import). Data pump is typical parallel processing.
    gzip
    Use pbzip2 which is parallel implementation of the bzip2. Example of its use:
    [http://przemol.blogspot.com/2009/01/parallel-bzip-in-cmt-multicore.html|http://przemol.blogspot.com/2009/01/parallel-bzip-in-cmt-multicore.html]

  • HTTP Headers - enabling caching and compression with the portal?

    Has anyone configured their web server (IIS or Apache) or use a commercial product to flawlessly cache and compress all content generated by the portal?
    Compression and caching is critical for making our portal based applictions work for overseas users. It should be doable, just taking advantage of standard HTTP protocols, but implementing this a complex system like the portal is tricky, we seem to be generating different values in the HTTP Headers for the same types of files (such as CSS).
    We are running Apache so can't take advantage of the built in compression capabilities of the .net portal. We are running the java vervion. 6.1 mp1, sql server 2000 (portal, search, collab, publisher, studio, analytics, custom .net and java portlets on remote server).
    Basically our strategy is to compress all outgoing static and dynamic text content (html, CSS, javascript), and to cache all static files (CSS, javascript, images) for 6 months to a year depending on file type.
    Here are some links on the subjects of caching and compression that I have compiled:
    Caching & Compression info and tools
    http://www.webreference.com/internet/software/servers/http/compression/
    http://www.ibm.com/developerworks/web/library/wa-httpcomp/
    http://www.mnot.net/cache_docs/
    http://www.codeproject.com/aspnet/HttpCompressionQnD.asp?df=100&forumid=322472&exp=0&select=1722189#xx1722189xx
    http://en.wikipedia.org/wiki/Http_compression
    http://perl.apache.org/docs/tutorials/client/compression/compression.html
    https://secure.xcache.com/Page.aspx?c=60&p=590
    http://www.codinghorror.com/blog/archives/000807.html
    http://www.howtoforge.com/apache2_mod_deflate
    http://www.ircache.net/cgi-bin/cacheability.py
    http://betterexplained.com/articles/how-to-optimize-your-site-with-http-caching/
    http://betterexplained.com/articles/speed-up-your-javascript-load-time/
    http://betterexplained.com/articles/speed-up-your-javascript-load-time/
    http://www.rubyrobot.org/article/5-tips-for-faster-loading-web-sites
    http://betterexplained.com/articles/how-to-optimize-your-site-with-gzip-compression/
    http://www.gidnetwork.com/tools/gzip-test.php
    http://www.pipeboost.com/
    http://www.schroepl.net/cgi-bin/http_trace.pl
    http://leknor.com/code/gziped.php?url=http%3A%2F%2Fwww.google.com
    http://www.port80software.com/surveys/top1000compression/
    http://www.rexswain.com/httpview.html
    http://www.15seconds.com/issue/020314.htm
    http://www.devwebpro.com/devwebpro-39-20041117DevelopingYourSiteforPerformanceCompressionandOtherServerSideEnhancements.html
    http://www.webpronews.com/topnews/2004/11/17/developing-your-site-for-performance-optimal-cache-control
    http://www.sitepoint.com/print/effective-website-acceleration
    http://nazish.blog.com/1007523/
    http://msdn.microsoft.com/library/default.asp?url=/library/en-us/IETechCol/dnwebgen/IE_Fiddler2.asp?frame=true
    http://www.fiddlertool.com/fiddler/version.asp
    http://www.w3.org/Protocols/rfc2616/rfc2616-sec13.html
    http://www.web-caching.com/cacheability.html
    http://www.edginet.org/techie/website/http.html
    http://www.cmlenz.net/blog/2005/05/on_http_lastmod.html
    http://www.websiteoptimization.com/speed/tweak/cache/
    http://www.webperformance.org/caching//caching_for_performance.html
    http://betterexplained.com/articles/how-to-debug-web-applications-with-firefox/
    Edited by tkoenings at 06/18/2007 6:26 AM

    Hi Scott,
    Does Weblogic platform 8.1 supports netscape? We have developed a portal which
    works perfectly on IE but it dies in netscape. Is netUI tags not supported in
    Netscape?
    Pls reply
    manju
    Scott Dunbar <[email protected]> wrote:
    From a pure HTML perspective Portal does it's rendering with nested
    tables.
    Netscape 4.x and below have terrible performance with nested tables.
    The
    problem is not the Portal server but rather Netscape on the client machine.
    If IE and/or a recent version of Netscape/Mozilla is not possible then
    there are
    really only two options:
    1) Faster client hardware - not likely to be an acceptable solution.
    2) Minimize the number of portlets and the complexity within the portlets.
    Neither of these solutions are a great answer, but the 4.7 series of
    Netscape is
    getting pretty old. Having said that, we've got customers who want to
    continue
    to use IE 4 :)
    Again, though, this problem is, I'm afraid out of our hands. It is the
    client
    rendering time that is the issue.
    cg wrote:
    Does anyone know of any known reasons why the 7.0 (did it also with4.0) portal
    pages can take up to almost 30 seconds to load in Netscape 4.7? I knowit is a
    very generic question but our customer still uses 4.7 and will notuse the portal
    b/c it takes so long to load some of the webapps. What the pages willdo when
    loading is that the headers will come up and when it gets to the bodyof the page
    it seems to stall and then comes up all of a sudden. For some of thepages it
    takes 6 seconds and for others it takes about 24-27 seconds.
    We have suggested using IE only but that is not an option with allof the customers
    and getting a newer version of Netscape is also out of the question.
    Any suggestions would be greatly appreciated.--
    scott dunbar
    bea systems, inc.
    boulder, co, usa

  • Split and Compress files

    Is it possible to compress and split files via terminal without a 3rd part software?
    I know I can compress but I cant find anywhere how to split it to certain size parts.
    Furthermore is there any app to help users with terminal commands?

    there are plenty of compression tools on the mac. gzip and bzip2.
    Lately, I like bzip2. This works a little easier on linux, especially if
    the original file is split into a lot of smaller ones.
    $ gzip TestFile
    $ ll
    total 120944
    -rw-r--r--@ 1 andya  501  61921000 May 31 22:01 TestFile.gz
    $ split -b 10m TestFile.gz
    $ openssl dgst -sha256 TestFile.gz xa*
    SHA256(TestFile.gz)= cd041d79b4af1a54b524602363a18e67201c4acb03675dfebcae9109d8707367
    SHA256(xaa)= a3d803049aee16cbbfd679668164707eb9053488fb2ec5720f282a711ee8c451
    SHA256(xab)= 0a79e26c77cb47ec09f5cf68cfa45ea8f52f5157cad07c0ac187eaf0ae59ff79
    SHA256(xac)= 0f556e8e93dcb41cb3ab20454ab46c016d6596316d75316d810f45e7c2b3682e
    SHA256(xad)= abc3db83737346a8af6ac7ba9552c4b71cf45865f7b9faded54f1683b2afd077
    SHA256(xae)= 3afbad7b68a1d1c703865422e40cbd68ca512a652f985a0714258b7d936ad0f6
    SHA256(xaf)= 11879853fcfbe6df6fb718e1166d4dcae7e0e6ebd92be6c32c104c0a28f0439a
    keep the hash of the smaller file, in case you get a error on the far end of the transfer.
    That way you only need to resend the small file that's corrupt.
    put the the TestFile back together.
    $ cat xa* > ScratchFile
    $ openssl dgst -sha256 TestFile.gz ScratchFile
    SHA256(TestFile.gz)= cd041d79b4af1a54b524602363a18e67201c4acb03675dfebcae9109d8707367
    SHA256(ScratchFile)= cd041d79b4af1a54b524602363a18e67201c4acb03675dfebcae9109d8707367

  • Help to improve expdp performance and compression

    Hi,
    We have a oracle standard Edition which do not support parallel , compression features of expdp.
    The expdp dmp file is around 90GB and it takes 45 minutes to create in production server . Then the scripts compresses the file with gzip utility and compression takes 80 minutes.
    To copy the compressed file from prod to staging server it takes another 47 minutes .
    We have automated the process but it takes long time for expdp + compression + copy ( Around 3 hrs ) . On staging server it does take more than 4 hours to create the staging db.
    Is there anyway I can improve the performance of these 3 operations .
    Can I do compression while file is exporting ? I tried using pipes in unix and it doesn't work for expdp.
    We don't want to use network link .
    Will expdp commands writes the file sequentially ? If so , can I start gzipping parallely when files are exported .
    Also tried compressing with gzip -1 option , but it has increased the file size by 30% , and eventually increased the copy time to staging server .
    Please help
    Thanks,
    Bharani J
    Edited by: 973089 on Nov 27, 2012 9:40 AM
    Edited by: 973089 on Nov 27, 2012 9:41 AM

    Hi,
    Why 'do not support parallel' ?
    I understand you don't want to use database link, i had this problem here [i used expdp].
    This is what i've done:
    A script that do:
    A full logic backup using expdp,
    a bzip2 to compact,
    and a transfer to the machine of destiny.
    It would be far more easily if i could use the database link, but i couldn't.
    however i used the parallel in expdp command.
    Hope you find a good solution.

  • Serialization and compression

    i have 3 big HashMap in a class.
    i want to serialize/deserialize this class using Externization and compression.
    if any one have good example to share.

    You must remove the zos.close() line.
    As I said, you are engaged in a space-time tradeoff: if you want speed don't compress; if you want to save space, compress. As this is the RMI group I assume this is going over a network so saving space will also save you tranmission time, but it will cost you compression and decompression time. Whether it is all really worth it is a moot point: you will have to measure and see. It is sounding to me as though maybe it isn't worth it.
    If you really think 'use of Externalizable interface is slower than default Serializable interface' (I don't), you could also try making it Serializable with the Map fields marked as transient, as follows:
    private void
    readObject(ObjectInputStream objectInput) throws
    IOException, ClassNotFoundException
    objectInput.defaultReadObject();
    GZIPInputStream zos = new GZIPInputStream(objectInput);
         ObjectInputStream oos = new ObjectInputStream(zos);
         map1 = (Map) oos.readObject();
         map2 = (Map) oos.readObject();
         map3 = (Map) oos.readObject();
    private void writeObject(ObjectOutputStream objectOutput)
    throws IOException
    objectOutput.defaultWriteObject();
    GZIPOutputStream zos = new GZIPOutputStream(objectOutput);
    ObjectOutputStream oos = new ObjectOutputStream(zos);
         oos.writeObject(map1);
         oos.writeObject(map2);
         oos.writeObject(map3);
         zos.finish();
    }Alternatively get rid of all this and construct your outer ObjectOutputStream around a GZipOutputStream, and your outer ObjectInputStream around a GZipInputStream. If you're doing RMI this is not possible: there is a solution involving a client socket factory but it's pretty clumsy.

  • Gzip and Flex

    From what I understand, gzip (and other compression tools)
    are compressed at the server level and the browser itself is able
    to decompress them.
    On my Flex project I am recieving a null object error, and
    when I use Firebug to evaluate the network results, the
    content-encoding section and accept-encoding sections both list
    gzip.
    Shouldn't the browser decompress the gzip files before they
    are sent to the flash player, or am I supposed to do something
    specific in Flex for this to happen?
    Here are the results from Firebug:

    Hello bahar08
    Its been a long while since you posted your topic but I've just found it after being getting stuck with this exact problem. There is a gzip decompressor my Danny Patterson that might help. I've added the header in as well but the responces don't seem to be coming down in a compressed format.
    It would be cool to chat with you about this.

  • Gzip HTTP Compression

    Hello,
    Is it possible to retrieve gzipped http compressed data in a mobile flex application? I have tried using URLLoader with no luck. I have also tried using HTTPService with no luck.
    In both requests I have added the following but it doesn't seem to make a difference.
    HTTPService:
    httpService.headers = new URLRequestHeader('Accept-Encoding','gzip');
    URLLoader:
    request.requestHeaders.push(new URLRequestHeader('Accept-Encoding','gzip'));
    loader.dataFormat = URLLoaderDataFormat.BINARY;
    In the progress handler of URLLoader it picks up the event.bytesTotal as being correct but when the data is downloaded it is not downloading as compressed. For example the compressed data is 8.1K as reported by event.bytesTotal but URLLoader is downloading 165K as displayed by event.bytesLoaded. This makes a huge impact on application speed relating to time taken to download data over a 3G network, especially for a data hungry app. Any help on this matter would be appreciated.

    Hi,
    Yes, as each request pass through the IIS, everything will be compressed.  And also it depends on the CPU utilization and settings done during the compression.  As per the article if the CPU utilization is 90% or above compression does not performed.
    https://www.nothingbutsharepoint.com/sites/itpro/Pages/SharePoint-Compression-in-IIS.aspx
    Please mark it answered, if your problem resolved or helpful.

  • Why does my iPhoto open and run slow since upgrading to OSX 10.8.5

    Why does my iPhoto open and run slow since upgrading to OSX 10.8.5?  I have an Imac - Intel 2.5 GHz quad core I5 - with 20 GB of memory. Before the upgrade it was much faster.

    Why does my iPhoto open and run slow since upgrading to OSX 10.8.5?  I have an Imac - Intel 2.5 GHz quad core I5 - with 20 GB of memory. Before the upgrade it was much faster.

  • Hello All... Back after a brief absence, things look a little bit different. I'm trying to take a 16 minute mini dv video and compress it for use on the web. I'm interested in any suggestions you may have on settings for the video and audio tracks. I'v

    Hello All...
    Back after a brief absence, things look a little bit different.
    I'm trying to take a 16 minute mini dv video and compress it for use on the web. I'm interested in any suggestions you may have on settings for the video and audio tracks. I've tried using Sorenson 3 (15 frames, key frames set to automatic, 320 x 240) for video and IMA 4:1 (mono) for audio. The resulting video looked great but the file size came in at about 255 Mb.
    Thanks!
    PowerMac G5 1.8 Dual   Mac OS X (10.4.3)  
    Message was edited by: Dan Foley

    Thank you for the replies.  Everyone was correct about the jack, interface, and phasing problems.  I have been unplugging my motu audio interface and then using headphones at work.  I have not changed any detailed audio output settings in logic.  When I read that the jack might be a problem I tried switching headphones.  This actually helped.  I am using dre-beats headphones and they seem to be having issues with the mac/jack-(the phasing/panning problems.  I can use these headphones with other devices but not the mac.  I have to use ipod ear buds and the phasing seems fixed.  Hopefully this information is helpful to someone else. 
    If anyone knows how to correct this issue please let me know its difficult to know what my final mixes are going to sound like and I have had to keep bouncing everything into i-tunes- sync to ipod and then listen in my car radio. 

  • E-mail forwards and downloads flicker and are slow to download

    Never had this problems until last night after I downloaded 4.ob12. When downloads are moved to either side for visibility, it returns to original position and the e-mail also flickers off and on--very disconcerting.
    I've also had problems when I delete e-mail. Rather than going back to the Inbox, it returns to the delete folder sometimes.
    I mainly chose 4.ob12 so I could add frequent downloads in the upper status bar, however I now learn that they are not available in 4.ob 12

    I tried the sync fix but it did not have any impact on the problem. Still struggling with freezing mail and very slow downloads. Mail coming in at about 4 kb/sec when I have a broadband connection capable of about 1 Mb/sec.
    I assume you were referring to running sync from .Mac in "System Preferences". I ran sync and left it to Synchronize with .Mac - Automatically.
    Any other thoughts?
    Regards
    Robert

  • Hi from the last two days my iphone have very slow to open the apps and very slow when i check the notification window , it taking too much time to open when i tap to down . help me to resolve the issue.

    Hi,  from the last two days my iphone( iphone 4 with ios 5) have very slow to open the apps and very slow when i check the notification window , it taking too much time to open when i tap to down . help me to resolve the issue.

    The Basic Troubleshooting Steps are:
    Restart... Reset... Restore...
    iPhone Reset
    http://support.apple.com/kb/ht1430
    Try this First... You will Not Lose Any Data...
    Turn the Phone Off...
    Press and Hold the Sleep/Wake Button and the Home Button at the Same Time...
    Wait for the Apple logo to Appear and then Disappear...
    Usually takes about 15 - 20 Seconds... ( But can take Longer...)
    Release the Buttons...
    Turn the Phone On...
    If that does not help... See Here:
    Backing up, Updating and Restoring
    http://support.apple.com/kb/HT1414

  • RE: Cleaning and Compression Environment Repository

    Mark,
    I'm not sure what I did wrong the first time. I thought I
    covered all the bases, but just to be certain, I did the
    whole sequence again and now it works.
    Before, I did have problems restarting the environment,
    because the repository was locked. I had to kill a
    hanging process to solve the problem. Most likely
    it was this process that was interfering with my original
    attempts.
    With this script, you offered some very usefull infor-
    mation. Where did you get it? Like I said in my
    first posting, I couldn't find any documentation.
    The "help" function of "envedit" doesn't show any of
    the commands you used. No way, you stumbled
    on this by accident.
    Thanks,
    Pascal Rottier
    STP - MSS Support & Coordination Group
    Philip Morris Europe
    e-mail: [email protected]
    Phone: +49 (0)89-72472530
    +++++++++++++++++++++++++++++++++++
    Origin IT-services
    Desktop Business Solutions Rotterdam
    e-mail: [email protected]
    Phone: +31 (0)10-2428100
    +++++++++++++++++++++++++++++++++++
    /* Ever stop to think, and forget to start again? */
    -----Original Message-----
    From: [email protected]
    [SMTP:[email protected]]
    Sent: Wednesday, June 30, 1999 1:04 PM
    To: [email protected]
    Cc: [email protected]
    Subject: Re: Cleaning and Compression Environment Repository
    The script mentioned is the exact sequence of commands that should be
    used.
    Did you shutdown the environment first? You cannot clean an environment
    repository while the environment is online. I know that these commands
    work for both NT and Unix environment managers.
    To unsubscribe, email '[email protected]' with
    'unsubscribe forte-users' as the body of the message.
    Searchable thread archive <URL:http://pinehurst.sageit.com/listarchive/>-
    To unsubscribe, email '[email protected]' with
    'unsubscribe forte-users' as the body of the message.
    Searchable thread archive <URL:http://pinehurst.sageit.com/listarchive/>

    Mark,
    I'm not sure what I did wrong the first time. I thought I
    covered all the bases, but just to be certain, I did the
    whole sequence again and now it works.
    Before, I did have problems restarting the environment,
    because the repository was locked. I had to kill a
    hanging process to solve the problem. Most likely
    it was this process that was interfering with my original
    attempts.
    With this script, you offered some very usefull infor-
    mation. Where did you get it? Like I said in my
    first posting, I couldn't find any documentation.
    The "help" function of "envedit" doesn't show any of
    the commands you used. No way, you stumbled
    on this by accident.
    Thanks,
    Pascal Rottier
    STP - MSS Support & Coordination Group
    Philip Morris Europe
    e-mail: [email protected]
    Phone: +49 (0)89-72472530
    +++++++++++++++++++++++++++++++++++
    Origin IT-services
    Desktop Business Solutions Rotterdam
    e-mail: [email protected]
    Phone: +31 (0)10-2428100
    +++++++++++++++++++++++++++++++++++
    /* Ever stop to think, and forget to start again? */
    -----Original Message-----
    From: [email protected]
    [SMTP:[email protected]]
    Sent: Wednesday, June 30, 1999 1:04 PM
    To: [email protected]
    Cc: [email protected]
    Subject: Re: Cleaning and Compression Environment Repository
    The script mentioned is the exact sequence of commands that should be
    used.
    Did you shutdown the environment first? You cannot clean an environment
    repository while the environment is online. I know that these commands
    work for both NT and Unix environment managers.
    To unsubscribe, email '[email protected]' with
    'unsubscribe forte-users' as the body of the message.
    Searchable thread archive <URL:http://pinehurst.sageit.com/listarchive/>-
    To unsubscribe, email '[email protected]' with
    'unsubscribe forte-users' as the body of the message.
    Searchable thread archive <URL:http://pinehurst.sageit.com/listarchive/>

  • My external hard drive goes in to constant access mode and really slows down my computer

    I'm using an external hard drive for time machine.  After working for an hour or so the drive goes into a constant access mode and realy slows down my IMac.  I look at activity monitor and everything looks good, no high CPU usage or memory issues.  The only fix is to use the power button to shut down my IMac.  After restart it works fine for a while.
    If I turn off time machine the problem seems to not happen.  I tried disk utility on the external hard drive and it says the dirve is fine.  I change out external drives and the problem still occures.  Any thoughts?

    I didn't realize my profile stated otherwise.
    Please check near the bottom of your original post above to see OS X (10.7.2).  You might want to update that to avoid confusion and save some time in the future.
    OK, please take a deep breath.
    Apple does not officially support Time Machine backups to a drive at the USB port of the AirPort Extreme....likely because it is not reliable. You might want to review this Apple Support document to confirm:
    http://support.apple.com/kb/HT2038
    So, we are not going to be able to provide you with much assistance on that issue.
    Some users seem to be able to make this work....some have some problems....and some (like me) who have tried this have nothing but problems.
    Sorry, I can't help on this one, but maybe another user who has had better luck will post to provide his secrets.
    The iTunes and iPhoto libraries and files are my only copies now.  I moved everything from my Macbook to the external then deleted the originals.
    Not sure if you have thought about this.
    If you deleted the "originals", then your only copy of your data is on one hard drive. You have no backups if the drive has a problem.
    A minimum backup plan would be to have a copy of important data on two different drives. When....not if...one drive fails, you have the data on the "other" drive.
    You have no "other" drive according to the information that you have provided.
    Further.....Time Machine backs up the changes on your Mac. At some point....you cannot know when.....Time Machine will pick up the change on your Mac and delete the iTunes and iPhoto data from your Time Machine backups.
    The files might stay there a few months, or even longer. But, I have seen instances where the files were deleted within a week or two.
    So my advice would be to only delete data from your Mac that you can afford to lose.

  • How can I use my Mac G3 (OS 9.2.2) to make Windows disk images and compress them into Windows stuffed archives?

    I have a collection of old Windows diskettes, 800 K and 1.4 MB.  I want to make disk images (or whatever is the Windows equivalent) and compress them into Windows stuffed archives.  I want Windows users to be able to download them, expand the stuffed archives, extract the disk images, and use the softsare on old Windows computers.  I need to do the work on a Power Macintosh beige G3 Tower using OS 9.2.2.  What software can I get to do it?

    To Jan, Greetings
    Thank you for your message and for your suggestions.
    Is WinZip a Mac application that makes compressed files that can be opened on a Windows computer?  If so, then do you have any notion where I could find a version old enough to run on OS 9.2.2?
    I noticed that DropStuff 6.0, which I use on my G3 Tower, has an option to make compressed files that are self-extracting on Windows.  Do you know if that works?  I suppose that I could make such a file and then find somebody with a Windows computer to test if for me.
    Yes, I could (shudder) get an old Windows computer somewhere, learn to use it, and do the project that way.  Do you know if the old versions of Windows include disk image programs and file compression  programs?  Or, would I need to buy some old software?
    Thank you for the information about the size of the diskettes.  As you've probably noticed, I don't know very much about Windows things.
    Sincerely,
    Frontiersman

Maybe you are looking for

  • Lenovo G560 Graphics related query.

    Hi, I have a Lenovo G560. The specifications are: Operating System: Windows 7 Home Basic 64-bit (6.1, Build 7601) Service Pack 1 (7601.win7sp1_gdr.130318-1533) Language: English (Regional Setting: aEnglish) System Manufacturer: LENOVO System Model: 2

  • *** FCP takes FOREVER to open on just 1 of my projects and a few more probs

    This is the weirdest thing. My editors give me external HD's all the time and I plug them in, FCP opens no problem and I can work with the footage. BUT, a recent project on a HD give to me by one of my editors takes about 3-5 mins to open and then wh

  • How to convert the cursor to lens

    Hi anyone here can help me I have developed an application which opens the FITS Images(Flexible Image Transform System- which is used to open astronomical images) . Now i need to convert the cursor to lens means as and when the cursor moves on the im

  • Dropped frames during playback. Dumb question but I'm just starting to edit again with a new comp. and FC 7

    Hello out there, It's been three years since I've used Final Cut Pro and I"m just getting back in to it. I'm working on my first project editing footage from my Canon 5D markII. I'm trying to do a simple cross fade on the video track but when I watch

  • CUPC digit manipulation

    With the LDAP sync between AD and CUCM and CUPS, dialing straight from CUPC by right clicking your contacts, will result in dialing the extension as entered in AD under telephone number (or business phone). Now, many people in our organisation have e