Unzipping files compressed with unix "compress"

I'm writing a bit of code that interrogates database activity snapshot files and stores key data in a database. The snapshot files are produced every hour and are compressed using the unix "compress" command.
I want to read the contents (a single text file) of these compressed files and I thought that the java.io.Zip package would give me what I was looking for.
In the code below, I already have an array of file references. I'm just iterating over the files, creating a ZipFile object from the File reference. That's where I get my ZipException (ZipfileUtility.java:57).
            for(int i=0; i<snapshots.length; i++){
                System.out.println(snapshots);
ZipFile zipFile = new ZipFile(snapshots[i]);
System.out.println("About to read zipfile");
Enumeration enumeration = zipFile.entries();
while(enumeration.hasMoreElements()){
System.out.println("Getting zip entry");
ZipEntry entry = (ZipEntry) enumeration.nextElement();
System.out.println(entry);
zipFile.close();
And the exception stack trace...
java.util.zip.ZipException: error in opening zip file
     at java.util.zip.ZipFile.open(Native Method)
     at java.util.zip.ZipFile.<init>(Unknown Source)
     at java.util.zip.ZipFile.<init>(Unknown Source)
     at org.lawford.zip.ZipfileUtility.main(ZipfileUtility.java:57)I couldn't find reference to ZipFiles NOT being able to read compress-ed files. Mind you, I couldn't find reference to them being able to either.
Am I out of luck? Does ZipFile understand the compress format? Am I missing something that I should have seen in the API?
I currently have no control over how the snapshots are compressed at source. I understand that the preferred archiving and compression command on unix is gzip.
Any suggestions?
Thanks.

Hi,
ZIP and compress are incompatible. Compress is based on the LZW algorithm, which is also used to compress GIF images. Unisys has intellectual propteries on this algorithm, which was one of the reasons to develop a roality-free algorithm. One of them is the ZIP format. ZIP can also create archives of files, while compress only handles a single file.
So you're not lucky in using the java.io.zip package in reading a compressed file.
Might be you could start the uncompress command via Runtime.exec and read the uncompressed data from the OutputStream of this process.

Similar Messages

  • Integration of BIP Report file bursting with Unix FTP directory(using IBot)

    We have a requirement for dynamically creating the folder in FTP server, which gets the bursted file when a BIP report is executed.
    We have used the following approach:
    Created a IBot with Custom Java Program which creates a folder dynamically in FTP, now we are facing the problem to integrate this IBot with the BIP report file bursting.
    Can anyone suggest how to overcome this issue or is there any alternative approach to achieve the above requirement.
    Any help would be appreciated.

    .

  • ODI file append with unix command problem

    Hi everyone,
    I will apennd multiple files to a main file in a directory.I wrote jython code like this:
    import os
    sourceDirectory = "/home/oracle1/Desktop/test"
    inFileNames = "#FILE_NAMES"
    inFileNamesList = inFileNames.split(" ")
    inFileIDS = *"'a','1','2','3'"*
    inFileIDSList = inFileIDS.split(",")
    i = 0
    for item in inFileNamesList:
         command ="awk 'BEGIN {OFS=\"" + "#FILE_DELIMITER\"" + "} {print $0,\"[|]\"" + *inFileIDSList* + ",\"[|]\"NR}' " + sourceDirectory + os.sep + item + " >> " + sourceDirectory + os.sep + "#SESS_CURR_TS"
         os.system(command)
         i = i + 1
    Now my problem is here :
    Yu can seee my inFileIDS values.Ana I have splitted comma and then first I will write file then *inFileIDSList[i]* and at the and NR
    But character a is not written.Number values is appended but character values is not appended.
    Why it can be so ?
    someone has an idea about this problem
    Coluld anyone help me to solve this problem?
    Regards

    import os
    sourceDirectory = "#SOURCE_DIRECTORY"
    inFileNames = "#FILE_NAMES"
    inFileNamesList = inFileNames.split(" ")
    inFileIDS = "#FILE_IDS"
    inFileIDSList = inFileIDS.split(",")
    i = 0
    for item in inFileNamesList:
         command ="awk 'BEGIN {OFS=\"" + "#FILE_DELIMITER\"" + "} {print $0,\"\"*FILENAME*}' " + sourceDirectory + os.sep + item + " >> " + sourceDirectory + os.sep + "#SESS_CURR_TS"
         os.system(command)
         i = i + 1
    My original code is above.Now I am printing file datas and I want to print file name that are being ready.
    First print$0(all line) and field_delimiter(;) and then I will print filename which file is processed.My output file will be like this:
    all line;FILENAME
    But I write FILENAME into awk command it returns with path .So for example likie this:/d102/odi/uca/arrival/data/T02344903302310.txt .But I want to print only file name without paths.
    I want to print T02344903302310.txt so my output file will be like this:
    all_line;T02344903302310.txt and there are many files like this.
    Could anyone has an idea?

  • How to unzip files in Shell script

    Hi Gurus,
    I have one requirement, where in I need to unzip files at at unix level and then move to the respective folders. How can I code this in the shell script.
    As it is entirely new thing, please let me know how to do this. Seeking few program steps.
    Thanks in advance for your help.
    Regards
    Nagendra

    You can specify unzip -d directory to tell unzip where to extract the files. For instance:
    dest_dir="/tmp/$(date +%N)"
    # zip file is cmd line argument $1
    unzip -d $dest_dir "$1"
    cd $dest_dirI don't know of any way to automatically go to a zip'ed directory stored inside the zip file. Besides, where do you want to go in case there are several directories stored?

  • Does my Mac 10.9.5 have a built in compression system to unzip files?  If not what utility is best?

    Does my Mac 10.9.5 have a built in compression system to unzip files?  If not what utility is best?

    It does. Select any file you want to compress. CTRL- or RIGHT-click on it and select Compress ... from the context menu. To decompress simply double-click on the compressed file to open it. If you have any problems then get The Unarchiver 3.9.1 for decompressing. Use The Archive Browser 1.9.1 to do both.

  • File Compression Using ByteArrayInput/OutputStream

    Hi,
    I want to write a File compression program using ByteArrayInput/OutputStream. It should access multiple files and zip the contents of the files and give it in the ByteArrayOutputStream so the user can name that file and store it as zip. How to do this. I have done this with OutputStream. I want the code its very urgent!

    u can use this code to compress and zipping the files
    import java.util.zip.*;
    public class Compress
    public static void doit(String filein[], String filepath[], String fileout)
    // Common Input streams for all the files that will be zipped
    FileInputStream fis = null;
    // Output stream for the ZIP file
    FileOutputStream fos = null;
    try
    fos = new FileOutputStream(fileout);
    // Initialize Zip output stream
    ZipOutputStream zos = new ZipOutputStream(fos);
    // Add entries to the ZIP file
    ZipEntry ze = null;
    // Initialize buffer for the output data
    final int BUFSIZ = 4096;
    byte inbuf[] = new byte[BUFSIZ];
    int n;
    for ( int i = 0; i < filein.length; i++)
    fis = new FileInputStream(filein);
    // Add entries into the ZIP file and give it the display name.
    // Upon unzip the same directory structure as given in the filepath will be followed
    // The file path is the complete path of the file including the file name.
    ze = new ZipEntry(filepath[i]);
    zos.putNextEntry(ze);
    inbuf = new byte[BUFSIZ];
    // write output to the out put stream
    while ((n = fis.read(inbuf)) != -1)
    zos.write(inbuf, 0, n);
    fis.close();
    fis = null;
    zos.close();
    fos = null;
    catch (IOException e)
         System.err.println(e);
    finally
         try
              if (fis != null)
                   fis.close();
         if (fos != null)
         fos.close();
    catch (IOException e)
         System.err.println("\nError Occured while Zipping the files: " + e.getMessage());
    public static void main(String args[])
    // Files to be Zipped
    String filein[] = {"file1.txt", "file2.txt", "file3.txt",};
              // Display name that appears in Win ZIP or any other unzip utility
              String filepath[] = {"temp\\file1.txt", "temp\\file2.txt", "temp\\file3.txt",};
    // Name of the Zip file
    String fileout = "compress.zip";
    // Zip it
    doit(filein, filepath, fileout);

  • Does Time Machine use a differential/delta file compression when copying files ?

    Hello,
    I would like to use Time Machine to backup to a MacBook Air but that computer is using a Virtual Machines store in a single file of 50 GBytes.
    Once the initial backup will be done, does Time Machine will only copy the changes in this large file or will everyday copy the full 50 Bytes?
    In other word does Time Machine use a differential/dela file compression algorithm (like un rsync)?
    If it is not yet the case, can you please file for me an application request to the development team internally?
    If others are also interested in such a feature, you’re welcome to vote for it.
    Kind regards,
    Olivier

    Ok, it looks like the current version of Time Machine cannot handle efficiently in network terms large files like Virtual Machine files and this a real issue today.
    Is anybody here able to file an official feature request to Apple to be able to use Time Machine efficiently also with large files (let's stay > 5 GBytes) ?
    I should propably mean using a differential compression algorythm over the network like the examples below:
    http://rsync.samba.org/tech_report/tech_report.html
    http://en.wikipedia.org/wiki/Remote_Differential_Compression

  • Applescript:: PICT File Compression doesn't work [?]

    I have tried many options for getting Pict file compression to work via scripting but to no avail. Can anyone confirm if this works in PS CS4?
    PICT file options I am trying to get to work are:  {class:PICT file save options, compression:maximum quality JPEG, embed color profile:false, resolution:thirty two, save alpha channels:false}
    All option **except compression seem to work.. :/
    Example code:
    set newPictFilePath to ((path to desktop folder) & "newPictFile.pct") as string
    tell application "Adobe Photoshop CS4"
    set docRef to current document
    set photoshopSaveFormat to PICT file
    set photoshopSaveOptions to {class:PICT file save options, compression:maximum quality JPEG, embed color profile:false, resolution:thirty two, save alpha channels:false}
    set kSavedDocument to save docRef in file newPictFilePath as photoshopSaveFormat with options photoshopSaveOptions appending no extension with copying
    end tell

    Hello,
    I did not have any problems with the script you posted, but I ran into issues when my document was not flattened, or was not RGB. After I added that to the script, everything seemed to work.
    I also have a habit of telling a document to save instead telling the app to save a document, but that might just be my quirk.

  • Unzip file with .sar and .car extensions.

    Hi,
    how unzipping file with .sar .car extensions?
    Best regards

    Hello,
    in the directory in which the SAP kernel is installed there has to be a executable called SAPCAR or sapcar.exe.
    Windows: C:usrsapSIDSYSexerunsapcar.exe -xvf archive.sar
    Unix: /usr/sap/SID/SYS/exe/run/SAPCAR -xvf archive.sar
    or
    Windows: C:usrsapSIDSYSexerunsapcar.exe -xvf archive.car
    Unix: /usr/sap/SID/SYS/exe/run/SAPCAR -xvf archive.car
    Regards, Michael

  • File Compression Error Occurred on Expert Settings

    I finally thought I had a decent set of settings to compress the QT movie for YouTube (and will tweak this now that each ten minute movie can be 1 gig! Hooray!). So I used what had worked on the previous movies and got the error
    File Compression Error Occurred
    The movie could not be properly compressed. The movie may contain some data that is invalid for the type of compression selected.
    The movie itself contains imported footage from my camcorder (DV), as I usually use. And I used only iMovie stock music loops and one Sfx...all aif files.
    I've tried tweaking the settings of Expert Settings here and there. Changed the 15 fps to 30. Made the Key At this point, what do I look at? Something within the movie itself? Or the settings (H.264, qual. Hi but also tried med, Frame rate tried 15 and 30, Key frame rate tried Auto and 10, Encoding multi-pass, Dimen. 640x480, Scale Letterbox. Sound is ACC, 24,000 kHz, stereo, bit rate 48kbps. Prepare for internet streaming is unchecked currently. Think it was that way when I last made a movie.
    Don't know where to continue to troubleshoot this. URmpfff. Help!

    Wow...this is a sticky problem. I could not resave the movie as another name. Frozen program, pinwheel of death, etc. I've had about three frozen attempts since I first wrote...
    Googling phrase from error message got only a few lost souls who never did post the solution. One solution someone had figured the problem was with Flip4Mac...to uninstall it, but they had problems uninstalling it. So did it...frozen computer).
    I did get this solution to work to move my movie along. It saved rather quickly to a 350 meg or so full quality file on Expert settings. Then got a single .dv file. Imported that into a new iMovie project (wonderful....that I titled My Great Movie fer petes sake...Hope I remember to retitle it.
    Along the way were error messages like Error 54, no permission to write to file, no hard drive space, etc. Baloney, I think...or is there any clue in all this?
    Thanks.

  • File Compression

    I have a web service created in a .net environment that examines existing pdf files in a staging directory prior to sending them over the wire using FTP.
    Part of that process requires that I rename individual files in order to associate them with a particular batch.
    I also have a requirement to reduce the size of individual files as much as possible in order to reduce the traffic going over the line.
    So far I have managed about a 30% compression rate by using an open source library (iTextSharp).
    I  am hoping that I can get a better compression rate using the Acrobat SDK, but I need someone to show me how, hopefully with an example that I can follow.
    The following code snippet is a model I wrote that accomplishes the rename and file compression...
                const string filePrefix = "19512-";
                string[] fileArray = Directory.GetFiles(@"c:\temp");
                foreach (var pdffile in fileArray) {
                    string[] filePath = pdffile.Split('\\');
                    var newFile = filePath[0] + '\\' + filePath[1] + '\\' + filePrefix + filePath[2];                
                    var reader = new PdfReader(pdffile);
                    var stamper = new PdfStamper(reader, new FileStream(newFile, FileMode.Create), PdfWriter.VERSION_1_5);
                    int pageNum = 1;
                    for (int i = 1; i <= pageNum; i++) {
                        reader.SetPageContent(i, reader.GetPageContent(i), PdfStream.BEST_COMPRESSION);
                    stamper.Writer.CompressionLevel = PdfStream.BEST_COMPRESSION;
                    stamper.FormFlattening = true;
                    stamper.SetFullCompression();
                    stamper.Close();
    Any assistance is appreciated.
    regards,
    Greymajek

    Greymajek wrote:
    ...using the Acrobat SDK...
    Then you better ask in the Acrobat SDK forum.

  • JPEG File Compression

    I just found out that when raw files are converted (via the export function) to jpeg, the file compression is extremely low. MY posted online picture quality certainly reflects the low jpeg file size.
    When I export pictures, I get several choices, but none of the choices allows me to set the jpeg compression to a "high" level.
    Can someone help me maximize my converted jpeg file size. I shot with a 21 megapixel camera and end up with a 1mb file (not good)

    Thank you Ian ... I appreciate your quick reply. It seems that after I upgraded my camera from a Canon xti to a Canon 5D Mark II and from a PC to an iMac, I find that I am very ignorant of photography and post processing.
    The combination of equipment above with Aperture is causing me huge problems since I can not get the picture quality (on screen or prints) that I got from the xti and pc. My pictures were post card quality - the upgraded equipment combination is giving me poor quality photo's. Very frustrating. I wonder where I can go for help ?

  • File Compression Problem

    Hello all,
    I'm having a strange file compression problem with CS5 on our new Mac Pro.  We purchased the Mac Pro to scan and process images, but the JPEGs and GIFs we create from this computer are much larger than they should be when closed (e.g. images that should be compressed to 6KB are reading as 60KB, and the file size is often smaller when opened than closed). Furthermore, anytime we use these image files in other programs (e.g. Filemaker Pro) the inflated file size will carry over.  What's even more puzzling is that the same files that are reading as 60KB on our Mac Pro will read correctly as 6KB from a PC.  Similarly, if we embed these images -- that were created on the Mac Pro -- into Filemaker from a PC, the image file size is correct.  We cannot use the compressed files we create on our Mac Pro because the inflated file size will be passed on to whatever application we use on the Mac Pro (except for Photoshop).
    We have been processing images for years on a PC and haven't had any troubles with this.   We were thinking for a while that the problem was with the Mac operating system, but after many calls with expert Apple advisers it seems like Photoshop for Mac has to be the issue.  We have already tried reformatting and re-indexing the hard drive, and at this point there is nothing else that can be done from Apple's end.  The last expert I spoke with at Apple said that it sounds like the way Photoshop for Mac compresses files and how Mac reads file sizes is very different from the way Photoshop for PC compresses files and how Windows reads file sizes.  If he was correct, and there is no work-around, we'll have no other choice and will have to return our Mac.
    Has anyone else experienced this before?  The experts at Apple were thoroughly confused by the problem and so are we.
    Thanks,
    Jenny

    This has nothing to do with compression.
    Macintosh saves more metadata, and more previews than Windows - that's one part.
    Macintosh shows the size of the file on disk, including wasted space due to the disk block size - that's another part. (look at the byte count, not the size in K or MB).
    When you upload the files to a server, or use them in most programs, that extra metadata is lost and the file sizes should be identical.
    I can't believe that your advisors spent any time on such a trivial issue...

  • File compression question

    Question about using the file compression feature within Finder. I understand that when I select multiple files and compress them, it creates a ZIP file with the default name 'archive.zip.' I created a working folder on my desktop and put in about 16 alias of photos that are down in my iPhoto 'originals' folder. When I created the archive.zip, file it was only 2.1 MB (the jpg's varied in size from <1 MB to 3 MB).
    I did this with the presumption that by using aliases in my working folder, the compression utility actually compressed the jpg's in the various folders but not the alias pointers.
    So, since the archive.zip is only 2.1 MB, what conclusion can I draw? Can the compression really squeeze them down that much?

    Dr. Livingstone wrote:
    Question about using the file compression feature within Finder. I understand that when I select multiple files and compress them, it creates a ZIP file with the default name 'archive.zip.' I created a working folder on my desktop and put in about 16 alias of photos that are down in my iPhoto 'originals' folder. When I created the archive.zip, file it was only 2.1 MB (the jpg's varied in size from <1 MB to 3 MB).
    I did this with the presumption that by using aliases in my working folder, the compression utility actually compressed the jpg's in the various folders but not the alias pointers.</div>
    that's wrong. the compression utility compressed the aliases as it should have. if you want to compress the original images that's what you need to put in that folder, not aliases. but keep in mind that jpegs are already compressed so you'll hardly gain any space by compressing them further.
    So, since the archive.zip is only 2.1 MB, what conclusion can I draw? Can the compression really squeeze them down that much?

  • MAC COMPATIBLE file compression FREEware

    Does anyone know of any freeware for compressing video files (for either iPod storage, or of course burning to DVD-R)? My local Apple IT guy told me that "Stuffit" has long been the sort of bona fide software program used for file compression. However, there is a price tag on it.
    I have a very reliable ripping program (Mac the Ripper) in order to back up my purchased DVDs onto my hard drive, but I would like to be able to compress the files so that I can store them in my iPod and take them with me, etc.
    Thanks!

    I would like to be able to compress the files so that I can store them in my iPod and take them with me, etc.
    Do you simply want to compress them to store on the iPod or do you want to view them on the iPod also?
    If you want to simply compress a file, select it in Mac OS X, right click (or option click) File -> Make archive.
    If you want to view it on the iPod
    Get Media Fork.

Maybe you are looking for

  • Parallel extraction of data for BW3.1 and BI 7.0 from one ECC 6.0  source

    We have an existing BW3.1 system and  we are coming up with BI 7.0 system. Can we extract data to both BW3.1 ( all our standard datasource have enhancement) and BI7.0 from one single ECC6.0 source system. If yes, can anyone provide more info on how t

  • Trouble with CC on a 32c11ou1 32 inch

    I cannot seem to get the cc to work on my TV when watching certain DVD's..I have a Panasonic Blue Ray Player, and in the hires mode the TV will not produce any text..This is only a problem on some Standard DVD's..If I try to run the TV in the lower r

  • I followed all the steps but my still don´t recognize my iphone

    I followed all the steps but my still don´t recognize my iphone the steps that you have in your web but doen´t work what can i do thanks

  • Extremely slow after upgrade

    I got Tiger working but the start up is sooo slow, like HOURS slow. It takes FOREVER to just get to the Finder and be able to move around on there. And when I open an App. it takes FOREVER too, and to make a change in a preference, same thing- Beachb

  • Update from version 10041 stuck

    Trying to update. finds many updates but stuck in zero percent then fails. 10041 -> 10061 Anyone can assist? The problem is folder Explorer crushes all the time so I am getting crazy here... Please help! amit