File Compression Using ByteArrayInput/OutputStream

Hi,
I want to write a File compression program using ByteArrayInput/OutputStream. It should access multiple files and zip the contents of the files and give it in the ByteArrayOutputStream so the user can name that file and store it as zip. How to do this. I have done this with OutputStream. I want the code its very urgent!

u can use this code to compress and zipping the files
import java.util.zip.*;
public class Compress
public static void doit(String filein[], String filepath[], String fileout)
// Common Input streams for all the files that will be zipped
FileInputStream fis = null;
// Output stream for the ZIP file
FileOutputStream fos = null;
try
fos = new FileOutputStream(fileout);
// Initialize Zip output stream
ZipOutputStream zos = new ZipOutputStream(fos);
// Add entries to the ZIP file
ZipEntry ze = null;
// Initialize buffer for the output data
final int BUFSIZ = 4096;
byte inbuf[] = new byte[BUFSIZ];
int n;
for ( int i = 0; i < filein.length; i++)
fis = new FileInputStream(filein);
// Add entries into the ZIP file and give it the display name.
// Upon unzip the same directory structure as given in the filepath will be followed
// The file path is the complete path of the file including the file name.
ze = new ZipEntry(filepath[i]);
zos.putNextEntry(ze);
inbuf = new byte[BUFSIZ];
// write output to the out put stream
while ((n = fis.read(inbuf)) != -1)
zos.write(inbuf, 0, n);
fis.close();
fis = null;
zos.close();
fos = null;
catch (IOException e)
     System.err.println(e);
finally
     try
          if (fis != null)
               fis.close();
     if (fos != null)
     fos.close();
catch (IOException e)
     System.err.println("\nError Occured while Zipping the files: " + e.getMessage());
public static void main(String args[])
// Files to be Zipped
String filein[] = {"file1.txt", "file2.txt", "file3.txt",};
          // Display name that appears in Win ZIP or any other unzip utility
          String filepath[] = {"temp\\file1.txt", "temp\\file2.txt", "temp\\file3.txt",};
// Name of the Zip file
String fileout = "compress.zip";
// Zip it
doit(filein, filepath, fileout);

Similar Messages

  • Playback issues from files compressed using the Divx Encoder

    Hey there,
    As the title implies, Im trying to edit Hypercam screen capture footage that was compressed with the Divx encoder it comes with.
    The footage then gets exported to an .avi format, and in windows it plays fine, HOWEVER, when I import the footage into premiere, all manner of horrible things happen, I have imported different footage encoded to the same format with no issues however. Has anyone else had an issue similar to this?
    the Rig:
    2.4 quad core
    3 gbyte of ram
    nvidia 8800 gts driver vers 182.xx
    Thanks

    If it is possible to NOT use the DivX CODEC, jump at that opportunity. It is a delivery format, pure and simple, and is not meant to edit, and does not edit well.
    Eddie's links will cover converting, but quality will be very poor.
    When faced with this CODEC, I use DigitalMedia Converter (~US$50), and it handles it fine. It just does not improve the quality, one iota. If there is no way around DivX (or Xvid), I can recommend the above. One must have the DivX CODEC (free) installed, but it sounds like you do. BTW, per Eddie's link, you WILL want to convert to DV-AVI Type II with PCM/WAV 48KHz 16-bit Audio. Again, you will need the MS DV CODEC installed, but most systems have this already.
    Good luck,
    Hunt

  • Does Time Machine use a differential/delta file compression when copying files ?

    Hello,
    I would like to use Time Machine to backup to a MacBook Air but that computer is using a Virtual Machines store in a single file of 50 GBytes.
    Once the initial backup will be done, does Time Machine will only copy the changes in this large file or will everyday copy the full 50 Bytes?
    In other word does Time Machine use a differential/dela file compression algorithm (like un rsync)?
    If it is not yet the case, can you please file for me an application request to the development team internally?
    If others are also interested in such a feature, you’re welcome to vote for it.
    Kind regards,
    Olivier

    Ok, it looks like the current version of Time Machine cannot handle efficiently in network terms large files like Virtual Machine files and this a real issue today.
    Is anybody here able to file an official feature request to Apple to be able to use Time Machine efficiently also with large files (let's stay > 5 GBytes) ?
    I should propably mean using a differential compression algorythm over the network like the examples below:
    http://rsync.samba.org/tech_report/tech_report.html
    http://en.wikipedia.org/wiki/Remote_Differential_Compression

  • Extracting compressed file (zip) using PL/SQL

    Hi!
    Can anyone help me on how to extract data out of a compressed file(ZIP) using pl sql.
    Regards,
    dhekz

    user8707902 wrote:
    Can anyone help me on how to extract data out of a compressed file(ZIP) using pl sql.Bear in mind that the Lempel-Zif-Welch (LZW) compression used in zip files may still have patent issue relating to Unisys (not sure of the patent has expired now or what, it's always been somewhat confusing). So, if you already have software written to zip/unzip files you should use that as it should be licenced already. If you write your own LZW compression/decompression routine for use in any commercial software you may be required to register and submit royalties to Unisys for the privilege. As I say, I don't know the latest, so you may be ok, but it's something to be aware of and check out if you intend to write your own and it's for commercial reasons.

  • Unzipping files compressed with unix "compress"

    I'm writing a bit of code that interrogates database activity snapshot files and stores key data in a database. The snapshot files are produced every hour and are compressed using the unix "compress" command.
    I want to read the contents (a single text file) of these compressed files and I thought that the java.io.Zip package would give me what I was looking for.
    In the code below, I already have an array of file references. I'm just iterating over the files, creating a ZipFile object from the File reference. That's where I get my ZipException (ZipfileUtility.java:57).
                for(int i=0; i<snapshots.length; i++){
                    System.out.println(snapshots);
    ZipFile zipFile = new ZipFile(snapshots[i]);
    System.out.println("About to read zipfile");
    Enumeration enumeration = zipFile.entries();
    while(enumeration.hasMoreElements()){
    System.out.println("Getting zip entry");
    ZipEntry entry = (ZipEntry) enumeration.nextElement();
    System.out.println(entry);
    zipFile.close();
    And the exception stack trace...
    java.util.zip.ZipException: error in opening zip file
         at java.util.zip.ZipFile.open(Native Method)
         at java.util.zip.ZipFile.<init>(Unknown Source)
         at java.util.zip.ZipFile.<init>(Unknown Source)
         at org.lawford.zip.ZipfileUtility.main(ZipfileUtility.java:57)I couldn't find reference to ZipFiles NOT being able to read compress-ed files. Mind you, I couldn't find reference to them being able to either.
    Am I out of luck? Does ZipFile understand the compress format? Am I missing something that I should have seen in the API?
    I currently have no control over how the snapshots are compressed at source. I understand that the preferred archiving and compression command on unix is gzip.
    Any suggestions?
    Thanks.

    Hi,
    ZIP and compress are incompatible. Compress is based on the LZW algorithm, which is also used to compress GIF images. Unisys has intellectual propteries on this algorithm, which was one of the reasons to develop a roality-free algorithm. One of them is the ZIP format. ZIP can also create archives of files, while compress only handles a single file.
    So you're not lucky in using the java.io.zip package in reading a compressed file.
    Might be you could start the uncompress command via Runtime.exec and read the uncompressed data from the OutputStream of this process.

  • Yosemite 10.10.1 will not allow me to copy multiple files to a NAS -error file in use, but other OS OK.

    I am running Mac OS 10.10.1 on a Mac Mini with 16G.
    I have a Buffalo NAS and wanted to copy files from one folder on the NAS  to another on the NAS.
    Under Yosemite I can only copy them one at a time. If I try to copy more than one I get an error message that the file is in use.
    Under SNS 10.6.3 (running under Parallels) I can easily copy a large number of files.
    Again under 10.10.1 I get the same problem when copying from my desktop to the NAS.
    The files are definitely not in use elsewhere and in any case if they were I wouldn't be able to copy one at a time.
    When I try to copy multiple files the OS creates greyed out icons of all the files to copy and then aborts when it actually tries to copy the first file.
    Any clues please, my current work around is to compress all the files to move, move the archive, expand the archive. This is not going to work for very large numbers of files - l'll be at it all day.
    Tony

    I have been unable to drag and drop large files, even onto the Desktop, let alone onto my NAS - I get the dragged files freezing in the middle of the screen or I get the 'unable to copy / file in use' error.
    Talking to Apple Support, they had me set up a new User, and then within that new account, the trouble went away. This pointed to the StartUp items being run in the original troublesome account,
    I have deleted all of my 3rd Party programs and then installed them one by one until the problem came back, and in my case (Macbook Air 5,2 running 10.10.3 Yosemite) I found it was DropBox causing the conflict. I have DropBox running in the Menu Bar (top of screen) and disabling sync (bottom left corner) stopped the file copy problem. Try disabling DropBox sync is file copying is failing?

  • Firefox 33 doesn't display a pdf file when using the response object

    Firefox 33.0.2 does not display pdf files when using the code below from an asp.net program, which works for previous versions of Firefox, and also works with IE. I'm using the built-in pdf viewer. All of my plugins are disabled.
    Dim strPDF As String
    strPDF = Session("filname") 'pdf filename
    Response.Clear()
    Response.ClearHeaders()
    Response.Buffer = True
    Response.ContentType = "application/pdf"
    Response.CacheControl = "Private"
    Response.AddHeader("Pragma", "no-cache")
    Response.AddHeader("Expires", "0")
    Response.AddHeader("Cache-Control", "no-store, no-cache, must-revalidate")
    Response.AddHeader("Content-Disposition", "inline; filename=" + strPDF)
    Response.WriteFile(strPDF)
    Response.Flush()
    Response.Close()
    Response.Clear()
    Response.End()
    Session("filname") = ""

    Thanks cor-el. You pointed me in the right direction. It appears to me that a reported Firefox 33 bug with the handling of compression (Transfer-Encoding: chunked) is the culprit (https://support.mozilla.org/en-US/questions/1026743). I was able to find a work-around by specifying the file size and buffering. Below is my code, with some code from http://www.codeproject.com/Questions/440054/How-to-Open-any-file-in-new-browser-tab-using-ASP.
    Dim strPDF As String
    strPDF = Session("filname") 'pdf filename
    Dim User As New WebClient()
    Dim FileBuffer As [Byte]() = User.DownloadData(strPDF)
    If Not (FileBuffer Is Nothing) Then
    Response.Clear()
    Response.ClearHeaders()
    Response.CacheControl = "Private"
    Response.AddHeader("Pragma", "no-cache")
    Response.AddHeader("Expires", "0")
    Response.AddHeader("Cache-Control", "no-store, no-cache, must-revalidate")
    Response.ContentType = "application/pdf"
    Response.AddHeader("content-length", FileBuffer.Length.ToString())
    Response.BinaryWrite(FileBuffer)
    Response.Flush()
    Response.Close()
    Response.Clear()
    Response.End()
    End If
    Session("filname") = ""

  • How can remove child into file xml using J2ME

    i want to delete child from file xml using midlet but nothing is changed into file
    please help me
    * To change this template, choose Tools | Templates
    * and open the template in the editor.
    package ajou;
    import java.io.IOException;
    import java.io.InputStreamReader;
    import java.io.OutputStream;
    import java.util.Vector;
    import javax.microedition.io.Connector;
    import javax.microedition.io.HttpConnection;
    import javax.microedition.midlet.MIDlet;
    import org.kxml2.io.KXmlParser;
    import org.kxml2.kdom.Document;
    import org.kxml2.kdom.Element;
    import org.kxml2.kdom.Node;
    * @author -Manel-
    public class manelGO extends MIDlet {
    public void startApp() {
    try {
              //Open http connection
              HttpConnection httpConnection = (HttpConnection) Connector.open("http://localhost:8080/examples/users.xml");
                   //Initilialize XML parser
                   KXmlParser parser = new KXmlParser();
                   parser.setInput(new InputStreamReader(httpConnection.openInputStream()));
                   Document theDoc = new Document();
                   theDoc.parse(parser);
                   Element rootElement = theDoc.getRootElement();
    rootElement.removeChild(1);
              catch (Exception e) {
                   e.printStackTrace ();
    public void pauseApp() {
    public void destroyApp(boolean unconditional) {
    }

    To achieve this you can use the bpelx:remove function: http://download.oracle.com/docs/cd/E12483_01/integrate.1013/b28981/manipdoc.htm#CIHJBJFD
    your assign will look like:
    <bpel:assign>
    <bpelx:remove>
    <target variable="person" query="/person/@per" />
    </bpelx:remove>
    </bpel:assign>
    Regards,
    Melvin

  • Could someone please recommend the best program to convert AVI and MPEG files for use (and exporting) into IMovie 11?

    Hi - I understand I need to convert AVI and MPEG files for use within IMovie 11. I have one particular file that has a watermark stating "Created with Flip4Mac WMV Demo". Can someone please recommend the best program for a novice?  Thanks!

    Hi Susan,
    It is no simpler converting vobs to mp4 than mpg (mpeg2).
    There is *always* a loss of quality converting video from one lossy-compressed format to another, e.g. mpeg2 to mp4.
    So, if you want to not lose any quality, then converting vobs to mpeg2 (mpg) is the thing to do (since there will be no video/audio re-compression), assuming Lr5 will accept them (sorry, but I do not know). example commands:
    Mac: ffmpeg -i myvideo.vob -sameq myvideo.mpg
    Win: ffmbc -i myvideo.vob -sameq myvideo.mpg
    If Lr5 won't accept them, then you can convert to mp4 with minimal quality loss using a command like:
    Mac: ffmpeg -i myvideo.vob -sameq myvideo.mp4
    Win: ffmbc -i myvideo.vob -sameq myvideo.mp4
    You may need variations of these commands depending on encoding of vob.
    Note: -i means -input-file; -sameq means -same-quality, as much as possible.
    ffm... is smart enough to avoid re-encoding, if possible, when -sameq is used, and makes reasonable choices for audio/video codecs based on extension of output file.
    Of course, you'll need to download/install ffm... program if not already on your machine, and execute commands in a terminal or command window.
    Do keep us posted please .
    Cheers,
    Rob

  • Is there a way to compress using Java deflator & uncompress using UTL ?

    hi,
    I was wondering if there is a way to compress using Java deflator on the java side & then uncompress in the stored procedure. UTL_COMPRESS.LZ_UNCOMPRESS(BLOB) ?
    I tried that, but I'm currently getting Invalid data exceptions..
    The Other option is to use Java Inflator in the Java stored procedure. But I want to avoid java stored procedures.
    Thanks in advance.
    /// java side
    String inputString = loadXML (inputfile);
    byte[] input = inputString.getBytes("UTF-8");
    byte[] output = new byte[inputString.length()];
    Deflater compresser = new Deflater(Deflater.BEST_SPEED, true);
    compresser.setInput(input);
    OracleCallableStatement insertALLStatement = (OracleCallableStatement) con.prepareCall(insertALLSQL);
    InputStream stream = new ByteArrayInputStream(data);
    insertALLStatement.setBinaryStream(1, stream, (int)data.length);
    insertALLStatement.execute();
    // pl sql
    create or replace PROCEDURE INSERTBYTES
    ( compressed IN BLOB
    ) AS
    uncompressed:=UTL_COMPRESS.lz_uncompress (compressed);
    ...

    That depends.
    Does Java Deflator use the same compression technique as UTL_COMPRESS.LZ_UNCOMPRESS? i.e. is it using Lempel Ziff compression algorithms. If so, then, yes, there's a possibility it could work, however it also depends if the Java Deflator stores header information about the compressed data differently to how LZ_UNCOMPRESS expects it, or even if header information is included or it's just raw compressed data.
    It sounds a bit like compressing a file with Yoshi compression and then trying to use PKZIP to uncompress it methinks, in which case you're going to be out of luck. You have to ensure the compression algorithms and file formats are compatible.

  • File Compression Error Occurred on Expert Settings

    I finally thought I had a decent set of settings to compress the QT movie for YouTube (and will tweak this now that each ten minute movie can be 1 gig! Hooray!). So I used what had worked on the previous movies and got the error
    File Compression Error Occurred
    The movie could not be properly compressed. The movie may contain some data that is invalid for the type of compression selected.
    The movie itself contains imported footage from my camcorder (DV), as I usually use. And I used only iMovie stock music loops and one Sfx...all aif files.
    I've tried tweaking the settings of Expert Settings here and there. Changed the 15 fps to 30. Made the Key At this point, what do I look at? Something within the movie itself? Or the settings (H.264, qual. Hi but also tried med, Frame rate tried 15 and 30, Key frame rate tried Auto and 10, Encoding multi-pass, Dimen. 640x480, Scale Letterbox. Sound is ACC, 24,000 kHz, stereo, bit rate 48kbps. Prepare for internet streaming is unchecked currently. Think it was that way when I last made a movie.
    Don't know where to continue to troubleshoot this. URmpfff. Help!

    Wow...this is a sticky problem. I could not resave the movie as another name. Frozen program, pinwheel of death, etc. I've had about three frozen attempts since I first wrote...
    Googling phrase from error message got only a few lost souls who never did post the solution. One solution someone had figured the problem was with Flip4Mac...to uninstall it, but they had problems uninstalling it. So did it...frozen computer).
    I did get this solution to work to move my movie along. It saved rather quickly to a 350 meg or so full quality file on Expert settings. Then got a single .dv file. Imported that into a new iMovie project (wonderful....that I titled My Great Movie fer petes sake...Hope I remember to retitle it.
    Along the way were error messages like Error 54, no permission to write to file, no hard drive space, etc. Baloney, I think...or is there any clue in all this?
    Thanks.

  • Converting VOB Files For use in FCP

    Hello,
    I have a large FCP project using VOB files shot on a DVD Recorder.
    I have been been using a freeware called MPEG Streamclip to convert them into Quicktime Files for use with FCP7. The resulting QT files have all been very choppy when used in the FCP Timeline etc.
    I have options with this freeware to convert to MPEG, TS or export to Quicktime, DV, AVI, MPEG4 or other formats.
    Any suggestions on how to bring these VOB files into Final Cut Pro so that they work well.
    Thanks,
    Marc

    Nick's tutorial is good. Read it.
    I had been using Cinematize for this but would always have trouble with the aspect ratio.
    With MPEG Streamclip I have no problems if I do it this way:
    Export to QuickTime
    Change Compression to: Apple DV/DVCPro-NTSC
    Quality: 100%
    Frame Size: 720x480 (DV-NTSC)
    Sound: Uncompressed (the default)
    Make Movie
    I do a preset of the above.
    Now, when you take that resultant file into FCP:
    In FCP make sure your sequence set-up is DV anamorphic (if footage is anamorphic/16:9). When you bring that file into FCP it will not be flagged as anamorphic. In the browser scroll over and check mark it as anamorphic. NOW drag into the timeline. Aspect ratio is good if you do it this way.
    Sharon

  • File Compression

    I have a web service created in a .net environment that examines existing pdf files in a staging directory prior to sending them over the wire using FTP.
    Part of that process requires that I rename individual files in order to associate them with a particular batch.
    I also have a requirement to reduce the size of individual files as much as possible in order to reduce the traffic going over the line.
    So far I have managed about a 30% compression rate by using an open source library (iTextSharp).
    I  am hoping that I can get a better compression rate using the Acrobat SDK, but I need someone to show me how, hopefully with an example that I can follow.
    The following code snippet is a model I wrote that accomplishes the rename and file compression...
                const string filePrefix = "19512-";
                string[] fileArray = Directory.GetFiles(@"c:\temp");
                foreach (var pdffile in fileArray) {
                    string[] filePath = pdffile.Split('\\');
                    var newFile = filePath[0] + '\\' + filePath[1] + '\\' + filePrefix + filePath[2];                
                    var reader = new PdfReader(pdffile);
                    var stamper = new PdfStamper(reader, new FileStream(newFile, FileMode.Create), PdfWriter.VERSION_1_5);
                    int pageNum = 1;
                    for (int i = 1; i <= pageNum; i++) {
                        reader.SetPageContent(i, reader.GetPageContent(i), PdfStream.BEST_COMPRESSION);
                    stamper.Writer.CompressionLevel = PdfStream.BEST_COMPRESSION;
                    stamper.FormFlattening = true;
                    stamper.SetFullCompression();
                    stamper.Close();
    Any assistance is appreciated.
    regards,
    Greymajek

    Greymajek wrote:
    ...using the Acrobat SDK...
    Then you better ask in the Acrobat SDK forum.

  • Need to extract compression used in TIFF...

    I would like to extract some information from a TIFF image. I would like to know the compression scheme used for each page. I don't really need to decode the TIFF image. I am specifically needing to know whether the TIFF file is using type 6 JPEG compression or type 7 JPEG-in-TIFF compression, and I need to know this information for each page (as some images seem to be "mixed").
    Do I need JAI to do this? How would I do this?

    Have you actually managed to use ImageIO to read any images yet? That's where you need to start. I'm not going to go over that now as it has been covered many times by many people.
    Once you have an ImageReader object up and running, you'll need to know a little bit more about TIFF metadata. The tag number for the compression value is '259'. I don't know what values the compressions you're talking about will be, but you'll soon find out.
    The following code should (hopefully) take the TIFF metadata out of the ImageReader and display what the image's compression is. The TIFF files I've been working with all have Group 4 Fax compression (the value this displays is just '4'), so I don't know how your results are going to look. Please let us know how you get on!
    TIFFDirectory tiffDir = TIFFDirectory.createFromMetadata(reader.getImageMetadata(image));
    TIFFField tf = tiffDir.getTIFFField(259);
    TIFFTag tag = tf.getTag();
    String compressionString = tf.getAsString(0);

  • File Compression Problem

    Hello all,
    I'm having a strange file compression problem with CS5 on our new Mac Pro.  We purchased the Mac Pro to scan and process images, but the JPEGs and GIFs we create from this computer are much larger than they should be when closed (e.g. images that should be compressed to 6KB are reading as 60KB, and the file size is often smaller when opened than closed). Furthermore, anytime we use these image files in other programs (e.g. Filemaker Pro) the inflated file size will carry over.  What's even more puzzling is that the same files that are reading as 60KB on our Mac Pro will read correctly as 6KB from a PC.  Similarly, if we embed these images -- that were created on the Mac Pro -- into Filemaker from a PC, the image file size is correct.  We cannot use the compressed files we create on our Mac Pro because the inflated file size will be passed on to whatever application we use on the Mac Pro (except for Photoshop).
    We have been processing images for years on a PC and haven't had any troubles with this.   We were thinking for a while that the problem was with the Mac operating system, but after many calls with expert Apple advisers it seems like Photoshop for Mac has to be the issue.  We have already tried reformatting and re-indexing the hard drive, and at this point there is nothing else that can be done from Apple's end.  The last expert I spoke with at Apple said that it sounds like the way Photoshop for Mac compresses files and how Mac reads file sizes is very different from the way Photoshop for PC compresses files and how Windows reads file sizes.  If he was correct, and there is no work-around, we'll have no other choice and will have to return our Mac.
    Has anyone else experienced this before?  The experts at Apple were thoroughly confused by the problem and so are we.
    Thanks,
    Jenny

    This has nothing to do with compression.
    Macintosh saves more metadata, and more previews than Windows - that's one part.
    Macintosh shows the size of the file on disk, including wasted space due to the disk block size - that's another part. (look at the byte count, not the size in K or MB).
    When you upload the files to a server, or use them in most programs, that extra metadata is lost and the file sizes should be identical.
    I can't believe that your advisors spent any time on such a trivial issue...

Maybe you are looking for

  • Previous week Dates

    Hi all Please help in writing a query to get the previous week data. If the input date is sysdate then it should return values for previous week Monday to Sunday. Whatever the input it should get the value for previous week from Monday to Sunday. Tha

  • DST in Windows Server needs restart?

    Hi, Coming from UNIX/Oracle environment, we do not need to restart the server when DST starts.  Is this the same case with SAP in Windows server 2003 & Oracle DB? tnx, kbas

  • Using Maven 2.0 with Eclipse

    Hello, Does anyone know of a plugin for eclipse that allows you to create projects based on the maven 2.0 layout? Thanks in advance, Ger

  • WD My Cloud uploading slower than it ever has

    I have had this device about 8 months now and have never once had a problem with it. It is taking forever to upload a file. I have even tried different files. It is taking an average of 45 minutes compared to the usual average of 3 minutes or less. I

  • Why won't my G3 run 10.3.9?

    This is a third (backup/music) Mac. It's been running 10.3.5 for several months. I thought I would update it along with iTunes this weekend. I'm sure Apple is wondering why I've registered my OS over a dozen times in the last three days. It has never