Uploading Very Large Files via HTTP

I am developing some classes that must upload files to a web server via HTTP and multipart/form-data. I am using Apache's Tomcat FileUpload library contained within the commons-fileupload-1.0.jar file on the server side. My code fails on large files or large quantities of small files because of the memory restriction of the VM. For example when uploading a 429 MB file I get this exception:
java.lang.OutOfMemoryError
Exception in thread "main"I have never been successful in uploading, regardless of the server-side component, more than ~30 MB.
In a production environment I cannot alter the clients VM memory setting, so I must code my client classes to handle such cases.
How can this be done in Java? This is the method that reads in a selected file and immediately writes it upon the output stream to the web resource as referenced by bufferedOutputStream:
private void write(File file) throws IOException {
  byte[] buffer = new byte[bufferSize];
  BufferedInputStream fileInputStream = new BufferedInputStream(new FileInputStream(file));
  // read in the file
  if (file.isFile()) {
    System.out.print("----- " + file.getName() + " -----");
    while (fileInputStream.available() > 0) {
      if (fileInputStream.available() >= 0 &&
          fileInputStream.available() < bufferSize) {
        buffer = new byte[fileInputStream.available()];
      fileInputStream.read(buffer, 0, buffer.length);
      bufferedOutputStream.write(buffer);
      bufferedOutputStream.flush();
    // close the files input stream
    try {
      fileInputStream.close();
    } catch (IOException ignored) {
      fileInputStream = null;
  else {
    // do nothing for now
}The problem is, the entire file, and any subsequent files being read in, are all being packed onto the output stream and don't begin actually moving until close() is called. Eventually the VM gives way.
I require my client code to behave no different than the typcial web browser when uploading or downloading a file via HTTP. I know of several commercial applets that can do this, why can't I? Can someone please educate me or at least point me to a useful resource?
Thank you,
Henryiv

Are you guys suggesting that the failures I'm
experiencing in my client code is a direct result of
the web resource's (servlet) caching of my request
(files)? Because the exception that I am catching is
on the client machine and is not generated by the web
server.
trumpetinc, your last statement intrigues me. It
sounds as if you are suggesting having the client code
and the servlet code open sockets and talk directly
with one another. I don't think out customers would
like that too much.Answering your first question:
Your original post made it sound like the server is running out of memory. Is the out of memory error happening in your client code???
If so, then the code you provided is a bit confusing - you don't tell us where you are getting the bufferedOutputStream - I guess I'll just assume that it is a properly configured member variable.
OK - so now, on to what is actually causing your problem:
You are sending the stream in a very odd way. I highly suspect that your call to
buffer = new byte[fileInputStream.available()];is resulting in a massive buffer (fileInputStream.available() probably just returns the size of the file).
This is what is causing your out of memory problem.
The proper way to send a stream is as follows:
     static public void sendStream(InputStream is, OutputStream os, int bufsize)
                 throws IOException {
          byte[] buf = new byte[bufsize];
          int n;
          while ((n = is.read(buf)) > 0) {
               os.write(buf, 0, n);
     static public void sendStream(InputStream is, OutputStream os)
                 throws IOException {
          sendStream(is, os, 2048);
     }the simple implementation with the hard coded 2048 buffer size is fine for almost any situation.
Note that in your code, you are allocating a new buffer every time through your loop. The purpose of a buffer is to have a block of memory allocated that you then move data into and out of.
Answering your second question:
No - actually, I'm suggesting that you use an HTTPUrlConnection to connect to your servlet directly - no need for special sockets or ports, or even custom protocols.
Just emulate what your browser does, but do it in the applet instead.
There's nothing that says that you can't send a large payload to an http servlet without multi-part mime encoding it. It's just that is what browsers do when uploading a file using a standard HTML form tag.
I can't see that a customer would have anything to say on the matter at all - you are using standard ports and standard communication protocols... Unless you are not in control of the server side implementation, and they've already dictated that you will mime-encode the upload. If that is the case, and they are really supporting uploads of huge files like this, then their architect should be encouraged to think of a more efficient upload mechanism (like the one I describe) that does NOT mime encode the file contents.
- K

Similar Messages

  • Upload very large file to SharePoint Online

    Hi,
    I tried uploading very large files to SharePoint online via the REST API using JAVA. Uploading files with a size of ~160 mb works well, but if the filesize is between 400mb and 1,6 gb I either receive a Socket Exception (Connection reset by peer) or an error
    of SharePoint telling me "the security validation for the page is invalid". 
    So first of all I want to ask you, how to work around the security validation error. As far as I understand the token, which I have added to my X-RequestDigest header of the http post request seems to be expired (by default it expires after 1800 seconds).
    Uploading such large files is time consuming, so I can't change the token in my header while uploading the file. How can I extend the expiration time of the token (if it is possible)? Could it help to send post requests with the same token
    continuously while uploading the file and therefore prevent the token from expiration? Is there any other strategy to upload such large files which need much more than 1800 seconds for upload?
    Additionally any thoughts on the socket exception? It happens quite sporadically. So sometimes it happens after uploading 100 mb, and the other time I have already uploaded 1 gb.
    Thanks in advance!

    Hi,
    Thanks for the reply, the reason I'm looking into this is so users can migrate there files to SharePoint online/Office 365. The max file limit is 2gb, so thought I would try and cover this.
    I've looked into that link before, and when I try and use those end points ie StartUpload I get the following response
    {"error":{"code":"-1, System.NotImplementedException","message":{"lang":"en-US","value":"The
    method or operation is not implemented."}}}
    I can only presume is that these endpoints have not been implemented yet, even though the documentation says they only available for Office 365 which is what  I am using. 
    Also, the other strange thing is that I can actually upload a file of 512mb to the server as I can see it in the web UI. The problem I am experiencing is get a response, it hangs on this line until the TimeOut is reached.
    WebResponse wresp = wreq.GetResponse();
    I've tried flushing and closing the stream, sendchunked but still no success

  • WebLogic Apache bridge problems on uploading large files via HTTP post

    I have a problem uploading files larger than quarter a mega, the jsp
    page does a POST
    to a servlet which reads the input stream and writes to a file.
    Configuration: Apache webserver 1.3.12 connected to the Weblogic 5.1
    application server
    via the bridge(mod_wl_ssl.so) from WebLogic Service pack 4.
    The upload goes on for about 30 secs and throws the following error.
    "Failure of WebLogic APACHE bridge:
    IO error writing POST data to 100.12.1.2:7001; sys err#: [32] sys err
    msg [Broken pipe]
    Build date/time: Jul 10 2000 12:29:18 "
    The same upload(in fact I uploaded a 8 MEG file) using the
    Netscape(NSAPI) WebLogic
    connector.
    Any answers would be deeply appreciated.

    I have a problem uploading files larger than quarter a mega, the jsp
    page does a POST
    to a servlet which reads the input stream and writes to a file.
    Configuration: Apache webserver 1.3.12 connected to the Weblogic 5.1
    application server
    via the bridge(mod_wl_ssl.so) from WebLogic Service pack 4.
    The upload goes on for about 30 secs and throws the following error.
    "Failure of WebLogic APACHE bridge:
    IO error writing POST data to 100.12.1.2:7001; sys err#: [32] sys err
    msg [Broken pipe]
    Build date/time: Jul 10 2000 12:29:18 "
    The same upload(in fact I uploaded a 8 MEG file) using the
    Netscape(NSAPI) WebLogic
    connector.
    Any answers would be deeply appreciated.

  • Very large file upload 2 GB with Adobe Flash

    Hello, anyone know how can I upload very large files with Adobe Flash or how to use SWFUpload?
    Thanks in Advance, All help will be very appreciated.

    1. yes
    2. I'm getting error message from php
    if( $_FILES['Filedata']['error'] == 0 ){ 
      if(move_uploaded_file( $_FILES['Filedata']['tmp_name'], $uploads_dir.$_FILES['Filedata']['name'] ) ){ 
      echo 'ok'; 
      exit(); 
      echo 'error';    // Overhere
      exit();

  • How to upload large file with http via post

    Hi guys,
    Does anybody know how to upload large file (>100 MB) use applet to servlet with http via post method? Thanks in advance.
    Regards,
    Mark.

    Hi SuckRatE
    Thanks for your reply. Could you give me some client side code to upload a large file. I use URL to connect to server. It throws out of memory exception. The part of client code is below:
    // connect to the servlet
    URL theServlet = new URL(servletLocation);
    URLConnection servletConnection = theServlet.openConnection();
    // inform the connection that we will send output and accept input
    servletConnection.setDoInput(true);
    servletConnection.setDoOutput(true);
    // Don't used a cached version of URL connection.
    servletConnection.setUseCaches (false);
    servletConnection.setDefaultUseCaches(false);
    // Specify the content type that we will send text data
    servletConnection.setRequestProperty("Content-Type",
    +"application/octet-stream");
    // send the user string to the servlet.
    OutputStream outStream = servletConnection.getOutputStream();
    FileInputStream filein = new FileInputStream(largeFile);
    //BufferedReader in = new BufferedReader(new InputStreamReader
    +(servletConnection.getInputStream()));
    //System.out.println("tempCurrent = "+in.readLine());
    byte abyte[] = new byte[2048];
    int cnt = 0;
    while((cnt = filein.read(abyte)) > 0)
    outStream.write(abyte, 0, cnt);
    filein.close();
    outStream.flush();
    outStream.close();
    Regards,
    Mark.

  • Tips or tools for handling very large file uploads and downloads?

    I am working on a site that has a document repository feature. The documents are stored as BLOBs in an Oracle database and for reasonably sized files its not problem to stream the files out directly from the database. For file uploads, I am using the Struts module to get them on disk and am then putting the blob in the database.
    We are now being asked to support very large files of 250MB+. I am concerned about problems I've heard of with HTTP not being reliable for files over 256MB. I'd also like a solution that would give the user a status bar and allow for restarts of broken uploads or downloads.
    Does anyone know of an off-the-shelf module that might help in this regard? I suspect an ActiveX control or Applet on the client side would be necessary. Freeware or Commercial software would be ok.
    Thanks in advance for any help/ideas.

    Hi. There is nothing wrong with HTTP handling 250MB+ files (per se).
    However, connections can get reset.
    Consider offering the files via FTP. Most FTP clients are good about resuming transfers.
    Or if you want to keep using HTTP, try supporting chunked encoding. Then a user can use something like 'GetRight' to auto resume HTTP downloads.
    Hope that helps,
    Peter
    http://rimuhosting.com - JBoss EJB/JSP hosting specialists

  • Very large files checkin to DMS/content server

    Dear experts,
    We have DMS with content server up and running.
    However, there is a problem with very large files (up to 2GB!) - these files cannot be checked in due to extremely long upload times and maybe GUI limitations.
    Does anyone have suggestions how we could get very large files into the content server ? I only want to check them in (either via DMS or directly to the content server) and then get the URL back.
    best regards,
    Johannes

    Hi Johannes,
    unfortunatly there is a limit for files regarding the Content Server which is about 2GB. Please note that such large files will cause very long upload times. If possible I would recommend you to split the files to smaller parts and try to check it in.
    From my point of view it is not recommended to put files directly on the Content Server, because this could lead to inconsistencies on the Content Server.
    Best regards,
    Christoph

  • After some running time I cannot download large files via any browser.

    I am running OS X 10.9.2 on a 27inch iMac 2013 model.
    If I keep it running (without sleeping) for some time, perhaps a week, I can no longer download any large files via any browser. It will start and after a very short time go down to 0 B/s, stall there and at the end aboft with an "Unknown network error" or something similar.
    Has anyone seen similar problems?

    Open router set up using http://192.168.1.1  .....you will see username & password .....leave username blank & in password use admin...............
    Click status tab..........check firmware version .......download the latest firmware from http://linksys.com/download ..........
    Udate the firmware ....reset the router for 20-30 seconds ........ later on reconfigure the router ........

  • Web Service to transmit a file (a very large file)

    Greetings,
    Is is possible to use a web service to transmit a file, sometimes a very large file? If so, what are the security implications of doing such a thing?
    Thanks,
    Charles

    Quick answer: It depends... Did you try using Google to look-up examples of doing this? Here is a url to try: http://www.google.com/#sclient=psy&hl=en&source=hp&q=web+service+transfer+file&pbx=1&oq=web+service+transfer+file&aq=f&aqi=g1g-j3g-b1&aql=&gs_sm=e&gs_upl=1984731l1997518l0l1997874l25l19l0l1l1l0l304l3879l0.7.10.1l18l0&bav=on.2,or.r_gc.r_pw.&fp=7caaed5eb0414d97&biw=1280&bih=871
    Thank you,
    Tony Miller
    Webster, TX
    Never Surrender Dreams!
    JMS
    If this question is answered, please mark the thread as closed and assign points where earned..

  • How do share a very large file?

    How do share a very large file?

    Do you want to send a GarageBand project or the bounced audio file?  To send an audio file is not critical, but if you want to send the project use "File > Compress" to create a .zip file of the project before you send it.
    If you have a Dropbox account,  I'd simply copy the file into the Dropbox "public" folder and mail the link. Right-click the file in the Dropbox, then choose Dropbox > Copy Public Link. This copies an Internet link to your file that you can paste anywhere: emails, instant messages, blogs, etc.
    2 GB on Dropbox are free.     https://www.dropbox.com/help/category/Sharing

  • Handling a very larger file from Sender system

    Hi Sdn,
    Can you please tell me, what should be done from an SAP+XI side to handle a very large file coming from the sender system to be posted to the receiver? What should be done in XI end so that there is no problem in handling the larger file.
    Thanks and regards,
    Aniruddha Bhattacharya

    Hi Aniruddha,
    Go to below link and check point number 3.1.1 maintain parameters as mentioned in that point.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/70ada5ef-0201-0010-1f8b-c935e444b0ad

  • Problem downloading a file via http

    Hi
    I'm just getting started with WLS (sp5) and am having a problem downloading
    a file via http. The document is stored in the main html docs directory and
    whenever I link to it or try to download it directly (eg:
    http://<host>:<port>/myfile.doc) I get the following error in a message box:
    Your current security settings do not allow this file to be downloaded.
    Can anyone point me in the right direction as to where I grant permissions
    to do this - I've tried using the weblogic.security.URLAclFile and adding
    the directory as a weblogic.io.fileSystem (a desperation move, I know).
    Thanks in advance,
    Peter Villiers

    PLEASE IGNORE THIS POST
    The problem was caused by someone (me though I honestly don't remember doing
    it), setting the content security level to high in my web browser which
    stopped this type of download.
    Peter

  • Unable to copy very large file to eSATA external HDD

    I am trying to copy a VMWare Fusion virtual machine, 57 GB, from my Macbook Pro's laptop hard drive to an external, eSATA hard drive, which is attached through an ExpressPort adapter. VMWare Fusion is not running and the external drive has lots of room. The disk utility finds no problems with either drive. I have excluded both the external disk and the folder on my laptop hard drive that contains my virtual machine from my Time Machihne backups. At about the 42 GB mark, an error message appears:
    The Finder cannot complete the operation because some data in "Windows1-Snapshot6.vmem" could not be read or written. (Error code -36)
    After I press OK to remove the dialog, the copy does not continue, and I cannot cancel the copy. I have to force-quit the Finder to make the copy dialog go away before I can attempt the copy again. I've tried rebooting between attempts, still no luck. I have tried a total of 4 times now, exact same result at the exact same place, 42 GB / 57 GB.
    Any ideas?

    Still no breakthrough from Apple. They're telling me to terminate the VMWare processes before attempting the copy, but had they actually read my description of the problem first, they would have known that I already tried this. Hopefully they'll continue to investigate.
    From a correspondence with Tim, a support representative at Apple:
    Hi Tim,
    Thank you for getting back to me, I got your message. Although it is true that at the time I ran the Capture Data program there were some VMWare-related processes running (PID's 105, 106, 107 and 108), this was not the case when the issue occurred earlier. After initially experiencing the problem, this possibility had occurred to me so I took the time to terminate all VMWare processes using the activity monitor before again attempting to copy the files, including the processes mentioned by your engineering department. I documented this in my posting to apple's forum as follows: (quote is from my post of Feb 19, 2008, 1:28pm, to the thread "Unable to copy very large file to eSATA external HDD", relevant section in >bold print<)
    Thanks for the suggestions. I have since tried this operation with 3 different drives through two different interface types. Two of the drives are identical - 3.5" 7200 RPM 1TB Western Digital WD10EACS (WD Caviar SE16) in external hard drive enclosures, and the other is a smaller USB2 100GB Western Digital WD1200U0170-001 external drive. I tried the two 1TB drives through eSATA - ExpressPort and also over USB2. I have tried the 100GB drive only over USB2 since that is the only interface on the drive. In all cases the result is the same. All 3 drives are formatted Mac OS Extended (Journaled).
    I know the files work on my laptop's hard drive. They are a VMWare virtual machine that works just fine when I use it every day. >Before attempting the copy, I shut down VMWare and terminated all VMWare processes using the Activity Monitor for good measure.< I have tried the copy operation both through the finder and through the Unix command prompt using the drive's mount point of /Volumes/jfinney-ext-3.
    Any more ideas?
    Furthermore, to prove that there were no file locks present on the affected files, I moved them to a different location on my laptop's HDD and renamed them, which would not have been possible if there had been interference from vmware-related processes. So, that's not it.
    Your suggested workaround, to compress the files before copying them to the external drive, may serve as a temporary workaround but it is not a solution. This VM will grow over time to the point where even the compressed version is larger than the 42GB maximum, and compressing and uncompressing the files will take me a lot of time for files of this size. Could you please continue to pursue this issue and identify the underlying cause?
    Thank you,
    - Jeremy

  • Upload a large file to OCS10g workspace

    Hi,
    I have a workspace on OCS10g and the quota of the workspace is 2GB.
    But, when I upload a large file - about 100 MB, it occurs errors.
    Undoubtedly, it works well in the case of small files.
    Already, I set the parameter IFS.DOMAIN.MEDIA.CONTENTTRANSFER.ContentLimit to 0.
    It means an unlimited permission to upload a workspace.
    Is there any other prerequisite to upload a large file to a workspace?
    Regards,
    Kitae

    Have you checked the Apache Timeout directive? It might need to be larger. There are no other general tricks or parameters to tweak that our dev team is aware of. Please file a metalink tar for this and detail the error message. I'm sure Oracle Support Services will be able to help you mine the logs to find out what is going on.

  • How do I control read start position in a very large file where start byte position may be larger than I32 (+/- 2^31)?

    Using LabView, I am trying to read a very large file which may be on the order of 2^32 bytes. I need to be able to step into the file at a byte position which may be greater than the I32 limit set by the read file.vi. Are there any options to the read file.vi or a method of circumventing this limitation?

    I'm not sure but i think that you can manage the "pos mode" in the "seek" sub-vi.
    The "pos mode" let you choose the initial position to add the numbers of bytes you want to move.
    I think that you can add a i32 number from the "initial" in the "pos mode" an latter use the "pos mode" in "current" to add another value. Then the next time you can move more than 2^31 bytes to the initial position.
    I hope you understand my idea, i wasn't try it before, but i think that would work.

Maybe you are looking for

  • Printing to Ancient HP LaserJet 4L

    Hi everyone. I know, I know, it's an ancient printer, but it still works! I know because I've used it to print from my husband's windows laptop. At the very least, I want to use up the toner in it. So, here's the deal--I found a usb/serial cable (onl

  • Finder Flashing & Dock disappeared. HELP!!!

    I've had my macbook 10.4 11 for a few years now since i started college in 2006. And all of a sudden these past few days, the finder window starts flashing..not flashing completely on & off, but like blinking & so does all the icons on my desktop. Th

  • Apple TV Dvice and no Movies Tab

    I have OS X 7.5.2 Apple TV 2.0 and iTunes 7.6. When i select the Apple TV under divices the Movie tab does not appear. Just Summary and photos appear. How do I correct this problem. Thank you for your help. George

  • I upgraded my hard drive, but now my iTunes & iMail don't work

    So i upgraded my hard drive, rebooted using my time machine backup but I still can't access my iTunes or iMail account any idea how to fix?

  • I haven't bought anything but itunes store took money from my bank.

    I havn't bught anything at itunes store I just listend some radio musics. but apple i tunes store took $3 from my bank. Why?