Abuout Download big file(Like *.avi)

Hi,all:
Sorry,my English is not good.
I have a problem about jsp download big file,
The file's real name is not same as it's save name.
This is the JSP code:
FileInputStream file = null;
BufferedInputStream bis = null;
ServletOutputStream aaa = null;
BufferedOutputStream bos = null;
try
file=new FileInputStream(DLNAME);
bis = new BufferedInputStream(file);
response.setContentType("APPLICATION/OCTET-STREAM;charSet=Shift_JIS");
response.setHeader("Content-Disposition", "attachment; filename=\"" + filename + "\"");
aaa=response.getOutputStream();
bos = new BufferedOutputStream (aaa);
byte[] buff = new byte[2048];
int bytesRead=0;
while((bytesRead = bis.read(buff, 0, buff.length))!=-1)
bos.write(buff, 0, bytesRead);
file.close();     
bis.close();
aaa.close();
bos.close();
catch(IOException e){System.out.println(e.getMessage());}
catch(Exception e){System.out.println("ddddd");}
when the client download file,if he select save file and no cancel, the jsp can work well.
but if he select turn of or cancel the process the web page is so slowly.
pleas help me!!!
my E-mail: [email protected]
Thanks!!!

Hello,
Sorry!My English is not good.
The server's OS is Windows2000,clients use Windows OS too.
when clients execute my code ,the standard windows's download-dialog is appear,
so I can't know what the clients want do.
can you tell me how to get the clients' act?
Thanks!

Similar Messages

  • Tips for downloading big files(World Of Warcraft)

    I need to download this 8 gigs of the game more. I'm downloading directly from bilzzard its going about 150 kb/s. Just wonder some tips yall do when downloading big files like this. Also if my macbook pro goes to sleep will it slow down the download because thats what seems to happen when I leave it on overnight.

    Do it in the evening through overnight. Set the MBP to not sleep. System Preferences>Energy Saver>Computer Sleep = Never.

  • Firefox crashes whenever i want to install any addon or download any file. Some times i am receiving error while downloading any file like "could not be saved, because you cannot change the contents of that folder"

    I am the user of latest firefox 3.6.6 browser. I am getting problems of frequent crashes whenever i tries to install any addon. The crashes also occures whenever i want to download any file.
    I am also receiving errors while downloading any file like:
    "C:\Users\****\AppData\Local\Temp\******.001.part could not be saved, because you cannot change the contents of that folder.
    Change the folder properties and try again, or try saving in a different location. I already changed it many many times but still the same problem."
    Adobe flash palyer is also giving problems of not responding. I am using updated version of it already. Java is also updated.
    I already uninstalled firefox compeletly and re installed it many many times but still the same problem. I also scanned my computer with avira and malware bytes' Anti malware and got no dection. Please ractify this problem ASAP lest my profession will suffer.
    == Crash ID(s) ==
    b7f518f2-8d86-41ca-8bab-aee632100709; 1d790e10-d8eb-4904-98c9-94bc62100708; f042d319-b9f8-42ed-a8cb-57c7d2100708

    Please help.
    It is getting worse
    Adobe flash player is crashing. I already uninstalled and re installed the latest ver. Also it is hanging randomly.
    Please help.

  • Sudhir Choudhrie - Can't Able to Download Big Files

    Hi, This is Sudhir Choudhrie, i've chosen Firefox for my daily browsing and downloading need because it has a function which lets you pause the running download and can start it whenever or next day you want. And you don't need any third-party trial download manager. But nowadays i finding it difficult to download big files. Whenever i choose to start a paused downloading file, i can't be able to download that file again. Please tell me if it's a website error or some error in Firefox. Thanks,

    Hi,
    Thank you for your question. This I do not know the answer, however I would be happy to test a download url. If this is an issue with downloads it is also possible to troubleshoot [https://support.mozilla.org/en-US/kb/cant-download-or-save-files]

  • Rt2860 wifi network hangs on downloading big files.

    After upgrade to 3.2 kernel now my rt2860 pci card is not working properly. Wifi connects fine, and i can browse the internet and download small files. But if i try to download a big file (> 1 gb) over the LAN, it starts and then hangs after downloading 8-10 mb. I have to disconnect the network, and connect again.
    I fixed it by installing rt2860 package from aur: https://aur.archlinux.org/packages.php?ID=14557 and blacklisting 2800pci.
    I would be happy with this solution, but now every time a kernel is updated i loose rt2860 after restart, and i have to manually recompile and instal the aur rt2860 package again.
    Are there any tweaks or config to fix the rt2800pci hanging problem ? or how can i make it not to loose aur package after every kernel upgrade ?

    I do not use rt2800pci on either Arch or Ubuntu. For me, it just doesn't work. You are doing better than I, because I can't get a connection at all with it.
    I put up with the effort of recompiling rt2860 after every kernel update, because it works. Besides, it helps me keep up my chops on compiling and installing kernel modules.
    Tim

  • Download big files

    I am trying to download a big file from the internet. Currently I am doing the following:
                URLConnection conn = downloadURL.openConnection();
                InputStream in = conn.getInputStream();
                OutputStream out = new FileOutputStream(dst);
                // Transfer bytes from in to out
                byte[] buf = new byte[SIZE];
                int len;
                while ((len = in.read(buf)) > 0 && !cancelled) {
                    out.write(buf, 0, len);
                }That code works for me most of the time, but sometimes the file is not downloaded correctly and I do not know how to test if the download was completed or not, and I do not know how to guarantee that the file is downloaded completely.
    Is there some way to make this?
    Greetings,
    Magus

    That statement makes no sense. I'm programming in Java, and Java has a fine mechanism for throwing exceptions.No, that statement makes no sense. Your Java code is talking TCP/IP to a server, which doesn't have such a mechanism. It can close or reset the connection; that's it. If it had reset the connection, your read() would block forever, or timeout. If it had closed the connection, your read() would have returned -1, as it did, so that is what happened.
    1. Can you expound upon what makes you think that is the reason that -1 was returned, as opposed to the connection timing out, the connection dropping, etc.?Because no exception was thrown. Or else one was thrown and you have swallowed it, but the code you posted doesn't indicate that.
    2. Can you indicate why Java would return a -1 instead of throwing a some I/O exception?Because the server, or possibly an intermediate firewall, closed the connection, rather than Java incurring some exception at the client end such as a timeout. The documentation you quoted at me before bears that out. That's what the -1 means. You quoted that at me yourself.
    The whole point here is that, according to the API documentation, -1 apparently indicates that the end of the stream was reached normally, and that error conditions are indicated by exception.Exactly so. So the server closed the connection normally, or an intermediate firewall did, but before it had sent all the data. Why is another question. Have a look at the server logs, or investigate the firewall configuration.
    I guess I'm asking if anyone has seen this behavior, and has any insight on why it doesn't seem to follow the API.It does.
    you seem to think that this behavior is in line with the API documentation; I disagree.Well you've quoted enough of it yourself: have another look, or another think. Premature (but normal) closing of the connection by the server or a firewall is the only possible explanation for what you're seeing. If it wasn't closed you wouldn't be getting the -1; if there was a timeout you would get a SocketTimeoutException; if there was any other exception you would catch that.
    I've seen plenty of short downloads in my life, but the server doesn't have a way of indicating that via TCP/IP. You have to count.
    NB some firewalls do close long-lived connections on their own account. Is there a client-side firewall that might be doing that.
    Or else the transmitted content-length is wrong; for example, it overflows an integer.

  • Safari Incompletely Downloads Big Files

    In the next Safari update, the browser should be able to download 100% of files that are at least 1GB in size.

    Hello Michael,
    I meant to get support there to supplement support here: )
    Happy to help you.
    Since it works in the test/other account try,
    •To Reset Safari
    Under the Safari menu, choose Reset.
    Click the Reset button in the dialog box.
    • QUIT Safari & Sherlock, go to Home~/Library/Preferences / [list mode and remove the com.apple.Safari.plist & com.apple.Sherlock.plist file to the trash.
    Do a reboot.
    • Home~/Library/Cache folder /Safari cache, folder drag that to the trash
    • Home~/Library/Cookies/Cookies.plist file, to the trash.
    Do a reboot, run Disk Utility, Disk permissions repairTravis A.'s Regular Maintenance/General Troubleshooting if you need more on running Disk permission repair.
    Launch & run Safari any better?
    This freeware utility tool for maintenance if you do not use one already. OnyX,. make sure to download the correct version for your os. Use this article to guide you.Clearing caches with OnyX
    Apple knowledge base article Safari quits unexpectedlyfor more.
    Good luck! Please let us know how you fare.
    Eme'~[)

  • Impossible to download "big" file : error 3259

    hey everyone,
    I got an ipad for christmas with the ios 3.2.2
    I installed the last version of iTunes 10.1.1.4
    Itunes wants me to download and upgrade for ios 4.2... the system is strating to download but after a while the download stops and say error 3259 !!!
    I am fighting for 3 days trying to find a solution !!!
    - I have no antivirus running ( as it could be the problem )...
    - my computer is connected to my triple play box via ethernet ( as it could also be a problem if it was a wifi connexion )
    Then, I don't know...
    please help me... LOL

    no ideas ?????

  • Problems uploading big files via FTP and downloading files

    I've been having problems uploading big files like video files (.mov 12MB) via FTP to my website (small files like .html or .doc I can upload but it takes longer than usual). I'm using Fetch 4.0.3. as FTP. Same problems when downloading files via Bit Torrent. I recently moved to Spain, since then I seem to have the problem. But my roommate with a PC doesn't have that problem. Connecting to internet with Ethernet cable also didn't resolve the problem. I also tested it from a Starbucks coffee connecting to Internet from there but still couldn't upload that 12MB file to the FTP. The security settings for firewall are set to "allow all incoming connections". I didn't change any of my settings so I don't know what the problems could be. I have a MacBook Pro, Mac OS X (10.5.7) Any suggestions? Thanks!

    Welcome to Apple Discussions!
    Much of what is available on Bittorrent is not legal, beta, or improperly labelled versions. If you want public domain software, see my FAQ*:
    http://www.macmaps.com/macosxnative.html#NATIVE
    for search engines of legitimate public domain sites.
    http://www.rbrowser.com/ has a light mode that supports binary without SSH security.
    http://rsug.itd.umich.edu/software/fugu/ has ssh secure FTP.
    Both I find are quick and a lot more reliable than Fetch. I know Fetch used to be the staple FTP program, but it grew too big for my use.
    - * Links to my pages may give me compensation.

  • Downloding big files on 3g/4g

    hello all
    first of all i like to say i mad at nokia for stoping 3g /4g big downloading over 20meg//
    i have a unlited internet plan sim deal
    but my phone stoping me to use
    ok here we go
    i like nokia to make it so we have a opshon for downloading over 3g big files /on/off system
    first    i understand 3g can be slower then wifi some times and unstabul but it belive it up to the user to use it or not/
    2ed   i understand some ppl use download and go over data so i sogest you set it by defrolt seting to the 20meg on but can be removed by seting.s menu
    now i explane y i a making this post today for the opshon for on/off downloading big files over 20meg not ust desable all the time
    today i went in my car and i got lost yes lost i went to use my phone to get me to a fomila place
    but as i was going to use nokia here maps it ask to download my regon map's i tryed just england on it own it was 200meg and it asked for wifi i had no way of geting wifi i was lost and nokia not alow me to try download on 3g even i got unlimited internet plan
    i belive this opeshon shud be up to the user to use or not
    even if it slow bad downloads crash or data limits / its up to the users// your makeing custermers hate nokia/windowsphones
    thanks keep the peace pzz add your coments
    Solved!
    Go to Solution.

    Nokia hasn't told you to go to the Windows webpage, I have suggested you do that. If you want an answer from Nokia you need to contact Nokia directly as this is a public forum.
    Microsoft control the software so it's them that you need to let know that you are unhappy, it really is that simple. I don't know what exactly can be said here to make you feel better?
    Nokia does pass on user feedback to Microsoft but what is the harm done if you also tell Microsoft yourself? Is it really that difficult to copy and paste your comments on to another site?
    There is nothing further that the users of this forum can say or do to help you on this.

  • Problems in Downloading a file from a web site using HttpClient

    Hi,
    My requirement is to download a file from a website using a java program. I have written a program using HttpClient API but I am unable to do so. Reality is that I don't know how to proceed with the HttpClient API to get the file downloaded. Same file can also be downloaded manually by login to that website, below mentioned steps are to be followed to download the file manually.
    1. Login to the website using Login/Password (I have valid login/password)
    2. Two options (links) are there (a) Report (b) Search [I am chosing report option]
    3. Newly opened window shows two tabs (a) Today (b) From & To [I am selection Today tab]
    4. Every tab has two radio button (a) File (b) Destination [Destination is selected by me]
    5. When Search button is pressed. A link gets created when that link is clicked file download popup is opened which allows facility to open/save the file.
    All these 5 step are to be followed in a java program to download the file. Please help in doing the same.

    // first URL which is used to open the website, this will have two text fields userName and password
    String finalURL = "http://www.xyz.in/mstatus/index.php";
    SMSGatewayComponentImpl obj = new SMSGatewayComponentImpl();
    obj.sendMessage(finalURL);
    public void sendMessage(String finalURL) {
    String ipAddrs = "a.b.c.d";
    int port = 8080;
    boolean flag = false;
    try {
    // Create an instance of HttpClient.
    HttpClient client = new HttpClient();
    HostConfiguration hostConfig = new HostConfiguration();
    hostConfig.setProxy(ipAddrs,port);
    client.setHostConfiguration(hostConfig);
    // Create a method instance.
    String cookieName=null;
    String cookieValue=null;
    // Here URL for the login page is passed and passing the user/password
    PostMethod method = new PostMethod(finalURL);
    method.setRequestHeader("userid","userName");
    method.setRequestHeader("passwd","pwd");
    // Execute the method.
    int statusCode = client.executeMethod(method);
    Cookie[] cookies = client.getState().getCookies();
    for (int i = 0; i < cookies.length; i++) {
    Cookie cookie = cookies;
    cookieName = cookie.getName();
    cookieValue = cookie.getValue();
    System.err.println(
    "Cookie: " + cookie.getName() +
    ", Value: " + cookie.getValue() +
    ", IsPersistent?: " + cookie.isPersistent() +
    ", Expiry Date: " + cookie.getExpiryDate() +
    ", Comment: " + cookie.getComment());
    NameValuePair[] respParameters = method.getResponseHeaders();
    String cookie = "";
    for(NameValuePair o : respParameters){
         System.out.println("Name : "+o.getName());
         System.out.println("Value : "+o.getValue());
         if("Set-Cookie".equalsIgnoreCase(o.getName()))
              cookie = o.getValue();
    NameValuePair[] footParameters = method.getResponseFooters();
    System.out.println("****************** Footer Values *******************");
    for(NameValuePair o1 : footParameters){
    System.out.println("Name : "+o1.getName());
    System.out.println("Value : "+o1.getValue());
    // This is jthe URL which comes when login/passowrd is entered and Login button is pressed.
    // I am trying to get the cookie from the first URL and pass this cookie for the second URL so that the session can be maintained
    // Here I may be wron..don't know is this the right way to download the file like this.....????
    finalURL = "http://www.xyz.in/mstatus/mainmenugrsms.php";
         method = new PostMethod(finalURL);
         method.setRequestHeader(cookieName,cookieValue);
         method.setRequestHeader("userid","userName");
         method.setRequestHeader("passwd","pwd");
         method.setRequestHeader("Set-Cookie",cookie);
         statusCode = client.executeMethod(method);
         respParameters = method.getResponseHeaders();
    for(NameValuePair o : respParameters){
    System.out.println("Name : "+o.getName());
         System.out.println("Value : "+o.getValue());
    // and this is the final URL which i got when that final link which enabled file download from the website have been copied as a shortcut and
    // pasted in a notepad. I was thinking that this will return the file as an input stream. but its not happening.
         finalURL = "http://www.xyz.in/mstatus/dlr_date.php#";
         method = new PostMethod(finalURL);
         method.setRequestHeader("Set-Cookie",cookie);
         method.setRequestHeader("userid","userName");
    // userid and passwd field are obtained when login/password page contents are seen using view source of that html
         method.setRequestHeader("type","1");
         // trying to set the cookie so that session can be maintained
    method.setRequestHeader(cookieName,cookieValue);
         method.setRequestHeader("passwd","pwd");
         statusCode = client.executeMethod(method);
         ObjectInputStream objRetInpuStream = new ObjectInputStream(method.getResponseBodyAsStream());
         System.out.println("objRetInpuStream : "+objRetInpuStream);
         if(objRetInpuStream!=null)
         System.out.println("objRetInpuStream available bytes : "+objRetInpuStream.available());
         String returnFile=(String)objRetInpuStream.readObject();
         System.out.println("Returned value \n : "+returnFile);
         respParameters = method.getResponseHeaders();
         for(NameValuePair o : respParameters){
         byte[] responseBody = method.getResponseBody();
         System.out.println("Response Body : "+new String(responseBody));
         if (statusCode != HttpStatus.SC_OK) {
              System.out.println("Error: " + method.getStatusLine());
         } else {
              System.out.println(method.getStatusLine());     
         } catch(Exception nfe) {
                   System.out.println("Exception " + nfe);
    Output
    =====
    /home/loguser/batch> sh run.sh SMSGatewayComponentImpl
    Classname : SMSGatewayComponentImpl
    run.sh[4]: test: 0403-004 Specify a parameter with this command.
    final URL : http://www.xyz.in/mstatus/index.php
    client is :org.apache.commons.httpclient.HttpClient@190e190e
    Cookie: PHPSESSID, Value: anqapu83ktgp8hlot06jtbmdf1, IsPersistent?: false, Expiry Date: null, Comment: null
    Name : Date
    Value : Thu, 06 May 2010 09:08:47 GMT
    Name : Server
    Value : Apache/2.2.3 (Red Hat)
    Name : X-Powered-By
    Value : PHP/5.1.6
    Name : Set-Cookie
    Value : PHPSESSID=anqapu83ktgp8hlot06jtbmdf1; path=/
    Name : Expires
    Value : Thu, 19 Nov 1981 08:52:00 GMT
    Name : Cache-Control
    Value : no-store, no-cache, must-revalidate, post-check=0, pre-check=0
    Name : Pragma
    Value : no-cache
    Name : Content-Length
    Value : 4792
    Name : Content-Type
    Value : text/html; charset=UTF-8
    Name : X-Cache
    Value : MISS from dcp.pas.abc.in
    Name : X-Cache-Lookup
    Value : MISS from dcp.pas.abc.in:8080
    Name : Via
    Value : 1.0 dcp.pas.abc.in:8080 (squid/2.6.STABLE21)
    Name : Proxy-Connection
    Value : keep-alive
    Cookie Value : PHPSESSID=anqapu83ktgp8hlot06jtbmdf1; path=/
    ****************** Footer Values *******************
    Name-2 : Date
    Value-2: Thu, 06 May 2010 09:08:47 GMT
    Name-2 : Server
    Value-2: Apache/2.2.3 (Red Hat)
    Name-2 : X-Powered-By
    Value-2: PHP/5.1.6
    Name-2 : Expires
    Value-2: Thu, 19 Nov 1981 08:52:00 GMT
    Name-2 : Last-Modified
    Value-2: Thu, 06 May 2010 09:08:47 GMT
    Name-2 : Cache-Control
    Value-2: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
    Name-2 : Pragma
    Value-2: no-cache
    Name-2 : Location
    Value-2: index.php
    Name-2 : Content-Length
    Value-2: 0
    Name-2 : Content-Type
    Value-2: text/html; charset=UTF-8
    Name-2 : X-Cache
    Value-2: MISS from dcp.pas.abc.in
    Name-2 : X-Cache-Lookup
    Value-2: MISS from dcp.pas.abc.in:8080
    Name-2 : Via
    Value-2: 1.0 dcp.pas.abc.in:8080 (squid/2.6.STABLE21)
    Name-2 : Proxy-Connection
    Value-2: keep-alive
    Cookie Value second time : PHPSESSID=anqapu83ktgp8hlot06jtbmdf1; path=/
    **Exception java.io.EOFException**
    Is my approach to download the file fromthe website ok????? please help me.

  • Unable to download large files via my Mobile Network

    I have unlimited mobile data but I am unable to download large files, like podcasts unless I connect to Wi Fi. My settings are set to use mobile data, is there anything I can do?

    There is a 50meg (recently upgraded from 20) per file download limit over cellular data. No, you can't change that.

  • Java Programm to download some file from a location on internet

    Hi,
    I want to write a java programme which should be capable of download a file(like exe,zip) from the internet. It should also have provision for proxy server and it is going to run behind a firewall.

    Read documentation for URLConnection (and HttpURLConnection) these classes can use proxy (set via system properties as mentioned in documentation), you can obtain InputStream to download remote file.
    Working behind firewall depends on firewall and its configuration.

  • Firefox crashes when downloading huge files

    When downloading huge files like a 4GB DVD image of linux, firefox starts to build up memory usage. After downloading 3GB it takes around 4GB of memory and then crashes since it can't possibly grow any larger. WHY is it doing this? In earlier versions of firefox it created a .part file and stored all data in there. But right now it stores the WHOLE download in memory and then saves it directly to the file. No .part files on the drive.

    I installed an apache webserver for faster local downloading and put a 4GB DVD up.
    First try on Windows7: After 500MB of downloading it stops with a memory load of 1.6GB then it suddenly tells me the download is finished - at 500MB of 4GB. No crash this time but still not the desired result.
    Try on Windows Vista: After 1.5GB of downloading and 1.7GB of memory usage and then the CPU activity jumped to 100% on the core where Firefox executes. Took around 5min then CPU goes down to 2% and it tells me the download finished and starts a virus scan. No crash either. Downloaded only 1.6GB of 4GB
    Maybe it didn't crash this time because the download is too fast. Earlier while downloading from internet it always crashed while downloading debian DVD image. Later I used Internet Explorer to complete the download. I'll install a bandwidth control in my apache and try again over a longer period of time.

  • WRT160NL copyin Big files on storage wil not succeed

    When i put a big file like a 7 GB (iso image) on my harddisk which is connected at my router, the connection breaks down in the middle of the copying process. I tried two different hardisks but same problem, so i fugures it has to do with the harddisk. Smaller files are no problem. Is there a timed out setting or something?
    Who can help me?
    Solved!
    Go to Solution.

    Actually 7GB is a large amount of data..However try to change the Wireless Chanel on the router and check..
    Which security type you are using on the router..? It is recommended to use WPA2 Security to get the proper speed from WRT160NL router.

Maybe you are looking for