Encoding for FLV results in 10x larger file CS5 vs CS4

I've recently upgraded to CS5 and the Media Encoder creates incredibly HUGE files compared to the CS4 media encoder.
For example, I can convert a 15MB mpg file to a 12 MB flv file in CS4 using the preset  "FLV same as source"
I do not find a matching preset in CS5, but tried "FLV - match source atributes (medium quality)"  Encoding the same file in CS5 with this preset results in 170MB file.
I've tried adjusting the settings, comparing the options in both versions, and I still can't get a reasonably sized file out of the CS5 media encoder.
This seems really broken to me.  A 170 MB file is not useful in any way.  Do I have bad presets?
Adobe Media Encoder version is 5.0.1.0  (32-bit)
thanks for any suggestions.
rp.

I am having the exact same issue with one exeption, I'm trying to convert a 10sec long mpg into a FLV.  LIke you in CS4 it took just seconds to convert and the final FLV was about 1mb, if that.  Now the same 10sec mpg in CS5, even after trimming as much as I can from settings, i'm still looking at 900mb, it starts at over 3000mb.  I'm doing the same thing in 5 that I was in 4, web 640x480, and disabling the audio(which helps by a few hundred mb).  The absolute best I can get is 70mb, and the conversion process still takes about 2.5 hrs to complete.  If you hear back on this please pass it along to me, if I don't notice.

Similar Messages

  • [svn] 1625: Avoiding an NPE instead of blindly instantiating a new localization manager here when encoding the compilation result of a CSS file to a SWF to unblock QE .

    Revision: 1625
    Author: [email protected]
    Date: 2008-05-08 13:53:33 -0700 (Thu, 08 May 2008)
    Log Message:
    Avoiding an NPE instead of blindly instantiating a new localization manager here when encoding the compilation result of a CSS file to a SWF to unblock QE.
    Doc: No
    QE: Yes
    Bugs: SDK-15490 - Compiler gives nullpointer in encode of incremental compile if benchmark is set to true
    Reviewer: Paul
    Ticket Links:
    http://bugs.adobe.com/jira/browse/SDK-15490
    Modified Paths:
    flex/sdk/trunk/modules/compiler/src/java/flex2/compiler/CompilerAPI.java

    Hi, thank you for your replies, I found out few things about my servlet, and its portability
    and i have few questions, although i marked this topic as answered i guess its ok to post
    I am using javax.servlet.context.tempdir to store my files in that servletcontext temporary directory. But i dont know how to give hyperlink
    of the modified files to the user for them to download the modified files.
    What i am using to get the tempdir i will paste
    File baseurl = (File)this.getServletContext().getAttribute("javax.servlet.context.tempdir");
    System.out.println(baseurl);
    baseurl = new File(baseurl.getAbsolutePath()+File.separator+"temp"+File.separator+"files");
    baseurl.mkdirs();so i am storing my files in that temp/files folder and the servlet processes them and modifies them, then how to present them as
    links to the user for download ?
    and as the servlet is multithreaded by nature, if my servlet gets 2 different requests with same file names, i guess one of them will be overwritten
    And i want to create unique directory for each request made to the servlet , so file names dont clash.
    one another thing is that i want my servlet to be executed by my <form action> only, I dont want the user to simply type url and trigger the servlet
    Reply A.S.A.P. please..
    Thanks and regards,
    Mihir Pandya

  • Creating .pdf from a ppt or .doc in acrobat 9 resulting in strangely large file sizes

    I am printing to .pdf from Microsoft Word and Powerpoint (previously Office 07 - now Office 10) and the file sizes are sometimes doubling in size. Also, for some larger documents, multiple .pdfs are generating as if acrobat is choosing where to cut off a document due to large file size and generating secondary, tertiary, etc., docs to continue the .pdf. The documents have placed images (excel charts, etc) but they are not enormous - that is until they are being generated into a .pdf. I had hoped that when i upgraded to Office 10, this issue would go away, but unfortunately, it hasn't. Any ideas? thoughts? Thanks so much!

    One thing that can happen with MS Office applications when an image contains transparency is the "image" is actually composed of multiple one pixel images when converted to PDF. This could be confirmed by examining the file. Try selecting the images using the TouchUp Object tool to see if what looks like a single image it entirely selected.

  • MBP Retina 8GB vs 16GB for data analysis - R, Python large files?

    I increasingly find myself working with graph data (Neo4J, Gephi) and document databases (Mongo, CouchDB) as well as using R for clustering and analysis and Python to munge fairly large files of 1GB or larger.
    8GB would probably serve me well and coming from a machine with 4GB will feel like plenty, but I suspect that I am the kind of use who would get his money worth out of 16GB.
    In addition to this I will be running a Windows VM for Office Excel/VBA work and to stats packages for university very frequently.
    Is getting the upgraded model worth it? Or will I not notice any difference.
    Thanks,
    David

    I would say you will want the 16GB for a few reasons. You are working with memory-intensive datasets, and also you are running a virtual machine. VMs tend to use up a lot of RAM since it means you are running two complete OSs at the same time, on top of any other apps you might be running.
    The third reason is that the Retina models cannot be upgraded after purchase. You would not want to have an 8GB forever if you discovered that your future work needed a bit more memory. You've already demonstrated doing some tasks that can be RAM-intensive, just max out the RAM and never worry about it again.
    If you still have doubts, learn how to analyze RAM usage and particularly swap file behavior in Activity Monitor. If your Mac is already paging to swap files frequently, you already need the RAM. If you do work for several days and swap paging activity is low, you don't need the RAM.

  • Two work-arounds for Siena when working with large files

    MonaTech pointed out that Project Siena will crash when switching to the desktop or another app.  I've also noticed this happening even when the monitor goes to sleep with inactivity.  To be clear, this is *developing* in Siena, not after the app
    you're working on has been compiled and installed.
    The 'non-technical' work-around that has been successful for me is to split the screen before I switch to another app.  It's not perfect but at least I can switch to another app (browser, desktop app, etc.) without losing where I'm at.
    A more technical work-around is proposed in this thread:http://social.technet.microsoft.com/Forums/en-US/c768be8f-3c85-444e-bb44-6f29abdecee7/project-siena-app-memory-issues-with-large-project-?forum=projectsiena 
    If you have a version of Visual Studio (that's not Express) it may do the trick for you.
    FYI - In the post you'll see that Olivier responded and indicates that this is a known issue.
    Thor

    Wow - that's what you get for multi-tasking! :)
    The link is now fixed in my original post and posted here for convenience:
    http://social.technet.microsoft.com/Forums/en-US/c768be8f-3c85-444e-bb44-6f29abdecee7/project-siena-app-memory-issues-with-large-project-?forum=projectsiena
    Thor

  • Saving CS Illustrator file in script results in much larger file

    I would like to use a script that saves a CS Illustrator documents PDF compatible. When I use the script, the created file is much larger (around 3 times) than when I use the Save As dialog box in the Illustrator application with default settings (PDF compatibility and compression turned on).
    Dim saveOptions As New Illustrator.IllustratorSaveOptions
    saveOptions.Compatibility = aiIllustrator11
    saveOptions.PDFCompatible = True
    saveOptions.Compressed = True
    The script manual setting compressed = true doesn't work. It still
    makes a huge file.
    I've seen a few other posts here where people have exactly the same problem. Is this a bug in Illustrator DLL?
    Please help.

    Clearly that's not the entire script since it doesn't run as-is, but that's not the problem.
    The code does actually work - at least, assuming your paths are valid it saves the file with the contents you expect. However, your script says:
    <pre class=command>set fileSpec to "Photo [Data]:Archive error LOG:" & "test"
    save document 1 in fileSpec</pre>
    Even though you tell TextEdit to save the file in 'Photo [Data] Archive error LOG:test', TextEdit actually creates a file 'Photo [Data] Archive error LOG:text.rtf'. The filename extension is automatically entered by TextEdit.
    Consequently, when you come along later and say:
    <pre class=command>open fileSpec</pre>
    it's trying to open the filename without the filename extension, and that's where it fails.
    Your best solutioni is to use the entire pathname in your variable. That way there's no disconnect when TextEdit adds one that you don't notice:
    <pre class=command>set fileSpec to "Photo [Data]:Archive error LOG:" & "test.rtf"</pre>

  • Application to split large file sze?

    Dear guys....
    I am looking for an application to split large file size before moving out from my Mac into the external HD ... any good suggestion? I have googled it but it is hard to find a reliable one ...I hope you guys could offer your expertise here ... million thanks

    Splitting a document may corrupt it and make it no longer workable. Get a larger storage space for your document. If you must do it, do it on a copy, not the original:
    http://www.soft32.com/download_192750.html

  • Does FW CS5 Mac crap out w large file sizes?

    Does FW CS5 Mac crap out with large file sizes like CS4 did? It seems like files over a 4-5MB tend to be flaky... Is there an optimum file size? I'm running a Mac Tower 2.93 GHz Quad Core, w 8 GB RAM

    Why not download the trial version and find out?

  • Best compression for exporting large files from iMovie for use in iDVD?

    I have a doco that I've made as separate projects too slow to make it all in one project - which I then exported as Quicktime .mov files. When I assembled the .mov files in iDVD I ended up with 4.7GB of material which iDVD flipped out on and crashed.
    What would be the best compression to apply to the large .mov files but reduce the overall file sizes by say 10%
    Its sometimes taking up to 6 hours to render each project average total screentime per project is about 70 minutes and each project file size ranges from 1 to 6GB...
    I feel like I'm a little out of my depth on this one!
    Any suggestions for this dilemma?
    Thanks
    Tony

    When I assembled the .mov files in iDVD I ended up with 4.7GB of material which iDVD flipped out on and crashed... Any suggestions for this dilemma?
    Not sure if your 4.7GB reference is for the source files or the resulting "muxed" MPEG2/PCM encoded content. In any case a single layer DVD is limited to 4.7 GBs of total content. To be safe, this usually means you are limited to 4.2 to 4.3 GB of encoded movie and/or slideshow content depending on how many and what type of menus you are creating created. Thus, if you are encoding for a single-layer DVD and your project encoded content exceeds the capacity of your DVD, then this would likely explain your problem.
    If your project size reference is for the source content being added to your iDVD project, then it is totally irrelevant and you need to concentrate on the total duration of your iDVD and the method of encoding instead of file size. Basically, for a single-layer DVD, only about 60 minutes content can be encoded using the "Best Performance" encode option, 90 minutes using the "High Quality" setting, or 2 hours in the "Professional Quality" encode mode. Thus, if your content duration exceeds the capacity of your encoding mode here, it would likely explain your problem.
    In either case, the solution would be to either burn to higher capacity optical media/higher capacity encoding mode or burn your content to multiple DVDs as individual projects at the current media capacity/encoding mode.

  • Encoding for Encore from Premiere Pro - File Size

    I'm working on a Blu-ray project that contains four separate tutorials, each with several individual clips.  I exported each clip (chapter) out of After Effects CS6 using the standard H264 preset.  The tutorials range from 2 to 5 hours each.  After all was said and done, the total file size of all H264 video clips was just under 25GB, perfect for a Blu-ray burn.
    I created four sequences in Premiere Pro, one for each tutorial.  All of the H264 video files were brought into Premiere Pro, chapter markers were added, ready to rock.
    When I attempted to export each sequence using the H264 Blu-ray preset, I found that using that preset would result in a grand total of about 75GB worth of data.  So I selected the MPEG2 Blu-ray preset, and it was almost the same.  I manually adjusted target bitrate settings, dropped quality to as low as it would go, and still the estimated file size far exceeded the 25GB I had available.
    All tutorials total about 15 hours of 1280x720 30p content.  I realize that's a ton of data, but AE exported all of this content using H264 and the grand total file size was under 25GB.  But bringing those very same video clips into Premiere Pro, assembling them, and exporting them results in much, much larger files.  I even attempted to import the AE exports directly into Encore to avoid having to re-encode anything, and still it estimated that the project's file size exceeded 75GB, even though the source footage of H264 video was less than 25GB.
    Right now I'm down-rezzing all of the footage to 720x480 and the file sizes are smaller, although I'll still need to decrease the bit rates significant if I have any shot of dumping all of this onto a single 25GB Blu-ray.
    I do realize that 15 hours is a lot for a single disc, but what I don't understand is how I can have 15 hours of H264 exported out of AE result in less than 25GB, but using that exact same footage in PPro/Encore results in over 75GB of footage.
    Any insight would be appreciated.
    Windows 7 64-bit
    Adobe Creative Suite Production Premium CS6
    (After Effects/Premiere Pro/Encore)

    The thing is, the AE exports look just fine.  (The tutorials are just a bunch of screen capture/software stuff.)  I didn't mess with any of the default preset settings for H264 out of AE, but yeah, they must've been encoded at pretty low bitrates.
    I wasn't aware that BD specifications have minimum bitrate requirements, so that would explain why I'm unable to decrease the bitrates within PPro or Encore.  Thanks for the info on that.  I just figured as long as the footage was at least 720P with square pixel ratio encoded in H264 or MPEG2 I'd be fine, but apparently not!
    The tutorials are for my own personal use, so I'm not too concerned about decreased resolution if I am able to make the SD content fit.  Otherwise, yeah, burning multiple discs is my only option.
    Thanks for the prompt feedback.

  • File Splitting for Large File processing in XI using EOIO QoS.

    Hi
    I am currently working on a scenario to split a large file (700MB) using sender file adapter "Recordset Structure" property (eg; Row, 5000). As the files are split and mapped, they are, appended to a destination file. In an example scenario a file of 700MB comes in (say with 20000 records) the destination file should have 20000 records.
    To ensure no records are missed during the process through XI, EOIO, QoS is used. A trigger record is appended to the incoming file (trigger record structure is the same as the main payload recordset) using UNIX shellscript before it is read by the Sender file adapter.
    XPATH conditions are evaluated in the receiver determination to eighther append the record to the main destination file or create a trigger file with only the trigger record in it.
    Problem that we are faced is that the "Recordset Structure" (eg; Row, 5000) splits in the chunks of 5000 and when the remaining records of the main payload are less than 5000 (say 1300) those remaining 1300 lines get grouped up with the trigger record and written to the trigger file instead of the actual destination file.
    For the sake of this forum I have a listed a sample scenario xml file representing the inbound file with the last record wih duns = "9999" as the trigger record that will be used to mark the end of the file after splitting and appending.
    <?xml version="1.0" encoding="utf-8"?>
    <ns:File xmlns:ns="somenamespace">
    <Data>
         <Row>
              <Duns>"001001924"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001925"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001926"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001927"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001928"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001929"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"9999"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
    </Data>
    </ns:File>
    In the sender file adapter I have for test purpose changed the "Recordset structure" set as "Row,5" for this sample xml inbound file above.
    I have two XPATH expressions in the receiver determination to take the last record set with the Duns = "9999" and send it to the receiver (coominication channel) to create the trigger file.
    In my test case the first 5 records get appended to the correct destination file. But the last two records (6th and 7th record get sent to the receiver channel that is only supposed to take the trigger record (last record with Duns = "9999").
    Destination file: (This is were all the records with "Duns NE "9999") are supposed to get appended)
    <?xml version="1.0" encoding="UTF-8"?>
    <R3File>
         <R3Row>
              <Duns>"001001924"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
         <R3Row>
              <Duns>"001001925"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
         <R3Row>
              <Duns>"001001926"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</xtract_Code>
         </R3Row>
              <R3Row>
              <Duns>"001001927"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
              <R3Row>
              <Duns>"001001928"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
    </R3File>
    Trigger File:
    <?xml version="1.0" encoding="UTF-8"?>
    <R3File>
              <R3Row>
              <Duns>"001001929"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Ccr_Extract_Code>"A"</Ccr_Extract_Code>
         </R3Row>
              <R3Row>
              <Duns>"9999"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Ccr_Extract_Code>"A"</Ccr_Extract_Code>
         </R3Row>
    </R3File>
    I ve tested the XPATH condition in XML Spy and that works fine. My doubts are on the property "Recordset structure" set as "Row,5".
    Any suggestions on this will be very helpful.
    Thanks,
    Mujtaba

    Hi Debnilay,
    We do have 64 bit architecture and still we have the file processing problem. Currently we are splitting the file into smaller chuncks and processsing. But we want to process as a whole file.
    Thanks
    Steve

  • How to make Adobe Flash Player the default player for .flv, .f4v files?

    I have currently installed from Adobe the following applications:
    Adobe Bridge CS6 (64bit)
    Adobe Bridge CS6
    Adobe Dreamweaver CS6
    Adobe Encore CS6
    Adobe ExtendScript Toolkit CS6
    Adobe Extension Manager CS6
    Adobe Fireworks CS6
    Adobe Flash Professional CS6
    Adobe Illustrator CS6 (64bit)
    Adobe InDesign CS6
    Adobe Media Encoder CS6
    Adobe Photoshop CS6 (64bit)
    Adobe Prelude CS6
    Adobe Premiere Pro CS6
    Adobe SpeedGrade CS6
    Adobe Flash Builder is not installed.
    Adobe Premiere After Effects is not installed.
    I've just downloaded Adobe Flash Player 13, but .flv files and .f4v files are not played by Adobe Flash Player. I've tried to make it default player but i didn't release which file .exe to browse for (by right-clicking and choosing Open with > Choose default program...).

    There's a Flash Player Projector (standalone) in the Flash Professional CS6/Players folder, but it's usually outdated.
    When browsing to an app to use from the Properties window, go there, but... download this one: Windows Flash Player 13 Projector  and replace the Flash Player that's in that folder.
    If you're looking to test Flash files you've authored, then you'll need the Projector, but if you're just looking for a good player for FLV and F4V, then you should use VLC Media Player. http://www.videolan.org/

  • How can I change file encoding for use in Itunes?

    For instance, aiff to apple lossles files, or mp3 to apple lossless, or any other conversion. And which is best?

    I don't really understand some of the settings that you posted. Stereo and joint stereo are two different things, so you can't encode as both at the same time. For modern MP3 compression, joint stereo is usually preferred, because it allows the encoder to use fewer bits in encoding the stereo information, and it doesn't result in any perceptual losses in stereo imaging.
    48.000 kHz is pointless unless you are ripping from source material that is actually at that sample rate. Redbook-standard CD audio is at 44.100 kHz, so just stick with that (or better yet, leave the sample rate setting at "auto" so it will pick the setting that matches the source material you are ripping from).
    320 kbps is kind of overkill as well. Actually, if you want to rip to MP3, I would advise you to stay away from the iTunes encoder, because most listening tests indicate that it hasn't been updated much since the first versions of iTunes back in the late 1990s. It is no longer a modern encoder. Assuming that you are on a Mac, I suggest that you download the iTunes-LAME plugin (http://www.apple.com/downloads/macosx/ipod_itunes/ituneslameencoder.html), which allows you to encode tracks into the iTunes Music Library using the LAME MP3 encoder. LAME is an open-source encoder project that has been continuously developed over many years and has been heavily tested for transparency. The latest version (3.97b1) has been shown in listening tests to be almost the same quality as equivilent-bitrate AAC files.
    Once you download the iTunes-LAME plug-in, you can install the latest version of LAME by going to http://www.rarewares.org/mp3.html . Scroll down to "Encoders/decoders built using LAME 3.97 beta 1" and download "For Mac OS X." This is a Unix Executable file. To install it in the iTunes-LAME plugin, go to Your User Folder/Library/iTunes/Scripts. Right click on the iTunes-LAME application and select "Show Package Contents." Open the "Contents" folder that appears and then open the "Resources" folder. You should see a Unix Executable there titled "lame." Replace this file with the 3.97b1 Unix Executable that you downloaded. Now iTunes-LAME will encode with LAME v.3.97b1 instead of its default LAME build. 3.97b1 has some major improvements over earlier LAME versions, so even though this installation process may seem daunting for some people, it is well worth it.
    To use iTunes-LAME, you first select a CD or playlist from the Source list (the left-hand column) in iTunes. You then click on the Script menu (the little script icon) and select "Import with LAME...". iTunes-LAME opens, showing you a command line and an Import button. LAME encoding options are selected using command-line switches. Fortunately, there are variable bit-rate (VBR) presets that have been heavily tested for transparency, so you don't really need to know anything about the command line options. Just type in "-V 2 --vbr-new," which is one of the most heavily-tested presets, and hit the Import button. This results in variable bit rate files that usually average out to between 170 and 240 kbps, depending on the complexity of the source. For more information on the presets, see http://wiki.hydrogenaudio.org/index.php?title=LAME#RecommendedEncoderSettings
    The only major downside to using iTunes-LAME is that it cannot take advantage of iTunes' error correction option when ripping CDs. If you want error correction though, you can always just have iTunes rip to WAV first, and then convert those WAV files to MP3 using the iTunes-LAME plugin.
    Feel free to ask any more questions.

  • Encoding seems longer in CS5 for flv

    I recently upgraded to CS5 and have been doing a lot of encoding.  Is it my imagination, or does the encoding take longer than CS4 did for flv files?  I have an HP DV7 running Windows 7 Home Premium, Intel i7 Q720 @ 1.60GHz, 6 gb ram, 64 bit os.  For example, my 8 minute piece (AVCHD) is taking 3 hours to encode to "same as source preset" flv.  I queued up 4 projects, the first was a 1 hour piece that I used one of the smaller web flv presets, the second was a 5 minute HDV piece that I used the "same as souce preset" flv, and the third was the 8 minute piece referenced above.  These 3 files have taken over 10 hours to encode.  Is this typical or normal?  I am not running any other programs except the ones that run in the background.  Any suggestions on how I might speed up the process?  Also, if it has any impact on time, I am using a Glyph external hard drive to house my footage connected via esata.
    Thanks
    Lisa

    You're not the first to claim that CS5 is slower than CS4.  I didn't follow those complaints too closely, myself.  You might try rooting around in the Premiere Pro CS5 forum to see what others came up with.

  • Tips or tools for handling very large file uploads and downloads?

    I am working on a site that has a document repository feature. The documents are stored as BLOBs in an Oracle database and for reasonably sized files its not problem to stream the files out directly from the database. For file uploads, I am using the Struts module to get them on disk and am then putting the blob in the database.
    We are now being asked to support very large files of 250MB+. I am concerned about problems I've heard of with HTTP not being reliable for files over 256MB. I'd also like a solution that would give the user a status bar and allow for restarts of broken uploads or downloads.
    Does anyone know of an off-the-shelf module that might help in this regard? I suspect an ActiveX control or Applet on the client side would be necessary. Freeware or Commercial software would be ok.
    Thanks in advance for any help/ideas.

    Hi. There is nothing wrong with HTTP handling 250MB+ files (per se).
    However, connections can get reset.
    Consider offering the files via FTP. Most FTP clients are good about resuming transfers.
    Or if you want to keep using HTTP, try supporting chunked encoding. Then a user can use something like 'GetRight' to auto resume HTTP downloads.
    Hope that helps,
    Peter
    http://rimuhosting.com - JBoss EJB/JSP hosting specialists

Maybe you are looking for

  • Format of Excel Sheet

    Hi experts, I am working on Account receivables in oracle applications.I am getting one problem . I run my query in Toad and I am saving output in .xls format.i got output in comma separted.like cust_no|Cust_name|trx_no|trx_date|amount|open_bal|close

  • How to unlock Sales Document (Sales Order)?

    Dear All, I was in the process to change a Sales Order. I did all the entries in required fields and before I could save the Sales Order, somehow, my SAP-Connection got disconnected. I logged again and run the T. Code: VA02 and gave the Sales Order N

  • IPhoto Book Slideshow Music Settings Not Working

    When I try to view a Photo book as a slideshow, a default guitar music track plays. When I uncheck "Play music" the music still plays. When I try to change the music, the default guitar track still plays. It seems the music settings are not holding o

  • IPhoto Keywords - Disappearing?

    Has anyone's iPhoto Keywords disappeared? Mine just have and it's not cool! I think it happened after Security update v1.0. So what gives, yo? Do I now have to sort through my whole freaking library again?

  • Dead Pixels on Macbook Pro with Retina

    Hi!  I have a brand new MacBook Pro with Retina.  I just noticed a white pixel on a black background...  i.e., dead pixel!  What do I do?  This is a brand new laptop!  Seriously not even 2 months old.  Help! Liz