FTP and HTTP large file ( 300MB) uploads failing

We have two IronPort Web S370 proxy servers in a WCCP Transparent proxy cluster.  We are experiencing problems with users who upload large video files where the upload will not complete.  Some of the files are 2GB in size, but most are in the hundreds of megabytes size.  Files of sizes less than a hundred meg seem to work just fine.  Some users are using FTP proxy and some are using HTTP methods, such as YouSendIt.com. We have tried explict proxy settings with some improvment, but it varies by situation.
Is anyone else having problems with users uploading large files and having them fail?  If so, any advice?
Thanks,
   Chris

Have you got any maximum sizes set in the IronPort Data Security Policies section?
Under Web Security Manager.
Thanks
Chris

Similar Messages

  • BT Cloud - large file ( ~95MB) uploads failing

    I am consistently getting upload failures for any files over approximately 95MB in size.  This happens with both the Web interface, and the PC client.  
    With the Web interface the file upload gets to a percentage that would be around the 95MB amount, then fails showing a red icon with a exclamation mark.  
    With the PC client the file gets to the same percentage equating to approximately 95MB, then resets to 0%, and repeats this continuously.  I left my PC running 24/7 for 5 days, and this resulted in around 60GB of upload bandwidth being used just trying to upload a single 100MB file.
    I've verified this on two PCs (Win XP, SP3), one laptop (Win 7, 64 bit), and also my work PC (Win 7, 64 bit).  I've also verified it with multiple different types and sizes of files.  Everything from 1KB to ~95MB upload perfectly, but anything above this size ( I've tried 100MB, 120MB, 180MB, 250MB, 400MB) fails every time.
    I've completely uninstalled the PC Client, done a Windows "roll-back", reinstalled, but this has had no effect.  I also tried completely wiping the cloud account (deleting all files and disconnecting all devices), and starting from scratch a couple of times, but no improvement.
    I phoned technical support yesterday and had a BT support rep remote control my PC, but he was completely unfamiliar with the application and after fumbling around for over two hours, he had no suggestion other than trying to wait for longer to see if the failure would clear itself !!!!!
    Basically I suspect my Cloud account is just corrupted in some way and needs to be deleted and recreated from scratch by BT.  However I'm not sure how to get them to do this as calling technical support was futile.
    Any suggestions?
    Thanks,
    Elinor.
    Solved!
    Go to Solution.

    Hi,
    I too have been having problems uploading a large file (362Mb) for many weeks now and as this topic is marked as SOLVED I wanted to let BT know that it isn't solved for me.
    All I want to do is share a video with a friend and thought that BT cloud would be perfect!  Oh, if only that were the case :-(
    I first tried web upload (as I didn't want to use the PC client's Backup facility) - it failed.
    I then tried the PC client Backup.... after about 4 hrs of "progress" it reached 100% and an icon appeared.  I selected it and tried to Share it by email, only to have the share fail and no link.   Cloud backup thinks it's there but there are no files in my Cloud storage!
    I too spent a long time on the phone to Cloud support during which the tech took over my PC.  When he began trying to do completely inappropriate and irrelevant  things such as cleaning up my temporary internet files and cookies I stopped him.
    We did together successfully upload a small file and sharing that was successful - trouble is, it's not that file I want to share!
    Finally he said he would escalate the problem to next level of support.
    After a couple of weeks of hearing nothing, I called again and went through the same farce again with a different tech.  After which he assured me it was already escalated.  I demanded that someone give me some kind of update on the problem and he assured me I would hear from BT within a week.  I did - they rang to ask if the problem was fixed!  Needless to say it isn't.
    A couple of weeks later now and I've still heard nothing and it still doesn't work.
    Why can't Cloud support at least send me an email to let me know they exist and are working on this problem.
    I despair of ever being able to share this file with BT Cloud.
    C'mon BT Cloud surely you can do it - many other organisations can!

  • Downloading via XHR and writing large files using WinJS fails with message "Not enough storage is available to complete this operation"

    Hello,
    I have an issue that some user are experiencing but I can't reproduce it myself on my laptop. What I am trying to do it grab a file (zip file) via XHR. The file can be quite big, like 500Mb. Then, I want to write it on the user's storage.
    Here is the code I use:
    DownloadOperation.prototype.onXHRResult = function (file, result) {
    var status = result.srcElement.status;
    if (status == 200) {
    var bytes = null;
    try{
    bytes = new Uint8Array(result.srcElement.response, 0, result.srcElement.response.byteLength);
    } catch (e) {
    try {
    Utils.logError(e);
    var message = "Error while extracting the file " + this.fileName + ". Try emptying your windows bin.";
    if (e && e.message){
    message += " Error message: " + e.message;
    var popup = new Windows.UI.Popups.MessageDialog(message);
    popup.showAsync();
    } catch (e) { }
    this.onWriteFileError(e);
    return;
    Windows.Storage.FileIO.writeBytesAsync(file, bytes).then(
    this.onWriteFileComplete.bind(this, file),
    this.onWriteFileError.bind(this)
    } else if (status > 400) {
    this.error(null);
    The error happens at this line:
    bytes = new Uint8Array(result.srcElement.response, 0, result.srcElement.response.byteLength);
    With description "Not enough storage is available to complete this operation". The user has only a C drive with plenty of space available, so I believe the error message given by IE might be a little wrong. Maybe in some situations, Uint8Array
    can't handle such large file? The program fails on a "ASUSTek T100TA" but not on my laptop (standard one)
    Can somebody help me with that? Is there a better way to write a downloaded binary file to the disk not passing via a Uint8Array?
    Thanks a lot,
    Fabien

    Hi Fabien,
    If Uint8Array works fine on the other computer, it should not be the problem of the API, but instead it could be the setting or configuration for IE.
    Actually using XHR for 500MB zip file is not suggested, base on the documentation:
    How to download a file, XHR wraps an
    XMLHttpRequest call in a promise, which is not a good approach for big item download, please use
    Background Transfer instead, which is designed to receive big items.
    Simply search on the Internet, and looks like the not enough storage error is a potential issue while using XMLHttpRequest:
    http://forums.asp.net/p/1985921/5692494.aspx?PRB+XMLHttpRequest+returns+error+Not+enough+storage+is+available+to+complete+this+operation, however I'm not familiar with how to solve the XMLHttpRequest issues.
    --James
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Scripting with FTP and HTTP

    Hi All,
    To help us with future planning, we would like to get a feel for how many developers are using the FTP and HTTP objects that are available with scripting in CS3 (through webaccesslib). If you are using them could you send me a quick email describing how you use the component? My email address is [email protected]
    Thanks in advance.
    Alan Morris
    Dev Tech Engineer
    Adobe Systems

    Yeah, this is so aggrevating!
    Adobe builds all of these cool ideas, then doesnt test them.
    The HTTPConnection object does not do POST at all. I have tried nearly every possibility. The documentation is either way off or the object just does not work. I can see the post in raw form and the POST variables are not coming across.
    After working on this for a few hours i thought to myself, hey maybe i should just create a flash pane instead and load the files into it, then have the flash object upload. Well i ran into a big fat wall there too! As it is with patchpanel and bridge, these technologies only accept swf objects. This whole concept of using SWF and crossscripting has a huge flaw. The SWF file's security format does not allow for local file access for doing simple things like upload. If i can't synchronize file data to web based clouds, then i cant do much worth talking about.
    I love these products and their possibilities but i have to have the ability to communicate with the world. HTTP is the way!
    Also a side note, FTP is an insecure/inflexible solution and looks like a lot more time was spent on this aspect of the scriptable product.
    PLEASE HELP ADOBE!!!!!

  • Ftp and http access over XDB repository is not allowed...

    When I try to execute the following command on a reasonably fresh Oracle 11 installation:
    insert into "XMLTEST" ( "name", "xmlfof" ) values ( 'small', DBMS_XDB.GETCONTENTXMLTYPE('/public/small.xml') );
    -- The schema is correctly registered, the file "small.xml" is in the /public repository folder, the user has every conceivable role and priviledge
    -- http access works fine from a remote location, tried to execute the command on the server and from remote system...
    I get the following error message:
    ORA-31020: Der Vorgang ist nicht zulässig, Ursache: For security reasons, ftp and http access over XDB repository is not allowed on server side ORA-06512: in "XDB.DBMS_XDB", Zeile 1915
    Searching for an answer on the forum didn't produce any concreate explanation... Does anyone have any idea how to solve this problem?

    As it turns out, the XML file contained a reference to a DTD at an external web-site, which caused the problem - it was identical to that described here:
    Re: ORA-31020 when using XML with external DTD or entities
    After removing the reference, everything works perfectly...

  • Using Windows 8.2 and copying large files over USB 3.0 to Segate Backup Plus USB hard drive - I get the following error:-

    Using Windows 8.1 and copying large files over USB 3.0 to Segate hard drive -  I get the following error:-
    Error 0x80070079: The semaphore timeout period has expired
     Using Windows 8.1 with a 3 - 4TB USB Segate drive that is using Microsoft's bitlocker.    It seems I can copy small files w/out issues; however, copying large files and directories causes this error.  Connectivity is USB 3.0 from a Dell
    Latitue3 e6520 to Segagate drive. 
    Can anyone share fix or determine methodology to fix?

    Hi Joe,
    Was the error gone when you turned off the Bitlocker from this drive?
    If no, please refer to this similar thread's solution to resolve:
    https://social.technet.microsoft.com/Forums/windows/en-US/c3fc9f5d-c073-4a9f-bb3d-b7bb8f893f78/error-0x80070079-the-semaphore-timeout-period-has-expired?forum=itprovistanetworking
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact [email protected]

  • Hi, I upgraded to OX X Mountain Lion, and now I can't upload any pictures from iphoto/desktop to email, etc.  After I choose my file and hit "attach file" to upload, it says: "Invalid File Specified".  I am using jpg, so I'm confused. Any suggestions?

    Hi, I recently upgraded to OX X Mountain Lion, and now I cannot upload any of my pictures from iphoto or even my desktop to email, etc.  After I choose my file and hit "attach file" to upload, it says: "Invalid File Specified".  I am using jpg, so I'm confused. Any suggestions?

    Not having a media card in your BlackBerry may affect your ability to transfer media files onto it. There is a file size limit of 2.86MB when transferring files into device memory. http://bbry.lv/9XNuy0
    Have you verified that the applications you are trying to install are supported by your current wireless provider? Unlocked devices from one provider may not have full functionality on another network.
    -FS
    Come follow your BlackBerry Technical Team on Twitter! @BlackBerryHelp
    Be sure to click Kudos! for those who have helped you.
    Click Solution? for posts that have solved your issue(s)!

  • HT201442 I am got error while trying to restore,, but i just only error file with 3194. I try many step by install lastest version and fix host file, but still fail. Anybody can give me any key point to solve this problem? Thank you

    I am got error while trying to restore,, but i just only error file with 3194. I try many step by install lastest version and fix host file, but still fail. Anybody can give me any key point to solve this problem? Thank you

    Error 3194:
    This device isn't eligible for the requested build
    It means that Apple has stopped signing the version of iOS that you have, try downloading the latest iOS version, then try Restore again.

  • Uploading large files to share fails - advice appreciated

    I wondered if anyone could help me with uploading large files. 
    The ability to distribute files to clients with the included 20Gb of server space was part of the reason I signed up to Creative Cloud; unfortunatly I'm having very mixed results and I'm constantly having to fall back on the free DropBox service which seems much more reliable.
    I'm working on a MacPro - 10.6.8 with a optical fibre based ISP 20Mb/s service; the service is, dare I say, usually very quick and reliable.  Before uploading a large file I set energy saver to "never sleep" in case I wonder off for a cup of tea, but enevitably come back to what looks like a "stuck" upload.  I've tried three times today to send a 285Mb file and they all failed so I reverted to sending the file by DropBox... with no problem - I've also sent files in excess of 1Gb by DropBox in the past without an issue.
    Todays file had reached about 50% according to the little progress bar.. it then stopped.   (it's a shame the progrees bar can't relay the actual upload speed and a percentage like DropBox so you can see what's going on)
    I have to state, I'm not a DropBox "fanboy" but it just seems to work...
    Are their any tricks or settings that could make Adobe's 20Gb of space a functioning part of my workflow?  It would be great to feel confident in it.
    Thanks in advance for any advice.

    Either Firefox or Chrome would be a good alternative.
    I would see if one of these browsers works for upload but we can also debug the issue on Safari.
    Here are the steps to debug on Safari:
    Add the Develop menu to Safari. From the menu choose Safari > Preferences. Select the Advanced tab and check the box for "Show Develop menu in menu bar".
    Once logged onto the Creative Cloud Files page (https://creative.adobe.com/files) Ctrl + Click anywhere on the page and from the browser context menu choose Inspect Element.
    In the window that opens in the bottom half of the browser click on the Clock icon (mouse over info tip says Instrument) and on the left select Network Requests.
    Try the upload and you will see various network requests. Look for those whose Name is "notification". When you select one in the list there will be an arrow next to "notification" and clicking on it will take you to the details. See screen shot. If successful you will have a Status of 200. Some of these "notification" items are for your storage quota being updated, but the one we want is for the successful upload.

  • Upload and Process large files

    We have a SharePoint 2013 OnPrem installation and have a business application that provides an option to copy local files into UNC path and some processing logic applied before copying it into SharePoint library. The current implementation is
    1. Users opens the application and  clicks “Web Upload” link from left navigation. This will open a \Layouts custom page to select upload file and its properties
    2. User specifies the file details and chooses a Web Zip file from his local machine 
    3. Web Upload Page Submit Action will
         a. call WCF  Service to copy Zip file from local machine to a preconfigure UNC path
         b. Creates a list item to store its properties along with the UNC path details
    4. Timer Job executes in a periodic interval to
         a. Query the List to see the items that are NOT processed and finds the path of ZIP file folder
         b. Unzip the selected file 
         c. Loops of unzipped file content - Push it into SharePoint library 
         d. Updates list item in “Manual Upload List”
    Can someone suggest a different design approach that can manage the large file outside of SharePoint context? Something like
       1. Some option to initiate file copy from user local machine to UNC path when he submits the layouts page
       2. Instead of timer jobs, have external services that grab data from a UNC path and processes periodic intervals to push it into SharePoint.

    Hi,
    According to your post, my understanding is that you want to upload and process files for SharePoint 2013 server.
    The following suggestion for your reference:
    1.We can create a service to process the upload file and copy the files to the UNC folder.
    2.Create a upload file visual web part and call the process file service.
    Thanks,
    Dennis Guo
    TechNet Community Support
    Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
    [email protected]
    Dennis Guo
    TechNet Community Support

  • Splitting large files for upload

    I need to upload a very large file to an FTP site, 9.8G. This takes a hugely long time with my bandwidth and invariablly fails somewhere before it finishes. Does anyone know if there is software that will take a large file like this, split it up, perhaps into disk images, so that they can be uploaded separately and then reassembled?
    Thanks very much.

    If you perform a web search there are some file splitters available for the Mac, I don't use them.
    Hopefully you will find a suitable one.

  • Airport Extreme, Airdisk, and Vista - large file problem with a twist

    Hi all,
    I'm having a problem moving large-ish files to an external drive attached to my Airport Extreme SOMETIMES. Let me explain.
    My system - Macbook Pro, 4gb ram, 10gb free HD space on macbook, running latest updates for mac and vista, external hard drive on AE is an internal WD in an enclosure w/ 25gb free (its formatted for pc, but I've used it directly connected to multiple computers, mac and pc, without fail), AE is using firmware 7.3.2, and im only allowing 802.11n. The problem occurs on the Vista side - havent checked the mac side yet.
    The Good - I have bit torrents set up, using utorrent, to automatically copy files over to my airdisk once they've completed. This works flawlessly. If i connect the hard drive directly to my laptop (macbook running bootcamp w/ vista, all updates applied), large files copy over without a hitch as well.
    The Bad - For the past couple weeks (could be longer, but I've just noticed it recently being a problem - is that a firmware problem clue?) If i download the files to my Vista desktop and copy them over manually, the copy just sits for a while and eventually fails with a "try again" error. If i try to copy any file over 300mb, the same error occurs.
    What I've tried - well, not a lot. The first thing i did was to make sure my hard drive was error free, and worked when physically connected - it is and it did. I've read a few posts about formatting the drive for mac, but this really isnt a good option for me since all of my music is on this drive, and itunes pulls from it (which also works without a problem). I do however get the hang in itunes if i try to import large files (movies). I've also read about trying to go back to an earlier AE firmware, but the posts were outdated. I can try the mac side and see if large files move over, but again, i prefer to do this in windows vista.
    this is my first post, so im sure im leaving out vital info. If anyone wants to take a stab at helping, I'd love to discuss. Thanks in advance.

    Hello,
    Just noticed the other day that I am having the same problem. I have two Vista machines attached to TC w/ a Western Digital 500 gig USB'd. I can write to the TC (any file size) with no problem, however I cannot write larger files to my attached WD drive. I can write smaller folders of music and such, but I cannot back up larger video files. Yet, if I directly attach the drive to my laptop I can copy over the files, no problem. I could not find any setting in the Airport Utility with regards to file size limits or anything of the like. Any help on this would be much appreciated.

  • Difficulty compressing a large file - compact command fails

    I am trying to compress a 419 GB SQL Server database, following the instructions in the compression example described here:
    http://msdn.microsoft.com/en-us/library/ms190257(v=sql.90).aspx
    I have set the filegroup to read-only, and taken the database offline.  I logged into the server that hosts the file, and ran this from a command prompt:
    ===================================================== 
    F:\Database>compact /c CE_SecurityLogsArchive_DataSecondary.ndf
    Compressing files in F:\Database\
    CE_SecurityLogsArchive_DataSecondary.ndf [ERR]
    CE_SecurityLogsArchive_DataSecondary.ndf: Insufficient system resources exist to
    complete the requested service.
    0 files within 1 directories were compressed.
    0 total bytes of data are stored in 0 bytes.
    The compression ratio is 1.0 to 1.
    F:\Database>
    ===============================================
    As you can see, it gave me an error: "Insufficient system resources exist to complete the requested service."  The drive has 564 GB free, so I doubt that is the issue.  Here are some specs on the server:
    MS Windows Server 2003 R2, Enterprise x64 Edition, SP2
    Intel Xeon E7420 CPU @ 2.13 GHz; 8 logical processors (8 physical)
    7.99 GB RAM
    Any suggestions how to handle this?  I really need to get this large file compressed, and this method seems appealing if I can make it work, because you can supposedly continue to query from the database even though it has been compressed.  If
    I use a compression utility like 7Zip, I doubt that I'll be able to query from the compressed file.

    Hi,
    Based on my knowledge, if you compress a file that is larger than 30 gigabytes, the compression may not succeed.
    For detailed information:
    http://msdn.microsoft.com/en-us/library/aa364219(VS.85).aspx
    Regards,
    Yan Li
    Regards, Yan Li

  • File adapter  reading and writing large files

    Hi we are getting error when trying to process large files using file adapters. files of size 80 to 100 MB. we need to read the inbound files and write them to another folder in another server. the error we are getting is out of memory. gracias

    Hi,
    Use the asynchronous process or a checkpoint(); to see your instance before it time-out.
    --Khaleel                                                                                                                                                                                                                           

  • FTP how much large  file can take?

    Hi Experts,
    can you please any one tell me FTP,how much large size file can pull.maximum size of FTP?
    Thanks
    narendran

    Hi Narendan,
    if you read the question 14 (Q: Which memory requirements does the File Adapter have? Is there a restriction on the maximum file size it can process?) in this note  821267 - FAQ: XI 3.0 / PI 7.0 / PI 7.1 / PI 7.3 File Adapter , shows that the size restriction depends about several factors, i recommend that you read it.
    Check also this thread Size recommendation for placing file from FTP to Application Server??
    Also, i suggest that you read this blog if you would need to work with big files: File/FTP Adapter - Large File Transfer (Chunk Mode)
    Regards.

Maybe you are looking for

  • How much hard disk can i upgrade for dv6516tx

    I have been using my hp pavillion dv6516tx O/S-Windows Vista 32-bit. s/n:CNF7343W7D p/n:GN393PA#ACJ Hard disk:160GB Ram:2GB I want to upgrade my Harddisk and ram.How much can i upgrade my harddisk and ram.How to take the backup into the hardisk.

  • In house prod. time for continuous manufacturing

    Hello friends, I want to enter in house production time in matarial master for continuous manufacturing process.I am using Base Unit Of Measure "Quintal".It is not possible to calculate In house production time. So how can I define it? Thanks!!

  • Count(*) with nested query

    Hi, I have a question about the count(*) with nested query. I have a table T1 with these columns: C1 number C2 number C3 number C4 number C5 number (The type of each column is not relevant for the example.) This query: select C1, C2, C3, C4 from T1 g

  • More than one Predefined Look and feel to choose from

    Hi Is it possible to create a set of pre-defined look and feel styles, and then allowing the user to choose one that will be appled to a site when he or she logs in Any ideas will be appreciated Thanks Harry

  • Can't Login to BW System after installation

    Hello People, I've just finished installing the BW system. I've been trying to login to the system, through the SAP Front-End. SQLServer was working and so does the bw-server, but every time i've tried to log-in I got a dump - as the first screen. Th