BT Cloud - large file ( ~95MB) uploads failing

I am consistently getting upload failures for any files over approximately 95MB in size.  This happens with both the Web interface, and the PC client.  
With the Web interface the file upload gets to a percentage that would be around the 95MB amount, then fails showing a red icon with a exclamation mark.  
With the PC client the file gets to the same percentage equating to approximately 95MB, then resets to 0%, and repeats this continuously.  I left my PC running 24/7 for 5 days, and this resulted in around 60GB of upload bandwidth being used just trying to upload a single 100MB file.
I've verified this on two PCs (Win XP, SP3), one laptop (Win 7, 64 bit), and also my work PC (Win 7, 64 bit).  I've also verified it with multiple different types and sizes of files.  Everything from 1KB to ~95MB upload perfectly, but anything above this size ( I've tried 100MB, 120MB, 180MB, 250MB, 400MB) fails every time.
I've completely uninstalled the PC Client, done a Windows "roll-back", reinstalled, but this has had no effect.  I also tried completely wiping the cloud account (deleting all files and disconnecting all devices), and starting from scratch a couple of times, but no improvement.
I phoned technical support yesterday and had a BT support rep remote control my PC, but he was completely unfamiliar with the application and after fumbling around for over two hours, he had no suggestion other than trying to wait for longer to see if the failure would clear itself !!!!!
Basically I suspect my Cloud account is just corrupted in some way and needs to be deleted and recreated from scratch by BT.  However I'm not sure how to get them to do this as calling technical support was futile.
Any suggestions?
Thanks,
Elinor.
Solved!
Go to Solution.

Hi,
I too have been having problems uploading a large file (362Mb) for many weeks now and as this topic is marked as SOLVED I wanted to let BT know that it isn't solved for me.
All I want to do is share a video with a friend and thought that BT cloud would be perfect!  Oh, if only that were the case :-(
I first tried web upload (as I didn't want to use the PC client's Backup facility) - it failed.
I then tried the PC client Backup.... after about 4 hrs of "progress" it reached 100% and an icon appeared.  I selected it and tried to Share it by email, only to have the share fail and no link.   Cloud backup thinks it's there but there are no files in my Cloud storage!
I too spent a long time on the phone to Cloud support during which the tech took over my PC.  When he began trying to do completely inappropriate and irrelevant  things such as cleaning up my temporary internet files and cookies I stopped him.
We did together successfully upload a small file and sharing that was successful - trouble is, it's not that file I want to share!
Finally he said he would escalate the problem to next level of support.
After a couple of weeks of hearing nothing, I called again and went through the same farce again with a different tech.  After which he assured me it was already escalated.  I demanded that someone give me some kind of update on the problem and he assured me I would hear from BT within a week.  I did - they rang to ask if the problem was fixed!  Needless to say it isn't.
A couple of weeks later now and I've still heard nothing and it still doesn't work.
Why can't Cloud support at least send me an email to let me know they exist and are working on this problem.
I despair of ever being able to share this file with BT Cloud.
C'mon BT Cloud surely you can do it - many other organisations can!

Similar Messages

  • FTP and HTTP large file ( 300MB) uploads failing

    We have two IronPort Web S370 proxy servers in a WCCP Transparent proxy cluster.  We are experiencing problems with users who upload large video files where the upload will not complete.  Some of the files are 2GB in size, but most are in the hundreds of megabytes size.  Files of sizes less than a hundred meg seem to work just fine.  Some users are using FTP proxy and some are using HTTP methods, such as YouSendIt.com. We have tried explict proxy settings with some improvment, but it varies by situation.
    Is anyone else having problems with users uploading large files and having them fail?  If so, any advice?
    Thanks,
       Chris

    Have you got any maximum sizes set in the IronPort Data Security Policies section?
    Under Web Security Manager.
    Thanks
    Chris

  • Uploading large files to share fails - advice appreciated

    I wondered if anyone could help me with uploading large files. 
    The ability to distribute files to clients with the included 20Gb of server space was part of the reason I signed up to Creative Cloud; unfortunatly I'm having very mixed results and I'm constantly having to fall back on the free DropBox service which seems much more reliable.
    I'm working on a MacPro - 10.6.8 with a optical fibre based ISP 20Mb/s service; the service is, dare I say, usually very quick and reliable.  Before uploading a large file I set energy saver to "never sleep" in case I wonder off for a cup of tea, but enevitably come back to what looks like a "stuck" upload.  I've tried three times today to send a 285Mb file and they all failed so I reverted to sending the file by DropBox... with no problem - I've also sent files in excess of 1Gb by DropBox in the past without an issue.
    Todays file had reached about 50% according to the little progress bar.. it then stopped.   (it's a shame the progrees bar can't relay the actual upload speed and a percentage like DropBox so you can see what's going on)
    I have to state, I'm not a DropBox "fanboy" but it just seems to work...
    Are their any tricks or settings that could make Adobe's 20Gb of space a functioning part of my workflow?  It would be great to feel confident in it.
    Thanks in advance for any advice.

    Either Firefox or Chrome would be a good alternative.
    I would see if one of these browsers works for upload but we can also debug the issue on Safari.
    Here are the steps to debug on Safari:
    Add the Develop menu to Safari. From the menu choose Safari > Preferences. Select the Advanced tab and check the box for "Show Develop menu in menu bar".
    Once logged onto the Creative Cloud Files page (https://creative.adobe.com/files) Ctrl + Click anywhere on the page and from the browser context menu choose Inspect Element.
    In the window that opens in the bottom half of the browser click on the Clock icon (mouse over info tip says Instrument) and on the left select Network Requests.
    Try the upload and you will see various network requests. Look for those whose Name is "notification". When you select one in the list there will be an arrow next to "notification" and clicking on it will take you to the details. See screen shot. If successful you will have a Status of 200. Some of these "notification" items are for your storage quota being updated, but the one we want is for the successful upload.

  • Splitting large files for upload

    I need to upload a very large file to an FTP site, 9.8G. This takes a hugely long time with my bandwidth and invariablly fails somewhere before it finishes. Does anyone know if there is software that will take a large file like this, split it up, perhaps into disk images, so that they can be uploaded separately and then reassembled?
    Thanks very much.

    If you perform a web search there are some file splitters available for the Mac, I don't use them.
    Hopefully you will find a suitable one.

  • Downloading via XHR and writing large files using WinJS fails with message "Not enough storage is available to complete this operation"

    Hello,
    I have an issue that some user are experiencing but I can't reproduce it myself on my laptop. What I am trying to do it grab a file (zip file) via XHR. The file can be quite big, like 500Mb. Then, I want to write it on the user's storage.
    Here is the code I use:
    DownloadOperation.prototype.onXHRResult = function (file, result) {
    var status = result.srcElement.status;
    if (status == 200) {
    var bytes = null;
    try{
    bytes = new Uint8Array(result.srcElement.response, 0, result.srcElement.response.byteLength);
    } catch (e) {
    try {
    Utils.logError(e);
    var message = "Error while extracting the file " + this.fileName + ". Try emptying your windows bin.";
    if (e && e.message){
    message += " Error message: " + e.message;
    var popup = new Windows.UI.Popups.MessageDialog(message);
    popup.showAsync();
    } catch (e) { }
    this.onWriteFileError(e);
    return;
    Windows.Storage.FileIO.writeBytesAsync(file, bytes).then(
    this.onWriteFileComplete.bind(this, file),
    this.onWriteFileError.bind(this)
    } else if (status > 400) {
    this.error(null);
    The error happens at this line:
    bytes = new Uint8Array(result.srcElement.response, 0, result.srcElement.response.byteLength);
    With description "Not enough storage is available to complete this operation". The user has only a C drive with plenty of space available, so I believe the error message given by IE might be a little wrong. Maybe in some situations, Uint8Array
    can't handle such large file? The program fails on a "ASUSTek T100TA" but not on my laptop (standard one)
    Can somebody help me with that? Is there a better way to write a downloaded binary file to the disk not passing via a Uint8Array?
    Thanks a lot,
    Fabien

    Hi Fabien,
    If Uint8Array works fine on the other computer, it should not be the problem of the API, but instead it could be the setting or configuration for IE.
    Actually using XHR for 500MB zip file is not suggested, base on the documentation:
    How to download a file, XHR wraps an
    XMLHttpRequest call in a promise, which is not a good approach for big item download, please use
    Background Transfer instead, which is designed to receive big items.
    Simply search on the Internet, and looks like the not enough storage error is a potential issue while using XMLHttpRequest:
    http://forums.asp.net/p/1985921/5692494.aspx?PRB+XMLHttpRequest+returns+error+Not+enough+storage+is+available+to+complete+this+operation, however I'm not familiar with how to solve the XMLHttpRequest issues.
    --James
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Difficulty compressing a large file - compact command fails

    I am trying to compress a 419 GB SQL Server database, following the instructions in the compression example described here:
    http://msdn.microsoft.com/en-us/library/ms190257(v=sql.90).aspx
    I have set the filegroup to read-only, and taken the database offline.  I logged into the server that hosts the file, and ran this from a command prompt:
    ===================================================== 
    F:\Database>compact /c CE_SecurityLogsArchive_DataSecondary.ndf
    Compressing files in F:\Database\
    CE_SecurityLogsArchive_DataSecondary.ndf [ERR]
    CE_SecurityLogsArchive_DataSecondary.ndf: Insufficient system resources exist to
    complete the requested service.
    0 files within 1 directories were compressed.
    0 total bytes of data are stored in 0 bytes.
    The compression ratio is 1.0 to 1.
    F:\Database>
    ===============================================
    As you can see, it gave me an error: "Insufficient system resources exist to complete the requested service."  The drive has 564 GB free, so I doubt that is the issue.  Here are some specs on the server:
    MS Windows Server 2003 R2, Enterprise x64 Edition, SP2
    Intel Xeon E7420 CPU @ 2.13 GHz; 8 logical processors (8 physical)
    7.99 GB RAM
    Any suggestions how to handle this?  I really need to get this large file compressed, and this method seems appealing if I can make it work, because you can supposedly continue to query from the database even though it has been compressed.  If
    I use a compression utility like 7Zip, I doubt that I'll be able to query from the compressed file.

    Hi,
    Based on my knowledge, if you compress a file that is larger than 30 gigabytes, the compression may not succeed.
    For detailed information:
    http://msdn.microsoft.com/en-us/library/aa364219(VS.85).aspx
    Regards,
    Yan Li
    Regards, Yan Li

  • Can't Upload Large Files (Upload Fails using Internet Explorer but works with Google Chrome)

    I've been experience an issue uploading large (75MB & greater) PDF files to a SharePoint 2010 document library. Using normal upload procedures using Internet Explorer 7 (our company standard for the time being) the upload fails. No error message is thrown,
    the upload screen goes away and the page refreshes and the document isn't there. I tried upload multiple and it says throws a failed error after a while.
    Using google chrome I made an attempt just to see what it did and the file using the "Add a document" uploaded in seconds. Can't figure out why one browser worked and the other doesn't. We are getting sporadic inquiries with the same issue.
    We have previously setup large file support in the appropriate areas and large files are uploaded to the sites successfully. Any thoughts?

    File size upload has to be configured on the server farm level. Your administrator most likely set
    up the limit to size of files that can be uploaded. This size can be increased and you would then be able to upload your documents.

  • "Unable to sync" when uploading large files to Creative Cloud

    Hi,
    I recently tried to upload some large files (1.1 GB, 2 GB) to my Creative Cloud. After completing the upload after nine hours I got the error message "unable to sync".
    Any suggestions?
    Regards,
    pdm208

    I'm having the same issue. I removed the offending big files. I turned sync on and off, but it's still unable to sync. Any thoughts?

  • Large File Upload ~50mb Fails using Firefox/Safari. Upload Limit is 1gig set at Web App. IIS timeout set at 120 seconds.

    Strange issue.  User can upload the file using IE9, but cannot do the same using Safari, Chrome, or FireFox.  The library resides in a Site Collection that was migrated from a 2010 Site Collection.  The timeout on IIS is 120 seconds.Very strange
    indeed.
    Haven't been able to trace the Correlation ID at this point.

    try using rest api for this much file upload if you are using custom solution.
    try below links for reference:
    http://blogs.technet.com/b/praveenh/archive/2012/11/16/issues-with-uploading-large-documents-on-document-library-wss-3-0-amp-moss-2007.aspx
    https://angler.wordpress.com/2012/03/21/increase-the-sharepoint-2010-upload-file-size-limit/
    http://social.technet.microsoft.com/wiki/contents/articles/20299.sharepoint-2010-uploading-large-files-to-sharepoint.aspx
    http://stackoverflow.com/questions/21873475/sharepoint-2013-large-file-upload-1gb
    https://answers.uchicago.edu/page.php?id=24860
    http://sharepoint.stackexchange.com/questions/91767/multiple-file-upload-not-working
    Please mark it as answer if you find it useful happy sharepointing

  • Transfer of large files to and from Egnyte failing...

    One of my clients uses Egnyte for file management. For a typical job I will usually be required to download 5GB of files and upload 1.5GB.
    However, when at home, transfer of large files to and from Egnyte will often fail. (On download, Chrome gives the error message: "connection failed". Uploading, Egnyte's error message is: "HTTP error").
    I have three machines at home. Two Macs (running Yosemite and Lion) and a PC running Windows 7. I've had no luck with any of them on any browser but when using other people's broadband I have no problem at all (using my MacBook).
    I have no firewalls running. Yes, I've turned everything on-and-off-again. So that leaves me to think that the problem lies with my BT Homehub 4 router. But why would my router be botching the transfer of large files? I've switched the router's firewall off, tried adding my Mac to DMZ (whatever that is) but that seems to be the most I can do. Ethernet is no different to wireless.
    I've not noticed this porblem when using other file transfer sites (like WeTransfer).
    What's going on?
    Please help!

    From my own experience (I admin a few gaming servers and often get disconnections from them in the middle of monitoring operations) and based on other users experiences here on the forums I suspect BT have been having some core infrastructure issues which can lead to A) intermittent packet loss B) extended packet delay - both of which can cause servers to assume a 'failure' and disconnect or suspend upload/download.
    I dont know what package you are on from BT (I'm Infinity 2) << and as its Hogmany im the one that drawn the short straw to keep cheaters off out servers << so I'm a bit intoxicated and may not make total sense atm.
    https://community.bt.com/t5/BT-Infinity-Speed-Connection/BT-Infinity-issues-for-the-last-few-days/td...
    ^^ this thread illustrates issues that people have been having over the last few weeks.
    This probably wont help - but it might make you aware that you arent alone in ONGOING issues.
    Happy New Year !

  • OOM happens inside Weblogic 10.3.6 when application uploads large files

    Oracle Fusion BI Apps application is uploading large files (100+ MB) onto Oracle Cloud Storage. This application works properly when ran outside weblogic server. When deployed on Fusion Middleware Weblogic 10.3.6, during upload of large files we get this OOM error
    java.lang.OutOfMemoryError: allocLargeObjectOrArray: [B, size 268435472
        at jrockit/vm/Allocator.allocLargeObjectOrArray(JIZ)Ljava/lang/Object;(Native Method)
        at jrockit/vm/Allocator.allocObjectOrArray(Allocator.java:349)[optimized]
        at weblogic/utils/io/UnsyncByteArrayOutputStream.resizeBuffer(UnsyncByteArrayOutputStream.java:59)[inlined]
        at weblogic/utils/io/UnsyncByteArrayOutputStream.write(UnsyncByteArrayOutputStream.java:89)[optimized]
        at com/sun/jersey/api/client/CommittingOutputStream.write(CommittingOutputStream.java:90)
        at com/sun/jersey/core/util/ReaderWriter.writeTo(ReaderWriter.java:115)
        at com/sun/jersey/core/provider/AbstractMessageReaderWriterProvider.writeTo(AbstractMessageReaderWriterProvider.java:76)
        at com/sun/jersey/core/impl/provider/entity/InputStreamProvider.writeTo(InputStreamProvider.java:98)
        at com/sun/jersey/core/impl/provider/entity/InputStreamProvider.writeTo(InputStreamProvider.java:59)
        at com/sun/jersey/api/client/RequestWriter.writeRequestEntity(RequestWriter.java:300)
        at com/sun/jersey/client/urlconnection/URLConnectionClientHandler._invoke(URLConnectionClientHandler.java:213)
        at com/sun/jersey/client/urlconnection/URLConnectionClientHandler.handle(URLConnectionClientHandler.java:149)
    Looks like weblogic is using its default Weblogic HTTP handler, switching to Sun HTTP handler via start up JVM/Java Option "-DUseSunHttpHandler=true" solves the OOM issue.
    Seems instead of streaming the file content with some fixed size byte array its being on the whole file into memory during upload.
    Is it possible to solve this OOM by changing any setting of Weblogic HTPP handler without switching to Sun HTTP handler as there are many other application deployed on this weblogic instance?
    We are concerned whether there will be any impact on performance or any other issue.
    Please advice, highly appreciate your response.
    Thanks!

    Hi,
    If you have a back then restore the below file back and then try to start weblogic:
    \Oracle\Middleware\user_projects\domains\<domain_name>\config\config.lok
    Thanks,
    Sharmela

  • Unable to submit form without attachment if file upload fails

    Hi All,
    I have a form, in which I have a input file component.
    Scenario 1 - If i don't attach anything and submit the form, it is successful.
    Scenario 2 - If I attach a file of size less than 2 MB and then submit, it is successful.
    Scenario 3 - If I attach a file of size greater than 2 MB and then submit, ADF gives an error message saying " File is large" which is expected.
    3.1 - In the scenario 3, after file upload fails, I want to submit the form without attaching any file. But the form isn't submitted and the warning message always pops up.
    3.2 - If I upload file of size less than 2MB again and then submit, it is successful.
    Please help me in resolving the issue.
    Thanks,
    Harini.

    Try clearing the error messages
    <af:resource type="javascript">
    function clearMessagesForComponent(evt){
    AdfPage.PAGE.clearAllMessages();
    evt.cancel();
    </af:resource>Now all yo uneed to do is to call this javascript method e.g. using a <af:clientListener method="clearMessagesForComponent" type="focus"/>on a button or any other field.
    Timo

  • File upload fails

    File upload fails when the file is larger than about 50kb. The upload returns a blank page when it fails. I have searched the world over and cannot find an answer to this problem. Does anyone have any suggestion of what to do?
    Thank you,
    Chris

    What mechanism are you using to handle uploaded files? Some libraries place a configurable limit on the size of the file.

  • File upload script not getting the file name for larger files

    Hi
    I have the following code (see extract below) and find that
    when the size of the file to upload is larger than about 300kb the
    code does not grab the file name. Consequently the upload fails.
    The code works fine when the file size is smaller.
    The code in the form page is:
    <form
    action="UploadAttachment.asp?SubjectName=<%=Request.QueryString("SubjectName")%>&VersionN umber=<%=Request.QueryString("VersionNumber")%>&QualificationName=<%=Request.QueryString(" QualificationName")%>"
    method="post" enctype="multipart/form-data" name="form1">
    <input name="file" type="file" size="100">
    <input name="Upload" type="submit" id="Upload"
    value="Upload">
    </form>
    The code in the UploadAttachment.asp page is:
    <%
    'Grab the file name
    Dim objUpload, strPath, SQLString
    Set objUpload = New clsUpload
    'there is a problem that this next line doesn't grab the file
    name if file is too large.
    strFileName = objUpload.Fields("file").FileName
    etc.
    %>
    If you have any idea how to resolve this I'd be grateful.
    Neil

    Hi
    I have the following code (see extract below) and find that
    when the size of the file to upload is larger than about 300kb the
    code does not grab the file name. Consequently the upload fails.
    The code works fine when the file size is smaller.
    The code in the form page is:
    <form
    action="UploadAttachment.asp?SubjectName=<%=Request.QueryString("SubjectName")%>&VersionN umber=<%=Request.QueryString("VersionNumber")%>&QualificationName=<%=Request.QueryString(" QualificationName")%>"
    method="post" enctype="multipart/form-data" name="form1">
    <input name="file" type="file" size="100">
    <input name="Upload" type="submit" id="Upload"
    value="Upload">
    </form>
    The code in the UploadAttachment.asp page is:
    <%
    'Grab the file name
    Dim objUpload, strPath, SQLString
    Set objUpload = New clsUpload
    'there is a problem that this next line doesn't grab the file
    name if file is too large.
    strFileName = objUpload.Fields("file").FileName
    etc.
    %>
    If you have any idea how to resolve this I'd be grateful.
    Neil

  • ITunes (Win) crashing when trying to match/upload large files

    Hi there,
    I've got a little problem with iTunes Match again. Imho it is still veeery buggy but it certainly improved a bit!
    So the problem is:
    I am listening to mixes a lot - meaning: large mp3 files with runtimes up to 3hrs.
    When I load files like these into my iTunes lib on Windows 7 and then try to upload them into the cloud 9 out of 10 times iTunes simply crashes while analyzing these large files ("iTunes stopped working"). Nothing I can do. Even trying to upload these files one by one didn't help.
    Does anyone have any ideas on what to do about this?
    thanks, cheers
    P

    Thanks for the reply!
    I know that 200mb is the limit. But as you already noted it should just say "not compatible" or whatever but not just simply crash. Problem is I can't have these tracks in my iTunes library then because every time I try to sync my lib with Match it tries to sync these files as well and iTunes just simply crashes.
    I also have a Macbook so I will try to upload the files via iTunes on the Mac. If that doesn't help last idea may be to reduce file size ... although I hate doing this as I prefer 256kbit files.
    Split into two would be another idea.
    Anyways I had these problems with large files couple of times already but could always fix it by syncing one file at a time. not this time though.

Maybe you are looking for

  • Error while deleting assignments in CL24N TCode

    Dear MM Gurus, We are facing problems in PO Release Strategy. We had various values in Release (Groups, Codes, Strategies) and Workflow assignments in QA and DEV. Since we received requirement for new Release Strategy, some of the old ones were not r

  • Error while creating query

    Hi , I am creating 7.1 laptop application, I am getting error while creating the queries. I have 2 queries in my Model DC, When I create the 1st query I am able to use it in the applicationDC without any problem, But when I am creating the second que

  • *** How do I put a program on my local network for all to use?

    I would like to put a program we have downloaded for our accunting on our office network so that either of our 3 employees can access it and add the most recent information. Is there a way to do this through the network folder in the finder? I can se

  • Can anybody explain me this absolute mess !

    I don't want to call Oracle to have a good look at the former Pertmaster, I don't want to be bothered with sales people ... i just want to take a look at pertmaster ... Edited by: user11949608 on 28-sep-2009 6:20

  • Installing Creative Suite 5 on Macbook Air

    I have the disks but can I install it without using the disks from online?