Bandwidth control druing large file transfer

My user has an application that requires him to transfer hugh files from one location to another location.
our whole company is connected via ipvpn tunnel each office has a 20Mb ethernet WAN link.
when this user starts his file transfer, it will choke up the whole 20Mb in that particular office.
we have checked at the application layer, but there is no option to control the file transfer speed or bandwidth
is that any way, I can control the bandwidth usage of this user?

Disclaimer
The Author of this posting offers the information contained within this posting without consideration and with the reader's understanding that there's no implied or expressed suitability or fitness for any purpose. Information provided is for informational purposes only and should not be construed as rendering professional advice of any kind. Usage of this posting's information is solely at reader's own risk.
Liability Disclaimer
In no event shall Author be liable for any damages whatsoever (including, without limitation, damages for loss of use, data or profit) arising out of the use or inability to use the posting's information even if Author has been advised of the possibility of such damage.
Posting
Your options for bandwidth control depend much on your platforms.  Some devices would allow you to police or shape your traffic, but at Vasilii describes, probably the best solution would be to deprioritize this bulk transfer traffic using QoS.  (For 20 Mbps, you might need a hierarchal shaper.  Also for this to work well with Internet VPN, you cannot have anything else sharing Internet ingress.)

Similar Messages

  • SharePoint Foundation 2013 Optimization For Large File Transfer?

    We are considering upgrading from  WSS 3.0 to SharePoint Foundation 2013.
    One of the improvements we want to see after the upgrade is a better user experience when downloading large files.  It can be done now, but it is not reliable.
    Our document library consists of mostly average sized Office documents, but it also includes some audio and video files and software installer package zip files ranging from 100MB to 2GB in size.
    I know we can change the settings to "allow" larger than default file downloads but how do we optimize the server setup to make these large file transfers work as seamlessly as possible? More RAM on the SharePoint Foundation server? Other Windows,
    SharePoint or IIS optimizations?  The files will often be downloaded from the Internet, so we will not have control over the download speed.

    SharePoint is capable of sending large files, it is an HTTP stateless system like any other website in that regard. Given your server is sized appropriately for the amount of concurrent traffic you expect, I don't see any special optimizations required.
    Trevor Seward
    Follow or contact me at...
    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.
    I see information like this posted warning against doing it as if large files are going to cause your SharePoint server and SQL to crash. 
    http://blogs.technet.com/b/praveenh/archive/2012/11/16/issues-with-uploading-large-documents-on-document-library-wss-3-0-amp-moss-2007.aspx
    "Though SharePoint is meant to handle files that are up to 2 gigs in size, it is not practically feasible and not recommended as well."
    "Not practically feasible" sounds like a pretty dire warning to stay away from large files.
    I had seen some other links warning that large file in the SharePoint database causes problems with fragmentation and large amounts of wasted space that doesn't go away when files are removed or that the server may run out of memory because downloaded
    files are held in RAM.

  • IdeaTab A2109A - large file transfer

    Hello,
    Trying to transfer a large file (4.5GB) to tablet IdeaTab A2109A connected to a laptop, this gets truncated to 1.2GB and transfer stops. Smaller files are copied fine, but seems that larger files have problem.
    Tablet has no SD card attached, thus I'm transfering directly to its internal storage (11GB free space).
    Android ver. 4.1.1/ kernel 3.1.10
    Please share your ideas/experience/solution on transfering large files for IdeaTab A2109A.
    Thanks!

    I've never tried transferring a file that big, but I've found using adb push and pull to transfer faster than using MTP.

  • Slow large file transfer speed with LaCie 1T firewire 800 drives

    I am transferring two large files (201gb and 95gb) from one LaCie 1T firewire external drives to another one (using separate connections to a PCI Express firewire 800 card in my Quad G5. The transfer time is incredibly slow – over for hours for the 201gb file and over 2 hours for the 95gb file.
    Does anyone have any ideas why this is so slow or what I might try to speed up the transfer rates?
    Thank you.
    G5 Quad 2.5 8GB DDR2 SDRAM two SATA 400GB   Mac OS X (10.4.5)  

    You posted this in the Powerbook discussion forum. You may want to post it in the Power Mac G5 area, located at http://discussions.apple.com/category.jspa?categoryID=108

  • Large file transfer problems over client/server socket

    Hi,
    I wrote a simple chat problem in which I can send files from client to client. The problem is when I send large files over 101 MB the transfer freezes. I do not even get any error mesages in the console. The files I am sending are of any type (Ex mp3, movies, etc...). I am serializing the data into a "byteArray[packetSize]" and sending the file packet by packet until the whole file has been sent, and reconstructed on the other side. The process works perfectly for files smaller than 101MB, but for some reason freezes after that if the file is larger. I have read many forums and there aren't too many solutions out there. I made sure to use the .reset() method to reset my ObjectOutputStream each time I write to it.
    Here's my file sending code:
    byte[] byteArray = new byte[defaultPacketSize];
    numPacketsRequired = Math.ceil(fileSize / defaultPacketSize);
    try {
    int i = 0;
    reader = new FileInputStream(filePath);
    while (reader.available() > 0) {
    if (reader.available() < defaultPacketSize) {
    byte[] lastPacket = new byte[reader.available()];
    reader.read(lastPacket);
    try {
    if (socket == null || output == null) {
    throw new SocketException("socket does not exist");
    output.writeObject(lastPacket);
    output.reset();
    output.writeObject("DONE");
    output.reset();
    output.close();
    socket.close();
    catch (Exception e) {
    System.out.println("Exception ***: " + e);
    output.close();
    socket.close();
    else {
    reader.read(byteArray);
    try {
    if (socket == null || output == null) {
    throw new SocketException("socket does not exist");
    output.writeObject(byteArray);
    output.reset();
    catch (Exception e) {
    System.out.println("Exception ***: " + e);
    output.close();
    socket.close();
    reader.close();
    catch (Exception e) {
    System.out.println("COULD NOT READ PACKET");
    Here's my file receiving code:
    try {
    // The message from the client
    Object streamInput;
    FileOutputStream writer;
    byte[] packet;
    while (true) {
    streamInput = input.readObject();
    if (streamInput instanceof byte[]) {
    packet = (byte[]) streamInput;
    try {
    writer = new FileOutputStream(outputPath, true);
    writer.write(packet); //Storing the bytes on file
    writer.close();
    catch (Exception e) {
    System.out.println("Exception: " + e);
    else if (streamInput.equals("DONE")) {
    socket.close();
    input.close();
    break;
    catch (Exception e) {
    I'm looking for any way I can possibly send large files from client to client without having it freeze. Are there any better file transfer ways other than socket? I don't really want FTP. I think I want to keep it HTTP.
    Any suggestions would be helpful.Thanks!
    Evan

    I've taken a better look a the code you posted, and
    there is one problem with the receiving code. You
    keep repeatedly opening and closing the
    FileOutputStream. This is not going to be efficient
    as the file will keep needing to be positioned to its
    end.Yes sorry I did change that code so that it leaves the file open until completely done writing. Basically I have a progress bar that records how far along in the writing process the client is, and when the progress bar reaches 100%, meaning the file is complete, the file.close() method is invoked. Sorry about that.
    I also ran some memory tests using the "Runtime.getRuntime().totalMemory()", and "Runtime.getRuntime().freeMemory()" methods. I put these methods inside the loop where I read in the file and send it to the client. here's the output:
    Sender's free memory: 704672
    File reader read 51200 bytes of data.
    767548 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 702968
    File reader read 51200 bytes of data.
    716348 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 701264
    File reader read 51200 bytes of data.
    665148 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 699560
    File reader read 51200 bytes of data.
    613948 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 697856
    File reader read 51200 bytes of data.
    562748 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 696152
    File reader read 51200 bytes of data.
    511548 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 694448
    File reader read 51200 bytes of data.
    460348 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 692744
    File reader read 51200 bytes of data.
    409148 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 691040
    File reader read 51200 bytes of data.
    357948 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 689336
    File reader read 51200 bytes of data.
    306748 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 687632
    File reader read 51200 bytes of data.
    255548 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 685928
    File reader read 51200 bytes of data.
    204348 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 684224
    File reader read 51200 bytes of data.
    153148 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 682520
    File reader read 51200 bytes of data.
    101948 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 680816
    File reader read 51200 bytes of data.
    50748 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 679112
    File reader read 50748 bytes of data.
    0 left to read.
    Creating last packet of size: 50748
    Last packet size after setting it equal to byteArray: 50748
    Here's the memory stats from the receiver:
    Receiver's free memory: 639856
    Receiver's total memory: 2842624
    Receiver's free memory: 638920
    Receiver's total memory: 2842624
    Receiver's free memory: 637984
    Receiver's total memory: 2842624
    Receiver's free memory: 637048
    Receiver's total memory: 2842624
    Receiver's free memory: 636112
    Receiver's total memory: 2842624
    Receiver's free memory: 635176
    Receiver's total memory: 2842624
    Receiver's free memory: 634240
    Receiver's total memory: 2842624
    Receiver's free memory: 633304
    Receiver's total memory: 2842624
    Receiver's free memory: 632368
    Receiver's total memory: 2842624
    Receiver's free memory: 631432
    Receiver's total memory: 2842624
    Receiver's free memory: 630496
    Receiver's total memory: 2842624
    Receiver's free memory: 629560
    Receiver's total memory: 2842624
    Receiver's free memory: 628624
    Receiver's total memory: 2842624
    Receiver's free memory: 627688
    Receiver's total memory: 2842624
    Receiver's free memory: 626752
    Receiver's total memory: 2842624
    Receiver's free memory: 625816
    Receiver's total memory: 2842624
    Receiver's free memory: 624880
    Receiver's total memory: 2842624
    Receiver's free memory: 623944
    Receiver's total memory: 2842624
    Receiver's free memory: 623008
    Receiver's total memory: 2842624
    Receiver's free memory: 622072
    Receiver's total memory: 2842624
    Receiver's free memory: 621136
    Receiver's total memory: 2842624
    Receiver's free memory: 620200
    Receiver's total memory: 2842624
    Receiver's free memory: 619264
    Receiver's total memory: 2842624
    Receiver's free memory: 618328
    Receiver's total memory: 2842624
    Receiver's free memory: 617392
    Receiver's total memory: 2842624
    Receiver's free memory: 616456
    Receiver's total memory: 2842624
    Receiver's free memory: 615520
    Receiver's total memory: 2842624
    Receiver's free memory: 614584
    this is just a sample of both receiver and sender's stats. Everything appears to be fine! Hope this message post isn't too long.
    Thanks!

  • Large file transfer

    Does anybody knows how can I transfer large files (600 mg) between 2 client applications? I am using TCP and I can retrieve the files from a client hard disk and the total size but I can't figure out how I will transfer them between clients without having an out of memory exception. Any help will be appreciated!!!

    http://onesearch.sun.com/search/developers/index.jsp?col=devforums&qp_name=Java+Programming&qp=forum%3A31&qt=transfer+large+files

  • Network Drive Losing Connection During Large File Transfer...

    Hey, everyone.
    Recently bought an Airport Extreme and LaCie USB HDD/Hub for my hybrid Windows/Mac home network. I formatted the drive with FAT32 as instructed, and hooked it up.
    My first order of business was to back up all of my media on my Windows desktop machine, but it hasn't been working out so well. A short while after starting the 35+ gig transfer (a few minutes at most), I get an error message saying the the destination folder is no longer there. Sure enough, checking 'My Computer' shows that the drive is now a 'Disconnted Network Drive'.
    Cycling power on everything and rebooting Windows restores the network drive, but the core problem remains.
    Anyone else seen this and have any idea how to deal with it?
    Thanks in advance for any help.

    Hi,
    I'm having the same problem. Here's the relevant parts of my setup:
    Airport Extreme
    HP printer and Western Digital "My Book" 500 G USB drive connected by 4/1 usb splitter
    MacBookPro connected via Gigabit
    Windows XP PC connected via Gigabit
    My media I loaded on the usb drive by directly connecting the usb to my PC, I've also loaded other iTunes media to the usb drive from the mac book. If I point iTunes on the PC at the usb drive I can play the media just fine. When I try to copy large files (700 MB) from my PC to the usb drive (via the airport) I'm told the destination folder isn't there. I've also noticed the windows tooltips telling me that it lost the internet connection, and if I have the airport utility open on my mac it also looses the airport for a few seconds. I am able to copy small files (largest that's worked so far is 1.6 MB) the usb drive via the airport.

  • Large file transfer drop

    Hello,
    I cannot transfer large files from an iMac (Vista/Win7) to another windows share. I always get a disconnection at some point before it's done.
    I have checked some older threads where an alternative broadcomm driver provided by HP did the trick. But those threads are rather old and the solutions suggested there are not working for me.
    Any ideas?
    Thanks.

    This ?? <br />
    http://www.drop.io/index.html <br />
    http://blog.drop.io/2010/10/29/an-important-update-on-the-future-of-drop-io/

  • Large file transfer limitations

    Does anyone know of a limitation to the size of files that can be uploaded and downloaded using the java client in Agile 9.3.0.1 or 9.2.2.2?
    We want to transfer files up to 5GB. We have successfully transferred smaller files up to 500 mb but we get failures above that consistently.
    If you know of any setting that can be set to allow this please let me know.
    Thanks
    Erik

    Erik,
    I never worked on this software before, but did you check [Agile Documentation|http://www.oracle.com/technology/documentation/agile.html] and see if it mention anything about the file size?
    You may also review these documents and see if it helps.
    Note: 758844.1 - Fileload Fails Loading Of Files With Filesize Greater Than 2 GB on AIX Platform
    Note: 736962.1 - What are the Limitations of File Management Server (FMS)?
    Regards,
    Husein

  • Large file transfer from iPhone 4S to MacBook Pro

    Hello,
    I have a video that's nearly 10 gigs on my iPhone 4S that I can't get on to my MacBook Pro.  I've taken a few videos that are around three to six minutes and haven't had any issues.  I get a message telling me, "Error Downloading Image, iPhoto can not import your photos because there was a problem downloading one of your images."
    I'm stumped.  I really don't want to delete and lose this video, but I can't be having a 10 gig video on my iPhone forever.  Can anyone help?
    Thanks!

    Try getting it on the Mac using something other than iPhoto.
    In Windows, it would simply be a matter of opening Explorer and copying the file(s).  I'm fairly certain that Mac has a similar feature, but I'm not familiar with Macs.
    Not really an iPhone issue... more of an issue with the application being used to import the file.

  • Can you stop air drop midway in a large file transfer?

    I'm moving 400 photos from my laptop to desktop Mac. It's taking a very long time. Can I stop without screwing things up?

    Erik,
    I never worked on this software before, but did you check [Agile Documentation|http://www.oracle.com/technology/documentation/agile.html] and see if it mention anything about the file size?
    You may also review these documents and see if it helps.
    Note: 758844.1 - Fileload Fails Loading Of Files With Filesize Greater Than 2 GB on AIX Platform
    Note: 736962.1 - What are the Limitations of File Management Server (FMS)?
    Regards,
    Husein

  • Large file transfer for windows computer

    i made an hd-movie which size is about 6 gb and would like to give copies to friends with windows computers. i have an external drive with a fat 32 partition, but it's not possible to copy the file on this drive, i think the limit is about 4 gb with fat 32 formatted disks. any idea how to manage this?
    one of my employees has a mac with windows running on it, can he copy the file for me? the external drive has two partitions, one fat 32 an one os journaled, is windows running on a mac able to read an os journaled disc and copy the file on the fat 32 partition?

    If you don't know, then look something up.
    exFAT (Extended File Allocation Table, also sometimes referred to as FAT64) is a proprietary, patent-pending[2] file system suited especially for USB flash drives,[citation needed]
    (EXtended File Allocation Table) An enhanced version of the FAT file system from Microsoft that uses less overhead than NTFS. It extends the maximum file size of 4GB in FAT32 to virtually unlimited.
    Read more: http://www.answers.com/topic/exfat#ixzz1ASNAm82m
    http://en.wikipedia.org/wiki/ExFAT
    As to NTFS being 'experimental' in which universe is that in? Mac NTFS drivers are sold by major companies for over 3 yrs and MacFUSE (a well respected developer of considerable talent and knowledge of operating systems I might add).

  • Transfer "large" files (above 300 mb) freezes Airport?

    I know - this is already mentioned in this forum - but the problem still remains unsolved (and I have filed an error report with Apple). However, perhaps someone can help me (and yes... I did hfs+ format my External HD...).
    Setup; iMac and Vista on boot camp and a Vista MCE on a separate machine. When I try to transfer files between the two systems with file sizes of 300+ mb, the transfer freezes and nothing can be transfered... in Vista I have to restart explorer and in Leopard the disc suddenly become unavailable... so what to do? throw out my Apple extreme N router? or something else... I have already tried using FTP, but the transfer of files still does not work?

    Some additional things that I've tried since the original message:
    - Reinstalled Snow Leopard back to 10.6.0. When booted from the installer, and then re-booting after it completed, the OS could not detect the Airport card. After a Shut Down and reboot, it still could not detect the card, but after an additional cycle, it came back up. However, a large file transfer still caused it to fail again (~100 MB into a 300 MB file)
    - I can cycle the power to the Airport card successfully if there hasn't been a reason for it to fail yet.
    - The failure also occurs during an FTP put operation.
    - Other people's suggestions for solving "Airport card not found" regarding creating a new location are ineffective. Only power cycling the computer seems to resolve the issue.
    As I said, I'm really starting to suspect hardware issue, but if there's anything else to try, I'd be happy to give it a shot. I'm about to start a clean install of Snow Leopard to an external disc to see if it still happens there as well.

  • Connection drop when transfer large file

    We are using a IBM T40 (with 802.11b build-in). We are able to surf web and ping successfully. Once we transfer large file, the connection just drop and the debug log from the AP1200 shows:
    Dec 18 10:39:57.095 H: %DOT11-4-MAXRETRIES: Packet to client xxxx.xxxx.xxxx reached max retries, removing the client
    Dec 18 10:39:57.095 H: %DOT11-6-DISASSOC: Interface Dot11Radio0, Deauthenticating Station xxxx.xxxx.xxxx Reason: Previous authentication no longer valid
    However, when we try using Cisco 350 PCMCIA cards and Orinoco, there is no problem with the large file transfer.
    We have upgraded the driver to the latest version.
    Any help. Thks

    This type of error is usually because of RF interference. When the AP sends a packet to the client, the client must ACK this packet back to the AP, if the AP receives no ACK from the client, it will resend this packet. It will continue this process for 16 times (default config) before assuming the client is off the network now (person moved out of range or powered off their computer) and will remove the client from its association table. I would verify coverage via a site survey, looking at not only signal strength, but also signal quality

  • Processing Large Files using Chunk Mode with ICO

    Hi All,
    I am trying to process Large files using ICO. I am on PI 7.3 and I am using new feature of PI 7.3, to split the input file into chunks.
    And I know that we can not use mapping while using Chunk Mode.
    While trying I noticed below points:
    1) I had Created Data Type, Message Type and Interfces in ESR and used the same in my scenario (No mapping was defined)Sender and receiver DT were same.
    Result: Scenario did not work. It created only one Chunk file (.tmp file) and terminated.
    2) I used Dummy Interface in my scenario and it worked Fine.
    So, Please confirm if we should always USE DUMMY Interfaces in Scenario while using Chunk mode in PI 7.3 Or Is there something that I am missing.
    Thanks in Advance,
    - Pooja.

    Hello,
    While trying I noticed below points:
    1) I had Created Data Type, Message Type and Interfces in ESR and used the same in my scenario (No mapping was defined)Sender and receiver DT were same.
    Result: Scenario did not work. It created only one Chunk file (.tmp file) and terminated.
    2) I used Dummy Interface in my scenario and it worked Fine.
    So, Please confirm if we should always USE DUMMY Interfaces in Scenario while using Chunk mode in PI 7.3 Or Is there something that I am missing.
    According to this blog:
    File/FTP Adapter - Large File Transfer (Chunk Mode)
    The following limitations apply to the chunk mode in File Adapter
    As per the above screenshots, the split never cosiders the payload. It's just a binary split. So the following limitations would apply
    Only for File Sender to File Receiver
    No Mapping
    No Content Based Routing
    No Content Conversion
    No Custom Modules
    Probably you are doing content conversion that is why it is not working.
    Hope this helps,
    Mark
    Edited by: Mark Dihiansan on Mar 5, 2012 12:58 PM

Maybe you are looking for

  • ME21N-Deliver completed check box in delivey tab to be given user authorzn

    In ME21N User A creates the Po for a Quantity 100 Nos & then he makes a MIGO for the qty 60 Nos:He comes back to ME22N in *DELIVERY TAB where in there will be a check box as DELIVERY COMPLETED :He will ensure to tick the same & save. So Other User B

  • Error in configuring NWDI - by using configuration wizard

    Hello, I have installed CE 7.1 and like to install NWDI. It requires me to execute some configuration wizards from administratio tool of the NW CE server. I am following the configuration guide for NWDI and initiated a "Change Management Service (CMS

  • How to determine Which Database I am Connected to?

    hi everyone, 10g I'm going to drop a database via SQL Plus. I forgot the command that will show which db I'm connected to. Dont want to delete the wrong database! Thanks, John

  • Triggering ERMS work items for SOFM docuemnts

    we have upgraded to SP08 in CRM and when system was up ERMS event linkage in 'CRM_ERMS_WF_CUST' , ERMSSUPRT2MAILRECEIVED was deactivated. and we manually activated later after couple of hours. Now we need to process those 200 emails received to CRM f

  • Is This Possible in iWeb?

    Hello- Would like to create a site similar to: http://www.simdevgroup.com Think this was done in RapidWeaver. Is this too much to create in iWeb? I just like the looks and flow of the site but it doesn't like any of the iWeb templates. Thanks!