Large File problem

Hi!
I have a swing application that must be abble to open a large file (4.8 Gb). But when i precess the file it is incomplete.
I think the system memory is not enought for open the file.
There is some way to know if a file must cannot be opened correctly? (some way to know the free system memory)
Or there is a way to open very large files???
Thank you

Might it be, you are using somewhere an int instead of a long for file positions?
Holding the entire file in memory makes no sence.
You might try java.nio for file to memory mapping.
And you might utilize java.util.zip to compress file parts on the fly loading them.
The java switches -Xms and -Xmx might further help out.
Your application could use soft and weak references to allow for memory-low cleanups.

Similar Messages

  • Airport Extreme, Airdisk, and Vista - large file problem with a twist

    Hi all,
    I'm having a problem moving large-ish files to an external drive attached to my Airport Extreme SOMETIMES. Let me explain.
    My system - Macbook Pro, 4gb ram, 10gb free HD space on macbook, running latest updates for mac and vista, external hard drive on AE is an internal WD in an enclosure w/ 25gb free (its formatted for pc, but I've used it directly connected to multiple computers, mac and pc, without fail), AE is using firmware 7.3.2, and im only allowing 802.11n. The problem occurs on the Vista side - havent checked the mac side yet.
    The Good - I have bit torrents set up, using utorrent, to automatically copy files over to my airdisk once they've completed. This works flawlessly. If i connect the hard drive directly to my laptop (macbook running bootcamp w/ vista, all updates applied), large files copy over without a hitch as well.
    The Bad - For the past couple weeks (could be longer, but I've just noticed it recently being a problem - is that a firmware problem clue?) If i download the files to my Vista desktop and copy them over manually, the copy just sits for a while and eventually fails with a "try again" error. If i try to copy any file over 300mb, the same error occurs.
    What I've tried - well, not a lot. The first thing i did was to make sure my hard drive was error free, and worked when physically connected - it is and it did. I've read a few posts about formatting the drive for mac, but this really isnt a good option for me since all of my music is on this drive, and itunes pulls from it (which also works without a problem). I do however get the hang in itunes if i try to import large files (movies). I've also read about trying to go back to an earlier AE firmware, but the posts were outdated. I can try the mac side and see if large files move over, but again, i prefer to do this in windows vista.
    this is my first post, so im sure im leaving out vital info. If anyone wants to take a stab at helping, I'd love to discuss. Thanks in advance.

    Hello,
    Just noticed the other day that I am having the same problem. I have two Vista machines attached to TC w/ a Western Digital 500 gig USB'd. I can write to the TC (any file size) with no problem, however I cannot write larger files to my attached WD drive. I can write smaller folders of music and such, but I cannot back up larger video files. Yet, if I directly attach the drive to my laptop I can copy over the files, no problem. I could not find any setting in the Airport Utility with regards to file size limits or anything of the like. Any help on this would be much appreciated.

  • Copy a small version into a larger file - problem with typing

    Hi,
    I have paste a copy of a Captivat file in a 10px (each side) larger file. Now I got the problem that typing animation is not on the correct place. Take a look:
    the original:
    the pasted version in the 10px larger file:
    Is this a known problem? Is there a way to fix this?

    Hi there,
    I would suggest that you use a screen capture utility such as Snagit to capture your Request Membership box and then insert this into your Captivate slide as an image. This will then enable you to accurately position the graphic and should address the typing problem.
    Best - Mark
    Visit the macrofireball blog

  • Copying large files problem in windows 7 x64 professional

    Hi!
    I'm new to this forum and I recently installed windows 7 professional without too much trouble in my pc. The only thigs that doesn't work are the quicklaunch buttons (which I'm expecting HP to release it until october 22). The system is functioning ok, but I discovered a very serious problem when tranferring large size files (copy or move function) to an external source (in my case a external HD: model Western Digital My book 1 TB connected via USB). 
    When I try to move a file that is larger than 1 GB the system during the process gets slower and slower rendering my system useless unless I cancel the process and restart. This only happens when moving files from my notebook to my HD.
    I've been doing my homework and I checked microsoft technet forums, here's what I found:
    1.It seems a lot of people have this problem, in particullar the ones with an asus motherboard (which I know this isn't the case). But in my case since I don't know which motherboard my notebook has, I can't come up with a solution to this problem.
    2. It seems to be a particullar problem with the 64 version of windows.
    3. The users claim that the problem was solved when they upgraded the chipset, which I somehow tried but with the current vista drivers, but the problem wasn't solved.
    4. Some users claim that by using other software like teracopy or grant full permission control seems like a workaround to this, but no in my case. It doesn't work.
    5. With sisoft sandra I managed to get my motherboard from this notebook but with no luck finding information or upgrades to a Quanta motherboard.
    Here are my system specs (link to the HP site describing it):
    Model: HP pavilion 6950la
     http://h10025.www1.hp.com/ewfrf/wc/document?docname=c01470050&lc=en&dlc=en&cc=us&lang=en&softwareitem=ob-55781-1&os=2100&product=3753762
    Windows 7 professional x64
    I'm worried about this and it seems weird to me that no one who has tried windows 7 professional x64 has mentioned this in the forum (or even mentioning it), because I think this is a really serious issue and I would like to make sure it gets known and hopefully solved during these days. I really don't want ot get back to vista just for this. And I hope my post will be useful to the HP people so they can get around this problem.
    Thankd in advance, will wait for your answers.

    Please repost your inquiry in the Microsoft Windows 7 Performance Forum. 
     Thank you!
    Carey Frisch
    Microsoft MVP
    Windows Expert - Consumer

  • Large file problems with an external FAT32 formatted drive (solutions???)

    I have an external hard drive that was from my PC and formatted as NTFS so it was read only on my Mac. I formatted it to FAT32 so that I could read/write to it from both my Mac and PC.
    Now, because it is formatted FAT32, I can only have a 4GB max file which has proved to be a pain to me. Is there not a format that satisfies both platform with unrestricted file sizes?
    Thanks in advance.

    Well, the answer is: not easily.
    NTFS, EXT2, and Mac HFS can all be read and written by Windows, Mac OS X and Linux, with appropriate software. You should probably settle on what you are going to use most, and use that as your preferred filesystem.
    For example, if you were going to use the Mac mostly to deal with the disk, format the disk using HFS+. On Windows, buy MacDrive and install it to read/write the Mac-format disk. If you are going to use Windows most of the time, then install MacFUSE and NTFS-3g on your Mac (http://code.google.com/p/macfuse/). et cetera.
    There's nothing in the way of a really good cross-platform filesystem. ZFS may eventually become that (one might hope), but that's still not quite there.

  • Privileges problem when transferring large files

    This just happened, and I don't know what changed. I cannot transfer large files from my Macbook Pro to my MacPro desktop, neither on the network nor by a USB hard drive. I get the infamous "You do not have sufficient privileges" message. The files start copying, and after about 1gb or so, the message pops up. Small files no problem, there's no restriction on those. It does not matter which disk I am trying to copy it to.
    I thought it was just a problem with iMovie files and folders (these get big fast), but it seems to be an issue with any large file. I have repaired permissions on both disks, I have made sure that the unlock key is unlocked and that all the boxes are checked "Read & Write".
    And this problem just arose today. Nothing has changed, so far as I know.

    I assume I am always logged in as administrator because I have never logged in as anything else. It also does not matter from what folder, or even disk drive, I am trying to transfer from. If the file is above a certain size, it stops transferring and I get the error.
    What interests me is not that I immediately get the "insufficient privileges" error when I try and transfer a forbidden folder or file. I've seen that. This happens after about a gig of data has transferred over already, then that error pops up.

  • Problems copying large files to external disks

    I have a lot of media and so multiple USB and network based external hard disks.
    I'm having trouble with two in particular that are recent buys. I initially mounted them via a USB hub onto my Time Capsule, but when I had errors, I've now tried mounting them directly to my MacBook Air - and for comparison directly to a Windows laptop.
    And I'm only having problems when it's a Mac doing the copying (MBA to either USB mounted via Time Capsule or directly USB mounted on the MBA).
    The problem is that the drive appears to behave OK for initial copies - but I'm trying to put a set of old movies (captured from a VCR ages ago that I'd recorded off TV) onto one of the drives and (a) it takes ages to copy and (b) eventually I get a write failure. The specific error message is
    The Finder can't complete the operation because some data in "" can't be written. (Error code -36)
    I"ve tried a whole variety of setups - as I've said via Time Capsule and directly mounted. I also wondered if the file system on the drive would make a difference. Out of the box it was formatted with FAT32 and when that failed I've now reformatted with MacOS file system - which I would have thought would give better compatibility (and I've read that FAT32 has a large file size limit - although I think it's 4GB and while I do have one file at 4.04GB, it's not failing the copy on that file).
    I've also connected the drive (when formatted FAT32) to a Windows laptop and (a) it copies faster and (b) it copies successfully.
    Any suggestions for this? Is there some kind of large file (all are >1GB) copy issue with USB mounted devices in OSX?

    As I mentioned in my original post while the disks were originally formatted FAT32 I reformatted then and changed them to Mac OS Extended (Journaled) so that isn't the issue. I still have the problem with the disks using Apple format.
    I've noticed that if I do the copy in pieces, it seems to work. I.e. if I select the full set of 45GB in one copy and paste operation (dragging and dropping using Finder) it fails part way through.
    But if I copy 3/4 movies at a time going back until I have copied all of them then I can get all of them on the disk.
    It suggests to me it's some kind of issue with copying and pasting a very large amount of data in Snow Leopard?

  • Large Data file problem in Oracle 8.1.7 and RedHat 6.2EE

    I've installed the RedHat 6.2EE (Enterprise
    Edition Optimized for Oracle8i) and Oracle
    EE 8.1.7. I am able to create very large file
    ( > 2GB) using standard commands, such as
    'cat', 'dd', .... However, when I create a
    large data file in Oracle, I get the
    following error messages:
    create tablespace ts datafile '/data/u1/db1/data1.dbf' size 10000M autoextend off
    extent management local autoallocate;
    create tablespace ts datafile '/data/u1/db1/data1.dbf' size 10000M autoextend off
    ERROR at line 1:
    ORA-19502: write error on file "/data/u1/db1/data1.dbf", blockno 231425
    (blocksize=8192)
    ORA-27069: skgfdisp: attempt to do I/O beyond the range of the file
    Additional information: 231425
    Additional information: 64
    Additional information: 231425
    Do anyone know what's wrong?
    Thanks
    david

    I've finally solved it!
    I downloaded the following jre from blackdown:
    jre118_v3-glibc-2.1.3-DYNMOTIF.tar.bz2
    It's the only one that seems to work (and god, have I tried them all!)
    I've no idea what the DYNMOTIF means (apart from being something to do with Motif - but you don't have to be a linux guru to work that out ;)) - but, hell, it works.
    And after sitting in front of this machine for 3 days trying to deal with Oracle's, frankly PATHETIC install, that's so full of holes and bugs, that's all I care about..
    The one bundled with Oracle 8.1.7 doesn't work with Linux redhat 6.2EE.
    Don't oracle test their software?
    Anyway I'm happy now, and I'm leaving this in case anybody else has the same problem.
    Thanks for everyone's help.

  • SFTP MGET of large files fails - connection closed - problem with spool file

    I have a new SFTP job to get files from an FTP Server.  The files are large (80mg, 150mg).  I can get smaller files from the ftp site with no issue, but when attempting the larger files the job completes abnormally after 2 min 1 sec. each time.  I can see the file is created on our local file system with 0 bytes, then when the FTP job fails, the 0 byte file is deleted.
    Is there a limit to how large an ftp file can be in Tidal?  How long an ftp job can run?
    The error in the job audit is Problem with spool file for job XXXX_SFTPGet and an exit code of 127 (whatever that is).
    In the log, the error is that the connection was closed.  I have checked with the ftp host and their logs show that we are disconnecting unexpectedly also.
    Below is an excerpt from the log
    DEBUG [SFTPMessage] 6 Feb 2015 14:17:33.055 : Send : Name=SSH_FXP_STAT,Type=17,RequestID=12
    DEBUG [SSH2Channel] 6 Feb 2015 14:17:33.055 : Transmit 44 bytes
    DEBUG [ChannelDataWindow] 6 Feb 2015 14:17:33.055 : Remote window size decreased to 130808
    DEBUG [PlainSocket] 6 Feb 2015 14:17:33.071 : RepeatCallback received 84 bytes
    DEBUG [SSH2Connection] 6 Feb 2015 14:17:33.071 : ProcessPacket pt=SSH_MSG_CHANNEL_DATA
    DEBUG [SFTPMessageFactory] 6 Feb 2015 14:17:33.071 : Received message (type=105,len=37)
    DEBUG [SFTPMessageStore] 6 Feb 2015 14:17:33.071 : AddMessage(12) - added to store
    DEBUG [SFTPMessage] 6 Feb 2015 14:17:33.071 : Reply : Name=SSH_FXP_ATTRS,Type=105,RequestID=12
    DEBUG [SFTPMessage] 6 Feb 2015 14:17:33.071 : Send : Name=SSH_FXP_OPEN,Type=3,RequestID=13
    DEBUG [SSH2Channel] 6 Feb 2015 14:17:33.071 : Transmit 56 bytes
    DEBUG [ChannelDataWindow] 6 Feb 2015 14:17:33.071 : Remote window size decreased to 130752
    DEBUG [PlainSocket] 6 Feb 2015 14:17:33.087 : RepeatCallback received 52 bytes
    DEBUG [SSH2Connection] 6 Feb 2015 14:17:33.087 : ProcessPacket pt=SSH_MSG_CHANNEL_DATA
    DEBUG [SFTPMessageFactory] 6 Feb 2015 14:17:33.087 : Received message (type=102,len=10)
    DEBUG [SFTPMessageStore] 6 Feb 2015 14:17:33.087 : AddMessage(13) - added to store
    DEBUG [SFTPMessage] 6 Feb 2015 14:17:33.087 : Reply : Name=SSH_FXP_HANDLE,Type=102,RequestID=13
    DEBUG [SFTPMessage] 6 Feb 2015 14:17:33.087 : Send : Name=SSH_FXP_READ,Type=5,RequestID=14
    DEBUG [SSH2Channel] 6 Feb 2015 14:17:33.087 : Transmit 26 bytes
    DEBUG [ChannelDataWindow] 6 Feb 2015 14:17:33.087 : Remote window size decreased to 130726
    DEBUG [PlainSocket] 6 Feb 2015 14:17:33.118 : RepeatCallback received 0 bytes
    DEBUG [SFTPChannelReceiver] 6 Feb 2015 14:17:33.118 : Connection closed:  (code=0)
    ERROR [SFTPMessageStore] 6 Feb 2015 14:17:33.118 : Disconnected unexpectedly ( [errorcode=0])
    ERROR [SFTPMessageStore] 6 Feb 2015 14:17:33.118 : EnterpriseDT.Net.Ftp.Ssh.SFTPException:  [errorcode=0]
    ERROR [SFTPMessageStore] 6 Feb 2015 14:17:33.118 :    at EnterpriseDT.Net.Ftp.Ssh.SFTPMessageStore.CheckState()
    ERROR [SFTPMessageStore] 6 Feb 2015 14:17:33.118 :    at EnterpriseDT.Net.Ftp.Ssh.SFTPMessageStore.GetMessage(Int32 requestId)

    I believe there is a limitation on FTP and what you are seeing is a timeout built into the 3rd party application that tidal uses (I feel like it was hardcoded and it would be a big deal to change but this was before Cisco purchased tidal)  there may have been a tagent.ini setting that tweaks that but I can't find any details.
    We wound up purchasing our own FTP software (ipswitch MOVEit Central & DMZ) because we also had the need to host as well as Get/Put to other FTP sites. It now does all our FTP and internal file delivery activity (we use it's api and call from tidal if we need to trigger inside a workflow)

  • Problem while processing large files

    Hi
    I am facing a problem while processing large files.
    I have a file which is around 72mb. It has around more than 1lac records. XI is able to pick the file if it has 30,000 records. If file has more than 30,000 records XI is picking the file ( once it picks it is deleting the file ) but i dont see any information under SXMB_MONI. Either error or successful or processing ... . Its simply picking and igonring the file. If i am processing these records separatly it working.
    How to process this file. Why it is simply ignoring the file. How to solve this problem..
    Thanks & Regards
    Sowmya.

    Hi,
    XI pickup the Fiel based on max. limit of processing as well as the Memory & Resource Consumptions of XI server.
    PRocessing the fiel of 72 MB is bit higer one. It increase the Memory Utilization of XI server and that may fali to process at the max point.
    You should divide the File in small Chunks and allow to run multiple instances. It will  be faster and will not create any problem.
    Refer
    SAP Network Blog: Night Mare-Processing huge files in SAP XI
    /people/sravya.talanki2/blog/2005/11/29/night-mare-processing-huge-files-in-sap-xi
    /people/michal.krawczyk2/blog/2005/11/10/xi-the-same-filename-from-a-sender-to-a-receiver-file-adapter--sp14
    Processing huge file loads through XI
    File Limit -- please refer to SAP note: 821267 chapter 14
    File Limit
    Thanks
    swarup
    Edited by: Swarup Sawant on Jun 26, 2008 7:02 AM

  • PROBLEM SAVING LARGE FILES ON ACROBAT READER XI

    I download a largish document (70 - 100 MB) from the on-line storage folder of a friend. But am unable to save it on my PC. I have the recent version of Acrobat Reader 11.0.5 on a PC using Windows 8 and this opens this or any other pdf file without problem. But as soon as I go to save the large files the programme crashes with a 'There's a problem' notice. I then discover it has saved a file name but without file content. Any ideas how I can tackle this problem? Colin

    "Online storage" - does that mean a website that is accessed via a browser?  If so, you can download the document by right-clicking on the link to the PDF, then select 'Save link as' or 'Save target as'.

  • I have to send large files like 5MB or biger via mobile phone, but in that cases phone is telling mi to plug in power. And when I do that, there is no problem to send the file. For smaller files there is no problem. How can I solve the problem? Because wh

    I have to send large files like 5MB or biger via mobile phone, but in that cases phone is telling mi to plug in power. And when I do that, there is no problem to send the file. For smaller files there is no problem. How can I solve the problem? Because when I'm not at home I can't plug in the power.

    hi,
    I am sending file from server to client i.e client will request for a file and service will send it back....... no socket connection is there...I am using JBOSS and apache axis.
    pls help me out.....
    Rashi

  • Large file transfer problems over client/server socket

    Hi,
    I wrote a simple chat problem in which I can send files from client to client. The problem is when I send large files over 101 MB the transfer freezes. I do not even get any error mesages in the console. The files I am sending are of any type (Ex mp3, movies, etc...). I am serializing the data into a "byteArray[packetSize]" and sending the file packet by packet until the whole file has been sent, and reconstructed on the other side. The process works perfectly for files smaller than 101MB, but for some reason freezes after that if the file is larger. I have read many forums and there aren't too many solutions out there. I made sure to use the .reset() method to reset my ObjectOutputStream each time I write to it.
    Here's my file sending code:
    byte[] byteArray = new byte[defaultPacketSize];
    numPacketsRequired = Math.ceil(fileSize / defaultPacketSize);
    try {
    int i = 0;
    reader = new FileInputStream(filePath);
    while (reader.available() > 0) {
    if (reader.available() < defaultPacketSize) {
    byte[] lastPacket = new byte[reader.available()];
    reader.read(lastPacket);
    try {
    if (socket == null || output == null) {
    throw new SocketException("socket does not exist");
    output.writeObject(lastPacket);
    output.reset();
    output.writeObject("DONE");
    output.reset();
    output.close();
    socket.close();
    catch (Exception e) {
    System.out.println("Exception ***: " + e);
    output.close();
    socket.close();
    else {
    reader.read(byteArray);
    try {
    if (socket == null || output == null) {
    throw new SocketException("socket does not exist");
    output.writeObject(byteArray);
    output.reset();
    catch (Exception e) {
    System.out.println("Exception ***: " + e);
    output.close();
    socket.close();
    reader.close();
    catch (Exception e) {
    System.out.println("COULD NOT READ PACKET");
    Here's my file receiving code:
    try {
    // The message from the client
    Object streamInput;
    FileOutputStream writer;
    byte[] packet;
    while (true) {
    streamInput = input.readObject();
    if (streamInput instanceof byte[]) {
    packet = (byte[]) streamInput;
    try {
    writer = new FileOutputStream(outputPath, true);
    writer.write(packet); //Storing the bytes on file
    writer.close();
    catch (Exception e) {
    System.out.println("Exception: " + e);
    else if (streamInput.equals("DONE")) {
    socket.close();
    input.close();
    break;
    catch (Exception e) {
    I'm looking for any way I can possibly send large files from client to client without having it freeze. Are there any better file transfer ways other than socket? I don't really want FTP. I think I want to keep it HTTP.
    Any suggestions would be helpful.Thanks!
    Evan

    I've taken a better look a the code you posted, and
    there is one problem with the receiving code. You
    keep repeatedly opening and closing the
    FileOutputStream. This is not going to be efficient
    as the file will keep needing to be positioned to its
    end.Yes sorry I did change that code so that it leaves the file open until completely done writing. Basically I have a progress bar that records how far along in the writing process the client is, and when the progress bar reaches 100%, meaning the file is complete, the file.close() method is invoked. Sorry about that.
    I also ran some memory tests using the "Runtime.getRuntime().totalMemory()", and "Runtime.getRuntime().freeMemory()" methods. I put these methods inside the loop where I read in the file and send it to the client. here's the output:
    Sender's free memory: 704672
    File reader read 51200 bytes of data.
    767548 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 702968
    File reader read 51200 bytes of data.
    716348 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 701264
    File reader read 51200 bytes of data.
    665148 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 699560
    File reader read 51200 bytes of data.
    613948 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 697856
    File reader read 51200 bytes of data.
    562748 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 696152
    File reader read 51200 bytes of data.
    511548 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 694448
    File reader read 51200 bytes of data.
    460348 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 692744
    File reader read 51200 bytes of data.
    409148 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 691040
    File reader read 51200 bytes of data.
    357948 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 689336
    File reader read 51200 bytes of data.
    306748 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 687632
    File reader read 51200 bytes of data.
    255548 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 685928
    File reader read 51200 bytes of data.
    204348 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 684224
    File reader read 51200 bytes of data.
    153148 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 682520
    File reader read 51200 bytes of data.
    101948 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 680816
    File reader read 51200 bytes of data.
    50748 left to read.
    Sender's runtime memory: 2818048
    Sender's free memory: 679112
    File reader read 50748 bytes of data.
    0 left to read.
    Creating last packet of size: 50748
    Last packet size after setting it equal to byteArray: 50748
    Here's the memory stats from the receiver:
    Receiver's free memory: 639856
    Receiver's total memory: 2842624
    Receiver's free memory: 638920
    Receiver's total memory: 2842624
    Receiver's free memory: 637984
    Receiver's total memory: 2842624
    Receiver's free memory: 637048
    Receiver's total memory: 2842624
    Receiver's free memory: 636112
    Receiver's total memory: 2842624
    Receiver's free memory: 635176
    Receiver's total memory: 2842624
    Receiver's free memory: 634240
    Receiver's total memory: 2842624
    Receiver's free memory: 633304
    Receiver's total memory: 2842624
    Receiver's free memory: 632368
    Receiver's total memory: 2842624
    Receiver's free memory: 631432
    Receiver's total memory: 2842624
    Receiver's free memory: 630496
    Receiver's total memory: 2842624
    Receiver's free memory: 629560
    Receiver's total memory: 2842624
    Receiver's free memory: 628624
    Receiver's total memory: 2842624
    Receiver's free memory: 627688
    Receiver's total memory: 2842624
    Receiver's free memory: 626752
    Receiver's total memory: 2842624
    Receiver's free memory: 625816
    Receiver's total memory: 2842624
    Receiver's free memory: 624880
    Receiver's total memory: 2842624
    Receiver's free memory: 623944
    Receiver's total memory: 2842624
    Receiver's free memory: 623008
    Receiver's total memory: 2842624
    Receiver's free memory: 622072
    Receiver's total memory: 2842624
    Receiver's free memory: 621136
    Receiver's total memory: 2842624
    Receiver's free memory: 620200
    Receiver's total memory: 2842624
    Receiver's free memory: 619264
    Receiver's total memory: 2842624
    Receiver's free memory: 618328
    Receiver's total memory: 2842624
    Receiver's free memory: 617392
    Receiver's total memory: 2842624
    Receiver's free memory: 616456
    Receiver's total memory: 2842624
    Receiver's free memory: 615520
    Receiver's total memory: 2842624
    Receiver's free memory: 614584
    this is just a sample of both receiver and sender's stats. Everything appears to be fine! Hope this message post isn't too long.
    Thanks!

  • WRT110 router - FTP problem - Timeouts on large files

    When I upload files via FTP and if the transfer of a file takes more then 3 minutes,
    then my FTP programs doesn't jump to the next file to upload when the previous one has been just finished.
    The problem occurs only if I connect to the internet behind the wrt110 router.
    I have tried FileZilla, FTPRush and FlashFXP FTP softwares to upload via FTP.
    At FileZilla sending keep-alives doesn't help,
    only it works at FlashFXP, but I have a trial version.
    At FIleZilla's site I can read this:
    http://wiki.filezilla-project.org/Network_Configuration#Timeouts_on_large_files
    Timeouts on large files
    If you can transfer small files without any issues,
    but transfers of larger files end with a timeout,
    a broken router and/or firewall exists
    between the client and the server and is causing a problem.
    As mentioned above,  
    FTP uses two TCP connections:
    a control connection to submit commands and receive replies,
    and a data connection for actual file transfers.
    It is the nature of FTP that during a transfer the control connection stays completely idle.
    The TCP specifications do not set a limit on the amount of time a connection can stay idle.
    Unless explicitly closed, a connection is assumed to remain alive indefinitely.
    However, many routers and firewalls automatically close idle connections after a certain period of time.  
    Worse, they often don't notify the user, but just silently drop the connection.
    For FTP, this means that during a long transfer
    the control connection can get dropped because it is detected as idle, but neither client nor server are notified.
    So when all data has been transferred, the server assumes the control connection is alive
    and it sends the transfer confirmation reply.
    Likewise, the client thinks the control connection is alive and it waits for the reply from the server.
    But since the control connection got dropped without notification,
    the reply never arrives and eventually the connection will timeout.
    In an attempt to solve this problem,
    the TCP specifications include a way to send keep-alive packets on otherwise idle TCP connections,
    to tell all involved parties that the connection is still alive and needed.
    However, the TCP specifications also make it very clear that these keep-alive packets should not be sent more often than once every two hours. Therefore, with added tolerance for network latency, connections can stay idle for up to 2 hours and 4 minutes.
    However, many routers and firewalls drop connections that have been idle for less than 2 hours and 4 minutes. This violates the TCP specifications (RFC 5382 makes this especially clear). In other words, all routers and firewalls that are dropping idle connections too early cannot be used for long FTP transfers. Unfortunately manufacturers of consumer-grade router and firewall vendors do not care about specifications ... all they care about is getting your money (and only deliver barely working lowest quality junk).
    To solve this problem, you need to uninstall affected firewalls and replace faulty routers with better-quality ones.

    When I upload files via FTP and if the transfer of a file takes more then 3 minutes,
    then my FTP programs doesn't jump to the next file to upload when the previous one has been just finished.
    The problem occurs only if I connect to the internet behind the wrt110 router.
    I have tried FileZilla, FTPRush and FlashFXP FTP softwares to upload via FTP.
    At FileZilla sending keep-alives doesn't help,
    only it works at FlashFXP, but I have a trial version.
    At FIleZilla's site I can read this:
    http://wiki.filezilla-project.org/Network_Configuration#Timeouts_on_large_files
    Timeouts on large files
    If you can transfer small files without any issues,
    but transfers of larger files end with a timeout,
    a broken router and/or firewall exists
    between the client and the server and is causing a problem.
    As mentioned above,  
    FTP uses two TCP connections:
    a control connection to submit commands and receive replies,
    and a data connection for actual file transfers.
    It is the nature of FTP that during a transfer the control connection stays completely idle.
    The TCP specifications do not set a limit on the amount of time a connection can stay idle.
    Unless explicitly closed, a connection is assumed to remain alive indefinitely.
    However, many routers and firewalls automatically close idle connections after a certain period of time.  
    Worse, they often don't notify the user, but just silently drop the connection.
    For FTP, this means that during a long transfer
    the control connection can get dropped because it is detected as idle, but neither client nor server are notified.
    So when all data has been transferred, the server assumes the control connection is alive
    and it sends the transfer confirmation reply.
    Likewise, the client thinks the control connection is alive and it waits for the reply from the server.
    But since the control connection got dropped without notification,
    the reply never arrives and eventually the connection will timeout.
    In an attempt to solve this problem,
    the TCP specifications include a way to send keep-alive packets on otherwise idle TCP connections,
    to tell all involved parties that the connection is still alive and needed.
    However, the TCP specifications also make it very clear that these keep-alive packets should not be sent more often than once every two hours. Therefore, with added tolerance for network latency, connections can stay idle for up to 2 hours and 4 minutes.
    However, many routers and firewalls drop connections that have been idle for less than 2 hours and 4 minutes. This violates the TCP specifications (RFC 5382 makes this especially clear). In other words, all routers and firewalls that are dropping idle connections too early cannot be used for long FTP transfers. Unfortunately manufacturers of consumer-grade router and firewall vendors do not care about specifications ... all they care about is getting your money (and only deliver barely working lowest quality junk).
    To solve this problem, you need to uninstall affected firewalls and replace faulty routers with better-quality ones.

  • Printing problem with ads on as java when printing large files

    hi all
    we have an wd for java application running on as java 7.0 sp18 and adobe document service if we print
    small files everything works fine with large files it fails with the following error (after arround 2 minutes)
    any ideas
    #1.5^H#869A6C5E590200710000092C000B20D000046E16922042E2#1246943126766#com.sap.engine.services.servlets_js
    p.server.HttpHandlerImpl#sap.com/tcwddispwda#com.sap.engine.services.servlets_jsp.server.HttpHandlerImp
    l#KRATHHO#8929##sabad19023_CHP_5307351#KRATHHO#63a60b106ab311de9cb4869a6c5e5902#SAPEngine_Application_Thr
    ead[impl:3]_15##0#0#Error#1#/System/Server/WebRequests#Plain###application [webdynpro/dispatcher] Process
    ing HTTP request to servlet [dispatcher] finished with error.^M
    The error is: com.sap.tc.webdynpro.clientserver.adobe.pdfdocument.base.core.PDFDocumentRuntimeException:
    Failed to  UPDATEDATAINPDF^M
    Exception id: [869A6C5E590200710000092A000B20D000046E1692201472]#

    Hello
    on which support package level is the java stack  ?
    kr,
    andreas

Maybe you are looking for

  • Upload of  Purchase order (PO) item information into SRM through Excel file

    Hi Following is the scenario for me User will login to portal and enter the header information and save the PO in SRM. Later he will upload line item details using the excel file into SRM. As per my analysis I have noted that there are following func

  • Icloud manual backup

    Hello. I am on ios 8.1.3 and I can not figure out how to do a manual backup. I go to settings --> icloud--> Storage--> Manage Storage. I can not find the "Backup Now" button anywhere. Did they get rid of this?? It says my last backup was yesterday bu

  • No Live Stream on Cameras (New Service) - Intermittent

    I just had 3 exterior cameras installed yesterday and was wondering if the intermittent loss of live video stream is a common occurrence with new installations. I also needs tips on how to setup a camera to record and notify when it detects motion. 

  • How to implement Table with Common Cell Section

    Hi, I have requirement of displaying mulltiple rows in single column cell for certain columns in each row of Table. This has to be a read only table, with alternate design support in 7.02 The Data Source contains all the elements to display in table

  • My ACR will not recognize Nikon RAW images

    I had a friend take our family photos today and she used her Nikon d600 (rather than my Canon 5D MkII--she didn't want to use a camera than she didn't know how to use, understandably!). She gave me her SD card so that I could transfer the images to m