WRT350N Large File Issue

I am connecting wirelessly to my WRT350N using the WPC300N wireless card, XP Pro. If I try and back up my laptop, either using the USB storage connection or using a gigabit NAS, the wireless connection either degrades to 1Mbs or drops off completely. This is whether I use a firewall or not.
If I connect using a slower NAS, this does not happen. Any idea of what the problem may be or whether there is a setting that may be adjusted?
Thanks.

I am connecting wirelessly to my WRT350N using the WPC300N wireless card, XP Pro. If I try and back up my laptop, either using the USB storage connection or using a gigabit NAS, the wireless connection either degrades to 1Mbs or drops off completely. This is whether I use a firewall or not.
If I connect using a slower NAS, this does not happen. Any idea of what the problem may be or whether there is a setting that may be adjusted?
Thanks.

Similar Messages

  • Tdms viewer+large files issue

    I built a VI that stores large [up to 50x50] 2D arrays continuously, every new 2D array is stored as a new group in a TDMS file.
    The VI streams out the data, it's OK, however when it finishes and the TDMS Viewer pops up, the LabVIEW keeps hanging and [seemingly] doing nothing. The TDMS files that the viewer tries to display can have easily over 10^6 pieces of 2D arrays inside if the measurements were running for days for example.
    Is there a clever way to make the TDMS Viewer display my file or at least say the quarter of it?
    Thanks,
    Krivan

    Also, if I use the code attached and I try to open that large file the LV keeps hanging [all open LV windows], the program does not enter the while loop because the open tdms VIs do nothing.
    However, if I run this program with a very small sized tdms file [~300 kbyte] containing only a couple of thousands of matrixes, it runs without any problem.
    I haven't tried the defragmenter so far because here it was reported to be not advisable.
    I also tried to read only a small portion of the files wiring 1000 to the read VI but [in case of the large files] the problem remains the same, the operation does not get through the TDMS Open VIs.
    Is there a clever way to handle large TDMS files?
    Thanks,
    Krivan
    Attachments:
    read_average_TDMS_difference_plots.vi ‏22 KB

  • Stor.E TV+ Samba Large Files Issue

    Hi,
    Using SAMBA I can copy small files to my Stor.e TV+ but cannot copy files > ~500MB. Is there any update to software/bios that resolves this ? Can't see any available downloads on the Support Pages for this device. Can we configure the Samba Share ourselves at all ?
    I require to be able to copy Large Files over Samba to prevent having to locally attach the Device to PC using USB each time I want to copy my large digital files over to the Media Player.
    Thanks

    Hi
    I think this could be due to energy saving settings of the computer or the NAS has interrupted the connection.
    If the energy saving settings of your laptop or NAS are active, then they can go into a idle state or sleep mode even though they are being accessed via the network.
    I would recommend adjust the settings of the medias source to prevent it going into sleep while transferring files via the network.
    You could also try different connection modes:
    - LAN setup
    - Wlan setup (Adhoc connection)
    - PPPoE Setup (if your device is connected to network via DSL)
    I think it would be interesting to know what connection is affected.
    Good luck mate

  • Acrobat 9 Large File Issues?

    I work with very large Excel files that I need to turn into pdf's.  Very large meaning 30,000 to 50,000 rows in the spreadsheet.  Once turned into a pdf it is 586 pages long.  I had Adobe Acrobat 9 Standard and I began to experience issues.  If I were in the Excel spreadsheet and clicked on Acrobat, create pdf, the process would take 10-15 minutes and the resulting pdf was not correct (in Excel, I would do a pagebreak preview and the last column would be indicated as on a new page.  However, I would drag the line to include the last column and save the file that way.  But, the pdf created would still have that last column as a separate page, thus creating over a 1,000 pages in the pdf).  Whatever I did I could not get it to work.  However, if I selected "Print" and chose Adobe Acrobat as the printer, it would create a perfect looking pdf in about a minute.  Much faster and it looked correct.  The issue I found however was in searching for words in that pdf using the pdf search, some of the words were findable and some it indicated could not be found - even though the words were in fact in the pdf.
    I thought that perhaps I was beyond the capabilities of Acrobat 9 Standard so I installed Acrobat 9 Professional.  Now I can, in fact create a pdf directly from Excel.  It takes about 5 minutes but it completes and it looks good.  When I search for the words it now finds all of them.  However, as a test, I chose the print method which againn took less than a minute and produced a perfect looking pdf.  However, when doing the word searches again it did not find some of the words.
    Does anyone have any idea why?  I would much prefer to create the pdf using print because it is so much faster but I have to have search find all the words.

    Are those poorly-searchable pdf's confined to Excel source docs or does the problem also occur when starting from other Office apps such as Word?
    FWIW I once saw a web review of a free pdf writer utility that gave searchable pdfs from word processor apps but not from Excel. Maybe some pointers there?
    http://www.techsupportalert.com/best-free-pdf-writer.htm
    Just a wild-assed guess (I don't even have Pro 9 or Office 2007), do those missing words in the poorly-searchable pdf's contain ligatures, where text character strings such as 'fi' and 'fl' are replaced by single, special characters? Possibly such words might remain invisible to your pdf text search because you are typing the plain vanilla text version into the Acrobat search box?

  • Photoshop Elements 7 saving large files issue

    I use Photoshop Elements 7, I recently bought a new camera (24 megapixels). PSE7 is not happy, I am having lots of issues saving my work. Is anyone else having these issues?

    DJ, My computer is windows 8, below are the system requirements for PSE12 - looks like it will run in 64-bit. Thanks for pointing me in the right direction!
    No, you have seriously misunderstood what I said. PSE will not run as a 64 bit application under Windows. Even though you have a 64 bit version of Windows, PSE will NOT run as a 64 bit application under Windows. It will run as a 32 bit application under Windows 64bit, giving it a maximum of approx 3 GB of memory to work with. I don't know if that will make your problems go away or not. Please download the 30-day free trial of PSE 12 and see if your problems go away or not before you make a purchase.
    From the system requirements
    all other applications run native on 32-bit operating systems and in 32-bit compatibility mode on 64-bit operating systems

  • Streaming large files issue (flash streaming media server)

    We are streaming a one hour F4V from streaming media server 3.5.2 and for some reason it is seeing our one hour video as being 10 hours long. We have tons of other videos and never ran into this problem in any of the other files. This is the only file that exceeds one hour. Any ideas. This occurs in the default player from the streaming server.

    Also before anyone asks we are using RTMP and the file type is F4V not FLV.

  • Why does the Reduced Size PDF command create larger files?

    I have been a happy user of Adobe Acrobat vers. 7 until Adobe ceased the support of that version and I was unable to print and do other activities since my last Mac OS vers.10.6.8 update, so I was forced to upgrade to Adobe's Acrobat's X latest version 10.0.0.  I had hope to be able to produce Adobe Acrobat 3D files but found out later that Adobe does not do that anymore and I would have to buy it as a plugin from Tetra 4D for another $399.00, oh well. I then wanted to reduce the file size of some larger pdfs I had edited.  It took a while to locate the command location, now in File>Save>Reduced SizePDF, once I activated the command on my file it actually increased its size!!  Why did this happen?  Should I continue to use my disabled, Adobe unsupported vers 7 to get the superior performance I once had?  This issue was mentioned in an earlier thread that started out as to where to locate this command but the resultant larger file issue remained unanswered.  Currently unhappy on my purchase.  Will Adobe offer a free upgrade to fix this?

    I agree with Peter. Use Pages '09, but keep a document migration strategy present. When you create a Pages document in Pages '09, export it to Word .doc too. Export any current content in Pages v5.2.2, and then stop using it. That means no double-clicked documents.
    Pages v5 is not an evolution of Pages '09 — it is an incomplete rewrite, and uses a different, internal document architecture than did Pages '09. This internal architecture is to blame for the export translation bloat into Word's different document architecture. It does not matter whether you type a single character, or multiple pages, you will get about a 500Kb .docx file.
    Opening, and resaving this monster .docx file in MS Word in Office for Mac 2011, LibreOffice, or TextEdit will result in a very small .docx footprint.

  • Help with Aperture/T2i/iMovie/large file ( 2GB) issue.

    I am having an issue with a very large file (3.99 GB) that I shot with my Rebel T2i. The file imports fine into Aperture, but then it isn't recognized at all by iMovie. I found out that iMovie can only handle files that are 2 GB or smaller.
    So now, I am trying to figure out how to chop my mega file neatly into a pair of 2GB halves. When I use the trim function, that does not seem to do the trick -- this may be a case of Aperture's non-destructive nature actually working against me.
    Does anyone have a solution for this? My intuition suggests that this may be a job for QuickTime Pro -- but I wasn't sure how that works now that we all have QuickTime X.
    Much appreciated.

    The file may well be in the wrong format, can you tell us more about it. See This

  • Hello, I am having issues open very large files I created, one being 1,241,776 KB. I have PS 12.1 with 64 bit version. I am not sure if they issues I am having is because of the PS version I have, and whether or not I have to upgrade?

    Hello, I am having issues open very large files I created, one being 1,241,776 KB. I have PS 12.1 with 64 bit version. I am not sure if they issues I am having is because of the PS version I have, and whether or not I have to upgrade?

    I think more likely, it's a memory / scratch disk issue.  1.25 gigabytes is a very big image file!!
    Nancy O.

  • [BUG] Performance Issue When Editing Large FIles

    Are other people experiencing this on the latest JDevelopers (11.1.1.4.0 and the version before it) ?
    People in the office have been experiencing extremely slow performance happening seemingly at random while editing Java files. We have been using JDev for almost 10 years and this has only become an issue for us recently. Typically we only use the Java editing and the database functionality in JDev.
    I have always felt that the issue has been related to network traffic created by Clearcase and have never really paid attention, but a few days ago after upgrading to the latest version of JDev, for the first time I started having slowdowns that are affecting my speed of work and decided to look into it,
    The main symptom is the editor hanging for an unknown reason in the middle of typing a line (even in a comment) or immediately after hitting the carriage return. All PCs in the office have 2Gig or more RAM and are well within recommended spec.
    I've been experimenting for a few days to try and determine what exactly is at the root of the slowness. Among the things I have tried:
    o cutting Clearcase out of the equation; not using it for the project filesystem; not connectiing to it in JDev
    o Not using any features other than Java editing in JDev (no database connections)
    o never using split panes for side by side editing
    o downloading the light version of JDev
    o Increasing speed of all pop-ups/dynamic helpers to maximum in the options
    o disabling as many helpers and automations as possible in the options
    None of these have helped. Momentary freezes of 3-5 seconds while editing are common. My basic test case is simply to move the cursor from one line to another and to type a simple one-line comment that takes up most of the line. I get the freeze most usually right after typing the "//" to open the comment - it happens almost 100% of the time.
    I have however noticed a link to the file size/complexity.
    If I perform my tests on a small/medium sized file of about 1000 lines (20-30 methods) performance is always excellent.
    If I perform my test on one of our larger files 10000 lines ( more than 100 methods) the freezes while editing almost always occur.
    It looks like there is some processor intensive stuff going on (which cannot be turned off via the options panel) which is taking control of the code editor and not completing in a reasonable amount of time on large Java files. I have a suspicion that it's somehow related to the gutter on the right hand side which show little red and yellow marks for run-time reports of compile errors and warnings and haven't found any way to disable it so I can check.

    Just a small follow-up....
    It looks like the problem is happening on only a single Java file in our product! Unfortunately it happens to be the largest and most often updated file in the project.
    I'm still poking around to figure out why JDev is choking consistently on this one particular file and not on any of the others. The size/complexity is not much bigger than the next largest which can be edited without problems. The problem file is a little unusual in that it contains a large number of static functions and members.
    Nice little mystery to solve.

  • Rtorrent: issue with DL large files ( 4GB) to NTFS

    Using latest rtorrent/rutorrent:  every time I DL a large >4GB file with rtorrent to the NTFS drive it shows it downloading the whole file MB by MB, but when I go to hash check (via rutorrent), there's only a partial percentage DLded.  Say if I DL a 4.36 GB .mkv file, I hash check and only 10% is done ~400MB or about 6 minutes of the video.
    Oddly:
    If I do ls -l --block-size=MB, the file shows normal 4GB+ size.
    If I do ls -s, file appears to be only a few hundred MB.
    If I DL to my root ext4 drive, there's no issue unless I change the save path of the torrent in rutorrent and elect for the files to be moved to the NTFS drive.
    I've transferred large files with 'cp' from another NTFS to this NTFS with no issue.
    I thought the problem was rutorrent plugin autotools, but I removed it from my plugins folder and the problem persists.
    Permissions:
    I have all the relevant directories in /etc/php.ini open_basedir:  the user/session, the mounted drive, and /srv/http/rutorrent
    I did #chown -R http:http /srv/http/rutorrent
    http is a member of the group with NTFS drive access
    the rutorrent/tmp directory is changed to be within /srv/http/rutorrent
    This is a pesky issue that I didn't have with my last arch install using the same general set up.
    I DL to an NTFS formatted drive and mount it the same way I did before: ntfs-3g defaults,auto,uid=XXXX,gid=XXXX,dmask=027,fmask=037
    My rtorrent user is the uid (owner) and is in the group that has access to the drive (along with my audio server user and http)
    I run rtorrent in screen as the rtorrent user
    I imagine this is an issue with rutorrent?
    Any tips before I reformat the whole 4TB to ext4?
    EDIT:  the issue is definitely isolated to rtorrent.  I manually added large size torrent using rtorrent, it completed.  I then hash checked (in rtorrent) and again only ~10% was shown as complete.
    EDIT2:  It is most definitely not a permissions issue.  Tried this again without mount permissions options and the same thing happens.
    Last edited by beerhoof (2015-01-30 22:05:57)

    I'm afraid I don't understand the question.
    7.2 now correctly parses the Canon XF .CIF sidecar files to determine whether the media is supposed to be spanned or not.  This has been a feature request that has been finally addressed to work correctly.
    (It also was there in 7.1 & previous, but had limitations:  the performance wasn't as good, there had been issues in the past with audio pops at cut points, and it required that the Canon XF folder structure remain intact, ie if you copied the media to a flattened folder structure, it would fail to do the spanning correctly.)
    If you are looking for a means to disable the automatic spanning, simply removing the .CIF files will achieve that.  Although i'm not sure I understand why you're looking to do that.  Most people *want* spanning to happen automatically, otherwise you're forced to manually sync spanned media segments by hand. 

  • CF8.01 Large Files Upload issue

    We are having an issue with posting large files to the server
    through CFFile. Our server is running on Windows 2003 R2 SP2 with
    2GB of RAM. The average upload size is 800MB and we may run into
    multiple simultaneous uploads with the large file size. So, we have
    adjusted the "Maximum size of post data" to 2000 MB and "Request
    Throttle Memory" to 5000 MB in the ColdFusion admin setting to
    hopefully can allow up to 5 simultaneous uploads.
    However, when we tried to launch two 800MB uploads at the
    same time from different machines, only one upload can get through.
    The other one returned "Cannot connect to the server" error after a
    few minutes. No errors can be found in the W3C log and the
    ColdFusion logs (coldfusion-out.log and exception.log) but it is
    reported in the HTTPErr1.log with the following message:
    2008-04-18 08:16:11 204.13.x.x 3057 65.162.x.x 80 HTTP/1.1
    POST /testupload.cfm - 1899633270 Timer_EntityBody DefaultAppPool
    Can anyone shed some light of it? Thanks!

    quote:
    Originally posted by:
    Newsgroup User
    Don't forget that your web server (IIS, Apache, etc.) can
    have upload
    throttles as well.
    We did not throttle our IIS to limit the upload/download
    bandwidth.
    Is there a maximum limit for "Request Throttle Memory"
    setting? Can we set it over the available physical RAM size?

  • Large file download issue

    When we are downloading a large file 100MB + with Internet Explorer the
    workstation will at some point report that the connection was reset by the
    server and we lose the download. This is kind of random and I've only seen
    it in IE. I have and use Firefox at my desk and I have no trouble like
    this. But our campus standard is IE and I really would like to pin this down.
    My fear is the Packet filters. I went and looked at the packet filters we
    have set up and was shocked to see we had 77 filter exceptions. My last
    look only showed 45 so I need to sit down and review every filter exception
    and thin the heard. Anyway, below are the exception for http which is what
    I believe IE uses in downloading files. Right? and wanted a second option
    from you guys.
    We have NW 6.5 sp 5 BM 3.8 sp 4 ir3 and I use Craigs tuneup and proxycfg.
    the packet typ HTTP is defines as:
    Protocol TCP
    SRC Port ALL
    DEST Port 80
    ACK bit Filtering disabled
    Stateful Filtering enabled
    The filter exception are:
    #15
    Packet Type: HTTP
    Source: B57_1_EII
    All Circuits
    Any Address
    Destination: B57_2_EII
    All Circuits
    66.89.73.96/255.255.255.224
    Comment:
    #22
    Packet Type: HTTP
    Source: <All Interfaces>
    All Circuits
    Any Address
    Destination: CE1000_1
    All Circuits
    Any Address
    Comment:
    #24
    Packet Type: HTTP
    Source: B57_2_EII
    All Circuits
    Any Address
    Destination: <All Interfaces>
    All Circuits
    Any Address
    Comment: Added by BRDCFG to allow HTTP proxy.
    #67
    Packet Type: HTTP
    Source: B57_2_EII
    All Circuits
    12.22.25.0/255.255.255.240
    Destination: <All Interfaces>
    All Circuits
    192.168.3.7
    Comment: Inbound http for powerschool
    #75
    Packet Type: HTTP
    Source: B57_1_EII
    All Circuits
    192.168.3.6
    Destination: <All Interfaces>
    All Circuits
    Any Address
    Comment:
    Thanks
    David

    When I looked at the version level of Proxy.cfg I use that you wrote it was
    version 21, I did download the current version. Would the differences
    between the versions cause any of the behavior I've seen. I didn't see much
    in there that looked too different.
    Other Comments inside the threaded message:
    > On your stateful HTTP exceptions, you are:
    > #15 Allowing all HTTP to network 66.89.73.96/255.255.255.224
    To allow private LAN access to the Public IP subnet range to allow
    management of Routers etc. that are outside of this firewall.
    > #22 Allowing all HTTP to cross interface CE1000_1 - if this is the public
    > interface, you are allowing all outbound HTTP without using a proxy. If
    > this is the private interface, you are allowing all inbound HTTP through
    > any static NAT connection.
    CE1000_1 is the interface that is connected to our DMZ. We host 3 webserver
    in the DMZ for public access. One is our Groupwise Webaccess.
    > #24 Allowing all HTTP as long as it originates from the B57_2 interface.
    > If the public IP address were also specified, it would be one of the
    > default exceptions intended to allow proxy to send HTTP requests out from
    > the server to the Internet. Because source/destination IP address is
    > missing, it effectively allows traffic into the B57_2 interface as well,
    > assuming the B57_2 interface is public.
    #24 here is the default added by BM. I don't think I ever touched this one.
    This is a set of filters that were imported from our earlier BM versions
    when we built this box. Sounds like it should have the Public IP address
    listed as the source IP. Would you suggest adding that in?
    > #67 Assuming B57_2 is public, you are allowing HTTP traffic to come from
    > a specific network, into a host at 192.168.3.7, via static NAT.
    Correct. We have a turn key box with a windows application that is managed
    by the vendor via HTTP and that is their IP range and the target IP is
    192.168.3.7 and it hold a static NAT in the 66.89.73 range on the BM Server.
    > #75 Hard for me to tell what this is doing, as it appears the B57
    > interface is your public interface. It will allow any HTTP from IP
    > address 192.168.3.6 to anywhere, but only if that address is on the B57
    > side of the server, which sounds incorrect. I'm guessing that the
    > 192.168.3.x network is on the CE1000 side of the server, in which case
    > this is a useless filter exception.
    The Interface is B57_1_EII and this is the Private LAN interface. The
    intent is to allow the SPAM filter box http access to the internet without
    requiring a proxy. The box is unable to accept a proxy setting and the way
    it gets it's updates from the vendor is via HTTP.
    Thanks
    David

  • SharePoint Foundation 2013 Optimization For Large File Transfer?

    We are considering upgrading from  WSS 3.0 to SharePoint Foundation 2013.
    One of the improvements we want to see after the upgrade is a better user experience when downloading large files.  It can be done now, but it is not reliable.
    Our document library consists of mostly average sized Office documents, but it also includes some audio and video files and software installer package zip files ranging from 100MB to 2GB in size.
    I know we can change the settings to "allow" larger than default file downloads but how do we optimize the server setup to make these large file transfers work as seamlessly as possible? More RAM on the SharePoint Foundation server? Other Windows,
    SharePoint or IIS optimizations?  The files will often be downloaded from the Internet, so we will not have control over the download speed.

    SharePoint is capable of sending large files, it is an HTTP stateless system like any other website in that regard. Given your server is sized appropriately for the amount of concurrent traffic you expect, I don't see any special optimizations required.
    Trevor Seward
    Follow or contact me at...
    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.
    I see information like this posted warning against doing it as if large files are going to cause your SharePoint server and SQL to crash. 
    http://blogs.technet.com/b/praveenh/archive/2012/11/16/issues-with-uploading-large-documents-on-document-library-wss-3-0-amp-moss-2007.aspx
    "Though SharePoint is meant to handle files that are up to 2 gigs in size, it is not practically feasible and not recommended as well."
    "Not practically feasible" sounds like a pretty dire warning to stay away from large files.
    I had seen some other links warning that large file in the SharePoint database causes problems with fragmentation and large amounts of wasted space that doesn't go away when files are removed or that the server may run out of memory because downloaded
    files are held in RAM.

  • Loading large files in Java Swing GUI

    Hello Everyone!
    I am trying to load large files(more then 70 MB of xml text) in a Java Swing GUI. I tried several approaches,
    1)Byte based loading whith a loop similar to
    pane.setText("");
                 InputStream file_reader = new BufferedInputStream(new FileInputStream
                           (file));
                 int BUFFER_SIZE = 4096;
                 byte[] buffer = new byte[BUFFER_SIZE];
                 int bytesRead;
                 String line;
                 while ((bytesRead = file_reader.read(buffer, 0, BUFFER_SIZE)) != -1)
                      line = new String(buffer, 0, bytesRead);
                      pane.append(line);
                 }But this is gives me unacceptable response times for large files and runs out of Java Heap memory.
    2) I read in several places that I could load only small chunks of the file at a time and when the user scrolls upwards or downwards the next/previous chunk is loaded , to achieve this I am guessing extensive manipulation for the ScrollBar in the JScrollPane will be needed or adding an external JScrollBar perhaps? Can anyone provide sample code for that approach? (Putting in mind that I am writting code for an editor so I will be needing to interact via clicks and mouse wheel roatation and keyboard buttons and so on...)
    If anyone can help me, post sample code or point me to useful links that deal with this issue or with writting code for editors in general I would be very grateful.
    Thank you in advance.

    Hi,
    I'm replying to your question from another thread.
    To handle large files I used the new IO libary. I'm trying to remember off the top of my head but the classes involved were the RandomAccessFile, FileChannel and MappedByteBuffer. The MappedByteBuffer was the best way for me to read and write to the file.
    When opening the file I had to scan through the contents of the file using a swing worker thread and progress monitor. Whilst doing this I indexed the file into managable chunks. I also created a cache to further optimise file access.
    In all it worked really well and I was suprised by the performance of the new IO libraries. I remember loading 1GB files and whilst having to wait a few seconds to perform the indexing you wouldn't know that the data for the JList was being retrieved from a file whilst the application was running.
    Good Luck,
    Martin.

Maybe you are looking for

  • User exit for Purchase order & Purchase requisition.

    Hi,   I need the user exits/BADIs/BTE in the following events.        1)  Just before saving Purchase requisitions and After saving purchase req ( Create and Change)       2)  Just before saving Purchase Order and After saving purchase Order   ( Crea

  • Dynamic Table with Random Records

    What I am trying to do is select random records from a table and display them in a dynamic table with max columns set to 3 and the 4th record to be on a new row. Below is what I have right now and it works to randomly pick records but has no function

  • Results on Router WRT54GS with WRE54G

    Can somebody tell me good things about this two devices, WRT54GS with WRE54G,plus negative reports of it.

  • Where is the artwork stored?

    Simple simple one.... After you add the artwork in itunes.. what happens if you move the library? Is the artwork stored in the actual library file or where!?!?!?! Just cannot see where the image files are stored... Does thatmake sense? Thanks in adva

  • Make pictures pop-up on the same page

    okay so this is gonna get complicated i want to have a list of my paintings on my page (got that) when i click i do not want them to open in the same window or a new one,,,,,, i want it to appear in the pink space, marked on my website under "paint",