Stor.E TV+ Samba Large Files Issue

Hi,
Using SAMBA I can copy small files to my Stor.e TV+ but cannot copy files > ~500MB. Is there any update to software/bios that resolves this ? Can't see any available downloads on the Support Pages for this device. Can we configure the Samba Share ourselves at all ?
I require to be able to copy Large Files over Samba to prevent having to locally attach the Device to PC using USB each time I want to copy my large digital files over to the Media Player.
Thanks

Hi
I think this could be due to energy saving settings of the computer or the NAS has interrupted the connection.
If the energy saving settings of your laptop or NAS are active, then they can go into a idle state or sleep mode even though they are being accessed via the network.
I would recommend adjust the settings of the medias source to prevent it going into sleep while transferring files via the network.
You could also try different connection modes:
- LAN setup
- Wlan setup (Adhoc connection)
- PPPoE Setup (if your device is connected to network via DSL)
I think it would be interesting to know what connection is affected.
Good luck mate

Similar Messages

  • Tdms viewer+large files issue

    I built a VI that stores large [up to 50x50] 2D arrays continuously, every new 2D array is stored as a new group in a TDMS file.
    The VI streams out the data, it's OK, however when it finishes and the TDMS Viewer pops up, the LabVIEW keeps hanging and [seemingly] doing nothing. The TDMS files that the viewer tries to display can have easily over 10^6 pieces of 2D arrays inside if the measurements were running for days for example.
    Is there a clever way to make the TDMS Viewer display my file or at least say the quarter of it?
    Thanks,
    Krivan

    Also, if I use the code attached and I try to open that large file the LV keeps hanging [all open LV windows], the program does not enter the while loop because the open tdms VIs do nothing.
    However, if I run this program with a very small sized tdms file [~300 kbyte] containing only a couple of thousands of matrixes, it runs without any problem.
    I haven't tried the defragmenter so far because here it was reported to be not advisable.
    I also tried to read only a small portion of the files wiring 1000 to the read VI but [in case of the large files] the problem remains the same, the operation does not get through the TDMS Open VIs.
    Is there a clever way to handle large TDMS files?
    Thanks,
    Krivan
    Attachments:
    read_average_TDMS_difference_plots.vi ‏22 KB

  • Acrobat 9 Large File Issues?

    I work with very large Excel files that I need to turn into pdf's.  Very large meaning 30,000 to 50,000 rows in the spreadsheet.  Once turned into a pdf it is 586 pages long.  I had Adobe Acrobat 9 Standard and I began to experience issues.  If I were in the Excel spreadsheet and clicked on Acrobat, create pdf, the process would take 10-15 minutes and the resulting pdf was not correct (in Excel, I would do a pagebreak preview and the last column would be indicated as on a new page.  However, I would drag the line to include the last column and save the file that way.  But, the pdf created would still have that last column as a separate page, thus creating over a 1,000 pages in the pdf).  Whatever I did I could not get it to work.  However, if I selected "Print" and chose Adobe Acrobat as the printer, it would create a perfect looking pdf in about a minute.  Much faster and it looked correct.  The issue I found however was in searching for words in that pdf using the pdf search, some of the words were findable and some it indicated could not be found - even though the words were in fact in the pdf.
    I thought that perhaps I was beyond the capabilities of Acrobat 9 Standard so I installed Acrobat 9 Professional.  Now I can, in fact create a pdf directly from Excel.  It takes about 5 minutes but it completes and it looks good.  When I search for the words it now finds all of them.  However, as a test, I chose the print method which againn took less than a minute and produced a perfect looking pdf.  However, when doing the word searches again it did not find some of the words.
    Does anyone have any idea why?  I would much prefer to create the pdf using print because it is so much faster but I have to have search find all the words.

    Are those poorly-searchable pdf's confined to Excel source docs or does the problem also occur when starting from other Office apps such as Word?
    FWIW I once saw a web review of a free pdf writer utility that gave searchable pdfs from word processor apps but not from Excel. Maybe some pointers there?
    http://www.techsupportalert.com/best-free-pdf-writer.htm
    Just a wild-assed guess (I don't even have Pro 9 or Office 2007), do those missing words in the poorly-searchable pdf's contain ligatures, where text character strings such as 'fi' and 'fl' are replaced by single, special characters? Possibly such words might remain invisible to your pdf text search because you are typing the plain vanilla text version into the Acrobat search box?

  • Photoshop Elements 7 saving large files issue

    I use Photoshop Elements 7, I recently bought a new camera (24 megapixels). PSE7 is not happy, I am having lots of issues saving my work. Is anyone else having these issues?

    DJ, My computer is windows 8, below are the system requirements for PSE12 - looks like it will run in 64-bit. Thanks for pointing me in the right direction!
    No, you have seriously misunderstood what I said. PSE will not run as a 64 bit application under Windows. Even though you have a 64 bit version of Windows, PSE will NOT run as a 64 bit application under Windows. It will run as a 32 bit application under Windows 64bit, giving it a maximum of approx 3 GB of memory to work with. I don't know if that will make your problems go away or not. Please download the 30-day free trial of PSE 12 and see if your problems go away or not before you make a purchase.
    From the system requirements
    all other applications run native on 32-bit operating systems and in 32-bit compatibility mode on 64-bit operating systems

  • Streaming large files issue (flash streaming media server)

    We are streaming a one hour F4V from streaming media server 3.5.2 and for some reason it is seeing our one hour video as being 10 hours long. We have tons of other videos and never ran into this problem in any of the other files. This is the only file that exceeds one hour. Any ideas. This occurs in the default player from the streaming server.

    Also before anyone asks we are using RTMP and the file type is F4V not FLV.

  • WRT350N Large File Issue

    I am connecting wirelessly to my WRT350N using the WPC300N wireless card, XP Pro. If I try and back up my laptop, either using the USB storage connection or using a gigabit NAS, the wireless connection either degrades to 1Mbs or drops off completely. This is whether I use a firewall or not.
    If I connect using a slower NAS, this does not happen. Any idea of what the problem may be or whether there is a setting that may be adjusted?
    Thanks.

    I am connecting wirelessly to my WRT350N using the WPC300N wireless card, XP Pro. If I try and back up my laptop, either using the USB storage connection or using a gigabit NAS, the wireless connection either degrades to 1Mbs or drops off completely. This is whether I use a firewall or not.
    If I connect using a slower NAS, this does not happen. Any idea of what the problem may be or whether there is a setting that may be adjusted?
    Thanks.

  • Why does the Reduced Size PDF command create larger files?

    I have been a happy user of Adobe Acrobat vers. 7 until Adobe ceased the support of that version and I was unable to print and do other activities since my last Mac OS vers.10.6.8 update, so I was forced to upgrade to Adobe's Acrobat's X latest version 10.0.0.  I had hope to be able to produce Adobe Acrobat 3D files but found out later that Adobe does not do that anymore and I would have to buy it as a plugin from Tetra 4D for another $399.00, oh well. I then wanted to reduce the file size of some larger pdfs I had edited.  It took a while to locate the command location, now in File>Save>Reduced SizePDF, once I activated the command on my file it actually increased its size!!  Why did this happen?  Should I continue to use my disabled, Adobe unsupported vers 7 to get the superior performance I once had?  This issue was mentioned in an earlier thread that started out as to where to locate this command but the resultant larger file issue remained unanswered.  Currently unhappy on my purchase.  Will Adobe offer a free upgrade to fix this?

    I agree with Peter. Use Pages '09, but keep a document migration strategy present. When you create a Pages document in Pages '09, export it to Word .doc too. Export any current content in Pages v5.2.2, and then stop using it. That means no double-clicked documents.
    Pages v5 is not an evolution of Pages '09 — it is an incomplete rewrite, and uses a different, internal document architecture than did Pages '09. This internal architecture is to blame for the export translation bloat into Word's different document architecture. It does not matter whether you type a single character, or multiple pages, you will get about a 500Kb .docx file.
    Opening, and resaving this monster .docx file in MS Word in Office for Mac 2011, LibreOffice, or TextEdit will result in a very small .docx footprint.

  • Partition External HD to store large files

    Can I partition my WD My Book for Mac external hard drive so that I can place large files such as movies from my Macbook Pro, to the newly created partiton on the external HD, then delete it off my MacBook Pro? I assume I would need to format it Mac OS Extended (Journaled)?  Also, I use time machine to backup the Macbook Pro to the external HD, so I have not downloaded the WD Smartware software.  Without downloading WD's software, can I manually place and retrieve the large files using the Macbook Pro's Finder I'm assumiung? And how difficult would it be?
    Thanks in advance,
    AndrewMx17

    So I was able to shrink my partition on the exteral HD that I was using Time Machine to backup to so I wouldn't lose my backup, and create another partiton for my large files I would like to store, then use Finder to place and retrieve files.  One thing I did encounter was the partition I was using to backup the Macbook was found to be corrupt but the 'First Aid' tool in Disk Utility was able to fix it so I could partition the drive.

  • Ok to store large files on desktop?

    Ok to store large files on desktop?
    Are there consequences (running slow) when storing a large files or numerous files on the desktop?
    Sometimes I use the desktop to stores photos until I complete a project. There are other files on the desktop that I could remove if it causes the computer to slow down. If not, I'll keep them there because it's easy and convenient.
    I have 4G of memory.

    Hi Junk,
    I can't think of any consequences to storing your large files on your desktop. After all, from your system's point of view, your desktop is just another folder. However, there is some system consequence to storing lots of files there, as it will decrease performance. But with 4GB of memory, I'm not sure how much of a difference you might see. And there's a big gap between "I'm storing 15 photos on my desktop" and "I can no longer see my hard disk icon in this mess." If the idea of even potential system slowdowns bothers you, create a work folder and leave it on your desktop to drop all of those photos in. If you're not getting too insane with the amount, though, I'm sure you'll be fine.
    Hope that helps!
    —Hazy

  • Help with Aperture/T2i/iMovie/large file ( 2GB) issue.

    I am having an issue with a very large file (3.99 GB) that I shot with my Rebel T2i. The file imports fine into Aperture, but then it isn't recognized at all by iMovie. I found out that iMovie can only handle files that are 2 GB or smaller.
    So now, I am trying to figure out how to chop my mega file neatly into a pair of 2GB halves. When I use the trim function, that does not seem to do the trick -- this may be a case of Aperture's non-destructive nature actually working against me.
    Does anyone have a solution for this? My intuition suggests that this may be a job for QuickTime Pro -- but I wasn't sure how that works now that we all have QuickTime X.
    Much appreciated.

    The file may well be in the wrong format, can you tell us more about it. See This

  • Stor.E TV+ doesn't transfer large file or large selection of files

    Hi Everyone,
    I've just bought a Stor.E TV+ 2TB and I've found a very noisy problem related to file transfers from PC to the HDD trough the LAN (wired &/or wireless).
    The unit doesn't transfer large file or large selection of files and hangs the PC process.
    Does anyone knows if there's an update firmware or a workaround to solve this?
    Looking forward to read from someone.
    Thanks
    G.

    Hi
    As far as I know this was solved by firmware update
    You could find the firmware on Toshiba European driver page:
    http://eu.computers.toshiba-europe.com/innovation/download_drivers_bios.jsp
    Here choose:
    - Options
    - Drive Devices
    - Multimedia HDDs
    - 3.5 inch StorE TV +

  • Testing iMac in store I checked iPhoto editing. Photo said to be 42Mb but zooming caused immediate pixellation, so clearly was not displaying a 42Mb file. When I zoom in PS on PC a large file retains detail for a lot of zooming. Whats going on?

    While testing a MacBook pro in the Apple Store, I checked iPhoto. Looking at what was said to be  42Mb file I noticed that zooming caused immediate pixellation, so clearly a 42Mb RAW file was not what I was looking at. In my PS Elements on my PC when I zoom, the photo remains unpixellated until quite a lot of zoom is used (marked as 100%). This depends on the file size of the photo of course. Maybe the displayed photo in iPhoto is a low res. jpeg. This is not too good for me. I wanted to see how the base model of MacBook would handle photos esp large files and I need to zoom in. Will the base model cope with this in PSE for Mac?

    iPhoto will happily zoom to 300%. The quality at that magnification will depend on the quality of the image.  But if it's a sharp image - of any size - it will zoom to 300% with no problems.
    PSE for Mac has nothing to do with iPhoto. I have used it with no problems, but for best information why not ask over at the Adobe forums?

  • Hello, I am having issues open very large files I created, one being 1,241,776 KB. I have PS 12.1 with 64 bit version. I am not sure if they issues I am having is because of the PS version I have, and whether or not I have to upgrade?

    Hello, I am having issues open very large files I created, one being 1,241,776 KB. I have PS 12.1 with 64 bit version. I am not sure if they issues I am having is because of the PS version I have, and whether or not I have to upgrade?

    I think more likely, it's a memory / scratch disk issue.  1.25 gigabytes is a very big image file!!
    Nancy O.

  • [BUG] Performance Issue When Editing Large FIles

    Are other people experiencing this on the latest JDevelopers (11.1.1.4.0 and the version before it) ?
    People in the office have been experiencing extremely slow performance happening seemingly at random while editing Java files. We have been using JDev for almost 10 years and this has only become an issue for us recently. Typically we only use the Java editing and the database functionality in JDev.
    I have always felt that the issue has been related to network traffic created by Clearcase and have never really paid attention, but a few days ago after upgrading to the latest version of JDev, for the first time I started having slowdowns that are affecting my speed of work and decided to look into it,
    The main symptom is the editor hanging for an unknown reason in the middle of typing a line (even in a comment) or immediately after hitting the carriage return. All PCs in the office have 2Gig or more RAM and are well within recommended spec.
    I've been experimenting for a few days to try and determine what exactly is at the root of the slowness. Among the things I have tried:
    o cutting Clearcase out of the equation; not using it for the project filesystem; not connectiing to it in JDev
    o Not using any features other than Java editing in JDev (no database connections)
    o never using split panes for side by side editing
    o downloading the light version of JDev
    o Increasing speed of all pop-ups/dynamic helpers to maximum in the options
    o disabling as many helpers and automations as possible in the options
    None of these have helped. Momentary freezes of 3-5 seconds while editing are common. My basic test case is simply to move the cursor from one line to another and to type a simple one-line comment that takes up most of the line. I get the freeze most usually right after typing the "//" to open the comment - it happens almost 100% of the time.
    I have however noticed a link to the file size/complexity.
    If I perform my tests on a small/medium sized file of about 1000 lines (20-30 methods) performance is always excellent.
    If I perform my test on one of our larger files 10000 lines ( more than 100 methods) the freezes while editing almost always occur.
    It looks like there is some processor intensive stuff going on (which cannot be turned off via the options panel) which is taking control of the code editor and not completing in a reasonable amount of time on large Java files. I have a suspicion that it's somehow related to the gutter on the right hand side which show little red and yellow marks for run-time reports of compile errors and warnings and haven't found any way to disable it so I can check.

    Just a small follow-up....
    It looks like the problem is happening on only a single Java file in our product! Unfortunately it happens to be the largest and most often updated file in the project.
    I'm still poking around to figure out why JDev is choking consistently on this one particular file and not on any of the others. The size/complexity is not much bigger than the next largest which can be edited without problems. The problem file is a little unusual in that it contains a large number of static functions and members.
    Nice little mystery to solve.

  • Rtorrent: issue with DL large files ( 4GB) to NTFS

    Using latest rtorrent/rutorrent:  every time I DL a large >4GB file with rtorrent to the NTFS drive it shows it downloading the whole file MB by MB, but when I go to hash check (via rutorrent), there's only a partial percentage DLded.  Say if I DL a 4.36 GB .mkv file, I hash check and only 10% is done ~400MB or about 6 minutes of the video.
    Oddly:
    If I do ls -l --block-size=MB, the file shows normal 4GB+ size.
    If I do ls -s, file appears to be only a few hundred MB.
    If I DL to my root ext4 drive, there's no issue unless I change the save path of the torrent in rutorrent and elect for the files to be moved to the NTFS drive.
    I've transferred large files with 'cp' from another NTFS to this NTFS with no issue.
    I thought the problem was rutorrent plugin autotools, but I removed it from my plugins folder and the problem persists.
    Permissions:
    I have all the relevant directories in /etc/php.ini open_basedir:  the user/session, the mounted drive, and /srv/http/rutorrent
    I did #chown -R http:http /srv/http/rutorrent
    http is a member of the group with NTFS drive access
    the rutorrent/tmp directory is changed to be within /srv/http/rutorrent
    This is a pesky issue that I didn't have with my last arch install using the same general set up.
    I DL to an NTFS formatted drive and mount it the same way I did before: ntfs-3g defaults,auto,uid=XXXX,gid=XXXX,dmask=027,fmask=037
    My rtorrent user is the uid (owner) and is in the group that has access to the drive (along with my audio server user and http)
    I run rtorrent in screen as the rtorrent user
    I imagine this is an issue with rutorrent?
    Any tips before I reformat the whole 4TB to ext4?
    EDIT:  the issue is definitely isolated to rtorrent.  I manually added large size torrent using rtorrent, it completed.  I then hash checked (in rtorrent) and again only ~10% was shown as complete.
    EDIT2:  It is most definitely not a permissions issue.  Tried this again without mount permissions options and the same thing happens.
    Last edited by beerhoof (2015-01-30 22:05:57)

    I'm afraid I don't understand the question.
    7.2 now correctly parses the Canon XF .CIF sidecar files to determine whether the media is supposed to be spanned or not.  This has been a feature request that has been finally addressed to work correctly.
    (It also was there in 7.1 & previous, but had limitations:  the performance wasn't as good, there had been issues in the past with audio pops at cut points, and it required that the Canon XF folder structure remain intact, ie if you copied the media to a flattened folder structure, it would fail to do the spanning correctly.)
    If you are looking for a means to disable the automatic spanning, simply removing the .CIF files will achieve that.  Although i'm not sure I understand why you're looking to do that.  Most people *want* spanning to happen automatically, otherwise you're forced to manually sync spanned media segments by hand. 

Maybe you are looking for

  • APEX Listener EA2 Standalone CLOB error

    I'm testing the APEX Listener EA2 release in Standalone mode on CentOS against Oracle XE. My RESTful service calls use the Media Resource type to return a CLOB that I format myself inside a function: select 'application/json', my_function_that_return

  • MacBook Restarts After Closing The Screen

    I have a new MacBook that I purchased the first of October. It has been running great until recently. About a week ago it started restarting itself after I closed the screen. I can use the laptop with no problem and then close the screen to let it go

  • Extract data from CAUFVD

    Is there a way to extract data from CAUFVD structure? I need to get the field for the Delivered Quantity (CAUFVD-GWEMG) but it is not in tables AUFK or AFPO. Thanks

  • CSS sheets won't work

    Another noob question for all you pros out there... I've made a webpage layout I'm proud of, and decided to use CSS for some of the text. I created several CSS sheets, each for a different bit of text, and saved them in their own folder (ex. MySite\S

  • How to code on button click increase cell height ?

    hi, Am new in development .Am creating app same like facebook.i have table view controller in that every cell having button .on perticular button click i want to add view to that cell .on same same when i click second time then it hide the addded vie