Large file slow loading

For a large project that takes a long time to load from the
Internet, is there a way to configure or set things up so that the
movie starts playing while the balance is loaded?

Thanks Rick,
The connectpreloader, didn't do anything different, but when
I tried the 'defualtpreloader' it did what I wanted, it started the
movie at 60% loaded... Cool.
My first checking on this problem was to look in your Tips
book that I bought a couple of weeks ago, since you do cover so
much that isn't in the documentation... next time you update it,
you should add this, as I'm sure that many people are frustrated by
how long it takes their movies to load and this mitigates that
concern. BTW, your Tips book is great and well worth the money.
Who do I email at Adobe, if you know who would be
appropriate, to try and get them to have someone like you write
their documentation for Captivate. I've been in the Software
Publishing business for about 30 years and I've never run across
such pitiful documentation as Captivates. MOST of the features and
options in the program aren't even referenced. As an example, this
preloader capability isn't discussed anywhere.
They are lucky that you monitor this forum and provide such
GREAT help for people.
Cary

Similar Messages

  • In making a "highlights" movie, using clips from different imported iMovie events, can I delete the larger iMovie event file from the Events browser and still work w the smaller clips in the Projects browser w/o having the larger files still loaded?

    I have sucessfully imported 150 Sony digital 8mm movies (each one hour in length) into iMovie as 150 iMovie events. I have since successfully converted them from their original 13 Gb (.dv) file to an exported smaller 1.3 Gb (.4mv "large file") movie that I am happy with, using the iMovie projects browser. So now I have 150 " .4mv" movies on my internal HD as well as about half of my original raw data " .dv " movies on my internal hard drive.
    Due to their large size (over 2 Tb), I do not have all the larger raw data (.dv) files on my 2Tb internal drive, just about half of them. What I want to do now is to create a new project in the Projects browser for each of my kids, and reload, starting with Tape #1, each of these larger files and do a highlights movie for each of my kids, wherein I pick out smaller clips from each 1 hour .dv iMovie event and paste them into the appropriate kid's Highlights project in the Projects browser.
    Here's my question: If I load the first 5 large files back onto my internal HD, and paste in various shorter clips into each of my kids' Highlights project, and then if I delete those first 5 large files (they are backed up on 2 other 3Tb external HDs), can I keep doing this (reloading the next 5 large .dv files to work with), and ultimately take each of my kid's Highlights project and export as a .4mv movie EVEN THOUGH the earlier large .dv files are no longer on my internal HD, OR does my iMac need to have all these larger files loaded on my internal HD for me to eventually export each of my kid's Highlights project to a .4mv movie?
    I have a 2011 era 27" iMac desktop w 2Tb HD internal and 250 Gb flash drive, and Lion OSX and iMovie 11.

    Thanks. I tested it out and you were correct. I loaded 2 .dv movies from my external HD back onto my internal HD, and got them re-imported into iMovie, took a few short clips from each of the 2 iMovie events and pasted them into a new project in the Project Browser. Then I deleted these 2  "source clips" from my internal HD, closed iMovie and then re-opened it and found that iMovie would NOT export the smaller clips for a "highlights" .4mv movie without the "source clips" being available.
    I read your link on Quicktime. It talks about mostly trimming, which is what I did with each iMovie event before I took each one as a project to export as a smaller .4mv file. But if I use Quicktime (do I need QT Pro or basic), what advantage is it to me to use QT over iMovie (I must admit I am a novice at iMovie and have never used QT or QT Pro as a tool)? Will it then convert any edits I make to a .m4v movie or do I need iMovie to do that?
    Does QT allow trimming multiple segments out of a movie during one edit session, or can you only do 1 at a time? By that I mean that, for example, when I use Sonys Picture Motion Browser for my .mts movies, you can only set one Start and one End point for each edit/trim you do: it does not allow you to set multiple Start Stop points like iMovie allows in its Event or Project browser. You can only do one "trim" at a time, save it and then reopen to do another trim. not very useful.

  • Large Jars -Slow loading of Applets

    I have a large jar file as a result of which My Applet takes a long time to load in the browser .Any suggestions to rectify this problem .
    GUI is primarily Swing and Browser- Netscape

    Use more than one jar. Then when one changes it takes less time to download.
    Or just ignore the problem. You are more likely to see this problem in developement (or testing) than in production. In production they only need to download it once.

  • ITunes 7 & large library - slow loading

    After upgrading to iTunes 7 it takes for ever to launch iTunes and now I get a progress bar that says loading iTunes Library that I didn't get under iTunes 6, I have about 30,000 songs in my library so I imagine it is only a issue with large libraries doe any one else have this issue?...

    I also have a rather large library. iTunes 7 definitely takes longer to load. In addition I am now having a problem opening my music folder in my home directory using the Finder to view the files in the folder. If I right-click on the folder I get the spinning beachball for approximately 1 minute (on a dual 2GHz Power Mac G5 w/ 2GB of RAM). When I am finally able to select "Get Info" I get the spinning beachball again for quite a while (at least several minutes!) When the Finder finally reports back it says there is NOTHING in the folder! Aside from my iTunes library I have numerous other tracks in my Music folder that have not been added to my iTunes library. Since iTunes 7 can "see" my iTunes library and play the tracks in there in there I'll assume that my other tracks are still there. The permissions appear to be ok for the folder. I can't believe that Apple has rushed out such a flawed "upgrade" to iTunes! The new features are desirable but it seems that there are way too many glitches with iTunes 7.
    Dual 2GHz Power Mac G5   Mac OS X (10.4.7)   2GB DDR RAM

  • Can't publish larger files in iWeb 09

    Hi
    Have recently updated my iWeb to 09 and had accidentally deleted my old Domaion file so had to rebuild site from scratch. Anyway, since I have, it gets stuck on pages that have any larger files on them (movie and audio files) despite having plenty of space in my MobileMe account. I get "Try again later" message.
    Have managed to publish successfully to a local folder which I then tried copying with index to 'Sites' on my iDisk but that that too presents a problem saying "The Finder cannot complete the operation because some data in "" could not be read or written. (Error code -36)
    I'm baffled! All I want to do is publish my site without any hassle, the files aren't that big only about 40 MB each. Can anyone suggest anything? Have had a look at iwebfaq and on here and have tried a few things but nothing seems to help so far. Any suggestions would be very gratefully received! Thanks!

    That error code is -36 ioErr I/O error. Can you publish a different site? Crate a small test site and see if it will go thru. If not then it's not just your primary site.
    If you expect to revise the site occasionally or want your visitors not to have to wait for those large files to load while the host page loads you might look at these ways to add a movie or audio file.
    Movie: QT Movies and Opening Item in New, Specially Sized Window. The code used is shown.
    Audio: Flash Audio Players - 3 different types of flash audio players. The code used is shown.
    These methods require loading the movie or audio files only once and do not load on the pages until they are played.
    OT

  • 4.2.3/.4 Data load wizard - slow when loading large files

    Hi,
    I am using the data load wizard to load csv files into an existing table. It works fine with small files up to a few thousand rows. When loading 20k rows or more the loading process becomes very slow. The table has a single numeric column for primary key.
    The primary key is declared at "shared components" -> logic -> "data load tables" and is recognized as "pk(number)" with "case sensitve" set to "No".
    While loading data, these configuration leads to the execution of the following query for each row:
    select 1 from "KLAUS"."PD_IF_CSV_ROW" where upper("PK") = upper(:uk_1)
    which can be found in the v$sql view while loading.
    It makes the loading process slow, because of the upper function no index can be used.
    It seems that the setting of "case sensitive" is not evaluated.
    Dropping the numeric index for the primary key and using a function based index does not help.
    Explain plan shows an implicit "to_char" conversion:
    UPPER(TO_CHAR(PK)=UPPER(:UK_1)
    This is missing in the query but maybe it is necessary for the function based index to work.
    Please provide a solution or workaround for the data load wizard to work with large files in an acceptable amount of time.
    Best regards
    Klaus

    Nevertheless, a bulk loading process is what I really like to have as part of the wizard.
    If all of the CSV files are identical:
    use the Excel2Collection plugin ( - Process Type Plugin - EXCEL2COLLECTIONS )
    create a VIEW on the collection (makes it easier elsewhere)
    create a procedure (in a Package) to bulk process it.
    The most important thing is to have, somewhere in the Package (ie your code that is not part of APEX), information that clearly states which columns in the Collection map to which columns in the table, view, and the variables (APEX_APPLICATION.g_fxx()) used for Tabular Forms.
    MK

  • Flash media server taking forever to load large files

    We purchased FMIS and we are encoding large 15+ hour MP4 recordings using flash media encoder. When opening these large files for playback, which have not been opened recently  the player displays the loading indicator for up to 4 minutes! Once it has apparently been cached on the server it opens immediately from any browser even after clearing local browser cache. So a few questions for the experts
    1. Why is it taking so long to load the file. Is it because the MP4 metadata is in the wrong format and the file is so huge? I read somewhere that Media Encoder records with incorrect MP4 metadata is that still the case?
    2. Once its cached on the server, exactly how much of it is cached. Some of these files are larger than 500mb.
    3. What fms settings do you suggest I change. FMIS is running on windows server R2 64 bit, but FMIS itself is 32 bit. We have not upgraded to the 64 bit version. We have 8GB of ram. Is it OK to set FMS cache to 3GB. And would that only have enough room for 3-4 large files, because we have hundreds of them.
    best,
    Tuviah
    Lead programmer, solid state logic inc

    Hi Tuviah,
    You may want to email me offline about more questions here as it can get a little specific but I'll hit the general problems here.
    MP4 is a fine format, and I won't speak ill of it, but it does have weaknesses.  In FMS implementation those weaknesses tend to manifest around the combination of recording and very large files, so some of these things are a known issue.
    The problem is that MP4 recording is achieved through what's called MP4 fragmentation.  It's a part of the MP4 spec that not every vendor supports, but has a very particular purpose, namely the ability to continually grow an MP4 style file efficiently.  Without fragments one has the problem that a large file must be constantly rewritten as a whole for updating the MOOV box (index of files) - fragments allow simple appending.  In other words it's tricky to make mp4 recording scalable (like for a server ) and still have the basic MP4 format - so fragments.
    There's a tradeoff to this however, in that the index of the file is broken up over the whole file.  Also likely these large files are tucked away on a NAS for you or something similar.  Normal as you likely can't store all of them locally.  However that has the bad combo of needing to index the file (touching parts of the whole thing) and doing network reads to do it.  This is likely the cause of the long delay you're facing - here are some things you can do to help.
    1. Post process the F4V/MP4 files into non fragmented format - this may help significantly in load time, though it could still be considered slow it should increase in speed.  Cheap to try it out on a few files. (F4V and MP4 are the same thing for this purpose - so don't worry about the tool naming)
    http://www.adobe.com/products/flashmediaserver/tool_downloads/
    2. Alternatively this is why we created the raw: format.  For long recording mp4 is just unideal and raw format solves many of the problems involved in doing this kind of recording.  Check it out
    http://help.adobe.com/en_US/flashmediaserver/devguide/WSecdb3a64785bec8751534fae12a16ad027 7-8000.html
    3. You may also want to check out FMS HTTP Dynamic Streaming - it also solves this problem, along with others like content protection and DVR and it's our most recent offering in tech, so it has a lot of strengths the other areas don't.
    http://www.adobe.com/products/httpdynamicstreaming/
    Hope that helps,
    Asa

  • AME CS6 painfully slow loading large quantities of clips.

    I'm using AME CS6 to transcode clips to prepare clips for delivery to a client that will only accept OP1a MXF's.  I have hundreds of clips to transcode and AME CS6 is unbearably slow loading them.  Usually I set up a watch folder but in this case I have two batches to run, one with about 750 individual clips and one with about 450.  Using watch folders just completely brought AME to it's knees trying to transcode a file while updating the queue.  I've just spent two hours loading the clips into AME by dragging/dropping them into the queue (that time includes time spent waiting while AME hangs while trying to change the codec/output setting).
    Earlier versions of AME were no where near as slow as this.  Even just scrolling up/down the queue that used to be fluid is now a click, wait, update..click, wait update process.  I could have copied the files from drive to drive in less time than this took to load/index.  Is there anything that can be done to speed up this process? 
    FYI
    Supermicro workstation
    Win7/64
    Dual Xeon X5650
    48Gb RAM
    nVidia Quadro 4000
    2x nVidia GTX 580
    Red Rocket
    SSD (OS)
    RAID-0 (media-internal)
    RAID-0 (render-external)

    lasvideo, thanks but that's not the same problem I am having. Appreciate you chiming in though!
    Stephen, thanks for writing. It's a pretty massive project and because clips often criss-cross between segments I have been working everything inside the same edit. I also wanted to see just how hard I could push Premiere (I've been using Premiere for 7 years now). Mind you once everything loads it still does just fine. Edits like a champ. Just was a bit frustrating with the load time.
    On #1, I don't have time machine set-up anymore as I ate up all my space.
    On #2 this is an interesting suggestion. I'm letting project manager analyze my project right now. I've never used project manager to break off individual edits so this should be interesting. Yes, just me on one machine. I'm shooting and editing at the same time and also doing graphics so I reckon by December. I have two full backups of all footage and edit files and save to a new edit file every couple of days (plus I have auto-save rolling).
    And for whatever reason, today when I opened a different project and then went back to the large project, suddenly the problem dissappeared. Weird but there it is. All 20,000 clips now load again under 4 minutes.
    Thanks for the assistance.
    Best, Dare

  • Large and many layered file slow on CS5

    I've installed CS5 on a new 17" MBP i5 8gb. Opening a CMYK PSB of 5.75 gb and around 100 layers, I was disheartened to find that scrolling and zooming were noticeably slower on CS5 than on CS3 on the same machine, and not substantially faster than CS3 on a 2007 17" MBP. It's frustrating, as Adobe has suggested that I might "process very large images up to ten times faster by taking advantage of cross-platform 64-bit support."
    Zooms for this file vary between a few seconds and a minute. At the extremes, it is noticeably worse than CS3 on the same MBP. But even where the time to redraw is similar, CS5's only-when-complete redraw offers less information and less ability to responsively modify zooms than CS3.
    Scrolls cause freezes of several seconds to a half minute. I assume that CS3's progressive redraw may sometimes feel faster than it is. But that rapid if partial response is usable information when scrolling. It's often unnecessary to fully image a screen, but CS5 makes it unavoidable. In both CS3 and CS5, once the whole image has been viewed at a particular zoom, scrolling becomes relatively fluid within that zoom until changes have been made.
    Looking around this site, I've seen and taken suggestions to repair permissions with Disk Utility, turn off Font Preview and increase the Cache Tile Size to 1024k. There are no 3rd party plugins loaded. I've not seen a substantial improvement in zooming or scrolling, after these changes.
    I'm a bit confused by the "tall and thin" and "big and fat" options, as this file and many that I work on have both large height/width and many layers, so neither description fits.
    Can anyone recommend settings that might substantially speed up CS5's handling of large files with many layers? Thanks!

    Unless you work with us to find the cause of your slowdown, it won't get fixed.
    We don't know why you're running slow.
    So far the slowdowns seem to be due to bad fonts, bad third party plugins.
    But without steps/files to reproduce a slowdown, we aren't going to be able to indentify a cause, much less fix anything.
    Chris, are you going to ask me for information that I have not already supplied? Where do I send the multi-GB files?

  • Slow loading speed for static files

    We are running an azure website on a S2 hostingplan and have a bundled and minified javascript file that is 1.6MB large. The load times of this file are sometimes 600ms and other times it suddenly jumps up to 4 seconds or even higher.
    To improve the loading times we tried creating an azure cdn endpoint that loads the files from the azure websites and now we sometimes see loading times of around 200ms. But sometimes it still jumps up to 4 seconds or higher.
    Can anyone give me an idea what is going on, I assume the larger loading times are due to disk access and figured that adding an cdn would fix this problem but as it turns out it didn't.
    What is going on here?

    To isolate and help us investigate, can you try the following:
    create a dummy site in the same web hosting plan as the site at stake. e.g. give it some random test name.
    Drop that javascript file in the wwwroot folder of that site
    Do you see the same slowness? If so, can you share the name of this dummy site to help us investigate?

  • Loading large files in Java Swing GUI

    Hello Everyone!
    I am trying to load large files(more then 70 MB of xml text) in a Java Swing GUI. I tried several approaches,
    1)Byte based loading whith a loop similar to
    pane.setText("");
                 InputStream file_reader = new BufferedInputStream(new FileInputStream
                           (file));
                 int BUFFER_SIZE = 4096;
                 byte[] buffer = new byte[BUFFER_SIZE];
                 int bytesRead;
                 String line;
                 while ((bytesRead = file_reader.read(buffer, 0, BUFFER_SIZE)) != -1)
                      line = new String(buffer, 0, bytesRead);
                      pane.append(line);
                 }But this is gives me unacceptable response times for large files and runs out of Java Heap memory.
    2) I read in several places that I could load only small chunks of the file at a time and when the user scrolls upwards or downwards the next/previous chunk is loaded , to achieve this I am guessing extensive manipulation for the ScrollBar in the JScrollPane will be needed or adding an external JScrollBar perhaps? Can anyone provide sample code for that approach? (Putting in mind that I am writting code for an editor so I will be needing to interact via clicks and mouse wheel roatation and keyboard buttons and so on...)
    If anyone can help me, post sample code or point me to useful links that deal with this issue or with writting code for editors in general I would be very grateful.
    Thank you in advance.

    Hi,
    I'm replying to your question from another thread.
    To handle large files I used the new IO libary. I'm trying to remember off the top of my head but the classes involved were the RandomAccessFile, FileChannel and MappedByteBuffer. The MappedByteBuffer was the best way for me to read and write to the file.
    When opening the file I had to scan through the contents of the file using a swing worker thread and progress monitor. Whilst doing this I indexed the file into managable chunks. I also created a cache to further optimise file access.
    In all it worked really well and I was suprised by the performance of the new IO libraries. I remember loading 1GB files and whilst having to wait a few seconds to perform the indexing you wouldn't know that the data for the JList was being retrieved from a file whilst the application was running.
    Good Luck,
    Martin.

  • (urgent) SQL*Loader Large file support in O734

    hi there,
    i have the following sqlloader error when trying to upload data file(s),
    each has size 10G - 20G to Oracle 734 DB on SunOS 5.6 .
    >>
    SQL*Loader-500: Unable to open file (..... /tstt.dat)
    SVR4 Error: 79: Value too large for defined data type
    <<
    i know there's bug fix for large file support in Oracle 8 -
    >>
    Oracle supports files over 2GB for the oracle executable.
    Contact Worldwide Support for information about fixes for bug 508304,
    which will add large file support for imp, exp, and sqlldr
    <<
    however, really want to know if any fix for Oracle 734 ?
    thx.

    Example
    Control file
    C:\DOCUME~1\MAMOHI~1>type dept.ctl
    load data
    infile dept.dat
    into table dept
    append
    fields terminated by ',' optionally enclosed by '"'
    trailing nullcols
    (deptno integer external,
    dname char,
    loc char)
    Data file
    C:\DOCUME~1\MAMOHI~1>type dept.dat
    50,IT,VIKARABAD
    60,INVENTORY,NIZAMABAD
    C:\DOCUME~1\MAMOHI~1>
    C:\DOCUME~1\MAMOHI~1>dir dept.*
    Volume in drive C has no label.
    Volume Serial Number is 9CCC-A1AF
    Directory of C:\DOCUME~1\MAMOHI~1
    09/21/2006  08:33 AM               177 dept.ctl
    04/05/2007  12:17 PM                41 dept.dat
                   2 File(s)          8,043 bytes
                   0 Dir(s)   1,165 bytes free
    Intelligent sqlldr command
    C:\DOCUME~1\MAMOHI~1>sqlldr userid=hary/hary control=dept.ctl
    SQL*Loader: Release 10.2.0.1.0 - Production on Thu Apr 5 12:18:26 2007
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Commit point reached - logical record count 2
    C:\DOCUME~1\MAMOHI~1>sqlplus hary/hary
    SQL*Plus: Release 10.2.0.1.0 - Production on Thu Apr 5 12:18:37 2007
    Copyright (c) 1982, 2005, Oracle.  All rights reserved.
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    As I am appending I got two extra rows. One department in your district and another in my district :)
    SQL> select * from dept;
        DEPTNO DNAME          LOC
            10 ACCOUNTING     NEW YORK
            20 RESEARCH       DALLAS
            30 SALES          CHICAGO
            40 OPERATIONS     BOSTON
            50 IT             VIKARABAD
            60 INVENTORY      NIZAMABAD
    6 rows selected.
    SQL>

  • Slow searches in large file ...

    It is almost unusable to use the find function as is in my large file, as the program won't let me complete my text in the search field once it starts looking for the first string of text i type. I have enabled the whole words option, but this doesn't help much. I tried to use the "use selection for find" option in the menu, but this is always disabled, so I'm not sure of this is of any use...
    Any ideas how to get search to wait until I type in the whole text field before it starts to bog down??
    Thanks
    jcm

    Thanks for the reply.
    File is 7 Megs, 1000 rows, columns to BB.
    I don't mind that the search is slow, but e.g. when I want to search for "johnson", it stops me from typing after jo, then again after john, then again after johns etc etc, .... Best workaround so far is to type Johnson into a blank cell somewhere, then paste it into the search field...but if this is the answer, it's back to Excel.......

  • Slow large file transfer speed with LaCie 1T firewire 800 drives

    I am transferring two large files (201gb and 95gb) from one LaCie 1T firewire external drives to another one (using separate connections to a PCI Express firewire 800 card in my Quad G5. The transfer time is incredibly slow – over for hours for the 201gb file and over 2 hours for the 95gb file.
    Does anyone have any ideas why this is so slow or what I might try to speed up the transfer rates?
    Thank you.
    G5 Quad 2.5 8GB DDR2 SDRAM two SATA 400GB   Mac OS X (10.4.5)  

    You posted this in the Powerbook discussion forum. You may want to post it in the Power Mac G5 area, located at http://discussions.apple.com/category.jspa?categoryID=108

  • I need to update to the latest version of Snow Leapord-currently running v10.6. Because of where we live the slow download speed for such a large file has kept me from downloading the update. What can I do short of hooking up computer elsewhere?

    I need to update to the latest version of Snow Leapord-currently running v10.6. Because of where we live the slow download speed for such a large file has kept me from downloading the update. What can I do short of hooking up computer elsewhere?

    Do you ever visit a friend or realtive with a Mac who has faster internet? Maybe the local library has Macs on a fast line. If so, get a USB thumb drive and put this link on it from your computer:
    http://support.apple.com/kb/DL1399
    Then put this link on the drive:
    http://support.apple.com/kb/DL1429
    When you are at some place that has a Mac with a decent download speed. insert the thumb drive in that Mac and click on the first link ("DL1399") and direct the download to your thumb drive. Now do the same this with the second link.
    The installer files will now be on the thumb drive and, when you get home, drag them from the thumb drive to your desktop. Install the Combo update first.

Maybe you are looking for

  • How to validate the contents of excel file after uploading it.

    hi Experts,                  we have a requirement in which we need to create business transactions with the data in the excel file after it was uploaded. we are creating this transactions(BP,LEAD..) in webui -marketing professional role-external lis

  • How to display an image in particular cell editor

    Hi All, I want to display an image as the background for the table cell. I searched in sdn and found the solution only for link to action. I have created a dynamic table and if i click on apply button one more row is getting added according to my req

  • Hard Drives Missing in Leopard

    Hi, When I boot my Mac Pro with the newly installed Leopard, only my first hard drive appears on the system. The other hard drive, partitioned 3 times, is completely invisible. Invisible to the Finder, invisible to things like Photoshop and FCP (whic

  • ADF Skinning - How to identify Skinning Selector Name

    When we do skinning, in the CSS file that is generated by ADF Framework, we are seeing a style named af_commandMenuItem_menu-itemIn order to define, custom properties to this above style, we would need to modify in the custom skinning file as af|comm

  • How do you change the drape you can drag over the icons

    I call it he drape.  The drape that you can drag down that covers the icon screens that shows your weather, calendar, messages, and the stock ticker.  How do you delete the stock ticker? Thank you, Gary