Error when merging large clips in QuickTime

I am trying to merge two 2-hour clips into a 4-hour clip in QuickTime (OS X 10.7.2). If I combine very short clips and export for iPad all works.  But the long clip combination generates an error "The operation not supported for this media." What is the problem? Is there a size limit I don't know about?  I can create the large (4 hour) film in Quicktime and I can watch it.  But I can't export the result.  Anyone have an answer?  [The two long clips were generated by HandBrake for Apple TV2 and the clips work everywhere they should.]
Thanks for any advice or help.
Jerry

I am trying to merge two 2-hour clips into a 4-hour clip in QuickTime (OS X 10.7.2). If I combine very short clips and export for iPad all works.  But the long clip combination generates an error "The operation not supported for this media." What is the problem? Is there a size limit I don't know about?
What is the encoded resolution and file size of each file? While HandBrake generates iPad files using "Large File" addressing, the QT X player may or may not be programmed to use this mode for smaller resolutions based on the contextually adaptive nature of the preset. Just a guess, but it is possible the QT X player may refuse to "export" the content if the target file is projected to exceed 4 GBs or exceeds the available memory required to perform the processing of the files.
If true, then depending on the original source files' compression format(s) and apps available, I might try one of the following:
1) Create a merged "reference" file containing the two source files using QT 7 Pro or create a single "merged" file from the two source files in a single wrapper and then use HandBrake to make a single conversion/export to the target iPad compression format.
2) Use QT 7 Pro or MPEG Streamclip to save the data in the two M4V files to a single MOV file (without recompressing the data) and checking to see if this file can be imported/managed by iTunes for use on your iPad.
3) Try a lower resolution export from QT X to see if this obviates the error message.

Similar Messages

  • Error when opening large data forms

    Hi,
    We are working on a Workforce planning implementation. We have 2 large custom defined dimensions.
    When opening large data forms we get a standard "Error has occurred" error. If we reduce the member selection the data form opens fine.
    Is anyone aware of a setting that can be increased to open large data forms? Im not referring to the "Warn id data form is larger than 3000 cells" setting.
    Im pretty sure there is a parameter that can be increased but I cant find it.
    Thanks for your help.
    Seb

    Hi Seb,
    If you do find the magic parameter then let us know because I would be interested to know.
    It is probably something like ALLOW_LARGE_FORMS = true :)
    In the planning logs is the error related to planning or is essbase related error.
    Is it failing due to the amount of rows or is it because it is going beyond the max of 256 columns ?
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Jvm startup fails with error when using large -Xmx value

    I'm running JDK 1.6.0_02-b05 on RHEL5 server. I'm getting error when starting the JVM with large -Xmx value. The host has ample memory to succeed yet it fails. I see this error when I'm starting tomcat with a bunch of options but found that it can be easily reproduced by starting the JVM with -Xmx2048M and -version. So it's this boiled down test case that I've been examining more closely.
    host% free -mt
    total used free shared buffers cached
    Mem: 6084 3084 3000 0 184 1531
    -/+ buffers/cache: 1368 4716
    Swap: 6143 0 6143
    Total: 12228 3084 9144
    Free reveals the host has 6 GB of RAM, approximately half is available. Swap is totally free meaning I should have access to about 9 GB of memory at this point.
    host% java -version
    java version "1.6.0_02"
    Java(TM) SE Runtime Environment (build 1.6.0_02-b05)
    Java HotSpot(TM) Server VM (build 1.6.0_02-b05, mixed mode)
    java -version succeeds
    host% java -Xmx2048M -version
    Error occurred during initialization of VM
    Could not reserve enough space for object heap
    Could not create the Java virtual machine.
    java -Xmx2048M -version fails. Trace of this reveals mmap call fails.
    mmap2(NULL, 2214592512, PROT_READ|PROT_WRITE|PROT_EXEC, MAP_PRIVATE|MAP_ANONYMOUS|MAP_NORESERVE, -1, 0) = -1 ENOMEM (Cannot allocate memory)
    Any ideas?

    These are the relevant java options we are using:
    -server -XX:-OmitStackTraceInFastThrow -XX:+PrintClassHistogram -XX:+UseLargePages -Xms6g -Xmx6g -XX:NewSize=256m -XX:MaxNewSize=256m -XX:PermSize=128m -XX:MaxPermSize=192m -XX:+UseConcMarkSweepGC -XX:+CMSClassUnloadingEnabled -XX:+CMSPermGenSweepingEnabled -XX:+ExplicitGCInvokesConcurrent -Dsun.rmi.dgc.client.gcInterval=3600000 -Dsun.rmi.dgc.server.gcInterval=3600000 -Djava.awt.headless=true
    This is a web application that is very dynamic and uses lots of database calls to build pages. We use a large clustered cache to reduce trips to the database. So being able to acces lots of memory is important to our application.
    I'll explain some of the more uncommon options:
    We use the Concurrent Garbage collector to reduce stop the world GC's. Here are the CMS options:
    -XX:+UseConcMarkSweepGC
    -XX:+CMSClassUnloadingEnabled
    -XX:+CMSPermGenSweepingEnabled An explicit coded GC invokes the Concurrent GC instead of the stop the world GC.
    -XX:+ExplicitGCInvokesConcurrentThe default PermSizes where not large enough for our application. So we increased them.
    -XX:PermSize=128m
    -XX:MaxPermSize=192mWe had some exceptions that were omitting their stack traces. This options fixes that problem:
    -XX:-OmitStackTraceInFastThrowWe approximate between 10% to 20% performance improvement with Large Page support. This is an advance feature.
    -XX:+UseLargePagesUseLargePages requires OS level configuration as well. In SUSE10 we configured the OS's hugepages by executing
    echo "vm.nr_hugepages = 3172" >> /etc/sysctl.confand then rebooting. kernel.shmmax may also need to be modified. If you use Large Page be sure to google for complete instructions.
    When we transitioned to 64bit we transitioned from much slower systems having 4GB of ram to much faster machines with 8GB of ram, so I can't answer the question of degraded performance, however with our application, the bigger our cache the better our performance, so if 64bit is slower we more than make up for it being able to access more memory. I bet the performance difference depends on the applications. You should do your own profiling.
    You can run both the 32bit version and the 64bit version on most 64bit OSes. So if there is a significant difference run the version you need for the application. For example if you need the memory use the 64bit version if you don't then use the 32bit version.

  • Error when saving large XML Forms - XFBuilder

    When creating large XML Forms (many entry fields, many links when metadata properties), the xfbuilder aborts the operation with an error message stating a networkcommunication failure / server response time-out. When some fields are deleted, the saving takes some time, but will prevail...
    Does anybody have experienced these problems as well and is there a solution so that larger xmlforms can be created?? Currently the xmlforms producing errors have over 100 entry fields.
    Rgds Caspar

    OK, I found it: when editing the Project Options, the server time out can be set. In NW04 edition this is set default to 120; in EP60 SP2 this value is empty. Increasing the value (e.g. 180) does the trick!

  • Constraint error when merging workspace

    Hi,
    We have a problem with Workspace Manager. I believe there are duplicate rows in the underlying OWM tables that are causing lots of problems. Firstly, we're trying to merge a child with it's parent and we get constraint violation on the xx_AUX table. Secondly, when trying to do a difference between the workspaces, we get a "multiple rows returned from subquery" error when accessing the xx_DIFF table. These errors seem to indicate some duplication, but I have no idea how it got there.
    We had a similar problem a couple years back, and there was a post in this forum at that time.
    Any suggestions? 1) how to fix this problem? 2) how to keep it from happening again?

    Ben,
    We have not been able to reproduce the problem on a dev system, so no help there. I guessed that you would need the metadata, but that is not an option right now (we need a quick solution).
    Newly created children of LIVE merge just fine, so I'm writing a program to duplicate the differences in the original child into a new child. I will then merge the new child and remove all the children workspaces (leaving only LIVE).
    Hopefully, this will clear up all our problems. Next, I will have to keep a very close eye on the state of the workspaces to see if this ever happens again. With luck, I will then be able to replicate the conditions and let you know.
    Thanks for your reply. I will post again if I ever determine how this problem occurred.
    Edited by: user9179477 on Feb 24, 2010 7:36 AM

  • Unknown error when merging files

    I have encountered an unknown error when trying to combine files into a single pdf.  It specifically states: "No PDF file was created because Acrobat encountered an unknown error."  This is unusual as just yesterday I was able to perform this task with ease - and being able to do this task will save me hours of work time.  What has been done between yesterday and today - I updated my windows 7 and maybe even updated Adobe  Acrobat.9 Pro - I also deleted some malware that was unnecessary for my computer: Qwiklinx; yontoo; ... I did a restore to when this action worked but to no avail.
    Thank you for your thoughts on this matter.

    Hi Kylie,
    Which version of Premiere Pro are you using? Can we get more details about this case? FAQ: What information should I provide when asking a question on this forum?
    Thanks,
    Kevin

  • Error when downloading large SharePoint document using REST API

    Hello,
    The following code works fine for small files, but I got "SCRIPT14: Not enough storage is available to complete this operation." error when downloading files over 100M.
    Is it possible to write the data directly to a local storage (IndexedDB for example) instead of keeping it in memory?
            var getFileEndpoint = String.format(
                 "{0}/_api/SP.AppContextSite(@target)/web/getfilebyserverrelativeurl('{1}')/$value?@target='{2}'",
                 _SPAppWebUrl, _DocumentUrl, _SPHostUrl);
            var executor = new SP.RequestExecutor(_SPAppWebUrl);
            var info = {
                url: getFileEndpoint,
                method: "GET",
                //binaryStringResponseBody: true,
                success: function (data) {
                    //Save data to IndexedDB
                error: function (err) {
                    //Error handling
            executor.executeAsync(info);
    Since my page is located in a SharePoint Hosted App, I can't use XMLHttpRequest directly to access the document in the host site. Seems like the MS cross-domain library is the only choice to me.
    Thanks,
    Matt.
    Matt

    Hi,
    Per my understanding, you might want to download the content of files into database in SharePoint Hosted App.
    In SharePoint Hosted App, as we can only use JavaScript which is supposed to be executed in browser, the content retrieved will be stored in memory then to the local hard drive.
    As a workaround, I suggest you develop a Provider Hosted App and do the same job there instead. In Provider Hosted App, we can use C# to handle the file downloading which should
    be more appropriate in your scenario.
    About
    how to download file using SharePoint Client Object Model:
    http://www.lucacostante.com/item/15-download-and-upload-files-via-sharepoint-client-object-model.html
    Thanks
    Patrick Liang
    TechNet Community Support
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
    [email protected]

  • "General Error" when exporting unrendered clip?

    I have .mov clips in my project that already have an alpha channel in them (format is Animation millions of colors +).
    They're on the timeline in a sequence, with a single transition between them.
    I want to export back to Animation Millions of Colors + in order to keep the alpha channel (this is for Flash video conversion).
    The composite mode for the clips are set to normal, with straight alpha. I can see that the alpha is intact on the clips in the timline by temporarily placing an image on a layer under them.
    I understand that you have to export without rendering in order to preserve the alpha channel.
    However, when I try to export to ANY codec without rendering, I get a dialog that says only "General Error." Rendering will allow export, but of course the alpha channel is gone now.
    I'm running System 10.4.10 with the QuickTime 7.3 upgrade installed, FCE 3.5.1. Did QT 7.3 break something?
    Thanks in advance,
    Steve

    Tom,
    Thanks for your reply. The problem isn't that I didn't use the right codec (I did, as I said in my original post). The problem is that I am not able to export an undrendered clip using any codec. All I get is the "General Error" message.
    Thus no transparency in the exported file.
    Also to clarify, I am not keying the source files to create transparency in FCE. The source files already have an alpha channel in them.
    I have downgraded to QuickTime 7.2, so 7.3 was not the culprit. I have also run the usual system cleanup utilities like AppleJack, etc.
    Thanks,
    Steve

  • Disk space error when copying/pasting clips between iMHD projects

    Anyone know how to get around this problem? I have emptied all my trash, but continue to get a "not enough disk space" error message when I attempt to paste one copied iMHD clip to another iMHD project. How do I check my available disk space?
    I assume that it's my startup disk because I run iMHD off of a Lacie 250GB external HD... Any thoughts?

    When we Copy and Paste a clip between projects, iMovie HD duplicates the clip's entire source file to the project. That can be a very large file, so the "not enough disk space" error sometimes occurs when we don't expect it.
    One way to learn which disk a project is on is to Command-click (that's the Apple key) on the project name in the titlebar at the top of the project window. The path to the project will appear.
    In the Finder, select that disk and choose File > Get Info.
    If there appears to be sufficient disk space, please tell us the exact error message.
    Also make sure the disk is formatted correctly. iMovie requires the Mac OS Extended disk format. The disk format is reported in the same Get Info window.
    Karl

  • Multi-Value error when merging queries

    I am merging 2 different tables to generate a report with combined information. I want to pull data from one table where there is no matching dimension I have declare the dimensions as detailed objects. This works fine, I am however getting an issue with multiple values in some rows.
    I have 2 queries that I merge
    Query1
    Application_name
    Version     
    Query 2
    Server hostname
    Application_name
    Report
    Application_name  (Query 1)
    Version (Query 1)
    Server hostname (Query 2)
    The issues comes when I have multiple Server hostnames for an Applicaiton_name, they are showing a error message of u201C#MUTIVALUEu201D I presume this is due to the fact that the merge is trying to put more that one value in the Server hostname
    How can I rectify this so that multiple servers are show for one application?
    e.g.
    Report Data
    Application_name (Query 1)     Version (Query 1)     Server hostname (Query 2)
    Application  X                            Version 10.1          server1
    Application  X                             Version 10.1          server2
    Application  X                             Version 10.1          server3
    Adam

    Try to use Report level context.
    Like
    Server hostname ForEach (Server ) In (Query)
    Hope this will help you out.

  • Insufficient System Resources when merging large files

    My client is running on Windows Server 2003, 64 bit. He has 30 gig of RAM and a large amount of file storage. The drives are NTFS.
    I have a program that produces a number of text files from different processes, and then merges them when done. After running the code for many days (we're working with a lot of data), the merge process is failing with "Insufficient System Resources Exist to complete the requested process".
    Insufficient system resources exist to complete the requested service
    java.io.IOException: Insufficient system resources exist to complete the requested service
    at java.io.FileOutputStream.writeBytes(Native Method)
    at java.io.FileOutputStream.write(Unknown Source)
    at sun.nio.cs.StreamEncoder.writeBytes(Unknown Source)
    at sun.nio.cs.StreamEncoder.implWrite(Unknown Source)
    at sun.nio.cs.StreamEncoder.write(Unknown Source)
    at java.io.OutputStreamWriter.write(Unknown Source)
    at java.io.BufferedWriter.write(Unknown Source)
    at functionality.ScenarioThreadedLoanProcessor.processLoans(ScenarioThreadedLoanProcessor.java:723)
    at functionality.ScenarioThreadedLoanProcessor.construct(ScenarioThreadedLoanProcessor.java:227)
    at utility.SwingWorker$2.run(SwingWorker.java:131)
    at java.lang.Thread.run(Unknown Source)
    I've investigated this problem in other places, and most of the answers seem to not apply to my case.
    I've looked a this: http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4938442
    But I am not using file channels (I don't think...), and I am copying data in chunks of <32MB.
    Here's the relevant code (I realize I don't need to be re-allocating the buffer, that's a legacy from an older version, but I don't think that's the cause.)
    There are usually four threads, 1-4 reports, and 1 scenario, so this loop shouldn't be executing thousands and thousands of times.
    for (int scenario = 0; scenario < scenarios.size(); scenario++)
                        for (int report = 0; report < reportPages.size(); report++)
                             for (LoanThread ln : loanThreads)
                                  BufferedReader br = new BufferedReader(new FileReader(new File(ln.reportPtr.get(
                                            (scenario * reportPages.size()) + report).getFileName())));
                                  br.readLine();//header, throw away
                                  BufferedWriter bw = new BufferedWriter(new FileWriter(formReport.get((scenario * reportPages.size()) + report)
                                            .getFileName(), true));//append
                                  char[] bu = new char[1024 * 1024 * 16];
                                  int charsRead = 0;
                                  while ((charsRead = br.read(bu)) != -1)
                                       bw.write(bu, 0, charsRead);
                                  bw.flush();
                                  bw.close();
                                  br.close();
                                  File f = new File(ln.reportPtr.get((scenario * reportPages.size()) + report).getFileName());
                                  f.delete();
    Any thoughts?Edited by: LizardSF on Jul 29, 2011 8:11 AM
    Edited by: sabre150 on 29-Jul-2011 09:02
    Added [ code] tags to make the code (more) readable

    1) You can allocate the buffer at the start of the outer loop to save the GC working overtime. You might even be able to move it out of the loops all together but I would need to see more code to be sure of that.+1. Exactly. This is the most important change you must do.
    The other change I would make is to reduce the buffer size. The disk only works in units of 4-16k at a time anyway. You will be surprised how much you can reduce it without affecting performance at all. I would cut it down to no more than a megabyte.
    You could also speed it up probably by a factor of at least two by using InputStreams and OutputStreams and a byte[] buffer, instead of Readers and Writers and char[], as you're only copying the file anyway. Also, those BufferedReaders and Writers are contributing nothing much, in fact nothing after the readLine(), as you are already using a huge buffer. Finally, you should also investigate FileChannel.transferTo() to get even more performance, and no heap memory usage whatsoever. Note that like your copy loop above, you have to check the result it returns and loop until the copy is complete. There are also upper limits on the transferCount that are imposed by the operating system and will cause exceptions, so again don't try to set it too big. A megabyte is again sufficient.

  • Receive an error when merging files

    When I try to merge files each file receives an error that says "cannot open file, remove file from list"

    You're either using Acrobat (http://forums.adobe.com/community/acrobat) or CreatePDF (http://forums.adobe.com/community/createpdf), because you CAN'T merge files with the free Reader.

  • Error when starting iTunes and no Quicktime

    Ok, I read a topic about involving the same problem, but with that one, Quicktime would open, mine will not.
    My issue is simple, when I click to start iTunes (The new one, 7.02 I believe) I get the windows error saying:
    "iTunes has encountered a problem and needs to close. We are sorry for the inconvenience."
    I have tried re-installing iTunes and Quicktime, but with no result. Also, with QUicktime, it simply will not open. No error message, it just does nothing.
    Thanks for the help!

    Hey CoasterGuy987,
    try the tips in this article: http://docs.info.apple.com/article.html?artnum=93976
    After reinstalling iTunes and Quicktime as described does Quicktime launch normally?
    Jason

  • Out of memory error when writing large file

    I have the piece of code below which works fine for writing small files, but when it encounters much larger files (>80M), the jvm throws an out of memory error.
    I believe it has something to do with the Stream classes. If I were to replace my PrintStream reference with the System.out object (which is commented out below), then it runs fine.
    Anyone else encountered this before?
         print = new PrintStream(new FileOutputStream(new File(a_persistDir, getCacheFilename()),
                                                                false));
    //      print = System.out;
              for(Iterator strings = m_lookupTable.keySet().iterator(); strings.hasNext(); ) {
                   StringBuffer sb = new StringBuffer();
                   String string = (String) strings.next();
                   String id = string;
                   sb.append(string).append(KEY_VALUE_SEPARATOR);
                   Collection ids = (Collection) m_lookupTable.get(id);
                   for(Iterator idsIter = ids.iterator(); idsIter.hasNext();) {
                        IBlockingResult blockingResult = (IBlockingResult) idsIter.next();
                        sb.append(blockingResult.getId()).append(VALUE_SEPARATOR);
                   print.println(sb.toString());
                   print.flush();
    } catch (IOException e) {
    } finally {
         if( print != null )
              print.close();
    }

    Yes, my first code would just print the strings as I got them. But it was running out of memory then as well. I thought of constructing a StringBuffer first because I was afraid that the PrintStream wasn't allocating the memory correctly.
    I've also tried flushing the PrintStream after every line is written but I still run into trouble.

  • -34 error when exporting from iMovie to QuickTime

    When attempting to export an edited movie using the iMovie app on my MacBook Pro into QuickTime, I receive the message "quick time export error -34"
    Can anyone tell me what to do?  THANK YOU!

    I tried that and it makes a file without
    video...sound only.
    Itsounds like you didn't have the Sound box checked.

Maybe you are looking for

  • Week view calendar on iPhone 3G

    Hello, I think it will be a great new fonction if we can see the calendar in a "week format" (actualy we can only use list, day or month) and, of course in "landscape" mode of our iPhone. Perhaps with the version 2.1 OS including a real bluetooth? re

  • SXMB_MONI: Outbound Status Red (Need Err Message Alerts)

    What is the best way to setup meaasge alerts in XI for a red flag in SXMB_MONI in the "Outbound Status" field? We currently have error messageing setup in ALERTCATDEF that will send an error message when the "Status" field is a red flag but I don't k

  • I just downloaded firefox 4 but still have the file/edit/view/etc of firefox 3...?

    In the walkthrough of Firefox 4 it shows just one orange button where everything is kept all together, but I still have all the single buttons from Firefox 3. I tried re-downloading Firefox 4 but it didn't change it to the new version, so I'm kind of

  • How do I prevent tab focus from changing when I scroll tabs with the mouse wheel? [SOLVED]

    Whenever I scroll my tabs with the mouse scroll wheel the tab focus changes from the page I'm currently on to the tab being scrolled to. I usually have a large list of tabs open and I depend on being able to scroll tabs without changing page focus to

  • Default report directory

    i have view that when i run a report from a form it create a pdf file in the directory /reports/cache. i ask : 1) how it is possibile to open with web.show_documnet this file (i have try web.show_document('http://nameserver:7777/reports/cache/123456.