Low resolution - a result of too much compression or is it the source file?

a friend of mine is helping me produce a DVD using DVD Studio Pro 4 and as our source files, we are using 14 video files that were generated using Final Cut Pro.
when the final DVD is produced, we were both disappointed to discover that the screen resolution appears to be quite low - the darks are pixelated and the entire image lacks sharpness. it looks fine on anything up to a large computer screen, but it definitely seems inadequate when viewed on a large flat screen HD television.
the DVD is not HD formatted, by the way.
being quite the amateur on DVD Studio Pro, my question is this: have the original video files been compressed too much and is this the cause of the problem?
the 14 video files that were loaded into DVD Studio Pro, in their original format, take up over 15 GB of space - they are all quicktime files (.mov formatted), generated from FINAL CUT PRO as DVCPRO files in NTSC (those were my export settings) - yet when all of the files are compressed and loaded into DVD Studio Pro, they take up only 1.4 GB.
it would seem that we have at least 3 GB's to spare.
any feedback would be welcomed, and please excuse me for my beginner questions.
additionally, it would be great if anyone could take the time to explain to me where i can change the compression settings / preferences in DVD Studio Pro.
again, thanks very much.

Ok, do not use DVD SP to compress your video. Use Compressor to create a .m2v file of your video and a .ac3 file of your audio. Tutorials can be found at; http://www.kenstone.net/fcphomepage/fcp_homepageindex.html
With Compressor you will have much more control over the encoding process and you can get better results. That being said, Compressor is good, but there are even better third party MPEG-2 encoders out there. They will give you even better results at lower bitrates.

Similar Messages

  • I have an iMac and a MacBook with Intel Core 2 Duo processors. I realize that this is within the stated requirements for Lion.  However was wondering if by migrating to Lion this will result in too much demand on processor resources, thus a slower machine

    I have an iMac and a MacBook with Intel Core 2 Duo processors. I realize that this is within the stated requirements for Lion.  However was wondering if by migrating to Lion this will result in too much demand on processor resources, thus a slower machine than using Snow Leopard?

    Which iMac?  Which Macbook?   Both have had several model
    itertations, even within the framework of a Core2Duo processor.
    With that said, I have an early 2009 iMac 24", with 2.66 GHz
    Core2Duo with 8 gig of RAM, and in my opinion, seems to be running
    smoother and faster with Lion.

  • PRS600 ereader taking too much time to load all the books

    Whenever I open My PRS600 from complete shutdown it takes too much time to load all the files which are present in Memory card. Is anyone else facing same issue?

    Hello,
    Recently, I have been having some serious issues when it comes to my Sony PRS-600. I am running on a Windows 7 64-bit, and an updated Reader Library 3.3
    The issue comes when transferring books to the e-reader from the library, and from the e-reader to a collection. The problem is that the software becomes intolerably slow while it's processing the command. The Reader Library software window grays out, displays (Not Working) and if clicked on it, shades to a white color and displays either "cancel operation" or "wait until program responds". If I do close the operation, it appears as if the e-reader doesn't follow the operation and still displays "Do not disconnect". Since I do not see any other way to disconnect (other than the eject option), I remove the USB plug which causes a bit more issues with the reader (such as removing all of my collections, for example!).
    But anyway, that's not the main issue here. The main issue is that the book transferring is really slow. I need to wait a couple of minutes (or even more) just for the software to process. Moving just 1 MB of data requires so much time as if it's 1 GB. Sometimes it's random and does it fast, and sometimes the application is better not to be dealt with at all while it's processing the command. If I would inspect My Computer, the simple loading of the e-reader storage icons and information would make the Windows Explorer to "crash" (e.g, close all windows and then reopen them). It just happens that in all randomness even the creation of a collection makes the software slow.
    So to recap: the reader software is slow when adding and moving books.
    I hope someone will help me resolve this annoyance.
    Thank you,
    KQ

  • Hi, im trying to download pictures from my camera but when I connect the lightning to usb camera adapter i get the message that there is too much energy required to support the device, can anyone help me?

    Hi, I´m trying to use the lightning to usb camera adaper, but I get a message that ¨there is too much energy needed to support the device¨, so I can´t download my pictures, can anyone help me?

    This discussion may help.
    It mostly depends on the camera. Some cameras have a Transfer Mode, which does not attempt to charge while transferring pictures. does yours have such a setting?
    Alternatively you can try to plug a powered USB hub between the camera and the USB connector so the power is provided by the HUB rather than the iPad.
    Other than that, yes the Memory card reader may be a better option assuming your Camera uses Secure Digital (SD) type memory cards.

  • Cannot get Genius to work - Always get same error message - "Genius results can't be updated right now. The required file cannot be found."  Problem also seems to be affecting iCloud aand Music Match.

    Cannot get Genius to work - Always get same error message - "Genius results can't be updated right now. The required file cannot be found."  Problem also seems to be affecting iCloud aand Music Match. Can anyone help me with this?  What file is missing and from where?

    This is a known problem with Windows 7 Enterprise using certain Group Policy settings. Solution for me was to use another machine running Windows 7 Ultimate.

  • I have 10.0.3, why can I no longer export to vimeo?  also, my quicktime movies are not compressed, they export as the whole file.

    I have 10.0.3, why can I no longer export to vimeo?  also, my quicktime movies are not compressed, they export as the whole file.  this didn't happen before the update.  do i have to now get compressor?  help.

    You should buy Compressor it never fails. It is a must have. I have given up using this buggy Share function and use Compressor all the time.

  • Too much compression on FCE project

    I have a FCE project that is 14.4 min long and 3.81 GB. When I us DVDSP to create the menu and DVD it ends up as a 995 MB DVD. Compared to the original FCE project, it looks really grainy. I upped the encoding as high as it would go, but I can't get it larger than 995 MB. Since it is the only movie on the DVD is there anyway to get DVDSP to not compress at all?

    DVD footage is most often compressed to MPEG2, but the bitrate can vary widely up to 9.8Mbps if you use a replicated disc. Additionally, you can also use MPEG1 on a SD disc, which is not as high a quality, but can allow you to fit vastly more minutes of video on the same size disc.
    If you are burning a DVD on your mac then anecdotal evidence suggests that a video bitrate of no more than 7.4Mbps is prudent. More than this and the disc may 'choke' in some players.
    This isn't necessarily to do with the laser, so much as the reflective properties of the re-writable disc not being as good as a replicated one. All DVD players should be able to cope with 9.8Mbps from a DVD-Video disc (although sadly not all do, surprisingly), but keep in mind that a DVD-R is not a DVD-Video disc.
    If your pristine footage is coming out as grainy when you encode then you really ought to check the settings that you are using. Is the source footage and lighting good quality for the footage? Have you applied any filters or effects? Are you using a good quality encoder?
    The final size of the file produced will vary according to the encoding used. The higher the bit rate the larger the file size will be. Audio files add to the overall bit rate on a disc, to, so using AC3 (Dolby Digital) is a very good idea - they have a smaller file size, lower bitrate and can allow you to increase the rate for your video to gain quality.
    Be aware that the visual difference between footage encoded at around 7.4Mbps, compared to that encoded at 8 or even 9Mbps is so slight that most people won't notice. There appears to be a law of diminishing returns in operation - you really don't gain anything by encoding at much higher levels, but you will prevent some players playing your disc back reliably.
    If you've not yet used Compressor then you really ought to be looking at it. If your results are coming from Compressor, then it is about time you looked at BitVice and if necessary used the DVNC setting there to reduce the noise on the final output. There are other encoders, but none so easy to use as Bitvice, IMO.

  • I'm trying to do a book using iphoto 11 (9.4.2), but i don't get a low resolution warning. Why? and how can i get the warning., i'm trying to do a book using iphoto 11 (9.4.2), but i don't get a low resolution warning. Why? and how can i get the warning.

    I'm trying to do a book using Iphoto, but it no longer gives a low resolution warning like it used to in the older versions.
    I don't know why it doesn't anymore, (or maybe I have to add it on somehow??) but it makes it difficult to know how much I can
    zoom in on a picture and still have it be good printable resolution.
    Does anyone know how to resolve this or add the warning to the latest version of iphoto?
    Thanks

    With iPhoto 9 the low resolution for photos warning has been dropped. The only warning one will get now is for text that has overflowed the text box:
    As Larry suggested send a feature request to Apple via http://www.apple.com/feedback/iphoto.html to get it back. That's a big omission in my opinion.
    You can determine what your minimum resolution is for your images (if all taken from the same camera) by dividing the pixel dimensions by the size of the larges frame in the book which would be 8.5 x 11. If it's at or above 150 dpi you'll be above Apple's previous resolution warning limit.
    OT

  • Is it too much lines of code inside a class file?

    Hi,
    I was wondering about how many lines can be considered too much in a java class file. I saw some class files with more the two thousand lines of code and even more than three thousand lines. When we have so many lines in the code I've realized that the file becomes slow for edition and it's not readable as it should be. So, as a best practice, or even due to technical issues, in average how many lines of code we'd write in order to keep the readability and avoid compile and build issues? How many lines a class should have without putting our code and performance at risk ?
    thanks a lot

    My general rule of thumb--in a class everything shold be very consistent, that is to say: you should be focusing on 1 idea in your class. IMO: If you have a class for users, you need to consistent in just what that class does. When the class becomes large, then look for related themes that run through methods of that class, and if here are many, you should make a "related class" that does satelite functions of the "core" offered in the main user class.
    Thousands of lines do not a good class make, I break in to related categories of classes long before then, and as a rule I keep under 500 lines in a class--usually under 300. In any case, no more lines of code need go into a class than that to implement the core features in the main idea for that class--and no less lines of code than is needed too, don't religate a method to another class just becasue you feel it would make the class it belongs too big.
    Along the idea of the user class:
    Storage for the user data
    Home information
    Work information
    physical information
    tastes and preferences
    family
    and etc
    all of these could be a class in and of themselves, but definetely are closely related to the user class.

  • HT3918 what is the effect of too much memory,that is over the recommended amount?.

    What is the effect of too much ram? Is it likely to do any damage/.
    brendan

    Welcome to Apple Support Communities
    Your Mac doesn't detect the RAM if you install more than the supported for your computer

  • Windows - too much RAM?  Is this the root cause of many problems

    I know this is going to sound counter intuitive, but I have reason to believe that there are problems with Windows systems with large amounts of memory (4 gb, if not something less than this).
    Why? I just recently bought a new notebook - fully maxed out for performance in every conceivable way - including 4 gb RAM.
    But since the first day, I noticed a problem with a particular piece of pre-installed software where MSVCR80.DLL would keep crashing everytime specific programs started to run. I liased with the vendor and went through all the usual troubleshooting, plus reinstalled Windows from both the factory config and from original disks more times than I care to remember but the problem remained. They replaced all the hardware, and the problem continues. The vendor tried to reproduce the problems internally but couldn't. (So note, none of these problems are Lightroom problems - I haven't even installed Lightroom on this new machine yet).
    About the same time I noticed a couple of mentions of this same DLL in a number of threads on this site:
    http://www.adobeforums.com/cgi-bin/webx?7@@.3bc44c18/3
    http://www.adobeforums.com/cgi-bin/webx/.3bc489fe
    http://www.adobeforums.com/cgi-bin/webx/.3bc4644c/10
    And reading these, you see people who have 4 or 3 Gb RAM also experiencing weird problems.
    Last night I was working through what could be different and decided to rip out one of the 2 Gb RAM modules in my new computer - taking it down to (only) 2 Gb RAM, and suddenly the software that I was having problems with suddenly works correctly. Reinstalled the second RAM module and it stops working and crashes in the same way as did initially. So in this case, the problem is definitely having this much RAM.
    So as a test, it would be interesting to see if some of the people experiencing problems in Lightroom (on Windows) with over 2 Gb memory reduce the amount of installed memory and see if the problems in Lightroom goes away as well?
    I'm also debating whether the problem is actually a software problem in the MSVCR80.DLL, or whether there is something more sinister in the hardware - because 4Gb RAM doesn't give 4Gb usable RAM because of the mapping of various hardware into memory (video cards for instance) - on my new notebook with Vista it was reporting it had 3.5Gb usable RAM - if you search on the web you find some Vista users with 4Gb RAM showing less than 3Gb usable RAM within Vista - so there are definitely some hardware dependencies going on here as well - some machines with 4Gb RAM might work, and some with different hardware configs with 4Gb RAM might not.
    It will be interesting to see what results people get with less memory.

    Hi everyone
    I'd like to thank everyone for trying to help me with this problem, but that's not the reason that I posted the original message.
    The problem I'm experiencing is not bad hardware or a faulty software installation. It is a 100% reproducible problem which displays when there is 4 Gb RAM in the machine and doesn't when there is 2 Gb RAM - with no other changes occurring except this change of memory.
    The old DLL conflict issue apparently is not so much of an issue with this file these days because each application can have a manifest which describes the precise version of the supporting files and Windows will maintain these seperate versions and present the appropriate version to the correct app. Regardless, this doesn't behave like a DLL conflict because the only change is that of memory.
    I agree that 32bit operating systems should not have a problem with 4 Gb RAM, but clearly in my case there is. And in the initial Lightroom topics that I cited where others were experiencing similar issues with the same DLL, and they also had machines with over 2 Gb RAM.
    The problem may not be that all machines with over 2 Gb RAM have problems, but there may be a unique set of interactions between various pieces of hardware / software / BIOS / installed software that is causing it on specific configurations. The problem may or may not lie with the actual DLL that is reporting the crash, I just don't know.
    The reason I posted this initially was because whilst investigating an unrelated application, I happened upon a strong correlation of specific problems with this DLL across multiple different applications (the security application that I noticed the problem with, reports here in Lightroom, but if you Google you'll find references to a large range of other applications, including PS3), and the common theme for everyone who has had these problems appears that they've had high end machines with over 2 Gb RAM. Further in my case with the relevant security application I'm having problems with I have proof that the problem disappears if you reduce the memory.
    My intent was really just to hone the number of scenarios where we know this problem either occurs or doesn't. I'm curious as to:
    - anyone who has over 2 Gb RAM who is having problems with Lightroom, and seeing if the same problems occur if they reduced the amount of memory in their system. There are certainly some people on these forums who have been very vocal as to problems - I just want to see if the Lightroom problems these people are experiencing are a sympton of some more fundamental issue.
    - anyone who has over 2 Gb RAM but is operating successfully with no problems whatsoever (forums have a selection bias whereby you predominantly hear from people who have problems, not those that don't have problems) - I'd like to balance this out a bit by having people who are having no problems also jump into the discussion.
    - anyone who has less than 2 Gb RAM who have had Lightroom report problems with MSVCR80.DLL (this is the control)
    If we can narrow down the circumstances in which it occurs, it will be much easier for the developers to be able to reproduce and then trace and eliminate the problem. But if everyone just keeps going "it doesn't work" then we'll never get anywhere.
    Thanks
    Greg

  • Too much compression in CS5?

    My camcorder puts out AVCHD files.  So if they are imported to CS5, are then compressed in mpeg 4 when a rendered file is made?  Someone told me that elements and premiere pro cs5 do this?

    As far as I know, PPro and PreEl do not output AVCHD... so, yes, editing AVCHD and outputting does result in compressing to another codec
    AVCHD direct to BluRay http://forums.adobe.com/message/2785066?tstart=0
    AND a possible workaround http://forums.adobe.com/thread/706465?tstart=0

  • Cache query results in too much garbage collection activity

    Oracle Coherence Version 3.6.1.3 Enterprise Edition: Production mode
    JRE 6 Update 21
    Linux OS 64 bit
    The application is using a object Customer having following structure
    Customer(CustID, FirstName, LastName, CCNumber, OCNumber)
    Each property of Customer is a inner classes having getValue as one of the methods retuning a value. The getValue method of CCNumber and OCNumber return a Long value. There are 150m instances of Customer in cache. To hold this much data in cache we are running several nodes on 2 machines.
    The following code is used to create indexes on CCNumber and OCNumber:
         ValueExtractor[] valExt = new ValueExtractor[]{
              new ReflectionExtractor("getCCNumber"), new ReflectionExtractor("getValue")};
         ChainedExtractor chExt = new ChainedExtractor(valExt);
         Long value = new Long(0);
         Filter f = new NotEqualsFilter(chExt, value);
         ValueExtractor condExtractor = new ConditionalExtractor(f, chExt, true);
         cache.addIndex(condExtractor, false, null);The client code queries the cache with following code:
         ValueExtractor[] valExt1 = new ValueExtractor[]{
              new ReflectionExtractor("getCCNumber"), new ReflectionExtractor("getValue")};
         ChainedExtractor chExt1 = new ChainedExtractor(valExt1);
         EqualsFilter filter1 = new EqualsFilter(chExt1, ccnumber);
         ValueExtractor[] valExt2 = new ValueExtractor[]{
              new ReflectionExtractor("getOCNumber"), new ReflectionExtractor("getValue")};
            ChainedExtractor chExt2 = new ChainedExtractor(valExt2);
         EqualsFilter filter2 = new EqualsFilter(chExt2, ocnumber);
         AnyFilter anyFilter = new AnyFilter(new Filter[]{filter1, filter2})
         cache.entrySet(anyFilter);The observation is that for 20 client threads the application performs well(avg response time = 200ms) but as the number of client threads increases the application performance goes down disproportionately(query returns anywhere between 1000ms to 8000ms for 60 threads). I think this is because of the eden space filling up very fast when the number of client thread goes up. The number of collections per second goes up with the number of client threads. There are almost 2-3 ParNew collections every second when there are 60 client threads where as only 1 collection per second for 20 client threads. Even 100-200ms pause degrades the overall query performance.
    My question is why coherence is creating so many objects that fills up eden so fast? Is there anything I need to do in my code?

    Hi Coh,
    The reason for so much garbage is that you are using ReflectionExtractors in you filters, I assume you do not have any indexes on your caches either. This means that each time you execute a query Coherence has to scan the cache for matches to the filter - like a full table scan in a DB. For each entry in the cache Coherence has to deserialize that entry into a real object then using reflection call the methods in the filters. Once the query is finished all these deserialized objects are garbage that needs to be collected. For a big cache this can be a lot of garbage.
    You can change to POF extractors to save the deserialization step which should reduce the garbage quite a bit, although not eliminate it. You could also use indexes, which should eliminate pretty much more of the garbage you are seeing during queries.
    JK

  • 3000 old emails back to 2009 came along for the ride when I transferred aol address book to Mac address book. All attempts to delete en masse have no results. Too much stuff. Procedures?

    3000 old emails, some as far back as 2009, came along for the ride when I transferred aol address book to imac address book. Can't find the magic key to permanently delete them. Ideas on procedures?

    I'd try deleting 50 at a time - 3000 is a stretch.
    By the way, double posting is not at all helpful.

  • 0HR_PT_3 Load is taking too much time when i execute the Info package

    Dear All,
    i am working on HR Module.
    i am extracting data from 0HR_PT_3 from R/3 PRD system through info package its taking long time to load data.
    I have checked in RSA3 in R/3 PRD system whenever i extract "0HR_PT_3"  instead of showing records  its showing this error message "Infotype 2002 not read due to lack of authorization" .
    could you please let me know any one have resolve this issue.
    Thanks,
    Venkat.
    Edited by: venkatnaresh7 on Jul 12, 2011 11:49 AM

    hi
    I am facing same issue. did you able to fix this?

Maybe you are looking for