Best approach to merge .mov movies?

Hello there,
I have some movies I'd like to merge into a single one, but trying to maintain as much of their current quality as possible. I'd like to see if anybody has a recommended approach. So you know, I currently have FCE4 and the iLife suite v9, but no other tools. However, I'm open to suggestions to additional tools if necessary.
For example, I have a slideshow created with iPhoto and saved as a movie in a .m4v file (MPEG-4 video file; codecs: AAC, H.264; dimensions: 960x540) and a Quicktime movie created with LiveType (Quicktime movie; codecs: Animation; dimendions: 720x486).
Regards,
MV

Hi Miguel,
The goal would be to see if we could get the two movies as close as possible to a sequence size that we could use to combine the two in FCE. I've never made a slideshow within iPhoto, but what frame-sizes does it let you choose from when you export it from the program? My preference has always been arranging slideshows in FCE or iMovie to give you as complete control of the process and compression as possible.
As for the LiveType project, go to +Edit > Project Properties+ and select the *NTSC DV 3:2* preset so that the frame-size changes to 720x480. Make any tweaks needed to your project to make sure everything fits into this new frame-size. Do not export the project as a movie, you can simply import it into FCE as a project file and everything should go fine.
Within FCE, create a new DV-NTSC sequence and import your LiveType _Project File_ as well as your iPhoto slideshow movie file. Arrange both of them in the DV-NTSC sequence to your liking, then click Option-R and go to +Sequence > Render Only > Mixdown Audio+ to render the sequence out and prepare the audio for exporting. If you like how it looks, go to +File > Export > Quicktime Movie+ to export the sequence as a full quality QuickTime file of the original sequence.

Similar Messages

  • Best approach to upgrade MaxDB AND move to new OS release

    What would be the best approach to upgrade Contentserver MaxDB 7.5 on W2K server (32-Bit) to MaxDB 7.7 on Windows 2008 Server x64 (R2?)?
    1.  a) Upgrade MaxDB on the existing installation
         b) Create a complete backup
         c) Install latest release on new machine
         b) recover with backup from existing installation on new machine
    2. a)  Create a complete backup from existing installation
        b) Install same release on new machine
        c) recover with backup from existing installation on new machine
        d) upgrade to latest release on new machine
    Both approaches share the problem that you have to run a OS/DB release constellation that is not released according to SAP PAM. The second approach has the advantage that the source system will not be touched and is still available in the original status if something fails. But the main issue is the compatibility question (7.5 not released for W2008, 7.7 not released for W2K)
    Any suggestions?
    Thanks,
    Matthias

    Natalia,
    thanks for your answer.
    > Do you have Content Server 6.40 ?
    yes.
    > the MAXDB version 7.5 is not released on W2008
    neither is 7.7 on W2K.
    > I recommend to go with 1 option.
    Which would be this one:
    >>1. a)Upgrade MaxDB on the existing installation
    So can I upgrade to 7.7 on W2K though it is not released for this combination?
    Best Regards,
    Matthias

  • Best approach to creating a TOC for product catalog using data merge

    What is the best approach for creating a TOC for a product catalog (over 1,000 items) using Data Merge?
    The TOC would contain the product Categories. 
    So for example, Category A items could go from pages 1 - 3, and Category B items would start at pg 4, but if new items were added to Category A, then Category B may start from pg 6. 
    From the Data Source, there are 5 Data Fields I've chosen to be displayed.  If this were a regular digital print document, I could use the Paragraph Style method for creating a TOC, but if I make any one of the Data Fields a certain Paragraph Style and use that for the TOC, it'll populate the TOC with that Data Field for all the items. 
    Any suggestions?

    Peter Spier wrote:
    TOC is not interactive in the ID file, though it can be in a PDF that you export (there's a checkbox to create PDF bookmarks). You might want to think about using Cross-references (rather than hyperlinks, I think) to build the TOC. You have to do it manually, but once done it should maintain itself, whereas a TOC is built automatically, but must be regenerated after you edit the doc.
    One caveat witih TOCs created from cross-references: Although changing the text of an x-ref source paragraph (for example from "Patatas and tamatas" to "Tomatoes and Potatoes,"and/or when the source paragraph flows to the next or previous page) update automatically or when invoking "Update cross-references," MOVING a cross-reference source paragraph to a location before or after another source paragraph, does not change their sequence in the pseudo-TOC. You'll need to manually move the reference in the pseudo-TOC to the correct position in the sequence of cross-refs. So, put the task of checking the order of x-refs in the pseudo-TOC on your before hand-off check list.
    HTH
    Regards,
    Peter
    Peter Gold
    KnowHow ProServices

  • HT1364 I just bought a new PC and now have ample space on my C drive to house my music Library which is currenlty installed on a external drive.  What is the best way to install and move the itunes library to my C Drive?

    I just bought a new PC and now have ample space on my C drive to house my music Library which is currenlty installed on a external drive.  What is the best way to install and move the itunes library to my C Drive?

    If the entire library is on the external drive then simply copy the iTunes folder into <User's Music> on the new computer, then install iTunes. If you've already installed iTunes you will want to remove the empty iTunes folder in <User's Music> first.
    If it turns out you only have the media folder on the external drive then take a look at this post...
    tt2

  • I need help to decide which macbook pro is best for photo editing, editing movies and doing all the rest too like excel, word etc. 13"

    I need help to decide which macbook pro is best for photo editing, editing movies and doing all the rest too like  microsoft office products ...excel, word etc.  I am new to the apple world and have liked the idea of the MAC Book Pro 13" but really dont know if this is good enough or if the computer will soon crash?
    13-inch: 2.6GHz
    with Retina display
    Specifications
    2.6GHz dual-core Intel Core i5
    Turbo Boost up to 3.1GHz
    8GB 1600MHz memory
    512GB PCIe-based flash storage1
    Intel Iris Graphics
    Built-in battery (9 hours)2

    That's a fine machine and, with 8GB of RAM and 512GB flash storage should serve you well for light video/photo editing as well as for 'normal' usage. And it should last you for years to come.
    Good luck in making your decision!!
    Clinton

  • Images merging and movies merging

    Hello!
    I am new to Java Media Framework and I'm sorry about my English is really poor.
    I read articles , like :
    [Generating a Movie File from a List of (JPEG) Images|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/JpegImagesToMovie.html]
    [Merging Tracks from Multiple Inputs|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/Merge.html]
    I'm at Unbuntu 9.04 i386 Desktop , using JDK 1.6.0_16 , and [JMF 2.1.1e Software|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/download.html].
    [ Question 1 ]
    I want to merge images into a mov file per 2 images
    $ java JpegImagesToMovie -w 320 -h 240 -f 1 -o file:1.mov file:foo1.jpg file:foo2.jpg
    $ java JpegImagesToMovie -w 320 -h 240 -f 1 -o file:2.mov file:foo3.jpg file:foo4.jpg
    $ ...
    the result of 1.mov and 2.mov is work correctly, and then, I want to merge 2 mov files into 1 mov file
    $ java Merge -o file:result.mov file:1.mov file:2.mov
    the file size of result.mov is almost the sum of the 1.mov and 2.mov , but i cannot play it correctly, the result.mov is only show the content of the 1.mov.
    the real cmd :
    $ java -classpath jmf.jar:. JpegImagesToMovie -w 691 -h 299 -f 1 -o file:a.mov /images/1.jpg /images/2.jpg
    $ java -classpath jmf.jar:. JpegImagesToMovie -w 691 -h 299 -f 1 -o file:b.mov /images/3.jpg /images/4.jpg
    $ java -classpath jmf.jar:. Merge -o file:result.mov file:a.mov file:b.mov
    [ Question 2 ]
    I use [Writing AVI videos using pure Java|http://blog.hslu.ch/rawcoder/2008/08/12/writing-avi-videos-using-pure-java/] to merge images(jpg) into a avi file, and it works correctly. the type is MJPG/MJPEG.
    like [Question 1] , 1.avi and 2.avi made, and then , I can use the avimerge tool to merge them into the result.avi , and it works correctly.
    but I cannot merge them by [Merging Tracks from Multiple Inputs|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/Merge.html] even if I change the code of Merge.java :
    @49 : String videoEncoding = "MJPG";
    @51 : String outputType = FileTypeDescriptor.MSVIDEO;which is according to [Class VideoFormat|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/apidocs/javax/media/format/VideoFormat.html] and [Class FileTypeDescriptor|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/apidocs/javax/media/protocol/FileTypeDescriptor.html] .
    the real cmd :
    $ java -classpath jmf.jar:. Merge -o file:result.avi file:a.avi file:b.avi
    but I cannot play it correctly.
    Thanks for your help !!

    hello.there wrote:
    hi shadowLife ,
    I am also a newbie to this filed.Curently i am working on Video conference tool using JMF. I also need to merge the mulitple audio files and i found, it is possible.
    One of my friend tried to merge the two video files, but the problem is that when you play them it plays only first video. As these videos are stored as multiple tracks so only one trakc is played at a time. If you want to see the other video then there should be a option of seeing video track in your player like it is in VLC player.
    So i don't know whether merging of two video tracks as a single video track is possible or not (i feel it is not possible), because video stream is different from audio stream.Merging two video streams together doesn't make sense the way it does with an audio stream. Thus, if a video has more than one video stream running concurrently, it'd be for something like "alternate angles" on a DVD, or perhaps security camera software would combine all of the video feeds into a single file like that.
    The OP was wanting to concatinate video files, not merge them. One right after the other. This concept makes a lot more sense for most applications.
    One way to do this is to play both video on some panel and capture the screen ( i heard that is is possible in java) then create a video from this captured images.It is possible to do that, but you would never want to do something like that... There are more problems with that idea than I can count, just stemming from the fact that that is a very, very hacky solution.
    Certainly there are instances where this might be necessary, but anything like that would absolutely be a last-ditch effort and could never be included in any sort of publishable software...
    Hope this information will help you.
    May be, i can be wrong so hoping for others reply on that. this will help me also for making my fundamentals and knowledge strong.Can be wrong and mostly were ;-) But it's a learning experience for everyone, eh?

  • Best Program for editing quicktime movies

    What is the best program for editing quicktime movies.
    For example, I have several quicktime movies in the 4:3 aspect ratio, but the video is ment to be 16:9 so there is black on the top and bottom. I want to be able to crop out the black and make the video file a true 16:9 wide screen video becuase my laptop has a widescreen 16:10 aspect ratio lcd. And when playing back these videos, you end up with a couple inches of black on the top and bottom, and right and left, so a lot of screen space get's waisted. The only programs I have found so far that can make a 16:9 quicktime video have not supported the H.264 format, and only stetch the video to 16:9 rather then croping it, so the black remains on the top and bottom and the aspect ratio is off and every one looks short and fat becuase it's not in its native aspect ratio correctly.
    Does anyone have any advice on what program I should use?
    Also, does anyone have any recomendatiosn for how to convert wmv files to quicktime / itunes videos? I have found one program that does so, but it comes out very blocky and the audio is out of sync with the video by a good five seconds.
    Thanks

    MPEG Streamclip (free) can crop and Save As so you wouldn't need to add any new compression to the existing files.
    The make a Windows version now (it used to be Mac only).
    No tips on converting WMP files but you'll find dozens of solutions in the iPod Discussions pages.

  • Joining or merging two movie files

    If I want to join or merge two movie files (.m4v) into one, can I use one of the included applications on my iMac (iMovie, QuickTime, etc) or do I need a 3rd party app? If so, could you recommend one please?
    Thanks!

    On a related note-
    I have imported my "Das Boot" onto my drive, so I don't have to pull out my CD's. Food for thought before I go on any further - the files show up as ".dvdmedia". Since this version is the uncut 4 hr long version it is on two different CD's which converted into two files. How do I, if possible, make these into one file? I also have and am planning to import "Ben Hur". I tried to use iMovie (I have iLife 09') and it said I needed to "upgrade" to iMovie 3 (no luck with that one).
    Any legal suggestions?

  • How to merge two movies documents into 1 in  iTunes?

    Can anyone tell me how to merge two movies documents into one in iTunes? Sorry maybe this is an easy and dumb question for you, but I have totally no idea. I download some movies in Windows and use Kigo Video Coverter to converte them into mp4 format. So I can watch it through new Apple TV in my big screen TV. Now I am thinking maybe I can upgrade my experience a bit more. Because some movies are downloaded into 2 or 3 separated document and I have to choose and click them while seeing. Is there any software can merge those separated documents into one single in mp4 format? I wish you could give me several to choose . Freeware is better. Thanks in advance.

    Thanks. I've tried simplemovies application but it seems too much complicated for me. I read some passages online and bought Quicktime Pro ($30) and finally link 2 mp4 documents together. So easy. Only issue is I have to pay $ 30 .

  • Best approach to do Range partitioning on Huge tables.

    Hi All,
    I am working on 11gR2 oracle 3node RAC database. below are the db details.
    SQL> select * from v$version;
    BANNER
    Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    PL/SQL Release 11.2.0.3.0 - Production
    CORE 11.2.0.3.0 Production
    TNS for Linux: Version 11.2.0.3.0 - Production
    NLSRTL Version 11.2.0.3.0 - Production
    in my environment we have 10 big transaction (10 billion rows) tables and it is growing bigger and bigger. Now the management is planning to do a range partition based on created_dt partition key column.
    We tested this partitioning startegy with few million record in other environment with below steps.
    1. CREATE TABLE TRANSACTION_N
    PARTITION BY RANGE ("CREATED_DT")
    ( PARTITION DATA1 VALUES LESS THAN (TO_DATE(' 2012-08-01 00:00:00', 'YYYY-MM-DD HH24:MI:SS') ) TABLESPACE &&TXN_TAB_PART1,
    PARTITIONDATA2 VALUES LESS THAN (TO_DATE(' 2012-09-01 00:00:00', 'YYYY-MM-DD HH24:MI:SS') ) TABLESPACE &&TXN_TAB_PART2,
    PARTITION DATA3 VALUES LESS THAN (TO_DATE(' 2012-10-01 00:00:00', 'YYYY-MM-DD HH24:MI:SS') ) TABLESPACE &&TXN_TAB_PART3
    as (select * from TRANSACTION where 1=2);
    2. exchange partion for data move to new partition table from old one.
    ALTER TABLE TRANSACTION_N
    EXCHANGE PARTITION DATA1
    WITH TABLE TRANSACTION
    WITHOUT VALIDATION;
    3. create required indexes (took almost 3.5 hrs with parallel 16).
    4. Rename the table names and drop the old tables.
    this took around 8 hrs for one table which has 70 millions of records, then for billions of records it will take more than 8 hrs. But the problem is we get only 2 to 3 hrs of down time in production to implement these change for all tables.
    Can you please suggest the best approach i can do, to copy that much big data from existing table to the newly created partitioned table and create required indexes.
    Thanks,
    Hari

    >
    in my environment we have 10 big transaction (10 billion rows) tables and it is growing bigger and bigger. Now the management is planning to do a range partition based on created_dt partition key column.
    We tested this partitioning startegy with few million record in other environment with below steps.
    1. CREATE TABLE TRANSACTION_N
    PARTITION BY RANGE ("CREATED_DT")
    ( PARTITION DATA1 VALUES LESS THAN (TO_DATE(' 2012-08-01 00:00:00', 'YYYY-MM-DD HH24:MI:SS') ) TABLESPACE &&TXN_TAB_PART1,
    PARTITIONDATA2 VALUES LESS THAN (TO_DATE(' 2012-09-01 00:00:00', 'YYYY-MM-DD HH24:MI:SS') ) TABLESPACE &&TXN_TAB_PART2,
    PARTITION DATA3 VALUES LESS THAN (TO_DATE(' 2012-10-01 00:00:00', 'YYYY-MM-DD HH24:MI:SS') ) TABLESPACE &&TXN_TAB_PART3
    as (select * from TRANSACTION where 1=2);
    2. exchange partion for data move to new partition table from old one.
    ALTER TABLE TRANSACTION_N
    EXCHANGE PARTITION DATA1
    WITH TABLE TRANSACTION
    WITHOUT VALIDATION;
    3. create required indexes (took almost 3.5 hrs with parallel 16).
    4. Rename the table names and drop the old tables.
    this took around 8 hrs for one table which has 70 millions of records, then for billions of records it will take more than 8 hrs. But the problem is we get only 2 to 3 hrs of down time in production to implement these change for all tables.
    Can you please suggest the best approach i can do, to copy that much big data from existing table to the newly created partitioned table and create required indexes.
    >
    Sorry to tell you but that test and partitioning strategy is essentially useless and won't work for you entire table anyway. One reasone is that if you use the WITHOUT VALIDATION clause you must ensure that the data being exchanged actually belongs to the partition you are putting it in. If it doesn't you won't be able to reenable or rebuild any primary key or unique constraints that exist on the table.
    See Exchanging Partitions in the VLDB and Partitioning doc
    http://docs.oracle.com/cd/E18283_01/server.112/e16541/part_admin002.htm#i1107555
    >
    When you specify WITHOUT VALIDATION for the exchange partition operation, this is normally a fast operation because it involves only data dictionary updates. However, if the table or partitioned table involved in the exchange operation has a primary key or unique constraint enabled, then the exchange operation is performed as if WITH VALIDATION were specified to maintain the integrity of the constraints.
    If you specify WITHOUT VALIDATION, then you must ensure that the data to be exchanged belongs in the partition you exchange.
    >
    Comments below are limited to working with ONE table only.
    ISSUE #1 - ALL data will have to be moved regardless of the approach used. This should be obvious since your current data is all in one segment but each partition of a partitioned table requires its own segment. So the nut of partitioning is splitting the existing data into multiple segments almost as if you were splitting it up and inserting it into multiple tables, one table for each partition.
    ISSUE#2 - You likely cannot move that much data in the 2 to 3 hours window that you have available for down time even if all you had to do was copy the existing datafiles.
    ISSUE#3 - Even if you can avoid issue #2 you likely cannot rebuild ALL of the required indexes in whatever remains of the outage windows after moving the data itself.
    ISSUE#4 - Unless you have conducted full volume performance testing in another environment prior to doing this in production you are taking on a tremendous amount of risk.
    ISSUE#5 - Unless you have fully documented the current, actual execution plans for your most critical queries in your existing system you will have great difficulty overcoming issue #4 since you won't have the requisite plan baseline to know if the new partitioning and indexing strategies are giving you the equivalent, or better, performance.
    ISSUE#6 - Things can, and will, go wrong and cause delays no matter which approach you take.
    So assuming you plan to take care of issues #4 and #5 you will probably have three viable alternatives:
    1. use DBMS_REDEFINITION to do the partitioning on-line. See the Oracle docs and this example from oracle-base for more info.
    Redefining Tables Online - http://docs.oracle.com/cd/B28359_01/server.111/b28310/tables007.htm
    Partitioning an Existing Table using DBMS_REDEFINITION
    http://www.oracle-base.com/articles/misc/partitioning-an-existing-table.php
    2. do the partitioning offline and hope that you don't exceed your outage window. Recover by continuing to use the existing table.
    3. do the partitioning offline but remove the oldest data to minimize the amount of data that has to be worked with.
    You should review all of the tables to see if you can remove older data from the current system. If you can you could use online redefinition that ignores older data. Then afterwards you can extract this old data from the old table for archiving.
    If the amount of old data is substantial you can extract the new data to a new partitioned table in parallel and not deal with the old data at all.

  • What´s the best approach to work with Excel, csv files

    Hi gurus. I got a question for you. According to your experience what's the best approach to work with Excel or csv files that have to be uploaded through DataServices to you datawarehouse.
    Let's say your end-user, who is not a programmer, creates a group of 4 excel files with different calculations in a monthly basis, so they can generate a set of reports from their datawarehouse once the files have been uploaded to tables in your DWH. The calculations vary from month to month. The user doesn't have a front-end to upload the excel files directly to Data Services. The end user needs to keep a track of which person uploaded the files for a determined month.
    1. The end user should place their 4 excel files in a shared directory that will be seen by DataServices.
    2. DataServices will execute certain scheduled job that will read the four files and upload them to the Datawarehouse at a determined time, lets say at 9:00pm.
    It makes me wonder... what happens if the user needs to present their reports immediately so they can´t wait until 9:00pm.  Is it possible for the end user to execute some kind of action (out of the DataServices Environment) so DataServices "could know" that it has to process those files right now, instead of waiting for the night schedule?
    Is there a way that DS will track who was the person who uploaded those files?
    Would it be better to build a front-end for the end user so they can upload their four files directlyto the datawarehouse?
    Waiting for your comments to resolve this dilemma
    Best Regards
    Erika

    Hi,
    There are functions in DS that captures the input files automatically. You could use file_exists() or wait_for_file() option to do that. Schedule the job to run every certain minute and if the file exists then run. This could be done by using a certain file name with date and timestamp etc or after running move the old files to archive and DS wait for new files to show up.
    Check this - Selective Reading and Postprocessing - Enterprise Information Management - SCN Wiki
    Hope this helps.
    Arun

  • Best approach to replicate the data.

    Hello Every one,
    I want to know about the best approach to replicate the data.
    First i clear the senario , i have one oracle 10g enterprise edition database and 20 oracle 10g standerd edition databases.The enterprise edition will run at center and the other 20 standered edition databases will run at sites.
    The data will move from center to sites and vice versa not between sites.There is only one schema to replicate with more than 500 tables.
    what will be the best for replication (Updateble MVs or Oracle Streams or any thing else.),its urgentpls.
    Thanx in advance.
    Edited by: user560669 on Dec 13, 2009 11:01 AM

    Hello,
    Yes MV or Oracle Stream are the common ways to replicate datas between databases.
    I think that in your case (you have to replicate a whole Schema) Oracle Streams is interresting (it's not so easy
    to maintain 500 MV).
    But you must care of the type of Edition.
    I'm not sure that Standard Edition allows Advanced replication features. It seems to me (but I may be wrong)
    that Updatable MV is an EE features.
    About the Stream It seems to be available even in SE.
    Please, find enclosed some links about it:
    [http://www.oracle.com/database/product_editions.html]
    [http://www.oracle.com/technology/products/dataint/index.html]
    Hope it can help,
    Best regards,
    Jean-Valentin

  • Best approach for backup and transport (NW BPC 7.5)

    Hi,
    we are currently in a situation, where we have developed a prototype in a Q (quality) envoirnment. This has been a standalone system sofar.
    Now we have a Development->Quality->Production envoirnment - and I want to move my prototype from the Quality system to my Development system.
    I'm thing about the "UJT_BACKUP_RESTORE_UI" program. I will not copy masterdata and transactional data.
    My question is now. Is it possible to backup my Appset prototype in Quality and restore it under a new Appset Name in development, and then put in it a transport request ?
    Or would I have problems with technical names for cubes and dimensions ?
    What is the best approach ?
    Thank you,
    Joergen
    Edited by: jdalby on Jun 14, 2011 9:49 PM

    Hi,
    First of all, you should have posted your question in the BPC for NW forum.
    Coming back to your question, you can try the transaction UJBR. You can take the backup of metadata / masterdata / transaction data or any combination of those.
    I wont suggest you to change the name of the backup files. Instead, you can create a copy of the appset and then take the backup and restore it in DEV.
    Hope this helps.

  • Best approach to exporting and importing an environment

    Client is on 11.1.0.7. Windows 2008 R1. We have a need to export an environment that is using ASM and import that environment to another server that is not using ASM. Environment name will stay the same. Was hoping to get some advice on best approach for this and what would be involved. Thanks in advance.

    ...a need to export an environment...What do you mean by that? Are you referring to moving the data only or physically migrating the database? How big is the data/database size?
    If you only need to move data, export/import is transparent from ASM environment. If you need to migrate physical database from ASM to non-ASM in the same platform, you can use RMAN utility.

  • 40k file size, best approach ?

    Hello,
    I'm just dipping my toes into the Flash water and I'm
    wondering about the best approach for a project... that may not
    even involve flash at all ? Basically my station wants to put some
    of our commercials on our site, but they need to be 40k. I use CS4
    After Effects, so naturally I imported the spot and then exported
    it as a flash movie with the new dimensions 300x100.... down from
    1920x1080. Incredibly I got it down to like 200k and it looked
    pretty darn good. So my question, what's the best way to do this
    type of thing. Do i need flash and how can I get it down to 40k and
    still look good... or should I piss off cause I'm asking the wrong
    group :)
    Thanks for any help or pointers !!

    explain please :)
    So lets say I have a :30 spot that's a quicktime and our web
    folks need it 300x150 40k. What should be my steps.. generally,
    don't have to go into great detail here but I'm pretty blind on
    this topic.
    Thanks for the help

Maybe you are looking for