Encore audio and video different lengths

Hi everybody.  I am using the Adobe Production Premium CS 3.2 and I am currently using Encore to make a Blu-ray.  My show consists of eight MPEG files which I make into timelines in Encore (they become eight chapters in the menu -- each one is about 5 to 10 minutes long).  I made a DVD -- no problem at all.  I made an older version of the program on Blu-ray using the following settings (in Media Encoder) and was 100% successful.
MPEG-2 (1280x720)
720p 59.94 fps
VBR 2 pass
(25, 30, 35)
Quality 4.5
Audio 48 Khz, 16bit PCM
I made some changes in my original footage and decided to re-create everything anew.  I changed the Quality setting in Encoder from 4.5 to 5.  Now my MPEG-2 for Blu-ray files I import as assets into Encore.  I create timelines for each of my files/chapters.  Now, on every single timeline without exception, the video is one or two frames longer thatn the audio.  I cannot build a Blu-ray.  I try to go into the timeline, zoomed in to see the individual frames, and manually attempt to adjust the audio and video, but cannot get them to match up.
Any clues as to what might be going on?  I've read similar threads but nothing with a definite solution.  And I am wondering if Encore or Encoder just "doesn't like" me changing the quality to 5.0. Remember, when I did it before I didn't seem to have a problem.  Any help would be appreciated.

Encore seems unhappy with many assets imported into timelines if they are not a whole number of seconds long.  It seems to adjust their length to the nearest half frame.  I have only ever found this to happen with blu-ray mpeg2 files encoded in AME.
Another possibility is that the video timescale is min:second: frame, whereas the audio may be in samples or min:seconds.  This too can lead to marginal differences in length.
I have never found this to prevent me building a disk, but whichever of audio or video is longer gets cropped to match the shorter thus ruining fades to black/silence.
I should add that I have no experience of Encore 3, only Encore 2, 4 and 5.
These days I make sure that the length of my Premiere timeline is a whole number of seconds before encoding.

Similar Messages

  • Audio and video clip lengths differ in encore timeline

    Hi
    I have been working to build a BD-R H.263 ac3 disk.  I used AME from within Premiere Pro to export the M4V and (at the same time) AC3, using the present defualt parameters-no tweaking of any setting.  The clips appear to play great on encore's monitor--everything in sync.And on the burned BD-R
    However, I am getting the black screen between clips in the timeline when played in the standalone BD player, or (when I had one clip to a timeline) in a playlist.  a previous poster made note of different length audio and video clips.  I zoomed on the timeline and lo and behold--the  the audio and video clips are different lengths.
    This is with the timeline fully expand--each division is a frame. Note that break between the two audio clips is offset from the break between the two video clips by two frames.  This offset is after several clips.  I could find no information on what the one frame crosshatching at the end of the leading clips means.  Can anyone explain this cross hatching?  What do I do about it?  the image below is after the first clip in the timeline:
    In this image, the offset is less than a full frame??????
    I tried using the trim tools to bring these together-but, the trim tool only lets it jump a full frame and then only to a GOP boundary, which doesn't help because the boundaries are obviously different fro the video and audio track.
    I want to emphasize that these assets were created from the Premiere CS4 timeline (which had no offset between audio and video) using the default H.263 and AC3 presets in AME and the same presets selected in Encore.
    The offsets do not appear to cause the actual audio and video to go out of sync when I play the burned disk. But, I get the 1-2 second wait between menus and chapters, or with items in a playlist.
    The audio offset suggests that the player may not know what to do when faced with a chapter point that is different on the audio and video tracks, although, I will note that encore snapped to the joints in the video track not the audio track.
    You can see from this listing that several of the clips audio and video lengths that differ by one frame.  By the end of the timeline, the cummulative offset is 3 seconds (48 minute timeline)
    Can anyone shed any light on this?  Thanks

    I don't think that I had the same problem that you do, but I thought I'd throw in my two cents just in case it might help. Encore would not build the output; it said something like the inpoint and the length did not match the endpoint. Actually, I think this happened in several different situations. In one case I simply adjusted the audio and video tracks to match endpoints. In the other case I went back to Premier and found that in some cases where I had added transitions it had shifted the endpoints of the audio and video tracks so they did not match. I removed the transitions. It didn't matter that much to me since I'm doing this for personal use. Is it possible those crosshatched lines you're seeing are transitions?
    I am using CS 3 and editing HD. I have had many problems with encore. I read in the form someplace that it had a hard time handling H.264 so I changed to MPEG-2 and that fixed one problem. In another case it couldn't find some files and I found some advice not to use HD or wide menus, so I switched to standard 4:3 menus and it worked.

  • Video Different Length Than Audio

    Hi
    I created a reference file in FCP and encoded through compressor. For some reason, the video length is a few frames longer than the audio. This is confusing my timeline in DVDSP.
    Does anyone know what I'm doing wrong?
    Thanks

    I am shooting and capturing in DV-CAM, editing in Final Cut 5, and exporting using Compressor for DVD. I'm not doing any sample rate conversions or changes in frame rate.
    The interesting thing was, after I investigated further, on some of the clips, Final Cut and DVD Studio Pro saw the compressed MPEG-2 video as being different lengths. Also, even though the audio and video showed as being up to 1 second different in length in DSP (on a 24-min. clip), when I put both on a timeline, they reconciled within a frame. There must be some kind of imbedded time code that is keeping them in sync.

  • Audio and Video out of Sync in Encore 2.0

    Here's my issue:
    I opened a project in Encore 2.0 that was an Encore 1.5 project. I have one timeline with a 2 hour movie and the AC3 audio file that goes with it. Encore 1.5 played the audio and video in sync from the timeline and burned discs with the audio in sync. The video is an MPEG2 file encoded at a 3.1 constant bit rate.
    In Encore 2.0, when I play the timeline, the audio becomes progressively out of sync as it plays. Within the first few minutes it's maybe a frame or 2 off. By the end of the film it is about 18 seconds off (audio is playing ahead of the video).
    Is this a result of Encore 2.0 conforming the audio file? If so, how am I (or anyone) supposed to attain sync between video and audio in Encore 2.0?
    I have Encore 1.5 and 2.0 both installed on the same machine. As a test, I opened the project in 1.5 and played it, everything was fine. I closed Encore 1.5 and opened the project in Encore 2.0, and the audio is out of sync.
    Any comments, ideas or suggestions?
    Mike

    Joe,
    I tried creating a project with the original WAV file and it's corresponding video and found that the audio and video were still out of sync. I did a little more testing and discovered that the audio is actually being played correctly, it's the video that is not being played correctly. Here's what I did:
    I opened the WAV file version of the audio file in Premiere Pro and noted where a hammer hit in the video occurs (01;40;30;28). I then opened Encore 2.0 and imported the video and AC3 version of the WAV file, placed them in a timeline and the audio's hammer hit is right where it's supposed to be (01;40;30;28), but the video's hammer hit occurs at 01;40;43;18.
    Next I went into Encore 1.5 and brought in the AC3 audio and video files and placed them on a timeline. I went to 01;40;30;28 in the timeline and the audio and video were both at the frame they should be at that timecode.
    Seeing how the video appears to be playing incorrectly, here's some info on my video files:
    I have 4 versions of the same film in varying encode rates with the same start and end timecodes. All four files were encoded on a Sonic DVD Creator authoring station using hardware to encode. The files are encoded at the following rates: 2.9 constant bit rate, 3.1 constant bit rate, 3.4 variable bit rate and 4.0 variable bit rate. They are each NTSC with drop frame timecode. All four files play correctly in Encore 1.5 but incorrectly in Encore 2.0.
    This almost seems like a drop frame/non-drop frame issue. Do you have any ideas?
    Mike

  • Create a DVD using Premiere Pro and Encore - delay between audio and video (in ms)?

    I created a DVD and need to be sure that audio and video are in perfect sync (smaller than 1 millisecond).
    Is this even possible with Premiere Pro / with DVDs in general?
    My sequence settings in Premiere are:
    DV PAL, 25 fps, 720x576 (16:9), D1/DV PAL Widescreen, lower field first, Audio: 48000 Hz
    My DVD settings in Encore are:
    MPEG-2 (MainConcept MPEG-Video), 720x576, 25 fps, lower field first, Widescreen 16:9 (1,458), Bitrate VBR, 2-Pass, Min 1,5 MBit/s Target 4 MBit/s, Max 9 MBit/s, GOP: M-Frames 3, N-Frames 12, Audio: Dolby Digital, 192 kbit/s

    I created a DVD and need to be sure that audio and video are in perfect sync (smaller than 1 millisecond).
    Is this even possible with Premiere Pro / with DVDs in general?
    Frame accurate is the best you can achieve and that means 33 milliseconds at best.

  • Capture Audio and Video to different disks or not?

    I thought that capturing audio and video to separate disks was more efficient, now I see in the FCS 2 documentation it says that this is not recommended for any of the DV formats. But it does not say why, or what the consequences of doing this are...
    Does anyone out there know?
    Thanks,
    Pete d.

    Peter Durso wrote:
    Now, Is there any value in putting the render files on a drive that is separate from the Capture disks?
    Pete D.
    Absolutely. Listen to your drives while doing an intense render. The Capture drive is grabbing all of the video clips in the stack, the system drive is grabbing software. If you render to the Capture drive, that same drive has to write, too. Reduce the load by using a Render drive.
    I have four drives: System, Capture, Render, Output. The last is used for all exports, encodes, and transcodes.
    bogiesan

  • When exporting separate audio and video streams, the audio drifts out of sync

    So this is my first slightly larger project in Final Cut Pro X.
    When I export my finished project (around 12 minutes long) straight to quicktime, it plays back perfectly.
    When I export using compressor, the resulting audio and video streams are two different lengths. The audio comes out a full second longer. If you try to lay them down in a program like Adobe Encore, the audio starts out perfectly in sync. Two minutes in you notice the video is slightly out of sync with the audio. By the end of the video it's comically bad, over a second out of sync.
    What's even more vexing is if I try to take the quicktime file that IS in sync upon playback, and try to convert it burn that to a dvd or blu-ray using toast, the problem re-emerges.
    I triec searching around, but most of the suggestions that came up were vauge, like, "Upgrade to a real program like Final Cut." (This was in reply to people who were using free software programs.)
    I did see one idea surface where 44 and 48 kHz sampling rates don't play well together. So I combed through my project and found all the 44.1 songs that were used, and swapped them out with upsampled versions. Nothing changed. It didn't sound like it would fix it, as people usually said it was a specific amount of frames off, not something drifting out of sync.
    So the question is, does anyone have any suggestions on what could be causing this? Thanks! :-)

    Project settings can be accessed and changed by selecting the project and pressing command-J. Then, in the inspector, click on the little gear in the lower-right hand corner to access the settings.
    Why are you trying to output video/audio separately? If the content plays in-sync when exporting a quicktime movie (using current project settings), you could try separating the video/audio using MPEG Streamclip as a second step instead of using compressor. For audio, don't change the sample rate from the project settings, and use uncompressed audio (AIFF) as the output format.

  • Lync 2013 and Macbook Pro (can not complete the call ) on audio and video call

    hi everyone ,
    we installed lync2013 server on windows 2012 server .Windows client working perfectly without any problems but we have some MacBook pro (maverick) with lync 2011 installed (14.0.8) .Mac clients can log in , make Instant messaging but can not make audio and
    video calls at the moment.When  I had done some logging on lync server , I ifigured out the following error message  which is
    "Start-Line: SIP/2.0 401 Unauthorized"
    TL_INFO(TF_PROTOCOL) [LYNCFE\LYNCFE]15EC.26C8::05/30/2014-13:08:46.224.00000BF4 (SIPStack,SIPAdminLog::ProtocolRecord::Flush:ProtocolRecord.cpp(265)) [3440658878]
    Trace-Correlation-Id: 3440658878
    Instance-Id: 10DF1
    Direction: outgoing;source="local"
    Peer: 10.45.40.22:54285
    Message-Type: response
    Start-Line: SIP/2.0 401 Unauthorized
    From: <sip:[email protected]>;tag=118c2f6503;epid=1dac663933
    To: <sip:[email protected]>;tag=59BF4D7B62264F8A815140BEC75A72BC
    Call-ID: b50d1d78ea20324eb00dbe3ed316c4b6
    CSeq: 2 INVITE
    Via: SIP/2.0/TLS 10.45.40.22:3540;received=10.45.40.22;ms-received-port=54285;ms-received-cid=243800
    Content-Length: 0
    $$end_record
    any ideas related with this error  ???
    1-I had tried 3 different mac book with 4 different user account
    2-Fresh installed lync 2011 on each MacBook ,

    Hi,
    Please try to clear all preferences for Mac Lync.
    You can do as following steps in the link:
    http://www.unicom.iu.edu/kb.php?docid=bave :
    1.  Quit Lync for Mac.
    2..In your Home folder, open the Library folder. Note that Mac OS X 10.7 and later hides your Library folder. To access it:
           1. Press Command-Shift-g, or from the Go menu, select Go to Folder... .
           2. In the Go to Folder drop-down window, enter ~/Library, and click Go.
    3.  Remove the following files from your Library folder:
              /Users/username/Library/Preferences/com.microsoft.Lync.plist
              /Users/username/Library/Preferences/ByHost/MicrosoftLyncRegistrationDB.xxxx.plist
              /Users/username/Library/Logs/Microsoft-Lync-x.log (This file is present only if you turned on Lync Logging.)
              /Users/username/Library/Logs/Microsoft-Lync.log
    4.  In your Documents folder, remove the following:
              /Users/username/Documents/Microsoft User Data/Microsoft Lync Data
    5.  Optionally, also remove Microsoft Lync History:
              Users/username/Documents/Microsoft User Data/Microsoft Lync History
              Note: This optional step will delete saved conversations. For Mac users, the conversation history is not saved to the Exchange account, but instead is saved locally
    to the Mac.
    6.  Open Keychain Access from the /Applications/Utilities folder:
         1. Delete any keychains on the left that look like the following, where emailaddress is your email address: 
    OC__KeyContainer__emailaddress
         2. In your Login keychain, delete the following, where emailaddress is your email address: 
    emailaddress.cer
    7.  In the /Users/username/Library/Keychains folder, delete all files that look like the following, where emailaddress is your email address: 
    OC__KeyContainer__emailaddress  
    Note: Microsoft is providing this information as a convenience to you. The sites are not controlled by Microsoft. Microsoft cannot make any representations regarding the quality, safety, or suitability of any software or information
    found there. Please make sure that you completely understand the risk before retrieving any suggestions from the above link.
    Best Regards,
    Eason Huang
    Eason Huang
    TechNet Community Support

  • DO i need some extra hardware interface for receving both Audio and video

    hi i m doing e-learning project. i have to capture video from webcam and voice from headphone and send to client.
    but my code is working fine for either one at a time.
    DO i need some extra hardware interface for receving both Audio and video. im using code AVTransmit and AVReceive found from this site only
    After running TX
    i give Dsound:// & vfw://0 in Media Locater only sound is received and no vedio
    and when i give vfw://0 in Media Locater only live video is transmited.
    im using JMF1.1.2e.
    if any one know the method to run or cause of it plz reply me soon. i will be very thankfull
    transmiter/server side code .first run TX on server
    import java.io.*;
    import java.awt.*;
    import java.awt.event.*;
    import java.net.*;
    import java.util.*;
    import javax.media.rtp.*;
    import javax.swing.*;
    import javax.swing.event.*;
    import javax.swing.border.*;
    public class Tx extends JFrame implements ActionListener, KeyListener,
    MouseListener, WindowListener {
    Vector targets;
    JList list;
    JButton startXmit;
    JButton rtcp;
    JButton update;
    JButton expiration;
    JButton statistics;
    JButton addTarget;
    JButton removeTarget;
    JTextField tf_remote_address;
    JTextField tf_remote_data_port;
    JTextField tf_media_file;
    JTextField tf_data_port;
    TargetListModel listModel;
    AVTransmitter avTransmitter;
    RTCPViewer rtcpViewer;
    JCheckBox cb_loop;
    Config config;
    public Tx() {
    setTitle( "JMF/RTP Transmitter");
         config= new Config();
         GridBagLayout gridBagLayout= new GridBagLayout();
         GridBagConstraints gbc;
         JPanel p= new JPanel();
         p.setLayout( gridBagLayout);
         JPanel localPanel= createLocalPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 0;
         gbc.gridy= 0;
         gbc.gridwidth= 2;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( localPanel, gbc);
         p.add( localPanel);
         JPanel targetPanel= createTargetPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 1;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( targetPanel, gbc);
    p.add( targetPanel);
         JPanel mediaPanel= createMediaPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 2;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( mediaPanel, gbc);
    p.add( mediaPanel);
    JPanel buttonPanel= new JPanel();
    rtcp= new JButton( "RTCP Monitor");
    update= new JButton( "Transmission Status");
         update.setEnabled( false);
         rtcp.addActionListener( this);
         update.addActionListener( this);
         buttonPanel.add( rtcp);
         buttonPanel.add( update);
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 3;
    gbc.gridwidth= 2;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( buttonPanel, gbc);
         p.add( buttonPanel);
    getContentPane().add( p);
         list.addMouseListener( this);
         addWindowListener( this);
    pack();
    setVisible( true);
    private JPanel createMediaPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         JLabel label= new JLabel( "Media Locator:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         tf_media_file= new JTextField( 35);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_media_file, gbc);
         p.add( tf_media_file);
         tf_media_file.setText( config.media_locator);
         cb_loop= new JCheckBox( "loop");
         startXmit= new JButton( "Start Transmission");
         startXmit.setEnabled( true);
         startXmit.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 2;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( cb_loop, gbc);
         p.add( cb_loop);
         cb_loop.setSelected( true);
         cb_loop.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( startXmit, gbc);
         p.add( startXmit);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Source");
         p.setBorder( titledBorder);
         return p;
    private JPanel createTargetPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         targets= new Vector();
         for( int i= 0; i < config.targets.size(); i++) {
         targets.addElement( config.targets.elementAt( i));
    listModel= new TargetListModel( targets);
    list= new JList( listModel);
         list.addKeyListener( this);
         list.setPrototypeCellValue( "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx");
    JScrollPane scrollPane= new JScrollPane( list,
    ScrollPaneConstants.VERTICAL_SCROLLBAR_AS_NEEDED,
    ScrollPaneConstants.HORIZONTAL_SCROLLBAR_NEVER);
         gbc= new GridBagConstraints();
         gbc.gridx= 0;
         gbc.gridy= 0;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( scrollPane, gbc);
         p.add( scrollPane);
    JPanel p1= new JPanel();
         p1.setLayout( gridBagLayout);
         JLabel label= new JLabel( "IP Address:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( label, gbc);
         p1.add( label);
         tf_remote_address= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( tf_remote_address, gbc);
         p1.add( tf_remote_address);
         label= new JLabel( "Data Port:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( label, gbc);
         p1.add( label);
         tf_remote_data_port= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( tf_remote_data_port, gbc);
         p1.add( tf_remote_data_port);     
    JPanel p2= new JPanel();
    addTarget= new JButton( "Add Target");     
    removeTarget= new JButton( "Remove Target");
         p2.add( addTarget);
         p2.add( removeTarget);
         addTarget.addActionListener( this);
         removeTarget.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 2;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.gridwidth= 2;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 20,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( p2, gbc);
         p1.add( p2);
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 0;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( p1, gbc);
         p.add( p1);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Targets");
         p.setBorder( titledBorder);
         return p;
    private JPanel createLocalPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         JLabel label= new JLabel( "IP Address:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         JTextField tf_local_host= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_local_host, gbc);
         p.add( tf_local_host);
         try {
    String host= InetAddress.getLocalHost().getHostAddress();     
         tf_local_host.setText( host);
         } catch( UnknownHostException e) {
         label= new JLabel( "Data Port:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         tf_data_port= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_data_port, gbc);
         p.add( tf_data_port);
         tf_data_port.setText( config.local_data_port);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Local Host");
         p.setBorder( titledBorder);
         return p;
    public void actionPerformed( ActionEvent event) {
    Object source= event.getSource();
         if( source == addTarget) {
         String ip= tf_remote_address.getText().trim();
         String port= tf_remote_data_port.getText().trim();
         String localPort= tf_data_port.getText().trim();
         addTargetToList( localPort, ip, port);
         if( avTransmitter != null) {
         avTransmitter.addTarget( ip, port);
         } else if( source == removeTarget) {
         int index= list.getSelectedIndex();
         if( index != -1) {
              Target target= (Target) targets.elementAt( index);
              if( avTransmitter != null) {
         avTransmitter.removeTarget( target.ip, target.port);
              targets.removeElement( target);
              listModel.setData( targets);          
         } else if( source == startXmit) {
         if( startXmit.getLabel().equals( "Start Transmission")) {          
         int data_port= new Integer( tf_data_port.getText()).intValue();
              avTransmitter= new AVTransmitter( this, data_port);
         avTransmitter.start( tf_media_file.getText().trim(), targets);          
              avTransmitter.setLooping( cb_loop.isSelected());
         startXmit.setLabel( "Stop Transmission");
         } else if( startXmit.getLabel().equals( "Stop Transmission")) {
              avTransmitter.stop();
              avTransmitter= null;
              removeNonBaseTargets();
              listModel.setData( targets);
         startXmit.setLabel( "Start Transmission");          
         } else if( source == rtcp) {
         if( rtcpViewer == null) {
         rtcpViewer= new RTCPViewer();
         } else {
              rtcpViewer.setVisible( true);
              rtcpViewer.toFront();
         } else if( source == cb_loop) {
         if( avTransmitter != null) {
              avTransmitter.setLooping( cb_loop.isSelected());
    private void removeNonBaseTargets() {
         String localPort= tf_data_port.getText().trim();
         for( int i= targets.size(); i > 0;) {
         Target target= (Target) targets.elementAt( i - 1);
         if( !target.localPort.equals( localPort)) {
    targets.removeElement( target);
         i--;
    public void addTargetToList( String localPort,
                             String ip, String port) {     
    ListUpdater listUpdater= new ListUpdater( localPort, ip,
                                  port, listModel, targets);
    SwingUtilities.invokeLater( listUpdater);           
    public void rtcpReport( String report) {
         if( rtcpViewer != null) {
         rtcpViewer.report( report);
    public void windowClosing( WindowEvent event) {
         config.local_data_port= tf_data_port.getText().trim();
         config.targets= new Vector();
         for( int i= 0; i < targets.size(); i++) {
         Target target= (Target) targets.elementAt( i);
         if( target.localPort.equals( config.local_data_port)) {
              config.addTarget( target.ip, target.port);
         config.media_locator= tf_media_file.getText().trim();
         config.write();
    System.exit( 0);
    public void windowClosed( WindowEvent event) {
    public void windowDeiconified( WindowEvent event) {
    public void windowIconified( WindowEvent event) {
    public void windowActivated( WindowEvent event) {
    public void windowDeactivated( WindowEvent event) {
    public void windowOpened( WindowEvent event) {
    public void keyPressed( KeyEvent event) {
    public void keyReleased( KeyEvent event) {
    Object source= event.getSource();
         if( source == list) {
         int index= list.getSelectedIndex();
    public void keyTyped( KeyEvent event) {
    public void mousePressed( MouseEvent e) {
    public void mouseReleased( MouseEvent e) {
    public void mouseEntered( MouseEvent e) {
    public void mouseExited( MouseEvent e) {
    public void mouseClicked( MouseEvent e) {
    Object source= e.getSource();
         if( source == list) {
         int index= list.getSelectedIndex();
         if( index != -1) {
              Target target= (Target) targets.elementAt( index);
              tf_remote_address.setText( target.ip);
              tf_remote_data_port.setText( target.port);
         int index= list.locationToIndex( e.getPoint());
    public static void main( String[] args) {
    new Tx();
    class TargetListModel extends AbstractListModel {
    private Vector options;
    public TargetListModel( Vector options) {
         this.options= options;
    public int getSize() {
         int size;
         if( options == null) {
         size= 0;
         } else {
         size= options.size();
         return size;
    public Object getElementAt( int index) {
    String name;
    if( index < getSize()) {
         Target o= (Target)options.elementAt( index);
    name= o.localPort + " ---> " + o.ip + ":" + o.port;
         } else {
         name= null;
         return name;
    public void setData( Vector data) {
         options= data;
         fireContentsChanged( this, 0, data.size());
    class ListUpdater implements Runnable {
    String localPort, ip, port;
    TargetListModel listModel;
    Vector targets;
    public ListUpdater( String localPort, String ip, String port,
                   TargetListModel listModel, Vector targets) {
         this.localPort= localPort;
         this.ip= ip;
         this.port= port;
         this.listModel= listModel;
         this.targets= targets;
    public void run() {
    Target target= new Target( localPort, ip, port);
         if( !targetExists( localPort, ip, port)) {
         targets.addElement( target);
    listModel.setData( targets);
    public boolean targetExists( String localPort, String ip, String port) {
         boolean exists= false;
         for( int i= 0; i < targets.size(); i++) {
         Target target= (Target) targets.elementAt( i);
         if( target.localPort.equals( localPort)
         && target.ip.equals( ip)
              && target.port.equals( port)) {          
              exists= true;
         break;
         return exists;
    >>>>>>>>>>>>>>>>>
    import java.awt.*;
    import java.io.*;
    import java.net.InetAddress;
    import java.util.*;
    import javax.media.*;
    import javax.media.protocol.*;
    import javax.media.format.*;
    import javax.media.control.TrackControl;
    import javax.media.control.QualityControl;
    import javax.media.rtp.*;
    import javax.media.rtp.event.*;
    import javax.media.rtp.rtcp.*;
    public class AVTransmitter implements ReceiveStreamListener, RemoteListener,
    ControllerListener {
    // Input MediaLocator
    // Can be a file or http or capture source
    private MediaLocator locator;
    private String ipAddress;
    private int portBase;
    private Processor processor = null;
    private RTPManager rtpMgrs[];
    private int localPorts[];
    private DataSource dataOutput = null;
    private int local_data_port;
    private Tx tx;
    public AVTransmitter( Tx tx, int data_port) {
         this.tx= tx;
         local_data_port= data_port;
    * Starts the transmission. Returns null if transmission started ok.
    * Otherwise it returns a string with the reason why the setup failed.
    public synchronized String start( String filename, Vector targets) {
         String result;
         locator= new MediaLocator( filename);
         // Create a processor for the specified media locator
         // and program it to output JPEG/RTP
         result = createProcessor();
         if (result != null) {
         return result;
         // Create an RTP session to transmit the output of the
         // processor to the specified IP address and port no.
         result = createTransmitter( targets);
         if (result != null) {
         processor.close();
         processor = null;
         return result;
         // Start the transmission
         processor.start();
         return null;
    * Use the RTPManager API to create sessions for each media
    * track of the processor.
    private String createTransmitter( Vector targets) {
         // Cheated. Should have checked the type.
         PushBufferDataSource pbds = (PushBufferDataSource)dataOutput;
         PushBufferStream pbss[] = pbds.getStreams();
         rtpMgrs = new RTPManager[pbss.length];
         localPorts = new int[ pbss.length];
         SessionAddress localAddr, destAddr;
         InetAddress ipAddr;
         SendStream sendStream;
         int port;
         SourceDescription srcDesList[];
         for (int i = 0; i < pbss.length; i++) {
         // for (int i = 0; i < 1; i++) {
         try {
              rtpMgrs[i] = RTPManager.newInstance();     
              port = local_data_port + 2*i;
              localPorts[ i]= port;
              localAddr = new SessionAddress( InetAddress.getLocalHost(),
                                  port);
              rtpMgrs.initialize( localAddr);          
              rtpMgrs[i].addReceiveStreamListener(this);
              rtpMgrs[i].addRemoteListener(this);
         for( int k= 0; k < targets.size(); k++) {
              Target target= (Target) targets.elementAt( k);
              int targetPort= new Integer( target.port).intValue();
              addTarget( localPorts[ i], rtpMgrs[ i], target.ip, targetPort + 2*i);
              sendStream = rtpMgrs[i].createSendStream(dataOutput, i);          
              sendStream.start();
         } catch (Exception e) {
              e.printStackTrace();
              return e.getMessage();
         return null;
    public void addTarget( String ip, String port) {
         for (int i= 0; i < rtpMgrs.length; i++) {
         int targetPort= new Integer( port).intValue();
         addTarget( localPorts[ i], rtpMgrs[ i], ip, targetPort + 2*i);
    public void addTarget( int localPort, RTPManager mgr, String ip, int port) {
         try {
         SessionAddress addr= new SessionAddress( InetAddress.getByName( ip),
                                  new Integer( port).intValue());
         mgr.addTarget( addr);
         tx.addTargetToList( localPort + "", ip, port + "");
         } catch( Exception e) {
         e.printStackTrace();
    public void removeTarget( String ip, String port) {
         try {     
         SessionAddress addr= new SessionAddress( InetAddress.getByName( ip),
                                  new Integer( port).intValue());
         for (int i= 0; i < rtpMgrs.length; i++) {
         rtpMgrs[ i].removeTarget( addr, "target removed from transmitter.");
         } catch( Exception e) {
         e.printStackTrace();
    boolean looping= true;
    public void controllerUpdate( ControllerEvent ce) {
         System.out.println( ce);
         if( ce instanceof DurationUpdateEvent) {
         Time duration= ((DurationUpdateEvent) ce).getDuration();
         System.out.println( "duration: " + duration.getSeconds());
         } else if( ce instanceof EndOfMediaEvent) {
         System.out.println( "END OF MEDIA - looping=" + looping);
         if( looping) {
         processor.setMediaTime( new Time( 0));
              processor.start();
    public void setLooping( boolean flag) {
         looping= flag;
    public void update( ReceiveStreamEvent event) {
         String timestamp= getTimestamp();
         StringBuffer sb= new StringBuffer();
         if( event instanceof InactiveReceiveStreamEvent) {
         sb.append( timestamp + " Inactive Receive Stream");
         } else if( event instanceof ByeEvent) {
         sb.append( timestamp + " Bye");
         } else {
         System.out.println( "ReceiveStreamEvent: "+ event);
         tx.rtcpReport( sb.toString());     
    public void update( RemoteEvent event) {     
         String timestamp= getTimestamp();
         if( event instanceof ReceiverReportEvent) {
         ReceiverReport rr= ((ReceiverReportEvent) event).getReport();
         StringBuffer sb= new StringBuffer();
         sb.append( timestamp + " RR");
         if( rr != null) {
              Participant participant= rr.getParticipant();
              if( participant != null) {
              sb.append( " from " + participant.getCNAME());
              sb.append( " ssrc=" + rr.getSSRC());
              } else {
              sb.append( " ssrc=" + rr.getSSRC());
              tx.rtcpReport( sb.toString());
         } else {
         System.out.println( "RemoteEvent: " + event);
    private String getTimestamp() {
         String timestamp;
         Calendar calendar= Calendar.getInstance();
         int hour= calendar.get( Calendar.HOUR_OF_DAY);
         String hourStr= formatTime( hour);
         int minute= calendar.get( Calendar.MINUTE);
         String minuteStr= formatTime( minute);
         int second= calendar.get( Calendar.SECOND);
         String secondStr= formatTime( second);
         timestamp= hourStr + ":" + minuteStr + ":" + secondStr;     
         return timestamp;
    private String formatTime( int time) {     
         String timeStr;
         if( time < 10) {
         timeStr= "0" + time;
         } else {
         timeStr= "" + time;
         return timeStr;
    * Stops the transmission if already started
    public void stop() {
         synchronized (this) {
         if (processor != null) {
              processor.stop();
              processor.close();
              processor = null;
         for (int i= 0; i < rtpMgrs.length; i++) {
         rtpMgrs[ i].removeTargets( "Session ended.");
              rtpMgrs[ i].dispose();
    public String createProcessor() {
         if (locator == null) {
         return "Locator is null";
         DataSource ds;
         DataSource clone;
         try {
         ds = javax.media.Manager.createDataSource(locator);
         } catch (Exception e) {
         return "Couldn't create DataSource";
         // Try to create a processor to handle the input media locator
         try {
         processor = javax.media.Manager.createProcessor(ds);
         processor.addControllerListener( this);     
         } catch (NoProcessorException npe) {
         return "Couldn't create processor";
         } catch (IOException ioe) {
         return "IOException creating processor";
         // Wait for it to configure
         boolean result = waitForState(processor, Processor.Configured);
         if (result == false)
         return "Couldn't configure processor";
         // Get the tracks from the processor
         TrackControl [] tracks = processor.getTrackControls();
         // Do we have atleast one track?
         if (tracks == null || tracks.length < 1)
         return "Couldn't find tracks in processor";
         // Set the output content descriptor to RAW_RTP
         // This will limit the supported formats reported from
         // Track.getSupportedFormats to only valid RTP formats.
         ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW_RTP);
         processor.setContentDescriptor(cd);
         Format supported[];
         Format chosen;
         boolean atLeastOneTrack = false;
         // Program the tracks.
         for (int i = 0; i < tracks.length; i++) {
         Format format = tracks[i].getFormat();
         if (tracks[i].isEnabled()) {
              supported = tracks[i].getSupportedFormats();
              // We've set the output content to the RAW_RTP.
              // So all the supported formats should work with RTP.
              // We'll just pick the first one.
              if (supported.length > 0) {
              if (supported[0] instanceof VideoFormat) {
                   // For video formats, we should double check the
                   // sizes since not all formats work in all sizes.
                   chosen = checkForVideoSizes(tracks[i].getFormat(),
                                       supported[0]);
              } else
                   chosen = supported[0];
              tracks[i].setFormat(chosen);
              System.err.println("Track " + i + " is set to transmit as:");
              System.err.println(" " + chosen);
              atLeastOneTrack = true;
              } else
              tracks[i].setEnabled(false);
         } else
              tracks[i].setEnabled(false);
         if (!atLeastOneTrack)
         return "Couldn't set any of the tracks to a valid RTP format";
         // Realize the processor. This will internally create a flow
         // graph and attempt to create an output datasource for JPEG/RTP
         // audio frames.
         result = waitForState(processor, Controller.Realized);
         if (result == false)
         return "Couldn't realize processor";
         // Set the JPEG quality to .5.
         setJPEGQuality(processor, 0.5f);
         // Get the output data source of the processor
         dataOutput = processor.getDataOutput();
         return null;
    static SessionAddress destAddr1, destAddr2;
    * For JPEG and H263, we know that they only work for particular
    * sizes. So we'll perform extra checking here to make sure they
    * are of the right sizes.
    Format checkForVideoSizes(Format original, Format supported) {
         int width, height;
         Dimension size = ((VideoFormat)original).getSize();
         Format jpegFmt = new Format(VideoFormat.JPEG_RTP);
         Format h263Fmt = new Format(VideoFormat.H263_RTP);
         if (supported.matches(jpegFmt)) {
         // For JPEG, make sure width and height are divisible by 8.
         width = (size.width % 8 == 0 ? size.width :
                        (int)(size.width / 8) * 8);
         height = (size.height % 8 == 0 ? size.height :
                        (int)(size.height / 8) * 8);
         } else if (supported.matches(h263Fmt)) {
         // For H.263, we only support some specific sizes.
         if (size.width < 128) {
              width = 128;
              height = 96;
         } else if (size.width < 176) {
              width = 176;
              height = 144;
         } else {
              width = 352;
              height = 288;
         } else {
         // We don't know this particular format. We'll just
         // leave it alone then.
         return supported;
         return (new VideoFormat(null,
                        new Dimension(width, height),
                        Format.NOT_SPECIFIED,
                        null,
                        Format.NOT_SPECIFIED)).intersects(supported);
    * Setting the encoding quality to the specified value on the JPEG encoder.
    * 0.5 is a good default.
    void setJPEGQuality(Player p, float val) {
         Control cs[] = p.getControls();
         QualityControl qc = null;
         VideoFormat jpegFmt = new VideoFormat(VideoFormat.JPEG);
         // Loop through the controls to find the Quality control for
         // the JPEG encoder.
         for (int i = 0; i < cs.length; i++) {
         if (cs[i] instanceof QualityControl &&
              cs[i] instanceof Owned) {
              Object owner = ((Owned)cs[i]).getOwner();
              // Check to see if the owner is a Codec.
              // Then check for the output format.
              if (owner instanceof Codec) {
              Format fmts[] = ((Codec)owner).getSupportedOutputFormats(null);
              for (int j = 0; j < fmts.length; j++) {
                   if (fmts[j].matches(jpegFmt)) {
                   qc = (QualityControl)cs[i];
                   qc.setQuality(val);
                   System.err.println("- Setting quality to " +
                             val + " on " + qc);
                   break;
              if (qc != null)
              break;
    * Convenience methods to handle processor's state changes.
    private Integer stateLock = new Integer(0);
    private boolean failed = false;
    Integer getStateLock() {
         return stateLock;
    void setFailed() {
         failed = true;
    private synchronized boolean waitForState(Processor p, int state) {
         p.addControllerListener(new StateListener());
         failed = false;
         // Call the required method on the processor
         if (state == Processor.Configured) {
         p.configure();
         } else if (state == Processor.Realized) {
         p.realize();
         // Wait until we get an event that confirms the
         // success of the method, or a failure event.
         // See StateListener inner class
         while (p.getState() < state && !failed) {
         synchronized (getStateLock()) {
              try {
              getStateLock().wait();
              } catch (InterruptedException ie) {
              return false;
         if (failed)
         return false;
         else
         return true;
    * Inner Classes
    class StateListener implements ControllerListener {
         public void controllerUpdate(ControllerEvent ce) {
         // If there was an error during configure or
         // realiz

    I do this all the time, I put my MBP to a 60 inch Sharp. If you have the video working do the simple thing first. Check to make sure your sound is on your TV and Mac. Then if that doesn't work go to System Prefrences and under sound go to a tab called Output and see if your TV is listed and if it is change it to that setting
    Hope It Works

  • Does 2 firewire audio and video interface can work together  on FCP ?

    hello,
    My audio interface is a TC Electronic Konnekt 24D
    my video interface is a Canopus ADVC 110
    my hard drives are plugged in firewire too
    and I use Final Cut Pro 5.0.4
    So, the konnekt is recognized by FCP only when the Canopus is not plugged. And even then, the audio sometimes drop out and sometimes causes the app to crash, so that I have to trash the preferences
    When I plug the Canopus, the Konnekt 24D is recognised by OS X but not FCP
    I want to use the voice over tool with the konnekt 24D, so what should I do ?
    Does 2 firewire audio and video interface can work together ?
    If so, maybe I only have a driver problem from TC Electronic.
    But if not, I have to change this interface. Let me know if only RME, MOTU, Presonus and Edirol work, as it is said on apple's webpage for hardware compatibility
    thanks
    eMac G4 1,25 Ghz Mac OS X (10.3.9)
    eMac G4 1,25 Ghz 1 Go RAM DDR   Mac OS X (10.3.9)  

    The 1082 does show up as a possible input when I open
    the VO tool.
    Ok. So an audio interface can be plugged to FCP without crashing the app. Because of the Konnekt driver problem, I wasn't shure of this.
    I use my Aja box for export & external monitoring.
    All my audio sources run back into the Tascam and I
    use the analog mixer in the 1082 to monitor different
    sources.
    When I view external video, the audio follows it to
    keep sync.
    that's exactly what I want with other I/O interfaces. I'd like to monitor video with the canopus ADVC 110, and monitor audio with the Konnekt 24D (or another interface if this one is not compatible)
    I also imput dailies with a deck or a camera (DV or HDV) but that's not the point here.
    I want to imput audio with the Konnekt 24D because of its audio quality. Not audio sync to video dailies. But voice over, in sync with FCP video playback (for me, tha ADVC 110)

  • Audio and Video tracks don't go up/down together EVEN when linked selection is checked!

    I'm sure this is something really simple but I just can't figure it out. And it must be a button or a option that I've checked because it didn't use to do that a few days ago.
    Basically, I have a clip on the timeline which has a video and an audio track. I want to move the video clip up to the next video track but I want the audio to move down to the next audio track too. Right now it's not doing that. Only the video file moves up to the next track and the audio clip remains in the same track. How do I fix that so that however many tracks I move the video up by the audio goes down by that same number of tracks?

    Hi Shooternz,
    shooternz wrote:
    Not sure how other NLES manage it but I think it would be a pain if audio  automatically followed video in this respect.  eg Blocked layers b'cos something is already there.
    FCP &, and earlier behaved that way. I prefer the way Premiere Pro does it. You often do not need to move the audio to a different track when you move video to another track. I want to control the placement of audio and video separately now.
    Thanks,
    Kevin

  • Audio And Video out of sync in Final Cut, but not in quicktime

    I heave searched the forums to no end to find the solution to this issue, unfortunatley, I have not solved the problem.
    I am using Final Cut 4.5 on a G4 running OSX 10.3.4. This system has worked perfectly for years with no problems. Neither the system, hardware, or software has ever been changed on this machine. Suddenly last week, despite perfect sync while previewing the capture, the Audio and Video were of sync in Final Cut, but when checked in quicktime, there were no problems. That is how I eventually figured out this was a final cut problem.
    By out of sync, I don't mean it is by a specific number of frames that can be adjusted and then everything is fine, the sync seems to drift so that if you edit the audio to be in sync in one part of the video, it won't be at the end. Also, it is out of sync from the first frame played, not a gradual drift.
    This is definitely not an issue of frame rates or final cut settings as will be evident by the steps I took below to find the source of this problem.
    This is what I did to attempt to fix the problem:
    - Check drives using disc utility and repair permissions
    - Remove Final Cut Preferences using Apple guidelines
    - Switch hard drives
    - Delete Final Cut and reinstall newer version of final cut (since the capture files played fine on Final Cut 5.1 on another machine)
    - reinstall entire OSX 10.3.9, wiping the hard drive and reinstalling Final Cut 5.1
    - switch ram slots, and even replace ram with different sticks from another G4 machine
    None of these solutions worked. Is there some hardware problem with my G4 that is suddenly causing Final Cut to play audio and video out of sync?
    Hope you can help, otherwise this otherwise great G4 is going in the dumpster!
    Thanks

    For what it's worth, I'm having what looks to be the same problem. Driving me insane. Here's what's going on-
    Editing in FCP 6.0.2
    QT was just updated to 7.3.1 the other day (hmm)
    On an MacBookPro machine, OS 10.5.1
    Software updater says everything's up to date of course.
    DV-NTSC 24p (23.98) Advanced Pulldown Removal (footage was shot at 24PA on a DVX100b)
    In Quicktime, the file previews fine, everything's in sync, everyone's happy. In FCP, the video and audio are wildly out of sync. I did a test inside of quicktime and sliced out a small portion of the media clip and did a "save as" and this small slice played in sync in FCP. However, this did not work for the entire clip. Clip is weighing in at 5.58GB for a 32 minute run.
    I'm trying to multicam 2 tapes together and manually syncing these up is proving to be difficult at best. When anyone gets any ideas/leads/patches, let us know.
    Grr,
    Nicole
    Editor, RHED Pixel

  • Audio and Video out of Sync after Pro Res export

    I´m getting an audio/video sync issue when I´m exporting to Pro Res.
    I started off in a HDV1080i50 sequence. Audio sampling rate is 48khz. (16 bit depth)
    The sequence is less than 20 minutes long. It plays correctly on the timeline.
    When I export the sequence for DVD and view it on a set top it plays absolutely fine.
    When I export the video to ProRes (422 HQ - I used Export using Quicktime Conversion) and the audio to AIFF, they quickly go out of sync. This was brought to my attention first by the guys who were making the DigiBeta master, and when I view the exported material in a Pro Res sequence I can see it for myself.
    The audio and video clips are of the exact same duration.
    Possible issues...
    Some of the narration was recorded at 44.1 khz. I didn´t resample it (I´m still a learning novice..).. but if that is the issue should it not have affected the DVD as well? All the other audio in the project is 48khz.
    Most of the material is HDV1080i50..but one clip used was originally an NTSC clip and therefore a different frame rate. However I converted it to a 25fps clip and this is reflected in the vid rate column in the browser where it says 25fps.
    Final point, the exported AIFF files..when I import them back into the HDV1080i50 sequence and play back, it is perfectly in sync.
    Any help would be greatly appreciated.

    First, yes resample the narration to 48k 16 bit.  You will probably be able to reconnect to the resampled files.  Usually works for me.   That's a simple fix and will probably solve the problem.  If it doesn't, you can try to change the codec (compressor) in the sequence settings to prores 422HQ, fully render and see if that solves the problem.
    Wait, just reread your original post.  Export from fcp using file:  export:  quicktime movie NOT quicktime conversion with current settings.  If that works without synch issues, use compressor to convert to prores.

  • Audio and Video out of sync during editing. PLEASE HELP

    I am using a MacBook Pro. Specs are:
    - OS X version 10.9.2
    - 2.5GHz Intel Core i5 Processor
    - 4GB 1600MHz DDR3 RAM
    When I import a video into AfterEffects CC, the audio and video are out of sync. The audio and video files are the right size, length etc. but are way out of sync. By the end of a 2 minute video, it can be out by up to 5-6 seconds. This applies to videos imported from any device. iPhone, USB and even videos I've recorded directly from my computer. Most files I try and import are MP4 files, but I've found the same issues with all other video files. I've even tried importing them separetely as an audio file and then as a video file. Nothing seems to work. I am also experienceing the same issue with Premiere CC.
    Please help. I am needing these programs to work for assessment items.
    - Andrew

    Without knowing the specs of your videos it is hard to say. I'm also running a Mac and am having no such issues.
    My first thought is that you have something from a 3rd party installed that is causing conflicts. If all else fails try transcoding your mp4 files to a frame based codec using the Adobe Media Encoder. Pick one of the Apple production codecs or Jpeg compressed QuickTime.
    Are you running a ram preview? Did you render your Premiere timeline? Are you new to AE?

  • Audio and Video Out of Sync  -AV sync problem

    Edited a bunch of clips together from Canon T1i 720p, iPhone 3GS, Canon SX110, and Sony DSC-W70. About an hour long. Everything looks great when played within iMovie '11, but when exported, the audio and video get out of sync. It gets progressively worse the farther along you are in the sequence. AV is as much as 6 seconds off by the end!! Audio is ahead of the video. I've tried exporting in several different ways including: Export to iDVD, Export to Media Browser 720p, with the same results.
    Anyone else having this problem? I know I've done this with success in the past with iMovie '09.

    Im having the same issue I have a project that is 9 minutes long tried all before even looking at the forum but It will not sync. Some one even told me to repair the permissions so nothing worked. I went to the Mac store no body knew what I was talking about so they asked me to call Apple so I did and was hung up on 3 times not only that but I was advised since I just bought the product I did have 3 months of tech support.
    I mean I'm having an issue and they tell me I have 90 days of free service? This has got to be a joke. I than asked well can I just take it back and instal my 09 back? I was advised nope you wont be able to do so and you cant get your money back.
    After this whole mishap I did learned that the less snips and clips and editing is on the project the sync and video will be ok but if you have lots clips such as most of us than we will continue on having this issue.
    We need a fix ASAP

Maybe you are looking for

  • Cyrillic Subject Encoding in Mail.app

    I have a problem with wrong cyrillic subject font on the message list pane in Mail.app Though on the message preview pane everything looks well. It has been happening with with only some emails, but it's quite disappointing... Can you please advise a

  • Publisher Stalls during page navigation

    I'm experiencing an unusual problem with my BI Publisher 11g instance. Sometimes it will take a page a lengthy amount of time to load or even respond. I have check the logs and do not see anything unusual. I'll be happy to provide additional details

  • Photo Sync Issue

    Hi, I just painstakingly spent all day putting photos from a fantastic weekend in order on my pc but when they've been sync to my ipad they're all out of order.  Please can someone let me know if there is a way to fix this, has happened on both my ip

  • How tune the following query?

    Dear Experts, I am very much new to SQL tuning . Here i have shown my and explain plan SELECT ei.ROWID row_id, 'PROJECT' search_criteria, ei.expenditure_item_id,           SUBSTR (gcc.segment2, 1, 6) ACCOUNT,           SUBSTR (gcc.segment5, 1, 10) lo

  • Links -g and vt mode

    Hi, I'm not sure if it the best forum for that but as my question is related to graphics I've decided to ask i here. I have small installation, with no X, no svgalib. Only framebuffer compiled into the kernel. The fb is up and running. I've compiled