Audio and Video Synchronization

Is it possible to have a synchronized audio and video on a video conferencing application running on jmf? Doing this project has led to discoveries that audio is slower than video to transmit. And what video formats should be used on audio and video to have synchronized. If ever someone has got synchronized can u please email it to me [email protected] Thanks. :)

Hi,
The videos are small: 3 to 5 minutes, maximum.
However, I think I know what the problem is. When captivate
build the project it converts avi to mp3. In this process it uses
the configured options of the audio file, including the bitrate
option. The problem is that the mp3 bitrate will be different from
the avi bitrate, and they should be equal.
Despite that I couldn´t solve this problem, because
captivate and the other product I´n using "DubIt" seems to
have incompatible bitrates! I can´t get an equivalent bitrate
to both products?
Does anyone use Dubit and know how to get an equivalent
bitrate in both products? In captivate we can configure a custom
bitrate but the choices are still limited.
Thank you

Similar Messages

  • Audio and video not in sync when exporting to Quicktime file

    I use a digital camera to shoot video, which is in .avi format. I then import the clips into IMovie HD (selecting all the clips and dragging all at the same time into IMovie). If I play the video from beginning to end in IMovie, the audio and video line up fine. But, when I 'share' to Quicktime file, the longer the video is, the more out-of-sync the audio and video become.
    I didn't have this problem when I used a mini-DV cam and imported using a firewire cable (in real time).
    Any ideas as to how to get the audio to sync up with the video once it's shared as a Quicktime file? I've tried .mp4 and .mov, and I have the same problem.

    Anything here help?
    iMovie: Improving audio and video synchronization
    http://docs.info.apple.com/article.html?artnum=42974

  • How do I synch the audio and video to play together in iMovie? And why is it not synched?

    My iMovie the video is not synched to the audio.  It could be about two seconds off.  And sometimes the sound just stops. I tested it on Quick Time Player and burnt it to a dvd and it does the same thing.  The sound then stops halfway through besides my ilife sound effects.  How do I synch the audio and video to play together in iMovie and dvd?  I taped the movie from my digital camera and it hasn't done this on imovie before. Any Help?

    This may help:
    iMovie: Improving audio and video synchronization
    http://support.apple.com/kb/TA25603?viewlocale=en_US

  • Audio and video sync problem

    Hi,
    I have a problem with audio and video synchronization. I'm using Merge class to merge video and audio tracks. But audio track can be shorter then whole video track (for example video length: 2 min., audio length 1 min.). When I merge them everything is OK. After that I'd like to concatenate more tracks (generated from merge) into one final video file. For this purpose I'm using Concat class.
    But here is the problem, after I concatenate all these tracks then the result video is OK but audio is synchronized right after the previous audio track ended. (If I have video1- 2min. audio1 - 1min., video2 - 2min., audio2 - 30 seconds, then second audio2 starts in 1,30 min right after the first audio track. But I want to start it when the second video2 starts.) Do you have any ideas??
    Merge class http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/Merge.html
    Concat class http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/Concat.html
    Thanks for help in advance.

    Ok thanks.
    I tried to implement PullBufferDataSource and everything went ok, the audio .wav file was generated, but when I run that file, the length is 0 and not the lenght I wanted (40.00). Where can be a problem?
    Here is the code I wrote/changed:
    TimeDataSource tds = new TimeDataSource(40.00);
    Processor p;
    try {
        System.err.println("- create processor for the image datasource ...");
        p = Manager.createProcessor(tds);
    } catch (Exception e) {
        System.err.println("Yikes!  Cannot create a processor from the data source.");
        return false;
    }I set content type to WAVE:
    // Set the output content descriptor to QuickTime.
    p.setContentDescriptor(new ContentDescriptor(FileTypeDescriptor.WAVE));I use DataSource as the next:
        class TimeDataSource extends PullBufferDataSource {
                     private double duration;
                     TimeSourceStream streams[];
                     public TimeDataSource(double duration) {
                      this.duration = duration;
                      streams = new TimeSourceStream[1];
                  streams[0] = new TimeSourceStream(40);
              @Override
              public PullBufferStream[] getStreams() {
                   return streams;
              @Override
              public void connect() throws IOException {
              @Override
              public void disconnect() {
              @Override
              public String getContentType() {
                   return ContentDescriptor.RAW;
              @Override
              public Object getControl(String arg0) {
                   return null;
              @Override
              public Object[] getControls() {
                   return new Object[0];
              @Override
              public Time getDuration() {
                   return new Time(duration);
              @Override
              public void start() throws IOException {
              @Override
              public void stop() throws IOException {
        class TimeSourceStream implements PullBufferStream {
                   private AudioFormat format;
                   private boolean ended = false;
                   private long duration;
                   public TimeSourceStream(long duration) {
                        this.duration = duration;
                        format = new AudioFormat(AudioFormat.LINEAR, 8000, 8, 1);
              @Override
              public Format getFormat() {
                   return format;
              @Override
              public void read(Buffer buf) throws IOException {
                   buf.setDuration(duration);
                   buf.setFormat(format);
                   buf.setEOM(true);
                   buf.setOffset(0);
                   buf.setLength(0);
                   ended = true;     
              @Override
              public boolean willReadBlock() {
                   return false;
              @Override
              public boolean endOfStream() {
                   return ended;
              @Override
              public ContentDescriptor getContentDescriptor() {
                   return new ContentDescriptor(ContentDescriptor.RAW);
              @Override
              public long getContentLength() {
                   return 0;
              @Override
              public Object getControl(String arg0) {
                   return null;
              @Override
              public Object[] getControls() {
                   return new Object[0];
        }Edited by: mako123 on Feb 22, 2010 2:18 PM
    Edited by: mako123 on Feb 22, 2010 2:23 PM

  • Audio and video out of synchronization in iTune for Windows 8.1

    When playing back a downloaded iTunes file, the audio and video are out of synchronization a lot of the time.  The audio can be several seconds behind the video.
    How can this be corrected/fixed?
    I am running the latest iTunes software on a 3.8Ghz AMD-processor machine running Windows 8.1.

    Hi John,
    Welcome to Apple Support Communities.
    Take a look at the article linked below, it provides a lot of great troubleshooting tips that you can try, which will resolve most video playback issues in iTunes.
    Troubleshooting iTunes for Windows Vista or Windows 7 video playback performance issues
    I hope this helps.
    -Jason

  • How to synchronize audio and video rtp-streams

    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?
    How are you supposed to sync audio and video if not with as above ?

    camelstrike wrote:
    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?The RTP packets are timestamped when they leave, and they are played in order. You can't change the timebase on an RTP stream because, for all intensive purposes, it's "live" data. It plays at the speed the transmitting RTP server wants it to play, and it never gets behind because it drops out of order and old data packets.
    If your RTP streams aren't synced correctly on the receiving end, it means they aren't being synced on the transmitting end...so you should look into the other side of the equation.

  • Synchronize audio and video over RTP

    Hi all,
    I am new in jmf. Please any one tell me, how to synchronize audio and video streams, when we capture thru audio and video capture devices and send it over network thru rtp protocol and receive at client side. How to syncnronize at client side? Please send me reply if any one have solution in jmf.
    Thanks in advance.

    Does anyone know about how this is done? I'm doing a videoconference thing with jmf, QoS is a required componenet! So if anyone knows about how this is done, plz share with discussion! All I know so far is that I can buffer the incoming streams to some extend then call syncStart() to start the video with the starting time of the video!

  • How to synchronize multiple, unorganized audio- and video clips

    I'm editing a video with audio recorded by an external recorder.
    I've been given a lot of unorganized files and I wonder if there's a quick way to synchronize them all without knowing which audio- and video file that fit together.

    I've been pretty amazed by how well the synchronize clips command works. You might take a chance and select all of your audio/video files at once, right click and choose Synchronize Clips.

  • How to synchronize audio and video

    Hello,
    I have an application that stream video images. Whats going on at the other end( the audio) I can also stream that.
    Now i want to synchronize both audio and video for effective synchronized output.
    Can i do that. I dont know the API that are available in Doja. Is Any timebase or timesynchronize APIs are available through which i can synchronize both.
    Any Suggestions and ideas can be implemented.
    Thanks

    I know that In Midlet development using MMAPI there are APIs available to synronize the players like
    p1.setTimaBase(p2.getTimeBase())
    But I am not using Players. Through http connection I m getting the raw image data and displaying it through ImageItem i.e no players required.
    Therefore If i stream the audio like this How to synchronize that.
    I relly need ur Help
    Thanks

  • Audio and video out of sync after import

    I'm trying to bring some H.264 video into a Premiere project. But when I place it in a sequence (a sequence that is created based upon this very clip), the audio and video are about a second out of sync. (It's also odd: no amount of adjustment will bring them in sync. Seemingly the video is running at some variable rate that makes synchronization impossible...?)
    As a test, I tried bringing my video into iMovie, and it works just fine there. But I don't want to use iMovie: I want to use Premiere.
    Any ideas?

    Thanks. Here's the output:
    General
    Format : MPEG-4
    Format profile : Base Media / Version 2
    Codec ID : mp42
    File size : 1.05 GiB
    Duration : 51mn 20s
    Overall bit rate mode : Variable
    Overall bit rate : 2 937 Kbps
    Encoded date : UTC 1904-01-01 00:00:00
    Tagged date : UTC 1904-01-01 00:00:00
    Writing application : HandBrake 0.10.1 2015030800
    Video
    ID : 1
    Format : AVC
    Format/Info : Advanced Video Codec
    Format profile : [email protected]
    Format settings, CABAC : Yes
    Format settings, ReFrames : 4 frames
    Codec ID : avc1
    Codec ID/Info : Advanced Video Coding
    Duration : 51mn 20s
    Bit rate : 2 606 Kbps
    Width : 720 pixels
    Height : 478 pixels
    Display aspect ratio : 4:3
    Frame rate mode : Variable
    Frame rate : 24.145 fps
    Minimum frame rate : 0.167 fps
    Maximum frame rate : 29.970 fps
    Color space : YUV
    Chroma subsampling : 4:2:0
    Bit depth : 8 bits
    Scan type : Progressive
    Bits/(Pixel*Frame) : 0.314
    Stream size : 957 MiB (89%)
    Writing library : x264 core 142 r2479 dd79a61
    Encoding settings : cabac=1 / ref=3 / deblock=1:0:0 / analyse=0x3:0x113 / me=hex / subme=7 / psy=1 / psy_rd=1.00:0.00 / mixed_ref=1 / me_range=16 / chroma_me=1 / trellis=1 / 8x8dct=1 / cqm=0 / deadzone=21,11 / fast_pskip=1 / chroma_qp_offset=-2 / threads=18 / lookahead_threads=3 / sliced_threads=0 / nr=0 / decimate=1 / interlaced=0 / bluray_compat=0 / constrained_intra=0 / bframes=3 / b_pyramid=2 / b_adapt=1 / b_bias=0 / direct=1 / weightb=1 / open_gop=0 / weightp=2 / keyint=240 / keyint_min=24 / scenecut=40 / intra_refresh=0 / rc_lookahead=40 / rc=crf / mbtree=1 / crf=20.0 / qcomp=0.60 / qpmin=0 / qpmax=69 / qpstep=4 / vbv_maxrate=62500 / vbv_bufsize=78125 / crf_max=0.0 / nal_hrd=none / filler=0 / ip_ratio=1.40 / aq=1:1.00
    Encoded date : UTC 1904-01-01 00:00:00
    Tagged date : UTC 1904-01-01 00:00:00
    Color range : Limited
    Color primaries : BT.601 NTSC
    Transfer characteristics : BT.709
    Matrix coefficients : BT.601
    Audio #1
    ID : 2
    Format : AAC
    Format/Info : Advanced Audio Codec
    Format profile : LC
    Codec ID : 40
    Duration : 51mn 20s
    Bit rate mode : Variable
    Bit rate : 132 Kbps
    Channel(s) : 2 channels
    Channel positions : Front: L R
    Sampling rate : 48.0 KHz
    Compression mode : Lossy
    Stream size : 47.9 MiB (4%)
    Title : Stereo / Stereo
    Language : English
    Encoded date : UTC 1904-01-01 00:00:00
    Tagged date : UTC 1904-01-01 00:00:00
    Audio #2
    ID : 3
    Format : AC-3
    Format/Info : Audio Coding 3
    Mode extension : CM (complete main)
    Format settings, Endianness : Big
    Codec ID : ac-3
    Duration : 51mn 20s
    Bit rate mode : Constant
    Bit rate : 192 Kbps
    Channel(s) : 2 channels
    Channel positions : Front: L R
    Sampling rate : 48.0 KHz
    Bit depth : 16 bits
    Compression mode : Lossy
    Stream size : 70.5 MiB (7%)
    Title : Stereo / Stereo
    Language : English
    Encoded date : UTC 1904-01-01 00:00:00
    Tagged date : UTC 1904-01-01 00:00:00
    Menu
    ID : 4
    Codec ID : text
    Duration : 51mn 20s
    Language : English
    Encoded date : UTC 1904-01-01 00:00:00
    Tagged date : UTC 1904-01-01 00:00:00
    Bit rate mode : CBR
    00:00:00.000 : Chapter 2
    00:01:50.877 : Chapter 3
    00:03:59.218 : Chapter 4
    00:13:26.285 : Chapter 5
    00:20:57.685 : Chapter 6
    00:32:58.489 : Chapter 7
    00:39:14.031 : Chapter 8
    00:50:51.611 : Chapter 9

  • Adobe Presenter - audio and video

    Could someone please tell me the difference between adding
    audio and video to a PP presentation via PowerPoint vs. importing
    and adding audio and video using the Presenter tab?
    Also, I'm looking for a list of things that won't work when
    publishing. For example, I noticed that animations, slide
    transitions and QuickTime movies won't work after publishing.
    Thanks SO much.
    Leona

    Check out the documentation on Presenter -- there are some
    very specific ways you need to add audio (use "Record Audio" or
    "Edit Audio" in the Presenter menu) and video (use "Insert video"
    in the Presenter menu).
    Powerpoint has facilities to do these things as well, but
    Presenter won't be able to publish them, from what we've
    determined.
    For animations, again, read the documentation carefully --
    all your animations have to be set to "On Click" or "With Previous"
    -- i.e., they can't be timed animations. Again, Presenter won't be
    able to translate these timed animations -- you must "click" them
    into view (you do this as you're recording audio with "Next
    Animation", or you can do it after recording audio using
    "Synchronize Audio").

  • My synchronized clips (audio and video) have lost their compound clip icon after I bought the full version of FCP X, why is that?

    I have noticed a few changes to my FCP workflow after installing FCPX from the trial version I had before. I've noticed that when I sync audio and video files together via Clip > Synchronize Clips.. the icon at the top left corner of the thumnail is no longer there as it was in the trial version. I also have to right click and select 'open in timeline' to work on the files. I much preferred clicking on the thumnail's icon in the event library and it opening up in the timeline. Also Stabilization is missing from my video effects! How odd.

    I no longer have the trial version this is the point. Bought the 'release' version about 6 weeks ago. Still doesn't explain why the stabilization option randomly appears for some clips and not for others. Also would like to know if I can get the compound icon back on the thumbnails for synchonized clips.

  • Synchronizing several audio and video files automatically in premiere?

    Hi there!
    I'm working on a short film shot on the Arri Alexa, and all my audio is on seperate audiofiles, though synced through time code. So I'm wondering: In Premiere CC, is there a way to automatically synchronize several audio- and video files, using timecode (as you can in Avid), without having to merge them individaully? THis would save me a lot of time.

    You can try using the "create multicamera sequence" command. That will create "multiclips" of your footage with its related audio. It can sync with timecode or use the audio for synchronization. When you need to export the edit for color grading, you can select all of them and choose multicamera>flatten. But beware that if you apply effects, such as speed ramp or intrinsic effects, those will be removed on a "flatten" command.

  • Audio and Video syncing issues

    I have a video file where the audio and video file are normal when I play the clip in Quicktime. When I import the file into an FCP project and drag it into the timeline, the audio and video are no longer synced. My sequence setting is ProRes422 as is my video clip. The only difference I can see is that my clip is 24 bit integer and the sequence is 32 bit floating point. Although, this is my 50th project capturing old BetacamSP tapes and all my projects say this and I haven't had a problem until this particular file and not afterwards. Any thoughts?

    There you go then ... make sure you monitor audio and video by the same path ie both via the Kona card or both on the Mac ... or use the Playback Offset to compensate for the delay between the Canvas window and the Kona output.
    From the manual:
    To set the frame offset between the computer display and the external video and audio outputs
    Choose Final Cut Pro > System Settings.
    Click the Playback Control tab.
    Enter a number of frames in the Frame Offset number field.
    Frame offset can be any whole number between 0 and 30. The default value is 4. For example, if a video monitor connected to your DV camcorder shows your program four frames later than your computer display, a frame offset of 4 will synchronize the two.
    Note: Depending on your external monitor configuration, you may need to experiment with frame offset values to synchronize the external monitor and the computer display.
    Click OK.
    Play your video in the Canvas or Viewer and compare the video offset between your external monitor and your computer display.
    If the computer display and external monitor are still not synchronized, repeat steps 1 through 5 using different frame offset values until the display and monitor are in sync.

  • Can i take both Receveing of audio and video on the same window.

    hi,
    i m using AVTransmit2 and AVReceive2 for Transmission and Receiving all is doing well but problem is that AVReceive2 show Transmission of audio and video in both different windows and it use different players for audio and video. Can u tell me can i merge both audio and video at the receiving end. i m already mergeing audio and video at transmission end ???. any one tell me how can i solve this problem ???
    thx in advance.

    Hi,
    I have the same situation - to use one window to show both rtp audio and video stream, but use the ControlPanelComponent of audio player to control both players.
    I want to use the audio player as the master player so that the video player could be synchronized with it. So: audioPlayer.addController(videoPlayer).
    In the only window frame for both players, I want to show the VisualComponent(VC) of video and the ControlPanelComponent(CC) of audio. So I add both components respectly to the window frame, after I got them from each player. Because the streams are coming from the network over RTP, and they do not become realized at the same time, I add the VC and CC of first realized player(no matter it is audio or video) to window frame at first, and then change them to the desired video VC and audio audio CC.
    Though I check both player are realized and not started before I call the addController() method, the application stucked in this method. Of course I call it in a try block but no exception was thrown out.
    when I use TimeBase to synchronize them, i.e., videoPlayer.setTimeBase(audioPlayer.getTimeBase()), everything goes fine. But, I can not control both players from one ControlPanelComponent in this way.
    Anybody having any idea to solve this problem please reply.
    By the way, in the example code such as AVReceiver, where audo and video streams are shown on two window seperately, user can use the ControlPanelComponent of audio to control both player, though it seems no relevant code to support that function. The ControlPanelComponent of video can not control the audio player anyway. Anyone having any explanation to this?

Maybe you are looking for