Align audio to video

I had a project where the audio did not record. I took a cassette tape of the event, imported into a new movie then as a clip into iMovie. I extracted the audio and aligned it with start of the video clip. The audio starts to run ahead noticeably and by the end of the clip, is about four seconds ahead. Is there a way to compress the audio so that it lines up with video from start to finish?

Well . . . "built in audio" is the option you need to select. And unless you have another audio device plugged in (and by that I mean a USB or Firewire audio interface) it will be the only one you will get. You will never be able to select your mini disc player directly from Audacity's input selection. It doesn't work like that.
By selecting "built in audio" in Audacity, you are selecting either the built-in mic on your Powerbook, or the line-in socket, depending on which option you have selected in System Preferences. If you have it set up like I suggested in my earlier post, it should work. If you have got it to work once, and not changed any settings, it should work again. Make sure you check the following:
1: System preferences, Sound, Input - should be set to "Line-in"
2: Your mini disc's output is plugged into the Powerbook's line-in. Volume is set at about 80%.
3: You should get a signal in Audacity if you have followed these steps.
Any problems should be solved by the help pages on Audacity's website: http://audacity.sourceforge.net/help/
If Audacity just isn't working for you, you might try Sound Studio (Which may have come free on your Mac) or can be downloaded from http://www.felttip.com/products/soundstudio/ for 1 15-day free trial.
You could even have a bash at doing it with GarageBand, if you have it.

Similar Messages

  • Aligning audio tracks to video

    In Final cut i import my video and audio from my camera. I then export the audio to Garage Band and edit it (i don't change the length of the audio, i just try to make it sound better by adding reverb and EQ) then i export that audio to itunes and then import it back to Final Cut.
    I am having problems lining up the two audio tracks in Final Cut. Im working with music video's so the audio and video must be in sync. In garage band there is a feature called "snap to grid" that allows me to slide my track to a desired increment of time. I can choose to "snap to grid" or not. By not "snapping to grid" i am able to line up tracks exactly where i want them. Is there a feature like that in Final Cut. In Final Cut, I need to be able to slide the newly imported track to exactly line up with the original audio.
    Any input would be greatly appreciated!!!

    Hello Nate
    In FCP the "n" key is used to toggle on/off the snapping function.
    Question: Why the convoluted workflow? How come you don't just roundtrip your audio out of FCP directly to Soundtrack Pro (instead of GarageBand) and back again?
    Best
    Andy

  • Sync Music audio with Video Audio PE10

    Hi.
    I am new to Premiere elements 10 (Changed from Corel to Adobe, so i'm not new to video editing, but I'm not used to PE10).
    I am filming a lot of my daughters Circus performances, so very often I find the music number used at the performance and merge that with the video audio in the project.
    But I have a problem syncing the music from the audio track on the video with the seperate music track.
    So to specify: I want both Audio tracks running at the same time, so that I keep the clapping and the whisteling from the audience, but get the better quality of the music from the external file merged with the music in the video-audio-clip.
    Is there a special tool or can someone help me with a special tecnique?
    Regards Settemand

    Along with the instructions offered above, when nudging the Audio, I would make sure that Snap is OFF, as Snap will try to align your Audio with the beginning (Head), or end (Tail), of you Video Clips. That is useful for many editing operations, but for nudging Audio will frustrate you. The Snap function is a toggle with the S key. Just do not forget to turn it back ON, when done.
    Also, though the Soundtrack Audio Track seems to be the ideal location for your extra music, some users have experienced issues, when using it, or the Narration Audio Track. While it seems that those issues are Project-specific, and those Tracks often work well, to keep from being tripped up later, I recommend putting your music on a free Audio Track, such as Audio 2 - just to be safe.
    Also, with either an Audio, or Video Track, you can Zoom IN/OUT horizontally with the Timeline Zoom slider, AND can also Zoom IN/OUT vertically, by going to the Track's Header (where the Track Name is located), then hovering your cursor over the little line between the Track, until it changes to an equal sign with up/down arrows, then click+drag, to enlarge the Track (and the Clip display) vertically. I find that this makes viewing the Audio Waveform Display easier, and the peaks are more obvious. When done, you can just reverse that operation to Zoom OUT on the Track vertically, so that it looks like it did, before you started.
    Good luck, and this FAQ Entry might be useful, though it talks about OOS (Out Of Sync) Audio, the functions are the same for syncing additional Audio: http://forums.adobe.com/thread/436751?tstart=0
    Hunt

  • Grouping Audio and Video tracks?

    iMovie HD:
    When I insert new clips or edit existing ones, my imported audio tracks lose sync with the clips that come after the place of edit. I am continuously manually re-aligning the audio which is very time consuming.
    Is there a way to group or link Audio and video so the that imported audio will stay in sync with my clips?
    I understand that it may not be possible for the clip I am working on, but what about the clips after the clip I'm editing?
    Christo

    Found the answer: "Lock audio clip at playhead"

  • Audio and Video are Out of Sync

    Hello,
    This seems to be a reoccurring problem for me and I'd love some help getting it resolved once and for all. After capturing footage, the audio and video are out of sync in the clip AND when I drag it to the timeline. When I go through each part, it looks like the video hesitates often, which seems to be causing the misalignment of audio. But, once it is moved into alignment, it goes out of sync again throughout the remainder of the one hour video.
    I'm capturing through a Canon Optura 50 Mini DV and editing with FCP 6.0.1. I have checked that the audio mode is set to 16 bit on the camera, and the capture settings have the audio rate for 48 KHz.
    This same problem occurred when I was capturing another event last week.
    PLEASE advise. It won't get aligned even if I take the hours and hours to move the audio left and right frame by frame.

    I am still wondering why you're recommending that I use an external drive for this project.
    Because the internal has more than enough to do running Mac OS X and the applications. It is constantly writing files. While DV video is at the lower end of the scale, it is still asking a lot of the drive's read and write head.
    I always used external drives, but we had many problems with delays and processing problems.
    At the very least, you should be using FireWire external drives. USB is no good for video as it delivers data in packets, not the sustained stream that video needs. The drives must also be formatted as Mac OS Extended. Windows FAT 16 of FAT 32 formatted drives have file size limits of 2 and 4 GB respectively. Anything larger is segemented and can cause problems on a Mac.

  • How do I use my web cam to record audio and video? I can get the video to work but no sound.

    I want to record both audio and video at the same time. The video works fine but not the audio. I do use Skype without any problems using both.

    PwmBoat,
    This made me smile, because things like this do make me smile...  I never make videos on YouCam, but I did this time, so I could test this one out.
    You did not say what webcam software you are using...
    ======================================================================
     Video and audio recording works on YouCam V3.5.1.4606
    Make sure you have turned on a working microphone in Settings:
    YouCam > Right-Side   Settings >
    Right-Side, Lower-Right, Select a working Microphone from the Drop down list
    AND
    CHECK "Capture with audio"
    OK
    You can also buy (for $38 US) YouCam 5.0 from the Cyberlink.com website and it has audio/video and all sorts of other goodies, too.  (They have 3.x as well, but why buy what HP already gives you?)
    ==========================================================================
    I hope this helps!                                 
    We work hard to help!
    Whenever you see a Helpful Post - Click the Kudos Star on the Left as Thanks!
    Did this Post solve your problem?  Mark it “Accept as Solution”!
    Note: You can find the “Accept as Solution” box on threads started by you.
    2012 Year of the Dragon!
    Kind Regards,
    Dragon-Fur

  • My shuffle ,doesnt get syncronized with itunes in my pc,getting audio and video settings error,but able to get it syncronized with my friends pc

    my shuffle ,doesnt get syncronized with itunes in my pc,getting audio and video settings error,but able to get it syncronized with my friends pc    

    Try:                                               
    - iOS: Not responding or does not turn on           
    - Also try DFU mode after try recovery mode
    How to put iPod touch / iPhone into DFU mode « Karthik's scribblings
    - If not successful and you can't fully turn the iOS device fully off, let the battery fully drain. After charging for an least an hour try the above again.
    - Try another cable                     
    - Try on another computer                                                       
    - If still not successful that usually indicates a hardware problem and an appointment at the Genius Bar of an Apple store is in order.
    Apple Retail Store - Genius Bar
    The missing apps could have been done by setting the Restrictions that can hid those apps. If the backup was made with those retrictions set the the Restrictions are also restored.
    Thus, if you get it to work restore to factory settings/new iPod, not from backup                               
    You can redownload most iTunes purchases by:        
      Downloading past purchases from the App Store, iBookstore, and iTunes Store

  • Does the mini displayport able to transfer audio and video to an external monitor (example: a tv) using a 2011 13" MacBook Pro with an hdmi cable?

    I was wondering this, because I want to hook up my 2011 MacBook Pro to a external monitor (TV) using an hdmi cable.  This would be heplful before I buy a mini displayport to hdmi adapter, and then maybe find out that the mini displayport might not transfer audio, only video.

    Yes, min Displayport supports audio and video for the 2011 MBP 13" via HDMI.

  • I am new to iMovies.  When I make a video using the mic and camera in my Macbook Pro, the audio and video are not in synch.  Very annoying.  How can I fix this?

    I am new to iMovies.  When I make a video using the mic and camera in my Macbook Pro, the audio and video are not in synch.  Very annoying.  How can I fix this?

    DVD drives read the bottom of the disk, not the top. Smooth out the paper & try again.

  • Premiere Pro CS6 importing/ audio and video problems

    I recently recorded gameplay from my xbox, when i open this recording using windows media player it states that the video is 3 hours 30 minutes long, which is correct, when i then import this into premiere pro, premiere pro changes it to be 5 hours and 44 minutes long, when i then play this back it plays in slow motion and audio and video start to desync, i thought it may just be the source editor playing up so i made a clip and put it into the timeline and it still had the same outcome. The speed/duration are still on 100% so im not sure what to do, anyone have any idea as to what this is, or why it is happening?

    What are the attributes of your PVR video...
    specifically, is it variable frame rate?
    Media file evaluation utilities:
    GSpot
    http://www.headbands.com/gspot/
    MediaInfo
    http://mediainfo.sourceforge.net/en
    If it is VFR, Premiere no-likey.
    Variable frame rate video problems:
    http://forums.adobe.com/message/4897424#4897424
    http://forums.adobe.com/message/4071506#4071506

  • How to get audio and video in one file

    hi...I am "premiere beginner user" and I have a simple question , I think..:)...Every time when I am trying to export media out from my premiere (using media encoder) I get video file and some audio files, but what I need is only ONE file, that contains both (audio and video)..for example , I have edited film in hd quality and I want it to export, I choose h.264 format(make some settings,pal,audio quality...) and when it´s "all done", only "result" I get is one video file and two audio files and they are not together :).....so please help mme anyone..ok and as you can see I am not from english speaking country, so sorry for my english...thanks for help....

    There is no indication that something is out-of-sync or the ability to move/slip into sync like in FCE/FCP.
    To the best of my knowledge, using match frame and overwriting the video with video+audio (or attaching it as a connected clip if you don't want to overwrite the video, which might have effects applied) is the best way to do this.
    You can use the blade tool first in the timeline to get just the section you want to get back into sync. Make cuts so that you have a single clip that needs the audio resynced, select it, and use shift-F to select the original synced clip in the Event Browser. Make sure your play head is at the beginning of this video clip in the timeline. Press option-2, then 'q', and now you have the audio back in your timeline, synced with the video.  Delete the old, no-longer-synced audio and you should be set.

  • DO i need some extra hardware interface for receving both Audio and video

    hi i m doing e-learning project. i have to capture video from webcam and voice from headphone and send to client.
    but my code is working fine for either one at a time.
    DO i need some extra hardware interface for receving both Audio and video. im using code AVTransmit and AVReceive found from this site only
    After running TX
    i give Dsound:// & vfw://0 in Media Locater only sound is received and no vedio
    and when i give vfw://0 in Media Locater only live video is transmited.
    im using JMF1.1.2e.
    if any one know the method to run or cause of it plz reply me soon. i will be very thankfull
    transmiter/server side code .first run TX on server
    import java.io.*;
    import java.awt.*;
    import java.awt.event.*;
    import java.net.*;
    import java.util.*;
    import javax.media.rtp.*;
    import javax.swing.*;
    import javax.swing.event.*;
    import javax.swing.border.*;
    public class Tx extends JFrame implements ActionListener, KeyListener,
    MouseListener, WindowListener {
    Vector targets;
    JList list;
    JButton startXmit;
    JButton rtcp;
    JButton update;
    JButton expiration;
    JButton statistics;
    JButton addTarget;
    JButton removeTarget;
    JTextField tf_remote_address;
    JTextField tf_remote_data_port;
    JTextField tf_media_file;
    JTextField tf_data_port;
    TargetListModel listModel;
    AVTransmitter avTransmitter;
    RTCPViewer rtcpViewer;
    JCheckBox cb_loop;
    Config config;
    public Tx() {
    setTitle( "JMF/RTP Transmitter");
         config= new Config();
         GridBagLayout gridBagLayout= new GridBagLayout();
         GridBagConstraints gbc;
         JPanel p= new JPanel();
         p.setLayout( gridBagLayout);
         JPanel localPanel= createLocalPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 0;
         gbc.gridy= 0;
         gbc.gridwidth= 2;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( localPanel, gbc);
         p.add( localPanel);
         JPanel targetPanel= createTargetPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 1;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( targetPanel, gbc);
    p.add( targetPanel);
         JPanel mediaPanel= createMediaPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 2;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( mediaPanel, gbc);
    p.add( mediaPanel);
    JPanel buttonPanel= new JPanel();
    rtcp= new JButton( "RTCP Monitor");
    update= new JButton( "Transmission Status");
         update.setEnabled( false);
         rtcp.addActionListener( this);
         update.addActionListener( this);
         buttonPanel.add( rtcp);
         buttonPanel.add( update);
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 3;
    gbc.gridwidth= 2;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( buttonPanel, gbc);
         p.add( buttonPanel);
    getContentPane().add( p);
         list.addMouseListener( this);
         addWindowListener( this);
    pack();
    setVisible( true);
    private JPanel createMediaPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         JLabel label= new JLabel( "Media Locator:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         tf_media_file= new JTextField( 35);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_media_file, gbc);
         p.add( tf_media_file);
         tf_media_file.setText( config.media_locator);
         cb_loop= new JCheckBox( "loop");
         startXmit= new JButton( "Start Transmission");
         startXmit.setEnabled( true);
         startXmit.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 2;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( cb_loop, gbc);
         p.add( cb_loop);
         cb_loop.setSelected( true);
         cb_loop.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( startXmit, gbc);
         p.add( startXmit);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Source");
         p.setBorder( titledBorder);
         return p;
    private JPanel createTargetPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         targets= new Vector();
         for( int i= 0; i < config.targets.size(); i++) {
         targets.addElement( config.targets.elementAt( i));
    listModel= new TargetListModel( targets);
    list= new JList( listModel);
         list.addKeyListener( this);
         list.setPrototypeCellValue( "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx");
    JScrollPane scrollPane= new JScrollPane( list,
    ScrollPaneConstants.VERTICAL_SCROLLBAR_AS_NEEDED,
    ScrollPaneConstants.HORIZONTAL_SCROLLBAR_NEVER);
         gbc= new GridBagConstraints();
         gbc.gridx= 0;
         gbc.gridy= 0;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( scrollPane, gbc);
         p.add( scrollPane);
    JPanel p1= new JPanel();
         p1.setLayout( gridBagLayout);
         JLabel label= new JLabel( "IP Address:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( label, gbc);
         p1.add( label);
         tf_remote_address= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( tf_remote_address, gbc);
         p1.add( tf_remote_address);
         label= new JLabel( "Data Port:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( label, gbc);
         p1.add( label);
         tf_remote_data_port= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( tf_remote_data_port, gbc);
         p1.add( tf_remote_data_port);     
    JPanel p2= new JPanel();
    addTarget= new JButton( "Add Target");     
    removeTarget= new JButton( "Remove Target");
         p2.add( addTarget);
         p2.add( removeTarget);
         addTarget.addActionListener( this);
         removeTarget.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 2;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.gridwidth= 2;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 20,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( p2, gbc);
         p1.add( p2);
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 0;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( p1, gbc);
         p.add( p1);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Targets");
         p.setBorder( titledBorder);
         return p;
    private JPanel createLocalPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         JLabel label= new JLabel( "IP Address:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         JTextField tf_local_host= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_local_host, gbc);
         p.add( tf_local_host);
         try {
    String host= InetAddress.getLocalHost().getHostAddress();     
         tf_local_host.setText( host);
         } catch( UnknownHostException e) {
         label= new JLabel( "Data Port:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         tf_data_port= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_data_port, gbc);
         p.add( tf_data_port);
         tf_data_port.setText( config.local_data_port);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Local Host");
         p.setBorder( titledBorder);
         return p;
    public void actionPerformed( ActionEvent event) {
    Object source= event.getSource();
         if( source == addTarget) {
         String ip= tf_remote_address.getText().trim();
         String port= tf_remote_data_port.getText().trim();
         String localPort= tf_data_port.getText().trim();
         addTargetToList( localPort, ip, port);
         if( avTransmitter != null) {
         avTransmitter.addTarget( ip, port);
         } else if( source == removeTarget) {
         int index= list.getSelectedIndex();
         if( index != -1) {
              Target target= (Target) targets.elementAt( index);
              if( avTransmitter != null) {
         avTransmitter.removeTarget( target.ip, target.port);
              targets.removeElement( target);
              listModel.setData( targets);          
         } else if( source == startXmit) {
         if( startXmit.getLabel().equals( "Start Transmission")) {          
         int data_port= new Integer( tf_data_port.getText()).intValue();
              avTransmitter= new AVTransmitter( this, data_port);
         avTransmitter.start( tf_media_file.getText().trim(), targets);          
              avTransmitter.setLooping( cb_loop.isSelected());
         startXmit.setLabel( "Stop Transmission");
         } else if( startXmit.getLabel().equals( "Stop Transmission")) {
              avTransmitter.stop();
              avTransmitter= null;
              removeNonBaseTargets();
              listModel.setData( targets);
         startXmit.setLabel( "Start Transmission");          
         } else if( source == rtcp) {
         if( rtcpViewer == null) {
         rtcpViewer= new RTCPViewer();
         } else {
              rtcpViewer.setVisible( true);
              rtcpViewer.toFront();
         } else if( source == cb_loop) {
         if( avTransmitter != null) {
              avTransmitter.setLooping( cb_loop.isSelected());
    private void removeNonBaseTargets() {
         String localPort= tf_data_port.getText().trim();
         for( int i= targets.size(); i > 0;) {
         Target target= (Target) targets.elementAt( i - 1);
         if( !target.localPort.equals( localPort)) {
    targets.removeElement( target);
         i--;
    public void addTargetToList( String localPort,
                             String ip, String port) {     
    ListUpdater listUpdater= new ListUpdater( localPort, ip,
                                  port, listModel, targets);
    SwingUtilities.invokeLater( listUpdater);           
    public void rtcpReport( String report) {
         if( rtcpViewer != null) {
         rtcpViewer.report( report);
    public void windowClosing( WindowEvent event) {
         config.local_data_port= tf_data_port.getText().trim();
         config.targets= new Vector();
         for( int i= 0; i < targets.size(); i++) {
         Target target= (Target) targets.elementAt( i);
         if( target.localPort.equals( config.local_data_port)) {
              config.addTarget( target.ip, target.port);
         config.media_locator= tf_media_file.getText().trim();
         config.write();
    System.exit( 0);
    public void windowClosed( WindowEvent event) {
    public void windowDeiconified( WindowEvent event) {
    public void windowIconified( WindowEvent event) {
    public void windowActivated( WindowEvent event) {
    public void windowDeactivated( WindowEvent event) {
    public void windowOpened( WindowEvent event) {
    public void keyPressed( KeyEvent event) {
    public void keyReleased( KeyEvent event) {
    Object source= event.getSource();
         if( source == list) {
         int index= list.getSelectedIndex();
    public void keyTyped( KeyEvent event) {
    public void mousePressed( MouseEvent e) {
    public void mouseReleased( MouseEvent e) {
    public void mouseEntered( MouseEvent e) {
    public void mouseExited( MouseEvent e) {
    public void mouseClicked( MouseEvent e) {
    Object source= e.getSource();
         if( source == list) {
         int index= list.getSelectedIndex();
         if( index != -1) {
              Target target= (Target) targets.elementAt( index);
              tf_remote_address.setText( target.ip);
              tf_remote_data_port.setText( target.port);
         int index= list.locationToIndex( e.getPoint());
    public static void main( String[] args) {
    new Tx();
    class TargetListModel extends AbstractListModel {
    private Vector options;
    public TargetListModel( Vector options) {
         this.options= options;
    public int getSize() {
         int size;
         if( options == null) {
         size= 0;
         } else {
         size= options.size();
         return size;
    public Object getElementAt( int index) {
    String name;
    if( index < getSize()) {
         Target o= (Target)options.elementAt( index);
    name= o.localPort + " ---> " + o.ip + ":" + o.port;
         } else {
         name= null;
         return name;
    public void setData( Vector data) {
         options= data;
         fireContentsChanged( this, 0, data.size());
    class ListUpdater implements Runnable {
    String localPort, ip, port;
    TargetListModel listModel;
    Vector targets;
    public ListUpdater( String localPort, String ip, String port,
                   TargetListModel listModel, Vector targets) {
         this.localPort= localPort;
         this.ip= ip;
         this.port= port;
         this.listModel= listModel;
         this.targets= targets;
    public void run() {
    Target target= new Target( localPort, ip, port);
         if( !targetExists( localPort, ip, port)) {
         targets.addElement( target);
    listModel.setData( targets);
    public boolean targetExists( String localPort, String ip, String port) {
         boolean exists= false;
         for( int i= 0; i < targets.size(); i++) {
         Target target= (Target) targets.elementAt( i);
         if( target.localPort.equals( localPort)
         && target.ip.equals( ip)
              && target.port.equals( port)) {          
              exists= true;
         break;
         return exists;
    >>>>>>>>>>>>>>>>>
    import java.awt.*;
    import java.io.*;
    import java.net.InetAddress;
    import java.util.*;
    import javax.media.*;
    import javax.media.protocol.*;
    import javax.media.format.*;
    import javax.media.control.TrackControl;
    import javax.media.control.QualityControl;
    import javax.media.rtp.*;
    import javax.media.rtp.event.*;
    import javax.media.rtp.rtcp.*;
    public class AVTransmitter implements ReceiveStreamListener, RemoteListener,
    ControllerListener {
    // Input MediaLocator
    // Can be a file or http or capture source
    private MediaLocator locator;
    private String ipAddress;
    private int portBase;
    private Processor processor = null;
    private RTPManager rtpMgrs[];
    private int localPorts[];
    private DataSource dataOutput = null;
    private int local_data_port;
    private Tx tx;
    public AVTransmitter( Tx tx, int data_port) {
         this.tx= tx;
         local_data_port= data_port;
    * Starts the transmission. Returns null if transmission started ok.
    * Otherwise it returns a string with the reason why the setup failed.
    public synchronized String start( String filename, Vector targets) {
         String result;
         locator= new MediaLocator( filename);
         // Create a processor for the specified media locator
         // and program it to output JPEG/RTP
         result = createProcessor();
         if (result != null) {
         return result;
         // Create an RTP session to transmit the output of the
         // processor to the specified IP address and port no.
         result = createTransmitter( targets);
         if (result != null) {
         processor.close();
         processor = null;
         return result;
         // Start the transmission
         processor.start();
         return null;
    * Use the RTPManager API to create sessions for each media
    * track of the processor.
    private String createTransmitter( Vector targets) {
         // Cheated. Should have checked the type.
         PushBufferDataSource pbds = (PushBufferDataSource)dataOutput;
         PushBufferStream pbss[] = pbds.getStreams();
         rtpMgrs = new RTPManager[pbss.length];
         localPorts = new int[ pbss.length];
         SessionAddress localAddr, destAddr;
         InetAddress ipAddr;
         SendStream sendStream;
         int port;
         SourceDescription srcDesList[];
         for (int i = 0; i < pbss.length; i++) {
         // for (int i = 0; i < 1; i++) {
         try {
              rtpMgrs[i] = RTPManager.newInstance();     
              port = local_data_port + 2*i;
              localPorts[ i]= port;
              localAddr = new SessionAddress( InetAddress.getLocalHost(),
                                  port);
              rtpMgrs.initialize( localAddr);          
              rtpMgrs[i].addReceiveStreamListener(this);
              rtpMgrs[i].addRemoteListener(this);
         for( int k= 0; k < targets.size(); k++) {
              Target target= (Target) targets.elementAt( k);
              int targetPort= new Integer( target.port).intValue();
              addTarget( localPorts[ i], rtpMgrs[ i], target.ip, targetPort + 2*i);
              sendStream = rtpMgrs[i].createSendStream(dataOutput, i);          
              sendStream.start();
         } catch (Exception e) {
              e.printStackTrace();
              return e.getMessage();
         return null;
    public void addTarget( String ip, String port) {
         for (int i= 0; i < rtpMgrs.length; i++) {
         int targetPort= new Integer( port).intValue();
         addTarget( localPorts[ i], rtpMgrs[ i], ip, targetPort + 2*i);
    public void addTarget( int localPort, RTPManager mgr, String ip, int port) {
         try {
         SessionAddress addr= new SessionAddress( InetAddress.getByName( ip),
                                  new Integer( port).intValue());
         mgr.addTarget( addr);
         tx.addTargetToList( localPort + "", ip, port + "");
         } catch( Exception e) {
         e.printStackTrace();
    public void removeTarget( String ip, String port) {
         try {     
         SessionAddress addr= new SessionAddress( InetAddress.getByName( ip),
                                  new Integer( port).intValue());
         for (int i= 0; i < rtpMgrs.length; i++) {
         rtpMgrs[ i].removeTarget( addr, "target removed from transmitter.");
         } catch( Exception e) {
         e.printStackTrace();
    boolean looping= true;
    public void controllerUpdate( ControllerEvent ce) {
         System.out.println( ce);
         if( ce instanceof DurationUpdateEvent) {
         Time duration= ((DurationUpdateEvent) ce).getDuration();
         System.out.println( "duration: " + duration.getSeconds());
         } else if( ce instanceof EndOfMediaEvent) {
         System.out.println( "END OF MEDIA - looping=" + looping);
         if( looping) {
         processor.setMediaTime( new Time( 0));
              processor.start();
    public void setLooping( boolean flag) {
         looping= flag;
    public void update( ReceiveStreamEvent event) {
         String timestamp= getTimestamp();
         StringBuffer sb= new StringBuffer();
         if( event instanceof InactiveReceiveStreamEvent) {
         sb.append( timestamp + " Inactive Receive Stream");
         } else if( event instanceof ByeEvent) {
         sb.append( timestamp + " Bye");
         } else {
         System.out.println( "ReceiveStreamEvent: "+ event);
         tx.rtcpReport( sb.toString());     
    public void update( RemoteEvent event) {     
         String timestamp= getTimestamp();
         if( event instanceof ReceiverReportEvent) {
         ReceiverReport rr= ((ReceiverReportEvent) event).getReport();
         StringBuffer sb= new StringBuffer();
         sb.append( timestamp + " RR");
         if( rr != null) {
              Participant participant= rr.getParticipant();
              if( participant != null) {
              sb.append( " from " + participant.getCNAME());
              sb.append( " ssrc=" + rr.getSSRC());
              } else {
              sb.append( " ssrc=" + rr.getSSRC());
              tx.rtcpReport( sb.toString());
         } else {
         System.out.println( "RemoteEvent: " + event);
    private String getTimestamp() {
         String timestamp;
         Calendar calendar= Calendar.getInstance();
         int hour= calendar.get( Calendar.HOUR_OF_DAY);
         String hourStr= formatTime( hour);
         int minute= calendar.get( Calendar.MINUTE);
         String minuteStr= formatTime( minute);
         int second= calendar.get( Calendar.SECOND);
         String secondStr= formatTime( second);
         timestamp= hourStr + ":" + minuteStr + ":" + secondStr;     
         return timestamp;
    private String formatTime( int time) {     
         String timeStr;
         if( time < 10) {
         timeStr= "0" + time;
         } else {
         timeStr= "" + time;
         return timeStr;
    * Stops the transmission if already started
    public void stop() {
         synchronized (this) {
         if (processor != null) {
              processor.stop();
              processor.close();
              processor = null;
         for (int i= 0; i < rtpMgrs.length; i++) {
         rtpMgrs[ i].removeTargets( "Session ended.");
              rtpMgrs[ i].dispose();
    public String createProcessor() {
         if (locator == null) {
         return "Locator is null";
         DataSource ds;
         DataSource clone;
         try {
         ds = javax.media.Manager.createDataSource(locator);
         } catch (Exception e) {
         return "Couldn't create DataSource";
         // Try to create a processor to handle the input media locator
         try {
         processor = javax.media.Manager.createProcessor(ds);
         processor.addControllerListener( this);     
         } catch (NoProcessorException npe) {
         return "Couldn't create processor";
         } catch (IOException ioe) {
         return "IOException creating processor";
         // Wait for it to configure
         boolean result = waitForState(processor, Processor.Configured);
         if (result == false)
         return "Couldn't configure processor";
         // Get the tracks from the processor
         TrackControl [] tracks = processor.getTrackControls();
         // Do we have atleast one track?
         if (tracks == null || tracks.length < 1)
         return "Couldn't find tracks in processor";
         // Set the output content descriptor to RAW_RTP
         // This will limit the supported formats reported from
         // Track.getSupportedFormats to only valid RTP formats.
         ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW_RTP);
         processor.setContentDescriptor(cd);
         Format supported[];
         Format chosen;
         boolean atLeastOneTrack = false;
         // Program the tracks.
         for (int i = 0; i < tracks.length; i++) {
         Format format = tracks[i].getFormat();
         if (tracks[i].isEnabled()) {
              supported = tracks[i].getSupportedFormats();
              // We've set the output content to the RAW_RTP.
              // So all the supported formats should work with RTP.
              // We'll just pick the first one.
              if (supported.length > 0) {
              if (supported[0] instanceof VideoFormat) {
                   // For video formats, we should double check the
                   // sizes since not all formats work in all sizes.
                   chosen = checkForVideoSizes(tracks[i].getFormat(),
                                       supported[0]);
              } else
                   chosen = supported[0];
              tracks[i].setFormat(chosen);
              System.err.println("Track " + i + " is set to transmit as:");
              System.err.println(" " + chosen);
              atLeastOneTrack = true;
              } else
              tracks[i].setEnabled(false);
         } else
              tracks[i].setEnabled(false);
         if (!atLeastOneTrack)
         return "Couldn't set any of the tracks to a valid RTP format";
         // Realize the processor. This will internally create a flow
         // graph and attempt to create an output datasource for JPEG/RTP
         // audio frames.
         result = waitForState(processor, Controller.Realized);
         if (result == false)
         return "Couldn't realize processor";
         // Set the JPEG quality to .5.
         setJPEGQuality(processor, 0.5f);
         // Get the output data source of the processor
         dataOutput = processor.getDataOutput();
         return null;
    static SessionAddress destAddr1, destAddr2;
    * For JPEG and H263, we know that they only work for particular
    * sizes. So we'll perform extra checking here to make sure they
    * are of the right sizes.
    Format checkForVideoSizes(Format original, Format supported) {
         int width, height;
         Dimension size = ((VideoFormat)original).getSize();
         Format jpegFmt = new Format(VideoFormat.JPEG_RTP);
         Format h263Fmt = new Format(VideoFormat.H263_RTP);
         if (supported.matches(jpegFmt)) {
         // For JPEG, make sure width and height are divisible by 8.
         width = (size.width % 8 == 0 ? size.width :
                        (int)(size.width / 8) * 8);
         height = (size.height % 8 == 0 ? size.height :
                        (int)(size.height / 8) * 8);
         } else if (supported.matches(h263Fmt)) {
         // For H.263, we only support some specific sizes.
         if (size.width < 128) {
              width = 128;
              height = 96;
         } else if (size.width < 176) {
              width = 176;
              height = 144;
         } else {
              width = 352;
              height = 288;
         } else {
         // We don't know this particular format. We'll just
         // leave it alone then.
         return supported;
         return (new VideoFormat(null,
                        new Dimension(width, height),
                        Format.NOT_SPECIFIED,
                        null,
                        Format.NOT_SPECIFIED)).intersects(supported);
    * Setting the encoding quality to the specified value on the JPEG encoder.
    * 0.5 is a good default.
    void setJPEGQuality(Player p, float val) {
         Control cs[] = p.getControls();
         QualityControl qc = null;
         VideoFormat jpegFmt = new VideoFormat(VideoFormat.JPEG);
         // Loop through the controls to find the Quality control for
         // the JPEG encoder.
         for (int i = 0; i < cs.length; i++) {
         if (cs[i] instanceof QualityControl &&
              cs[i] instanceof Owned) {
              Object owner = ((Owned)cs[i]).getOwner();
              // Check to see if the owner is a Codec.
              // Then check for the output format.
              if (owner instanceof Codec) {
              Format fmts[] = ((Codec)owner).getSupportedOutputFormats(null);
              for (int j = 0; j < fmts.length; j++) {
                   if (fmts[j].matches(jpegFmt)) {
                   qc = (QualityControl)cs[i];
                   qc.setQuality(val);
                   System.err.println("- Setting quality to " +
                             val + " on " + qc);
                   break;
              if (qc != null)
              break;
    * Convenience methods to handle processor's state changes.
    private Integer stateLock = new Integer(0);
    private boolean failed = false;
    Integer getStateLock() {
         return stateLock;
    void setFailed() {
         failed = true;
    private synchronized boolean waitForState(Processor p, int state) {
         p.addControllerListener(new StateListener());
         failed = false;
         // Call the required method on the processor
         if (state == Processor.Configured) {
         p.configure();
         } else if (state == Processor.Realized) {
         p.realize();
         // Wait until we get an event that confirms the
         // success of the method, or a failure event.
         // See StateListener inner class
         while (p.getState() < state && !failed) {
         synchronized (getStateLock()) {
              try {
              getStateLock().wait();
              } catch (InterruptedException ie) {
              return false;
         if (failed)
         return false;
         else
         return true;
    * Inner Classes
    class StateListener implements ControllerListener {
         public void controllerUpdate(ControllerEvent ce) {
         // If there was an error during configure or
         // realiz

    I do this all the time, I put my MBP to a 60 inch Sharp. If you have the video working do the simple thing first. Check to make sure your sound is on your TV and Mac. Then if that doesn't work go to System Prefrences and under sound go to a tab called Output and see if your TV is listed and if it is change it to that setting
    Hope It Works

  • How can I get netflix streaming audio, the video is fine but the audio coming through is from the TV(dish)

    how can I get netflix streaming audio, the video is fine but the audio coming through is from the TV(dish)

    It's exactly as I stated. Whenever I try to drag these kinds of loops (ESX24 / software instrument loops? the ones marked in green with the white music note next to them) from the loop browser into the timeline a message comes up saying Audio Not Found for that loop.  And a new track is created automatically when loops are dragged into the timeline, so I'm not creating some other random / synth instrument track so I'm not sure  what the deal is... But perhaps I'll try creating a software instrument track first and then drag the loop into that track and see what happens - maybe there's something with the default settings that automatically creates audio tracks whenever loops are imported?

  • OT: What viewers do you need to use ALL audio and video files?

    I had a lot of trouble with audio and video files on my PC. I
    now have a
    MacBook Pro, which is a big improvement. But I still
    frequently
    encounter files that I can't use. I try to open them, but
    Apple doesn't
    know what to do with them. For example, I often encounter
    unusable
    videos while perusing news websites. I also have some audio
    files with
    .mva or .mvw extensions that don't work.
    So what viwers/players do I need to download in order to be
    able to play
    every audio/video file that comes my way? iTunes handles .mp3
    files, and
    I also have QuickTime installed. I just viewed another video
    after I
    installed a Flash plugin. What else do I need to install?
    Thanks.
    www.geobop.org - Family Websites
    www.invisible-republic.org - Adult political websites (Mature
    adults only)

    VLC is a big favorite
    http://www.macupdate.com/app/mac/5758/vlc-media-player

  • HOW DO I FIND THE DRIVERS FOR WINDOWS 8 AUDIO AND VIDEO TO UNINSTALL?

    I NEED TO INSTALL AND UNINSTALL WINDOWS 8 DRIVERS FOR AUDIO AND VIDEO SO THAT I MAY REINSTALL THEM, BUT I DON'T KNOW WHERE TO GO TO FIND THEM IN THE FIRST PLACE.
    ALSO, IF I PURCHASE THE PROGRAM FOR UPDATING DRIVERS, WILL THAT PROGRAM ALSO INSTALL THE NEW AUDIO/VIDEO DRIVERS AND UNINSTALL THE FORMER ONES SO THAT I DON'T HAVE TO GO THROUGH SO MANY STEPS BY MYSELF. I FIND THE DIRECTIONS TO DO THIS FOR MYSELF VERY COMPLICATED.

    Hi, I would like to assist you on this matter, however I would need some information from you first.  I see that you are running windows 8. What is the make and model number of the computer that you are using?  The windows 8 that you are using, is it a 32 bit or a 64 bit operating system?  Please write back and I will be happy to do research on this issue.
    Thank you
    Waterboy71
    Click the “Kudos Thumbs Up" at the bottom of this post to say “Thanks” for helping!
    Please click “Accept as Solution ” if you feel my post solved your issue, it will help others find the solution.
    W a t e r b o y 71
    I work on behalf of HP

Maybe you are looking for