Seperate audio and video.

How do you seperate an audio clip from the video clip before you import it into your project. Any and all help is greatly apprciated

If what you're asking is how to select video without audio or audio without video from a clip in your Browser and bring only that element into your timeline, just click the tab(s) in the patch panel area at the left end of the timeline to disconnect the element of the clip (audio or video) you don't want to use. Then bring the clip into your timeline in the usual way.

Similar Messages

  • Seperating audio and video

    It's been awhile since I've used my Cs4, but I remember being able to seperate audio and video right in the source monitor. There were icons that were there that I can't seem to locate now.  what could cause this change?

    I don't have CS4 in front of me right now, but it seems to me that there is a Take Audio only/Video only/Audio and Video pulldown menu near the bottom of that monitor which will control what gets added to the TL.
    In CS5 we added icons there to facilitate the process.

  • Capture audio and video files to seperate files

    Forum,
    Having suffered problems with OS 10.5 and Quicktime 7.4.1 I have done a clean install with 10. 4 .11.
    When I select capture the following notice appears
    "Audio only capture selected,video preview disabled"
    In system settings "capture audio and video to seperate files " is NOT checked.
    But that may not mean the same as the statement in in the capture window.
    I have searched HELP and a book called" Optimizing your Final Cut Pro System" but can not find an answer.
    Advice would be appreciated.
    Thank you
    Michael Craven

    Hi,
    Check this:
    in Log & Capture window select Clip Settings tab and make sure video box is checked.
    Cheers,
    G.

  • AV digital works audio only on touch but both audio and video on ipad

    The av digital accessory does audio only on the Itouch but works fine on the IPad showing both audio and video?

    To try and keep it simple, there are 3 variables here.
    1) Acquisition - You can either record with a stereo or mono mic, you've probably used mono.
    2a) Capturing - In the 'clip settings' tab of the log and capture window, you can choose how you want to capture your audio, choosing from the 'Audio Format' drop down menu.
    2b) Editing - If you've already captured some material, in the timeline, check the audio tracks. If, for example, a clip has audio tracks a1 and a2, and you can see 2 sets of small triangles at the start and end of the clip, you have stereo audio.
    (double click the audio track and in the viewer you will see a 'stereo a1a2' tab with linked stereo - any changes you apply here will equally apply to both tracks - although you can 'pan' the audio using the 'pan slider' for example, if you want the sound of a car approaching to start in the left speaker, then move to the right speaker as it moves past on screen)
    If there aren't the triangles, double click the audio and in the viewer you'll have 2 seperate mono tracks - mono a1 and mono a2 - assuming you haven't captured the audio format as 'mono mix', ch1(L) or ch2(R).
    You can link seperate mono tracks in the timeline by selecting the audio clip and hitting opt (alt) L, (similarly, you can unlink stereo track by the same procedure).
    If you've got a single mono track, you can duplicate the track and then link, pan, etc as required - can't remember exactly as it's ages since I captured single channels.
    3) Output - The decisions you make in 2) determine how your content willl be outputted. (Is 'outputted' a word?)
    I'm by no means an audio expert so I hope this is clearer than mud.

  • Audio and video not linked by default?

    Sometimes it seems that the audio and video tracks are linked and sometimes they are not, without me unlinking them. I have also noticed that sometimes it seems that I am unable to link tracks back together? Are there common reasons why my video and audio might behave this way?
    Thanks.

    Tom Wolsky wrote:
    You can only link one track of video to up to 24 separate items of audio as long as they are not linked to any other video
    AND as long as those audio clips are on seperate tracks.
    Clips that are captured with Video and Audio are Linked by default.
    You can determine if clips are linked in the timeline. They have an underlined clipname in the Timeline. Unlinked items are not underlined.
    Linked Selection can only select linked clips. So if Linked Selection is ON (See luca's screengrab, earlier this thread), then clicking on an underlined audio clip, it's linked videoclip (and other linked audioclips) will be selected.
    Now on how clips become unlinked. One example can be when you have a linked clip in the timeline (with 1 video and 2 audio layers) and then you cut somewehere in the middle of the audio portions of that clip, then you'll notice that the left part of the audioclips are still linked to the videoclip (all underlined clipnames), but the right part of the audio clips are unlinked from the video (not underlined. This is because the video can only have a link to 1 audio clip per audiotrack...
    Hope this makes sence...
    Rienk

  • No audio and video with iChatAV connecting to one computer

    Hi all, very strange issue here. Have two new alu iMacs, one 2.4 24" and one 2.0 20". Set them up at home and everything worked fine. Set up two seperate AIM accounts and got audio and video iChat working with no problems at all. Got it to work between my two computers and with another one. I then moved my 20" iMac to my office. I am now only able to send text messages, no audio and video with iChat any more. Get message that account doesn't answer. Of course we accept and everything. The strange thing is that both my computers do get audio and video working connecting to my friend's iMac with iChatAV only not when connecting to eachother. When we tried to make a converence call, my friend had to invite us and then my home computer could see and hear me at the office and my friend, while my friend and I could see and hear eachother, but only could hear my home computer. What could be wrong here?

    Can you mark the thread as Answered then please.
    10:13 PM Friday; January 25, 2008

  • Synchronizing several audio and video files automatically in premiere?

    Hi there!
    I'm working on a short film shot on the Arri Alexa, and all my audio is on seperate audiofiles, though synced through time code. So I'm wondering: In Premiere CC, is there a way to automatically synchronize several audio- and video files, using timecode (as you can in Avid), without having to merge them individaully? THis would save me a lot of time.

    You can try using the "create multicamera sequence" command. That will create "multiclips" of your footage with its related audio. It can sync with timecode or use the audio for synchronization. When you need to export the edit for color grading, you can select all of them and choose multicamera>flatten. But beware that if you apply effects, such as speed ramp or intrinsic effects, those will be removed on a "flatten" command.

  • Audio and Video on the sme RSS feed

    I'm trying to place audio and video podcasts (MP3 and MP4) on the same rss feed. The feed validates. When I do audio only, it works fine. When I add a video to the feed, only the video displays in iTunes.
    Do I have to run two seperate feeds?
    Thanks

    You can't play both the audio and the video simultaneously, you'll need to merge them.
    Neil

  • How do I use my web cam to record audio and video? I can get the video to work but no sound.

    I want to record both audio and video at the same time. The video works fine but not the audio. I do use Skype without any problems using both.

    PwmBoat,
    This made me smile, because things like this do make me smile...  I never make videos on YouCam, but I did this time, so I could test this one out.
    You did not say what webcam software you are using...
    ======================================================================
     Video and audio recording works on YouCam V3.5.1.4606
    Make sure you have turned on a working microphone in Settings:
    YouCam > Right-Side   Settings >
    Right-Side, Lower-Right, Select a working Microphone from the Drop down list
    AND
    CHECK "Capture with audio"
    OK
    You can also buy (for $38 US) YouCam 5.0 from the Cyberlink.com website and it has audio/video and all sorts of other goodies, too.  (They have 3.x as well, but why buy what HP already gives you?)
    ==========================================================================
    I hope this helps!                                 
    We work hard to help!
    Whenever you see a Helpful Post - Click the Kudos Star on the Left as Thanks!
    Did this Post solve your problem?  Mark it “Accept as Solution”!
    Note: You can find the “Accept as Solution” box on threads started by you.
    2012 Year of the Dragon!
    Kind Regards,
    Dragon-Fur

  • My shuffle ,doesnt get syncronized with itunes in my pc,getting audio and video settings error,but able to get it syncronized with my friends pc

    my shuffle ,doesnt get syncronized with itunes in my pc,getting audio and video settings error,but able to get it syncronized with my friends pc    

    Try:                                               
    - iOS: Not responding or does not turn on           
    - Also try DFU mode after try recovery mode
    How to put iPod touch / iPhone into DFU mode « Karthik's scribblings
    - If not successful and you can't fully turn the iOS device fully off, let the battery fully drain. After charging for an least an hour try the above again.
    - Try another cable                     
    - Try on another computer                                                       
    - If still not successful that usually indicates a hardware problem and an appointment at the Genius Bar of an Apple store is in order.
    Apple Retail Store - Genius Bar
    The missing apps could have been done by setting the Restrictions that can hid those apps. If the backup was made with those retrictions set the the Restrictions are also restored.
    Thus, if you get it to work restore to factory settings/new iPod, not from backup                               
    You can redownload most iTunes purchases by:        
      Downloading past purchases from the App Store, iBookstore, and iTunes Store

  • Does the mini displayport able to transfer audio and video to an external monitor (example: a tv) using a 2011 13" MacBook Pro with an hdmi cable?

    I was wondering this, because I want to hook up my 2011 MacBook Pro to a external monitor (TV) using an hdmi cable.  This would be heplful before I buy a mini displayport to hdmi adapter, and then maybe find out that the mini displayport might not transfer audio, only video.

    Yes, min Displayport supports audio and video for the 2011 MBP 13" via HDMI.

  • I am new to iMovies.  When I make a video using the mic and camera in my Macbook Pro, the audio and video are not in synch.  Very annoying.  How can I fix this?

    I am new to iMovies.  When I make a video using the mic and camera in my Macbook Pro, the audio and video are not in synch.  Very annoying.  How can I fix this?

    DVD drives read the bottom of the disk, not the top. Smooth out the paper & try again.

  • Premiere Pro CS6 importing/ audio and video problems

    I recently recorded gameplay from my xbox, when i open this recording using windows media player it states that the video is 3 hours 30 minutes long, which is correct, when i then import this into premiere pro, premiere pro changes it to be 5 hours and 44 minutes long, when i then play this back it plays in slow motion and audio and video start to desync, i thought it may just be the source editor playing up so i made a clip and put it into the timeline and it still had the same outcome. The speed/duration are still on 100% so im not sure what to do, anyone have any idea as to what this is, or why it is happening?

    What are the attributes of your PVR video...
    specifically, is it variable frame rate?
    Media file evaluation utilities:
    GSpot
    http://www.headbands.com/gspot/
    MediaInfo
    http://mediainfo.sourceforge.net/en
    If it is VFR, Premiere no-likey.
    Variable frame rate video problems:
    http://forums.adobe.com/message/4897424#4897424
    http://forums.adobe.com/message/4071506#4071506

  • How to get audio and video in one file

    hi...I am "premiere beginner user" and I have a simple question , I think..:)...Every time when I am trying to export media out from my premiere (using media encoder) I get video file and some audio files, but what I need is only ONE file, that contains both (audio and video)..for example , I have edited film in hd quality and I want it to export, I choose h.264 format(make some settings,pal,audio quality...) and when it´s "all done", only "result" I get is one video file and two audio files and they are not together :).....so please help mme anyone..ok and as you can see I am not from english speaking country, so sorry for my english...thanks for help....

    There is no indication that something is out-of-sync or the ability to move/slip into sync like in FCE/FCP.
    To the best of my knowledge, using match frame and overwriting the video with video+audio (or attaching it as a connected clip if you don't want to overwrite the video, which might have effects applied) is the best way to do this.
    You can use the blade tool first in the timeline to get just the section you want to get back into sync. Make cuts so that you have a single clip that needs the audio resynced, select it, and use shift-F to select the original synced clip in the Event Browser. Make sure your play head is at the beginning of this video clip in the timeline. Press option-2, then 'q', and now you have the audio back in your timeline, synced with the video.  Delete the old, no-longer-synced audio and you should be set.

  • DO i need some extra hardware interface for receving both Audio and video

    hi i m doing e-learning project. i have to capture video from webcam and voice from headphone and send to client.
    but my code is working fine for either one at a time.
    DO i need some extra hardware interface for receving both Audio and video. im using code AVTransmit and AVReceive found from this site only
    After running TX
    i give Dsound:// & vfw://0 in Media Locater only sound is received and no vedio
    and when i give vfw://0 in Media Locater only live video is transmited.
    im using JMF1.1.2e.
    if any one know the method to run or cause of it plz reply me soon. i will be very thankfull
    transmiter/server side code .first run TX on server
    import java.io.*;
    import java.awt.*;
    import java.awt.event.*;
    import java.net.*;
    import java.util.*;
    import javax.media.rtp.*;
    import javax.swing.*;
    import javax.swing.event.*;
    import javax.swing.border.*;
    public class Tx extends JFrame implements ActionListener, KeyListener,
    MouseListener, WindowListener {
    Vector targets;
    JList list;
    JButton startXmit;
    JButton rtcp;
    JButton update;
    JButton expiration;
    JButton statistics;
    JButton addTarget;
    JButton removeTarget;
    JTextField tf_remote_address;
    JTextField tf_remote_data_port;
    JTextField tf_media_file;
    JTextField tf_data_port;
    TargetListModel listModel;
    AVTransmitter avTransmitter;
    RTCPViewer rtcpViewer;
    JCheckBox cb_loop;
    Config config;
    public Tx() {
    setTitle( "JMF/RTP Transmitter");
         config= new Config();
         GridBagLayout gridBagLayout= new GridBagLayout();
         GridBagConstraints gbc;
         JPanel p= new JPanel();
         p.setLayout( gridBagLayout);
         JPanel localPanel= createLocalPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 0;
         gbc.gridy= 0;
         gbc.gridwidth= 2;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( localPanel, gbc);
         p.add( localPanel);
         JPanel targetPanel= createTargetPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 1;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( targetPanel, gbc);
    p.add( targetPanel);
         JPanel mediaPanel= createMediaPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 2;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( mediaPanel, gbc);
    p.add( mediaPanel);
    JPanel buttonPanel= new JPanel();
    rtcp= new JButton( "RTCP Monitor");
    update= new JButton( "Transmission Status");
         update.setEnabled( false);
         rtcp.addActionListener( this);
         update.addActionListener( this);
         buttonPanel.add( rtcp);
         buttonPanel.add( update);
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 3;
    gbc.gridwidth= 2;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( buttonPanel, gbc);
         p.add( buttonPanel);
    getContentPane().add( p);
         list.addMouseListener( this);
         addWindowListener( this);
    pack();
    setVisible( true);
    private JPanel createMediaPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         JLabel label= new JLabel( "Media Locator:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         tf_media_file= new JTextField( 35);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_media_file, gbc);
         p.add( tf_media_file);
         tf_media_file.setText( config.media_locator);
         cb_loop= new JCheckBox( "loop");
         startXmit= new JButton( "Start Transmission");
         startXmit.setEnabled( true);
         startXmit.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 2;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( cb_loop, gbc);
         p.add( cb_loop);
         cb_loop.setSelected( true);
         cb_loop.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( startXmit, gbc);
         p.add( startXmit);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Source");
         p.setBorder( titledBorder);
         return p;
    private JPanel createTargetPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         targets= new Vector();
         for( int i= 0; i < config.targets.size(); i++) {
         targets.addElement( config.targets.elementAt( i));
    listModel= new TargetListModel( targets);
    list= new JList( listModel);
         list.addKeyListener( this);
         list.setPrototypeCellValue( "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx");
    JScrollPane scrollPane= new JScrollPane( list,
    ScrollPaneConstants.VERTICAL_SCROLLBAR_AS_NEEDED,
    ScrollPaneConstants.HORIZONTAL_SCROLLBAR_NEVER);
         gbc= new GridBagConstraints();
         gbc.gridx= 0;
         gbc.gridy= 0;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( scrollPane, gbc);
         p.add( scrollPane);
    JPanel p1= new JPanel();
         p1.setLayout( gridBagLayout);
         JLabel label= new JLabel( "IP Address:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( label, gbc);
         p1.add( label);
         tf_remote_address= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( tf_remote_address, gbc);
         p1.add( tf_remote_address);
         label= new JLabel( "Data Port:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( label, gbc);
         p1.add( label);
         tf_remote_data_port= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( tf_remote_data_port, gbc);
         p1.add( tf_remote_data_port);     
    JPanel p2= new JPanel();
    addTarget= new JButton( "Add Target");     
    removeTarget= new JButton( "Remove Target");
         p2.add( addTarget);
         p2.add( removeTarget);
         addTarget.addActionListener( this);
         removeTarget.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 2;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.gridwidth= 2;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 20,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( p2, gbc);
         p1.add( p2);
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 0;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( p1, gbc);
         p.add( p1);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Targets");
         p.setBorder( titledBorder);
         return p;
    private JPanel createLocalPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         JLabel label= new JLabel( "IP Address:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         JTextField tf_local_host= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_local_host, gbc);
         p.add( tf_local_host);
         try {
    String host= InetAddress.getLocalHost().getHostAddress();     
         tf_local_host.setText( host);
         } catch( UnknownHostException e) {
         label= new JLabel( "Data Port:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         tf_data_port= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_data_port, gbc);
         p.add( tf_data_port);
         tf_data_port.setText( config.local_data_port);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Local Host");
         p.setBorder( titledBorder);
         return p;
    public void actionPerformed( ActionEvent event) {
    Object source= event.getSource();
         if( source == addTarget) {
         String ip= tf_remote_address.getText().trim();
         String port= tf_remote_data_port.getText().trim();
         String localPort= tf_data_port.getText().trim();
         addTargetToList( localPort, ip, port);
         if( avTransmitter != null) {
         avTransmitter.addTarget( ip, port);
         } else if( source == removeTarget) {
         int index= list.getSelectedIndex();
         if( index != -1) {
              Target target= (Target) targets.elementAt( index);
              if( avTransmitter != null) {
         avTransmitter.removeTarget( target.ip, target.port);
              targets.removeElement( target);
              listModel.setData( targets);          
         } else if( source == startXmit) {
         if( startXmit.getLabel().equals( "Start Transmission")) {          
         int data_port= new Integer( tf_data_port.getText()).intValue();
              avTransmitter= new AVTransmitter( this, data_port);
         avTransmitter.start( tf_media_file.getText().trim(), targets);          
              avTransmitter.setLooping( cb_loop.isSelected());
         startXmit.setLabel( "Stop Transmission");
         } else if( startXmit.getLabel().equals( "Stop Transmission")) {
              avTransmitter.stop();
              avTransmitter= null;
              removeNonBaseTargets();
              listModel.setData( targets);
         startXmit.setLabel( "Start Transmission");          
         } else if( source == rtcp) {
         if( rtcpViewer == null) {
         rtcpViewer= new RTCPViewer();
         } else {
              rtcpViewer.setVisible( true);
              rtcpViewer.toFront();
         } else if( source == cb_loop) {
         if( avTransmitter != null) {
              avTransmitter.setLooping( cb_loop.isSelected());
    private void removeNonBaseTargets() {
         String localPort= tf_data_port.getText().trim();
         for( int i= targets.size(); i > 0;) {
         Target target= (Target) targets.elementAt( i - 1);
         if( !target.localPort.equals( localPort)) {
    targets.removeElement( target);
         i--;
    public void addTargetToList( String localPort,
                             String ip, String port) {     
    ListUpdater listUpdater= new ListUpdater( localPort, ip,
                                  port, listModel, targets);
    SwingUtilities.invokeLater( listUpdater);           
    public void rtcpReport( String report) {
         if( rtcpViewer != null) {
         rtcpViewer.report( report);
    public void windowClosing( WindowEvent event) {
         config.local_data_port= tf_data_port.getText().trim();
         config.targets= new Vector();
         for( int i= 0; i < targets.size(); i++) {
         Target target= (Target) targets.elementAt( i);
         if( target.localPort.equals( config.local_data_port)) {
              config.addTarget( target.ip, target.port);
         config.media_locator= tf_media_file.getText().trim();
         config.write();
    System.exit( 0);
    public void windowClosed( WindowEvent event) {
    public void windowDeiconified( WindowEvent event) {
    public void windowIconified( WindowEvent event) {
    public void windowActivated( WindowEvent event) {
    public void windowDeactivated( WindowEvent event) {
    public void windowOpened( WindowEvent event) {
    public void keyPressed( KeyEvent event) {
    public void keyReleased( KeyEvent event) {
    Object source= event.getSource();
         if( source == list) {
         int index= list.getSelectedIndex();
    public void keyTyped( KeyEvent event) {
    public void mousePressed( MouseEvent e) {
    public void mouseReleased( MouseEvent e) {
    public void mouseEntered( MouseEvent e) {
    public void mouseExited( MouseEvent e) {
    public void mouseClicked( MouseEvent e) {
    Object source= e.getSource();
         if( source == list) {
         int index= list.getSelectedIndex();
         if( index != -1) {
              Target target= (Target) targets.elementAt( index);
              tf_remote_address.setText( target.ip);
              tf_remote_data_port.setText( target.port);
         int index= list.locationToIndex( e.getPoint());
    public static void main( String[] args) {
    new Tx();
    class TargetListModel extends AbstractListModel {
    private Vector options;
    public TargetListModel( Vector options) {
         this.options= options;
    public int getSize() {
         int size;
         if( options == null) {
         size= 0;
         } else {
         size= options.size();
         return size;
    public Object getElementAt( int index) {
    String name;
    if( index < getSize()) {
         Target o= (Target)options.elementAt( index);
    name= o.localPort + " ---> " + o.ip + ":" + o.port;
         } else {
         name= null;
         return name;
    public void setData( Vector data) {
         options= data;
         fireContentsChanged( this, 0, data.size());
    class ListUpdater implements Runnable {
    String localPort, ip, port;
    TargetListModel listModel;
    Vector targets;
    public ListUpdater( String localPort, String ip, String port,
                   TargetListModel listModel, Vector targets) {
         this.localPort= localPort;
         this.ip= ip;
         this.port= port;
         this.listModel= listModel;
         this.targets= targets;
    public void run() {
    Target target= new Target( localPort, ip, port);
         if( !targetExists( localPort, ip, port)) {
         targets.addElement( target);
    listModel.setData( targets);
    public boolean targetExists( String localPort, String ip, String port) {
         boolean exists= false;
         for( int i= 0; i < targets.size(); i++) {
         Target target= (Target) targets.elementAt( i);
         if( target.localPort.equals( localPort)
         && target.ip.equals( ip)
              && target.port.equals( port)) {          
              exists= true;
         break;
         return exists;
    >>>>>>>>>>>>>>>>>
    import java.awt.*;
    import java.io.*;
    import java.net.InetAddress;
    import java.util.*;
    import javax.media.*;
    import javax.media.protocol.*;
    import javax.media.format.*;
    import javax.media.control.TrackControl;
    import javax.media.control.QualityControl;
    import javax.media.rtp.*;
    import javax.media.rtp.event.*;
    import javax.media.rtp.rtcp.*;
    public class AVTransmitter implements ReceiveStreamListener, RemoteListener,
    ControllerListener {
    // Input MediaLocator
    // Can be a file or http or capture source
    private MediaLocator locator;
    private String ipAddress;
    private int portBase;
    private Processor processor = null;
    private RTPManager rtpMgrs[];
    private int localPorts[];
    private DataSource dataOutput = null;
    private int local_data_port;
    private Tx tx;
    public AVTransmitter( Tx tx, int data_port) {
         this.tx= tx;
         local_data_port= data_port;
    * Starts the transmission. Returns null if transmission started ok.
    * Otherwise it returns a string with the reason why the setup failed.
    public synchronized String start( String filename, Vector targets) {
         String result;
         locator= new MediaLocator( filename);
         // Create a processor for the specified media locator
         // and program it to output JPEG/RTP
         result = createProcessor();
         if (result != null) {
         return result;
         // Create an RTP session to transmit the output of the
         // processor to the specified IP address and port no.
         result = createTransmitter( targets);
         if (result != null) {
         processor.close();
         processor = null;
         return result;
         // Start the transmission
         processor.start();
         return null;
    * Use the RTPManager API to create sessions for each media
    * track of the processor.
    private String createTransmitter( Vector targets) {
         // Cheated. Should have checked the type.
         PushBufferDataSource pbds = (PushBufferDataSource)dataOutput;
         PushBufferStream pbss[] = pbds.getStreams();
         rtpMgrs = new RTPManager[pbss.length];
         localPorts = new int[ pbss.length];
         SessionAddress localAddr, destAddr;
         InetAddress ipAddr;
         SendStream sendStream;
         int port;
         SourceDescription srcDesList[];
         for (int i = 0; i < pbss.length; i++) {
         // for (int i = 0; i < 1; i++) {
         try {
              rtpMgrs[i] = RTPManager.newInstance();     
              port = local_data_port + 2*i;
              localPorts[ i]= port;
              localAddr = new SessionAddress( InetAddress.getLocalHost(),
                                  port);
              rtpMgrs.initialize( localAddr);          
              rtpMgrs[i].addReceiveStreamListener(this);
              rtpMgrs[i].addRemoteListener(this);
         for( int k= 0; k < targets.size(); k++) {
              Target target= (Target) targets.elementAt( k);
              int targetPort= new Integer( target.port).intValue();
              addTarget( localPorts[ i], rtpMgrs[ i], target.ip, targetPort + 2*i);
              sendStream = rtpMgrs[i].createSendStream(dataOutput, i);          
              sendStream.start();
         } catch (Exception e) {
              e.printStackTrace();
              return e.getMessage();
         return null;
    public void addTarget( String ip, String port) {
         for (int i= 0; i < rtpMgrs.length; i++) {
         int targetPort= new Integer( port).intValue();
         addTarget( localPorts[ i], rtpMgrs[ i], ip, targetPort + 2*i);
    public void addTarget( int localPort, RTPManager mgr, String ip, int port) {
         try {
         SessionAddress addr= new SessionAddress( InetAddress.getByName( ip),
                                  new Integer( port).intValue());
         mgr.addTarget( addr);
         tx.addTargetToList( localPort + "", ip, port + "");
         } catch( Exception e) {
         e.printStackTrace();
    public void removeTarget( String ip, String port) {
         try {     
         SessionAddress addr= new SessionAddress( InetAddress.getByName( ip),
                                  new Integer( port).intValue());
         for (int i= 0; i < rtpMgrs.length; i++) {
         rtpMgrs[ i].removeTarget( addr, "target removed from transmitter.");
         } catch( Exception e) {
         e.printStackTrace();
    boolean looping= true;
    public void controllerUpdate( ControllerEvent ce) {
         System.out.println( ce);
         if( ce instanceof DurationUpdateEvent) {
         Time duration= ((DurationUpdateEvent) ce).getDuration();
         System.out.println( "duration: " + duration.getSeconds());
         } else if( ce instanceof EndOfMediaEvent) {
         System.out.println( "END OF MEDIA - looping=" + looping);
         if( looping) {
         processor.setMediaTime( new Time( 0));
              processor.start();
    public void setLooping( boolean flag) {
         looping= flag;
    public void update( ReceiveStreamEvent event) {
         String timestamp= getTimestamp();
         StringBuffer sb= new StringBuffer();
         if( event instanceof InactiveReceiveStreamEvent) {
         sb.append( timestamp + " Inactive Receive Stream");
         } else if( event instanceof ByeEvent) {
         sb.append( timestamp + " Bye");
         } else {
         System.out.println( "ReceiveStreamEvent: "+ event);
         tx.rtcpReport( sb.toString());     
    public void update( RemoteEvent event) {     
         String timestamp= getTimestamp();
         if( event instanceof ReceiverReportEvent) {
         ReceiverReport rr= ((ReceiverReportEvent) event).getReport();
         StringBuffer sb= new StringBuffer();
         sb.append( timestamp + " RR");
         if( rr != null) {
              Participant participant= rr.getParticipant();
              if( participant != null) {
              sb.append( " from " + participant.getCNAME());
              sb.append( " ssrc=" + rr.getSSRC());
              } else {
              sb.append( " ssrc=" + rr.getSSRC());
              tx.rtcpReport( sb.toString());
         } else {
         System.out.println( "RemoteEvent: " + event);
    private String getTimestamp() {
         String timestamp;
         Calendar calendar= Calendar.getInstance();
         int hour= calendar.get( Calendar.HOUR_OF_DAY);
         String hourStr= formatTime( hour);
         int minute= calendar.get( Calendar.MINUTE);
         String minuteStr= formatTime( minute);
         int second= calendar.get( Calendar.SECOND);
         String secondStr= formatTime( second);
         timestamp= hourStr + ":" + minuteStr + ":" + secondStr;     
         return timestamp;
    private String formatTime( int time) {     
         String timeStr;
         if( time < 10) {
         timeStr= "0" + time;
         } else {
         timeStr= "" + time;
         return timeStr;
    * Stops the transmission if already started
    public void stop() {
         synchronized (this) {
         if (processor != null) {
              processor.stop();
              processor.close();
              processor = null;
         for (int i= 0; i < rtpMgrs.length; i++) {
         rtpMgrs[ i].removeTargets( "Session ended.");
              rtpMgrs[ i].dispose();
    public String createProcessor() {
         if (locator == null) {
         return "Locator is null";
         DataSource ds;
         DataSource clone;
         try {
         ds = javax.media.Manager.createDataSource(locator);
         } catch (Exception e) {
         return "Couldn't create DataSource";
         // Try to create a processor to handle the input media locator
         try {
         processor = javax.media.Manager.createProcessor(ds);
         processor.addControllerListener( this);     
         } catch (NoProcessorException npe) {
         return "Couldn't create processor";
         } catch (IOException ioe) {
         return "IOException creating processor";
         // Wait for it to configure
         boolean result = waitForState(processor, Processor.Configured);
         if (result == false)
         return "Couldn't configure processor";
         // Get the tracks from the processor
         TrackControl [] tracks = processor.getTrackControls();
         // Do we have atleast one track?
         if (tracks == null || tracks.length < 1)
         return "Couldn't find tracks in processor";
         // Set the output content descriptor to RAW_RTP
         // This will limit the supported formats reported from
         // Track.getSupportedFormats to only valid RTP formats.
         ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW_RTP);
         processor.setContentDescriptor(cd);
         Format supported[];
         Format chosen;
         boolean atLeastOneTrack = false;
         // Program the tracks.
         for (int i = 0; i < tracks.length; i++) {
         Format format = tracks[i].getFormat();
         if (tracks[i].isEnabled()) {
              supported = tracks[i].getSupportedFormats();
              // We've set the output content to the RAW_RTP.
              // So all the supported formats should work with RTP.
              // We'll just pick the first one.
              if (supported.length > 0) {
              if (supported[0] instanceof VideoFormat) {
                   // For video formats, we should double check the
                   // sizes since not all formats work in all sizes.
                   chosen = checkForVideoSizes(tracks[i].getFormat(),
                                       supported[0]);
              } else
                   chosen = supported[0];
              tracks[i].setFormat(chosen);
              System.err.println("Track " + i + " is set to transmit as:");
              System.err.println(" " + chosen);
              atLeastOneTrack = true;
              } else
              tracks[i].setEnabled(false);
         } else
              tracks[i].setEnabled(false);
         if (!atLeastOneTrack)
         return "Couldn't set any of the tracks to a valid RTP format";
         // Realize the processor. This will internally create a flow
         // graph and attempt to create an output datasource for JPEG/RTP
         // audio frames.
         result = waitForState(processor, Controller.Realized);
         if (result == false)
         return "Couldn't realize processor";
         // Set the JPEG quality to .5.
         setJPEGQuality(processor, 0.5f);
         // Get the output data source of the processor
         dataOutput = processor.getDataOutput();
         return null;
    static SessionAddress destAddr1, destAddr2;
    * For JPEG and H263, we know that they only work for particular
    * sizes. So we'll perform extra checking here to make sure they
    * are of the right sizes.
    Format checkForVideoSizes(Format original, Format supported) {
         int width, height;
         Dimension size = ((VideoFormat)original).getSize();
         Format jpegFmt = new Format(VideoFormat.JPEG_RTP);
         Format h263Fmt = new Format(VideoFormat.H263_RTP);
         if (supported.matches(jpegFmt)) {
         // For JPEG, make sure width and height are divisible by 8.
         width = (size.width % 8 == 0 ? size.width :
                        (int)(size.width / 8) * 8);
         height = (size.height % 8 == 0 ? size.height :
                        (int)(size.height / 8) * 8);
         } else if (supported.matches(h263Fmt)) {
         // For H.263, we only support some specific sizes.
         if (size.width < 128) {
              width = 128;
              height = 96;
         } else if (size.width < 176) {
              width = 176;
              height = 144;
         } else {
              width = 352;
              height = 288;
         } else {
         // We don't know this particular format. We'll just
         // leave it alone then.
         return supported;
         return (new VideoFormat(null,
                        new Dimension(width, height),
                        Format.NOT_SPECIFIED,
                        null,
                        Format.NOT_SPECIFIED)).intersects(supported);
    * Setting the encoding quality to the specified value on the JPEG encoder.
    * 0.5 is a good default.
    void setJPEGQuality(Player p, float val) {
         Control cs[] = p.getControls();
         QualityControl qc = null;
         VideoFormat jpegFmt = new VideoFormat(VideoFormat.JPEG);
         // Loop through the controls to find the Quality control for
         // the JPEG encoder.
         for (int i = 0; i < cs.length; i++) {
         if (cs[i] instanceof QualityControl &&
              cs[i] instanceof Owned) {
              Object owner = ((Owned)cs[i]).getOwner();
              // Check to see if the owner is a Codec.
              // Then check for the output format.
              if (owner instanceof Codec) {
              Format fmts[] = ((Codec)owner).getSupportedOutputFormats(null);
              for (int j = 0; j < fmts.length; j++) {
                   if (fmts[j].matches(jpegFmt)) {
                   qc = (QualityControl)cs[i];
                   qc.setQuality(val);
                   System.err.println("- Setting quality to " +
                             val + " on " + qc);
                   break;
              if (qc != null)
              break;
    * Convenience methods to handle processor's state changes.
    private Integer stateLock = new Integer(0);
    private boolean failed = false;
    Integer getStateLock() {
         return stateLock;
    void setFailed() {
         failed = true;
    private synchronized boolean waitForState(Processor p, int state) {
         p.addControllerListener(new StateListener());
         failed = false;
         // Call the required method on the processor
         if (state == Processor.Configured) {
         p.configure();
         } else if (state == Processor.Realized) {
         p.realize();
         // Wait until we get an event that confirms the
         // success of the method, or a failure event.
         // See StateListener inner class
         while (p.getState() < state && !failed) {
         synchronized (getStateLock()) {
              try {
              getStateLock().wait();
              } catch (InterruptedException ie) {
              return false;
         if (failed)
         return false;
         else
         return true;
    * Inner Classes
    class StateListener implements ControllerListener {
         public void controllerUpdate(ControllerEvent ce) {
         // If there was an error during configure or
         // realiz

    I do this all the time, I put my MBP to a 60 inch Sharp. If you have the video working do the simple thing first. Check to make sure your sound is on your TV and Mac. Then if that doesn't work go to System Prefrences and under sound go to a tab called Output and see if your TV is listed and if it is change it to that setting
    Hope It Works

Maybe you are looking for

  • Change of additional acct. assignment in automatically created line items

    Hello, How can we change the additional account assignment of automatically created line items in transaction MIRO. In my case the tax line item is created automatically in MIRO. I am able to change the account assignment of consumption line item. Bu

  • jsp:setProperty .. method throws null pointer exception...

    my simple jsp page : <%@page contentType="text/html"%> <%@page pageEncoding="UTF-8"%> <%-- The taglib directive below imports the JSTL library. If you uncomment it, you must also add the JSTL library to the project. The Add Library... action on Libra

  • Oracle Data Loader

    Hi guys! I'm planning to import a file with about 400k records with data loader (insert function). I do this operation with web services and I took about 7 hours. With web services I import about 20k records per time. Someone know if i use dataloader

  • Flash Player 10.1 for Solaris x86 missing

    Hello, I am having and odd  problem installing 10.1.82.76 on Solaris for x86.  When I get the file  extracted and installed, the version check in Firefox clams that its not  version 10.1.82.76 but 10.0 r45 or 10.0.45.2 as confirmed by http://www.adob

  • Setting frame size to 640 by 360

    Hello I am trying to edit video that is 640 by 360 in size but Premiere seems to only want to have it in 720 by 480. This results in a thick black box around my video that is very annoying. Is there any way to fix this? I do not want to upscale the v