Trouble getting both audio and video out of 2009 MBP.

I had this working fine a few months ago, and then upgraded to a Pioneer elite stereo, now either audio or video work, but not simultaneously, even if I connect video to TV directly and audio to stereo directly, I can't get both to work it's either or.  I've set the stereo to accept the optical input, and as I said above, it doesn't even work if I connect the audio to the stereo only while the video goes to the TV.  Really seems to be a software issue, any ideas?

the receiver will need to be set so the tv is one of the optical inputs. so that is why you are only hearing one or the other. the tv expects to get sound from a different source other then your opical input.
on my jvc there is a setting where I tell the receiver. The tv sound is digital so it'll find the digital signal and associate it to the tv sound.
What is your exact model receiver I'm sure there is a PDF of the instructions online.

Similar Messages

  • I am trying to export a video from Premiere Pro.  I have both audio and video checked and the sequence is active but only the audio exports.  There is no video at all.  Can anyone tell me how to get the video to export.  It was working fine until today.

    Premiere Pro CC has been working fine until today.  I have completed my sequence and it is active.  I have both audio and video checked when I am exporting using H.264 with Match Source Hi bitrate.  I am choosing entire sequence.  The audio exports but the video and title does not.  Can anyone tell me how to troubleshoot this so I can complete this project?  Please!

    Greetings,
    I've never seen this issue, and I handle many iPads, of all versions. WiFi issues are generally local to the WiFi router - they are not all of the same quality, range, immunity to interference, etc. You have distance, building construction, and the biggie - interference.
    At home, I use Apple routers, and have no issues with any of my WiFi enabled devices, computers, mobile devices, etc - even the lowly PeeCees. I have locations where I have Juniper Networks, as well as Aruba, and a few Netgears - all of them work as they should.
    The cheaper routers, Linksys, D-Link, Seimens home units, and many other no name devices have caused issues of various kinds, and even connectivity.
    I have no idea what Starbucks uses, but I always have a good connection, and I go there nearly every morning and get some work done, as well as play.
    You could try changing channels, 2.4 to 5 Gigs, changing locations of the router. I have had to do all of these at one time or another over the many years that I have been a Network Engineer.
    Good Luck - Cheers,
    M.

  • DO i need some extra hardware interface for receving both Audio and video

    hi i m doing e-learning project. i have to capture video from webcam and voice from headphone and send to client.
    but my code is working fine for either one at a time.
    DO i need some extra hardware interface for receving both Audio and video. im using code AVTransmit and AVReceive found from this site only
    After running TX
    i give Dsound:// & vfw://0 in Media Locater only sound is received and no vedio
    and when i give vfw://0 in Media Locater only live video is transmited.
    im using JMF1.1.2e.
    if any one know the method to run or cause of it plz reply me soon. i will be very thankfull
    transmiter/server side code .first run TX on server
    import java.io.*;
    import java.awt.*;
    import java.awt.event.*;
    import java.net.*;
    import java.util.*;
    import javax.media.rtp.*;
    import javax.swing.*;
    import javax.swing.event.*;
    import javax.swing.border.*;
    public class Tx extends JFrame implements ActionListener, KeyListener,
    MouseListener, WindowListener {
    Vector targets;
    JList list;
    JButton startXmit;
    JButton rtcp;
    JButton update;
    JButton expiration;
    JButton statistics;
    JButton addTarget;
    JButton removeTarget;
    JTextField tf_remote_address;
    JTextField tf_remote_data_port;
    JTextField tf_media_file;
    JTextField tf_data_port;
    TargetListModel listModel;
    AVTransmitter avTransmitter;
    RTCPViewer rtcpViewer;
    JCheckBox cb_loop;
    Config config;
    public Tx() {
    setTitle( "JMF/RTP Transmitter");
         config= new Config();
         GridBagLayout gridBagLayout= new GridBagLayout();
         GridBagConstraints gbc;
         JPanel p= new JPanel();
         p.setLayout( gridBagLayout);
         JPanel localPanel= createLocalPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 0;
         gbc.gridy= 0;
         gbc.gridwidth= 2;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( localPanel, gbc);
         p.add( localPanel);
         JPanel targetPanel= createTargetPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 1;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( targetPanel, gbc);
    p.add( targetPanel);
         JPanel mediaPanel= createMediaPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 2;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( mediaPanel, gbc);
    p.add( mediaPanel);
    JPanel buttonPanel= new JPanel();
    rtcp= new JButton( "RTCP Monitor");
    update= new JButton( "Transmission Status");
         update.setEnabled( false);
         rtcp.addActionListener( this);
         update.addActionListener( this);
         buttonPanel.add( rtcp);
         buttonPanel.add( update);
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 3;
    gbc.gridwidth= 2;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( buttonPanel, gbc);
         p.add( buttonPanel);
    getContentPane().add( p);
         list.addMouseListener( this);
         addWindowListener( this);
    pack();
    setVisible( true);
    private JPanel createMediaPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         JLabel label= new JLabel( "Media Locator:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         tf_media_file= new JTextField( 35);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_media_file, gbc);
         p.add( tf_media_file);
         tf_media_file.setText( config.media_locator);
         cb_loop= new JCheckBox( "loop");
         startXmit= new JButton( "Start Transmission");
         startXmit.setEnabled( true);
         startXmit.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 2;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( cb_loop, gbc);
         p.add( cb_loop);
         cb_loop.setSelected( true);
         cb_loop.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( startXmit, gbc);
         p.add( startXmit);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Source");
         p.setBorder( titledBorder);
         return p;
    private JPanel createTargetPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         targets= new Vector();
         for( int i= 0; i < config.targets.size(); i++) {
         targets.addElement( config.targets.elementAt( i));
    listModel= new TargetListModel( targets);
    list= new JList( listModel);
         list.addKeyListener( this);
         list.setPrototypeCellValue( "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx");
    JScrollPane scrollPane= new JScrollPane( list,
    ScrollPaneConstants.VERTICAL_SCROLLBAR_AS_NEEDED,
    ScrollPaneConstants.HORIZONTAL_SCROLLBAR_NEVER);
         gbc= new GridBagConstraints();
         gbc.gridx= 0;
         gbc.gridy= 0;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( scrollPane, gbc);
         p.add( scrollPane);
    JPanel p1= new JPanel();
         p1.setLayout( gridBagLayout);
         JLabel label= new JLabel( "IP Address:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( label, gbc);
         p1.add( label);
         tf_remote_address= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( tf_remote_address, gbc);
         p1.add( tf_remote_address);
         label= new JLabel( "Data Port:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( label, gbc);
         p1.add( label);
         tf_remote_data_port= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( tf_remote_data_port, gbc);
         p1.add( tf_remote_data_port);     
    JPanel p2= new JPanel();
    addTarget= new JButton( "Add Target");     
    removeTarget= new JButton( "Remove Target");
         p2.add( addTarget);
         p2.add( removeTarget);
         addTarget.addActionListener( this);
         removeTarget.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 2;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.gridwidth= 2;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 20,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( p2, gbc);
         p1.add( p2);
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 0;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( p1, gbc);
         p.add( p1);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Targets");
         p.setBorder( titledBorder);
         return p;
    private JPanel createLocalPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         JLabel label= new JLabel( "IP Address:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         JTextField tf_local_host= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_local_host, gbc);
         p.add( tf_local_host);
         try {
    String host= InetAddress.getLocalHost().getHostAddress();     
         tf_local_host.setText( host);
         } catch( UnknownHostException e) {
         label= new JLabel( "Data Port:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         tf_data_port= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_data_port, gbc);
         p.add( tf_data_port);
         tf_data_port.setText( config.local_data_port);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Local Host");
         p.setBorder( titledBorder);
         return p;
    public void actionPerformed( ActionEvent event) {
    Object source= event.getSource();
         if( source == addTarget) {
         String ip= tf_remote_address.getText().trim();
         String port= tf_remote_data_port.getText().trim();
         String localPort= tf_data_port.getText().trim();
         addTargetToList( localPort, ip, port);
         if( avTransmitter != null) {
         avTransmitter.addTarget( ip, port);
         } else if( source == removeTarget) {
         int index= list.getSelectedIndex();
         if( index != -1) {
              Target target= (Target) targets.elementAt( index);
              if( avTransmitter != null) {
         avTransmitter.removeTarget( target.ip, target.port);
              targets.removeElement( target);
              listModel.setData( targets);          
         } else if( source == startXmit) {
         if( startXmit.getLabel().equals( "Start Transmission")) {          
         int data_port= new Integer( tf_data_port.getText()).intValue();
              avTransmitter= new AVTransmitter( this, data_port);
         avTransmitter.start( tf_media_file.getText().trim(), targets);          
              avTransmitter.setLooping( cb_loop.isSelected());
         startXmit.setLabel( "Stop Transmission");
         } else if( startXmit.getLabel().equals( "Stop Transmission")) {
              avTransmitter.stop();
              avTransmitter= null;
              removeNonBaseTargets();
              listModel.setData( targets);
         startXmit.setLabel( "Start Transmission");          
         } else if( source == rtcp) {
         if( rtcpViewer == null) {
         rtcpViewer= new RTCPViewer();
         } else {
              rtcpViewer.setVisible( true);
              rtcpViewer.toFront();
         } else if( source == cb_loop) {
         if( avTransmitter != null) {
              avTransmitter.setLooping( cb_loop.isSelected());
    private void removeNonBaseTargets() {
         String localPort= tf_data_port.getText().trim();
         for( int i= targets.size(); i > 0;) {
         Target target= (Target) targets.elementAt( i - 1);
         if( !target.localPort.equals( localPort)) {
    targets.removeElement( target);
         i--;
    public void addTargetToList( String localPort,
                             String ip, String port) {     
    ListUpdater listUpdater= new ListUpdater( localPort, ip,
                                  port, listModel, targets);
    SwingUtilities.invokeLater( listUpdater);           
    public void rtcpReport( String report) {
         if( rtcpViewer != null) {
         rtcpViewer.report( report);
    public void windowClosing( WindowEvent event) {
         config.local_data_port= tf_data_port.getText().trim();
         config.targets= new Vector();
         for( int i= 0; i < targets.size(); i++) {
         Target target= (Target) targets.elementAt( i);
         if( target.localPort.equals( config.local_data_port)) {
              config.addTarget( target.ip, target.port);
         config.media_locator= tf_media_file.getText().trim();
         config.write();
    System.exit( 0);
    public void windowClosed( WindowEvent event) {
    public void windowDeiconified( WindowEvent event) {
    public void windowIconified( WindowEvent event) {
    public void windowActivated( WindowEvent event) {
    public void windowDeactivated( WindowEvent event) {
    public void windowOpened( WindowEvent event) {
    public void keyPressed( KeyEvent event) {
    public void keyReleased( KeyEvent event) {
    Object source= event.getSource();
         if( source == list) {
         int index= list.getSelectedIndex();
    public void keyTyped( KeyEvent event) {
    public void mousePressed( MouseEvent e) {
    public void mouseReleased( MouseEvent e) {
    public void mouseEntered( MouseEvent e) {
    public void mouseExited( MouseEvent e) {
    public void mouseClicked( MouseEvent e) {
    Object source= e.getSource();
         if( source == list) {
         int index= list.getSelectedIndex();
         if( index != -1) {
              Target target= (Target) targets.elementAt( index);
              tf_remote_address.setText( target.ip);
              tf_remote_data_port.setText( target.port);
         int index= list.locationToIndex( e.getPoint());
    public static void main( String[] args) {
    new Tx();
    class TargetListModel extends AbstractListModel {
    private Vector options;
    public TargetListModel( Vector options) {
         this.options= options;
    public int getSize() {
         int size;
         if( options == null) {
         size= 0;
         } else {
         size= options.size();
         return size;
    public Object getElementAt( int index) {
    String name;
    if( index < getSize()) {
         Target o= (Target)options.elementAt( index);
    name= o.localPort + " ---> " + o.ip + ":" + o.port;
         } else {
         name= null;
         return name;
    public void setData( Vector data) {
         options= data;
         fireContentsChanged( this, 0, data.size());
    class ListUpdater implements Runnable {
    String localPort, ip, port;
    TargetListModel listModel;
    Vector targets;
    public ListUpdater( String localPort, String ip, String port,
                   TargetListModel listModel, Vector targets) {
         this.localPort= localPort;
         this.ip= ip;
         this.port= port;
         this.listModel= listModel;
         this.targets= targets;
    public void run() {
    Target target= new Target( localPort, ip, port);
         if( !targetExists( localPort, ip, port)) {
         targets.addElement( target);
    listModel.setData( targets);
    public boolean targetExists( String localPort, String ip, String port) {
         boolean exists= false;
         for( int i= 0; i < targets.size(); i++) {
         Target target= (Target) targets.elementAt( i);
         if( target.localPort.equals( localPort)
         && target.ip.equals( ip)
              && target.port.equals( port)) {          
              exists= true;
         break;
         return exists;
    >>>>>>>>>>>>>>>>>
    import java.awt.*;
    import java.io.*;
    import java.net.InetAddress;
    import java.util.*;
    import javax.media.*;
    import javax.media.protocol.*;
    import javax.media.format.*;
    import javax.media.control.TrackControl;
    import javax.media.control.QualityControl;
    import javax.media.rtp.*;
    import javax.media.rtp.event.*;
    import javax.media.rtp.rtcp.*;
    public class AVTransmitter implements ReceiveStreamListener, RemoteListener,
    ControllerListener {
    // Input MediaLocator
    // Can be a file or http or capture source
    private MediaLocator locator;
    private String ipAddress;
    private int portBase;
    private Processor processor = null;
    private RTPManager rtpMgrs[];
    private int localPorts[];
    private DataSource dataOutput = null;
    private int local_data_port;
    private Tx tx;
    public AVTransmitter( Tx tx, int data_port) {
         this.tx= tx;
         local_data_port= data_port;
    * Starts the transmission. Returns null if transmission started ok.
    * Otherwise it returns a string with the reason why the setup failed.
    public synchronized String start( String filename, Vector targets) {
         String result;
         locator= new MediaLocator( filename);
         // Create a processor for the specified media locator
         // and program it to output JPEG/RTP
         result = createProcessor();
         if (result != null) {
         return result;
         // Create an RTP session to transmit the output of the
         // processor to the specified IP address and port no.
         result = createTransmitter( targets);
         if (result != null) {
         processor.close();
         processor = null;
         return result;
         // Start the transmission
         processor.start();
         return null;
    * Use the RTPManager API to create sessions for each media
    * track of the processor.
    private String createTransmitter( Vector targets) {
         // Cheated. Should have checked the type.
         PushBufferDataSource pbds = (PushBufferDataSource)dataOutput;
         PushBufferStream pbss[] = pbds.getStreams();
         rtpMgrs = new RTPManager[pbss.length];
         localPorts = new int[ pbss.length];
         SessionAddress localAddr, destAddr;
         InetAddress ipAddr;
         SendStream sendStream;
         int port;
         SourceDescription srcDesList[];
         for (int i = 0; i < pbss.length; i++) {
         // for (int i = 0; i < 1; i++) {
         try {
              rtpMgrs[i] = RTPManager.newInstance();     
              port = local_data_port + 2*i;
              localPorts[ i]= port;
              localAddr = new SessionAddress( InetAddress.getLocalHost(),
                                  port);
              rtpMgrs.initialize( localAddr);          
              rtpMgrs[i].addReceiveStreamListener(this);
              rtpMgrs[i].addRemoteListener(this);
         for( int k= 0; k < targets.size(); k++) {
              Target target= (Target) targets.elementAt( k);
              int targetPort= new Integer( target.port).intValue();
              addTarget( localPorts[ i], rtpMgrs[ i], target.ip, targetPort + 2*i);
              sendStream = rtpMgrs[i].createSendStream(dataOutput, i);          
              sendStream.start();
         } catch (Exception e) {
              e.printStackTrace();
              return e.getMessage();
         return null;
    public void addTarget( String ip, String port) {
         for (int i= 0; i < rtpMgrs.length; i++) {
         int targetPort= new Integer( port).intValue();
         addTarget( localPorts[ i], rtpMgrs[ i], ip, targetPort + 2*i);
    public void addTarget( int localPort, RTPManager mgr, String ip, int port) {
         try {
         SessionAddress addr= new SessionAddress( InetAddress.getByName( ip),
                                  new Integer( port).intValue());
         mgr.addTarget( addr);
         tx.addTargetToList( localPort + "", ip, port + "");
         } catch( Exception e) {
         e.printStackTrace();
    public void removeTarget( String ip, String port) {
         try {     
         SessionAddress addr= new SessionAddress( InetAddress.getByName( ip),
                                  new Integer( port).intValue());
         for (int i= 0; i < rtpMgrs.length; i++) {
         rtpMgrs[ i].removeTarget( addr, "target removed from transmitter.");
         } catch( Exception e) {
         e.printStackTrace();
    boolean looping= true;
    public void controllerUpdate( ControllerEvent ce) {
         System.out.println( ce);
         if( ce instanceof DurationUpdateEvent) {
         Time duration= ((DurationUpdateEvent) ce).getDuration();
         System.out.println( "duration: " + duration.getSeconds());
         } else if( ce instanceof EndOfMediaEvent) {
         System.out.println( "END OF MEDIA - looping=" + looping);
         if( looping) {
         processor.setMediaTime( new Time( 0));
              processor.start();
    public void setLooping( boolean flag) {
         looping= flag;
    public void update( ReceiveStreamEvent event) {
         String timestamp= getTimestamp();
         StringBuffer sb= new StringBuffer();
         if( event instanceof InactiveReceiveStreamEvent) {
         sb.append( timestamp + " Inactive Receive Stream");
         } else if( event instanceof ByeEvent) {
         sb.append( timestamp + " Bye");
         } else {
         System.out.println( "ReceiveStreamEvent: "+ event);
         tx.rtcpReport( sb.toString());     
    public void update( RemoteEvent event) {     
         String timestamp= getTimestamp();
         if( event instanceof ReceiverReportEvent) {
         ReceiverReport rr= ((ReceiverReportEvent) event).getReport();
         StringBuffer sb= new StringBuffer();
         sb.append( timestamp + " RR");
         if( rr != null) {
              Participant participant= rr.getParticipant();
              if( participant != null) {
              sb.append( " from " + participant.getCNAME());
              sb.append( " ssrc=" + rr.getSSRC());
              } else {
              sb.append( " ssrc=" + rr.getSSRC());
              tx.rtcpReport( sb.toString());
         } else {
         System.out.println( "RemoteEvent: " + event);
    private String getTimestamp() {
         String timestamp;
         Calendar calendar= Calendar.getInstance();
         int hour= calendar.get( Calendar.HOUR_OF_DAY);
         String hourStr= formatTime( hour);
         int minute= calendar.get( Calendar.MINUTE);
         String minuteStr= formatTime( minute);
         int second= calendar.get( Calendar.SECOND);
         String secondStr= formatTime( second);
         timestamp= hourStr + ":" + minuteStr + ":" + secondStr;     
         return timestamp;
    private String formatTime( int time) {     
         String timeStr;
         if( time < 10) {
         timeStr= "0" + time;
         } else {
         timeStr= "" + time;
         return timeStr;
    * Stops the transmission if already started
    public void stop() {
         synchronized (this) {
         if (processor != null) {
              processor.stop();
              processor.close();
              processor = null;
         for (int i= 0; i < rtpMgrs.length; i++) {
         rtpMgrs[ i].removeTargets( "Session ended.");
              rtpMgrs[ i].dispose();
    public String createProcessor() {
         if (locator == null) {
         return "Locator is null";
         DataSource ds;
         DataSource clone;
         try {
         ds = javax.media.Manager.createDataSource(locator);
         } catch (Exception e) {
         return "Couldn't create DataSource";
         // Try to create a processor to handle the input media locator
         try {
         processor = javax.media.Manager.createProcessor(ds);
         processor.addControllerListener( this);     
         } catch (NoProcessorException npe) {
         return "Couldn't create processor";
         } catch (IOException ioe) {
         return "IOException creating processor";
         // Wait for it to configure
         boolean result = waitForState(processor, Processor.Configured);
         if (result == false)
         return "Couldn't configure processor";
         // Get the tracks from the processor
         TrackControl [] tracks = processor.getTrackControls();
         // Do we have atleast one track?
         if (tracks == null || tracks.length < 1)
         return "Couldn't find tracks in processor";
         // Set the output content descriptor to RAW_RTP
         // This will limit the supported formats reported from
         // Track.getSupportedFormats to only valid RTP formats.
         ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW_RTP);
         processor.setContentDescriptor(cd);
         Format supported[];
         Format chosen;
         boolean atLeastOneTrack = false;
         // Program the tracks.
         for (int i = 0; i < tracks.length; i++) {
         Format format = tracks[i].getFormat();
         if (tracks[i].isEnabled()) {
              supported = tracks[i].getSupportedFormats();
              // We've set the output content to the RAW_RTP.
              // So all the supported formats should work with RTP.
              // We'll just pick the first one.
              if (supported.length > 0) {
              if (supported[0] instanceof VideoFormat) {
                   // For video formats, we should double check the
                   // sizes since not all formats work in all sizes.
                   chosen = checkForVideoSizes(tracks[i].getFormat(),
                                       supported[0]);
              } else
                   chosen = supported[0];
              tracks[i].setFormat(chosen);
              System.err.println("Track " + i + " is set to transmit as:");
              System.err.println(" " + chosen);
              atLeastOneTrack = true;
              } else
              tracks[i].setEnabled(false);
         } else
              tracks[i].setEnabled(false);
         if (!atLeastOneTrack)
         return "Couldn't set any of the tracks to a valid RTP format";
         // Realize the processor. This will internally create a flow
         // graph and attempt to create an output datasource for JPEG/RTP
         // audio frames.
         result = waitForState(processor, Controller.Realized);
         if (result == false)
         return "Couldn't realize processor";
         // Set the JPEG quality to .5.
         setJPEGQuality(processor, 0.5f);
         // Get the output data source of the processor
         dataOutput = processor.getDataOutput();
         return null;
    static SessionAddress destAddr1, destAddr2;
    * For JPEG and H263, we know that they only work for particular
    * sizes. So we'll perform extra checking here to make sure they
    * are of the right sizes.
    Format checkForVideoSizes(Format original, Format supported) {
         int width, height;
         Dimension size = ((VideoFormat)original).getSize();
         Format jpegFmt = new Format(VideoFormat.JPEG_RTP);
         Format h263Fmt = new Format(VideoFormat.H263_RTP);
         if (supported.matches(jpegFmt)) {
         // For JPEG, make sure width and height are divisible by 8.
         width = (size.width % 8 == 0 ? size.width :
                        (int)(size.width / 8) * 8);
         height = (size.height % 8 == 0 ? size.height :
                        (int)(size.height / 8) * 8);
         } else if (supported.matches(h263Fmt)) {
         // For H.263, we only support some specific sizes.
         if (size.width < 128) {
              width = 128;
              height = 96;
         } else if (size.width < 176) {
              width = 176;
              height = 144;
         } else {
              width = 352;
              height = 288;
         } else {
         // We don't know this particular format. We'll just
         // leave it alone then.
         return supported;
         return (new VideoFormat(null,
                        new Dimension(width, height),
                        Format.NOT_SPECIFIED,
                        null,
                        Format.NOT_SPECIFIED)).intersects(supported);
    * Setting the encoding quality to the specified value on the JPEG encoder.
    * 0.5 is a good default.
    void setJPEGQuality(Player p, float val) {
         Control cs[] = p.getControls();
         QualityControl qc = null;
         VideoFormat jpegFmt = new VideoFormat(VideoFormat.JPEG);
         // Loop through the controls to find the Quality control for
         // the JPEG encoder.
         for (int i = 0; i < cs.length; i++) {
         if (cs[i] instanceof QualityControl &&
              cs[i] instanceof Owned) {
              Object owner = ((Owned)cs[i]).getOwner();
              // Check to see if the owner is a Codec.
              // Then check for the output format.
              if (owner instanceof Codec) {
              Format fmts[] = ((Codec)owner).getSupportedOutputFormats(null);
              for (int j = 0; j < fmts.length; j++) {
                   if (fmts[j].matches(jpegFmt)) {
                   qc = (QualityControl)cs[i];
                   qc.setQuality(val);
                   System.err.println("- Setting quality to " +
                             val + " on " + qc);
                   break;
              if (qc != null)
              break;
    * Convenience methods to handle processor's state changes.
    private Integer stateLock = new Integer(0);
    private boolean failed = false;
    Integer getStateLock() {
         return stateLock;
    void setFailed() {
         failed = true;
    private synchronized boolean waitForState(Processor p, int state) {
         p.addControllerListener(new StateListener());
         failed = false;
         // Call the required method on the processor
         if (state == Processor.Configured) {
         p.configure();
         } else if (state == Processor.Realized) {
         p.realize();
         // Wait until we get an event that confirms the
         // success of the method, or a failure event.
         // See StateListener inner class
         while (p.getState() < state && !failed) {
         synchronized (getStateLock()) {
              try {
              getStateLock().wait();
              } catch (InterruptedException ie) {
              return false;
         if (failed)
         return false;
         else
         return true;
    * Inner Classes
    class StateListener implements ControllerListener {
         public void controllerUpdate(ControllerEvent ce) {
         // If there was an error during configure or
         // realiz

    I do this all the time, I put my MBP to a 60 inch Sharp. If you have the video working do the simple thing first. Check to make sure your sound is on your TV and Mac. Then if that doesn't work go to System Prefrences and under sound go to a tab called Output and see if your TV is listed and if it is change it to that setting
    Hope It Works

  • Extra hardware interface for transmiting both Audio and video to client ??

    hi i m doing e-learning project. i have to capture video from webcam and voice from headphone and send to client.
    but my code is working fine for either one at a time.
    DO i need some extra hardware interface for Transmitting  both Audio and video to client im using code AVTransmit and AVReceive found from this site only
    After running TX
    i give dsound:// & vfw://0 in Media Locater only sound is received and no vedio
    and when i give vfw://0 in Media Locater only live video is transmited.
    im using JMF1.1.2e.
    if any one know the method to run or cause of it plz reply me soon. i will be very thankfull
    transmiter/server side code .first run TX on server
    import java.io.*;
    import java.awt.*;
    import java.awt.event.*;
    import java.net.*;
    import java.util.*;
    import javax.media.rtp.*;
    import javax.swing.*;
    import javax.swing.event.*;
    import javax.swing.border.*;
    public class Tx extends JFrame implements ActionListener, KeyListener,
    MouseListener, WindowListener {
    Vector targets;
    JList list;
    JButton startXmit;
    JButton rtcp;
    JButton update;
    JButton expiration;
    JButton statistics;
    JButton addTarget;
    JButton removeTarget;
    JTextField tf_remote_address;
    JTextField tf_remote_data_port;
    JTextField tf_media_file;
    JTextField tf_data_port;
    TargetListModel listModel;
    AVTransmitter avTransmitter;
    RTCPViewer rtcpViewer;
    JCheckBox cb_loop;
    Config config;
    public Tx() {
    setTitle( "JMF/RTP Transmitter");
    config= new Config();
    GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
    JPanel p= new JPanel();
    p.setLayout( gridBagLayout);
    JPanel localPanel= createLocalPanel();
    gbc= new GridBagConstraints();
    gbc.gridx= 0;
    gbc.gridy= 0;
    gbc.gridwidth= 2;
    gbc.anchor= GridBagConstraints.CENTER;
    gbc.fill= GridBagConstraints.BOTH;
    gbc.insets= new Insets( 10, 5, 0, 0);
    ((GridBagLayout)p.getLayout()).setConstraints( localPanel, gbc);
    p.add( localPanel);
    JPanel targetPanel= createTargetPanel();
    gbc= new GridBagConstraints();
    gbc.gridx= 1;
    gbc.gridy= 1;
    gbc.weightx= 1.0;
    gbc.weighty= 1.0;
    gbc.anchor= GridBagConstraints.CENTER;
    gbc.fill= GridBagConstraints.BOTH;
    gbc.insets= new Insets( 10, 5, 0, 0);
    ((GridBagLayout)p.getLayout()).setConstraints( targetPanel, gbc);
    p.add( targetPanel);
    JPanel mediaPanel= createMediaPanel();
    gbc= new GridBagConstraints();
    gbc.gridx= 1;
    gbc.gridy= 2;
    gbc.weightx= 1.0;
    gbc.weighty= 1.0;
    gbc.anchor= GridBagConstraints.CENTER;
    gbc.fill= GridBagConstraints.BOTH;
    gbc.insets= new Insets( 10, 5, 0, 0);
    ((GridBagLayout)p.getLayout()).setConstraints( mediaPanel, gbc);
    p.add( mediaPanel);
    JPanel buttonPanel= new JPanel();
    rtcp= new JButton( "RTCP Monitor");
    update= new JButton( "Transmission Status");
    update.setEnabled( false);
    rtcp.addActionListener( this);
    update.addActionListener( this);
    buttonPanel.add( rtcp);
    buttonPanel.add( update);
    gbc= new GridBagConstraints();
    gbc.gridx = 0;
    gbc.gridy = 3;
    gbc.gridwidth= 2;
    gbc.weightx = 1.0;
    gbc.weighty = 0.0;
    gbc.anchor = GridBagConstraints.CENTER;
    gbc.fill = GridBagConstraints.HORIZONTAL;
    gbc.insets = new Insets( 5,5,10,5);
    ((GridBagLayout)p.getLayout()).setConstraints( buttonPanel, gbc);
    p.add( buttonPanel);
    getContentPane().add( p);
    list.addMouseListener( this);
    addWindowListener( this);
    pack();
    setVisible( true);
    private JPanel createMediaPanel() {
    JPanel p= new JPanel();
    GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
    p.setLayout( gridBagLayout);
    JLabel label= new JLabel( "Media Locator:");
    gbc= new GridBagConstraints();
    gbc.gridx = 0;
    gbc.gridy = 0;
    gbc.weightx = 0.0;
    gbc.weighty = 0.0;
    gbc.anchor = GridBagConstraints.EAST;
    gbc.fill = GridBagConstraints.NONE;
    gbc.insets = new Insets( 5,5,10,5);
    ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
    p.add( label);
    tf_media_file= new JTextField( 35);
    gbc= new GridBagConstraints();
    gbc.gridx = 1;
    gbc.gridy = 0;
    gbc.weightx = 1.0;
    gbc.weighty = 0.0;
    gbc.anchor = GridBagConstraints.WEST;
    gbc.fill = GridBagConstraints.HORIZONTAL;
    gbc.insets = new Insets( 5,5,10,5);
    ((GridBagLayout)p.getLayout()).setConstraints( tf_media_file, gbc);
    p.add( tf_media_file);
    tf_media_file.setText( config.media_locator);
    cb_loop= new JCheckBox( "loop");
    startXmit= new JButton( "Start Transmission");
    startXmit.setEnabled( true);
    startXmit.addActionListener( this);
    gbc= new GridBagConstraints();
    gbc.gridx = 2;
    gbc.gridy = 0;
    gbc.weightx = 0.0;
    gbc.weighty = 0.0;
    gbc.anchor = GridBagConstraints.WEST;
    gbc.fill = GridBagConstraints.NONE;
    gbc.insets = new Insets( 5,5,10,5);
    ((GridBagLayout)p.getLayout()).setConstraints( cb_loop, gbc);
    p.add( cb_loop);
    cb_loop.setSelected( true);
    cb_loop.addActionListener( this);
    gbc= new GridBagConstraints();
    gbc.gridx = 1;
    gbc.gridy = 1;
    gbc.weightx = 0.0;
    gbc.weighty = 0.0;
    gbc.anchor = GridBagConstraints.CENTER;
    gbc.fill = GridBagConstraints.NONE;
    gbc.insets = new Insets( 5,5,10,5);
    ((GridBagLayout)p.getLayout()).setConstraints( startXmit, gbc);
    p.add( startXmit);
    TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Source");
    p.setBorder( titledBorder);
    return p;
    private JPanel createTargetPanel() {
    JPanel p= new JPanel();
    GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
    p.setLayout( gridBagLayout);
    targets= new Vector();
    for( int i= 0; i < config.targets.size(); i++) {
    targets.addElement( config.targets.elementAt( i));
    listModel= new TargetListModel( targets);
    list= new JList( listModel);
    list.addKeyListener( this);
    list.setPrototypeCellValue( "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx");
    JScrollPane scrollPane= new JScrollPane( list,
    ScrollPaneConstants.VERTICAL_SCROLLBAR_AS_NEEDED,
    ScrollPaneConstants.HORIZONTAL_SCROLLBAR_NEVER);
    gbc= new GridBagConstraints();
    gbc.gridx= 0;
    gbc.gridy= 0;
    gbc.weightx= 1.0;
    gbc.weighty= 1.0;
    gbc.anchor= GridBagConstraints.CENTER;
    gbc.fill= GridBagConstraints.BOTH;
    gbc.insets= new Insets( 10, 5, 0, 0);
    ((GridBagLayout)p.getLayout()).setConstraints( scrollPane, gbc);
    p.add( scrollPane);
    JPanel p1= new JPanel();
    p1.setLayout( gridBagLayout);
    JLabel label= new JLabel( "IP Address:");
    gbc= new GridBagConstraints();
    gbc.gridx = 0;
    gbc.gridy = 0;
    gbc.weightx = 0.0;
    gbc.weighty = 0.0;
    gbc.anchor = GridBagConstraints.EAST;
    gbc.fill = GridBagConstraints.NONE;
    gbc.insets = new Insets( 5,5,0,5);
    ((GridBagLayout)p1.getLayout()).setConstraints( label, gbc);
    p1.add( label);
    tf_remote_address= new JTextField( 15);
    gbc= new GridBagConstraints();
    gbc.gridx = 1;
    gbc.gridy = 0;
    gbc.weightx = 0.0;
    gbc.weighty = 0.0;
    gbc.anchor = GridBagConstraints.WEST;
    gbc.fill = GridBagConstraints.NONE;
    gbc.insets = new Insets( 5,5,0,5);
    ((GridBagLayout)p1.getLayout()).setConstraints( tf_remote_address, gbc);
    p1.add( tf_remote_address);
    label= new JLabel( "Data Port:");
    gbc= new GridBagConstraints();
    gbc.gridx = 0;
    gbc.gridy = 1;
    gbc.weightx = 0.0;
    gbc.weighty = 0.0;
    gbc.anchor = GridBagConstraints.EAST;
    gbc.fill = GridBagConstraints.NONE;
    gbc.insets = new Insets( 5,5,0,5);
    ((GridBagLayout)p1.getLayout()).setConstraints( label, gbc);
    p1.add( label);
    tf_remote_data_port= new JTextField( 15);
    gbc= new GridBagConstraints();
    gbc.gridx = 1;
    gbc.gridy = 1;
    gbc.weightx = 0.0;
    gbc.weighty = 0.0;
    gbc.anchor = GridBagConstraints.WEST;
    gbc.fill = GridBagConstraints.NONE;
    gbc.insets = new Insets( 5,5,0,5);
    ((GridBagLayout)p1.getLayout()).setConstraints( tf_remote_data_port, gbc);
    p1.add( tf_remote_data_port);
    JPanel p2= new JPanel();
    addTarget= new JButton( "Add Target");
    removeTarget= new JButton( "Remove Target");
    p2.add( addTarget);
    p2.add( removeTarget);
    addTarget.addActionListener( this);
    removeTarget.addActionListener( this);
    gbc= new GridBagConstraints();
    gbc.gridx = 0;
    gbc.gridy = 2;
    gbc.weightx = 1.0;
    gbc.weighty = 0.0;
    gbc.gridwidth= 2;
    gbc.anchor = GridBagConstraints.CENTER;
    gbc.fill = GridBagConstraints.HORIZONTAL;
    gbc.insets = new Insets( 20,5,0,5);
    ((GridBagLayout)p1.getLayout()).setConstraints( p2, gbc);
    p1.add( p2);
    gbc= new GridBagConstraints();
    gbc.gridx= 1;
    gbc.gridy= 0;
    gbc.weightx= 1.0;
    gbc.weighty= 1.0;
    gbc.anchor= GridBagConstraints.CENTER;
    gbc.fill= GridBagConstraints.BOTH;
    gbc.insets= new Insets( 10, 5, 0, 0);
    ((GridBagLayout)p.getLayout()).setConstraints( p1, gbc);
    p.add( p1);
    TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Targets");
    p.setBorder( titledBorder);
    return p;
    private JPanel createLocalPanel() {
    JPanel p= new JPanel();
    GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
    p.setLayout( gridBagLayout);
    JLabel label= new JLabel( "IP Address:");
    gbc= new GridBagConstraints();
    gbc.gridx = 0;
    gbc.gridy = 0;
    gbc.weightx = 0.0;
    gbc.weighty = 0.0;
    gbc.anchor = GridBagConstraints.EAST;
    gbc.fill = GridBagConstraints.NONE;
    gbc.insets = new Insets( 5,5,0,5);
    ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
    p.add( label);
    JTextField tf_local_host= new JTextField( 15);
    gbc= new GridBagConstraints();
    gbc.gridx = 1;
    gbc.gridy = 0;
    gbc.weightx = 0.0;
    gbc.weighty = 0.0;
    gbc.anchor = GridBagConstraints.WEST;
    gbc.fill = GridBagConstraints.NONE;
    gbc.insets = new Insets( 5,5,0,5);
    ((GridBagLayout)p.getLayout()).setConstraints( tf_local_host, gbc);
    p.add( tf_local_host);
    try {
    String host= InetAddress.getLocalHost().getHostAddress();
    tf_local_host.setText( host);
    } catch( UnknownHostException e) {
    label= new JLabel( "Data Port:");
    gbc= new GridBagConstraints();
    gbc.gridx = 0;
    gbc.gridy = 1;
    gbc.weightx = 0.0;
    gbc.weighty = 0.0;
    gbc.anchor = GridBagConstraints.EAST;
    gbc.fill = GridBagConstraints.NONE;
    gbc.insets = new Insets( 5,5,0,5);
    ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
    p.add( label);
    tf_data_port= new JTextField( 15);
    gbc= new GridBagConstraints();
    gbc.gridx = 1;
    gbc.gridy = 1;
    gbc.weightx = 0.0;
    gbc.weighty = 0.0;
    gbc.anchor = GridBagConstraints.WEST;
    gbc.fill = GridBagConstraints.NONE;
    gbc.insets = new Insets( 5,5,10,5);
    ((GridBagLayout)p.getLayout()).setConstraints( tf_data_port, gbc);
    p.add( tf_data_port);
    tf_data_port.setText( config.local_data_port);
    TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Local Host");
    p.setBorder( titledBorder);
    return p;
    public void actionPerformed( ActionEvent event) {
    Object source= event.getSource();
    if( source == addTarget) {
    String ip= tf_remote_address.getText().trim();
    String port= tf_remote_data_port.getText().trim();
    String localPort= tf_data_port.getText().trim();
    addTargetToList( localPort, ip, port);
    if( avTransmitter != null) {
    avTransmitter.addTarget( ip, port);
    } else if( source == removeTarget) {
    int index= list.getSelectedIndex();
    if( index != -1) {
    Target target= (Target) targets.elementAt( index);
    if( avTransmitter != null) {
    avTransmitter.removeTarget( target.ip, target.port);
    targets.removeElement( target);
    listModel.setData( targets);
    } else if( source == startXmit) {
    if( startXmit.getLabel().equals( "Start Transmission")) {
    int data_port= new Integer( tf_data_port.getText()).intValue();
    avTransmitter= new AVTransmitter( this, data_port);
    avTransmitter.start( tf_media_file.getText().trim(), targets);
    avTransmitter.setLooping( cb_loop.isSelected());
    startXmit.setLabel( "Stop Transmission");
    } else if( startXmit.getLabel().equals( "Stop Transmission")) {
    avTransmitter.stop();
    avTransmitter= null;
    removeNonBaseTargets();
    listModel.setData( targets);
    startXmit.setLabel( "Start Transmission");
    } else if( source == rtcp) {
    if( rtcpViewer == null) {
    rtcpViewer= new RTCPViewer();
    } else {
    rtcpViewer.setVisible( true);
    rtcpViewer.toFront();
    } else if( source == cb_loop) {
    if( avTransmitter != null) {
    avTransmitter.setLooping( cb_loop.isSelected());
    private void removeNonBaseTargets() {
    String localPort= tf_data_port.getText().trim();
    for( int i= targets.size(); i > 0;) {
    Target target= (Target) targets.elementAt( i - 1);
    if( !target.localPort.equals( localPort)) {
    targets.removeElement( target);
    i--;
    public void addTargetToList( String localPort,
    String ip, String port) {
    ListUpdater listUpdater= new ListUpdater( localPort, ip,
    port, listModel, targets);
    SwingUtilities.invokeLater( listUpdater);
    public void rtcpReport( String report) {
    if( rtcpViewer != null) {
    rtcpViewer.report( report);
    public void windowClosing( WindowEvent event) {
    config.local_data_port= tf_data_port.getText().trim();
    config.targets= new Vector();
    for( int i= 0; i < targets.size(); i++) {
    Target target= (Target) targets.elementAt( i);
    if( target.localPort.equals( config.local_data_port)) {
    config.addTarget( target.ip, target.port);
    config.media_locator= tf_media_file.getText().trim();
    config.write();
    System.exit( 0);
    public void windowClosed( WindowEvent event) {
    public void windowDeiconified( WindowEvent event) {
    public void windowIconified( WindowEvent event) {
    public void windowActivated( WindowEvent event) {
    public void windowDeactivated( WindowEvent event) {
    public void windowOpened( WindowEvent event) {
    public void keyPressed( KeyEvent event) {
    public void keyReleased( KeyEvent event) {
    Object source= event.getSource();
    if( source == list) {
    int index= list.getSelectedIndex();
    public void keyTyped( KeyEvent event) {
    public void mousePressed( MouseEvent e) {
    public void mouseReleased( MouseEvent e) {
    public void mouseEntered( MouseEvent e) {
    public void mouseExited( MouseEvent e) {
    public void mouseClicked( MouseEvent e) {
    Object source= e.getSource();
    if( source == list) {
    int index= list.getSelectedIndex();
    if( index != -1) {
    Target target= (Target) targets.elementAt( index);
    tf_remote_address.setText( target.ip);
    tf_remote_data_port.setText( target.port);
    int index= list.locationToIndex( e.getPoint());
    public static void main( String[] args) {
    new Tx();
    class TargetListModel extends AbstractListModel {
    private Vector options;
    public TargetListModel( Vector options) {
    this.options= options;
    public int getSize() {
    int size;
    if( options == null) {
    size= 0;
    } else {
    size= options.size();
    return size;
    public Object getElementAt( int index) {
    String name;
    if( index < getSize()) {
    Target o= (Target)options.elementAt( index);
    name= o.localPort + " ---> " + o.ip + ":" + o.port;
    } else {
    name= null;
    return name;
    public void setData( Vector data) {
    options= data;
    fireContentsChanged( this, 0, data.size());
    class ListUpdater implements Runnable {
    String localPort, ip, port;
    TargetListModel listModel;
    Vector targets;
    public ListUpdater( String localPort, String ip, String port,
    TargetListModel listModel, Vector targets) {
    this.localPort= localPort;
    this.ip= ip;
    this.port= port;
    this.listModel= listModel;
    this.targets= targets;
    public void run() {
    Target target= new Target( localPort, ip, port);
    if( !targetExists( localPort, ip, port)) {
    targets.addElement( target);
    listModel.setData( targets);
    public boolean targetExists( String localPort, String ip, String port) {
    boolean exists= false;
    for( int i= 0; i < targets.size(); i++) {
    Target target= (Target) targets.elementAt( i);
    if( target.localPort.equals( localPort)
    && target.ip.equals( ip)
    && target.port.equals( port)) {
    exists= true;
    break;
    return exists;
    import java.awt.*;
    import java.io.*;
    import java.net.InetAddress;
    import java.util.*;
    import javax.media.*;
    import javax.media.protocol.*;
    import javax.media.format.*;
    import javax.media.control.TrackControl;
    import javax.media.control.QualityControl;
    import javax.media.rtp.*;
    import javax.media.rtp.event.*;
    import javax.media.rtp.rtcp.*;
    public class AVTransmitter implements ReceiveStreamListener, RemoteListener,
    ControllerListener {
    // Input MediaLocator
    // Can be a file or http or capture source
    private MediaLocator locator;
    private String ipAddress;
    private int portBase;
    private Processor processor = null;
    private RTPManager rtpMgrs[];
    private int localPorts[];
    private DataSource dataOutput = null;
    private int local_data_port;
    private Tx tx;
    public AVTransmitter( Tx tx, int data_port) {
    this.tx= tx;
    local_data_port= data_port;
    * Starts the transmission. Returns null if transmission started ok.
    * Otherwise it returns a string with the reason why the setup failed.
    public synchronized String start( String filename, Vector targets) {
    String result;
    locator= new MediaLocator( filename);
    // Create a processor for the specified media locator
    // and program it to output JPEG/RTP
    result = createProcessor();
    if (result != null) {
    return result;
    // Create an RTP session to transmit the output of the
    // processor to the specified IP address and port no.
    result = createTransmitter( targets);
    if (result != null) {
    processor.close();
    processor = null;
    return result;
    // Start the transmission
    processor.start();
    return null;
    * Use the RTPManager API to create sessions for each media
    * track of the processor.
    private String createTransmitter( Vector targets) {
    // Cheated. Should have checked the type.
    PushBufferDataSource pbds = (PushBufferDataSource)dataOutput;
    PushBufferStream pbss[] = pbds.getStreams();
    rtpMgrs = new RTPManager[pbss.length];
    localPorts = new int[ pbss.length];
    SessionAddress localAddr, destAddr;
    InetAddress ipAddr;
    SendStream sendStream;
    int port;
    SourceDescription srcDesList[];
    for (int i = 0; i < pbss.length; i++) {
    // for (int i = 0; i < 1; i++) {
    try {
    rtpMgrs[i] = RTPManager.newInstance();
    port = local_data_port + 2*i;
    localPorts[ i]= port;
    localAddr = new SessionAddress( InetAddress.getLocalHost(),
    port);
    rtpMgrs.initialize( localAddr);
    rtpMgrs[i].addReceiveStreamListener(this);
    rtpMgrs[i].addRemoteListener(this);
    for( int k= 0; k < targets.size(); k++) {
    Target target= (Target) targets.elementAt( k);
    int targetPort= new Integer( target.port).intValue();
    addTarget( localPorts[ i], rtpMgrs[ i], target.ip, targetPort + 2*i);
    sendStream = rtpMgrs[i].createSendStream(dataOutput, i);
    sendStream.start();
    } catch (Exception e) {
    e.printStackTrace();
    return e.getMessage();
    return null;
    public void addTarget( String ip, String port) {
    for (int i= 0; i < rtpMgrs.length; i++) {
    int targetPort= new Integer( port).intValue();
    addTarget( localPorts[ i], rtpMgrs[ i], ip, targetPort + 2*i);
    public void addTarget( int localPort, RTPManager mgr, String ip, int port) {
    try {
    SessionAddress addr= new SessionAddress( InetAddress.getByName( ip),
    new Integer( port).intValue());
    mgr.addTarget( addr);
    tx.addTargetToList( localPort + "", ip, port + "");
    } catch( Exception e) {
    e.printStackTrace();
    public void removeTarget( String ip, String port) {
    try {
    SessionAddress addr= new SessionAddress( InetAddress.getByName( ip),
    new Integer( port).intValue());
    for (int i= 0; i < rtpMgrs.length; i++) {
    rtpMgrs[ i].removeTarget( addr, "target removed from transmitter.");
    } catch( Exception e) {
    e.printStackTrace();
    boolean looping= true;
    public void controllerUpdate( ControllerEvent ce) {
    System.out.println( ce);
    if( ce instanceof DurationUpdateEvent) {
    Time duration= ((DurationUpdateEvent) ce).getDuration();
    System.out.println( "duration: " + duration.getSeconds());
    } else if( ce instanceof EndOfMediaEvent) {
    System.out.println( "END OF MEDIA - looping=" + looping);
    if( looping) {
    processor.setMediaTime( new Time( 0));
    processor.start();
    public void setLooping( boolean flag) {
    looping= flag;
    public void update( ReceiveStreamEvent event) {
    String timestamp= getTimestamp();
    StringBuffer sb= new StringBuffer();
    if( event instanceof InactiveReceiveStreamEvent) {
    sb.append( timestamp + " Inactive Receive Stream");
    } else if( event instanceof ByeEvent) {
    sb.append( timestamp + " Bye");
    } else {
    System.out.println( "ReceiveStreamEvent: "+ event);
    tx.rtcpReport( sb.toString());
    public void update( RemoteEvent event) {
    String timestamp= getTimestamp();
    if( event instanceof ReceiverReportEvent) {
    ReceiverReport rr= ((ReceiverReportEvent) event).getReport();
    StringBuffer sb= new StringBuffer();
    sb.append( timestamp + " RR");
    if( rr != null) {
    Participant participant= rr.getParticipant();
    if( participant != null) {
    sb.append( " from " + participant.getCNAME());
    sb.append( " ssrc=" + rr.getSSRC());
    } else {
    sb.append( " ssrc=" + rr.getSSRC());
    tx.rtcpReport( sb.toString());
    } else {
    System.out.println( "RemoteEvent: " + event);
    private String getTimestamp() {
    String timestamp;
    Calendar calendar= Calendar.getInstance();
    int hour= calendar.get( Calendar.HOUR_OF_DAY);
    String hourStr= formatTime( hour);
    int minute= calendar.get( Calendar.MINUTE);
    String minuteStr= formatTime( minute);
    int second= calendar.get( Calendar.SECOND);
    String secondStr= formatTime( second);
    timestamp= hourStr + ":" + minuteStr + ":" + secondStr;
    return timestamp;
    private String formatTime( int time) {
    String timeStr;
    if( time < 10) {
    timeStr= "0" + time;
    } else {
    timeStr= "" + time;
    return timeStr;
    * Stops the transmission if already started
    public void stop() {
    synchronized (this) {
    if (processor != null) {
    processor.stop();
    processor.close();
    processor = null;
    for (int i= 0; i < rtpMgrs.length; i++) {
    rtpMgrs[ i].removeTargets( "Session ended.");
    rtpMgrs[ i].dispose();
    public String createProcessor() {
    if (locator == null) {
    return "Locator is null";
    DataSource ds;
    DataSource clone;
    try {
    ds = javax.media.Manager.createDataSource(locator);
    } catch (Exception e) {
    return "Couldn't create DataSource";
    // Try to create a processor to handle the input media locator
    try {
    processor = javax.media.Manager.createProcessor(ds);
    processor.addControllerListener( this);
    } catch (NoProcessorException npe) {
    return "Couldn't create processor";
    } catch (IOException ioe) {
    return "IOException creating processor";
    // Wait for it to configure
    boolean result = waitForState(processor, Processor.Configured);
    if (result == false)
    return "Couldn't configure processor";
    // Get the tracks from the processor
    TrackControl [] tracks = processor.getTrackControls();
    // Do we have atleast one track?
    if (tracks == null || tracks.length < 1)
    return "Couldn't find tracks in processor";
    // Set the output content descriptor to RAW_RTP
    // This will limit the supported formats reported from
    // Track.getSupportedFormats to only valid RTP formats.
    ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW_RTP);
    processor.setContentDescriptor(cd);
    Format supported[];
    Format chosen;
    boolean atLeastOneTrack = false;
    // Program the tracks.
    for (int i = 0; i < tracks.length; i++) {
    Format format = tracks[i].getFormat();
    if (tracks[i].isEnabled()) {
    supported = tracks[i].getSupportedFormats();
    // We've set the output content to the RAW_RTP.
    // So all the supported formats should work with RTP.
    // We'll just pick the first one.
    if (supported.length > 0) {
    if (supported[0] instanceof VideoFormat) {
    // For video formats, we should double check the
    // sizes since not all formats work in all sizes.
    chosen = checkForVideoSizes(tracks[i].getFormat(),
    supported[0]);
    } else
    chosen = supported[0];
    tracks[i].setFormat(chosen);
    System.err.println("Track " + i + " is set to transmit as:");
    System.err.println(" " + chosen);
    atLeastOneTrack = true;
    } else
    tracks[i].setEnabled(false);
    } else
    tracks[i].setEnabled(false);
    if (!atLeastOneTrack)
    return "Couldn't set any of the tracks to a valid RTP format";
    // Realize the processor. This will internally create a flow
    // graph and attempt to create an output datasource for JPEG/RTP
    // audio frames.
    result = waitForState(processor, Controller.Realized);
    if (result == false)
    return "Couldn't realize processor";
    // Set the JPEG quality to .5.
    setJPEGQuality(processor, 0.5f);
    // Get the output data source of the processor
    dataOutput = processor.getDataOutput();
    return null;
    static SessionAddress destAddr1, destAddr2;
    * For JPEG and H263, we know that they only work for particular
    * sizes. So we'll perform extra checking here to make sure they
    * are of the right sizes.
    Format checkForVideoSizes(Format original, Format supported) {
    int width, height;
    Dimension size = ((VideoFormat)original).getSize();
    Format jpegFmt = new Format(VideoFormat.JPEG_RTP);
    Format h263Fmt = new Format(VideoFormat.H263_RTP);
    if (supported.matches(jpegFmt)) {
    // For JPEG, make sure width and height are divisible by 8.
    width = (size.width % 8 == 0 ? size.width :
    (int)(size.width / 8) * 8);
    height = (size.height % 8 == 0 ? size.height :
    (int)(size.height / 8) * 8);
    } else if (supported.matches(h263Fmt)) {
    // For H.263, we only support some specific sizes.
    if (size.width < 128) {
    width = 128;
    height = 96;
    } else if (size.width < 176) {
    width = 176;
    height = 144;
    } else {
    width = 352;
    height = 288;
    } else {
    // We don't know this particular format. We'll just
    // leave it alone then.
    return supported;
    return (new VideoFormat(null,
    new Dimension(width, height),
    Format.NOT_SPECIFIED,
    null,
    Format.NOT_SPECIFIED)).intersects(supported);
    * Setting the encoding quality to the specified value on the JPEG encoder.
    * 0.5 is a good default.
    void setJPEGQuality(Player p, float val) {
    Control cs[] = p.getControls();
    QualityControl qc = null;
    VideoFormat jpegFmt = new VideoFormat(VideoFormat.JPEG);
    // Loop through the controls to find the Quality control for
    // the JPEG encoder.
    for (int i = 0; i < cs.length; i++) {
    if (cs[i] instanceof QualityControl &&
    cs[i] instanceof Owned) {
    Object owner = ((Owned)cs[i]).getOwner();
    // Check to see if the owner is a Codec.
    // Then check for the output format.
    if (owner instanceof Codec) {
    Format fmts[] = ((Codec)owner).getSupportedOutputFormats(null);
    for (int j = 0; j < fmts.length; j++) {
    if (fmts[j].matches(jpegFmt)) {
    qc = (QualityControl)cs[i];
    qc.setQuality(val);
    System.err.println("- Setting quality to " +
    val + " on " + qc);
    break;
    if (qc != null)
    break;
    * Convenience methods to handle processor's state changes.
    private Integer stateLock = new Integer(0);
    private boolean failed = false;
    Integer getStateLock() {
    return stateLock;
    void setFailed() {
    failed = true;
    private synchronized boolean waitForState(Processor p, int state) {
    p.addControllerListener(new StateListener());
    failed = false;
    // Call the required method on the processor
    if (state == Processor.Configured) {
    p.configure();
    } else if (state == Processor.Realized) {
    p.realize();
    // Wait until we get an event that confirms the
    // success of the method, or a failure event.
    // See StateListener inner class
    while (p.getState() < state && !failed) {
    synchronized (getStateLock()) {
    try {
    getStateLock().wait();
    } catch (InterruptedException ie) {
    return false;
    if (failed)
    return false;
    else
    return true;
    * Inner Classes
    class StateListener implements ControllerListener {
    public void controllerUpdate(ControllerEvent ce) {
    // If there was an error during configure or
    // realize, the processor will be closed
    if (ce instanceof ControllerClosedEvent)
    setFailed();
    // All controller events, send a notification
    // to the waiting thread in waitForState method.
    if (ce instanceof ControllerEvent) {
    synchronized (getStateLock()) {
    getStateLock().notifyAll();
    import java.io.*;
    import java.util.*;
    public class Config {
    private String pathPrefix;
    public String local_data_port;
    public Vector targets;
    public String media_locator;
    public Config() {
    pathPrefix= System.getProperty( "user.home") + File.separator;
    read();
    public void read() {
    targets= new Vector();
    try {
    String path= pathPrefix + "xmit.dat";
    FileInputStream fin= new FileInputStream( path);
    BufferedInputStream bin= new BufferedInputStream( fin);
    DataInputStream din= new DataInputStream( bin);
    local_data_port= readString( din);
    int n_targets= din.readInt();
    for( int i= 0; i < n_targets; i++) {
    String ip= readString( din);
    String port= readString( din);
    targets.addElement( new Target( local_data_port, ip, port));
    media_locator= readString( din);
    fin.close();
    } catch( IOException e) {
    System.out.println( "xmit.dat file missing!");
    local_data_port= "";
    media_locator= "";
    public void write() {
    try {
    String path= pathPrefix + "xmit.dat";
    FileOutputStream fout= new FileOutputStream( path);
    BufferedOutputStream bout= new BufferedOutputStream( fout);
    DataOutputStream dout= new DataOutputStream( bout);
    writeString( dout, local_data_port);
    dout.writeInt( targets.size());
    for( int i= 0; i < targets.size(); i++) {
    Target target= (Target) targets.elementAt( i);
    writeString( dout, target.ip);
    writeString( dout, target.port);
    writeString( dout, media_locator);
    dout.flush();
    dout.close();
    fout.close();
    } catch( IOException e) {
    System.out.println( "Error writing xmit.dat!");
    public String readString( DataInputStream din) {
    String s= null;
    try {
    short length= din.readShort();
    if( length > 0) {
    byte buf[]= new byte[ length];
    din.read( buf, 0, length);
    s= new String( buf);
    } catch( IOException e) {
    System.err.println( e);
    return s;
    public void writeString( DataOutputStream dout, String str) {
    try {
    if( str != null) {
    dout.writeShort( str.length());
    dout.writeBytes( str);
    } else {
    dout.writeShort( 0);
    } catch( Exception e) {
    e.printStackTrace();
    public void addTarget( String ip, String port) {
    targets.addElement( new Target( "", ip, port));
    import java.io.*;
    import java.awt.*;
    import java.awt.event.*;
    import java.net.*;
    import java.util.*;
    import javax.media.rtp.*;
    import javax.swing.*;
    import javax.swing.event.*;
    import javax.swing.border.*;
    public class RTCPViewer extends JFrame implements ActionListener, KeyListener,
    MouseListener, WindowListener {
    private JList list;
    private Vector reports;
    private RtcpListModel listModel;
    private JButton clear;
    private JButton start;
    private boolean recording;
    public RTCPViewer() {
    setTitle( "JMF/RTCP Tracer");
    re

    I'm also getting some really bad sound quality on facebook notifications. I've been forced to disable them they sound horrible like they are clipping.
    Trying to figure out if this is a problem on all Late 2013 15" macbook pros or if mine is defective? This occurs even at around 20-30% volume levels with the Facebook notification sound problems...

  • Audio And Video out of sync in Final Cut, but not in quicktime

    I heave searched the forums to no end to find the solution to this issue, unfortunatley, I have not solved the problem.
    I am using Final Cut 4.5 on a G4 running OSX 10.3.4. This system has worked perfectly for years with no problems. Neither the system, hardware, or software has ever been changed on this machine. Suddenly last week, despite perfect sync while previewing the capture, the Audio and Video were of sync in Final Cut, but when checked in quicktime, there were no problems. That is how I eventually figured out this was a final cut problem.
    By out of sync, I don't mean it is by a specific number of frames that can be adjusted and then everything is fine, the sync seems to drift so that if you edit the audio to be in sync in one part of the video, it won't be at the end. Also, it is out of sync from the first frame played, not a gradual drift.
    This is definitely not an issue of frame rates or final cut settings as will be evident by the steps I took below to find the source of this problem.
    This is what I did to attempt to fix the problem:
    - Check drives using disc utility and repair permissions
    - Remove Final Cut Preferences using Apple guidelines
    - Switch hard drives
    - Delete Final Cut and reinstall newer version of final cut (since the capture files played fine on Final Cut 5.1 on another machine)
    - reinstall entire OSX 10.3.9, wiping the hard drive and reinstalling Final Cut 5.1
    - switch ram slots, and even replace ram with different sticks from another G4 machine
    None of these solutions worked. Is there some hardware problem with my G4 that is suddenly causing Final Cut to play audio and video out of sync?
    Hope you can help, otherwise this otherwise great G4 is going in the dumpster!
    Thanks

    For what it's worth, I'm having what looks to be the same problem. Driving me insane. Here's what's going on-
    Editing in FCP 6.0.2
    QT was just updated to 7.3.1 the other day (hmm)
    On an MacBookPro machine, OS 10.5.1
    Software updater says everything's up to date of course.
    DV-NTSC 24p (23.98) Advanced Pulldown Removal (footage was shot at 24PA on a DVX100b)
    In Quicktime, the file previews fine, everything's in sync, everyone's happy. In FCP, the video and audio are wildly out of sync. I did a test inside of quicktime and sliced out a small portion of the media clip and did a "save as" and this small slice played in sync in FCP. However, this did not work for the entire clip. Clip is weighing in at 5.58GB for a 32 minute run.
    I'm trying to multicam 2 tapes together and manually syncing these up is proving to be difficult at best. When anyone gets any ideas/leads/patches, let us know.
    Grr,
    Nicole
    Editor, RHED Pixel

  • Audio and video out of sync on short recordings

    I've scanned the discussions and see that alot of posters are experiencing
    the same problem I'm having which is recording quicktime movies
    and getting audio and video out of sync.
    In most of those posts the responses seem to be that the problem gets
    bad if you're making a long recording. I'm experiencing that, but...
    ...I'm also having the same problem on really short recordings...less than a
    minute. I've coming video and audio out of my cable box, into a SONY DSR-11 VCR, then firewire out of the deck into the computer, recording the movies with Quicktime 7.1.6.
    I know it's coming out of the cable box and then out of the SONY deck via firewire in sync because when I log and capture through Final Cut Pro, audio and video are in sync in preview and playback. I don't want to use Final Cut Pro to capture the movies because I want to end up with MPEG4, and I'd have to go through an extra conversion process out of FCP.
    Any thoughts on why even short recordings through Quicktime are out of sync?
    Thanks.

    I'm having similar problems transferring VHS video to Quicktime. Another problem I'm having is when I transfer a VHS video to iMovie and then export the video to H.264 format, I have similar sync problems, even though the iMovie seems perfectly in sync.
    I'm going to try to transfer several alternative video formats (video from my cable DVR, Video from live TV, and video from a digital mini cassette to try and figure out whether the VHS format is the culprit. Will keep you posted.

  • Mini DisplayPort to support both audio and video coming soon?

    So I notice that Apple updated the MacBook Pro Mini DisplayPort implementation to support BOTH audio AND video.
    Think this will make it’s way to the Mac mini soon, or an update still a long way off?

    Probably in the next refresh, but the mini was refreshed twice last year, so another refresh in less than a year would probably be too much for which to hope!
    There is always this US$49 wonder;
    http://www.monoprice.com/products/product.asp?cid=104&cp_id=10428&cs_id=1042802&pid=6331&seq=1&format=2
    Dah•veed

  • Why, after three months of apple tv working perfectly fine, is it now cutting out constantly and the audio and video out of sinc?!@

    Why, after three months of apple tv working perfectly fine, is it now cutting out constantly and the audio and video out of sinc?

    Playing rentals, Netflix, from local iTunes, other?
    Firstly restart router and AppleTv after unpowering to see if that helps - may or may not but quick and easy to try.
    Connection via ethernet or wifi?  If wifi are there any new devices in the house which may be causing interference or new neighbouring networks?  If feasible try to eliminate wifi issues by using an ethernet connection.
    AC

  • I have a MacBookPro5,5 how do i connect it to my hdtv WITH BOTH AUDIO AND VIDEO? and where can i buy this cable(s)?

    i have a MacBookPro5,5 how do i connect it to my hdtv WITH BOTH AUDIO AND VIDEO? and where can i buy this cable(s)?

    thank you roger. i have attempted to connect my mbp now 3 times (all different ways with different cables purchased) but unfortunately no success. i understand that with me having an older mbp it can be a challenge. i have found this page http://www.wikihow.com/Connect-a-Macbook-to-a-TV but with my unsuccessful history, i have become skeptical. i will try your suggestion and let you know. thank you again.

  • AV digital works audio only on touch but both audio and video on ipad

    The av digital accessory does audio only on the Itouch but works fine on the IPad showing both audio and video?

    To try and keep it simple, there are 3 variables here.
    1) Acquisition - You can either record with a stereo or mono mic, you've probably used mono.
    2a) Capturing - In the 'clip settings' tab of the log and capture window, you can choose how you want to capture your audio, choosing from the 'Audio Format' drop down menu.
    2b) Editing - If you've already captured some material, in the timeline, check the audio tracks. If, for example, a clip has audio tracks a1 and a2, and you can see 2 sets of small triangles at the start and end of the clip, you have stereo audio.
    (double click the audio track and in the viewer you will see a 'stereo a1a2' tab with linked stereo - any changes you apply here will equally apply to both tracks - although you can 'pan' the audio using the 'pan slider' for example, if you want the sound of a car approaching to start in the left speaker, then move to the right speaker as it moves past on screen)
    If there aren't the triangles, double click the audio and in the viewer you'll have 2 seperate mono tracks - mono a1 and mono a2 - assuming you haven't captured the audio format as 'mono mix', ch1(L) or ch2(R).
    You can link seperate mono tracks in the timeline by selecting the audio clip and hitting opt (alt) L, (similarly, you can unlink stereo track by the same procedure).
    If you've got a single mono track, you can duplicate the track and then link, pan, etc as required - can't remember exactly as it's ages since I captured single channels.
    3) Output - The decisions you make in 2) determine how your content willl be outputted. (Is 'outputted' a word?)
    I'm by no means an audio expert so I hope this is clearer than mud.

  • Capturing both AUDIO and VIDEo at a time.....

    Hi Every One,
    I was able to capture audio separately and save in a file and video separately and save in a file...
    I want to Capture both audio and video at a time and save in a file is it possible, if possible suggest me the way

    Merging Tracks from Multiple Inputs
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/Merge.html]

  • Audio and video out of sync after converting to mp4

    I've tried 3 converters (ImToo, Xilisoft, and Videora) and for 2 videos I really want on my ipod, the audio and video are offset. Infact, on Xilisoft, one wouldn't even convert (to anything ------as both are WMV and I thought that might be a problem--but I found I had another WMV that had the audio matching the video fine). For the other vid., I tried on Xilisoft making it an AVI first, but that AVI also had messed up audio/video
    This is a really broad question, but what can I do and what programs can I try to get the videos to convert properly??
    (I've tried uploading the wmvs to itunes to have itunes "convery for ipod", but they wouldn't even upload to itunes)
    30gb video ipod   Windows XP  

    I have also had similar problems. Unfortunatly i have yet to discover a program that works perfectly. Just keep searching is really all i have to say. However if you do find it, could u please post it

  • Audio and Video out of Sync in Encore 2.0

    Here's my issue:
    I opened a project in Encore 2.0 that was an Encore 1.5 project. I have one timeline with a 2 hour movie and the AC3 audio file that goes with it. Encore 1.5 played the audio and video in sync from the timeline and burned discs with the audio in sync. The video is an MPEG2 file encoded at a 3.1 constant bit rate.
    In Encore 2.0, when I play the timeline, the audio becomes progressively out of sync as it plays. Within the first few minutes it's maybe a frame or 2 off. By the end of the film it is about 18 seconds off (audio is playing ahead of the video).
    Is this a result of Encore 2.0 conforming the audio file? If so, how am I (or anyone) supposed to attain sync between video and audio in Encore 2.0?
    I have Encore 1.5 and 2.0 both installed on the same machine. As a test, I opened the project in 1.5 and played it, everything was fine. I closed Encore 1.5 and opened the project in Encore 2.0, and the audio is out of sync.
    Any comments, ideas or suggestions?
    Mike

    Joe,
    I tried creating a project with the original WAV file and it's corresponding video and found that the audio and video were still out of sync. I did a little more testing and discovered that the audio is actually being played correctly, it's the video that is not being played correctly. Here's what I did:
    I opened the WAV file version of the audio file in Premiere Pro and noted where a hammer hit in the video occurs (01;40;30;28). I then opened Encore 2.0 and imported the video and AC3 version of the WAV file, placed them in a timeline and the audio's hammer hit is right where it's supposed to be (01;40;30;28), but the video's hammer hit occurs at 01;40;43;18.
    Next I went into Encore 1.5 and brought in the AC3 audio and video files and placed them on a timeline. I went to 01;40;30;28 in the timeline and the audio and video were both at the frame they should be at that timecode.
    Seeing how the video appears to be playing incorrectly, here's some info on my video files:
    I have 4 versions of the same film in varying encode rates with the same start and end timecodes. All four files were encoded on a Sonic DVD Creator authoring station using hardware to encode. The files are encoded at the following rates: 2.9 constant bit rate, 3.1 constant bit rate, 3.4 variable bit rate and 4.0 variable bit rate. They are each NTSC with drop frame timecode. All four files play correctly in Encore 1.5 but incorrectly in Encore 2.0.
    This almost seems like a drop frame/non-drop frame issue. Do you have any ideas?
    Mike

  • Audio and video out uf sync

    I'm using Premier elements 12.  When I open a VOB file, the audio and video are not in sync.  I can open that same file with windows media player and all is well.  What do I need to do with respect to editing a file like this?
    The information about the display on my computer is:
    Name NVIDIA GeForce GT 220
    PNP Device ID PCI\VEN_10DE&DEV_0A20&SUBSYS_069A10DE&REV_A2\4&22063C0D&0&0018
    Adapter Type GeForce GT 220, NVIDIA compatible
    Adapter Description NVIDIA GeForce GT 220
    Adapter RAM 1.00 GB (1,073,741,824 bytes)
    Installed Drivers nvd3dumx.dll,nvwgf2umx.dll,nvwgf2umx.dll,nvd3dum,nvwgf2um,nvwgf2um
    Driver Version 9.18.13.3523
    INF File oem38.inf (Section002 section)
    Color Planes Not Available
    Color Table Entries 4294967296
    Resolution 1920 x 1080 x 60 hertz
    Bits/Pixel 32
    Memory Address 0xFA000000-0xFAFFFFFF
    Memory Address 0xD0000000-0xDFFFFFFF
    Memory Address 0xCE000000-0xCFFFFFFF
    I/O Port 0x0000EC00-0x0000EC7F
    IRQ Channel IRQ 16
    I/O Port 0x000003B0-0x000003BB
    I/O Port 0x000003C0-0x000003DF
    Memory Address 0xA0000-0xBFFFF
    Thanks,
    Pete Blair

    Pete Blair
    In all your information, I did not see what computer operating system your Premiere Elements 12 is running on? Have you updated 12 to the 12.1 Update, using an opened project's Help Menu/Update? If not, please do so.
    Your issue may relate to the details of your "VOB file" and how it was ripped/copied from the DVD-VIDEO and DVD disc to get it into a Premiere Elements project.
    a. project preset set for NTSC DV Standard or NTSC DV Widescreen (whichever corresponded to your recording details)...or PAL equivalent.
    b. DVD disc in burner tray
    c. Add Media/DVD camera or computer drive/Video Importer/.....
    It does not happened often, but if audio is out of sync and you are not using the above route, then consider the Command Prompt way described in the
    following
    ATR Premiere Elements Troubleshooting: PE: DVD-VIDEO/Seamless VOB Ripping
    A just in case note, contrary to some, it is not necessary to convert VOB (VTS_01_1.VOB video file and any files in that series) to any other format in Premiere Elements.
    Please review and consider and then we can decide what next.
    Thank you.
    ATR

  • Audio and Video out of Sync after Pro Res export

    I´m getting an audio/video sync issue when I´m exporting to Pro Res.
    I started off in a HDV1080i50 sequence. Audio sampling rate is 48khz. (16 bit depth)
    The sequence is less than 20 minutes long. It plays correctly on the timeline.
    When I export the sequence for DVD and view it on a set top it plays absolutely fine.
    When I export the video to ProRes (422 HQ - I used Export using Quicktime Conversion) and the audio to AIFF, they quickly go out of sync. This was brought to my attention first by the guys who were making the DigiBeta master, and when I view the exported material in a Pro Res sequence I can see it for myself.
    The audio and video clips are of the exact same duration.
    Possible issues...
    Some of the narration was recorded at 44.1 khz. I didn´t resample it (I´m still a learning novice..).. but if that is the issue should it not have affected the DVD as well? All the other audio in the project is 48khz.
    Most of the material is HDV1080i50..but one clip used was originally an NTSC clip and therefore a different frame rate. However I converted it to a 25fps clip and this is reflected in the vid rate column in the browser where it says 25fps.
    Final point, the exported AIFF files..when I import them back into the HDV1080i50 sequence and play back, it is perfectly in sync.
    Any help would be greatly appreciated.

    First, yes resample the narration to 48k 16 bit.  You will probably be able to reconnect to the resampled files.  Usually works for me.   That's a simple fix and will probably solve the problem.  If it doesn't, you can try to change the codec (compressor) in the sequence settings to prores 422HQ, fully render and see if that solves the problem.
    Wait, just reread your original post.  Export from fcp using file:  export:  quicktime movie NOT quicktime conversion with current settings.  If that works without synch issues, use compressor to convert to prores.

Maybe you are looking for

  • Contacts Envelope Print Error

    I have figured out how to print an envelope with a return address, and proper layout even without the bar code (Hello Apple - it's 2015) and the limited text formatting options, but now Contacts prints two envelopes with each print request. The probl

  • Multiple receivers for same message

    Hello Everyone, My requirement is where customer is sending an XML message and this gets converted into IDOC. This XML message has one node which if it has value as X and Y , then it should go to one ERP system. And if this tag is empty or other than

  • When trying to install Photoshop Elements 12  - error message "Key could not be opened"

    Error message is in German as I live in Austria. I have tried a lot of time to install  a recently bought packagee "Photoshop Elements 12 & Premiere Elements 12" - but without success. When istalling PSE 12 I always receive the following error messag

  • Migration from Withholding Tax ( Classic ) to Extended Withholding Tax

    Dear All, I am doing the Migration from Withholding Tax ( Classic ) to Extended Withholding Tax, I am getting the following error. I made all the customization and done the necessary process flow steps. Now I am doing the last transaction i.e. Migrat

  • Q: Can Spotlight be configured to default to search for the comments field?

    Can the Spotlight search box be configured to search only for Comments in Get Info as the default? I know that I can add a line to the search parameters that will search there. But, I'm most interested in searching for tags than for words throughout