Another bug? Failure to lock audio to video

I don't know if others are experiencing this, or if it is just my install...
I put a few pictures into the video, dropped a "camera click" into the audio line and locked it to the video.
But after further edits of other sections earlier in the movie, I found that the audio camera click was no longer wed to the picture. The video had moved when earlier cuts were made, but the audio apparently stayed where it was.
Anyone else get that, or just lucky me?
--ET

ET,
I've experienced some unpredictable behavior like you describe in the past with the lock audio feature.
I suggest you take a cue from the pros. Generally, final audio sweetening with effects and music isn't done until the picture is finalized and "locked".
It's usually best to resist the temptation of putting in sound effects or music until your picture editing is complete.
It's also worth noting that with iLife '06 you can now create the soundtrack of your movie in GarageBand. It's pretty cool and worth checking out.
Matt

Similar Messages

  • Locking Audio and Video Tracks in CS6 Tip

    I discovered, again on accident :-) that if you hold down the shift key and select the Lock Track Icon on the audio or video, it locks, and unlocks all the tracks.
    Works with selecting the "eyeball" icon, selecting all tracks and turning audio on or off for all tracks too
    I found this pretty helpful on the project I am working on.

    That's great. A simple adjustment and it works like a charm. Voila! One click of the mouse and the locks all snap like a year old bagel.
    Thanks once more for your help, Tom.

  • How to lock audio to video track?

    If I place a video track on the timeline, the audio comes along and is automatically locked with the video track. If I want to add a NEW audio track (recorded separately), there doesn't seem to be a way to lock that audio with the video track once it's synced. Or am I missing something?

    Select the video & audio tracks and then do *Modify > Link*
    And to double-protect the sync'ing, put a marker on each of the video & audio tracks at a common point (playhead at the same point in the timeline) so that if they ever get separated it will be easier to line them back up.

  • Lock audio and video track

    If I put a video on the timeline, with no audio, then I bring in an audio file for the video, how can I "lock" them together, so I can move them around the timeline as one. I know I can just select both tracks and move them as one, but I would rather have them locked together so if I cut they are still together. Thanks!
    -Caleb

    Glad to help.
    Not begging for points here, but, at the least, please mark the thread as answered so others will know it's resolved.

  • Want to lock video to audio, not audio to video--can 08 do this?

    I know that this functionality isn't available in iMovieHD, but perhaps it got added in 08?
    I'm making a music video--one long audio track that I'm cutting video to--and I want to be able to lock clips to the audio track. HD allows you to lock audio to video, but not the other way around.
    Can I do this in 08?
    Thanks!

    **! I've got to go final cut express to do this? Seems ridiculous...
    from your point of view...
    editing 'videos' is no beginners task..
    to add some clips somewhere to some music.. THAT is a 14y old's wet dream, iM08 fullfills within a snap ..
    what you have in mind is advanced editing... that simple.
    you need a timeline, you need a corresponding audio-wave-form, you need probably more tracks, to test 'versions', to manipulate content, ...
    for sure, you can built a boat with just a saw and a hammer.. I'd prefer proper tools ...

  • Can't drag audio and video together?

    I have numerous assets in this dvd project. I'm setting up a button to play 10 commercials in a row, a "play all" button.
    I'm editing the 10 commercials in what looks like a mini-video editor, with the video track and audio tracks shown.
    PROBLEM:
    When I grab the video to change the order of my "play all", the audio doesn't move with it. For instance, I'm shifting commercial #3 to become commercial #1. In order for Audio from Commercial #3 to become audio for commercial #1, I have to shift audio #2-10 over to the right to create space for Audio Commercial #3 to insert there. VERY FRUSTRATING and painstaking. And if you accidentally clip a bit off the end or front of an audio portion (easy to do in these tiny little tracks), the sync' is off for the rest of the spots!
    I called Apple "tech support", and the rep' told me you can't drag audio and video together, and made up some excuse about 'this should already be done in Final Cut'. What? I'm just bringing in QuickTime files as assets. Why can't I lock audio and video together?
    Incidentally, Apple's Tech Support is useless you spend $199, or "$799 for a year". This cost was NOT made clear in the original purchase of the product. I'd prefer spending $1000 for a "PROFESSIONAL APPLICATION" and have good tech support included!
    Arrrrgh!
    G4 dual 1.25 gig - 1 gig ram   Mac OS X (10.3.8)   DVD Studio "Pro" 4

    Once in the track, A and V don't get moved together.
    I usually just rebuild the track order from scratch anyway because you can easily lose sync easilly when you are shuffling clip order.

  • Lock external audio to video

    I am editing a project in iMovie that is using audio recorded on a separate device than the video was shot on.  I have successfully synced the audio and video together with the use of hand claps at the beginning of the video.  My problem is now editing the project and removing the hand claps from the video.  How can I lock the video and external audio clip together so that I can trim the front and back of the project?  Right now, the video and audio are 2 separate pieces and trimming one does not trim the other.  Basically a reverse of the "detach audio" function that makes the camera audio a separate entity from the video clip.
    I am not doing this as background audio as all videos that I saw on how to perform the sync did not do this as background audio.  Unfortunately, none of the videos show the step of trimming out the claps.
    Thanks,
    Roger

    The real answer is to use Final Cut Pro. It has a feature that will automatically sync audio clips with video clips from multiple cameras. You could use this feature to automatically line up the external audio with your video clips. Once you do, you can edit your movie easily.
    You can do this in iMovie, but it is a lot more of a manual effort.
    1. When you add the audio, do not drop it in the background. Drop it on one of the first video clips. The sound file will appear as a ribbon beneath the video clips. You can easily drag it left or right, or trim it.
    2. If you need to edit out stuff like hand claps, I suggest you first get the audio and the video lined up. Set the volume of the video clips to zero. Then Share the project. Use SHARE/EXPORT USING QUICKTIME. Choose Movie to QuickTime Movie. Choose Apple Intermediate Codec as the Video codec in the Options. Leave the audio codec as uncompressed.  This will create an .mov file in the Apple Intermediate Codec. You can use FILE/IMPORT MOVIE to import this file into an Event. Now the good audio is joined with the good video, and you can edit it like you are used to.
    3. There is another way.  For simplicity, let's say you have an audio clip with song A, then applause and a pause, and then Song B. You can drag in your audio file twice, again - dropping it on a video clip rather than on the background. Then use the clip trimmer to trim away the back half of the audio clip (Leaving only Song A). Use the clip trimmer to trim away the front half of the other clip (Leaving Song B). Now you have two separate audio clips that you can move around
    4. You can also accomplish the steps in #3 above by dragging in the audio clips twice, and then dragging the handles on the edges of the audio clips. Drag the front handle left to right to delete song A and leave Song B. Drag the right handle of the other clip to the left to delete song B and leave Song A.
    Bottom line: it is easy in Final Cut Pro X. It is a kluge in iMovie.

  • Want to lock video to audio, NOT audio to video--I'm making a music video

    Hi
    I'm making a music video--one long audio clip which I'm cutting video to.
    I've got Pogue's book and it explains how to pushpin audio to video, but the reverse doesn't seem to work.
    I've got one long audio clip and I want to lock video segments to it to preserve the synch between video clips and the audio track.
    Help doing this would be greatly appreciated. Also, is there a good breakdown somewhere of how to make a music video in iMovie?
    THANKS!

    daveemac wrote:
    I know they gutted a bunch of stuff in iMovie08, but perhaps they added the functionality I'm referring to? Do you happen to know?
    Steve told us at the keynote, iM08 was done by an engineer who's a Scuba-diver ... so, audio+iM08? forget it ... the audio skills are in vers.0.1. almost zil.
    plus: in iMHD≤6, you are allowed in timeline to drag clips 'to the right' to create 'holes' in the video timeline.. (=black clips) .. iM08 doesn't support that. and: in iMHD≤6 you can 'crop' a clip from the clippane easiely BEFORE adding it to the project's timeline.. frame-precision selection is 'tricky' with iM08 .. (see my site, iM08Tricks)
    .. iM08 is extremely fast to 'glue' some clips together; for serious video-editing it keeps some most interesting concepts (skimming), I would like to see in any forth-coming vers. of FCE .. but: 'videos' with iM08? .. not for me ...

  • Import failure - GoPro Hero4 4k30fps -"The file has no audio or video streams"

    I am trying to import a 4k30fps video from a GoPro Hero4 Black and I get this message "The file has no audio or video streams"
    I have tried importing in a number of different ways, does anyone have any help or advice?
    I am using Windows 7 64 bit and Adobe Premiere 8.0.1 (21)
    Thank you

    Uninstalling and reinstalling a "Typical" Apple Quicktime installation seemed to fix the problem.

  • Ipad background audio from video when multitasking

    Hello.
    This is a request if people can either feedback On their present experience or try this out themselves first and feedback.
    I have an ipad 1. When ever i try to get background audio to play from video when multitasking in another app, eg safari, from any video app including plex, airvideo, vlc,  and the original apple video app. All that occurs is that the icon changes to the ipod and a random song is played from my music library instead of the audio from the video i was watching. This does not matter if the video file is streamed or stored on the ipad itself.
    There is one slight and occasional proviso to this, as at times if the video file is tagged in itunes as 'music' rather than 'movie' or 'tv show' etc then it does on occasions work, and on occasions youtube videos will play in the background. But most of the time these aspects also dont function int he background with the ipod icon instantly replacing which ever video app icon i tried to use in the background.
    Now my sons iphone 4 manages to function correctly and always plays video in the background when multitasking.
    I have asked this question before on this forum, with no response, so booked a visit to my local genius bar. Where they checked my ipad and agreed it wasnt working as ios should, then interestingly they attempted to do the same on an ipad 2 and it failed to work in exactly the same way.  Their response was that this was obviously a fault of the ipad ios, possibly due to the ipad having a separate video app where as the iphone has video within its generic ipod app.
    I did the usual by informing apple via of this bug through the feedback system, and requested that the genius bar do the same. But having thought about this, and failed to see any other users on this forum who are also puzzled at this lack of what i think is an important function, i wonder if this means that i am the only one who likes to listen to video when im browsing, so dont need it, or wether everyone elses ipads function as ios should do, and for some reason both my ipad, and the ipad 2 tested in the shop are at fault.
    So can i request if others reading this can have a go to see if their ipad will play background audio from video when multitasking, or if the same problem occurs.
    Thanks very much this is really appreciated.

    Oops clicked the wrong button, I meant to click a 'helpful answer' as it  hard to have a correct answer to such a question.
    But thanks for the replies.
    It looks like this is a bug in ios solely for the ipad, as it does work fine on the iphone  although wonder which way the ipod touch will perform. interestingly it looks like i may also be in the minority of wishing to listen to video files as I browse:)
    Which probably means apple may never get around to sorting it, as no one else uses the function.
    Anyway thanks for the replies, if any one else wants to see the bug have a go and comment, i find it funny how I seem to be the only one to have found it to date.
    R:)

  • Syncing Audio and Video but wish for the Audio not to move on timeline?

    Has anyone worked out how to sync audio and video within Premiere Pro but have the Audio stay fixed on the timeline and the video move? Every way I try it Premiere Pro insists on moving the audio. Locking the track does not work as you cannot select the audio within it to Sync with.
    I have a music video shoot with 14 camera angles used on the lead singer (don't ask me why). Multi-camera editing is not an option as there are too many sources for my brand new MacbookPro to handle. I think I would need the computer from Superman II to edit that many clips together.
    What I am doing is syncing the first clip, the one Im going to use as the baseline fall back shot and then I need to sync other clips onto the timeline. However Premiere Pro insists on moving the audio to fit with the video not visa-versa.
    Any help would be appreciated.

    @slitchfield: on the mentioned page, it is stated that "Improvements in screen real estate, with slimmer top status bar and optional (in some apps) bottom toolbar, meaning that all phones will have a larger useful display area."
    But on the E6 the bottom toolbar decreases available screen size in most apps, most noticeable in Web browser where the full view has gone. The bottom toolbar is not optional, you can't configure it yourself.
    And: "Homescreen widgets will now come in up to five different sizes (1x1, 2x1, 4x1, 2x2, 4x4) and allow a greater degree of interactivity.", but on the E6 they are in only one size, always with large borders and other visual effects.
    So beware of differences between phones, perhaps some aspects of Belle work differently depending on the type of device.
    "Notes now brings up a white (and AMOLED-unfriendly...) editing screen.": that you can change by switching themes, "Dark Solid for Anna" does a good job. It is a shame that the lines of Notes have gone though.
    When switching themes, I found you have to restart the phone, otherwise the theme doesn't get properly loaded in all apps. Stopping/starting apps is not enough. So when experimenting and loading one theme after another, you may end up with disfunctional combinations of black-on-black and white-on-white, but it may not be due to the last theme. Also you may end up with a good combination that isn't around anymore the next time you restart your phone.
    Current N900 and former E6 user regretting Belle upgrade
    Devices owned: 2110, 2110i, Cellular Data Card, 8110, 7110, 6210, 6310i, 6100, 9500, 6233, 5140i, 3109c, N900, E6-00

  • Premiere Pro CS6 importing .mp4 = "The file has no audio or video streams."

    Hi,
    When I import an MP4 clip into Adobe Premiere Pro CS6, I get the following message:
    "File Import Failure
    151_0010_01.mp4
    Error Message
    The file has no audio or video streams."
    I have the latest Quicktime, and I have reinstalled Premiere Pro CS6 a few times already.  If anyone knows a fix for this, I would greatly appreciate it.  My system and clip info are found below.  Thank you, Robert.
    OPERATING SYSTEM
    Win 7 Ultimate 64bit
    CLIP INFO
    Complete name                         \BPAV\CLPR\151_0009_01\151_0009_01.MP4
    Format                                   : MPEG-4
    Commercial name                    : XDCAM EX 35
    Format profile                         : Base Media / Version 2
    Codec ID                                : mp42
    File size                                : 1.76 GiB
    Duration                                 : 7mn 5s
    Overall bit rate mode                 : Variable
    Overall bit rate                         : 35.6 Mbps
    Encoded date                             : UTC 2009-07-19 04:22:10
    Tagged date                              : UTC 2009-07-19 04:22:10
    Video
    ID                                       : 1
    Format                                   : MPEG Video
    Commercial name                          : XDCAM EX 35
    Format version                           : Version 2
    Format profile                           : Main@High
    Format settings, BVOP                    : Yes
    Format settings, Matrix                  : Custom
    Format settings, GOP                     : M=3, N=15
    Codec ID                                 : 61
    Duration                                 : 7mn 5s
    Bit rate mode                            : Variable
    Bit rate                                 : 34.0 Mbps
    Maximum bit rate                         : 35.0 Mbps
    Width                                    : 1 280 pixels
    Height                                   : 720 pixels
    Display aspect ratio                     : 16:9
    Frame rate mode                          : Constant
    Frame rate                               : 29.970 fps
    Color space                              : YUV
    Chroma subsampling                       : 4:2:0
    Bit depth                                : 8 bits
    Scan type                                : Progressive
    Compression mode                         : Lossy
    Bits/(Pixel*Frame)                       : 1.231
    Stream size                              : 1.68 GiB (96%)
    Language                                 : English
    Encoded date                             : UTC 2009-07-19 04:22:10
    Tagged date                              : UTC 2009-07-19 04:22:10
    Color primaries                          : BT.709
    Transfer characteristics                 : BT.709
    Matrix coefficients                      : BT.709
    Audio
    ID                                       : 2
    Format                                   : PCM
    Format settings, Endianness              : Big
    Format settings, Sign                    : Signed
    Codec ID                                 : twos
    Duration                                 : 7mn 5s
    Bit rate mode                            : Constant
    Bit rate                                 : 1 536 Kbps
    Channel(s)                               : 2 channels
    Sampling rate                            : 48.0 KHz
    Bit depth                                : 16 bits
    Stream size                              : 77.9 MiB (4%)
    Language                                 : English
    Encoded date                             : UTC 2009-07-19 04:22:10
    Tagged date                              : UTC 2009-07-19 04:22:10

    I encountered a similar problem.  I had quicktime files that would play in the quicktime player from the desktop, but wouldn't import the files "the file has no audio or video streams".  I had just reinstalled quicktime, and Premier with no change.  Finally realized that we had Barracuda protection that was at fault.  It was preventing communication to the quicktime codec and thus Premier could not interpret the footage.  Another symptom, I would also launch After Effects and it would tell me that quicktime wasn't installed.
    I solved this by uninstalling Barracuda.  We reported the fault to Barracuda, and have not seen any change in the software. 

  • Grouping Audio and Video

    After adding several audio / video clips to my project, I remembered that I forgot to add a few. Now when I add the video clips, the audio files that used to align with the previous ones are no longer lined up correctly. Is there a way to group video and audio clips so that they always move together? Or is there another way to avoid this problem?
    Thanks

    rakusuira,
    You can use the Lock Audio Clip feature.
    First select your audio clip then positiion your playhead where you wish the audio clip to start the choose Advanced>Lock Audio Clip at Playhead.
    You'll see a little yellow pin appear indicating the lock.
    Matt
    P.S. I've noticed some quirky behavior of this feature. For example, if iMovie runs out of room in your movie it will allow locked audio to slide out of position. Still, I think it's worth trying.

  • My shuffle ,doesnt get syncronized with itunes in my pc,getting audio and video settings error,but able to get it syncronized with my friends pc

    my shuffle ,doesnt get syncronized with itunes in my pc,getting audio and video settings error,but able to get it syncronized with my friends pc    

    Try:                                               
    - iOS: Not responding or does not turn on           
    - Also try DFU mode after try recovery mode
    How to put iPod touch / iPhone into DFU mode « Karthik's scribblings
    - If not successful and you can't fully turn the iOS device fully off, let the battery fully drain. After charging for an least an hour try the above again.
    - Try another cable                     
    - Try on another computer                                                       
    - If still not successful that usually indicates a hardware problem and an appointment at the Genius Bar of an Apple store is in order.
    Apple Retail Store - Genius Bar
    The missing apps could have been done by setting the Restrictions that can hid those apps. If the backup was made with those retrictions set the the Restrictions are also restored.
    Thus, if you get it to work restore to factory settings/new iPod, not from backup                               
    You can redownload most iTunes purchases by:        
      Downloading past purchases from the App Store, iBookstore, and iTunes Store

  • DO i need some extra hardware interface for receving both Audio and video

    hi i m doing e-learning project. i have to capture video from webcam and voice from headphone and send to client.
    but my code is working fine for either one at a time.
    DO i need some extra hardware interface for receving both Audio and video. im using code AVTransmit and AVReceive found from this site only
    After running TX
    i give Dsound:// & vfw://0 in Media Locater only sound is received and no vedio
    and when i give vfw://0 in Media Locater only live video is transmited.
    im using JMF1.1.2e.
    if any one know the method to run or cause of it plz reply me soon. i will be very thankfull
    transmiter/server side code .first run TX on server
    import java.io.*;
    import java.awt.*;
    import java.awt.event.*;
    import java.net.*;
    import java.util.*;
    import javax.media.rtp.*;
    import javax.swing.*;
    import javax.swing.event.*;
    import javax.swing.border.*;
    public class Tx extends JFrame implements ActionListener, KeyListener,
    MouseListener, WindowListener {
    Vector targets;
    JList list;
    JButton startXmit;
    JButton rtcp;
    JButton update;
    JButton expiration;
    JButton statistics;
    JButton addTarget;
    JButton removeTarget;
    JTextField tf_remote_address;
    JTextField tf_remote_data_port;
    JTextField tf_media_file;
    JTextField tf_data_port;
    TargetListModel listModel;
    AVTransmitter avTransmitter;
    RTCPViewer rtcpViewer;
    JCheckBox cb_loop;
    Config config;
    public Tx() {
    setTitle( "JMF/RTP Transmitter");
         config= new Config();
         GridBagLayout gridBagLayout= new GridBagLayout();
         GridBagConstraints gbc;
         JPanel p= new JPanel();
         p.setLayout( gridBagLayout);
         JPanel localPanel= createLocalPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 0;
         gbc.gridy= 0;
         gbc.gridwidth= 2;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( localPanel, gbc);
         p.add( localPanel);
         JPanel targetPanel= createTargetPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 1;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( targetPanel, gbc);
    p.add( targetPanel);
         JPanel mediaPanel= createMediaPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 2;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( mediaPanel, gbc);
    p.add( mediaPanel);
    JPanel buttonPanel= new JPanel();
    rtcp= new JButton( "RTCP Monitor");
    update= new JButton( "Transmission Status");
         update.setEnabled( false);
         rtcp.addActionListener( this);
         update.addActionListener( this);
         buttonPanel.add( rtcp);
         buttonPanel.add( update);
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 3;
    gbc.gridwidth= 2;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( buttonPanel, gbc);
         p.add( buttonPanel);
    getContentPane().add( p);
         list.addMouseListener( this);
         addWindowListener( this);
    pack();
    setVisible( true);
    private JPanel createMediaPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         JLabel label= new JLabel( "Media Locator:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         tf_media_file= new JTextField( 35);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_media_file, gbc);
         p.add( tf_media_file);
         tf_media_file.setText( config.media_locator);
         cb_loop= new JCheckBox( "loop");
         startXmit= new JButton( "Start Transmission");
         startXmit.setEnabled( true);
         startXmit.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 2;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( cb_loop, gbc);
         p.add( cb_loop);
         cb_loop.setSelected( true);
         cb_loop.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( startXmit, gbc);
         p.add( startXmit);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Source");
         p.setBorder( titledBorder);
         return p;
    private JPanel createTargetPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         targets= new Vector();
         for( int i= 0; i < config.targets.size(); i++) {
         targets.addElement( config.targets.elementAt( i));
    listModel= new TargetListModel( targets);
    list= new JList( listModel);
         list.addKeyListener( this);
         list.setPrototypeCellValue( "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx");
    JScrollPane scrollPane= new JScrollPane( list,
    ScrollPaneConstants.VERTICAL_SCROLLBAR_AS_NEEDED,
    ScrollPaneConstants.HORIZONTAL_SCROLLBAR_NEVER);
         gbc= new GridBagConstraints();
         gbc.gridx= 0;
         gbc.gridy= 0;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( scrollPane, gbc);
         p.add( scrollPane);
    JPanel p1= new JPanel();
         p1.setLayout( gridBagLayout);
         JLabel label= new JLabel( "IP Address:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( label, gbc);
         p1.add( label);
         tf_remote_address= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( tf_remote_address, gbc);
         p1.add( tf_remote_address);
         label= new JLabel( "Data Port:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( label, gbc);
         p1.add( label);
         tf_remote_data_port= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( tf_remote_data_port, gbc);
         p1.add( tf_remote_data_port);     
    JPanel p2= new JPanel();
    addTarget= new JButton( "Add Target");     
    removeTarget= new JButton( "Remove Target");
         p2.add( addTarget);
         p2.add( removeTarget);
         addTarget.addActionListener( this);
         removeTarget.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 2;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.gridwidth= 2;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 20,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( p2, gbc);
         p1.add( p2);
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 0;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( p1, gbc);
         p.add( p1);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Targets");
         p.setBorder( titledBorder);
         return p;
    private JPanel createLocalPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         JLabel label= new JLabel( "IP Address:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         JTextField tf_local_host= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_local_host, gbc);
         p.add( tf_local_host);
         try {
    String host= InetAddress.getLocalHost().getHostAddress();     
         tf_local_host.setText( host);
         } catch( UnknownHostException e) {
         label= new JLabel( "Data Port:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         tf_data_port= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_data_port, gbc);
         p.add( tf_data_port);
         tf_data_port.setText( config.local_data_port);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Local Host");
         p.setBorder( titledBorder);
         return p;
    public void actionPerformed( ActionEvent event) {
    Object source= event.getSource();
         if( source == addTarget) {
         String ip= tf_remote_address.getText().trim();
         String port= tf_remote_data_port.getText().trim();
         String localPort= tf_data_port.getText().trim();
         addTargetToList( localPort, ip, port);
         if( avTransmitter != null) {
         avTransmitter.addTarget( ip, port);
         } else if( source == removeTarget) {
         int index= list.getSelectedIndex();
         if( index != -1) {
              Target target= (Target) targets.elementAt( index);
              if( avTransmitter != null) {
         avTransmitter.removeTarget( target.ip, target.port);
              targets.removeElement( target);
              listModel.setData( targets);          
         } else if( source == startXmit) {
         if( startXmit.getLabel().equals( "Start Transmission")) {          
         int data_port= new Integer( tf_data_port.getText()).intValue();
              avTransmitter= new AVTransmitter( this, data_port);
         avTransmitter.start( tf_media_file.getText().trim(), targets);          
              avTransmitter.setLooping( cb_loop.isSelected());
         startXmit.setLabel( "Stop Transmission");
         } else if( startXmit.getLabel().equals( "Stop Transmission")) {
              avTransmitter.stop();
              avTransmitter= null;
              removeNonBaseTargets();
              listModel.setData( targets);
         startXmit.setLabel( "Start Transmission");          
         } else if( source == rtcp) {
         if( rtcpViewer == null) {
         rtcpViewer= new RTCPViewer();
         } else {
              rtcpViewer.setVisible( true);
              rtcpViewer.toFront();
         } else if( source == cb_loop) {
         if( avTransmitter != null) {
              avTransmitter.setLooping( cb_loop.isSelected());
    private void removeNonBaseTargets() {
         String localPort= tf_data_port.getText().trim();
         for( int i= targets.size(); i > 0;) {
         Target target= (Target) targets.elementAt( i - 1);
         if( !target.localPort.equals( localPort)) {
    targets.removeElement( target);
         i--;
    public void addTargetToList( String localPort,
                             String ip, String port) {     
    ListUpdater listUpdater= new ListUpdater( localPort, ip,
                                  port, listModel, targets);
    SwingUtilities.invokeLater( listUpdater);           
    public void rtcpReport( String report) {
         if( rtcpViewer != null) {
         rtcpViewer.report( report);
    public void windowClosing( WindowEvent event) {
         config.local_data_port= tf_data_port.getText().trim();
         config.targets= new Vector();
         for( int i= 0; i < targets.size(); i++) {
         Target target= (Target) targets.elementAt( i);
         if( target.localPort.equals( config.local_data_port)) {
              config.addTarget( target.ip, target.port);
         config.media_locator= tf_media_file.getText().trim();
         config.write();
    System.exit( 0);
    public void windowClosed( WindowEvent event) {
    public void windowDeiconified( WindowEvent event) {
    public void windowIconified( WindowEvent event) {
    public void windowActivated( WindowEvent event) {
    public void windowDeactivated( WindowEvent event) {
    public void windowOpened( WindowEvent event) {
    public void keyPressed( KeyEvent event) {
    public void keyReleased( KeyEvent event) {
    Object source= event.getSource();
         if( source == list) {
         int index= list.getSelectedIndex();
    public void keyTyped( KeyEvent event) {
    public void mousePressed( MouseEvent e) {
    public void mouseReleased( MouseEvent e) {
    public void mouseEntered( MouseEvent e) {
    public void mouseExited( MouseEvent e) {
    public void mouseClicked( MouseEvent e) {
    Object source= e.getSource();
         if( source == list) {
         int index= list.getSelectedIndex();
         if( index != -1) {
              Target target= (Target) targets.elementAt( index);
              tf_remote_address.setText( target.ip);
              tf_remote_data_port.setText( target.port);
         int index= list.locationToIndex( e.getPoint());
    public static void main( String[] args) {
    new Tx();
    class TargetListModel extends AbstractListModel {
    private Vector options;
    public TargetListModel( Vector options) {
         this.options= options;
    public int getSize() {
         int size;
         if( options == null) {
         size= 0;
         } else {
         size= options.size();
         return size;
    public Object getElementAt( int index) {
    String name;
    if( index < getSize()) {
         Target o= (Target)options.elementAt( index);
    name= o.localPort + " ---> " + o.ip + ":" + o.port;
         } else {
         name= null;
         return name;
    public void setData( Vector data) {
         options= data;
         fireContentsChanged( this, 0, data.size());
    class ListUpdater implements Runnable {
    String localPort, ip, port;
    TargetListModel listModel;
    Vector targets;
    public ListUpdater( String localPort, String ip, String port,
                   TargetListModel listModel, Vector targets) {
         this.localPort= localPort;
         this.ip= ip;
         this.port= port;
         this.listModel= listModel;
         this.targets= targets;
    public void run() {
    Target target= new Target( localPort, ip, port);
         if( !targetExists( localPort, ip, port)) {
         targets.addElement( target);
    listModel.setData( targets);
    public boolean targetExists( String localPort, String ip, String port) {
         boolean exists= false;
         for( int i= 0; i < targets.size(); i++) {
         Target target= (Target) targets.elementAt( i);
         if( target.localPort.equals( localPort)
         && target.ip.equals( ip)
              && target.port.equals( port)) {          
              exists= true;
         break;
         return exists;
    >>>>>>>>>>>>>>>>>
    import java.awt.*;
    import java.io.*;
    import java.net.InetAddress;
    import java.util.*;
    import javax.media.*;
    import javax.media.protocol.*;
    import javax.media.format.*;
    import javax.media.control.TrackControl;
    import javax.media.control.QualityControl;
    import javax.media.rtp.*;
    import javax.media.rtp.event.*;
    import javax.media.rtp.rtcp.*;
    public class AVTransmitter implements ReceiveStreamListener, RemoteListener,
    ControllerListener {
    // Input MediaLocator
    // Can be a file or http or capture source
    private MediaLocator locator;
    private String ipAddress;
    private int portBase;
    private Processor processor = null;
    private RTPManager rtpMgrs[];
    private int localPorts[];
    private DataSource dataOutput = null;
    private int local_data_port;
    private Tx tx;
    public AVTransmitter( Tx tx, int data_port) {
         this.tx= tx;
         local_data_port= data_port;
    * Starts the transmission. Returns null if transmission started ok.
    * Otherwise it returns a string with the reason why the setup failed.
    public synchronized String start( String filename, Vector targets) {
         String result;
         locator= new MediaLocator( filename);
         // Create a processor for the specified media locator
         // and program it to output JPEG/RTP
         result = createProcessor();
         if (result != null) {
         return result;
         // Create an RTP session to transmit the output of the
         // processor to the specified IP address and port no.
         result = createTransmitter( targets);
         if (result != null) {
         processor.close();
         processor = null;
         return result;
         // Start the transmission
         processor.start();
         return null;
    * Use the RTPManager API to create sessions for each media
    * track of the processor.
    private String createTransmitter( Vector targets) {
         // Cheated. Should have checked the type.
         PushBufferDataSource pbds = (PushBufferDataSource)dataOutput;
         PushBufferStream pbss[] = pbds.getStreams();
         rtpMgrs = new RTPManager[pbss.length];
         localPorts = new int[ pbss.length];
         SessionAddress localAddr, destAddr;
         InetAddress ipAddr;
         SendStream sendStream;
         int port;
         SourceDescription srcDesList[];
         for (int i = 0; i < pbss.length; i++) {
         // for (int i = 0; i < 1; i++) {
         try {
              rtpMgrs[i] = RTPManager.newInstance();     
              port = local_data_port + 2*i;
              localPorts[ i]= port;
              localAddr = new SessionAddress( InetAddress.getLocalHost(),
                                  port);
              rtpMgrs.initialize( localAddr);          
              rtpMgrs[i].addReceiveStreamListener(this);
              rtpMgrs[i].addRemoteListener(this);
         for( int k= 0; k < targets.size(); k++) {
              Target target= (Target) targets.elementAt( k);
              int targetPort= new Integer( target.port).intValue();
              addTarget( localPorts[ i], rtpMgrs[ i], target.ip, targetPort + 2*i);
              sendStream = rtpMgrs[i].createSendStream(dataOutput, i);          
              sendStream.start();
         } catch (Exception e) {
              e.printStackTrace();
              return e.getMessage();
         return null;
    public void addTarget( String ip, String port) {
         for (int i= 0; i < rtpMgrs.length; i++) {
         int targetPort= new Integer( port).intValue();
         addTarget( localPorts[ i], rtpMgrs[ i], ip, targetPort + 2*i);
    public void addTarget( int localPort, RTPManager mgr, String ip, int port) {
         try {
         SessionAddress addr= new SessionAddress( InetAddress.getByName( ip),
                                  new Integer( port).intValue());
         mgr.addTarget( addr);
         tx.addTargetToList( localPort + "", ip, port + "");
         } catch( Exception e) {
         e.printStackTrace();
    public void removeTarget( String ip, String port) {
         try {     
         SessionAddress addr= new SessionAddress( InetAddress.getByName( ip),
                                  new Integer( port).intValue());
         for (int i= 0; i < rtpMgrs.length; i++) {
         rtpMgrs[ i].removeTarget( addr, "target removed from transmitter.");
         } catch( Exception e) {
         e.printStackTrace();
    boolean looping= true;
    public void controllerUpdate( ControllerEvent ce) {
         System.out.println( ce);
         if( ce instanceof DurationUpdateEvent) {
         Time duration= ((DurationUpdateEvent) ce).getDuration();
         System.out.println( "duration: " + duration.getSeconds());
         } else if( ce instanceof EndOfMediaEvent) {
         System.out.println( "END OF MEDIA - looping=" + looping);
         if( looping) {
         processor.setMediaTime( new Time( 0));
              processor.start();
    public void setLooping( boolean flag) {
         looping= flag;
    public void update( ReceiveStreamEvent event) {
         String timestamp= getTimestamp();
         StringBuffer sb= new StringBuffer();
         if( event instanceof InactiveReceiveStreamEvent) {
         sb.append( timestamp + " Inactive Receive Stream");
         } else if( event instanceof ByeEvent) {
         sb.append( timestamp + " Bye");
         } else {
         System.out.println( "ReceiveStreamEvent: "+ event);
         tx.rtcpReport( sb.toString());     
    public void update( RemoteEvent event) {     
         String timestamp= getTimestamp();
         if( event instanceof ReceiverReportEvent) {
         ReceiverReport rr= ((ReceiverReportEvent) event).getReport();
         StringBuffer sb= new StringBuffer();
         sb.append( timestamp + " RR");
         if( rr != null) {
              Participant participant= rr.getParticipant();
              if( participant != null) {
              sb.append( " from " + participant.getCNAME());
              sb.append( " ssrc=" + rr.getSSRC());
              } else {
              sb.append( " ssrc=" + rr.getSSRC());
              tx.rtcpReport( sb.toString());
         } else {
         System.out.println( "RemoteEvent: " + event);
    private String getTimestamp() {
         String timestamp;
         Calendar calendar= Calendar.getInstance();
         int hour= calendar.get( Calendar.HOUR_OF_DAY);
         String hourStr= formatTime( hour);
         int minute= calendar.get( Calendar.MINUTE);
         String minuteStr= formatTime( minute);
         int second= calendar.get( Calendar.SECOND);
         String secondStr= formatTime( second);
         timestamp= hourStr + ":" + minuteStr + ":" + secondStr;     
         return timestamp;
    private String formatTime( int time) {     
         String timeStr;
         if( time < 10) {
         timeStr= "0" + time;
         } else {
         timeStr= "" + time;
         return timeStr;
    * Stops the transmission if already started
    public void stop() {
         synchronized (this) {
         if (processor != null) {
              processor.stop();
              processor.close();
              processor = null;
         for (int i= 0; i < rtpMgrs.length; i++) {
         rtpMgrs[ i].removeTargets( "Session ended.");
              rtpMgrs[ i].dispose();
    public String createProcessor() {
         if (locator == null) {
         return "Locator is null";
         DataSource ds;
         DataSource clone;
         try {
         ds = javax.media.Manager.createDataSource(locator);
         } catch (Exception e) {
         return "Couldn't create DataSource";
         // Try to create a processor to handle the input media locator
         try {
         processor = javax.media.Manager.createProcessor(ds);
         processor.addControllerListener( this);     
         } catch (NoProcessorException npe) {
         return "Couldn't create processor";
         } catch (IOException ioe) {
         return "IOException creating processor";
         // Wait for it to configure
         boolean result = waitForState(processor, Processor.Configured);
         if (result == false)
         return "Couldn't configure processor";
         // Get the tracks from the processor
         TrackControl [] tracks = processor.getTrackControls();
         // Do we have atleast one track?
         if (tracks == null || tracks.length < 1)
         return "Couldn't find tracks in processor";
         // Set the output content descriptor to RAW_RTP
         // This will limit the supported formats reported from
         // Track.getSupportedFormats to only valid RTP formats.
         ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW_RTP);
         processor.setContentDescriptor(cd);
         Format supported[];
         Format chosen;
         boolean atLeastOneTrack = false;
         // Program the tracks.
         for (int i = 0; i < tracks.length; i++) {
         Format format = tracks[i].getFormat();
         if (tracks[i].isEnabled()) {
              supported = tracks[i].getSupportedFormats();
              // We've set the output content to the RAW_RTP.
              // So all the supported formats should work with RTP.
              // We'll just pick the first one.
              if (supported.length > 0) {
              if (supported[0] instanceof VideoFormat) {
                   // For video formats, we should double check the
                   // sizes since not all formats work in all sizes.
                   chosen = checkForVideoSizes(tracks[i].getFormat(),
                                       supported[0]);
              } else
                   chosen = supported[0];
              tracks[i].setFormat(chosen);
              System.err.println("Track " + i + " is set to transmit as:");
              System.err.println(" " + chosen);
              atLeastOneTrack = true;
              } else
              tracks[i].setEnabled(false);
         } else
              tracks[i].setEnabled(false);
         if (!atLeastOneTrack)
         return "Couldn't set any of the tracks to a valid RTP format";
         // Realize the processor. This will internally create a flow
         // graph and attempt to create an output datasource for JPEG/RTP
         // audio frames.
         result = waitForState(processor, Controller.Realized);
         if (result == false)
         return "Couldn't realize processor";
         // Set the JPEG quality to .5.
         setJPEGQuality(processor, 0.5f);
         // Get the output data source of the processor
         dataOutput = processor.getDataOutput();
         return null;
    static SessionAddress destAddr1, destAddr2;
    * For JPEG and H263, we know that they only work for particular
    * sizes. So we'll perform extra checking here to make sure they
    * are of the right sizes.
    Format checkForVideoSizes(Format original, Format supported) {
         int width, height;
         Dimension size = ((VideoFormat)original).getSize();
         Format jpegFmt = new Format(VideoFormat.JPEG_RTP);
         Format h263Fmt = new Format(VideoFormat.H263_RTP);
         if (supported.matches(jpegFmt)) {
         // For JPEG, make sure width and height are divisible by 8.
         width = (size.width % 8 == 0 ? size.width :
                        (int)(size.width / 8) * 8);
         height = (size.height % 8 == 0 ? size.height :
                        (int)(size.height / 8) * 8);
         } else if (supported.matches(h263Fmt)) {
         // For H.263, we only support some specific sizes.
         if (size.width < 128) {
              width = 128;
              height = 96;
         } else if (size.width < 176) {
              width = 176;
              height = 144;
         } else {
              width = 352;
              height = 288;
         } else {
         // We don't know this particular format. We'll just
         // leave it alone then.
         return supported;
         return (new VideoFormat(null,
                        new Dimension(width, height),
                        Format.NOT_SPECIFIED,
                        null,
                        Format.NOT_SPECIFIED)).intersects(supported);
    * Setting the encoding quality to the specified value on the JPEG encoder.
    * 0.5 is a good default.
    void setJPEGQuality(Player p, float val) {
         Control cs[] = p.getControls();
         QualityControl qc = null;
         VideoFormat jpegFmt = new VideoFormat(VideoFormat.JPEG);
         // Loop through the controls to find the Quality control for
         // the JPEG encoder.
         for (int i = 0; i < cs.length; i++) {
         if (cs[i] instanceof QualityControl &&
              cs[i] instanceof Owned) {
              Object owner = ((Owned)cs[i]).getOwner();
              // Check to see if the owner is a Codec.
              // Then check for the output format.
              if (owner instanceof Codec) {
              Format fmts[] = ((Codec)owner).getSupportedOutputFormats(null);
              for (int j = 0; j < fmts.length; j++) {
                   if (fmts[j].matches(jpegFmt)) {
                   qc = (QualityControl)cs[i];
                   qc.setQuality(val);
                   System.err.println("- Setting quality to " +
                             val + " on " + qc);
                   break;
              if (qc != null)
              break;
    * Convenience methods to handle processor's state changes.
    private Integer stateLock = new Integer(0);
    private boolean failed = false;
    Integer getStateLock() {
         return stateLock;
    void setFailed() {
         failed = true;
    private synchronized boolean waitForState(Processor p, int state) {
         p.addControllerListener(new StateListener());
         failed = false;
         // Call the required method on the processor
         if (state == Processor.Configured) {
         p.configure();
         } else if (state == Processor.Realized) {
         p.realize();
         // Wait until we get an event that confirms the
         // success of the method, or a failure event.
         // See StateListener inner class
         while (p.getState() < state && !failed) {
         synchronized (getStateLock()) {
              try {
              getStateLock().wait();
              } catch (InterruptedException ie) {
              return false;
         if (failed)
         return false;
         else
         return true;
    * Inner Classes
    class StateListener implements ControllerListener {
         public void controllerUpdate(ControllerEvent ce) {
         // If there was an error during configure or
         // realiz

    I do this all the time, I put my MBP to a 60 inch Sharp. If you have the video working do the simple thing first. Check to make sure your sound is on your TV and Mac. Then if that doesn't work go to System Prefrences and under sound go to a tab called Output and see if your TV is listed and if it is change it to that setting
    Hope It Works

Maybe you are looking for

  • AnyConnect 3.1 - removing Security Warning: Untrusted VPN Server Certificate!

    Hi guys, Is there a way to disable the warning generated from using self signed certs? I would like to make the process as seamless as possible. AnyConnect 3.1 ASA 8.4(2) Thanks.

  • How to suppress blank page in sapscript

    Hi All, I am working on cheque printing, after I run the transaction F110, it will print the sap script which I have developed, everything is working fine and good. But its printing an extra blank page in addition with cheque. I have tested thinking

  • Can i use an external drive in macbook pro mid 2012?

    replaced my optical drive with new matshita UJ8A8 drive in my macbook pro 13 mid 2012. problem was prior to replacing every type of disc wold be ejected, now when disc goes in new drive makes a sound and then ejects???? any ideas? I have 16 gig of ra

  • Is there a way to sort a large document by a date on each page?

    We receive medical records, which are then scanned and saved as a single document. They are not provided to us in any type of order. I need to sort the document chronologically based on the date of treatment. And sometimes, a treatment records is mor

  • Sending From Mail Alias

    Hi, Is there a way to allow a user to send using a different From address than his primary? I've followed various examples for Exchange 2003 and haven't been able to do it. We are planning to migrate to 2010 later this year and I'm wondering if anyon