Poor Audio and Video Quality

Using IDVD to burn iMovie containing pictures and music and am getting very poor quality audio and video. The video is very low resolution, lacking clarity and definition, and the audio comes out really flat with over-saturated vocals that is unpleasant to listen to.
The content was made from iMovie 11, importing photos from iPhoto and iTunes music.
I exported the movie via Share->Export Movie-> HD 1080P
In iDVD 7.1.2 I imported the movie (actually several) into a slideshow menu item.
I used the iDVD encoding preferences set at 'Best Performance' and 'Professional Quality' with no difference between the two
Then to ensure it wasn't the DVD disk causing the problem, I used the 'Save as Disc Image', and validated the file image saved to disk had the same quality issues as the burned DVD.
Then to eliminate the iMovie as the problem, ran a test and used an iPhoto slideshow in place of the iMovie, and it also has the same low quality issues.
I have observed that using the iDVD viewing capabilities, the A/V quality is fine. Its after the burn to disk the problem occurs.
At this point I'm not sure what to try. I'm viewing the DVD on my 50" widescreen TV using Sony DVD player.
A year or so back I did similar steps and the DVD came out fine... the sound quality was excellent and the video quality while not outstanding, was reasonable.
Any ideas -- I'm really stumped.

Hi
In iMovie'11 - I do (not use Share to iDVD)
• Share to Media Browser - AND AS Large - NOT HD ! (result will not be better but rather WORSE)
• In iDVD I import from Media button and Movie tab.
Yours Bengt W

Similar Messages

  • Poor audio and video quality when in conference

    I have a customer with poor quality in audio/video conferences.
    P2P is going really well, there are no problems with sharing desktops and applications until they try to share a powerpoint presentation OR invites a third participant. The behaviour is the same even if the two users are on different network sites with a
    WAN between them. Perfect audio and video until sharing ppt or inviting third participant.
    The environment is as follows
    3 hardware front end servers, two edge servers, Office web apps farm with two servers. All web services are loadbalanced with BigIP. There is a cisco firewall between clients and servers. QoS is implemented on clients, edge and FE-servers. Latest CU on all
    servers. approx 390 users now(pilot) should be approx 6500 when fully deployed. There is another pool with same setup, but we do not have users on that pool yet. Users do not have internet.All conversations going bad are classified as green in monitoring reports.
    No extreme values in the reports. Clients are running no policies, all servers are default. Servers are flatlining. approx 20 video calls a day at this point e.g. there si more than enough capacity in the network and on the servers.
    At this point I suspect either packet inspection on the cisco firewall OR some odd configuration needed on the Frontends. 
    I have asked the network guys if there is a packet inspection being done on the firewall, and if there is done anythong to optimize this.
    I have read about the conferencing policy but do not understand what maximum limits should have anything to do with it.
    I am not sure where to start here.

    Hi,
    The media traffic bandwidth usage can be challenging to calculate because of the number of different variables, such as codec usage, resolution, and activity levels. The bandwidth usage is a function of the codec used and the activity
    of the stream, both of which vary between scenarios. The following table lists the audio codecs commonly used in Lync Server 2013 scenarios.
    You can refer to the link below of “Network bandwidth requirements for media traffic in Lync Server 2013” to check if the network bandwidth match the requirements (especially for conference
    situation):
    http://technet.microsoft.com/en-us/library/jj688118.aspx
    Best Regards,
    Eason Huang
    Eason Huang
    TechNet Community Support

  • HT1933 when i play my purchased movies in HD there the audio and video quality is poor

    I have purchased movies in HD but when i try to play them the audio skips and there is skipping in the video as well. However i also have a few movies in Sd and they do not have these problems. However even though the computer im using is not a MAC is an extremely nice Asus with a 4th gen I5 and supports HD video playback. Anyone know what to do  I tried to get help from apple and they scheduled me a phone call two weeks from now? some help right?!

    Can anybody suggest me how can I improve the quality of audio and video in JMF Player ? Below here I am providing the source code of the JMF Player I am using:My suggestion would be to use a supported OS with your JMF. JMF doesn't support "Ubuntu 8.04 and MAC OS 10.5", it only supports Redhat and Solaris (and Windows 9X or NT 4.0)

  • IChat: Poor Audio and Video

    Hi everyone, hoping for some insights into improving my connection.
    Well, I've read everyting I can find and whilst I can now connect - (DON'T put iChat and Skype in the Login items) - I'm getting pretty unuseable audio and video.
    Here's what I've tried and where I'm at:
    QT streaming is at 1.5
    iChat is on None
    Mac Firewall is OFF
    I connect via WIFI to a "FreeBox" - a combi router and modem here in France issued by my ISP.
    For simplicity I've put my iMac in the DMZ on the router setup page.
    The iMac has never run anything before 10.4.8
    I do DHCP manually.
    There is an AIrport Extreme card inside (AirPort Extreme (0x14E4, 0x87)
    Wireless Card Locale: Worldwide
    Wireless Card Firmware Version: 4.80.46.0
    NO iChat add-ons
    Bandwidth... the last speed test says 2,27 Mbit/s down and 136,17 kbit/s up
    I Just tried chatting with defcom1 (thanks mate!) and it was really bad - from my end. At his it was apparently OK...???
    Would really appreciate some thoughts on this
    CHeers, Adam
    iMac C2D 2GHZ   Mac OS X (10.4.8)  

    Is this the speed your ISP is providing or was this
    something that was recommended here on these forums?
    This QT setting is as recommended here in the forums
    Do you have a Direct Subscriber Line (DSL) or is it
    faster?
    Yes, it's a 2Mb DSL line
    Obviously you are using the built in iSight camera
    and the lighting is OK. Can you try an external
    camera, possibly a firewire camcorder?
    Haven't access to any other cameras...
    Incidentally, I've never had to be in the DMZ but I
    don't think this is going to be an A/V quality
    issue.
    I wanted to establish that this wasn't related to any ports not being open or whatever. Being thorough.
    Any other ideas ?
    I'm gathering strength to physically move the iMac down to the "FreeBox" (modem/router) and try the same with an Ethernet connection so as to rule out poor WiFi, but I'm not convinced about it.
    TIA
    AP
    iMac C2D 2GHZ Mac OS X (10.4.8)
    iMac C2D 2GHZ Mac OS X (10.4.8)

  • Old Projects in HD poor audio and video

    I had an old on going project that I have added to for 2 or 3 yrs.
    using imovie 2,3,4,5 and HD It would not ''share'' back to tape.
    Some nice people on this forum advised me to export the project to
    quicktime then back to a new project. I did that and it seemed to
    work fine, even burned an idvd.
    But the next day when I played the dvd and also tried the tape
    the video and audio was was not good quality. Is a reduction in quality
    to be expected or did I do something wrong ?
    I really appreciate the help and information I've gotten from all
    the great people on this discussion group
    Terry

    Hi,
    The media traffic bandwidth usage can be challenging to calculate because of the number of different variables, such as codec usage, resolution, and activity levels. The bandwidth usage is a function of the codec used and the activity
    of the stream, both of which vary between scenarios. The following table lists the audio codecs commonly used in Lync Server 2013 scenarios.
    You can refer to the link below of “Network bandwidth requirements for media traffic in Lync Server 2013” to check if the network bandwidth match the requirements (especially for conference
    situation):
    http://technet.microsoft.com/en-us/library/jj688118.aspx
    Best Regards,
    Eason Huang
    Eason Huang
    TechNet Community Support

  • How to get audio and video in one file

    hi...I am "premiere beginner user" and I have a simple question , I think..:)...Every time when I am trying to export media out from my premiere (using media encoder) I get video file and some audio files, but what I need is only ONE file, that contains both (audio and video)..for example , I have edited film in hd quality and I want it to export, I choose h.264 format(make some settings,pal,audio quality...) and when it´s "all done", only "result" I get is one video file and two audio files and they are not together :).....so please help mme anyone..ok and as you can see I am not from english speaking country, so sorry for my english...thanks for help....

    There is no indication that something is out-of-sync or the ability to move/slip into sync like in FCE/FCP.
    To the best of my knowledge, using match frame and overwriting the video with video+audio (or attaching it as a connected clip if you don't want to overwrite the video, which might have effects applied) is the best way to do this.
    You can use the blade tool first in the timeline to get just the section you want to get back into sync. Make cuts so that you have a single clip that needs the audio resynced, select it, and use shift-F to select the original synced clip in the Event Browser. Make sure your play head is at the beginning of this video clip in the timeline. Press option-2, then 'q', and now you have the audio back in your timeline, synced with the video.  Delete the old, no-longer-synced audio and you should be set.

  • DO i need some extra hardware interface for receving both Audio and video

    hi i m doing e-learning project. i have to capture video from webcam and voice from headphone and send to client.
    but my code is working fine for either one at a time.
    DO i need some extra hardware interface for receving both Audio and video. im using code AVTransmit and AVReceive found from this site only
    After running TX
    i give Dsound:// & vfw://0 in Media Locater only sound is received and no vedio
    and when i give vfw://0 in Media Locater only live video is transmited.
    im using JMF1.1.2e.
    if any one know the method to run or cause of it plz reply me soon. i will be very thankfull
    transmiter/server side code .first run TX on server
    import java.io.*;
    import java.awt.*;
    import java.awt.event.*;
    import java.net.*;
    import java.util.*;
    import javax.media.rtp.*;
    import javax.swing.*;
    import javax.swing.event.*;
    import javax.swing.border.*;
    public class Tx extends JFrame implements ActionListener, KeyListener,
    MouseListener, WindowListener {
    Vector targets;
    JList list;
    JButton startXmit;
    JButton rtcp;
    JButton update;
    JButton expiration;
    JButton statistics;
    JButton addTarget;
    JButton removeTarget;
    JTextField tf_remote_address;
    JTextField tf_remote_data_port;
    JTextField tf_media_file;
    JTextField tf_data_port;
    TargetListModel listModel;
    AVTransmitter avTransmitter;
    RTCPViewer rtcpViewer;
    JCheckBox cb_loop;
    Config config;
    public Tx() {
    setTitle( "JMF/RTP Transmitter");
         config= new Config();
         GridBagLayout gridBagLayout= new GridBagLayout();
         GridBagConstraints gbc;
         JPanel p= new JPanel();
         p.setLayout( gridBagLayout);
         JPanel localPanel= createLocalPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 0;
         gbc.gridy= 0;
         gbc.gridwidth= 2;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( localPanel, gbc);
         p.add( localPanel);
         JPanel targetPanel= createTargetPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 1;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( targetPanel, gbc);
    p.add( targetPanel);
         JPanel mediaPanel= createMediaPanel();
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 2;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( mediaPanel, gbc);
    p.add( mediaPanel);
    JPanel buttonPanel= new JPanel();
    rtcp= new JButton( "RTCP Monitor");
    update= new JButton( "Transmission Status");
         update.setEnabled( false);
         rtcp.addActionListener( this);
         update.addActionListener( this);
         buttonPanel.add( rtcp);
         buttonPanel.add( update);
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 3;
    gbc.gridwidth= 2;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( buttonPanel, gbc);
         p.add( buttonPanel);
    getContentPane().add( p);
         list.addMouseListener( this);
         addWindowListener( this);
    pack();
    setVisible( true);
    private JPanel createMediaPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         JLabel label= new JLabel( "Media Locator:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         tf_media_file= new JTextField( 35);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_media_file, gbc);
         p.add( tf_media_file);
         tf_media_file.setText( config.media_locator);
         cb_loop= new JCheckBox( "loop");
         startXmit= new JButton( "Start Transmission");
         startXmit.setEnabled( true);
         startXmit.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 2;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( cb_loop, gbc);
         p.add( cb_loop);
         cb_loop.setSelected( true);
         cb_loop.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( startXmit, gbc);
         p.add( startXmit);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Source");
         p.setBorder( titledBorder);
         return p;
    private JPanel createTargetPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         targets= new Vector();
         for( int i= 0; i < config.targets.size(); i++) {
         targets.addElement( config.targets.elementAt( i));
    listModel= new TargetListModel( targets);
    list= new JList( listModel);
         list.addKeyListener( this);
         list.setPrototypeCellValue( "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx");
    JScrollPane scrollPane= new JScrollPane( list,
    ScrollPaneConstants.VERTICAL_SCROLLBAR_AS_NEEDED,
    ScrollPaneConstants.HORIZONTAL_SCROLLBAR_NEVER);
         gbc= new GridBagConstraints();
         gbc.gridx= 0;
         gbc.gridy= 0;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( scrollPane, gbc);
         p.add( scrollPane);
    JPanel p1= new JPanel();
         p1.setLayout( gridBagLayout);
         JLabel label= new JLabel( "IP Address:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( label, gbc);
         p1.add( label);
         tf_remote_address= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( tf_remote_address, gbc);
         p1.add( tf_remote_address);
         label= new JLabel( "Data Port:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( label, gbc);
         p1.add( label);
         tf_remote_data_port= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( tf_remote_data_port, gbc);
         p1.add( tf_remote_data_port);     
    JPanel p2= new JPanel();
    addTarget= new JButton( "Add Target");     
    removeTarget= new JButton( "Remove Target");
         p2.add( addTarget);
         p2.add( removeTarget);
         addTarget.addActionListener( this);
         removeTarget.addActionListener( this);
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 2;
         gbc.weightx = 1.0;
         gbc.weighty = 0.0;
         gbc.gridwidth= 2;
         gbc.anchor = GridBagConstraints.CENTER;
         gbc.fill = GridBagConstraints.HORIZONTAL;
         gbc.insets = new Insets( 20,5,0,5);
         ((GridBagLayout)p1.getLayout()).setConstraints( p2, gbc);
         p1.add( p2);
         gbc= new GridBagConstraints();
         gbc.gridx= 1;
         gbc.gridy= 0;
         gbc.weightx= 1.0;
         gbc.weighty= 1.0;
         gbc.anchor= GridBagConstraints.CENTER;
         gbc.fill= GridBagConstraints.BOTH;
         gbc.insets= new Insets( 10, 5, 0, 0);
         ((GridBagLayout)p.getLayout()).setConstraints( p1, gbc);
         p.add( p1);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Targets");
         p.setBorder( titledBorder);
         return p;
    private JPanel createLocalPanel() {
    JPanel p= new JPanel();
         GridBagLayout gridBagLayout= new GridBagLayout();
    GridBagConstraints gbc;
         p.setLayout( gridBagLayout);
         JLabel label= new JLabel( "IP Address:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         JTextField tf_local_host= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 0;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_local_host, gbc);
         p.add( tf_local_host);
         try {
    String host= InetAddress.getLocalHost().getHostAddress();     
         tf_local_host.setText( host);
         } catch( UnknownHostException e) {
         label= new JLabel( "Data Port:");
         gbc= new GridBagConstraints();
         gbc.gridx = 0;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.EAST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,0,5);
         ((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
         p.add( label);
         tf_data_port= new JTextField( 15);
         gbc= new GridBagConstraints();
         gbc.gridx = 1;
         gbc.gridy = 1;
         gbc.weightx = 0.0;
         gbc.weighty = 0.0;
         gbc.anchor = GridBagConstraints.WEST;
         gbc.fill = GridBagConstraints.NONE;
         gbc.insets = new Insets( 5,5,10,5);
         ((GridBagLayout)p.getLayout()).setConstraints( tf_data_port, gbc);
         p.add( tf_data_port);
         tf_data_port.setText( config.local_data_port);
         TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Local Host");
         p.setBorder( titledBorder);
         return p;
    public void actionPerformed( ActionEvent event) {
    Object source= event.getSource();
         if( source == addTarget) {
         String ip= tf_remote_address.getText().trim();
         String port= tf_remote_data_port.getText().trim();
         String localPort= tf_data_port.getText().trim();
         addTargetToList( localPort, ip, port);
         if( avTransmitter != null) {
         avTransmitter.addTarget( ip, port);
         } else if( source == removeTarget) {
         int index= list.getSelectedIndex();
         if( index != -1) {
              Target target= (Target) targets.elementAt( index);
              if( avTransmitter != null) {
         avTransmitter.removeTarget( target.ip, target.port);
              targets.removeElement( target);
              listModel.setData( targets);          
         } else if( source == startXmit) {
         if( startXmit.getLabel().equals( "Start Transmission")) {          
         int data_port= new Integer( tf_data_port.getText()).intValue();
              avTransmitter= new AVTransmitter( this, data_port);
         avTransmitter.start( tf_media_file.getText().trim(), targets);          
              avTransmitter.setLooping( cb_loop.isSelected());
         startXmit.setLabel( "Stop Transmission");
         } else if( startXmit.getLabel().equals( "Stop Transmission")) {
              avTransmitter.stop();
              avTransmitter= null;
              removeNonBaseTargets();
              listModel.setData( targets);
         startXmit.setLabel( "Start Transmission");          
         } else if( source == rtcp) {
         if( rtcpViewer == null) {
         rtcpViewer= new RTCPViewer();
         } else {
              rtcpViewer.setVisible( true);
              rtcpViewer.toFront();
         } else if( source == cb_loop) {
         if( avTransmitter != null) {
              avTransmitter.setLooping( cb_loop.isSelected());
    private void removeNonBaseTargets() {
         String localPort= tf_data_port.getText().trim();
         for( int i= targets.size(); i > 0;) {
         Target target= (Target) targets.elementAt( i - 1);
         if( !target.localPort.equals( localPort)) {
    targets.removeElement( target);
         i--;
    public void addTargetToList( String localPort,
                             String ip, String port) {     
    ListUpdater listUpdater= new ListUpdater( localPort, ip,
                                  port, listModel, targets);
    SwingUtilities.invokeLater( listUpdater);           
    public void rtcpReport( String report) {
         if( rtcpViewer != null) {
         rtcpViewer.report( report);
    public void windowClosing( WindowEvent event) {
         config.local_data_port= tf_data_port.getText().trim();
         config.targets= new Vector();
         for( int i= 0; i < targets.size(); i++) {
         Target target= (Target) targets.elementAt( i);
         if( target.localPort.equals( config.local_data_port)) {
              config.addTarget( target.ip, target.port);
         config.media_locator= tf_media_file.getText().trim();
         config.write();
    System.exit( 0);
    public void windowClosed( WindowEvent event) {
    public void windowDeiconified( WindowEvent event) {
    public void windowIconified( WindowEvent event) {
    public void windowActivated( WindowEvent event) {
    public void windowDeactivated( WindowEvent event) {
    public void windowOpened( WindowEvent event) {
    public void keyPressed( KeyEvent event) {
    public void keyReleased( KeyEvent event) {
    Object source= event.getSource();
         if( source == list) {
         int index= list.getSelectedIndex();
    public void keyTyped( KeyEvent event) {
    public void mousePressed( MouseEvent e) {
    public void mouseReleased( MouseEvent e) {
    public void mouseEntered( MouseEvent e) {
    public void mouseExited( MouseEvent e) {
    public void mouseClicked( MouseEvent e) {
    Object source= e.getSource();
         if( source == list) {
         int index= list.getSelectedIndex();
         if( index != -1) {
              Target target= (Target) targets.elementAt( index);
              tf_remote_address.setText( target.ip);
              tf_remote_data_port.setText( target.port);
         int index= list.locationToIndex( e.getPoint());
    public static void main( String[] args) {
    new Tx();
    class TargetListModel extends AbstractListModel {
    private Vector options;
    public TargetListModel( Vector options) {
         this.options= options;
    public int getSize() {
         int size;
         if( options == null) {
         size= 0;
         } else {
         size= options.size();
         return size;
    public Object getElementAt( int index) {
    String name;
    if( index < getSize()) {
         Target o= (Target)options.elementAt( index);
    name= o.localPort + " ---> " + o.ip + ":" + o.port;
         } else {
         name= null;
         return name;
    public void setData( Vector data) {
         options= data;
         fireContentsChanged( this, 0, data.size());
    class ListUpdater implements Runnable {
    String localPort, ip, port;
    TargetListModel listModel;
    Vector targets;
    public ListUpdater( String localPort, String ip, String port,
                   TargetListModel listModel, Vector targets) {
         this.localPort= localPort;
         this.ip= ip;
         this.port= port;
         this.listModel= listModel;
         this.targets= targets;
    public void run() {
    Target target= new Target( localPort, ip, port);
         if( !targetExists( localPort, ip, port)) {
         targets.addElement( target);
    listModel.setData( targets);
    public boolean targetExists( String localPort, String ip, String port) {
         boolean exists= false;
         for( int i= 0; i < targets.size(); i++) {
         Target target= (Target) targets.elementAt( i);
         if( target.localPort.equals( localPort)
         && target.ip.equals( ip)
              && target.port.equals( port)) {          
              exists= true;
         break;
         return exists;
    >>>>>>>>>>>>>>>>>
    import java.awt.*;
    import java.io.*;
    import java.net.InetAddress;
    import java.util.*;
    import javax.media.*;
    import javax.media.protocol.*;
    import javax.media.format.*;
    import javax.media.control.TrackControl;
    import javax.media.control.QualityControl;
    import javax.media.rtp.*;
    import javax.media.rtp.event.*;
    import javax.media.rtp.rtcp.*;
    public class AVTransmitter implements ReceiveStreamListener, RemoteListener,
    ControllerListener {
    // Input MediaLocator
    // Can be a file or http or capture source
    private MediaLocator locator;
    private String ipAddress;
    private int portBase;
    private Processor processor = null;
    private RTPManager rtpMgrs[];
    private int localPorts[];
    private DataSource dataOutput = null;
    private int local_data_port;
    private Tx tx;
    public AVTransmitter( Tx tx, int data_port) {
         this.tx= tx;
         local_data_port= data_port;
    * Starts the transmission. Returns null if transmission started ok.
    * Otherwise it returns a string with the reason why the setup failed.
    public synchronized String start( String filename, Vector targets) {
         String result;
         locator= new MediaLocator( filename);
         // Create a processor for the specified media locator
         // and program it to output JPEG/RTP
         result = createProcessor();
         if (result != null) {
         return result;
         // Create an RTP session to transmit the output of the
         // processor to the specified IP address and port no.
         result = createTransmitter( targets);
         if (result != null) {
         processor.close();
         processor = null;
         return result;
         // Start the transmission
         processor.start();
         return null;
    * Use the RTPManager API to create sessions for each media
    * track of the processor.
    private String createTransmitter( Vector targets) {
         // Cheated. Should have checked the type.
         PushBufferDataSource pbds = (PushBufferDataSource)dataOutput;
         PushBufferStream pbss[] = pbds.getStreams();
         rtpMgrs = new RTPManager[pbss.length];
         localPorts = new int[ pbss.length];
         SessionAddress localAddr, destAddr;
         InetAddress ipAddr;
         SendStream sendStream;
         int port;
         SourceDescription srcDesList[];
         for (int i = 0; i < pbss.length; i++) {
         // for (int i = 0; i < 1; i++) {
         try {
              rtpMgrs[i] = RTPManager.newInstance();     
              port = local_data_port + 2*i;
              localPorts[ i]= port;
              localAddr = new SessionAddress( InetAddress.getLocalHost(),
                                  port);
              rtpMgrs.initialize( localAddr);          
              rtpMgrs[i].addReceiveStreamListener(this);
              rtpMgrs[i].addRemoteListener(this);
         for( int k= 0; k < targets.size(); k++) {
              Target target= (Target) targets.elementAt( k);
              int targetPort= new Integer( target.port).intValue();
              addTarget( localPorts[ i], rtpMgrs[ i], target.ip, targetPort + 2*i);
              sendStream = rtpMgrs[i].createSendStream(dataOutput, i);          
              sendStream.start();
         } catch (Exception e) {
              e.printStackTrace();
              return e.getMessage();
         return null;
    public void addTarget( String ip, String port) {
         for (int i= 0; i < rtpMgrs.length; i++) {
         int targetPort= new Integer( port).intValue();
         addTarget( localPorts[ i], rtpMgrs[ i], ip, targetPort + 2*i);
    public void addTarget( int localPort, RTPManager mgr, String ip, int port) {
         try {
         SessionAddress addr= new SessionAddress( InetAddress.getByName( ip),
                                  new Integer( port).intValue());
         mgr.addTarget( addr);
         tx.addTargetToList( localPort + "", ip, port + "");
         } catch( Exception e) {
         e.printStackTrace();
    public void removeTarget( String ip, String port) {
         try {     
         SessionAddress addr= new SessionAddress( InetAddress.getByName( ip),
                                  new Integer( port).intValue());
         for (int i= 0; i < rtpMgrs.length; i++) {
         rtpMgrs[ i].removeTarget( addr, "target removed from transmitter.");
         } catch( Exception e) {
         e.printStackTrace();
    boolean looping= true;
    public void controllerUpdate( ControllerEvent ce) {
         System.out.println( ce);
         if( ce instanceof DurationUpdateEvent) {
         Time duration= ((DurationUpdateEvent) ce).getDuration();
         System.out.println( "duration: " + duration.getSeconds());
         } else if( ce instanceof EndOfMediaEvent) {
         System.out.println( "END OF MEDIA - looping=" + looping);
         if( looping) {
         processor.setMediaTime( new Time( 0));
              processor.start();
    public void setLooping( boolean flag) {
         looping= flag;
    public void update( ReceiveStreamEvent event) {
         String timestamp= getTimestamp();
         StringBuffer sb= new StringBuffer();
         if( event instanceof InactiveReceiveStreamEvent) {
         sb.append( timestamp + " Inactive Receive Stream");
         } else if( event instanceof ByeEvent) {
         sb.append( timestamp + " Bye");
         } else {
         System.out.println( "ReceiveStreamEvent: "+ event);
         tx.rtcpReport( sb.toString());     
    public void update( RemoteEvent event) {     
         String timestamp= getTimestamp();
         if( event instanceof ReceiverReportEvent) {
         ReceiverReport rr= ((ReceiverReportEvent) event).getReport();
         StringBuffer sb= new StringBuffer();
         sb.append( timestamp + " RR");
         if( rr != null) {
              Participant participant= rr.getParticipant();
              if( participant != null) {
              sb.append( " from " + participant.getCNAME());
              sb.append( " ssrc=" + rr.getSSRC());
              } else {
              sb.append( " ssrc=" + rr.getSSRC());
              tx.rtcpReport( sb.toString());
         } else {
         System.out.println( "RemoteEvent: " + event);
    private String getTimestamp() {
         String timestamp;
         Calendar calendar= Calendar.getInstance();
         int hour= calendar.get( Calendar.HOUR_OF_DAY);
         String hourStr= formatTime( hour);
         int minute= calendar.get( Calendar.MINUTE);
         String minuteStr= formatTime( minute);
         int second= calendar.get( Calendar.SECOND);
         String secondStr= formatTime( second);
         timestamp= hourStr + ":" + minuteStr + ":" + secondStr;     
         return timestamp;
    private String formatTime( int time) {     
         String timeStr;
         if( time < 10) {
         timeStr= "0" + time;
         } else {
         timeStr= "" + time;
         return timeStr;
    * Stops the transmission if already started
    public void stop() {
         synchronized (this) {
         if (processor != null) {
              processor.stop();
              processor.close();
              processor = null;
         for (int i= 0; i < rtpMgrs.length; i++) {
         rtpMgrs[ i].removeTargets( "Session ended.");
              rtpMgrs[ i].dispose();
    public String createProcessor() {
         if (locator == null) {
         return "Locator is null";
         DataSource ds;
         DataSource clone;
         try {
         ds = javax.media.Manager.createDataSource(locator);
         } catch (Exception e) {
         return "Couldn't create DataSource";
         // Try to create a processor to handle the input media locator
         try {
         processor = javax.media.Manager.createProcessor(ds);
         processor.addControllerListener( this);     
         } catch (NoProcessorException npe) {
         return "Couldn't create processor";
         } catch (IOException ioe) {
         return "IOException creating processor";
         // Wait for it to configure
         boolean result = waitForState(processor, Processor.Configured);
         if (result == false)
         return "Couldn't configure processor";
         // Get the tracks from the processor
         TrackControl [] tracks = processor.getTrackControls();
         // Do we have atleast one track?
         if (tracks == null || tracks.length < 1)
         return "Couldn't find tracks in processor";
         // Set the output content descriptor to RAW_RTP
         // This will limit the supported formats reported from
         // Track.getSupportedFormats to only valid RTP formats.
         ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW_RTP);
         processor.setContentDescriptor(cd);
         Format supported[];
         Format chosen;
         boolean atLeastOneTrack = false;
         // Program the tracks.
         for (int i = 0; i < tracks.length; i++) {
         Format format = tracks[i].getFormat();
         if (tracks[i].isEnabled()) {
              supported = tracks[i].getSupportedFormats();
              // We've set the output content to the RAW_RTP.
              // So all the supported formats should work with RTP.
              // We'll just pick the first one.
              if (supported.length > 0) {
              if (supported[0] instanceof VideoFormat) {
                   // For video formats, we should double check the
                   // sizes since not all formats work in all sizes.
                   chosen = checkForVideoSizes(tracks[i].getFormat(),
                                       supported[0]);
              } else
                   chosen = supported[0];
              tracks[i].setFormat(chosen);
              System.err.println("Track " + i + " is set to transmit as:");
              System.err.println(" " + chosen);
              atLeastOneTrack = true;
              } else
              tracks[i].setEnabled(false);
         } else
              tracks[i].setEnabled(false);
         if (!atLeastOneTrack)
         return "Couldn't set any of the tracks to a valid RTP format";
         // Realize the processor. This will internally create a flow
         // graph and attempt to create an output datasource for JPEG/RTP
         // audio frames.
         result = waitForState(processor, Controller.Realized);
         if (result == false)
         return "Couldn't realize processor";
         // Set the JPEG quality to .5.
         setJPEGQuality(processor, 0.5f);
         // Get the output data source of the processor
         dataOutput = processor.getDataOutput();
         return null;
    static SessionAddress destAddr1, destAddr2;
    * For JPEG and H263, we know that they only work for particular
    * sizes. So we'll perform extra checking here to make sure they
    * are of the right sizes.
    Format checkForVideoSizes(Format original, Format supported) {
         int width, height;
         Dimension size = ((VideoFormat)original).getSize();
         Format jpegFmt = new Format(VideoFormat.JPEG_RTP);
         Format h263Fmt = new Format(VideoFormat.H263_RTP);
         if (supported.matches(jpegFmt)) {
         // For JPEG, make sure width and height are divisible by 8.
         width = (size.width % 8 == 0 ? size.width :
                        (int)(size.width / 8) * 8);
         height = (size.height % 8 == 0 ? size.height :
                        (int)(size.height / 8) * 8);
         } else if (supported.matches(h263Fmt)) {
         // For H.263, we only support some specific sizes.
         if (size.width < 128) {
              width = 128;
              height = 96;
         } else if (size.width < 176) {
              width = 176;
              height = 144;
         } else {
              width = 352;
              height = 288;
         } else {
         // We don't know this particular format. We'll just
         // leave it alone then.
         return supported;
         return (new VideoFormat(null,
                        new Dimension(width, height),
                        Format.NOT_SPECIFIED,
                        null,
                        Format.NOT_SPECIFIED)).intersects(supported);
    * Setting the encoding quality to the specified value on the JPEG encoder.
    * 0.5 is a good default.
    void setJPEGQuality(Player p, float val) {
         Control cs[] = p.getControls();
         QualityControl qc = null;
         VideoFormat jpegFmt = new VideoFormat(VideoFormat.JPEG);
         // Loop through the controls to find the Quality control for
         // the JPEG encoder.
         for (int i = 0; i < cs.length; i++) {
         if (cs[i] instanceof QualityControl &&
              cs[i] instanceof Owned) {
              Object owner = ((Owned)cs[i]).getOwner();
              // Check to see if the owner is a Codec.
              // Then check for the output format.
              if (owner instanceof Codec) {
              Format fmts[] = ((Codec)owner).getSupportedOutputFormats(null);
              for (int j = 0; j < fmts.length; j++) {
                   if (fmts[j].matches(jpegFmt)) {
                   qc = (QualityControl)cs[i];
                   qc.setQuality(val);
                   System.err.println("- Setting quality to " +
                             val + " on " + qc);
                   break;
              if (qc != null)
              break;
    * Convenience methods to handle processor's state changes.
    private Integer stateLock = new Integer(0);
    private boolean failed = false;
    Integer getStateLock() {
         return stateLock;
    void setFailed() {
         failed = true;
    private synchronized boolean waitForState(Processor p, int state) {
         p.addControllerListener(new StateListener());
         failed = false;
         // Call the required method on the processor
         if (state == Processor.Configured) {
         p.configure();
         } else if (state == Processor.Realized) {
         p.realize();
         // Wait until we get an event that confirms the
         // success of the method, or a failure event.
         // See StateListener inner class
         while (p.getState() < state && !failed) {
         synchronized (getStateLock()) {
              try {
              getStateLock().wait();
              } catch (InterruptedException ie) {
              return false;
         if (failed)
         return false;
         else
         return true;
    * Inner Classes
    class StateListener implements ControllerListener {
         public void controllerUpdate(ControllerEvent ce) {
         // If there was an error during configure or
         // realiz

    I do this all the time, I put my MBP to a 60 inch Sharp. If you have the video working do the simple thing first. Check to make sure your sound is on your TV and Mac. Then if that doesn't work go to System Prefrences and under sound go to a tab called Output and see if your TV is listed and if it is change it to that setting
    Hope It Works

  • Does 2 firewire audio and video interface can work together  on FCP ?

    hello,
    My audio interface is a TC Electronic Konnekt 24D
    my video interface is a Canopus ADVC 110
    my hard drives are plugged in firewire too
    and I use Final Cut Pro 5.0.4
    So, the konnekt is recognized by FCP only when the Canopus is not plugged. And even then, the audio sometimes drop out and sometimes causes the app to crash, so that I have to trash the preferences
    When I plug the Canopus, the Konnekt 24D is recognised by OS X but not FCP
    I want to use the voice over tool with the konnekt 24D, so what should I do ?
    Does 2 firewire audio and video interface can work together ?
    If so, maybe I only have a driver problem from TC Electronic.
    But if not, I have to change this interface. Let me know if only RME, MOTU, Presonus and Edirol work, as it is said on apple's webpage for hardware compatibility
    thanks
    eMac G4 1,25 Ghz Mac OS X (10.3.9)
    eMac G4 1,25 Ghz 1 Go RAM DDR   Mac OS X (10.3.9)  

    The 1082 does show up as a possible input when I open
    the VO tool.
    Ok. So an audio interface can be plugged to FCP without crashing the app. Because of the Konnekt driver problem, I wasn't shure of this.
    I use my Aja box for export & external monitoring.
    All my audio sources run back into the Tascam and I
    use the analog mixer in the 1082 to monitor different
    sources.
    When I view external video, the audio follows it to
    keep sync.
    that's exactly what I want with other I/O interfaces. I'd like to monitor video with the canopus ADVC 110, and monitor audio with the Konnekt 24D (or another interface if this one is not compatible)
    I also imput dailies with a deck or a camera (DV or HDV) but that's not the point here.
    I want to imput audio with the Konnekt 24D because of its audio quality. Not audio sync to video dailies. But voice over, in sync with FCP video playback (for me, tha ADVC 110)

  • I am trying to export a video from Premiere Pro.  I have both audio and video checked and the sequence is active but only the audio exports.  There is no video at all.  Can anyone tell me how to get the video to export.  It was working fine until today.

    Premiere Pro CC has been working fine until today.  I have completed my sequence and it is active.  I have both audio and video checked when I am exporting using H.264 with Match Source Hi bitrate.  I am choosing entire sequence.  The audio exports but the video and title does not.  Can anyone tell me how to troubleshoot this so I can complete this project?  Please!

    Greetings,
    I've never seen this issue, and I handle many iPads, of all versions. WiFi issues are generally local to the WiFi router - they are not all of the same quality, range, immunity to interference, etc. You have distance, building construction, and the biggie - interference.
    At home, I use Apple routers, and have no issues with any of my WiFi enabled devices, computers, mobile devices, etc - even the lowly PeeCees. I have locations where I have Juniper Networks, as well as Aruba, and a few Netgears - all of them work as they should.
    The cheaper routers, Linksys, D-Link, Seimens home units, and many other no name devices have caused issues of various kinds, and even connectivity.
    I have no idea what Starbucks uses, but I always have a good connection, and I go there nearly every morning and get some work done, as well as play.
    You could try changing channels, 2.4 to 5 Gigs, changing locations of the router. I have had to do all of these at one time or another over the many years that I have been a Network Engineer.
    Good Luck - Cheers,
    M.

  • Audio and video out of synch on export

    Hi,
    I've searched the forum for this. I have a movie in FCE 4 in which the audio and video are in synch in the time line. The audio was recorded into Logic at 96, noodled, passed into FCE at 48 and then had to be lined up with the video. Then when I go to export, the resulting QT movie is out of synch, the audio now about one second before the video. I'm using QT conversion. My compresson is H.264, quality best, frame rate 30, multi-pass, 640 x 480, format AAC or linear PCM, various sample rates, stereo and mono etc. I've tried various other settings for export, but the problem remains.
    What am I doing wrong? Many thanks for any help,
    Ray

    Hi,
    Thanks for the reply. I think you're on the right track. I want the best sound possible, so I record into Logic, knowing I'm going into FCE I record at 96 so an even multiple down to 48. I think a .wav file though, and maybe that's the problem. Into FCE, but other audio there as well from the camera mic, and which I occasionally include in the sound track for ambience. All of this noodled a fair amount and then exported, the intent being to end up in 44, as it will be uploaded to youtube. I switch to 16 bit out of Logic, but the aiff vs wav may be the problem. Do think there might be a better way to do this?
    Ray

  • Audio and video out of sync when using quicktime conversion

    hello,
    i am editing a hdv project and want to export to quicktime. the client needs it in a specific size (700x394). they do not want the video in the file compressed, just the size of image. they are compressing the video through flash for the web. anyways, i used to just use quicktime conversion, custom size and use 700x394, but now when i do that the audio and video are sparadically out of sync in the product. the wierd thing is that the project plays fine in the timeline, but when i export the problem occours. i then, exported to a full res fcp movie and it is fine. then tired reexporting through qt pro to 700x394. when i did this their were no sync issues but the quality of the file was not so good, it look more pixely, certainly not as good as when i did it using quicktime conversion.
    how can i export through fcp to get the 700x394 file? someone said 'use current settings' instead of using quicktime conversion? how do i do this? what do i need to do and what are my settings?
    any help is appreciated. thanks a ton in advance.
    cheers,
    clay

    What format are you converting the HDV to?
    If it's animation, it is quite possible your computer can't pull the data rate. In this case it would be a processor / hard drive issue not an out of sync issue.
    x

  • Audio and video not synchronized in recording made with Lync 2013

    There appears to be no forum for the Lync 2013 client yet, so I am posting this in the Lync 2013 server forum.
    Environment: Lync 2013 Server and Lync 2013 32-bit client. Only one Lync pool. Conference call was scheduled using "Lync Meeting" in Outlook about a week before the call. Call recorded on a laptop that was plugged into our LAN with an Ethernet
    cable.
    Problem: User scheduled a conference call with screen sharing (PowerPoint) and recorded the call. The audio and video are not synchronized in the recording. There is several minutes' delay, with the audio lagging behind the PowerPoint. The last slide
    on the call was presented, the call ended, and the presenter ended the recording (I was present for all this and can confirm no user error in these steps); however, in the recording, the slides change before the speaker talks about it, and the last
    couple of minutes are cut off. 
    Any ideas on why this happened? Web research has failed me. Thank you.

    You may need to lower the recording quality as your PC might not be able to keep up the pace.
    Go to Options in Lync client -> Recording -> Select 480p
    Then test again
    - Belgian Unified Communications Community : http://www.pro-lync.be - MCM/MVP/MCT

  • IMovie will not import or play the clips at the proper speed and the audio and video do not match up.

    i have a samsung hd camcorder. it records at 40 fps, but iMovie will not play the clips at the proper speed and the audio and video do not match up. I have tried to record at standard quality to no avail. Help. thanks in advance.

    samsung sc-hmx10c     ntsc
    im guessing that is the speed it records at due to the fact imovie puts a blue 40 at the top left corner of every clip i import.

  • I need easy audio and video converting and editing software?

    I need easy audio and video converting and editing software that wont cost $400.

    I highly recommend you try Video Monkey. It's free, though donations are certainly accepted, and does an excellent job of converting .mov to .wmv. Choose Best (2 Pass) for Encoding Speed and depending upon the size of your original file, uncheck 320 pixels wide to create a larger converted image. I set Quality to "Go Nuts".
    Another good commercial program is Video Converter Ultimate for Mac. It works as video converter and editor. The version for 2 Pass conversion is approximately $69. , which is considerably less than the $400+ that Adobe Premiere is asking for its editor. I'd still prefer a freebie though, but there's little choice...

  • Audio and video is out of sync!

    when I record audio and video together with my isight built in camera on my macbook pro with either quicktime pro, photobooth or imovie, when I play the clip back the audio and video is out of sync! like watching a kung foo movie. I've tried everything, I've taken my book into see a "Genius" at thew mac store and he said to me that the isight is made really for video conferencing and is really not supposed to be fully in sync? Uhh thats weird, so mac makes a product and puts it on all their laptops and it only sort of works?? I've been on the phone with apple care, did a bunch of things like disk utility repairs on permisions, then used the startup disk to do another permisions repair. I've uninstalled quicktime pro and re-installed it a couple of times? If anyone could give me some advice that would be great! I'm a little frustrated as you can probably see! thanks
    cheers
    m

    Welcome to Apple Discussions, m
    See if any of the suggestions offered here reduce your sync quality problem enough:
      http://discussions.apple.com/thread.jspa?messageID=7089133&#7089133
    If, after making certain that your Mac is running well, has plenty of free space on the correct drive, etc., your sync is still not close enough, you can manually re-sync any parts that are unacceptable to you using either QT Pro or iMovie.
    If you check carefully, you will see that not even pro video is always perfectly synced. It is a lot of work to get things "nearly perfect," but, with a bit of practice, it doesn't take long to get things "close enough" for me.
    I use the same simple process with either QT Pro or iMovie, so you can use whichever you find easiest. Whichever you use, keep the original AV clip safe and *_work on a copy_* in case something goes wrong.
    If you do not already know how, here is my process, but you can adjust for your workflow:
    (1) Copy of the audio track.
    (2) Add the audio track copy to the AV file.
    (3) Silence the original audio and work on audio copy.
    (4) Insert small blank audio clips (or split and slide if using iMovie) if needed to make the audio later.
    (5) Cut small bits out of quiet parts (or split and shorten in iMovie) if needed to make audio sooner.
    (6) Repeat until satisfied and delete any tracks no longer needed.
    (7) Save the adjusted AV file.
    (8) Enjoy!
    EZ Jim
    G5 DP 1.8GHz w/Mac OS X (10.5.7) PowerBook 1.67GHz (10.4.11)   iBookSE 366MHz (10.3.9)  External iSight

Maybe you are looking for

  • General users can't open web analysis reports as html

    Hi guys, I am facing an issue about Web Analysis. When I use the version pre-9, I can access web analysis directly with html client or java client.Then I set each user with each POV. Now I am using EPM11.1.1.3, I can just access web analysis by works

  • Using Time Machine HD, can I transfer applications to new iMac

    I have been using time machine to an external drive for 3 years. I'm going to buy a new iMac in June. When I transfer data etc. using the TM external drive, will it transfer my applications? Many of my apps were downloaded and I don't have discs. Whe

  • Workbook funs from Disco Desktop but errors out in PLUS

    Hi, Let me start by saying that I read the previous thread on this topic from September 2008 but have not yet found a solution. There is an existing report of Unpaid Invoices that uses folders from Accounts Payable (AP Payment Schedules, Invoices, Su

  • Cannot Shuffle Music Playlist from Apple TV

    When I go into my homeshare libray from the apple TV, then go into Music->Music Playlist and then choose any of my playlists that my music videos are in, I can play my music videos one by one. There is always the option to shuffle at the top, however

  • Passing select-options values using call transaction method

    Hi Experts, I have a scenario in which i have three fields BUKRS,WERKS and MATKL in an internal table i_tab and I have MATNR as a selection-option. After this I have to use call transaction <tcode> for all the line items in the internal table and the