Add audio to video
First off I am trying to add a secondary track of audio to a MP4 file and I was wondering what was the best way to go about doing that. Secondly I want to add black to the begining and end of the movie and was wondering what would do that best. The reason behind needing these two things is that me and a few friends recorded our own in-movie commentary and would like to add it into the movie. And our audio goes a little bit longer than the movie is so I would like to add the black as filler.
AeroQ wrote:
If audio and video don't have exactly the same length and you want to stretch audio to match video length, then the command is: Edit > Add to Selection & Scale
(Having selected the video segment previously)
Regards, BJ
Is there a way to silence the audio on the movie and just have the "extra" audio that is Added to Selection & Scaled be the only audio?
I tried doing just what you described and there is a bit of an echo, although pretty slight, it's enough to be annoying. I had thought I had gotten the start and end points for both the movie and the "extra" audio just about lined up in GarageBand.
Any help would be greatly appreciated!
Deborah
Similar Messages
-
Can't add audio to video file in QT 7.1.5 on WinXP.
Quicktime Pro 7.1.5 on a PC running WinXP.
I have created a video file (.mov) and now wish to add an Audio track. Following the instructions in Help, however nothing seems to happen - click on the menu item to Edit->Add To Movie but nothing happens - in fact the menu doesn't even disappear as if the button is locked.
Any hints?
TIA.Open your audio only movie in QT Pro.
Select all and copy.
Switch to your video only file, position the playhead at the beginning and "add".
If your files have different "times" you'll need to reverse the copy (video), select all of audio track and add scaled. -
How to add audio or video in KeyNote?
Can music from ipad be added to a Keynote presentation?
Hi there,
You may find the information at the website below helpful.
Keynote Help for iPad - Add video and audio
http://help.apple.com/keynote/ipad/2.2/#/tan63d61519a
-Griff W. -
Unable to see video & audio tracks as shown in "Create a sequence or timeline and add audio"
hi,
I'm on the https://helpx.adobe.com/creative-cloud/learn/start/premiere.html page trying to work through the Create a sequence or timeline and add audio tutorial
And cannot see the Video & Audio, as shown in the bottom right sub-window
I'm using a trial version of Premier pro CC downloaded yesterday. On my mac mini running 10.9.2
does anyone know how to fix this problem?
thanks
david
I am able to hear the audio.
Message was edited by: spottedsilvertabbyHi,
Error 7 is when a router or the connection to the router is broken in some way.
Some devices have features such as Denial Of Service protection (DoS) that cut a particular Internet Port when it thinks too much data is coming (Presuming it is an attack.)
iChat 5 in Leopard is not capped by the System Preferences > Quicktime Streaming speed (which we used to suggest was set at 1.5Mbps)
It now sees your whole Connection speed and your Upload may be much faster than this.
Your Download is likely to be much faster. However iChat will tend to operate at the lower figure of your Upload.
DoS features are threshold based.
You may now be bumping in to this Threshold where you were not before.
SPI (Stateful packet Inspection) does a different job but has the same effect when it is overloaded by the speed of the data.
If you have either of these features, turn them Off (Disable them)
7:25 PM Friday; October 2, 2009 -
Add audio from 25 fps video to 23.976 fps
Hello,
I want to add audio file from 25 fps to match with another video with 23.976 fps using Audition. How can i do that ?!
25 fps video settings:
General
Complete name : 01.avi
Format : AVI
Format/Info : Audio Video Interleave
File size : 170 MiB
Duration : 20mn 28s
Overall bit rate : 1 161 Kbps
Writing application : VirtualDubMod 1.5.10.2 (build 2540/release)
Writing library : VirtualDub build 24415/release
Video
ID : 0
Format : MPEG-4 Visual
Format settings, BVOP : 1
Format settings, QPel : No
Format settings, GMC : No warppoints
Format settings, Matrix : Default (H.263)
Muxing mode : Packed bitstream
Codec ID : DX50
Codec ID/Hint : DivX 5
Duration : 20mn 28s
Bit rate : 1 039 Kbps
Width : 640 pixels
Height : 480 pixels
Display aspect ratio : 4:3
Frame rate : 25.000 fps
Color space : YUV
Chroma subsampling : 4:2:0
Bit depth : 8 bits
Scan type : Progressive
Compression mode : Lossy
Bits/(Pixel*Frame) : 0.135
Stream size : 152 MiB (90%)
Writing library : DivX 6.7.0 (UTC 2007-09-20)
Audio
ID : 1
Format : MPEG Audio
Format version : Version 1
Format profile : Layer 3
Mode : Joint stereo
Mode extension : MS Stereo
Codec ID : 55
Codec ID/Hint : MP3
Duration : 20mn 27s
Bit rate mode : Constant
Bit rate : 112 Kbps
Channel(s) : 2 channels
Sampling rate : 44.1 KHz
Compression mode : Lossy
Delay relative to video : 22ms
Stream size : 16.4 MiB (10%)
Alignment : Split accross interleaves
Interleave, duration : 40 ms (1.00 video frame)
Interleave, preload duration : 500 ms
23.976 fps video settings:
General
Complete name : new_01.avi
Format : AVI
Format/Info : Audio Video Interleave
File size : 557 MiB
Duration : 24mn 7s
Overall bit rate : 3 228 Kbps
Writing library : VirtualDub build 32842/release
Video
ID : 0
Format : MPEG-4 Visual
Format profile : Advanced Simple@L5
Format settings, BVOP : 2
Format settings, QPel : No
Format settings, GMC : No warppoints
Format settings, Matrix : Default (H.263)
Muxing mode : Packed bitstream
Codec ID : XVID
Codec ID/Hint : XviD
Duration : 24mn 7s
Bit rate : 1 807 Kbps
Width : 1 024 pixels
Height : 768 pixels
Display aspect ratio : 4:3
Frame rate : 23.976 fps
Color space : YUV
Chroma subsampling : 4:2:0
Bit depth : 8 bits
Scan type : Progressive
Compression mode : Lossy
Bits/(Pixel*Frame) : 0.096
Stream size : 312 MiB (56%)
Writing library : XviD 64
Audio
ID : 1
Format : PCM
Format settings, Endianness : Little
Format settings, Sign : Signed
Codec ID : 1
Duration : 24mn 7s
Bit rate mode : Constant
Bit rate : 1 411.2 Kbps
Channel(s) : 2 channels
Sampling rate : 44.1 KHz
Bit depth : 16 bits
Stream size : 244 MiB (44%)
Alignment : Aligned on interleaves
Interleave, duration : 42 ms (1.00 video frame)
Interleave, preload duration : 500 ms
Best Regards,Thanks SteveG,
Ok, What about add the same audio from 25 fps video to 29.97 fps video with settings below:
General
Complete name : 01.wmv
Format : Windows Media
File size : 259 MiB
Duration : 24mn 8s
Overall bit rate mode : Constant
Overall bit rate : 1 501 Kbps
Maximum Overall bit rate : 1 507 Kbps
Encoded date : UTC 2007-06-25 14:54:25.750
Writing application : TMPGEnc 4.0 XPress Version. 4.0.3.169
Video
ID : 2
Format : VC-1
Format profile : MP@ML
Codec ID : WMV3
Codec ID/Info : Windows Media Video 9
Codec ID/Hint : WMV3
Description of the codec : Windows Media Video 9 - Professional
Duration : 24mn 8s
Bit rate mode : Constant
Bit rate : 1 300 Kbps
Width : 640 pixels
Height : 480 pixels
Display aspect ratio : 4:3
Frame rate : 29.970 fps
Bit depth : 8 bits
Scan type : Progressive
Compression mode : Lossy
Bits/(Pixel*Frame) : 0.141
Stream size : 224 MiB (87%)
Language : Japanese
Audio
ID : 1
Format : WMA
Format version : Version 2
Codec ID : 161
Codec ID/Info : Windows Media Audio
Description of the codec : Windows Media Audio 9.1 - 192 kbps, 48 kHz, stereo 1-pass CBR
Duration : 24mn 8s
Bit rate mode : Constant
Bit rate : 192 Kbps
Channel(s) : 2 channels
Sampling rate : 48.0 KHz
Bit depth : 16 bits
Stream size : 33.1 MiB (13%)
Language : Japanese -
Love the Adobe products first. OK , I have my instructional coaches creating power point presentation add audio to each slide with the presenter 10 add ins. I need these presentation converted into MP4 like you can do with adobe presenter video creator.
If you import the PPT deck into Captivate it will not bring the audio from Presenter. But you can import it into Captivate after the slide deck has been imported. The audio files aren't necessarily named in a logical fashion in the source files for the presentation, so I would recommend that you publish the presentation locally and then find the 'data' folder in the published output and the audio files will be named in a way that you can infer the slide they are associated with. Not the simplest proceedure but it should be pretty painless.
-
DO i need some extra hardware interface for receving both Audio and video
hi i m doing e-learning project. i have to capture video from webcam and voice from headphone and send to client.
but my code is working fine for either one at a time.
DO i need some extra hardware interface for receving both Audio and video. im using code AVTransmit and AVReceive found from this site only
After running TX
i give Dsound:// & vfw://0 in Media Locater only sound is received and no vedio
and when i give vfw://0 in Media Locater only live video is transmited.
im using JMF1.1.2e.
if any one know the method to run or cause of it plz reply me soon. i will be very thankfull
transmiter/server side code .first run TX on server
import java.io.*;
import java.awt.*;
import java.awt.event.*;
import java.net.*;
import java.util.*;
import javax.media.rtp.*;
import javax.swing.*;
import javax.swing.event.*;
import javax.swing.border.*;
public class Tx extends JFrame implements ActionListener, KeyListener,
MouseListener, WindowListener {
Vector targets;
JList list;
JButton startXmit;
JButton rtcp;
JButton update;
JButton expiration;
JButton statistics;
JButton addTarget;
JButton removeTarget;
JTextField tf_remote_address;
JTextField tf_remote_data_port;
JTextField tf_media_file;
JTextField tf_data_port;
TargetListModel listModel;
AVTransmitter avTransmitter;
RTCPViewer rtcpViewer;
JCheckBox cb_loop;
Config config;
public Tx() {
setTitle( "JMF/RTP Transmitter");
config= new Config();
GridBagLayout gridBagLayout= new GridBagLayout();
GridBagConstraints gbc;
JPanel p= new JPanel();
p.setLayout( gridBagLayout);
JPanel localPanel= createLocalPanel();
gbc= new GridBagConstraints();
gbc.gridx= 0;
gbc.gridy= 0;
gbc.gridwidth= 2;
gbc.anchor= GridBagConstraints.CENTER;
gbc.fill= GridBagConstraints.BOTH;
gbc.insets= new Insets( 10, 5, 0, 0);
((GridBagLayout)p.getLayout()).setConstraints( localPanel, gbc);
p.add( localPanel);
JPanel targetPanel= createTargetPanel();
gbc= new GridBagConstraints();
gbc.gridx= 1;
gbc.gridy= 1;
gbc.weightx= 1.0;
gbc.weighty= 1.0;
gbc.anchor= GridBagConstraints.CENTER;
gbc.fill= GridBagConstraints.BOTH;
gbc.insets= new Insets( 10, 5, 0, 0);
((GridBagLayout)p.getLayout()).setConstraints( targetPanel, gbc);
p.add( targetPanel);
JPanel mediaPanel= createMediaPanel();
gbc= new GridBagConstraints();
gbc.gridx= 1;
gbc.gridy= 2;
gbc.weightx= 1.0;
gbc.weighty= 1.0;
gbc.anchor= GridBagConstraints.CENTER;
gbc.fill= GridBagConstraints.BOTH;
gbc.insets= new Insets( 10, 5, 0, 0);
((GridBagLayout)p.getLayout()).setConstraints( mediaPanel, gbc);
p.add( mediaPanel);
JPanel buttonPanel= new JPanel();
rtcp= new JButton( "RTCP Monitor");
update= new JButton( "Transmission Status");
update.setEnabled( false);
rtcp.addActionListener( this);
update.addActionListener( this);
buttonPanel.add( rtcp);
buttonPanel.add( update);
gbc= new GridBagConstraints();
gbc.gridx = 0;
gbc.gridy = 3;
gbc.gridwidth= 2;
gbc.weightx = 1.0;
gbc.weighty = 0.0;
gbc.anchor = GridBagConstraints.CENTER;
gbc.fill = GridBagConstraints.HORIZONTAL;
gbc.insets = new Insets( 5,5,10,5);
((GridBagLayout)p.getLayout()).setConstraints( buttonPanel, gbc);
p.add( buttonPanel);
getContentPane().add( p);
list.addMouseListener( this);
addWindowListener( this);
pack();
setVisible( true);
private JPanel createMediaPanel() {
JPanel p= new JPanel();
GridBagLayout gridBagLayout= new GridBagLayout();
GridBagConstraints gbc;
p.setLayout( gridBagLayout);
JLabel label= new JLabel( "Media Locator:");
gbc= new GridBagConstraints();
gbc.gridx = 0;
gbc.gridy = 0;
gbc.weightx = 0.0;
gbc.weighty = 0.0;
gbc.anchor = GridBagConstraints.EAST;
gbc.fill = GridBagConstraints.NONE;
gbc.insets = new Insets( 5,5,10,5);
((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
p.add( label);
tf_media_file= new JTextField( 35);
gbc= new GridBagConstraints();
gbc.gridx = 1;
gbc.gridy = 0;
gbc.weightx = 1.0;
gbc.weighty = 0.0;
gbc.anchor = GridBagConstraints.WEST;
gbc.fill = GridBagConstraints.HORIZONTAL;
gbc.insets = new Insets( 5,5,10,5);
((GridBagLayout)p.getLayout()).setConstraints( tf_media_file, gbc);
p.add( tf_media_file);
tf_media_file.setText( config.media_locator);
cb_loop= new JCheckBox( "loop");
startXmit= new JButton( "Start Transmission");
startXmit.setEnabled( true);
startXmit.addActionListener( this);
gbc= new GridBagConstraints();
gbc.gridx = 2;
gbc.gridy = 0;
gbc.weightx = 0.0;
gbc.weighty = 0.0;
gbc.anchor = GridBagConstraints.WEST;
gbc.fill = GridBagConstraints.NONE;
gbc.insets = new Insets( 5,5,10,5);
((GridBagLayout)p.getLayout()).setConstraints( cb_loop, gbc);
p.add( cb_loop);
cb_loop.setSelected( true);
cb_loop.addActionListener( this);
gbc= new GridBagConstraints();
gbc.gridx = 1;
gbc.gridy = 1;
gbc.weightx = 0.0;
gbc.weighty = 0.0;
gbc.anchor = GridBagConstraints.CENTER;
gbc.fill = GridBagConstraints.NONE;
gbc.insets = new Insets( 5,5,10,5);
((GridBagLayout)p.getLayout()).setConstraints( startXmit, gbc);
p.add( startXmit);
TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Source");
p.setBorder( titledBorder);
return p;
private JPanel createTargetPanel() {
JPanel p= new JPanel();
GridBagLayout gridBagLayout= new GridBagLayout();
GridBagConstraints gbc;
p.setLayout( gridBagLayout);
targets= new Vector();
for( int i= 0; i < config.targets.size(); i++) {
targets.addElement( config.targets.elementAt( i));
listModel= new TargetListModel( targets);
list= new JList( listModel);
list.addKeyListener( this);
list.setPrototypeCellValue( "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx");
JScrollPane scrollPane= new JScrollPane( list,
ScrollPaneConstants.VERTICAL_SCROLLBAR_AS_NEEDED,
ScrollPaneConstants.HORIZONTAL_SCROLLBAR_NEVER);
gbc= new GridBagConstraints();
gbc.gridx= 0;
gbc.gridy= 0;
gbc.weightx= 1.0;
gbc.weighty= 1.0;
gbc.anchor= GridBagConstraints.CENTER;
gbc.fill= GridBagConstraints.BOTH;
gbc.insets= new Insets( 10, 5, 0, 0);
((GridBagLayout)p.getLayout()).setConstraints( scrollPane, gbc);
p.add( scrollPane);
JPanel p1= new JPanel();
p1.setLayout( gridBagLayout);
JLabel label= new JLabel( "IP Address:");
gbc= new GridBagConstraints();
gbc.gridx = 0;
gbc.gridy = 0;
gbc.weightx = 0.0;
gbc.weighty = 0.0;
gbc.anchor = GridBagConstraints.EAST;
gbc.fill = GridBagConstraints.NONE;
gbc.insets = new Insets( 5,5,0,5);
((GridBagLayout)p1.getLayout()).setConstraints( label, gbc);
p1.add( label);
tf_remote_address= new JTextField( 15);
gbc= new GridBagConstraints();
gbc.gridx = 1;
gbc.gridy = 0;
gbc.weightx = 0.0;
gbc.weighty = 0.0;
gbc.anchor = GridBagConstraints.WEST;
gbc.fill = GridBagConstraints.NONE;
gbc.insets = new Insets( 5,5,0,5);
((GridBagLayout)p1.getLayout()).setConstraints( tf_remote_address, gbc);
p1.add( tf_remote_address);
label= new JLabel( "Data Port:");
gbc= new GridBagConstraints();
gbc.gridx = 0;
gbc.gridy = 1;
gbc.weightx = 0.0;
gbc.weighty = 0.0;
gbc.anchor = GridBagConstraints.EAST;
gbc.fill = GridBagConstraints.NONE;
gbc.insets = new Insets( 5,5,0,5);
((GridBagLayout)p1.getLayout()).setConstraints( label, gbc);
p1.add( label);
tf_remote_data_port= new JTextField( 15);
gbc= new GridBagConstraints();
gbc.gridx = 1;
gbc.gridy = 1;
gbc.weightx = 0.0;
gbc.weighty = 0.0;
gbc.anchor = GridBagConstraints.WEST;
gbc.fill = GridBagConstraints.NONE;
gbc.insets = new Insets( 5,5,0,5);
((GridBagLayout)p1.getLayout()).setConstraints( tf_remote_data_port, gbc);
p1.add( tf_remote_data_port);
JPanel p2= new JPanel();
addTarget= new JButton( "Add Target");
removeTarget= new JButton( "Remove Target");
p2.add( addTarget);
p2.add( removeTarget);
addTarget.addActionListener( this);
removeTarget.addActionListener( this);
gbc= new GridBagConstraints();
gbc.gridx = 0;
gbc.gridy = 2;
gbc.weightx = 1.0;
gbc.weighty = 0.0;
gbc.gridwidth= 2;
gbc.anchor = GridBagConstraints.CENTER;
gbc.fill = GridBagConstraints.HORIZONTAL;
gbc.insets = new Insets( 20,5,0,5);
((GridBagLayout)p1.getLayout()).setConstraints( p2, gbc);
p1.add( p2);
gbc= new GridBagConstraints();
gbc.gridx= 1;
gbc.gridy= 0;
gbc.weightx= 1.0;
gbc.weighty= 1.0;
gbc.anchor= GridBagConstraints.CENTER;
gbc.fill= GridBagConstraints.BOTH;
gbc.insets= new Insets( 10, 5, 0, 0);
((GridBagLayout)p.getLayout()).setConstraints( p1, gbc);
p.add( p1);
TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Targets");
p.setBorder( titledBorder);
return p;
private JPanel createLocalPanel() {
JPanel p= new JPanel();
GridBagLayout gridBagLayout= new GridBagLayout();
GridBagConstraints gbc;
p.setLayout( gridBagLayout);
JLabel label= new JLabel( "IP Address:");
gbc= new GridBagConstraints();
gbc.gridx = 0;
gbc.gridy = 0;
gbc.weightx = 0.0;
gbc.weighty = 0.0;
gbc.anchor = GridBagConstraints.EAST;
gbc.fill = GridBagConstraints.NONE;
gbc.insets = new Insets( 5,5,0,5);
((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
p.add( label);
JTextField tf_local_host= new JTextField( 15);
gbc= new GridBagConstraints();
gbc.gridx = 1;
gbc.gridy = 0;
gbc.weightx = 0.0;
gbc.weighty = 0.0;
gbc.anchor = GridBagConstraints.WEST;
gbc.fill = GridBagConstraints.NONE;
gbc.insets = new Insets( 5,5,0,5);
((GridBagLayout)p.getLayout()).setConstraints( tf_local_host, gbc);
p.add( tf_local_host);
try {
String host= InetAddress.getLocalHost().getHostAddress();
tf_local_host.setText( host);
} catch( UnknownHostException e) {
label= new JLabel( "Data Port:");
gbc= new GridBagConstraints();
gbc.gridx = 0;
gbc.gridy = 1;
gbc.weightx = 0.0;
gbc.weighty = 0.0;
gbc.anchor = GridBagConstraints.EAST;
gbc.fill = GridBagConstraints.NONE;
gbc.insets = new Insets( 5,5,0,5);
((GridBagLayout)p.getLayout()).setConstraints( label, gbc);
p.add( label);
tf_data_port= new JTextField( 15);
gbc= new GridBagConstraints();
gbc.gridx = 1;
gbc.gridy = 1;
gbc.weightx = 0.0;
gbc.weighty = 0.0;
gbc.anchor = GridBagConstraints.WEST;
gbc.fill = GridBagConstraints.NONE;
gbc.insets = new Insets( 5,5,10,5);
((GridBagLayout)p.getLayout()).setConstraints( tf_data_port, gbc);
p.add( tf_data_port);
tf_data_port.setText( config.local_data_port);
TitledBorder titledBorder= new TitledBorder( new EtchedBorder(), "Local Host");
p.setBorder( titledBorder);
return p;
public void actionPerformed( ActionEvent event) {
Object source= event.getSource();
if( source == addTarget) {
String ip= tf_remote_address.getText().trim();
String port= tf_remote_data_port.getText().trim();
String localPort= tf_data_port.getText().trim();
addTargetToList( localPort, ip, port);
if( avTransmitter != null) {
avTransmitter.addTarget( ip, port);
} else if( source == removeTarget) {
int index= list.getSelectedIndex();
if( index != -1) {
Target target= (Target) targets.elementAt( index);
if( avTransmitter != null) {
avTransmitter.removeTarget( target.ip, target.port);
targets.removeElement( target);
listModel.setData( targets);
} else if( source == startXmit) {
if( startXmit.getLabel().equals( "Start Transmission")) {
int data_port= new Integer( tf_data_port.getText()).intValue();
avTransmitter= new AVTransmitter( this, data_port);
avTransmitter.start( tf_media_file.getText().trim(), targets);
avTransmitter.setLooping( cb_loop.isSelected());
startXmit.setLabel( "Stop Transmission");
} else if( startXmit.getLabel().equals( "Stop Transmission")) {
avTransmitter.stop();
avTransmitter= null;
removeNonBaseTargets();
listModel.setData( targets);
startXmit.setLabel( "Start Transmission");
} else if( source == rtcp) {
if( rtcpViewer == null) {
rtcpViewer= new RTCPViewer();
} else {
rtcpViewer.setVisible( true);
rtcpViewer.toFront();
} else if( source == cb_loop) {
if( avTransmitter != null) {
avTransmitter.setLooping( cb_loop.isSelected());
private void removeNonBaseTargets() {
String localPort= tf_data_port.getText().trim();
for( int i= targets.size(); i > 0;) {
Target target= (Target) targets.elementAt( i - 1);
if( !target.localPort.equals( localPort)) {
targets.removeElement( target);
i--;
public void addTargetToList( String localPort,
String ip, String port) {
ListUpdater listUpdater= new ListUpdater( localPort, ip,
port, listModel, targets);
SwingUtilities.invokeLater( listUpdater);
public void rtcpReport( String report) {
if( rtcpViewer != null) {
rtcpViewer.report( report);
public void windowClosing( WindowEvent event) {
config.local_data_port= tf_data_port.getText().trim();
config.targets= new Vector();
for( int i= 0; i < targets.size(); i++) {
Target target= (Target) targets.elementAt( i);
if( target.localPort.equals( config.local_data_port)) {
config.addTarget( target.ip, target.port);
config.media_locator= tf_media_file.getText().trim();
config.write();
System.exit( 0);
public void windowClosed( WindowEvent event) {
public void windowDeiconified( WindowEvent event) {
public void windowIconified( WindowEvent event) {
public void windowActivated( WindowEvent event) {
public void windowDeactivated( WindowEvent event) {
public void windowOpened( WindowEvent event) {
public void keyPressed( KeyEvent event) {
public void keyReleased( KeyEvent event) {
Object source= event.getSource();
if( source == list) {
int index= list.getSelectedIndex();
public void keyTyped( KeyEvent event) {
public void mousePressed( MouseEvent e) {
public void mouseReleased( MouseEvent e) {
public void mouseEntered( MouseEvent e) {
public void mouseExited( MouseEvent e) {
public void mouseClicked( MouseEvent e) {
Object source= e.getSource();
if( source == list) {
int index= list.getSelectedIndex();
if( index != -1) {
Target target= (Target) targets.elementAt( index);
tf_remote_address.setText( target.ip);
tf_remote_data_port.setText( target.port);
int index= list.locationToIndex( e.getPoint());
public static void main( String[] args) {
new Tx();
class TargetListModel extends AbstractListModel {
private Vector options;
public TargetListModel( Vector options) {
this.options= options;
public int getSize() {
int size;
if( options == null) {
size= 0;
} else {
size= options.size();
return size;
public Object getElementAt( int index) {
String name;
if( index < getSize()) {
Target o= (Target)options.elementAt( index);
name= o.localPort + " ---> " + o.ip + ":" + o.port;
} else {
name= null;
return name;
public void setData( Vector data) {
options= data;
fireContentsChanged( this, 0, data.size());
class ListUpdater implements Runnable {
String localPort, ip, port;
TargetListModel listModel;
Vector targets;
public ListUpdater( String localPort, String ip, String port,
TargetListModel listModel, Vector targets) {
this.localPort= localPort;
this.ip= ip;
this.port= port;
this.listModel= listModel;
this.targets= targets;
public void run() {
Target target= new Target( localPort, ip, port);
if( !targetExists( localPort, ip, port)) {
targets.addElement( target);
listModel.setData( targets);
public boolean targetExists( String localPort, String ip, String port) {
boolean exists= false;
for( int i= 0; i < targets.size(); i++) {
Target target= (Target) targets.elementAt( i);
if( target.localPort.equals( localPort)
&& target.ip.equals( ip)
&& target.port.equals( port)) {
exists= true;
break;
return exists;
>>>>>>>>>>>>>>>>>
import java.awt.*;
import java.io.*;
import java.net.InetAddress;
import java.util.*;
import javax.media.*;
import javax.media.protocol.*;
import javax.media.format.*;
import javax.media.control.TrackControl;
import javax.media.control.QualityControl;
import javax.media.rtp.*;
import javax.media.rtp.event.*;
import javax.media.rtp.rtcp.*;
public class AVTransmitter implements ReceiveStreamListener, RemoteListener,
ControllerListener {
// Input MediaLocator
// Can be a file or http or capture source
private MediaLocator locator;
private String ipAddress;
private int portBase;
private Processor processor = null;
private RTPManager rtpMgrs[];
private int localPorts[];
private DataSource dataOutput = null;
private int local_data_port;
private Tx tx;
public AVTransmitter( Tx tx, int data_port) {
this.tx= tx;
local_data_port= data_port;
* Starts the transmission. Returns null if transmission started ok.
* Otherwise it returns a string with the reason why the setup failed.
public synchronized String start( String filename, Vector targets) {
String result;
locator= new MediaLocator( filename);
// Create a processor for the specified media locator
// and program it to output JPEG/RTP
result = createProcessor();
if (result != null) {
return result;
// Create an RTP session to transmit the output of the
// processor to the specified IP address and port no.
result = createTransmitter( targets);
if (result != null) {
processor.close();
processor = null;
return result;
// Start the transmission
processor.start();
return null;
* Use the RTPManager API to create sessions for each media
* track of the processor.
private String createTransmitter( Vector targets) {
// Cheated. Should have checked the type.
PushBufferDataSource pbds = (PushBufferDataSource)dataOutput;
PushBufferStream pbss[] = pbds.getStreams();
rtpMgrs = new RTPManager[pbss.length];
localPorts = new int[ pbss.length];
SessionAddress localAddr, destAddr;
InetAddress ipAddr;
SendStream sendStream;
int port;
SourceDescription srcDesList[];
for (int i = 0; i < pbss.length; i++) {
// for (int i = 0; i < 1; i++) {
try {
rtpMgrs[i] = RTPManager.newInstance();
port = local_data_port + 2*i;
localPorts[ i]= port;
localAddr = new SessionAddress( InetAddress.getLocalHost(),
port);
rtpMgrs.initialize( localAddr);
rtpMgrs[i].addReceiveStreamListener(this);
rtpMgrs[i].addRemoteListener(this);
for( int k= 0; k < targets.size(); k++) {
Target target= (Target) targets.elementAt( k);
int targetPort= new Integer( target.port).intValue();
addTarget( localPorts[ i], rtpMgrs[ i], target.ip, targetPort + 2*i);
sendStream = rtpMgrs[i].createSendStream(dataOutput, i);
sendStream.start();
} catch (Exception e) {
e.printStackTrace();
return e.getMessage();
return null;
public void addTarget( String ip, String port) {
for (int i= 0; i < rtpMgrs.length; i++) {
int targetPort= new Integer( port).intValue();
addTarget( localPorts[ i], rtpMgrs[ i], ip, targetPort + 2*i);
public void addTarget( int localPort, RTPManager mgr, String ip, int port) {
try {
SessionAddress addr= new SessionAddress( InetAddress.getByName( ip),
new Integer( port).intValue());
mgr.addTarget( addr);
tx.addTargetToList( localPort + "", ip, port + "");
} catch( Exception e) {
e.printStackTrace();
public void removeTarget( String ip, String port) {
try {
SessionAddress addr= new SessionAddress( InetAddress.getByName( ip),
new Integer( port).intValue());
for (int i= 0; i < rtpMgrs.length; i++) {
rtpMgrs[ i].removeTarget( addr, "target removed from transmitter.");
} catch( Exception e) {
e.printStackTrace();
boolean looping= true;
public void controllerUpdate( ControllerEvent ce) {
System.out.println( ce);
if( ce instanceof DurationUpdateEvent) {
Time duration= ((DurationUpdateEvent) ce).getDuration();
System.out.println( "duration: " + duration.getSeconds());
} else if( ce instanceof EndOfMediaEvent) {
System.out.println( "END OF MEDIA - looping=" + looping);
if( looping) {
processor.setMediaTime( new Time( 0));
processor.start();
public void setLooping( boolean flag) {
looping= flag;
public void update( ReceiveStreamEvent event) {
String timestamp= getTimestamp();
StringBuffer sb= new StringBuffer();
if( event instanceof InactiveReceiveStreamEvent) {
sb.append( timestamp + " Inactive Receive Stream");
} else if( event instanceof ByeEvent) {
sb.append( timestamp + " Bye");
} else {
System.out.println( "ReceiveStreamEvent: "+ event);
tx.rtcpReport( sb.toString());
public void update( RemoteEvent event) {
String timestamp= getTimestamp();
if( event instanceof ReceiverReportEvent) {
ReceiverReport rr= ((ReceiverReportEvent) event).getReport();
StringBuffer sb= new StringBuffer();
sb.append( timestamp + " RR");
if( rr != null) {
Participant participant= rr.getParticipant();
if( participant != null) {
sb.append( " from " + participant.getCNAME());
sb.append( " ssrc=" + rr.getSSRC());
} else {
sb.append( " ssrc=" + rr.getSSRC());
tx.rtcpReport( sb.toString());
} else {
System.out.println( "RemoteEvent: " + event);
private String getTimestamp() {
String timestamp;
Calendar calendar= Calendar.getInstance();
int hour= calendar.get( Calendar.HOUR_OF_DAY);
String hourStr= formatTime( hour);
int minute= calendar.get( Calendar.MINUTE);
String minuteStr= formatTime( minute);
int second= calendar.get( Calendar.SECOND);
String secondStr= formatTime( second);
timestamp= hourStr + ":" + minuteStr + ":" + secondStr;
return timestamp;
private String formatTime( int time) {
String timeStr;
if( time < 10) {
timeStr= "0" + time;
} else {
timeStr= "" + time;
return timeStr;
* Stops the transmission if already started
public void stop() {
synchronized (this) {
if (processor != null) {
processor.stop();
processor.close();
processor = null;
for (int i= 0; i < rtpMgrs.length; i++) {
rtpMgrs[ i].removeTargets( "Session ended.");
rtpMgrs[ i].dispose();
public String createProcessor() {
if (locator == null) {
return "Locator is null";
DataSource ds;
DataSource clone;
try {
ds = javax.media.Manager.createDataSource(locator);
} catch (Exception e) {
return "Couldn't create DataSource";
// Try to create a processor to handle the input media locator
try {
processor = javax.media.Manager.createProcessor(ds);
processor.addControllerListener( this);
} catch (NoProcessorException npe) {
return "Couldn't create processor";
} catch (IOException ioe) {
return "IOException creating processor";
// Wait for it to configure
boolean result = waitForState(processor, Processor.Configured);
if (result == false)
return "Couldn't configure processor";
// Get the tracks from the processor
TrackControl [] tracks = processor.getTrackControls();
// Do we have atleast one track?
if (tracks == null || tracks.length < 1)
return "Couldn't find tracks in processor";
// Set the output content descriptor to RAW_RTP
// This will limit the supported formats reported from
// Track.getSupportedFormats to only valid RTP formats.
ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW_RTP);
processor.setContentDescriptor(cd);
Format supported[];
Format chosen;
boolean atLeastOneTrack = false;
// Program the tracks.
for (int i = 0; i < tracks.length; i++) {
Format format = tracks[i].getFormat();
if (tracks[i].isEnabled()) {
supported = tracks[i].getSupportedFormats();
// We've set the output content to the RAW_RTP.
// So all the supported formats should work with RTP.
// We'll just pick the first one.
if (supported.length > 0) {
if (supported[0] instanceof VideoFormat) {
// For video formats, we should double check the
// sizes since not all formats work in all sizes.
chosen = checkForVideoSizes(tracks[i].getFormat(),
supported[0]);
} else
chosen = supported[0];
tracks[i].setFormat(chosen);
System.err.println("Track " + i + " is set to transmit as:");
System.err.println(" " + chosen);
atLeastOneTrack = true;
} else
tracks[i].setEnabled(false);
} else
tracks[i].setEnabled(false);
if (!atLeastOneTrack)
return "Couldn't set any of the tracks to a valid RTP format";
// Realize the processor. This will internally create a flow
// graph and attempt to create an output datasource for JPEG/RTP
// audio frames.
result = waitForState(processor, Controller.Realized);
if (result == false)
return "Couldn't realize processor";
// Set the JPEG quality to .5.
setJPEGQuality(processor, 0.5f);
// Get the output data source of the processor
dataOutput = processor.getDataOutput();
return null;
static SessionAddress destAddr1, destAddr2;
* For JPEG and H263, we know that they only work for particular
* sizes. So we'll perform extra checking here to make sure they
* are of the right sizes.
Format checkForVideoSizes(Format original, Format supported) {
int width, height;
Dimension size = ((VideoFormat)original).getSize();
Format jpegFmt = new Format(VideoFormat.JPEG_RTP);
Format h263Fmt = new Format(VideoFormat.H263_RTP);
if (supported.matches(jpegFmt)) {
// For JPEG, make sure width and height are divisible by 8.
width = (size.width % 8 == 0 ? size.width :
(int)(size.width / 8) * 8);
height = (size.height % 8 == 0 ? size.height :
(int)(size.height / 8) * 8);
} else if (supported.matches(h263Fmt)) {
// For H.263, we only support some specific sizes.
if (size.width < 128) {
width = 128;
height = 96;
} else if (size.width < 176) {
width = 176;
height = 144;
} else {
width = 352;
height = 288;
} else {
// We don't know this particular format. We'll just
// leave it alone then.
return supported;
return (new VideoFormat(null,
new Dimension(width, height),
Format.NOT_SPECIFIED,
null,
Format.NOT_SPECIFIED)).intersects(supported);
* Setting the encoding quality to the specified value on the JPEG encoder.
* 0.5 is a good default.
void setJPEGQuality(Player p, float val) {
Control cs[] = p.getControls();
QualityControl qc = null;
VideoFormat jpegFmt = new VideoFormat(VideoFormat.JPEG);
// Loop through the controls to find the Quality control for
// the JPEG encoder.
for (int i = 0; i < cs.length; i++) {
if (cs[i] instanceof QualityControl &&
cs[i] instanceof Owned) {
Object owner = ((Owned)cs[i]).getOwner();
// Check to see if the owner is a Codec.
// Then check for the output format.
if (owner instanceof Codec) {
Format fmts[] = ((Codec)owner).getSupportedOutputFormats(null);
for (int j = 0; j < fmts.length; j++) {
if (fmts[j].matches(jpegFmt)) {
qc = (QualityControl)cs[i];
qc.setQuality(val);
System.err.println("- Setting quality to " +
val + " on " + qc);
break;
if (qc != null)
break;
* Convenience methods to handle processor's state changes.
private Integer stateLock = new Integer(0);
private boolean failed = false;
Integer getStateLock() {
return stateLock;
void setFailed() {
failed = true;
private synchronized boolean waitForState(Processor p, int state) {
p.addControllerListener(new StateListener());
failed = false;
// Call the required method on the processor
if (state == Processor.Configured) {
p.configure();
} else if (state == Processor.Realized) {
p.realize();
// Wait until we get an event that confirms the
// success of the method, or a failure event.
// See StateListener inner class
while (p.getState() < state && !failed) {
synchronized (getStateLock()) {
try {
getStateLock().wait();
} catch (InterruptedException ie) {
return false;
if (failed)
return false;
else
return true;
* Inner Classes
class StateListener implements ControllerListener {
public void controllerUpdate(ControllerEvent ce) {
// If there was an error during configure or
// realizI do this all the time, I put my MBP to a 60 inch Sharp. If you have the video working do the simple thing first. Check to make sure your sound is on your TV and Mac. Then if that doesn't work go to System Prefrences and under sound go to a tab called Output and see if your TV is listed and if it is change it to that setting
Hope It Works -
Is there a way to make an audio clip not cover the whole project in iMovie? I want to add audio clip or song and let it start at a certain point in the project. Whenever I add audio or song it covers the whole project. I'm working with iMovie on IPad!
Thank you for your reply Karsten but unfortunately this didn't help me so far. Or maybe I'm missing something?
First the link is a tutorial for iMovie on a Mac. I'm using iMovie on iPad so the steps are inapplicable.
Second it is only possible for me to manipulate the end part of the sound clip to whichever duration I want. But I can't do the same with the 'beginning' of the sound clip.
I simply want to place some photos in the beginning of my video with no sound in the background then after like 2 secs I want to start the music clip. For some reason that is not possible! Cause every time I drop the music clip unto my project timeline it automatically place it self along with the first frame in the project! And consequently the photos and music are forced to start together.
Hope I'm making sense... -
HT2494 what connector to buy to transfer audio and video from macbook to tv?
what connector to buy to transfer audio and video from macbook to tv? I read that if I buy the normal mini display port to hdmi converter that the audio will not transmit through my tv. I there a certain thing I need to buy to make this happen? I have a Macbook 13 inch Early 2008.
You can sync photos and videos to your iPad via your computer's iTunes.
To sync photos, connect and select your iPad on the left-hand sidebar of your computer's iTunes (you can enable the sidebar via option-command-S on a Mac), and on the right-hand side there should be a series of tabs, one of which should be Photos - if you select that tab you can then select which photo folders to sync to the iPad. There is a bit more info on this page. You will need to sync all the photos that you want on the iPad together in one go as only the most recent photo sync remains on the iPad - synced photos can't be deleted directly on the iPad, instead they are deleted by not including them in the next photo sync.
If the videos are films that you want to go into the Videos app then you first need to add them to your computer's iTunes library via File > Add To Library, and you can then use the Films/Movies tab to select and sync them
Syncing media : iTunes: Syncing media content to iPod
What format are the videos in that you want to download, and have you got an app on your iPad that supports that format ? -
No audio with video Premiere 11, distorted work area
I downloaded Premiere 11 onto my Macbook Pro. I have 2 issues, I have no audio on videos or with narration. Mp3's play fine. When I bring media into the work space, every thing is distorted and I can't see where I am dropping it or what I am doing with it. We have Premiere 11 on my husbands Mac and it works fine. Any help would be appreciated.
nursesharonhaz
Thanks for the details. Moving forward...assuming that all new projects behave the same with regard to your problem media in your Premiere Elements 11 Mac.
1. Narration. Please create a .wav narration clip in the free audio editor named Audacity and import them into Premiere Elements 11 Mac with the project's Add Media/Files and Folder/Project Assets from where you drag it to the Timeline Narration Track.
2. MOV File. What is the brand/model/settings for the camera that recorded that video?
Double check to assure that you have the latest version of QuickTime installed on your computer with Premiere Elements 11 Mac.
How does the editing workspace distorts weirdly when the .mov file is dragged from Project Assets to Video Track 1 of the Timeline?
Do MaxBook Pro have themes as does Windows (Aero, Basic, and High Contrast)?
3. Miscellaenous. Delete the Adobe Premiere Elements Prefs file in the 12.0 Folder
Users\Libraries\............\Adobe\Premiere Elements\12.0
For Mac, I am not sure if the ......is Preferences or Application Support
I am wondering about your Intel Iris video card/graphics card. Need to find out if there is any history on that with Premiere Elements.
If none of the above presents as contributing to the core issue, then let us try downloading a new source of installation files. Please do so from
the following web site. We will be using installation files for Premiere Elements 11 tryout into which you can insert your purchased serial number.
Photoshop Elements 11 Direct Download Links: Free Trials, Premiere | ProDesignTools
You can also look at the Premiere Elements 12 tryout to determine if there is any difference in behavior between 11 and 12.
Adobe Photoshop Elements 12 Direct Download Links, Premiere too | ProDesignTools
Let us see if any of the above has any impact on your situation.
Thanks.
ATR -
No audio with video chat, but can audio chat!
When I video chat, I don't get or receive audio. Neither does the person I'm "chatting" with. I can audio chat with that person... no problem. But the combination video and audio is a bust. And here's the kicker: it's only when I'm chatting with a particular person. With others, the audio and video combo works perfectly. And to top it all off, this person can audio/video chat with others just fine too! Any hints? My OS X and router firewall are both inactive.
My computer:
QT streaming: 1 Mbps cable
OS X firewall: off
Modem: D-Link cable modem (DCM-201)
Router: Belkin 802.11 wireless router (Model # F5D6231-4)
- firewall is off
- not in bridge mode
No iChat add-ons
Connection speed: 4175 Kbps down, 302.3 Kbps up
Will work on getting the other person's info... I know they're on a MacBook Pro as well. Seems more like a bad preference file or something to me as this person is the only one I have trouble with and vice versa. Thanks for any help!Hi MizzouHusker,
Thanks for starting a new thread.
Is the Modem in Bridge mode ?
Or is it doing DHCP ?
What ports are opened in either device ?
Do either have UPnP ?
With those speeds you can set Quicktime to 1.5meg
It will need you to restart iChat to see the new speed.
If devices use NAT you can have problems if the devices at either end do not like the NAT method used (there are about 6)
UPnP will get around this.
4:26 PM Thursday; July 13, 2006 -
No audio with video, website navigation impaired, I can no longer recommend Firefox
I maintain the video site shaleshockmedia.org. I can no longer recommend Firefox to viewers because of the severe problem I have been experiencing: No audio with video; navigation often does not work. Maybe this is purely a local problem. Please let me know if the site is fully functional or not when you go there. There is no problem with Chrome or Safari. My computer is a Macbook Pro 3,1 Intel Core 2 Duo. MacOs 10.6.8
Note that there are a lot of popular extensions that can cause issues with videos not working apart from issues with the Flash plugin.
In what way is the navigation not working properly?
Can you give some examples?
Start Firefox in <u>[[Safe Mode|Safe Mode]]</u> to check if one of the extensions (Firefox/Tools > Add-ons > Extensions) or if hardware acceleration is causing the problem (switch to the DEFAULT theme: Firefox/Tools > Add-ons > Appearance).
*Do NOT click the Reset button on the Safe mode start window or otherwise make changes.
*https://support.mozilla.org/kb/Safe+Mode
*https://support.mozilla.org/kb/Troubleshooting+extensions+and+themes
Check for problems with current Flash plugin versions and try these:
*disable a possible RealPlayer Browser Record Plugin extension for Firefox and update the RealPlayer if installed
*disable protected mode in Flash 11.3 and later
*disable hardware acceleration in the Flash plugin
*http://kb.mozillazine.org/Flash#Troubleshooting -
I'm using Premier elements 12. When I open a VOB file, the audio and video are not in sync. I can open that same file with windows media player and all is well. What do I need to do with respect to editing a file like this?
The information about the display on my computer is:
Name NVIDIA GeForce GT 220
PNP Device ID PCI\VEN_10DE&DEV_0A20&SUBSYS_069A10DE&REV_A2\4&22063C0D&0&0018
Adapter Type GeForce GT 220, NVIDIA compatible
Adapter Description NVIDIA GeForce GT 220
Adapter RAM 1.00 GB (1,073,741,824 bytes)
Installed Drivers nvd3dumx.dll,nvwgf2umx.dll,nvwgf2umx.dll,nvd3dum,nvwgf2um,nvwgf2um
Driver Version 9.18.13.3523
INF File oem38.inf (Section002 section)
Color Planes Not Available
Color Table Entries 4294967296
Resolution 1920 x 1080 x 60 hertz
Bits/Pixel 32
Memory Address 0xFA000000-0xFAFFFFFF
Memory Address 0xD0000000-0xDFFFFFFF
Memory Address 0xCE000000-0xCFFFFFFF
I/O Port 0x0000EC00-0x0000EC7F
IRQ Channel IRQ 16
I/O Port 0x000003B0-0x000003BB
I/O Port 0x000003C0-0x000003DF
Memory Address 0xA0000-0xBFFFF
Thanks,
Pete BlairPete Blair
In all your information, I did not see what computer operating system your Premiere Elements 12 is running on? Have you updated 12 to the 12.1 Update, using an opened project's Help Menu/Update? If not, please do so.
Your issue may relate to the details of your "VOB file" and how it was ripped/copied from the DVD-VIDEO and DVD disc to get it into a Premiere Elements project.
a. project preset set for NTSC DV Standard or NTSC DV Widescreen (whichever corresponded to your recording details)...or PAL equivalent.
b. DVD disc in burner tray
c. Add Media/DVD camera or computer drive/Video Importer/.....
It does not happened often, but if audio is out of sync and you are not using the above route, then consider the Command Prompt way described in the
following
ATR Premiere Elements Troubleshooting: PE: DVD-VIDEO/Seamless VOB Ripping
A just in case note, contrary to some, it is not necessary to convert VOB (VTS_01_1.VOB video file and any files in that series) to any other format in Premiere Elements.
Please review and consider and then we can decide what next.
Thank you.
ATR -
Audio and video out of sync on converted AVI file
I am trying to convert a DVD to an AVI for playback on a device. When I import the DVD and publish it as an .AVI the output audio and video are out of sync. After about an hour of playback the audio lags the video by a full second. How can I correct this problem?
Walter0325
If you have Premiere Elements 12 on Windows 8 and you want to rip VOBs from the DVD-VIDEO to export subsequently to DV AVI, here are the details....
1. Do you have DVD-VIDEO standard or widescreen? Set the Premiere Elements 12 project preset manually.
File Menu/New/Project. Set the project preset to NTSC or PAL DV Standard or NTSC or PAL DV Widescreen depending on what you have.
Before you exit the new project dialog, make sure that you have a check mark next to Force Selected Project Setting on this Project.
2. In the Premiere Elements workspace, use Add Media/DVD camera or computer drive/Video Importer to get the VOBs (only the ones in the series VTS_01_1.VOB. Set to a folder in the Save In: location. Check mark next to Insert in Timeline. Hit Get Media.
3. If you have an orange line OVER the Timeline content, press the Enter key to get the best possible preview of what you have on that Timeline.
If after all that, the audio is out of sync, then use the Command Prompt method to get the individual VOBs into a single file with all the VOBs seamlessly merged. This worked nicely for a recent user here. See details in my blog post how to for the Command Prompt way.
http://www.atr935.blogspot.com/2013/09/pe-dvd-videoseamless-vob-ripping.html
Please do not hesitate to ask if you need clarification on anything that I have written.
Depending on your results, more information may be requested about the specific properties of what you have on that DVD disc.
Thank you.
ATR -
How to combine discrete audio and video files?
I am working on a project which requires me to merge a video file with a PCM audio file from another source.
I just bought Compressor 4.1.1, expecting to be able to do exactly this.
The only "multiple input - single output" option I can see is creating a surround sound group, adding my audio as left and right channels, and then attaching the video.
This does not work. Pressing "add" at the end of the process does not result in a new job.
Help please!
G.QT7 Pro is able to do this. Cost is $30.
In QT7 Pro, open up the audio and video files to be matched.
Select all the audio and then select copy
In the video window, put the playback head where you want the audio to start then select add to movie.
Verify the playback is as desired, then save the movie. Done.
Takes longer to describe the process than to do it.
x
Of course, life is good when the in point is exactly the same for both audio and video ...
Maybe you are looking for
-
How can i get firefox to ask me to open or save a download?
firefox used to ask me to open or save a file when i downloaded it and now it doesn't, is there a way 2 change it back 2 the way it was?
-
Mountain lion installation failing
Hi, After downloading Mountain Lion about 3 times, it is still failing. I browse thru the next and tried the various tips such repair the folder permissions, ensure that the mac is up to date. However, it still doesn t work. Any other tips or procedu
-
Shared photo's & music...
Guys/Gals My wife and I share a MacBook Pro with a 120Gig harddrive. We use seperate logins so we have our own internet shortcuts, email accounts, etc. However, because of that setup, we also have duplicate photo and music file folders. If we could s
-
CE 7.1 EHP1 installation on windows 7
Hello I try to instal the 64.bits release of CE on a windows 7 laptop, in order to evaluate the opportunity to change all laptops for my developers. Unfortunately I systematicaly have the same error : Unable to find any Java instances on the host. In
-
Apps say I need Flash Player after upgrade. I have Windows 7, IE 11 and adequate system. Install said complete but a check from Adobe site says it isnt installed. I have uninstalled and run install again. Same response. What am missing?