Audio/Video classes??

Hey all-
I heard that there are audio/video training classes for certification (No physical tutor) and the cost for this one is pretty low compared to the normal training classes. Is this true?
Wil it be good to go with such audio/video classes? Will that be helpful in writnig the exam? Please suggest.
Thanks
M A

Hi Shakir,
Ya, all these information is completely correct. SAP Education do provides these audio/video classes that is usually known as e-Learning sessions. These
e-Learnings are in two formats. Some of these materials will be in plain PDFs, some will be voice enabled and some will be in SAP Tutor (.sim format) such as like a demo type. here there will be no tutor and yes these are comparatively very less costly as classroom sessions.
Yes its beneficial to go for these sessions when you can not manage to go for classroom sessions. Sure this will help you in writing exams as well but keep in mind that these sessions will laways be less helpful as compared to classroom.
To access these sessions you should register with SAP Education and you will get Service Marketplace user ID and password by that you can access these sessions.
Rewarding to query will give you one point yourself. *
Regards,
Subhasha

Similar Messages

  • Codes for audio video receiver class

    I am currently writting an audio and video receiver for my project based on AVReceive3, it works but i can only get Video only, everything seems to be normal, i dont see why i am not getting audio. My transmitter is working fine and sending both streams.
    If anyone can spot the error or mistake i a making, pleaase let me know as soon as possible.
    my email address is [email protected]
    public class AudVidReceive extends Frame implements ActionListener,ReceiveStreamListener,SessionListener,ControllerListener
    // Input MediaLocator
    // Can be a file or http or capture source
    private MediaLocator alocator;
    MediaLocator vlocator,vloct,aloct;
    private String ipAddress;
    private int portBase;
    Button transtop,transtart,recstop,recstart;
    Component controlPanel, visualComponent;
    //For Receiving Media Stream
    String sessions[] = null;
    SessionManager mgrs[] = null;
    Vector playerWindows = null;
    boolean dataReceived = false;
    Object dataSync = new Object();
    private Processor processor = null;
    Player player=null;
    Format fmt=null;
    private SessionManager rtpMgrs[];
    private DataSource dataOutput = null;
    DataSource ads=null;
    DataSource vds=null;
    DataSource source[];
    DataSource m=null;
    public AudVidReceive()
    super("Audio Video Receive");
    recstart=new Button("StartReceiving");
    //recstop=new Button("Stop Receiving");
    add(recstart);
    recstart.setBounds(80,250,100,30);
    recstart.addActionListener(this);
    setLayout(null);
    setSize(400,300);
    setVisible(true);
    //this.sessions=sessions;
    public void actionPerformed(ActionEvent e)
    Button b=(Button)e.getSource();
    String ac=e.getActionCommand();
    if(ac.equalsIgnoreCase("StartReceiving"))
    boolean flag;
    boolean st=initialize();
    if(!st)
    System.err.println("Failed to Realize the Sessions ");
    System.exit(-1);
    flag=isDone();
    try{
    while(!flag)
    Thread.sleep(1000);
    }catch(Exception e1)
    e1.printStackTrace();
    //Initialize the Session for Receiving the Incomming Stream
    protected boolean initialize() {
    try {
    InetAddress ipAddr;
    SessionAddress localAddr = new SessionAddress();
    SessionAddress destAddr;
    String sessions[]={"192.168.2.143/42060"};
    mgrs = new com.sun.media.rtp.RTPSessionMgr[sessions.length];
    playerWindows = new Vector();
    SessionLabel session;
    // Open the RTP sessions.
    for (int i = 0; i < sessions.length; i++) {
    // Parse the session addresses.
    try {
    session = new SessionLabel(sessions);
    } catch (IllegalArgumentException e) {
    System.err.println("Failed to parse the session address given: " + sessions[i]);
    return false;
    System.err.println(" - Open RTP session for: addr: " + session.addr + " port: " + session.port + " ttl: " + session.ttl);
    mgrs[i] = new com.sun.media.rtp.RTPSessionMgr();
    mgrs[i].addSessionListener(this);
    mgrs[i].addReceiveStreamListener(this);
    ipAddr = InetAddress.getByName(session.addr);
    destAddr = new SessionAddress(ipAddr, session.port,
    ipAddr, session.port+1);
    mgrs[i].initSession(localAddr, getSDES(mgrs[i]), .05, .25);
    // You can try out some other buffer size to see
    // if you can get better smoothness.
    BufferControl bc = (BufferControl)mgrs[i].getControl("javax.media.control.BufferControl");
    if (bc != null)
    bc.setBufferLength(350);
    mgrs[i].startSession(destAddr, session.ttl, null);
    } catch (Exception e){
    System.err.println("Cannot create the RTP Session: " + e.getMessage());
    return false;
    // Wait for data to arrive before moving on.
    long then = System.currentTimeMillis();
    long waitingPeriod = 120000; // wait for a maximum of 30 secs.
    try{
    synchronized (dataSync)
    while (!dataReceived && System.currentTimeMillis() - then < waitingPeriod) {
    if (!dataReceived)
    System.err.println(" - Waiting for RTP data to arrive...");
    dataSync.wait(1000);
    }catch (Exception e) {}
    if (!dataReceived) {
    System.err.println("No RTP data was received.");
    close();
    return false;
    return true;
    * Find out the host info.
    String cname = null;
    private SourceDescription[] getSDES(SessionManager mgr)
    SourceDescription[] desclist = new SourceDescription[3];
    if (cname == null)
    cname = mgr.generateCNAME();
    desclist[0] = new
    SourceDescription(SourceDescription.SOURCE_DESC_NAME,
    System.getProperty("user.name"),
    1,
    false);
    desclist[1] = new
    SourceDescription(SourceDescription.SOURCE_DESC_CNAME,
    cname,
    1,
    false);
    desclist[2] = new
    SourceDescription(SourceDescription.SOURCE_DESC_TOOL,
    "AVReceive powered by JMF",
    1,
    false);
    return desclist;
    //Check Player window sizing
    public boolean isDone() {
    return playerWindows.size() == 0;
    * Close the players and the session managers.
    protected void close()
    for (int i = 0; i < playerWindows.size(); i++)
    try {
    ((PlayerWindow)playerWindows.elementAt(i)).close();
    } catch (Exception e) {}
    playerWindows.removeAllElements();
    // close the RTP session.
    for (int i = 0; i < mgrs.length; i++) {
    if (mgrs[i] != null) {
    mgrs[i].closeSession("Closing session from AudVidReceive");
    mgrs[i] = null;
    //Find the player
    PlayerWindow find(Player p) {
    for (int i = 0; i < playerWindows.size(); i++) {
    PlayerWindow pw = (PlayerWindow)playerWindows.elementAt(i);
    if (pw.player == p)
    return pw;
    return null;
    //Find whether the Player is receiving the Stream
    PlayerWindow find(ReceiveStream strm) {
    for (int i = 0; i < playerWindows.size(); i++) {
    PlayerWindow pw = (PlayerWindow)playerWindows.elementAt(i);
    if (pw.stream == strm)
    return pw;
    return null;
    * SessionListener.
    public synchronized void update(SessionEvent evt) {
    if (evt instanceof NewParticipantEvent) {
    Participant p = ((NewParticipantEvent)evt).getParticipant();
    System.err.println(" - A new participant had just joined: " + p.getCNAME());
    * ReceiveStreamListener
    public synchronized void update( ReceiveStreamEvent evt) {
    SessionManager mgr = (SessionManager)evt.getSource();
    Participant participant = evt.getParticipant(); // could be null.
    ReceiveStream stream = evt.getReceiveStream(); // could be null.
    if (evt instanceof RemotePayloadChangeEvent) {
    System.err.println(" - Received an RTP PayloadChangeEvent.");
    System.err.println("Sorry, cannot handle payload change.");
    System.exit(0);
    else if (evt instanceof NewReceiveStreamEvent) {
    try {
    stream = ((NewReceiveStreamEvent)evt).getReceiveStream();
    DataSource ds = stream.getDataSource();
    // Find out the formats.
    RTPControl ctl = (RTPControl)ds.getControl("javax.media.rtp.RTPControl");
    if (ctl != null){
    System.err.println(" - Recevied new RTP stream: " + ctl.getFormat());
    } else
    System.err.println(" - Recevied new RTP stream");
    if (participant == null)
    System.err.println(" The sender of this stream had yet to be identified.");
    else {
    System.err.println(" The stream comes from: " + participant.getCNAME());
    // create a player by passing datasource to the Media Manager
    Player p = javax.media.Manager.createPlayer(ds);
    if (p == null)
    return;
    p.addControllerListener(this);
    p.realize();
    PlayerWindow pw = new PlayerWindow(p, stream);
    playerWindows.addElement(pw);
    // Notify intialize() that a new stream had arrived.
    synchronized (dataSync) {
    dataReceived = true;
    dataSync.notifyAll();
    } catch (Exception e) {
    System.err.println("NewReceiveStreamEvent exception " + e.getMessage());
    return;
    }//above catch closing if
    else if (evt instanceof StreamMappedEvent) {
    if (stream != null && stream.getDataSource() != null) {
    DataSource ds = stream.getDataSource();
    // Find out the formats.
    RTPControl ctl = (RTPControl)ds.getControl("javax.media.rtp.RTPControl");
    System.err.println(" - The previously unidentified stream ");
    if (ctl != null)
    System.err.println(" " + ctl.getFormat());
    System.err.println(" had now been identified as sent by: " + participant.getCNAME());
    else if (evt instanceof ByeEvent) {
    System.err.println(" - Got \"bye\" from: " + participant.getCNAME());
    PlayerWindow pw = find(stream);
    if (pw != null) {
    pw.close();
    playerWindows.removeElement(pw);
    }// end of this method
    * ControllerListener for the Players.
    public synchronized void controllerUpdate(ControllerEvent ce) {
    Player p = (Player)ce.getSourceController();
    if (p == null)
    return;
    // Get this when the internal players are realized.
    if (ce instanceof RealizeCompleteEvent) {
    PlayerWindow pw = find(p);
    if (pw == null) {
    // Some strange happened.
    System.err.println("Internal error!");
    System.exit(-1);
    pw.initialize();
    pw.setVisible(true);
    p.start();
    if (ce instanceof ControllerErrorEvent) {
    p.removeControllerListener(this);
    PlayerWindow pw = find(p);
    if (pw != null) {
    pw.close();
    playerWindows.removeElement(pw);
    System.err.println("AVReceive internal error: " + ce);
    * A utility class to parse the session addresses.
    class SessionLabel {
    public String addr = null;
    public int port;
    public int ttl = 1;
    SessionLabel(String session) throws IllegalArgumentException {
    int off;
    String portStr = null, ttlStr = null;
    if (session != null && session.length() > 0) {
    while (session.length() > 1 && session.charAt(0) == '/')
    session = session.substring(1);
    // Now see if there's a addr specified.
    off = session.indexOf('/');
    if (off == -1) {
    if (!session.equals(""))
    addr = session;
    } else {
    addr = session.substring(0, off);
    session = session.substring(off + 1);
    // Now see if there's a port specified
    off = session.indexOf('/');
    if (off == -1) {
    if (!session.equals(""))
    portStr = session;
    } else {
    portStr = session.substring(0, off);
    session = session.substring(off + 1);
    // Now see if there's a ttl specified
    off = session.indexOf('/');
    if (off == -1) {
    if (!session.equals(""))
    ttlStr = session;
    } else {
    ttlStr = session.substring(0, off);
    if (addr == null)
    throw new IllegalArgumentException();
    if (portStr != null) {
    try {
    Integer integer = Integer.valueOf(portStr);
    if (integer != null)
    port = integer.intValue();
    } catch (Throwable t) {
    throw new IllegalArgumentException();
    } else
    throw new IllegalArgumentException();
    if (ttlStr != null) {
    try {
    Integer integer = Integer.valueOf(ttlStr);
    if (integer != null)
    ttl = integer.intValue();
    } catch (Throwable t) {
    throw new IllegalArgumentException();
    * GUI classes for the Player.
    class PlayerWindow extends Frame {
    Player player;
    ReceiveStream stream;
    PlayerWindow(Player p, ReceiveStream strm) {
    player = p;
    stream = strm;
    public void initialize() {
    add(new PlayerPanel(player));
    public void close() {
    player.close();
    setVisible(false);
    dispose();
    public void addNotify() {
    super.addNotify();
    pack();
    * GUI classes for the Player.
    class PlayerPanel extends Panel {
    Component vc, cc;
    PlayerPanel(Player p) {
    setLayout(new BorderLayout());
    if ((vc = p.getVisualComponent()) != null)
    add("Center", vc);
    if ((cc = p.getControlPanelComponent()) != null)
    add("South", cc);
    public Dimension getPreferredSize() {
    int w = 0, h = 0;
    if (vc != null) {
    Dimension size = vc.getPreferredSize();
    w = size.width;
    h = size.height;
    if (cc != null) {
    Dimension size = cc.getPreferredSize();
    if (w == 0)
    w = size.width;
    h += size.height;
    if (w < 160)
    w = 160;
    return new Dimension(w, h);
    public static void main(String [] args) {
    Format fmt = null;
    int i = 0;
    AudVidReceive at1 = new AudVidReceive();
    System.err.println("Start Receiving incoming Streams ");
    try {
    Thread.currentThread().sleep(60000);
    } catch (InterruptedException ie) {
    ie.printStackTrace();

    I have to use RTP and RTSP for transferring the
    audio/video streams in real time.
    How I can sen and receive RTP packets.
    How can I make the RTP Player.
    What is the role of JMF in it.
    Please suggest me and provide me the codes if
    possible.
    Thanks alot
    shobhit vermaThere are two ways using which you can send and recieve packets using rtp. One without using the SessionManager and the other with.
    1.) Not using the SessionManager makes things easy for you but it offers very little flexibility to you. You can do this by just specifying a medialocator pointing to the specific url
    2.) Using the SessionManager is what is usually suggested. This process is more complex. First you have to instantiate a RTPSessionMgr to a SessionManager reference using which you can send and receive streams to and from the network. The process is more involved than this and is suggest you read some tutorials to get a better understanding, than me explaining to you the entire process.
    Message was edited by:
    qUesT_foR_knOwLeDge

  • How to use rtp to transmit data other than audio / video

    hi all,
    as you all know the jmf is based upon a specific paradigm:
    data source -> player (processor) -> renderer.
    i am not being very precise on this but that does not matter.
    i want (for testing purposes) to create a tool that transmit some custom data
    (maybe even nonsense) over the rtp packages.
    so as you may now see this would not work quite right with the mentioned
    paradigm since there would be no audio/video to play-process-render.
    so how would i go about to utilise the rtp abilities of the jmf without adhering
    to the whole player-whatever shebang.
    what i have figured out is that there is a way to transmit custom rtp-payload.
    there is even an example on the jmf solutions page but even this one uses
    an audio format (pcm).
    i hope what it is clear to the dear reader what i want to do and maybe some-
    one can help me with this.
    p.s. i am not one who wants to get the whole source code (like so many
    others do on this forum) for a project someone may have done on this, what
    i need is a hint in the right direction.
    thanks for your attention and thanks in advance for any useful hints.

    Hi, I would like to make some comments that may help
    you to find out how to manage your problem.
    If you have a look to the JMF 2.0 API, Guide,
    "Understanding the JMF RTP API", Figure 8-2
    http://java.sun.com/products/java-media/jmf/2.1.1/guide/RTPArchitecture.html#105171
    you can notice that the processor is not essential.
    In this pciture, the processor is inserted between two datasources,
    to let you perform some operations on the media that is read from
    a file or a capture device.
    But the SessionManager just needs a DataSource.
    With what you say, I can imagine that you want to transmit
    real-time data that is not related to standard media formats.
    In principle, the only think you must do is to extend the
    javax.media.protocol.DataSource abstract class and implement
    it at your own convenience.
    Once your custom class is written, you can call the SessionManager's
    method createSendStream(DataSource ds, int streamindex) passing
    for argument an instance of your mentioned custom implementation.
    Hope this helps.
    Good Luck ;-) !

  • How to send files like audio,video,images and text via RMI..

    Hi everyone,
    As I am working under a project, my own, of creating a chat machine, I've thought to make it capable of doing everything that MSN or yahoo MSN can do. Till now I've just been able to send messages and some small icons as expressions. So, my next step will be making my progam able to send even other files like audio, video, images and text to the person on the other machine to whom I'm chatting. But as I don't have any idea on how to start doing it, I want anyone who think he/she can help me to give me the basic logic that is used to do so. I would very much appreciate it. I've used vectors to store the text messages which is visible to all the users using the chat program enabling them to see various messages in it.
    thank you...
    Jay

    Hi,
    Now, I got stuck because the code doesn't seem to work well. For large files with around 40 mb or more size couldn't be sent. I have constructed the code, just rough sketch, as follows:
    ** In the Server Implementation class I've used FileInputStream to read the contents of a file that is sent as an argument to the method.
    ** Similarly, in the client side I've used RandomAccessFile to save the received array of bytes.
    public void sendFile(File f)
       ChatServer cs=(ChatServer)Naming.lookup("rmi://localhost/ChatServer");
       cs.readsAndStoreTheFileInTheServer(f); // In the Server Implementation the contents of the file is read and saved in an array of byte. later method is invoked by the client to get the array of the saved byte.
       cs.message("-Accept-"); // When a client receives this word then a JComponent with accept and cancel button will be constructed from where other clients can save/cancel the sent file.
    }For small size files this code works well but for files more than 40 mb of size it is useless. I wonder if there's any other alternative.
    regards,
    Jay

  • Streaming audio-video on the mobile device

    Please, I must do the streaming audio-video on the mobile device using a server standard.
    I must to program the application for the server using j2se:in this case how i can send the file audio/video?This file have specifies classes?
    For the client i must using the j2me:in this case how can receive the file audio and creating the player?
    Thanks.

    Hi, please help,
    I have problem in this area also,
    am intending to stream audio and video data to mobile phone using j2me for my bs project.,
    is that true, midlets can only 'plays back' one file at time specified by url?
    I want to store many files in database( e.g ms access), and call them to midlets, to make some how interactive and real,because i have tomcat server installed in my pc.
    is this possible?
    How can i call file stored beyond path of WTK ?
    can i store audio/ video file in ms- access?
    Please, please help !
    thanks in advance.
    Message was edited by:
    nbkamani

  • Problem with running example 'Generating Live Audio/Video Data'

    Hello,
    Concerning the example 'Generating Live Audio/Video Data', I'm having trouble with the run instructions.
    http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/LiveData.html
    How does JMFRegistry know about the location of jmfsample?
    How is 'live' resolved as a URL?
    2.Register the package prefix for the new data source using JMFRegistry
    - Run JMFRegistry
    - In the Protocol Prefix List section add "jmfsample" and hit Commit.
    4.Select File->Open URL and enter "live:"
    Much thanks,
    Ben

    I'm getting the following error message: "Could not create player for live"Implies you've either not registered the "live:" protocol prefix in the JMF Registry, or, it couldn't load the class you registered for it...or it might be erroring out inside the actual live protocol, I'm not sure what that would look like, but a System.err.println statement in the constructor of both of those classes might be a good idea.
    I added the output of javac (DataSource.class and LiveStream.class) to a directory on the classpath.
    C:\Program Files\JMF2.1.1e\lib\jmfsample\media\protocol\liveEh, that looks a little questionable to me. I'm not 100% sure that the JRE will automaticlly descend into the package subdirectories like that, looking for classfiles, for every folder on the path. I am, of course, fully open to the idea that it does and I just never thought about it...but I guess I just thought it only did that for JAR files, not CLASS files. Regardless, I'd recommend:
    1) Make sure you've registered the protocol prefix "live:" correctly in JMF Registry
    2) Try to run it with the 2 compiled class files in the same folder as your project
    3) Try to run it with the 2 compiled class files in the lib directory, if that's on the classpath
    4) Try to run it with the 2 compiled class files installed in the JRE as an extension (google for how to do this because I don't remember off the top of my head)
    5) Reinstall JMF and see if that helps

  • Please, help with Live Audio/Video example from jmf solutions

    Hello,
    I�m desperate looking for a solution for a particular problem.
    I�m trying to feed JMF with an AudioInputStream generated via Java Sound, so that I can send it via RTP. The problem is that I don�t know how to properly create a DataSource from an InputStream. I know the example Live Audio/Video Data from the jmf solutions focuses on something similar.
    The problem is that I don�t know exactly how it works, os, the question is, how can I modify that example in order to use it and try to create a proper DataSource from the AudioInputStream, and then try to send it via RTP?
    I think that I manage to create a DataSource and pass it to the class AVTransmit2 from the jmf examples, and from that DataSource create a processor, which creates successfully, and then find a corresponding format and try to send it, but when i try to send it or play it I get garbage sound, so I�m not really sure whether I create the DataSource correctly or not, as I�ve made some changes on the Live Audio/Video Data from the jmf solutions to construct a livestream from the audioinputstream. Actually, I don�t understand where in the code does it construct the DataSource from the livestream, from an inputStream, because there�s not constructor like this DataSource(InputStream) neither nothing similar.
    Please help me as I�m getting very stuck with this, I would really appreciate your help,
    thanks for your time, bye.

    import javax.media.*;
    import javax.media.format.*;
    import javax.media.protocol.*;
    import java.io.IOException;
    import javax.sound.sampled.AudioInputStream;
    public class LiveAudioStream implements PushBufferStream, Runnable {
        protected ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW);
        protected int maxDataLength;
        protected int vez = 0;
        protected AudioInputStream data;
        public AudioInputStream audioStream;
        protected byte[] audioBuffer;
        protected javax.media.format.AudioFormat audioFormat;
        protected boolean started;
        protected Thread thread;
        protected float frameRate = 20f;
        protected BufferTransferHandler transferHandler;
        protected Control [] controls = new Control[0];
        public LiveAudioStream(byte[] audioBuf) {
             audioBuffer = audioBuf;
                      audioFormat = new AudioFormat(AudioFormat.ULAW,
                          8000.0,
                          8,
                          1,
                          Format.NOT_SPECIFIED,
                          AudioFormat.SIGNED,
                          8,
                          Format.NOT_SPECIFIED,
                          Format.byteArray);
                      maxDataLength = 40764;
                      thread = new Thread(this);
         * SourceStream
        public ContentDescriptor getContentDescriptor() {
         return cd;
        public long getContentLength() {
         return LENGTH_UNKNOWN;
        public boolean endOfStream() {
         return false;
         * PushBufferStream
        int seqNo = 0;
        double freq = 2.0;
        public Format getFormat() {
             return audioFormat;
        public void read(Buffer buffer) throws IOException {
         synchronized (this) {
             Object outdata = buffer.getData();
             if (outdata == null || !(outdata.getClass() == Format.byteArray) ||
              ((byte[])outdata).length < maxDataLength) {
              outdata = new byte[maxDataLength];
              buffer.setData(audioBuffer);          
              buffer.setFormat( audioFormat );
              buffer.setTimeStamp( 1000000000 / 8 );
             buffer.setSequenceNumber( seqNo );
             buffer.setLength(maxDataLength);
             buffer.setFlags(0);
             buffer.setHeader( null );
             seqNo++;
        public void setTransferHandler(BufferTransferHandler transferHandler) {
         synchronized (this) {
             this.transferHandler = transferHandler;
             notifyAll();
        void start(boolean started) {
         synchronized ( this ) {
             this.started = started;
             if (started && !thread.isAlive()) {
              thread = new Thread(this);
              thread.start();
             notifyAll();
         * Runnable
        public void run() {
         while (started) {
             synchronized (this) {
              while (transferHandler == null && started) {
                  try {
                   wait(1000);
                  } catch (InterruptedException ie) {
              } // while
             if (started && transferHandler != null) {
              transferHandler.transferData(this);
              try {
                  Thread.currentThread().sleep( 10 );
              } catch (InterruptedException ise) {
         } // while (started)
        } // run
        // Controls
        public Object [] getControls() {
         return controls;
        public Object getControl(String controlType) {
           try {
              Class  cls = Class.forName(controlType);
              Object cs[] = getControls();
              for (int i = 0; i < cs.length; i++) {
                 if (cls.isInstance(cs))
    return cs[i];
    return null;
    } catch (Exception e) {   // no such controlType or such control
    return null;
    and the other one, the DataSource,
    import javax.media.Time;
    import javax.media.protocol.*;
    import java.io.IOException;
    import java.io.InputStream;
    import javax.sound.sampled.AudioInputStream;
    public class CustomDataSource extends PushBufferDataSource {
        protected Object [] controls = new Object[0];
        protected boolean started = false;
        protected String contentType = "raw";
        protected boolean connected = false;
        protected Time duration = DURATION_UNKNOWN;
        protected LiveAudioStream [] streams = null;
        protected LiveAudioStream stream = null;
        public CustomDataSource(LiveAudioStream ls) {
             streams = new LiveAudioStream[1];
             stream = streams[0]= ls;
        public String getContentType() {
         if (!connected){
                System.err.println("Error: DataSource not connected");
                return null;
         return contentType;
        public byte[] getData() {
             return stream.audioBuffer;
        public void connect() throws IOException {
          if (connected)
                return;
          connected = true;
        public void disconnect() {
         try {
                if (started)
                    stop();
            } catch (IOException e) {}
         connected = false;
        public void start() throws IOException {
         // we need to throw error if connect() has not been called
            if (!connected)
                throw new java.lang.Error("DataSource must be connected before it can be started");
            if (started)
                return;
         started = true;
         stream.start(true);
        public void stop() throws IOException {
         if ((!connected) || (!started))
             return;
         started = false;
         stream.start(false);
        public Object [] getControls() {
         return controls;
        public Object getControl(String controlType) {
           try {
              Class  cls = Class.forName(controlType);
              Object cs[] = getControls();
              for (int i = 0; i < cs.length; i++) {
                 if (cls.isInstance(cs))
    return cs[i];
    return null;
    } catch (Exception e) {   // no such controlType or such control
    return null;
    public Time getDuration() {
         return duration;
    public PushBufferStream [] getStreams() {
         return streams;
    hope this helps

  • Use FMS & Actionscript to Stream Audio/Video and control (powerpoint) slides

    Hi there,
    My compnay provides webcasting services to numerous clients, and we currently use Windows Media Encoder as our base software for these projects.
    However, I'm interested in switching to Flash Media Server, but need to figure out one issue and how to implement it in such a way like how we are currently using Windows Media Encoder.
    We need a way to stream out a live audio/video feed AND have a way to control a presenter's POWERPOINT slides and have them advance by our command when streaming live.
    Basically, the way we do it is we will stream out audio/video from WME and also use the encoder's SCRIPTING functions to send out a "url"  from the encoder to the actual page in which the viewer is watching the live embedded stream.  On that page is an embeded/internal frame which we can specify (using the id/name) that the url gets loaded, and this actually advances the slides (or more so pulls up a pre-created html page of the next slide in the person's powerpoint, which is actually a screenshot or exported image of that slide and embedded within a specific HTML file and hosted on the same server as the embedded player/html file)
    Would there be a way to send out this same type of script using FMS?   I've been playing around with it and notice the interface itself does not have a built in scripting function like Windows Media Encoder does. However, I did read somewhere that it supports ActionScript... but I have no clue how to implement this feature.
    I am just curious how it might be able to go about doing this?  Is there a way to use an [action]script with FMS and have it send out a URL to a specific frameset embedded on the same page as the Flash Media Encoder Playback?
    Hopefully this make sense and that someone can point me in the right direction.  Switching to FMS over Windows Media Encoder is the way I would love to go, just need to figure out how this could be done.
    Would a better method be converting the powerpoint slides to a swf or flv file, embedding that file within the same page as the FMS playback, and somehow using a script to "advance" the frames/slides/scenes within that file?
    I also heard something about Flash "Shared Objects" and how this could be a way of controlling a swf file remotely.....
    Could anyone give me some advice and know a method that I could use to achieve this type of implementation?
    Any help is GREATLY appreciated!   Thanks!

    FMS (the interactive version, FMIS) has complete scripting support.
    What you want to do can be done with FMS, and there are a lot of ways you can approach it. Converting to .swf can be helpful, as it means you have a single resource to load rather than a bunch of individual slide images (the flashplayer can load .swf, jpg, gif, png). Rather than embedding, I'd opt to make it an external resource and load it at runtime... that would keep the player application reusable.
    To get an idea of how to use server side actionscript, see the getting started portion of the FMS 3.5 dev docs
    http://help.adobe.com/en_US/FlashMediaServer/3.5_Deving/
    Then, see the SSAS docs for the following classes in particular:
    The Client class for Client.call()
    The Stream class for Stream.send()
    The Shared object class
    The Application class for BroadcastMsg()
    The docs give some general overview for the usage of each. The best choice really depends on your deployment architecture and scalability requirements.

  • How To combine 2 PlayerPanel (Audio & Video) To only oneFrame

    How To combine 2 PlayerPanel (Audio & Video) To only one Frame ... By now I use Video Conference with IP Multicast and use 2 port numbers for Audio and Video .... But When I receive new Stream ... As a matter of fact It've 2 Stream , Rite ??? When I receive I have 2 Frame ( Audio and Video Frame)
    But That was not properly work ... That may take come confuse to user I wanna Grap 'em into one How can I do ??? Because of SSRC was created by Random method so SSRC cannot be indentify
    NuT

    dear NUT please send me the code for this video conferencing over lan with the multicast feature--i am in a great need of it urgently. The code you have already posted is having a great problem that Transmitter class contains the code for receiver class while the Receiver Class already contains the same code.
    Moreover the video we transmit, we also want to see over our own machine plesae send some suggestion regarding this thing--means we want to see two videos at a time one is our own and other is the streamed one.
    i know you would be a busy programmer but i am complled to ask you for help.
    please send me the right code over my email brecause it would become easy for me to reach My Email:
    <[email protected]>

  • Audio Video Conference application with JMF

    hi there!
    I am developing an "Audio Video Conference application". so far I've been able to develop an applicaiton which opens an RTP Player on the client side, now my purpose is to enhance the functionality of RTP player by adding various functions like "Quit", "Pause" etc. I am looking for more options for enhancing the functionality of the RTP player in order to make it more user friendly for example I've been able to find the method for quiting from the player which is "KillThePlayer()" it is found in the class "PlayerWindow".
    java.lang.Object
    |
    --java.awt.Component
    |
    --java.awt.Container
    |
    --java.awt.Window
    |
    --java.awt.Frame
    |
    --com.sun.media.ui.PlayerWindow
    |
    --RTPPlayerWindow
    Please suggest me some of the classes and methods where I can find the method(s) for "Pausing the player" by which the player stops recieveing data and when un-paused it again starts recieveing, and also suggest me more functions to add to the RTP player so as to make it more user friendly.

    hi!
    thanx, I appreciate your interest! i know i am hitting the wrong forum but i have also posted my query in the JMF forum, i waited for long but unfortunately there was no reply posted to my request, so i just thought why not to try on some other relevant forums.
    if there is any sort of refernce available to my topic please help me.
    thanx for your help,
    bbye!

  • Audio/Video Applet

    Hi,
    I' m starting with JMF and i' d like to develop an Audio/Video applet with JFM which seems to be the best solution for my project.
    My project must capture the Audio/Video device and display the video in an applet. It must work via the Internet.
    I have installed the JMF in my machine and i' m working with jdk1.1.
    I have tried :
    - the AVTransmit2 and the AVReceive2 of java.sun examples but it doesn't work.
    - then the RTPPlayerApplet -> same trouble.
    - and AVTransmit3 and AVReceive3 ->not better
    - Simple PlayerApplet ->idem
    - RTPConnector -> :(
    I think i haven't understood how to make those examples work. Maybe it' s caused of my poor english :(
    So i don't know if i must continue to find out a solution to one of these examples or if i must find another solution :(
    Could somebody tell me if it is possible to solve my project by one of the java.sun technology?
    Could somebody explain me how to make work these example?
    Thanks in advance.
    - Christel

    hi,
    1st example :
    The building of the .class is correct for the AVTransmit2 and the AVReceive2
    But when i'm using them the AVTransmit2 is working for 60 seconds
    < java AVTransmit2 file:\C:\CAV\TransmitAV2\zidanevsleverkusen.mpeg 192.168.0.2 42050
    but the AVReceive2 does'nt
    < java AVReceive2 192.168.0.2 42050
    the error is: Failed to parse the session address given: 192.168.0.2
    Failed to initialize the session
    2nd example :
    And for the RTPPlayerApplet example, i have during the building:
    < javac -classpath src -d classes src\RTPPlayerApplet.java
    the result is :
    < Note: src\RTPPlayerApplet.java uses or overrides a depreciates API.
    < Note: Recompile with -deprecation for details
    When i use this option i have 6 warnings as this one :
    < src\RTPPlayerApplet.java: 282: warning: Javax.media.rtp.SessionManager in javax.media.rtp
    < has been deprecated
    What is my mistake?
    What have i not understood?
    Thanks for help (in advance) ;D
    Christel

  • How can I display & split the audio & video from the same digitized clip?

    I digitized a scene into iMovie that I edited on a professional system which I don't have access to anymore. The whole scene is 1 clip. Now I see a few tweaks that I want to make, so I was hoping to do them in iMovie.
    I want to "pull up" the audio in one section - meaning I want to take cut about 20 frames of audio from the end of a shot, and then move all the other audio up to fill the hole. To compensate for the missing 20 frames, I'll cut video off the head of the next shot. Some call this prelapping. Some call it an L-cut. Some call it asymmetrical trimming. Either way, I can't figure out how to do it in iMovie.
    My clip appears in the timeline as one track - a single track that contains the video and 2 audio tracks. How can I display the audio that's tied to the video on its own track? Then I think I could split audio & video wherever I wanted and trim things up - but I can't figure out how to do it.
    Am I asking too much of this software?
    BTW, I never see the option to "Split audio clip at playhead". I'm not displaying clip volume or waveforms. Choosing to display waveforms doesn't show me anything. Maybe iMovie thinks I'd only want to see waveforms of audio that isn't tied to my video-and-audio clips?
    Thanks in advance for any help...

    Jordon,
    "Am I asking too much of this software?"
    No, you're not.
    You first want to select your clip(s) and choose Advanced>Extract Audio.
    This will copy the audio from the video clip and place it on one of the two separate audio tracks while lowering the audio level to zero in the original video track.
    You can now edit and move the audio independently of the video.
    With the audio clip selected, you'll find you now have access to Edit>Split Selected Audio Clip at Playhead.
    Matt

  • IDVD 08 audio/video sync drift toward end of DVD movie - fix/workaround?

    Anyone know how I can fix this issue? It's driving me crazy.
    I've just spent a few evenings in iMovie 08, editing together the video of my son's birth. All looked & amazingly good in iMovie. The "Large" .m4v file output by iMovie - Shared with the Media Browser - plays perfectly in Quicktime too.
    However, after adding that shared movie to iDVD 08, and burning a DVD, I noticed that the audio sync drifts toward the end of the Movie when plated back. At the start of the DVD all seems well, but by the end of the ~1hr 20 minute movie the audio is unbearably out of sync with the video. Completely unwatchable.
    After a little Googling I came across some reports that iDVD might not deal well with video that's been recorded with 12-bit audio. I've confirmed that my camcorder is (was!) set to 12-bit rather than 16-bit (the default, I'm sure). So if that's the problem I'm hosed... it's not like I can get a retake on that footage
    Is there a reasonable workaround for this issue? I'm obviously unable to re-take the video (or the other 10 hours of DV video tape I've yet to use in an iMovie project). Ideally I'd like Apple to release a patch for this. Shouldn't it be possible for iDVD to account for audio that was originally recorded in 12-bit... even if it's just a checkbox you have to select to tip it off?
    Then again, perhaps this is totally unrelated to the 12-bit issue, and this is just a run of the mill bug?
    As I said... iMovie generates a MPEG4 video file of the Movie that looks/sounds perfect, so it seems this sync issue's being created somewhere in iDVD as it's rendering the audio/video to burn to the DVD...
    Thanks in advance.

    I've tried most of the options I can think of, and the audio sync drift issue continues on this ~1hr 12 minute iMovie/iDVD 08 project.
    A couple of promising ideas tried used QT Pro, to export the audio track from the iMovie generated MPEG-4 file - as a WAV file, resampling to 48kHz and 16-bit... then Adding this back to the video track in the MPEG-4 file (having deleted the original audio track). In another test I also tried the Add to Selection and Scale option. Oddly enough I think the Add to Selection and Scale attempt generated more sync issues than before in the final DVD movie... so I stuck with the MPEG-4 file containing the video track and the added, resampled at 48kHz & 16 bit, WAV audio track.
    This was saved as a self contained Movie file, producing a QT .mov file. I moved this .mov file into the iDVD projects directory structure so that it was avalable in iDVD's Media Browser. I then reworked the iDVD project to use this QT large.mov file, instead of the original iMovie large.m4v file.
    As a reminder... the original large.m4v file has the following properties:
    29.97 FPS H.264 Decoder, 960 x 540, Millions
    AAC, Stereo (L R), 44.100 kHz
    0:01:12:07.16 duration
    The resampled and saved QT MOV file has the following properties:
    29.97 FPS H.264 Decoder, 960 x 540, Millions
    16-bit Integer (Little Endian), Stereo, 48.000 kHz
    0:01:12:07.16 duration
    BOTH versions of this Movie file play flawlessly within QT Pro... there's no audio sync drift evident toward the end of the movie unless I use iDVD 08 to burn either movie file to a Physical DVD (or DVD Disk Image). In both cases the resultant DVD Movie exhibits significant audio sync drift as the ~1hr 12 minute movie progresses. The sync goes out by ~ <= 1 second based on a rough visual assessment.
    I've not yet tried to burn either the original iMovie produced MPEG-4 video file or my QT Pro tweaked audio track .MOV file version to a DVD on my PC...
    ... But I have used an application called Burn (referenced in this earlier/archived iDVD sync drift forum post: http://discussions.apple.com/thread.jspa?threadID=1332100 by user F Shippey - http://burn-osx.sourceforge.net/) to take the QT Pro tweaked .mov file version of the original MPEG-4 file and burn it to a DVD...
    Results:
    The Burn app burnt DVD plays with NO audio sync issues!... Even though it was generated from the same audio tweaked (48kHz/16-bit resampled) QT Pro .mov file.
    However, whenever I burn a DVD using iDVD 08 the audio sync gradually drifts. It matters not whether I use the original iMovie 08 generated MPEG-4 movie file, or the QT Pro version of that file with the resampled audio track (the same one that works fine with Burn).
    It sure looks to me like iDVD has a problem... Again, refer to http://discussions.apple.com/thread.jspa?threadID=1332100 for someone else who seems to have run into similar challenges.
    Am I overlooking something?
    Thanks in advance.

  • How do I connect an iPhone 5S or 6 with audio video cables?

    How do I connect an iPhone 5S or 6 with audio video cables? Our van only has these connections. Apple did have a cable that worked with the 4/4S, but it does not work with an adapter on the 5S. I want audio only.

    I Thought I probably was.
    something like this should work then, right?
    Belkin Y Audio Cable (12 foot) https://www.amazon.com/dp/B000067RBT/ref=cm_sw_r_awd_fYWzub11XY5MQ
    OR
    eForCity 3.5mm to RCA Auxiliary Cable Cord for iPod/MP3 https://www.amazon.com/dp/B0042DIZTE/ref=cm_sw_r_awd_R0Wzub1X33PD9
    thx SO much )

  • Audio / video sync issue in playback

    for every recording i've made, when i play back the archive there is noticable lack of sync between the audio and the person's mouth.   the video seems to lag behind by almost a second.  the good news at least is that the delta is constant.
    has anyone else experienced this?  it's happening in any scendario i try, even if i'm just recording myself. 
    is the LCCS dev team aware of this?  since audio and video streams are recorded separately, is there any way for me to programatically 'nudge' the streams into sync?
    it's currently a show stopper, it looks really bad when the sync is so obviously absent.

    finally getting back to this..
    i did as you suggested and recorded a video of myself using the recording and playback examples provided with the SDK (the 2.2 newest version btw).
    the de-syncing of audio and video is definitely there.  it seems especially bad for longer recordings.  however, an interesting thing i discovered in playback:
    - sometimes, if i drag the playhead somewhere in the recording, pause, and then hit play, video and audio fall into sync. 
    - if i start playback from the beginning of the recording, audio/video is always out of sync.
    i understand that the video and audio streams are recorded separately and thus played back separately.  my guess is that when those streams are retrieved and playback begins, one starts before/after the other because they're not retrieved at exactly the same time.  it seems some extra work is required to ensure that their playback commences at exactly the same timestamp within each stream.
    i can provide my recordings as an example if required.
    this is a total showstopper for me.  it renders the entire recording service useless if the playback can't have sync between audio and video.  i would be happy to help with testing this more.  can someone from the LCCS team comment on this issue?
    thanks
    adam

Maybe you are looking for

  • How to get a cell in a table NOT to expand when text is added in Acrobat?

    In one row I have one cell that expands the row height when text is added in Acrobat.  The cell next to it with what seems to be the same characteristics does not.  I have copied that cell onto the cell that expanded - but it is still expanding.  Wha

  • Just updated my itouch 1st gen to 3.1.2... now it won't sync music.

    So. I have a Asus computer with Windows 7. My itouch is a 1st generation and today i decided to upgrade it. However it took like 10 tries.. and 7 hrs. I got the apps on my itouch done and ready, however now it won't sync my pictures or my music. It s

  • ITunes Unexpectedly Re-Opens after Closing

    I just reformatted my laptop and upon installing iTunes 6.0.3.5, I noticed that iTunes, after closing it, will relaunch a few times, usually two. I figured maybe it was something with the software version, so I just updated to 6.0.4 and it does the e

  • How to insert record in table using ADF Table

    Hi, I am developing and application in ADF .it consist 5 tables.i need to insert into 1 table that contains IDs that are reffered from other tables and on ADF Table fi i dont want to show those IDs. Can somebody provide me solution how to do that? li

  • IOS 7.0.4 - Vmail Playback in Background?

    Is there a way to config iOS 7 to play vmail in background so I can type an email response while listening to the phone message?