RTP Transmission, Receive Woes

Hi ppl,
I developed an app to transmit captured video and audio as RTP streams. I made a monitor of the video and audio being captured which is displayed on the Frame. I also made a player from the RTP data received from another system in the network which transmits the data as RTP streams.
My problem is as soon as the player is realized and starts, the other components in the window, say a JList control, JButton controls are invisible and the JFrame upon which the Monitor and player is placed is also unresponsive. But the Monitor still displays video/audio being captured and the Player plays the video/audio being received from the network.
Can anybody offer some help on this? Please ...

Hi!
You r probably suffering from the problem of light weight component(tat r swing components) n heavy weight component(getVideo..... method components). Here heavy weight components r not working properly with these light weight components so..... not working properly for it more read article available title something as "Heavy weight n light weight components" as this site.........
I m not sure much but this might show u some direction to work properly..............

Similar Messages

  • RTP transmission in linux

    I am using linux for JMF RTP transmission.But the RTP data is not being transmitted.Can anyone suggest me the problem behind it.In windows its working fine.Is there any special configuration to be done on linux side for RTP transmission? I am using linux performance pack of JMF in linux machine.I also tried with cross platform pack of JMF.Thanks in advance...

    We finally did resolve this issue!!!!
    The thing is that Linux just behaves very different from Windows when you create Datagram or MultiCast's.
    Windows just assumes that localhost is the local ip address and Linux does not. When you do InetAddress.getLocalHost().getAddress() ; windows often returns the LAN ip address but linux returns localhost.
         InetSocketAddress local = new InetSocketAddress(InetAddress.getByAddress(new byte[]{(byte)192,(byte)168,(byte)3,(byte)2}),port);
         dataSock.bind(local);
         InetAddress destiny = InetAddress.getByAddress(new byte[]{(byte)224,(byte)1,(byte)1,(byte)0});
         dataSock2.connect(destiny, 9000);
    When you want to receive multicast packets you need to specify the network interface you access the multicast group. If you dont then it would just use InetAddress.getLocalHost().getAddress() as the address to reach the multicast group. In Linux this just does not work.
         MulticastSocket socket = new MulticastSocket(4446);
         InetAddress group = InetAddress.getByName("230.0.0.1");
         InetAddress address = InetAddress.getByName("192.168.3.2");
         NetworkInterface interf = NetworkInterface.getByInetAddress(address) ;
         SocketAddress socketAddress = new InetSocketAddress(group,4446) ;
         socket.joinGroup(socketAddress, interf );
    We are still working on making it work with broadcast addresses.
    Greetings!
    Nico and Charly

  • Help! How to change the video frame size during RTP transmission

    Hi there.
    In a RTP transmission:
    How can I change the size of the frames of the video while I have the processor in the server running?
    i.e. while the server is running, i want to change a video from 400x200 to 200x100
    Thanks

    I am trying to implement the same functionality. I've tried to reset the format using the trackControl.setFormat method but this only appears to be supported while the processor is in a configured state. I'm still investigating, but if you happen to know a solution please let me know.
    Here is what I'm trying to do... In the middle of a session, I'm trying to change the video format from h263/rtp:352x288 to h263/rtp:176x144

  • RTP Transmission over HTTPS (Internet)

    Hello people!
    I am currently working on my undergraduate thesis. I need to transmit the feed from a video capture device (i.e. a webcam) and also from an audio capture device towards a client in a secure transmission/session. I am already able to capture the feed from the camera and from the microphone and I also am able to transmit it in a nonsecure transmission/session.
    I am stuck with the problem of trasmitting the feed securely. I've tried implementing my own Effect. I override the process method of Effect to encrypt the data. However, there is a problem with the RTP transmission. RTPSession only allows JPEG_RTP and H263_RTP. I already junked this method, and I am now looking for another way, particularly transmitting it using HTTPS (on the internet).
    Could somebody please help me how to transmit my RTP session over HTTPS. Some sample codes could really help...
    Thanks a lot!!

    I think you might have the wrong forum. This is a forum about Forte 4GL.
    ka

  • How can I test a RTP(send&receive) application on the same computer?

    I an writing an application that lets one computer (A) sends a media file to another computer (B) and B receives it and play it.
    But how can I test this application on the same computer
    Eg.Transmitter: localhost 43000
    Receiver: localhost 43000
    It will retuen an error since the port number is alreeady used by another program.
    I know if I test it on different computers, it should be fine, but how can I test it on the same computer?

    JMStudio sends RTP packets from the same UDP port as the specified in the destination port, so you can't try it in the same computer. The examples provided in http://java.sun.com/products/java-media/jmf/2.1.1/solutions/index.html follow this rule aswell, but it's easy to modify it:
    rtpMgrs[streamid]=RTPManager.newInstance();
    localAddr = new SessionAddress( InetAddress.getLocalHost(),localDataPort);
    //By deafault, in the examples, they use  remote DataPort instead of localDataPort.
    //You can specify localDataPort to any different free port number, or to
    //SessionAddress.ANY_PORT if you don't mind the port number
    remoteAddr  = new SessionAddress(destIpAddr,remoteDataPort);
    rtpMgrs[streamid].initialize(localAddr);
    rtpMgrs[streamid].addTarget(remoteAddr);

  • To retransmit the rtp streams received

    hi !
    Program A transmits media data to another program B using RTPSessionMgr.What B has to do is send the recived streams to another receiver program C.This has to play the stream.That is I am trying to implement a client router server model.
    So i maintained a session between A & B and also between B & C.Once NewReceiveStream event occurs at B,it retrieves the datasource from the stream and uses the createSendStream() method of the sessionmanager and sends to C.
    My problem is that,C receives the audio and video streams.but never plays it.I get a pink screen for video and no audio.
    can somebody help me out in pointing out my mistake or tell me a new strategy to implement this.
    Actually,i have tried to use only datagram sockets on the router side.ie I created 2 sockets for B to receive data. and forwarded to C using 2 other sockets .And this did work.But the client does not know the sender details.I require to maintain the sender,receiver reports.So i went for the session manager(RTPManager /RTPSessionMgr).
    kindly help.

    Hi all!
    Nice to meet ya, it very exciting that I got a project which will integrating the JMF and JXTA, it got RTP streams from the JMF framework and then sending it to the JXTA framework ,and sent to any peer in some groups to visualizing it .So some aspects of which is similar to yours ,would you mind add me to your contact list of MSN(my MSN account is:[email protected])

  • The RTP transmission of a message is slow?

    Hi
    I use JMF2.2.1e in windows, and there is the transmission of a message in RTP with a wav file of PCM.
    However, it takes time for about 20 seconds when a DNS server of Network Neiborhood is set till a sound is played.
    It is played whether you exclude setting of DNS immediately if I fill in /etc/hosts with a transmission of a message former address.
    Because I do not get it, in specifications of a program making, as for the measures mentioned above, will not there be any good method?
    Transmission of a message former appointment is as follows.
    String url="rtp://" + addr + ":" + port + "/audio";
    MediaLocator m = new MediaLocator(url);
    datasink = Manager.createDatasink(ds,m);

    The dialog you are referring to is a ''shell dialog''.
    The most likely reason for this to take a long time is that the default location is taking a long time to resolve, as can happen if it's a network location or if the default folder doesn't exist.
    To reset the download location, reset '''browser.download.dir''' in the advanced configuration interface:
    # Select Firefox's location bar (press CTRL+L or click it)
    # Type '''about:config''' and press enter
    # Click '''I'll be careful, I promise!'''
    # in the Filter box, type '''browser.download.dir'''
    # Right click the keyword.url preference below and choose '''Reset'''

  • RTP transmission problem.

    Hi,
    I developed a audio/video conferencing application which transmit the rtp data from server to all clients. The problem I facing is the system working perfectly on my home network. but if I try in my university network it doesnt work.
    I use the IP address 192.168.0.255 for the transmitting server, and the client would use the IP of the server to access the data. In univasersit environment, I change the IP address accordingly.
    What would be the cause of this problem. Is it probably because of my university network settings?
    Plz help me ...thanks

    I've understood my problems...My error was to not close the Player objects created by Receiver...
    I thought that closing RTPManager, i'll close the associated Player object, but this is not true..
    Each Player object need to be closed...

  • It takes 3-5 seconds to start a RTP Transmission?

    Hi there,
    Has anyone experienced the above? I can't hear anything on the receiving end until like about 3-5 seconds after the transmission started.

    Could be some time lapse in identifying the right device to play the sound/media file.
    Do you have a Streaming Server installed to transmit audio files where?
    1. you have written a transmit program to transmit all audio files
    2. you have a receiver program to receive the same
    If this is the case can you ellaborate on the Streaming Server details.
    rgds,
    seetesh

  • RTP Transmission always lose the last packet

    Hi,
    I transmit a audio file through rtp by using the JMFStudio(V 2.1.1e), then capture the rtp packets on the sending port, but the last rtp packet is missing every time.
    Several developers of my team tried on their own machine and got the same result.
    Did anyone try this before or if anybody can tell me how to solve it?
    Thanks a lot!

    Several thoughts...
    If the last "packet" isn't big enough, IE, a full payload worth of data, the underlaying transport mechanism may keep it trapped in the buffer forever. If it sends packets based on a BufferFull event of some sort, then the last packet won't ever be sent because it'll get trapped in the buffer. This is a big problem with TCP sockets, and in that instance, the buffer is flushed automaticlly when a close operation is called on the socket.
    I have no idea how you're managing the sending of RTP packets, but that's a possible reason.
    Another possible reason would be how you're handling the EndOfMedia event. If you were to, say, stop the entire process down before the final RTP packet got sent out, then you'd also not be getting the last packet sent out. Everything in JMF pretty much spawns a new execution thread, so it's possible the main thread is shutting those other threads down before they're done doing work with calls to whatever.Stop() and whatever.Close()...
    Those are just 2 ideas off the top of my head to explain why a last packet may not be sent out...

  • Re: RTP transmission problem

    Is it necessary to use AVTransmit3 and RTPSocketAdapter classes for your project..
    If not, for your requirement, you can use AVTransmit2 and you can add a remoteSessionAddress whenever you need.
    If you want to use the AVTransmit3 and RTPSocketAdapter, keep the createTransmitter in separate class, and create a object whenever you want (say new avater met), here u have to careful to use the local datasource...
    Karthikeyan R

    I've understood my problems...My error was to not close the Player objects created by Receiver...
    I thought that closing RTPManager, i'll close the associated Player object, but this is not true..
    Each Player object need to be closed...

  • Is jmf required on recieving side during rtp transmission

    hi to all,
    iam new to jmf.
    I am transmitting media through rtp using AVTransmit2.
    But i dont want to use jmf on recieving side .
    can someone suggest me, how to do that .
    An early response would be a great help to me .

    rohit_rdx wrote: hi to all,
    iam new to jmf.
    I am transmitting media through rtp using AVTransmit2.
    But i dont want to use jmf on recieving side . >By 'I' I assume you actually mean you do not want to require your end user to install JMF?
    Note that it is easy (for the user) if you (the developer) deploy the JMF* using webstart. Installing JMF can then be as easy as 'one click' for some types of deployments** for most users***.
    * JMF, or an application that uses JMF.
    ** E.G. If you have a sandboxed app. that suggests desktop shortcuts and menu items, the user will be prompted in dialogs - no longer a 'one click' install.
    *** If the user has a suitable minimum version of Java to run the application, otherwise the Java plug-in might prompt them to update their Java.

  • JMF 1.0 supports RTP receive

    Hi all,
    I thought JMF 1.0 only supports playback of media from a network source. But then i came through this power point which says that, it supports RTP but receive only. I am confused, what does this mean ? does this mean that a JMF 1.0 client player does support receiving and playing of real time media. However, it does not support transmitting RTP data. I did search a lot about this topic and did not found a whole lot. I have read the JMF 2.0 spec but did not found any answer related to this topic. I haven't found JMF 1.0 spec yet and would really appreciate if some could explain this topic.
    https://www.research.ibm.com/haifa/p...eo/jmf_std.ppt
    Thanks,
    limbu
    Edited by: 878310 on Aug 9, 2011 8:25 AM
    Edited by: 878310 on Aug 9, 2011 8:26 AM

    The latest version of JMF, 2.1.1, supports both sending and receiving via RTP.
    The fact that you've said you can't find anything on the entire internet specifying that makes me think you're probably a troll...

  • JMF How to stream rtp from udp packet

    I implemented an rtsp client and use the client to setup two rtp session(audio, vedio). But when I use the example pramgram "AVReceive3" to stream the udp packet, it doesn't work.
    When the AVReceive3 receive the udp packet from the rtp port, it will call the update(ReceiveStreamEvent evt) function, and the event type is StreamMappedEvent, and the call to evt.getReceiveStream().getDataSource() return null.
    I thought the first event should be NewReceiveStreamEvent. Please help to solve the problem.
    What's rtp packet will cause the event StreamMappedEvent.
    Following is the code of AVReceive3.java:
    * AVReceive3.java
    * Created on 2007年10月30日, 下午4:11
    * To change this template, choose Tools | Template Manager
    * and open the template in the editor.
    package PlayerTest;
    import java.io.*;
    import java.awt.*;
    import java.net.*;
    import java.awt.event.*;
    import java.util.Vector;
    import javax.media.*;
    import javax.media.rtp.*;
    import javax.media.rtp.event.*;
    import javax.media.rtp.rtcp.*;
    import javax.media.protocol.*;
    import javax.media.protocol.DataSource;
    import javax.media.format.AudioFormat;
    import javax.media.format.VideoFormat;
    import javax.media.Format;
    import javax.media.format.FormatChangeEvent;
    import javax.media.control.BufferControl;
    * AVReceive3 to receive RTP transmission using the RTPConnector.
    public class AVReceive3 extends Thread implements ReceiveStreamListener, SessionListener,
    ControllerListener
    String sessions[] = null;
    RTPManager mgrs[] = null;
    Vector playerWindows = null;
    boolean dataReceived = false;
    Object dataSync = new Object();
    public AVReceive3(String sessions[])
    this.sessions = sessions;
    public void run()
    initialize();
    public boolean initialize() {
    try {
    mgrs = new RTPManager[sessions.length];
    playerWindows = new Vector();
    SessionLabel session;
    // Open the RTP sessions.
    for (int i = 0; i < sessions.length; i++) {
    // Parse the session addresses.
    try {
    session = new SessionLabel(sessions);
    } catch (IllegalArgumentException e) {
    System.err.println("Failed to parse the session address given: " + sessions[i]);
    return false;
    System.err.println(" - Open RTP session for: addr: " + session.addr + " port: " + session.port + " ttl: " + session.ttl);
    mgrs[i] = (RTPManager) RTPManager.newInstance();
    mgrs[i].addSessionListener(this);
    mgrs[i].addReceiveStreamListener(this);
    // Initialize the RTPManager with the RTPSocketAdapter
    mgrs[i].initialize(new RTPSocketAdapter(
    InetAddress.getByName(session.addr),
    session.port, session.ttl));
    // You can try out some other buffer size to see
    // if you can get better smoothness.
    BufferControl bc = (BufferControl)mgrs[i].getControl("javax.media.control.BufferControl");
    if (bc != null)
    bc.setBufferLength(350);
    } catch (Exception e){
    System.err.println("Cannot create the RTP Session: " + e.getMessage());
    return false;
    // Wait for data to arrive before moving on.
    long then = System.currentTimeMillis();
    long waitingPeriod = 30000; // wait for a maximum of 30 secs.
    try{
    synchronized (dataSync) {
    while (!dataReceived &&
    System.currentTimeMillis() - then < waitingPeriod) {
    if (!dataReceived)
    System.err.println(" - Waiting for RTP data to arrive");
    dataSync.wait(1000);
    } catch (Exception e) {}
    if (!dataReceived) {
    System.err.println("No RTP data was received.");
    close();
    return false;
    return true;
    public boolean isDone() {
    return playerWindows.size() == 0;
    * Close the players and the session managers.
    protected void close() {
    for (int i = 0; i < playerWindows.size(); i++) {
    try {
    ((PlayerWindow)playerWindows.elementAt(i)).close();
    } catch (Exception e) {}
    playerWindows.removeAllElements();
    // close the RTP session.
    for (int i = 0; i < mgrs.length; i++) {
    if (mgrs[i] != null) {
    mgrs[i].removeTargets( "Closing session from AVReceive3");
    mgrs[i].dispose();
    mgrs[i] = null;
    PlayerWindow find(Player p) {
    for (int i = 0; i < playerWindows.size(); i++) {
    PlayerWindow pw = (PlayerWindow)playerWindows.elementAt(i);
    if (pw.player == p)
    return pw;
    return null;
    PlayerWindow find(ReceiveStream strm) {
    for (int i = 0; i < playerWindows.size(); i++) {
    PlayerWindow pw = (PlayerWindow)playerWindows.elementAt(i);
    if (pw.stream == strm)
    return pw;
    return null;
    * SessionListener.
    public synchronized void update(SessionEvent evt) {
    if (evt instanceof NewParticipantEvent) {
    Participant p = ((NewParticipantEvent)evt).getParticipant();
    System.err.println(" - A new participant had just joined: " + p.getCNAME());
    * ReceiveStreamListener
    public synchronized void update( ReceiveStreamEvent evt) {
    System.out.println("\nReceive an receiveStreamEvent:"+evt.toString());
    RTPManager mgr = (RTPManager)evt.getSource();
    Participant participant = evt.getParticipant(); // could be null.
    ReceiveStream stream = evt.getReceiveStream(); // could be null.
    System.out.println("The RTPManager is:");
    if (evt instanceof RemotePayloadChangeEvent) {
    System.err.println(" - Received an RTP PayloadChangeEvent.");
    System.err.println("Sorry, cannot handle payload change.");
    System.exit(0);
    else if (evt instanceof NewReceiveStreamEvent) {
    try {
    stream = ((NewReceiveStreamEvent)evt).getReceiveStream();
    DataSource ds = stream.getDataSource();
    // Find out the formats.
    RTPControl ctl = (RTPControl)ds.getControl("javax.media.rtp.RTPControl");
    if (ctl != null){
    System.err.println(" - Recevied new RTP stream: " + ctl.getFormat());
    } else
    System.err.println(" - Recevied new RTP stream");
    if (participant == null)
    System.err.println(" The sender of this stream had yet to be identified.");
    else {
    System.err.println(" The stream comes from: " + participant.getCNAME());
    // create a player by passing datasource to the Media Manager
    Player p = javax.media.Manager.createPlayer(ds);
    if (p == null)
    return;
    p.addControllerListener(this);
    p.realize();
    PlayerWindow pw = new PlayerWindow(p, stream);
    playerWindows.addElement(pw);
    pw.setVisible(true);
    // Notify intialize() that a new stream had arrived.
    synchronized (dataSync) {
    dataReceived = true;
    dataSync.notifyAll();
    } catch (Exception e) {
    System.err.println("NewReceiveStreamEvent exception " + e.getMessage());
    return;
    else if (evt instanceof StreamMappedEvent) {
    if (stream != null)
    if(stream.getDataSource()!=null)
    DataSource ds = stream.getDataSource();
    // Find out the formats.
    RTPControl ctl = (RTPControl)ds.getControl("javax.media.rtp.RTPControl");
    System.err.println(" - The previously unidentified stream ");
    if (ctl != null)
    System.err.println(" " + ctl.getFormat());
    System.err.println(" had now been identified as sent by: " + participant.getCNAME());
    else if (evt instanceof ByeEvent) {
    System.err.println(" - Got \"bye\" from: " + participant.getCNAME());
    PlayerWindow pw = find(stream);
    if (pw != null) {
    pw.close();
    playerWindows.removeElement(pw);
    * ControllerListener for the Players.
    public synchronized void controllerUpdate(ControllerEvent ce) {
    Player p = (Player)ce.getSourceController();
    if (p == null)
    return;
    // Get this when the internal players are realized.
    if (ce instanceof RealizeCompleteEvent) {
    PlayerWindow pw = find(p);
    if (pw == null) {
    // Some strange happened.
    System.err.println("Internal error!");
    System.exit(-1);
    pw.initialize();
    pw.setVisible(true);
    p.start();
    if (ce instanceof ControllerErrorEvent) {
    p.removeControllerListener(this);
    PlayerWindow pw = find(p);
    if (pw != null) {
    pw.close();
    playerWindows.removeElement(pw);
    System.err.println("AVReceive3 internal error: " + ce);
    * A utility class to parse the session addresses.
    class SessionLabel {
    public String addr = null;
    public int port;
    public int ttl = 1;
    SessionLabel(String session) throws IllegalArgumentException {
    int off;
    String portStr = null, ttlStr = null;
    if (session != null && session.length() > 0) {
    while (session.length() > 1 && session.charAt(0) == '/')
    session = session.substring(1);
    // Now see if there's a addr specified.
    off = session.indexOf('/');
    if (off == -1) {
    if (!session.equals(""))
    addr = session;
    } else {
    addr = session.substring(0, off);
    session = session.substring(off + 1);
    // Now see if there's a port specified
    off = session.indexOf('/');
    if (off == -1) {
    if (!session.equals(""))
    portStr = session;
    } else {
    portStr = session.substring(0, off);
    session = session.substring(off + 1);
    // Now see if there's a ttl specified
    off = session.indexOf('/');
    if (off == -1) {
    if (!session.equals(""))
    ttlStr = session;
    } else {
    ttlStr = session.substring(0, off);
    if (addr == null)
    throw new IllegalArgumentException();
    if (portStr != null) {
    try {
    Integer integer = Integer.valueOf(portStr);
    if (integer != null)
    port = integer.intValue();
    } catch (Throwable t) {
    throw new IllegalArgumentException();
    } else
    throw new IllegalArgumentException();
    if (ttlStr != null) {
    try {
    Integer integer = Integer.valueOf(ttlStr);
    if (integer != null)
    ttl = integer.intValue();
    } catch (Throwable t) {
    throw new IllegalArgumentException();
    * GUI classes for the Player.
    class PlayerWindow extends Frame {
    Player player;
    ReceiveStream stream;
    PlayerWindow(Player p, ReceiveStream strm) {
    player = p;
    stream = strm;
    public void initialize() {
    add(new PlayerPanel(player));
    public void close() {
    player.close();
    setVisible(false);
    dispose();
    public void addNotify() {
    super.addNotify();
    pack();
    * GUI classes for the Player.
    class PlayerPanel extends Panel {
    Component vc, cc;
    PlayerPanel(Player p) {
    setLayout(new BorderLayout());
    if ((vc = p.getVisualComponent()) != null)
    add("Center", vc);
    if ((cc = p.getControlPanelComponent()) != null)
    add("South", cc);
    public Dimension getPreferredSize() {
    int w = 0, h = 0;
    if (vc != null) {
    Dimension size = vc.getPreferredSize();
    w = size.width;
    h = size.height;
    if (cc != null) {
    Dimension size = cc.getPreferredSize();
    if (w == 0)
    w = size.width;
    h += size.height;
    if (w < 160)
    w = 160;
    return new Dimension(w, h);
    public static void main(String argv[]) {
    //if (argv.length == 0)
    // prUsage();
    String sessions[]= new String[] {"127.0.0.1/6670","127.0.0.1/6672"};
    AVReceive3 avReceive = new AVReceive3(sessions);
    if (!avReceive.initialize()) {
    System.err.println("Failed to initialize the sessions.");
    System.exit(-1);
    // Check to see if AVReceive3 is done.
    try {
    while (!avReceive.isDone())
    Thread.sleep(1000);
    } catch (Exception e) {}
    System.err.println("Exiting AVReceive3");
    static void prUsage() {
    System.err.println("Usage: AVReceive3 <session> <session> ");
    System.err.println(" <session>: <address>/<port>/<ttl>");
    System.exit(0);
    }// end of AVReceive3
    Following is the code of RTPSocketAdapter.java:
    * RTPSocketAdapter.java
    * Created on 2007&#24180;10&#26376;30&#26085;, &#19979;&#21320;4:13
    * To change this template, choose Tools | Template Manager
    * and open the template in the editor.
    package PlayerTest;
    import java.io.IOException;
    import java.net.InetAddress;
    import java.net.DatagramSocket;
    import java.net.MulticastSocket;
    import java.net.DatagramPacket;
    import java.net.SocketException;
    import javax.media.protocol.DataSource;
    import javax.media.protocol.PushSourceStream;
    import javax.media.protocol.ContentDescriptor;
    import javax.media.protocol.SourceTransferHandler;
    import javax.media.rtp.RTPConnector;
    import javax.media.rtp.OutputDataStream;
    * An implementation of RTPConnector based on UDP sockets.
    public class RTPSocketAdapter implements RTPConnector {
    DatagramSocket dataSock;
    DatagramSocket ctrlSock;
    InetAddress addr;
    int port;
    SockInputStream dataInStrm = null;
    SockInputStream ctrlInStrm = null;
    SockOutputStream dataOutStrm = null;
    SockOutputStream ctrlOutStrm = null;
    public RTPSocketAdapter(InetAddress addr, int port) throws IOException {
    this(addr, port, 1);
    public RTPSocketAdapter(InetAddress addr, int port, int ttl) throws IOException {
    try {
    if (addr.isMulticastAddress()) {
    dataSock = new MulticastSocket(port);
    ctrlSock = new MulticastSocket(port+1);
    ((MulticastSocket)dataSock).joinGroup(addr);
    ((MulticastSocket)dataSock).setTimeToLive(ttl);
    ((MulticastSocket)ctrlSock).joinGroup(addr);
    ((MulticastSocket)ctrlSock).setTimeToLive(ttl);
    } else {
    dataSock = new DatagramSocket(port, InetAddress.getLocalHost());
    ctrlSock = new DatagramSocket(port+1, InetAddress.getLocalHost());
    } catch (SocketException e) {
    throw new IOException(e.getMessage());
    this.addr = addr;
    this.port = port;
    * Returns an input stream to receive the RTP data.
    public PushSourceStream getDataInputStream() throws IOException {
    if (dataInStrm == null) {
    dataInStrm = new SockInputStream(dataSock, addr, port);
    dataInStrm.start();
    return dataInStrm;
    * Returns an output stream to send the RTP data.
    public OutputDataStream getDataOutputStream() throws IOException {
    if (dataOutStrm == null)
    dataOutStrm = new SockOutputStream(dataSock, addr, port);
    return dataOutStrm;
    * Returns an input stream to receive the RTCP data.
    public PushSourceStream getControlInputStream() throws IOException {
    if (ctrlInStrm == null) {
    ctrlInStrm = new SockInputStream(ctrlSock, addr, port+1);
    ctrlInStrm.start();
    return ctrlInStrm;
    * Returns an output stream to send the RTCP data.
    public OutputDataStream getControlOutputStream() throws IOException {
    if (ctrlOutStrm == null)
    ctrlOutStrm = new SockOutputStream(ctrlSock, addr, port+1);
    return ctrlOutStrm;
    * Close all the RTP, RTCP streams.
    public void close() {
    if (dataInStrm != null)
    dataInStrm.kill();
    if (ctrlInStrm != null)
    ctrlInStrm.kill();
    dataSock.close();
    ctrlSock.close();
    * Set the receive buffer size of the RTP data channel.
    * This is only a hint to the implementation. The actual implementation
    * may not be able to do anything to this.
    public void setReceiveBufferSize( int size) throws IOException {
    dataSock.setReceiveBufferSize(size);
    * Get the receive buffer size set on the RTP data channel.
    * Return -1 if the receive buffer size is not applicable for
    * the implementation.
    public int getReceiveBufferSize() {
    try {
    return dataSock.getReceiveBufferSize();
    } catch (Exception e) {
    return -1;
    * Set the send buffer size of the RTP data channel.
    * This is only a hint to the implementation. The actual implementation
    * may not be able to do anything to this.
    public void setSendBufferSize( int size) throws IOException {
    dataSock.setSendBufferSize(size);
    * Get the send buffer size set on the RTP data channel.
    * Return -1 if the send buffer size is not applicable for
    * the implementation.
    public int getSendBufferSize() {
    try {
    return dataSock.getSendBufferSize();
    } catch (Exception e) {
    return -1;
    * Return the RTCP bandwidth fraction. This value is used to
    * initialize the RTPManager. Check RTPManager for more detauls.
    * Return -1 to use the default values.
    public double getRTCPBandwidthFraction() {
    return -1;
    * Return the RTCP sender bandwidth fraction. This value is used to
    * initialize the RTPManager. Check RTPManager for more detauls.
    * Return -1 to use the default values.
    public double getRTCPSenderBandwidthFraction() {
    return -1;
    * An inner class to implement an OutputDataStream based on UDP sockets.
    class SockOutputStream implements OutputDataStream {
    DatagramSocket sock;
    InetAddress addr;
    int port;
    public SockOutputStream(DatagramSocket sock, InetAddress addr, int port) {
    this.sock = sock;
    this.addr = addr;
    this.port = port;
    public int write(byte data[], int offset, int len) {
    try {
    sock.send(new DatagramPacket(data, offset, len, addr, port));
    } catch (Exception e) {
    return -1;
    return len;
    * An inner class to implement an PushSourceStream based on UDP sockets.
    class SockInputStream extends Thread implements PushSourceStream {
    DatagramSocket sock;
    InetAddress addr;
    int port;
    boolean done = false;
    boolean dataRead = false;
    SourceTransferHandler sth = null;
    public SockInputStream(DatagramSocket sock, InetAddress addr, int port) {
    this.sock = sock;
    this.addr = addr;
    this.port = port;
    public int read(byte buffer[], int offset, int length) {
    DatagramPacket p = new DatagramPacket(buffer, offset, length, addr, port);
    try {
    sock.receive(p);
    } catch (IOException e) {
    return -1;
    synchronized (this) {
    dataRead = true;
    notify();
    System.out.println("RTPSocketAdapter receive RTP packet from port:"+port);
    System.out.println("The received RTP packet:"+new String(buffer));
    return p.getLength();
    public synchronized void start() {
    super.start();
    if (sth != null) {
    dataRead = true;
    notify();
    public synchronized void kill() {
    done = true;
    notify();
    public int getMinimumTransferSize() {
    return 2 * 1024; // twice the MTU size, just to be safe.
    public synchronized void setTransferHandler(SourceTransferHandler sth) {
    this.sth = sth;
    dataRead = true;
    notify();
    // Not applicable.
    public ContentDescriptor getContentDescriptor() {
    return null;
    // Not applicable.
    public long getContentLength() {
    return LENGTH_UNKNOWN;
    // Not applicable.
    public boolean endOfStream() {
    return false;
    // Not applicable.
    public Object[] getControls() {
    return new Object[0];
    // Not applicable.
    public Object getControl(String type) {
    return null;
    * Loop and notify the transfer handler of new data.
    public void run() {
    while (!done) {
    synchronized (this) {
    while (!dataRead && !done) {
    try {
    wait();
    } catch (InterruptedException e) { }
    dataRead = false;
    if (sth != null && !done) {
    sth.transferData(this);
    Thanks.

    The error of No format has been registered for RTP Payload type 96
    is caused by the dynamic payload mapping, when I add the dynamic mapping between dynamic payload and format. The Player cann't work yet. I think it because JMF doesn't support the format of my clips. For example:
    video: a=rtpmap:96 H263-2000/90000
    audio:a=rtpmap:97 MP4A-LATM/12000/1
    Is there some available plugin to support these format?
    Thanks

  • Audio not being sent in RTP

    Hi,
    I'm developing a simple RTP application for capturing and Transmitting live audio. I have used [AVTransmit2|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/AVTransmit2.java] at the sender side and [AVReceive2|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/AVReceive2.java] at the receiver side. Everything is working fine when I try to transmit a wav file. But, when I try to capture live audio from the Mic and try to transmit, there is no audio at the receiver side.
    I have used the following code to create a MediaLocator for the microphone.
         public static MediaLocator createLocator(){
              CaptureDeviceInfo captureDeviceInfo = null;
              Vector captureDevices=null;
              captureDevices = CaptureDeviceManager.getDeviceList(null);
              System.out.println(
                        "- number of capture devices: " + captureDevices.size());
              for (int i = 0; i < captureDevices.size(); i++) {
                   captureDeviceInfo = (CaptureDeviceInfo) captureDevices.elementAt(i);
                   System.out.println(
                             "    - name of the capture device: " + captureDeviceInfo.getName());
                   Format[] formatArray = captureDeviceInfo.getFormats();
                   for (int j = 0; j < formatArray.length; j++) {
                        Format format = formatArray[j];
                        if (format instanceof AudioFormat) {
                             System.out.println(
                                       "         - format accepted by this AUDIO device: "
                                       + format.toString().trim());
                        } else
                             System.out.println("         - format of type UNKNOWN");
              if (captureDeviceInfo!=null){
                   MediaLocator audioLocator = captureDeviceInfo.getLocator();
                   return audioLocator;
              return null;
         }I've used this to initialize and start the transmitter:
    MediaLocator locator = createLocator();
    AVTransmit2 at = new AVTransmit2(locator, "10.0.0.226", "42050");
    String result = at.start();Is this the correct way to capture live audio? Please advise me if my approach is wrong.
    At the receiver side, I get the notification that a new participant has joined. Then it keeps on waiting for the RTP packets to arrive and finally gets timed out.
    - Waiting for RTP data to arrive...
    - Waiting for RTP data to arrive...I have not made any changes to AVTransmit2 or AVReceive2. Both systems are intel Macs running Snow Leopard. I tried to use other java application to capture live audio and its working fine. I don't know how to create a datasource or Medialocator out of it. :(

    Thanx for replying, captfoss. :)
    Actually, in the code for getting the media locator I had put in some extra redundant lines just to print out the list of formats supported by the devices. Other than that they don't do anything useful. In this mac, javasound://44100 is the only device detected. So, I use it directly(the first device).
    I have tried with javasound://0 , but still no luck. Any wav file is transmitted and received successfully. Is it anything related to this OS? I searched here and read that there might be issues in mac jvm not working as expected with capture devices. I don't have any idea on how to proceed. Are there any workarounds? I mean, I had used another code from an open-source java project for rtp transmission. They were using TargetDataLine and PipedOutputStream class for capturing audio and doing custom encoding in to ULAW and sending it. Its working fine with AVReceive2. Its not able to identify the stream owner, but still there is a voice path. Can I create a custom datasource from it?
    Sorry, for asking too many questions, but I'm a little bit excited as well as disappointed...

Maybe you are looking for

  • Seeburger BIC tool

    Going to support EDI interface which uses Seeburger AS2 adapter either at sender or receiver side.I have few doubts on BIC module: 1.Do i need to install BIC tool on my laptop or it is a web based tool which i can access from seeburger workbench? 2.F

  • How to stop rendering a report?

    I am building reports using MS Visual Studio, and in the Preview tab I have the Refresh button and a button with a X to stop a report while rendering. However, when I deploy the report to the reporting server and I run it from the application the X b

  • GR Processing Time and Delivery Time not coming in CS11 Output screen

    Hi, I am executing a BOM explosion report using CS11. In the CS11 report output I added columns for GR Time and Delivery Time using change layout option. After changing layout when I see the details in columns of GR Time and Delivery Time all the val

  • Search help PREM is empty

    We just upgraded to SAP ECC 6.0 and we are no longer able to do mathcode lookups on employees names.  i.e., PA20 and then in the Personnel No field when we enter =n.smith we get the message "No input help available (search help PREM is empty).  PA20

  • DeskJet 9670 on Windows 8.1.

    I have been using the above printer on Windows XP for several years, having set up from the drivers on the installation CD. I purchased a laptop running Win8.1 last July, and found that the native drivers would not install, but was advised to use the