RTP transmition problems!!!

I write a application that send video stream to the other PC. The RTP transimition code is like the AVTransmit sample. It works well.but when I want to close or shut down my RTP transmition, there is a big problem that how can I close it absolutely without qiut the application. The close method is like this:
public void stopTrans()
//synchronized (this)
if (processor != null)
processor.stop();
processor.close();
processor = null;
for (int i = 0; i < rtpMgrs.length; i++)
rtpMgrs.removeTargets("Session ended.");
rtpMgrs[i].dispose();
rtpMgrs[i] = null;
rtpMgrs = null;
when I press the send button to start a new transmition again, there is a exception said:
Can not create RTP session: javax.media.rtp.InvalidSessionAddressException: Can't open local data port: 3000
port 3000 is the port number I used last time. So if I want to create transmition again using the same port, it failed. Why , how can I release the port form a RTP transmition? Please help me. Thank you very much.

I have exactly the same problem with Rick.T, that is I cant use the same port 2nd time.... I get the message "can't open local data port xxxxx"
For every unicast session I use 4 ports per client, 2 to transmit (video-audio) 2 to receive (video-audio). So, it doesnt make sense to use every time new ports...
I have tried to close/stop/deallocate the Processor, to stop/disconnect DataSource, to stop/close SendStream and of course to removeTargets/dispose the RTPManager.... But I still face the same problem
Its the very final stage of my project and I dont know what else to try....
I have searched for other similar topics here but there is no answer to that problem.
Is there anyone who has the solution to release the ports??
thanx in advance

Similar Messages

  • Rtp transmitting problem

    i am passing with a problem.......
    i read in JMF api that are three methods to transmit voice onthe network
    1-e a MediaLocator that has the parameters of the RTP session to construct an RTP DataSink by calling Manager.createDataSink.
    2-e a session manager to create send streams for the content and control the transmission.
    3-through sockets..
    i got the code for ist method of datasinking....
    but i dont know that how to get this string....
    that is url
    // hand this datasource to manager for creating an RTP
    // datasink our RTP datasimnk will multicast the audio
    try {
    String url= "rtp://224.144.251.104:49150/audio/1";
    MediaLocator m = new MediaLocator(url);
    DataSink d = Manager.createDataSink(ds, m);
    d.open();
    d.start();
    } catch (Exception e) {
    System.exit(-1);
    }

    I've faced a similar problem, what I did was to use TWO session managers, one for transmitter and the other for receiver. I then used this objects in a container.
    Both peer use the same port, and know each other's address.
    For receiving data from the peer I do:
    RTPManager mymgr = RTPManager.newInstance();
    // add as listener to create players for each incoming stream
    mymgr.addReceiveStreamListener(this);
    SessionAddress localaddr =new SessionAddress( InetAddress.getLocalHost(), port);
    SessionAddress sessaddr = new SessionAddress( peerAddress, port);
    mymgr.initialize(localaddr);
    mymgr.addTarget( sessaddr);For transmitting data to the peer I do:
    RTPManager mymgr = RTPManager.newInstance();
    // add as listener to create players for each incoming stream
    mymgr.addReceiveStreamListener(this);
    SessionAddress localaddr =new SessionAddress( InetAddress.getLocalHost(), SessionAddress.ANY_PORT);
    SessionAddress sessaddr = new SessionAddress( peerAddress, port);
    mymgr.initialize(localaddr);
    mymgr.addTarget( sessaddr);
    // the processor is capturing audio from the mic
    processor.start();
    SendStream sendStream = mymgr.createSendStream(processor.getDataOutput(), i);
    sendStream.start();I probably forgot something when doing copy paste from my code... but I think this might help you
    Good luck
    Marco

  • RTP transmitting

    Hi, I'm now doing the RTP transmitting of video. However, the example I've seen most is only the DOS command input. So, I'm wondering if anybody could share their codes with me about the way to make a dialog as the value input for the RTP transmitting. Thanks a lot :)

    Covertnig the Sun's sample programs to take the inputs from Input Dialog Box is very simple.
    You have to make the changes - where the constructor (of AVReceive and AVTransmit ) is being called.
    Hope this helps !
    Max !

  • RTP transmission problem.

    Hi,
    I developed a audio/video conferencing application which transmit the rtp data from server to all clients. The problem I facing is the system working perfectly on my home network. but if I try in my university network it doesnt work.
    I use the IP address 192.168.0.255 for the transmitting server, and the client would use the IP of the server to access the data. In univasersit environment, I change the IP address accordingly.
    What would be the cause of this problem. Is it probably because of my university network settings?
    Plz help me ...thanks

    I've understood my problems...My error was to not close the Player objects created by Receiver...
    I thought that closing RTPManager, i'll close the associated Player object, but this is not true..
    Each Player object need to be closed...

  • RTP Tranmission Problem

    Dear Friends
    I am working with JMF to transmit a video (.avi) file from one computer to another computer in tcp/ip network. computers are connected in peer to peer architecture only. when i run the transmission server code it works and transmits ( i verified it in same computer ) but in remote computer (connected in same network)... either it comes after a long time - that also not fully.. or it just waits for the data.. so..no transmission.. i have to show a demo to my staff.. please help me dear friends.. regarding this problem if your have any questions or doubts please contact me in [email protected] or post your replies here.. i am waiting ...
    with regards
    Muthukumar

    when i run the transmission server code it works and transmits ( i verified it in same computer ) I don't understand, how do you know it is transmitting over network on same computer??
    Are you using jmstudio??
    file transmit to ip and port of destination computer
    destination computer open ip and port of destination computer.
    or transmit to xxx.xxx.xxx.255 portNum
    and receive on xxx.xxx.xxx.255 portNum

  • H263 RTP and JPEG RTP buffering problem

    Hi
    I have a problem with buffering received video stream. I use a buffercontrol of rtpmanager and for audio stream eveything works fine but for video buffercontrol is null, sa i can't set it (the same problem is with player or porcessor). Audio stream has all parameters (bitrate, samplerate etc) but video has only encoding and size parameters.
    I tried to change format (by processor) of received stream to supported format (for example jpeg) and set all parameters (size, framerate, maxdatalength and encoding) and use output datasource but still player can't get buffercontrol. When i use a porcessor as a player i can see in media properties that it is set like i want, but buffercontrol is null.
    I think that h263_rtp and jpeg_rtp video format don't support frameate (i tried set it but nothing happens) and it's played as it comes.
    I need to set a buffer because the framerate of incoming stream is really low (it's p2p application) and smothnes of video is bad.
    Please helpme where i can find any suggestion to solve this problem. Maybe i have to use custom transport layer or something like that.

    cziz13 wrote:
    I don't know what you want to say. I said exactly what I wanted to say.
    i got buffercontrol on realized player (it cals rendering and it's in solution AudioBufferControl.java). I got the buffer even on video file, but only when i play it from my hd. when i try to do it on received video stream the buffer is not working.Good for you. But that wasn't my point.
    Whenever you request a "Player" object, you don't get a "Player" object, you get some specific "Player" subclass that, depending on what it's designed to do, will implement certain interfaces.
    The BufferControl object you're getting on your Player class that will play from the hard drive is more of a prefetch cache than a buffer (in the sense that I think of a buffer, at least). It's using a pull datasource, so it can control getting data from its datasource.
    However, the player object that's playing a RTP stream is relying upon a pull buffer data source. It has absolutely no control over the data, and it's told by the DataSource when data is ready to be played...so it just loads when it's told to do by the DataSource.
    And the datasource is just handed data from the RTP streams inside the RTPManager based on the internal RTP buffering that is handled by the RTPManager...
    Any more suggestions?My original "suggestion" still stands. Get your BufferControl object on the RTPManager that you're using to receive the stream...

  • G723_RTP decoding from rtp stream problem

    HI,
    I've managed to establish rtp connection and send the audio stream in G732 format. The stream is ok, but I can't decode the response from the asterisk server. I use the same way to set the audio format for both sending and receiving:
    TrackControl[] tcs = processor.getTrackControls();
    if(tcs.length == 0) {
         return; //should not enter this case
    TrackControl tc = tcs[0];//we expect one channel only
    String codecType = SoftPhoneSettings.audio_codec;
    tc.setFormat(new AudioFormat(AudioFormat.G723_RTP, 8000.0, 16, 1, -1, -1, 192, -1.0, null));
    tc.setEnabled(true);If I use the gsm_rtp or ulaw_rtp codec it works just fine on both sides (also decoded right by me), but with G723_RTP I here no voice. And also I got EXCEPTION_ACCESS_VIOLATION on processor.close();, which is also only if using this codec.
    The asterisk server is working fine, but I can't figure out what exactly is the problem here, and why I send it correctly but I can't decode the response. I've also tried with setCodecChain, but the situation is the same. Can anybody help me with this, because I can't find the source of the problem and more important I still can't solve it. 10x in advance.

    Is there any standard way for retrieving the G723_RTP stream, like using a player or?

  • Video RTP transmittion problem

    Hey guys (and girls),
    I'm writing two programs that either send or receive a RTP video stream captured with a webcam. Using JMStudio I can capture and transmit the video and happily receive it with my client program, but when I try to capture and transmit the video using my server program the client can't realize because it's not got any data.
    Here is my transmitter code, my server class creates a new instance of the camTransmitter class.
    import java.io.*;
    import java.awt.*;
    import java.util.Vector;
    import javax.swing.*;
    import java.awt.event.*;
    import javax.media.*;
    import javax.media.format.*;
    import javax.media.protocol.*;
    import javax.media.control.*;
    import javax.media.ControllerListener;
    import javax.media.rtp.*;
    import javax.media.rtp.rtcp.*;
    import javax.media.rtp.event.*;
    import javax.media.rtp.SendStreamListener.*;
    import java.net.InetAddress;
    public class camTransmitter implements ControllerListener
         Processor p = null;
         CaptureDeviceInfo di = null;
         boolean configured, realized = false;
         public String ipAddress = "192.168.123.150";
         public int port = 22222;
         public RTPManager rtpMgrs[];
         public DataSource vDS;
         public SessionAddress localAddr, destAddr;
         public InetAddress ipAddr;
         public SendStream sendStream;
        public camTransmitter()
         // First, we'll need a DataSource that captures live video
         Format vFormat = new VideoFormat(VideoFormat.RGB);
         Vector devices = CaptureDeviceManager.getDeviceList(vFormat);
         if (devices.size() > 0)
              di = (CaptureDeviceInfo) devices.elementAt(0);
         else
              // exit if we could not find the relevant capture device.
              System.out.println("no devises");          
              System.exit(-1);
         MediaLocator camLoctation = di.getLocator();
         // Create a processor for this capture device & exit if we
         // cannot create it
         try
              vDS = Manager.createDataSource(camLoctation);
              p = Manager.createProcessor(vDS);
              p.addControllerListener(this);
         catch (IOException e) {
              System.exit(-1);
         }catch (NoProcessorException e) {
              System.exit(-1);
         }catch(NoDataSourceException e) {
              System.exit(-1);
         // at this point, we have succesfully created the processor.
         // Realize it and block until it is configured.
         p.configure();
         while(configured != true){}
         // block until it has been configured
         p.setContentDescriptor(new ContentDescriptor(ContentDescriptor.RAW_RTP));
         TrackControl track[] = p.getTrackControls();
         boolean encodingOk = false;
         // Go through the tracks and try to program one of them to
         // output ULAW_RTP data.
         for (int i = 0; i < track.length; i++)
              if (!encodingOk && track[i] instanceof FormatControl)
                   if (((FormatControl)track).setFormat(vFormat) == null)
                        track[i].setEnabled(false);
                   else
                        encodingOk = true;
              else
                   // we could not set this track to gsm, so disable it
                   track[i].setEnabled(false);
         // Realize it and block until it is realized.
         p.realize();
         while(realized != true){}
         // block until realized.
         // get the output datasource of the processor and exit
         // if we fail
         DataSource ds = null;
         try
              ds = p.getDataOutput();
         catch (NotRealizedError e)
              System.exit(-1);
         PushBufferDataSource pbds = (PushBufferDataSource)ds;
         PushBufferStream pbss[] = pbds.getStreams();
         rtpMgrs = new RTPManager[pbss.length];
         for (int i = 0; i < pbss.length; i++) {
         try {
              rtpMgrs[i] = RTPManager.newInstance();
              ipAddr = InetAddress.getByName(ipAddress);
              localAddr = new SessionAddress(InetAddress.getLocalHost(),port);
              destAddr = new SessionAddress(ipAddr, port);
              System.out.println(localAddr);
              rtpMgrs[i].initialize(destAddr);
              rtpMgrs[i].addTarget(destAddr);
              System.out.println( "Created RTP session: " + ipAddress + " " + port);
              sendStream = rtpMgrs[i].createSendStream(ds, i);          
              sendStream.start();
         } catch (Exception e) {
              System.err.println(e.getMessage());
         //System.out.println("RTP Stream created and running");
    public void controllerUpdate(ControllerEvent e)
         if(e instanceof ConfigureCompleteEvent)
              System.out.println("Configure Complete");
              configured = true;
         if(e instanceof RealizeCompleteEvent)
              System.out.println("Ralizing complete");
              realized = true;
    When I run this the processor realizes and the RTP seems to stream, but I have no idea why I can't receive the video from this.
    Pls help if anyone can Thanks

    Dear andreyvk ,
    I've read your post
    http://forum.java.sun.com/thread.jspa?threadID=785134&tstart=165
    about how to use a single RTP session for both media reception and trasmission (I'm referring to you RTPSocketAdapter modified version), but, at the moment, I'receive a BIND error.
    I think that your post is an EXCELLENT solution. I'modified AVReceive3 and AVTransmit3 in order to accept all parameters (Local IP & Port, Remote IP & Port).
    Can you please give me a simple scenario so I can understand what the mistake?
    I'use AVTransmit3 and AVReceive3 from different prompts and if I run these 2 classes contemporarely both in 2 different PC (172.17.32.27 and 172.17.32.30) I can transmit the media (vfw://0 for example) using AVTransmit3 but I'can't receive nothing if I run also AVReceive3 in the same PC?
    What's the problem? Furthermore, If I run first AVReceive3 from a MSDOS Prompt and subsequently I run AVTransmit3 from another prompt I see a BIND error (port already in use).
    How can I use your modified RTPSocketAdapter in order to send and receive a single media from the same port (e.g. 7500).
    I've used this scenario PC1: IP 172.17.32.30 Local Port 5000 and PC2:IP 172.17.32.27 LocalPort 10000
    So in the PC1 I run:
    AVTransmit3 vfw://0 <Local IP 172.17.32.30> <5000> <Remote IP 172.17.32.27> <10000>
    AVReceive3 <Local IP 172.17.32.30/5000> <Remote IP 172.17.32.27/10000>
    and in PC2:
    AVTransmit3 vfw://0 <Local IP 172.17.32.27> <10000> <Remote IP 172.17.32.30> <5000>
    AVReceive3 <Local IP 172.17.32.27/10000> <Remote IP 172.17.32.30/5000>
    I'd like to use the same port 5000 (in PC1) and 10000 (in PC2) in order to transmit and receive rtp packets. How can i do that without receive a Bind Error? How can I receive packets (and playing these media if audio &/or video) from the same port used to send stream over the network?
    How can I obtain a RTP Symmetric Transmission/Reception solution?
    Please give me an hint. If you can't post this is my email: [email protected]

  • Rtp session problem

    HI, I’m using a custom data source for jmf to send audio over a RTPManager using javasound for capturing live audio from a microphone and at the same time receiving audio from a RTP session using jmf, the code for the custom data Source is one posted in here a while ago which is working fine the problem is that when I start sending audio the incoming audio stop but the sound is send correctly to the other part, so if a don’t send audio I get the audio from the rtp session if I send audio I don’t listen the incoming audio, I’m only talking of one side of the call because the other part is sending the rtp session by a sip server that is connected to a regular phone line and I don’t know exactly how that work I only know that works. anyone know what can be wrong with this? Can it be a buffer problem? Or a blocking device or just a error on the code? Thanks in advance.

    Your question doesn't make a whole lot of sense, but I'd assume you've coded something incorrectly.

  • Maps transmiting problem via Ovi Suite

    Hi there, 
    I have Nokia n78 with Nokia Ovi Maps 3.01. I am trying to upload maps from Ovi Suite. I started to upload full Europe maps. After several time all maps had been downloaded, and upload started. But after uploaded more less 570 MB, upload stoped and after some time phone disconnected (like map loader said). Simillar problem is with uploading music. Any idea what can cause this problem? I tried also upload only one country map, but this happen also in this case, upload just freeze.

    If still you are able to reproduce this issue, could you please provide more information on this sothat someone can help you to solve your problem?
    A. Device details ( type *#0000#)
    B. System Information details ( Help About- Ovi Suite- System Information )
    C. Connection Type (BT/USB), BT stack details
    Please THANK me by clicking on the ****WHITE STAR** ( Giving KUDOS) the big GREEN BOX to your LEFT .
    It will help me to serve you better !!!!!
    Thanking You

  • Re: Jmf Video Transmitting Problem plz take me out from this trouble

    two suggestions,
    Try running your server code on a different server,(could be a problem with java install)
    and second change your transmit code back to the transmit for 60 seconds, then try refreshing explorer.

    I think I had the same problem, but I stopped working on my project.
    I think what the problem is, your jsp page can't create a new instance of your AVtx class each time page is loaded, there must be only one instance.
    Is it possible that when you click IE refresh that there is a second instance of AVTx trying to run??
    The way I planned on solving mine, was to create either DataGramSockets or Sockets and have AVTX listen on a socket, then have jsp pass the ip,port numbers to AVTx using packets/sockets, then AVTx would parse those parameters and transmit.
    You would then be removing AVTx out of your SERVLET contianer,, I used TOMCAT .
    I had very strange problems trying to run the transmitter from within tomcat.

  • Re: RTP transmission problem

    Is it necessary to use AVTransmit3 and RTPSocketAdapter classes for your project..
    If not, for your requirement, you can use AVTransmit2 and you can add a remoteSessionAddress whenever you need.
    If you want to use the AVTransmit3 and RTPSocketAdapter, keep the createTransmitter in separate class, and create a object whenever you want (say new avater met), here u have to careful to use the local datasource...
    Karthikeyan R

    I've understood my problems...My error was to not close the Player objects created by Receiver...
    I thought that closing RTPManager, i'll close the associated Player object, but this is not true..
    Each Player object need to be closed...

  • RTP synchronization problem

    I tried to send movie from server to client. I used RTPManager to receive audio and video streams from client applet. They work fine and client can receive both streams. However these two streams cannot sychronize. The time gap between two streams around 2 secs. How can I synchronize audio and video streams?
    Thanks a lot.

    Hi, I'm doing a project nearly the same as u all mentioned. But I am facing a problem is that my client side which using the applet cannot find it's IP, the IP it gets is localhost/127.0.0.1, as for this, i even cannot load up the player to connect to the server. As I c u guys already can stream the video and audio out, I'm wondering how u guys can get the IP of the client side.
    ps. The applet is put in the IIS of the server.

  • Fm transmition problem when guitar plugged in garage Band !

    hello,
    i m having a curious problem when i plug my guitar in GB, i interfere with a Turkish radio ... it's like my guitar is an antenna and i don't know where it comes from... so when i play, i have a radio as background sound... and when i cut my volumes from the radio it is still playing on the background ...
    can anyone help me ??
    THANKS !

    my first suspect would be the guitar cable. make sure you're using good quality shielded cable.

  • Problem with buffercontrol of rtpmanager

    Hi
    When i receive stream in rtp session i set up the buffer length of buffercontrol.
    It works good but when i play a datasource from this stream at begining it plays too fast (it plays everything what it has in buffer ) and than slows down (and plays in rate that stream incoming).
    I want to set big buffer and plays at the same framerate all the time (to play from the buffer without speeding). I used treshold but effect is the same.
    Problem is that (p2p application) when i want to transmit received stream from B to C (i clone the received data source) the frame rate is lower than original datasource. So i want to store more in buffer to play at the same ratio. How i can prevent player from speeding up.
    Could anyone help me?

    cziz13 wrote:
    Any suggestion where i can find any solution?My "guess" as to why there's the RTP acceleration problem is...
    In "real-time" if you drop a frame, you have to wait for the next one to arrive. So, there is an inherant "delay" where the frame is supposed to go.
    In "buffer-time", if you're missing a frame, it'll go immediately to the next frame without a delay where the dropped frame goes. If you drop a reasonable number of frames (say you drop 5 frames out of every 30) then you essentially have a 15% speedup...
    So, there is no solution based on the framerate because the player is playing the correct framerate...it's just not waiting during the dropped frames because, how the hell would it know to do that?
    There are 2 "solutions" to this problem.
    The first,
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/CustomPayload.html]
    And writing a custom depacketizer for a video format that would, essentially, duplicate the last-received frame whenever a skip in frame number was detected. So if frames 15 and 16 were dropped, you'd put 3 copies of frame 14 into the video. This would ensure that the video would play back at the correct speed.
    The second,
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/RTPConnector.html]
    You would swap out the UDP socket in that example for a TCP socket. This would ensure that no packets are dropped, and thus no frames are dropped, and therefore your video won't be missing any data.

Maybe you are looking for