Re: RTP transmission problem

Is it necessary to use AVTransmit3 and RTPSocketAdapter classes for your project..
If not, for your requirement, you can use AVTransmit2 and you can add a remoteSessionAddress whenever you need.
If you want to use the AVTransmit3 and RTPSocketAdapter, keep the createTransmitter in separate class, and create a object whenever you want (say new avater met), here u have to careful to use the local datasource...
Karthikeyan R

I've understood my problems...My error was to not close the Player objects created by Receiver...
I thought that closing RTPManager, i'll close the associated Player object, but this is not true..
Each Player object need to be closed...

Similar Messages

  • RTP transmission problem.

    Hi,
    I developed a audio/video conferencing application which transmit the rtp data from server to all clients. The problem I facing is the system working perfectly on my home network. but if I try in my university network it doesnt work.
    I use the IP address 192.168.0.255 for the transmitting server, and the client would use the IP of the server to access the data. In univasersit environment, I change the IP address accordingly.
    What would be the cause of this problem. Is it probably because of my university network settings?
    Plz help me ...thanks

    I've understood my problems...My error was to not close the Player objects created by Receiver...
    I thought that closing RTPManager, i'll close the associated Player object, but this is not true..
    Each Player object need to be closed...

  • RTP Transmission over HTTPS (Internet)

    Hello people!
    I am currently working on my undergraduate thesis. I need to transmit the feed from a video capture device (i.e. a webcam) and also from an audio capture device towards a client in a secure transmission/session. I am already able to capture the feed from the camera and from the microphone and I also am able to transmit it in a nonsecure transmission/session.
    I am stuck with the problem of trasmitting the feed securely. I've tried implementing my own Effect. I override the process method of Effect to encrypt the data. However, there is a problem with the RTP transmission. RTPSession only allows JPEG_RTP and H263_RTP. I already junked this method, and I am now looking for another way, particularly transmitting it using HTTPS (on the internet).
    Could somebody please help me how to transmit my RTP session over HTTPS. Some sample codes could really help...
    Thanks a lot!!

    I think you might have the wrong forum. This is a forum about Forte 4GL.
    ka

  • RTP transmission in linux

    I am using linux for JMF RTP transmission.But the RTP data is not being transmitted.Can anyone suggest me the problem behind it.In windows its working fine.Is there any special configuration to be done on linux side for RTP transmission? I am using linux performance pack of JMF in linux machine.I also tried with cross platform pack of JMF.Thanks in advance...

    We finally did resolve this issue!!!!
    The thing is that Linux just behaves very different from Windows when you create Datagram or MultiCast's.
    Windows just assumes that localhost is the local ip address and Linux does not. When you do InetAddress.getLocalHost().getAddress() ; windows often returns the LAN ip address but linux returns localhost.
         InetSocketAddress local = new InetSocketAddress(InetAddress.getByAddress(new byte[]{(byte)192,(byte)168,(byte)3,(byte)2}),port);
         dataSock.bind(local);
         InetAddress destiny = InetAddress.getByAddress(new byte[]{(byte)224,(byte)1,(byte)1,(byte)0});
         dataSock2.connect(destiny, 9000);
    When you want to receive multicast packets you need to specify the network interface you access the multicast group. If you dont then it would just use InetAddress.getLocalHost().getAddress() as the address to reach the multicast group. In Linux this just does not work.
         MulticastSocket socket = new MulticastSocket(4446);
         InetAddress group = InetAddress.getByName("230.0.0.1");
         InetAddress address = InetAddress.getByName("192.168.3.2");
         NetworkInterface interf = NetworkInterface.getByInetAddress(address) ;
         SocketAddress socketAddress = new InetSocketAddress(group,4446) ;
         socket.joinGroup(socketAddress, interf );
    We are still working on making it work with broadcast addresses.
    Greetings!
    Nico and Charly

  • Help! How to change the video frame size during RTP transmission

    Hi there.
    In a RTP transmission:
    How can I change the size of the frames of the video while I have the processor in the server running?
    i.e. while the server is running, i want to change a video from 400x200 to 200x100
    Thanks

    I am trying to implement the same functionality. I've tried to reset the format using the trackControl.setFormat method but this only appears to be supported while the processor is in a configured state. I'm still investigating, but if you happen to know a solution please let me know.
    Here is what I'm trying to do... In the middle of a session, I'm trying to change the video format from h263/rtp:352x288 to h263/rtp:176x144

  • W3d with transparent textures will be broken transmission problems

    W3d with transparent textures will be broken transmission
    problems
    When the W3D models have Texture with alpha channel, It will
    be Crusher and Transmittance (Showing the bottom), If not alpha it
    will be normal! please tell me why,Thanks
    snap:
    have alpha channel:
    snap-alpha.JPG
    no alpha channel(normal):
    snap-no-alpha.JPG
    3d model Source file & W3D file:
    http://www.3dzone.cn/beijixiong.zip
    <---new

    As we have written you can get it to work. Just do one of the
    following.
    1. Make the model a complete mesh. no parts just stuck into
    others. Make sure vertecies are welded together properly etc.
    2. a) Separate the head,hat,hands etc b) group them together
    c) export (the exporter will combine them into one model again).
    3. a) Assign a non transparent texture to your model, b)
    select the polygons that needs transparency, c) assign the
    transparent texture to those polygons
    4. Outsource the importing of models to someone else.
    I used method 2 as I mentioned in an earlier post and got it
    to work, here is the result.
    Scarecrow
    with transparency working
    The url to report bugs to adobe was recently posted, use the
    search function.

  • Video RTP transmittion problem

    Hey guys (and girls),
    I'm writing two programs that either send or receive a RTP video stream captured with a webcam. Using JMStudio I can capture and transmit the video and happily receive it with my client program, but when I try to capture and transmit the video using my server program the client can't realize because it's not got any data.
    Here is my transmitter code, my server class creates a new instance of the camTransmitter class.
    import java.io.*;
    import java.awt.*;
    import java.util.Vector;
    import javax.swing.*;
    import java.awt.event.*;
    import javax.media.*;
    import javax.media.format.*;
    import javax.media.protocol.*;
    import javax.media.control.*;
    import javax.media.ControllerListener;
    import javax.media.rtp.*;
    import javax.media.rtp.rtcp.*;
    import javax.media.rtp.event.*;
    import javax.media.rtp.SendStreamListener.*;
    import java.net.InetAddress;
    public class camTransmitter implements ControllerListener
         Processor p = null;
         CaptureDeviceInfo di = null;
         boolean configured, realized = false;
         public String ipAddress = "192.168.123.150";
         public int port = 22222;
         public RTPManager rtpMgrs[];
         public DataSource vDS;
         public SessionAddress localAddr, destAddr;
         public InetAddress ipAddr;
         public SendStream sendStream;
        public camTransmitter()
         // First, we'll need a DataSource that captures live video
         Format vFormat = new VideoFormat(VideoFormat.RGB);
         Vector devices = CaptureDeviceManager.getDeviceList(vFormat);
         if (devices.size() > 0)
              di = (CaptureDeviceInfo) devices.elementAt(0);
         else
              // exit if we could not find the relevant capture device.
              System.out.println("no devises");          
              System.exit(-1);
         MediaLocator camLoctation = di.getLocator();
         // Create a processor for this capture device & exit if we
         // cannot create it
         try
              vDS = Manager.createDataSource(camLoctation);
              p = Manager.createProcessor(vDS);
              p.addControllerListener(this);
         catch (IOException e) {
              System.exit(-1);
         }catch (NoProcessorException e) {
              System.exit(-1);
         }catch(NoDataSourceException e) {
              System.exit(-1);
         // at this point, we have succesfully created the processor.
         // Realize it and block until it is configured.
         p.configure();
         while(configured != true){}
         // block until it has been configured
         p.setContentDescriptor(new ContentDescriptor(ContentDescriptor.RAW_RTP));
         TrackControl track[] = p.getTrackControls();
         boolean encodingOk = false;
         // Go through the tracks and try to program one of them to
         // output ULAW_RTP data.
         for (int i = 0; i < track.length; i++)
              if (!encodingOk && track[i] instanceof FormatControl)
                   if (((FormatControl)track).setFormat(vFormat) == null)
                        track[i].setEnabled(false);
                   else
                        encodingOk = true;
              else
                   // we could not set this track to gsm, so disable it
                   track[i].setEnabled(false);
         // Realize it and block until it is realized.
         p.realize();
         while(realized != true){}
         // block until realized.
         // get the output datasource of the processor and exit
         // if we fail
         DataSource ds = null;
         try
              ds = p.getDataOutput();
         catch (NotRealizedError e)
              System.exit(-1);
         PushBufferDataSource pbds = (PushBufferDataSource)ds;
         PushBufferStream pbss[] = pbds.getStreams();
         rtpMgrs = new RTPManager[pbss.length];
         for (int i = 0; i < pbss.length; i++) {
         try {
              rtpMgrs[i] = RTPManager.newInstance();
              ipAddr = InetAddress.getByName(ipAddress);
              localAddr = new SessionAddress(InetAddress.getLocalHost(),port);
              destAddr = new SessionAddress(ipAddr, port);
              System.out.println(localAddr);
              rtpMgrs[i].initialize(destAddr);
              rtpMgrs[i].addTarget(destAddr);
              System.out.println( "Created RTP session: " + ipAddress + " " + port);
              sendStream = rtpMgrs[i].createSendStream(ds, i);          
              sendStream.start();
         } catch (Exception e) {
              System.err.println(e.getMessage());
         //System.out.println("RTP Stream created and running");
    public void controllerUpdate(ControllerEvent e)
         if(e instanceof ConfigureCompleteEvent)
              System.out.println("Configure Complete");
              configured = true;
         if(e instanceof RealizeCompleteEvent)
              System.out.println("Ralizing complete");
              realized = true;
    When I run this the processor realizes and the RTP seems to stream, but I have no idea why I can't receive the video from this.
    Pls help if anyone can Thanks

    Dear andreyvk ,
    I've read your post
    http://forum.java.sun.com/thread.jspa?threadID=785134&tstart=165
    about how to use a single RTP session for both media reception and trasmission (I'm referring to you RTPSocketAdapter modified version), but, at the moment, I'receive a BIND error.
    I think that your post is an EXCELLENT solution. I'modified AVReceive3 and AVTransmit3 in order to accept all parameters (Local IP & Port, Remote IP & Port).
    Can you please give me a simple scenario so I can understand what the mistake?
    I'use AVTransmit3 and AVReceive3 from different prompts and if I run these 2 classes contemporarely both in 2 different PC (172.17.32.27 and 172.17.32.30) I can transmit the media (vfw://0 for example) using AVTransmit3 but I'can't receive nothing if I run also AVReceive3 in the same PC?
    What's the problem? Furthermore, If I run first AVReceive3 from a MSDOS Prompt and subsequently I run AVTransmit3 from another prompt I see a BIND error (port already in use).
    How can I use your modified RTPSocketAdapter in order to send and receive a single media from the same port (e.g. 7500).
    I've used this scenario PC1: IP 172.17.32.30 Local Port 5000 and PC2:IP 172.17.32.27 LocalPort 10000
    So in the PC1 I run:
    AVTransmit3 vfw://0 <Local IP 172.17.32.30> <5000> <Remote IP 172.17.32.27> <10000>
    AVReceive3 <Local IP 172.17.32.30/5000> <Remote IP 172.17.32.27/10000>
    and in PC2:
    AVTransmit3 vfw://0 <Local IP 172.17.32.27> <10000> <Remote IP 172.17.32.30> <5000>
    AVReceive3 <Local IP 172.17.32.27/10000> <Remote IP 172.17.32.30/5000>
    I'd like to use the same port 5000 (in PC1) and 10000 (in PC2) in order to transmit and receive rtp packets. How can i do that without receive a Bind Error? How can I receive packets (and playing these media if audio &/or video) from the same port used to send stream over the network?
    How can I obtain a RTP Symmetric Transmission/Reception solution?
    Please give me an hint. If you can't post this is my email: [email protected]

  • Rtp transmitting problem

    i am passing with a problem.......
    i read in JMF api that are three methods to transmit voice onthe network
    1-e a MediaLocator that has the parameters of the RTP session to construct an RTP DataSink by calling Manager.createDataSink.
    2-e a session manager to create send streams for the content and control the transmission.
    3-through sockets..
    i got the code for ist method of datasinking....
    but i dont know that how to get this string....
    that is url
    // hand this datasource to manager for creating an RTP
    // datasink our RTP datasimnk will multicast the audio
    try {
    String url= "rtp://224.144.251.104:49150/audio/1";
    MediaLocator m = new MediaLocator(url);
    DataSink d = Manager.createDataSink(ds, m);
    d.open();
    d.start();
    } catch (Exception e) {
    System.exit(-1);
    }

    I've faced a similar problem, what I did was to use TWO session managers, one for transmitter and the other for receiver. I then used this objects in a container.
    Both peer use the same port, and know each other's address.
    For receiving data from the peer I do:
    RTPManager mymgr = RTPManager.newInstance();
    // add as listener to create players for each incoming stream
    mymgr.addReceiveStreamListener(this);
    SessionAddress localaddr =new SessionAddress( InetAddress.getLocalHost(), port);
    SessionAddress sessaddr = new SessionAddress( peerAddress, port);
    mymgr.initialize(localaddr);
    mymgr.addTarget( sessaddr);For transmitting data to the peer I do:
    RTPManager mymgr = RTPManager.newInstance();
    // add as listener to create players for each incoming stream
    mymgr.addReceiveStreamListener(this);
    SessionAddress localaddr =new SessionAddress( InetAddress.getLocalHost(), SessionAddress.ANY_PORT);
    SessionAddress sessaddr = new SessionAddress( peerAddress, port);
    mymgr.initialize(localaddr);
    mymgr.addTarget( sessaddr);
    // the processor is capturing audio from the mic
    processor.start();
    SendStream sendStream = mymgr.createSendStream(processor.getDataOutput(), i);
    sendStream.start();I probably forgot something when doing copy paste from my code... but I think this might help you
    Good luck
    Marco

  • Multicast transmission problem

    Hi,
    I'm tring to multicast audio streams from 1 computer to another on the same network. The multicast address for my network is 10.255.255.255. When I check the amount of transmission bytes from ifconfig continually, I see a gradual increase of bytes being sent out to the network.
    When I look at ifconfig on the receiving computer, it is receiving bytes as well. But AVReceive2 can't detect the rtp stream. It just keeps on waiting for the data to come. By the way, I use AVTransmit2 to transmit the data. But when I just unicast the rtp stream from one computer to another, then the rtp streaming works!!
    Anyone has the same problem as me. Can anyone point out any problems that I may be having. Oh yeah and I am running JMF between two red hat linux 9 notebooks.
    Do I have to do some extra modification to some ip routing table somewhere or sonething?
    Thanks to anybody that can PLEASE help me!!!1

    Hi Stefan,
    If I unicast an audio file from one computer to the other then it works fine. I didn't try sending a video file but I think it should work the same as well. I've got two computers, one having an ip address of 10.0.0.1 and the other having an ip address of 10.0.0.2. Doing an ifconfig, my broadcast address on each computer is stated as 10.255.255.255. When I use 10.255.255.255 to send my audio file, using ifconfig again, I notice that the receiving computer is receving bytes but the program isn't recognising the rtp stream. My program uses a similar address resolution to avtransmit2 and avreceive2.
    So I can't figure out what's wrong with the code. Ok for avreceive2, what do you usually type in the command line to receive broadcast packets? Do we have to modify the /etc/hosts file again??
    Thanks

  • RTP Tranmission Problem

    Dear Friends
    I am working with JMF to transmit a video (.avi) file from one computer to another computer in tcp/ip network. computers are connected in peer to peer architecture only. when i run the transmission server code it works and transmits ( i verified it in same computer ) but in remote computer (connected in same network)... either it comes after a long time - that also not fully.. or it just waits for the data.. so..no transmission.. i have to show a demo to my staff.. please help me dear friends.. regarding this problem if your have any questions or doubts please contact me in [email protected] or post your replies here.. i am waiting ...
    with regards
    Muthukumar

    when i run the transmission server code it works and transmits ( i verified it in same computer ) I don't understand, how do you know it is transmitting over network on same computer??
    Are you using jmstudio??
    file transmit to ip and port of destination computer
    destination computer open ip and port of destination computer.
    or transmit to xxx.xxx.xxx.255 portNum
    and receive on xxx.xxx.xxx.255 portNum

  • H263 RTP and JPEG RTP buffering problem

    Hi
    I have a problem with buffering received video stream. I use a buffercontrol of rtpmanager and for audio stream eveything works fine but for video buffercontrol is null, sa i can't set it (the same problem is with player or porcessor). Audio stream has all parameters (bitrate, samplerate etc) but video has only encoding and size parameters.
    I tried to change format (by processor) of received stream to supported format (for example jpeg) and set all parameters (size, framerate, maxdatalength and encoding) and use output datasource but still player can't get buffercontrol. When i use a porcessor as a player i can see in media properties that it is set like i want, but buffercontrol is null.
    I think that h263_rtp and jpeg_rtp video format don't support frameate (i tried set it but nothing happens) and it's played as it comes.
    I need to set a buffer because the framerate of incoming stream is really low (it's p2p application) and smothnes of video is bad.
    Please helpme where i can find any suggestion to solve this problem. Maybe i have to use custom transport layer or something like that.

    cziz13 wrote:
    I don't know what you want to say. I said exactly what I wanted to say.
    i got buffercontrol on realized player (it cals rendering and it's in solution AudioBufferControl.java). I got the buffer even on video file, but only when i play it from my hd. when i try to do it on received video stream the buffer is not working.Good for you. But that wasn't my point.
    Whenever you request a "Player" object, you don't get a "Player" object, you get some specific "Player" subclass that, depending on what it's designed to do, will implement certain interfaces.
    The BufferControl object you're getting on your Player class that will play from the hard drive is more of a prefetch cache than a buffer (in the sense that I think of a buffer, at least). It's using a pull datasource, so it can control getting data from its datasource.
    However, the player object that's playing a RTP stream is relying upon a pull buffer data source. It has absolutely no control over the data, and it's told by the DataSource when data is ready to be played...so it just loads when it's told to do by the DataSource.
    And the datasource is just handed data from the RTP streams inside the RTPManager based on the internal RTP buffering that is handled by the RTPManager...
    Any more suggestions?My original "suggestion" still stands. Get your BufferControl object on the RTPManager that you're using to receive the stream...

  • RTP transmition problems!!!

    I write a application that send video stream to the other PC. The RTP transimition code is like the AVTransmit sample. It works well.but when I want to close or shut down my RTP transmition, there is a big problem that how can I close it absolutely without qiut the application. The close method is like this:
    public void stopTrans()
    //synchronized (this)
    if (processor != null)
    processor.stop();
    processor.close();
    processor = null;
    for (int i = 0; i < rtpMgrs.length; i++)
    rtpMgrs.removeTargets("Session ended.");
    rtpMgrs[i].dispose();
    rtpMgrs[i] = null;
    rtpMgrs = null;
    when I press the send button to start a new transmition again, there is a exception said:
    Can not create RTP session: javax.media.rtp.InvalidSessionAddressException: Can't open local data port: 3000
    port 3000 is the port number I used last time. So if I want to create transmition again using the same port, it failed. Why , how can I release the port form a RTP transmition? Please help me. Thank you very much.

    I have exactly the same problem with Rick.T, that is I cant use the same port 2nd time.... I get the message "can't open local data port xxxxx"
    For every unicast session I use 4 ports per client, 2 to transmit (video-audio) 2 to receive (video-audio). So, it doesnt make sense to use every time new ports...
    I have tried to close/stop/deallocate the Processor, to stop/disconnect DataSource, to stop/close SendStream and of course to removeTargets/dispose the RTPManager.... But I still face the same problem
    Its the very final stage of my project and I dont know what else to try....
    I have searched for other similar topics here but there is no answer to that problem.
    Is there anyone who has the solution to release the ports??
    thanx in advance

  • G723_RTP decoding from rtp stream problem

    HI,
    I've managed to establish rtp connection and send the audio stream in G732 format. The stream is ok, but I can't decode the response from the asterisk server. I use the same way to set the audio format for both sending and receiving:
    TrackControl[] tcs = processor.getTrackControls();
    if(tcs.length == 0) {
         return; //should not enter this case
    TrackControl tc = tcs[0];//we expect one channel only
    String codecType = SoftPhoneSettings.audio_codec;
    tc.setFormat(new AudioFormat(AudioFormat.G723_RTP, 8000.0, 16, 1, -1, -1, 192, -1.0, null));
    tc.setEnabled(true);If I use the gsm_rtp or ulaw_rtp codec it works just fine on both sides (also decoded right by me), but with G723_RTP I here no voice. And also I got EXCEPTION_ACCESS_VIOLATION on processor.close();, which is also only if using this codec.
    The asterisk server is working fine, but I can't figure out what exactly is the problem here, and why I send it correctly but I can't decode the response. I've also tried with setCodecChain, but the situation is the same. Can anybody help me with this, because I can't find the source of the problem and more important I still can't solve it. 10x in advance.

    Is there any standard way for retrieving the G723_RTP stream, like using a player or?

  • Rtp session problem

    HI, I’m using a custom data source for jmf to send audio over a RTPManager using javasound for capturing live audio from a microphone and at the same time receiving audio from a RTP session using jmf, the code for the custom data Source is one posted in here a while ago which is working fine the problem is that when I start sending audio the incoming audio stop but the sound is send correctly to the other part, so if a don’t send audio I get the audio from the rtp session if I send audio I don’t listen the incoming audio, I’m only talking of one side of the call because the other part is sending the rtp session by a sip server that is connected to a regular phone line and I don’t know exactly how that work I only know that works. anyone know what can be wrong with this? Can it be a buffer problem? Or a blocking device or just a error on the code? Thanks in advance.

    Your question doesn't make a whole lot of sense, but I'd assume you've coded something incorrectly.

  • The RTP transmission of a message is slow?

    Hi
    I use JMF2.2.1e in windows, and there is the transmission of a message in RTP with a wav file of PCM.
    However, it takes time for about 20 seconds when a DNS server of Network Neiborhood is set till a sound is played.
    It is played whether you exclude setting of DNS immediately if I fill in /etc/hosts with a transmission of a message former address.
    Because I do not get it, in specifications of a program making, as for the measures mentioned above, will not there be any good method?
    Transmission of a message former appointment is as follows.
    String url="rtp://" + addr + ":" + port + "/audio";
    MediaLocator m = new MediaLocator(url);
    datasink = Manager.createDatasink(ds,m);

    The dialog you are referring to is a ''shell dialog''.
    The most likely reason for this to take a long time is that the default location is taking a long time to resolve, as can happen if it's a network location or if the default folder doesn't exist.
    To reset the download location, reset '''browser.download.dir''' in the advanced configuration interface:
    # Select Firefox's location bar (press CTRL+L or click it)
    # Type '''about:config''' and press enter
    # Click '''I'll be careful, I promise!'''
    # in the Filter box, type '''browser.download.dir'''
    # Right click the keyword.url preference below and choose '''Reset'''

Maybe you are looking for

  • Previewing videos in iTunes store

    hey i dont know if this has happened to anybody else but when im in the itunes store and i preview a video the video plays for about 10 seconds and then stops.. why does this happen?.. help would be great

  • Who can help me with my IPHONE4S unlock thank you.

    I was a few years ago in the United States work in Chinese, because the marriage, so coming back to China, I back to the iphone4s verzion customization, I do not have a verzion account, but returned to find the mobile phone is locked, and can not be

  • Which Tax ID to use on the Vendor

    I need some assistance on what is the generally accepted practice on tax id's. On the Control screen, there is tax id1 and tax id2.  THen on the Company Code Withholding screen, there is the 1099 information and tax id. If I have a vendor that is 109

  • Working with DIAdem REPORT in DIAdem SCRIPT

    Hello, I am working with DIAdem 9.1 Advanced. I would like to create a report with a 2D graph object containing one curve (X,Y1) or (X,Y2). The user has to choose between (X,Y1) and (X,Y2) via a dialog box. I want to generate this report with a VBS s

  • SAP Objects types on CRM?

    What are the query types,Objects provided by the CRM so that it can be called by third party for integartion pupose?