RTP Server?

has anyone done this? I am wanting to setup streaming video/audio from an applet (camera) to a server and broadcast out to an applet(browser) I need the server in the middle to manage the viewers. Do I need a specific RTP Server or can I use a regular web server? Thanks.

@rkippen: So, 2 months... Why don't you tell us how to solve the NAT problem, if you have already done it? Consider that he wants to send and receive from an Applet, which means:
1. signing Applets two times
2. writing RTP server
3. Installing JMF @ sender side
4. solving the NAT & firewall problems
5. He will probably try to implement some effects, maybe sound or video or both of them
Two months? All right...
Best regards from Germany,
r.v.

Similar Messages

  • Can anyone send/post me the code for RTP server which reads soundcard

    hi all,
    can anyone send me the code for RTP server which reads soundcard and RTP client to tranmit the sound.

    How much are you going to pay?

  • Code sample for rtp server. please!

    hi
    I need to get a rtp server code, may it simple, in order to learn from it. Did not find anything from browsing. could anyone send me a sample or a link?
    thanx a lot

    Hi,
    thanks for the link. I have aleready checked that, i should have specified more..
    The thing is that these examples require that we have the client IP and port . But in case of a NATed client, I don t see how to get this client IP.
    I looked in JMF API desperately for getting client info. But the only thing I get is the Participant class. No IP address nor a port to work with.
    'Anyone has a clue, or a code sample of a small server that listen for clients and retrieve dynamically their IP/port? would help a Lot
    thanx

  • RTP Server/ Client Program using JMF Library

    Hi folks,
    I am trying to send A/V data from Camcorder(SONY DCR-PC115) to PC(Red Hat Linux 7.3 kernal version 2-18.3), and then send that data to clinet programm via a network ( i.e. A/V data should be delivered by network using RTP/RTCP Protocol).
    I guess I need to make RTP Server and RTP Client program, and using JMF library in order to send data with RTP Protocol. Isn't it?
    So what I want here is Do any body have idea/experiance in making RTP Clinet /Server program using JMF library then pls let me know
    my e-mail id is [email protected], [email protected]
    Many thanks
    looking forward to hear from you soon
    Regards
    Madhu Chundu

    Hi,
    I'm also working on the same type of problem, so if anyone has already worked or have an idea of how it works or does, please mail me the details to my mail-id: [email protected]
    Thanks In Advance,
    Rajanikanth Joshi

  • Why I can not connect RTP server

    Hi all,
    Here I encounter a problem that I can not connect RTP server as below address:
    rtsp://159.226.42.36/mp3
    I can play the address link using realone or MS media player on internet although it's little long time to connect the server..
    I'm just wonder why the address style in examples is not as above,but as below:
    rtp://159.226.42.36:554/1
    How can I play the link as rtsp://159.226.42.36/mp3 or rtsp://159.226.42.36/mp3/looks.mp3 sth. like that?
    Thanks!

    BTW, the code is below:
    private void startRTPPlayer(String session) {
              try {
                   player = JMFEngine.getInstance().createRTPMediaPlayer(
                             new RTPConnectionInfo(session), null);
                   statelabel.setText("Opening " + session);
                   player.addControllerListener(new RTPControllerUpdate());
                   RTPManager mgrs = (RTPManager) RTPManager.newInstance();
                   player.realize();
              } catch (IllegalArgumentException e) {
                   logger.warn(e);
                   statelabel.setText(e.getMessage());
                   return;
              } catch (MalformedURLException e) {
                   logger.warn("The media resource location is not available");
                   statelabel.setText("The media resource location is not available");
                   return;
              } catch (IOException e) {
                   logger
                             .warn("The resource can not be opened or connected currently");
                   statelabel
                             .setText("The resource can not be opened or connected currently");
                   return;
              } catch (NoPlayerException ex) {
                   logger.error("No available player", ex);
                   statelabel.setText("Can not play the resource");
                   return;
    class RTPControllerUpdate implements ControllerListener {
              public void controllerUpdate(ControllerEvent ce) {
                   // Get this when the internal players are realized.
                   if (ce instanceof RealizeCompleteEvent) {
                        player.start();
                   if (ce instanceof ControllerErrorEvent) {
                        player.close();
                        logger.error(((ControllerErrorEvent) ce).getMessage());
         }

  • Peak level + rtp server / client

    Hi all JMF developper,
    Here are my questions. I have to develop a simple application that capture voice coming from a microphone and then transmit it over a network (the Internet in general). I would like to develop a peak level as many audio tools provide (just to see if the recording level is good).
    Then it will be very interesting for me to get materials, resources, codes or anything else about a simple JMF server / client. We plan to use the Darwin streaming server.
    From the client side, I would like to know if there a buffer security mechanism built in or not (for instance, listener that warns that the buffer will become empty in order to stop anything else that is synchronized with the voice)
    Kind regards,
    St�phane

    anybody does how I can create a Peak level in my GUI (a volume meter while recording)
    Thanks

  • Vaibility of RTP relay server using JMF

    Hello!
    I am thinking of making an RTP server that would just relay the streams received from single user to multiple users.
    I want to ask following questions:
    1- Is it viable to make such a Server using JMF?
    2- If it is, then how should I proceed?
    3- Should I need to create Processor for each recieved stream at the server or just cloning the DataSource received and sending them using separate RTPManagers would solve my problem?
    4- Is cloning of data source needed at all?
    I am asking this before doing any coding just in case if it is not possible than you warn me ,and because I had real bad experiences while cloning the data source and this server, I think depends on cloning.
    Else, I want some help regarding code. I would highly appreciate some code snippets.
    Thanks in advance.
    Thanks!
    P.S.: captfoss, Are you listening?

    Now some simple questions from a novice, thats me:I will answer them out of order
    3- Are these terms specific to JMF or are they general Networking terms?They are general networking terms.
    2- What is difference b/w unicasting and multicasting?Uni = Latin prefix for "one".
    Multi = Latin prefix for "many"
    Broad = Latin prefix for "all" (okay, this one is probably not true, but...)
    unicast = sending data to a single recipient.
    broadcast = sending data to all recipients.
    multicasting = sending data to many recipients.
    It deals with how the underlaying UDP packets are handled.
    Unicast addresses the packets to a specific host, so all other hosts that receive those packets go "Not for me" and ignore it.
    Broadcast addresses the packets to a special IP address, the broadcast ip, so all hosts that receive it say "Oh, this is a broadcast message so it's for me"
    Multicast addresses the packets to an IP address in a special range (Class D addresses), and all hosts can opt to join in to the "multicast session". If they join the multicast session, it basiclly means when they receive packets addressed to any multicast addresses that they have joined the session of, they will consider those packets to be "for them".
    1- What exactly is multicasting in JMF?JMF multicasting is basiclly where the "host" can send out a single stream, and any number of "clients" can receive the stream.
    4- How multicasting is handled at Transmitter and Reciever side using JMF, some java statements please.Multicasting is almost handled "automaticlly".
    It's handled by giving the transmitter a multicast IP address to send to. "224.123.123.123" is an example of the one I always used for testing (because it was easy to remember). Transmitting multicast packets is handled automaticlly.
    Receiving multicast packets requires a little special handling.
    From AVReceive2.java
    if( ipAddr.isMulticastAddress()) {
        // local and remote address pairs are identical:
        localAddr= new SessionAddress( ipAddr,
                           session.port,
                           session.ttl);
        destAddr = new SessionAddress( ipAddr,
                           session.port,
                           session.ttl);
    } else {
        localAddr= new SessionAddress( InetAddress.getLocalHost(),
                                session.port);
        destAddr = new SessionAddress( ipAddr, session.port);
    }The main difference here is that your "local" address isn't going to be LocalHost address, "127.0.0.1", it's going to be the multicast address.
    And you should define the TTL (time to live) for the SessionAddress because multicast packets can only travel a certain number of "hops" (number of times forwarded).
    But honestly, I'm pretty sure I ripped that IF statement out and multicasting still worked (but I define the TTL for unicast as well, so bear that in mind...)
    Ignoring all of the stuff above, multicasting is a great idea for a local LAN application where there are no routers between the host and the clients. If you've got a router between them, then the router may not forward the multicast packets and your stream may never get to the remote clients.
    That's why in my project, which is web based, clients attempt to get the multicast packets and if they fail at that, then they request a unicast from my server.

  • Streaming MP3 with RTP

    Hello,
    I am trying to implement a RTP server and a RTP client, for that purpose I am using JMF, it seems to work good until I tried to transmit MP3 files. When I trasmit MP3, I get the next error message:
    Unable to handle format: mpeglayer3, Unknown Sample Rate
    Failed to realize: com.sun.media.PlaybackEngine@1700391
    Error: Unable to realize com.sun.media.PlaybackEngine@1700391
    Error in ControllerErrorEvent: javax.media.ResourceUnavailableEvent[source=com.sun.media.content.unknown.Handler@fa39d7,message=Failed to realize: input media not supported: mpeglayer3 audio]
    I have installed the mp3 pluging and I have used the AVTrasmint2 and AVReceive2 class with some modifications to try to solve the problem, but I can not solve it.
    I tried to use JMFStudio in both sides (server and client side) and I got the same error.
    -->failed to handle a data format change!
    I would like to add that I have used two differents computers (one for the server and one for the client) to make all the test.
    Could you give me some advise, help or solution for that problem? If someone has a solution(code) to transmit streaming MP3 using RTP and kown that it is running ok, could you show me?
    I am doubting if MP3/RTP is supported in JMF, If someone knows another alternative to implement a streaming MP3 audio server and client, could you say me what solution is it?
    Thank you very much, I expect that someone could help me.
    PD: Sorry for my english, I think that it is not so good.

    Hi I had the same problem but i manage to solve it.
    First of all you have to install the mp3 plug-in and as u mentioned that u already did that.Make sure the mp3plugin.jar file is inside the lib/ext of the jre you are using for your project and one imp thing is first register the plug-in by issuing the command
    java com.sun.media.codec.audio.mp3.JavaDecoder
    It will modify the jmf.properties binary file.Once your done with this put this updated jmf.properties file in your project folder.After this your code should run perfectally.In case of any problem.Post it.
    cheers

  • High Delay of MPEG-1 P-Frames first slices in RTP Streams

    Hi,
    I'm trying to implement a simple RTP server using JMF 2.1.1e (Windows Performance Pack). I started off with the sample application available at
    http://java.sun.com/products/java-media/jmf/2.1.1/solutions/ToolsTx.html
    For receiving and displaying the MPEG-1 stream I'm using JMStudio. Basically, the transmission seems to work and JMStudio displays the video I'm transmitting from the server computer.
    Unfortunately, the video does not display fluently / smoothly, the frame rate fluctuates between 10 and 30 fps for a video stream that was originally 25 fps.
    Trying to track down the source of the problem, I used Etherreal to sniff the network traffic. According to this, the server seems to have problems espacially with the first slice / packet of most P-Frames. They are delayed for more than 0.1s (normal delay seems to be round about 0.0004s).
    Using this insight, I tried MPEG-1 videos using other GOP structures.
    1. IPPPPPPPPPPPPPP: As one would expect, using this GOP, the video displays even worse, since almost all frames are P-Frames.
    2. I: Using only I-Frames, the problem almost disappears and the video displays very smoothly.
    This leads me to the conclusion that inside JMF there seems to be a problem while packetizing the P-Frames.
    Did anybody experience similar problems or maybe even has a solution for this problem?
    Any help would be greatly appreciated!
    Kind regards
    Sebastian Seitz

    Dag Norum:
    First of all, thank you for your quicly reply ;-)
    Second, I'm very sorry to make a reply almost a month after your answer. The reason is that I went on holidays without an opportunity to test your tip.
    I used Harm's link and than reached to a Bitrate of 8.0000 for all three fields (Min., Target and Max.) and that makes all the difference. Now I have a video with very good quality.
    But I haven't tested your tip yet to see if that improves the video final quality.
    I want to work always with the maximum quality, no matter what time is needed to convert the video and no matter the length (GB) it costs.
    Do you think I can export to an uncompressed file (avi for windows and compressor set to none) and than make the Menus with Encore? It wouldn't be better work on Encore with an MPEG2 file? The DV compressor is MPEG2, isn´t it?
    And why you suggested to not have the "Optimize stills" checked? This is to optimize the frames without movement, isn´t it? Many times I use photos inserted in the movie. This option isn't good for that?
    Your last suggestion, to go directly from timeline to the end target (MPEG2) is much different than that I use (DV compressor)?
    If you could answer my questions I would appreciate.
    Many thanks!
    (sorry if my english isn't the better)
    Message was edited by: Warlord_LA   (01-Oct-2009   23h51)
    Sorry, I'd made a mistake.  When I answer to you I didn't realise that your sugestions was to AVI exportation and not to MPEG2.
    Of course you are absolutely right: for an intermediate file, no compression is always better than some compression :-)
    But my question about not have the "Optimize stills" checked remains. What does it makes, really?

  • How to synchronize audio and video rtp-streams

    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?
    How are you supposed to sync audio and video if not with as above ?

    camelstrike wrote:
    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?The RTP packets are timestamped when they leave, and they are played in order. You can't change the timebase on an RTP stream because, for all intensive purposes, it's "live" data. It plays at the speed the transmitting RTP server wants it to play, and it never gets behind because it drops out of order and old data packets.
    If your RTP streams aren't synced correctly on the receiving end, it means they aren't being synced on the transmitting end...so you should look into the other side of the equation.

  • Receiving Video RTP Stream (JMF) in JME ( MMAPI ) - URGENT !!!

    Hi Folks...
    I�m trying to develop an application that sends the images from a web cam connected to the computer, to the PDA that the images can be viewed by the user...
    My code for the JMF RTP Video Stream is as follows.
    Processor proc = null;
              javax.media.protocol.DataSource ds = null;
              TrackControl[] tc = null;
              int y;
              boolean encodingOk = false;
              Vector<javax.media.protocol.DataSource> datasources = new Vector<javax.media.protocol.DataSource>();
              for( int x = 0 ; x < camerasInfo.length ; x++ ){
                   try {
                        proc = Manager.createProcessor(camerasInfo[x].getLocator());
                   }catch (NoProcessorException e) { System.out.println("Erro ao int�nciar PROCESSOR: " + e);     }
                    catch (IOException e) { System.out.println("Erro ao int�nciar PROCESSOR: " + e); }
                   proc.configure();
                   try {
                        Thread.sleep(2000);
                   } catch (InterruptedException e1) {
                        // TODO Auto-generated catch block
                        e1.printStackTrace();
                   proc.setContentDescriptor( new ContentDescriptor(ContentDescriptor.RAW_RTP) );
                   tc = proc.getTrackControls();
                   for( y = 0; y < tc.length ; y++ ){
                        if (!encodingOk && tc[y] instanceof FormatControl){
                             if( ( (FormatControl) tc[y] ).setFormat( new VideoFormat(VideoFormat.RGB) ) != null ){
                                  tc[y].setEnabled(true);                              
                             }else{
                                  encodingOk = true;
                        }else{
                             tc[y].setEnabled(false);
                   proc.realize();
                   try {
                        Thread.sleep(2000);
                   } catch (InterruptedException e1) {
                        e1.printStackTrace();
                   try{
                        ds = proc.getDataOutput();
                   }catch(NotRealizedError e){
                        System.out.println("ERRO ao realizar datasource: " + e);
                   }catch(ClassCastException e){
                        System.out.println("Erro ao realizar datasource: " + e);
                   datasources.add(ds);
                   System.out.println( ds.getLocator() );
                   encodingOk = false;
              MediaLocator ml = new MediaLocator("rtp://10.1.1.100:99/video");
              try {
                   DataSink datSink = Manager.createDataSink(ds , ml);
                   datSink.open();
                   datSink.start();
              } catch (NoDataSinkException e) {
                   System.out.println("Erro ao instanciar DataSink: " + e );
              } catch (SecurityException e) {
                   System.out.println("Erro ao iniciar DataSink: " + e);
              } catch (IOException e) {
                   System.out.println("Erro ao iniciar DataSink: " + e);
              }I�m not sure if this code is correctly... it is ?
    So... the next part of the systems runs on the PDA..
    The code that access this RTP Stream is as follows..
              VideoControl c = null;
              try {
                   player = Manager.createPlayer("rtp://10.1.1.100:99/video");
                   c = (VideoControl) player.getControl("VideoControl");
                   tela = (Item) c.initDisplayMode( GUIControl.USE_GUI_PRIMITIVE, null);
                   player.start();
                   append(tela);
              } catch (IOException e) {
                   str.setText(e.toString());
                   append( str );
              } catch (MediaException e) {
                   str.setText(e.toString());
                   append( str );
              }So when the APP try to create a player for "rtp://10.1.1.100:99/video" an MediaException is throwed..
    javax.microedition.media.MediaException: Unable to create Player for the locator: rtp://10.1.1.100:99/video
    So... I don�t know what happen =/
    The error is in the PDA module ? Or in the computer�s initialization off the RTP Video Streaming ?
    I need finish this job at next week... so any help is usefull..
    Waiting for answers
    Rodrigo Kerkhoff

    First of all: before going onto the j2me part, make sure the server works, before doing anything else! Apparently, it doesn't...
    The MediaLocator is generally used to specify a kind of URL depicting where the data put into the datasink should go to. In your case, This cannot be just some URL where you want it to act as a rtps server. You'll need to implement that server yourself I guess.

  • How to synchronize audio track and video when transmitting by RTP

    since the audio stream and video stream are transmitted separately on the network even if in a merged data, how to synchronize the audio stream and video stream at the receiving end ?

    camelstrike wrote:
    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?The RTP packets are timestamped when they leave, and they are played in order. You can't change the timebase on an RTP stream because, for all intensive purposes, it's "live" data. It plays at the speed the transmitting RTP server wants it to play, and it never gets behind because it drops out of order and old data packets.
    If your RTP streams aren't synced correctly on the receiving end, it means they aren't being synced on the transmitting end...so you should look into the other side of the equation.

  • JMF-RTP in Internet

    The streaming examples (AVTransmit and AVReceive) worked fine on LAN. but how to get it worked over the internet (where the issues of proxy, routers have to be solved ). I read through the forum, some suggest that this might be due to the computer having 2 IP stacks ( one of LAN and one to the Internet ). Anyone can provide a solution/reason to this?
    Also, is a RTP server like the Darwin Streaming Server a solution to this problem?

    I was reading a couple of the problems that are on the forum and I having similar problems to many ppl.
    Basically I have made two files based on the AVTransmit and AVReceive files, however I cannot get these two files to transmit RTP voice data over two separate networks without carrying out port forwarding on routers at each network. That is, the routers must be configured to forward traffic incoming at a specific port to a specific machine on the internal network and also firewalls must be turned off.
    E.g. port 42050 must be opened up on the router/personal computer and RTP voice data forwarded to machine 192.168.0.4 for instance.
    I was wondering if there is either a generic port which is reserved for audio that is not closed down on a router or a personal computer which can be used, thus ommitting the need to port forward (i.e. making it easier for audio conversation without going into which ports are open/closed). Or if anyway knows how to adapt these files so that the HTTP tunnelling mechanism can be used. Basically I dont want to have to resort to port forwarding and would just like to use one or a subset of ports..
    Thanks

  • Does JMF support RTP packets being sent "Faster than real time"?

    I have a situation where some stored audio is passed to a speech recogniser using RTP. This is all working well with JMF. However, since this operation is "offline" (i.e. no live person is actually speaking or hearing this audio stream) and the recognizer is capable of processing the audio very quickly, then the RTP stream could be sending the audio in "faster than real time". What settings in the following components would allow this?
    DataSource _dataSource = Manager.createDataSource(source);
    Processor _processor = Manager.createProcessor(_dataSource);
    TrackControl[] trackControls = _processor.getTrackControls();
    Codec codec[] = new Codec[3];
    codec[0] = new com.ibm.media.codec.audio.rc.RCModule();
    codec[1] = new com.ibm.media.codec.audio.ulaw.JavaEncoder();
    codec[2] = new com.sun.media.codec.audio.ulaw.Packetizer();
    ((com.sun.media.codec.audio.ulaw.Packetizer) codec[2]).setPacketSize(160);
    _processor.realize();
    DataSource dataOutput = _processor.getDataOutput();
    SendStream _sendStream = _rtpManager.createSendStream(dataOutput, 0);
    _sendStream.start();          
    _processor.start();I tried "setRate" on the processor but this had no effect. getRate showed that it was still 1.0
    Best Regards,
    Jamie

    I wrote my own RTP client in about an hour - (seemed simpler than navigating JMF options). It is very basic, but works as I want. The RTP server (the speech recognizer it able to consume the stream and gives exactly the same results).
    package com.sss.mrcp;
    import java.io.InputStream;
    import java.net.DatagramPacket;
    import java.net.DatagramSocket;
    import java.net.InetAddress;
    import java.util.Random;
    public class RTP extends Thread {
         InputStream is;
         String address;
         int port;
         int localPort;
         public RTP(InputStream is, int localPort, String address, int port) {
              this.is = is;
              this.address = address;
              this.port = port;
              this.localPort = localPort;
         public void run()  {
              try {
              DatagramSocket socket = new DatagramSocket(localPort);
              Random r = new Random();
              int sequenceNumber = r.nextInt();
              int syncId = r.nextInt();
              int timeStamp = 0;
              int len = 256;
              byte[] buf = new byte[len];
              int code = 0;
              int headerLength = 12;
              while ((code = is.read(buf, headerLength, len - headerLength)) > -1) {
                   int i = 0;
                   buf[i++] = (byte) 0x80; // version info
                   buf[i++] = (byte) 0x08;     // 8=alaw,0=ulaw
                   sequenceNumber++;
                   buf[i++] = (byte) (sequenceNumber / 0x100);
                   buf[i++] = (byte) (sequenceNumber % 0x100);
                   timeStamp += (len - 12);
                   int timeStampTop = (timeStamp / 0x10000);
                   buf[i++] = (byte) (timeStampTop / 0x100);
                   buf[i++] = (byte) (timeStampTop % 0x100);
                   int timeStampBottom = (timeStamp % 0x10000);
                   buf[i++] = (byte) (timeStampBottom / 0x100);
                   buf[i++] = (byte) (timeStampBottom % 0x100);
                   int syncIdTop = (syncId / 0x10000);
                   buf[i++] = (byte) (syncIdTop / 0x100);
                   buf[i++] = (byte) (syncIdTop % 0x100);
                   int syncIdBottom = (syncId % 0x10000);
                   buf[i++] = (byte) (syncIdBottom / 0x100);
                   buf[i++] = (byte) (syncIdBottom % 0x100);
                   DatagramPacket packet = new DatagramPacket(buf, code+headerLength, InetAddress.getByName(address), port);
                   socket.send(packet);
                   Thread.sleep(1); // this sets the speed of delivery "faster than real time"
              } catch (Exception e) {
                   throw new RuntimeException(e);
    }

  • About port 5060 in asa 5505/

    Hello! My name is Denis, I have a problem with the cisco asa 5505 in the office. We need to open follow ports and protocols: SIP Server IP: 212.24.34.36 SIP Port (UDP/TCP): 5060 RTP Server IP: 212.158.160.92 RTP Port (UDP): 7000-27000 RTP Server IP: 212.24.48.76 RTP Port (UDP): 7000-27000 RTP Server IP: 217.23.132.36 RTP Port (UDP): 7000-27000 HTTP Server IP: 212.24.34.36 HTTP Port (ТСP): 80 I've made it by standart way throuhg the amdinistate interface in the cisco. All the ports is opened, but the problem is with the 5060 port. It is not working meanwhile, there is no ping. I need an instruction to open this port 5060 in our asa 5505. Also I attache the picture (the printscreen). I really hope you to help me. With best regards. Denis, Pilotmoto.

    Hello! My name is Denis, I have a problem with the cisco asa 5505 in the office. We need to open follow ports and protocols: SIP Server IP: 212.24.34.36 SIP Port (UDP/TCP): 5060 RTP Server IP: 212.158.160.92 RTP Port (UDP): 7000-27000 RTP Server IP: 212.24.48.76 RTP Port (UDP): 7000-27000 RTP Server IP: 217.23.132.36 RTP Port (UDP): 7000-27000 HTTP Server IP: 212.24.34.36 HTTP Port (ТСP): 80 I've made it by standart way throuhg the amdinistate interface in the cisco. All the ports is opened, but the problem is with the 5060 port. It is not working meanwhile, there is no ping. I need an instruction to open this port 5060 in our asa 5505. Also I attache the picture (the printscreen). I really hope you to help me. With best regards. Denis, Pilotmoto.

Maybe you are looking for