G723_RTP decoding from rtp stream problem

HI,
I've managed to establish rtp connection and send the audio stream in G732 format. The stream is ok, but I can't decode the response from the asterisk server. I use the same way to set the audio format for both sending and receiving:
TrackControl[] tcs = processor.getTrackControls();
if(tcs.length == 0) {
     return; //should not enter this case
TrackControl tc = tcs[0];//we expect one channel only
String codecType = SoftPhoneSettings.audio_codec;
tc.setFormat(new AudioFormat(AudioFormat.G723_RTP, 8000.0, 16, 1, -1, -1, 192, -1.0, null));
tc.setEnabled(true);If I use the gsm_rtp or ulaw_rtp codec it works just fine on both sides (also decoded right by me), but with G723_RTP I here no voice. And also I got EXCEPTION_ACCESS_VIOLATION on processor.close();, which is also only if using this codec.
The asterisk server is working fine, but I can't figure out what exactly is the problem here, and why I send it correctly but I can't decode the response. I've also tried with setCodecChain, but the situation is the same. Can anybody help me with this, because I can't find the source of the problem and more important I still can't solve it. 10x in advance.

Is there any standard way for retrieving the G723_RTP stream, like using a player or?

Similar Messages

  • Problem to create RTP stream on Linux by using JMF studio

    Hello All,
    I am trying to use JMStudio in order to create RTP stream
    I have a problem with capture
    I have run Jmfinit I got following messages
    JavaSound Capture Supported = true
    JavaSoundAuto: Committed ok
    java.lang.Error: Can't open video card 0
    java.lang.Error: Can't open video card 1
    java.lang.Error: Can't open video card 2
    java.lang.Error: Can't open video card 3
    java.lang.Error: Can't open video card 4
    java.lang.Error: Can't open video card 5
    java.lang.Error: Can't open video card 6
    java.lang.Error: Can't open video card 7
    java.lang.Error: Can't open video card 8
    java.lang.Error: Can't open video card 9
    it seems to me that i have no directsoundcapture
    therefore i can not create RTP package stream
    How can i add direct sound capture to my capture devices.
    is there any suggestion for this issue?
    King regards
    BEKIR BALCIK
    ARGELA TECHNOLOGIES

    In $ORACLE_BASE/admin you should have a directory structure for the viton1 instance : create a similar directory structure for the viton2 instance.
    In the pfile subdirectory create the initviton2.ora file : you can copy it from initviton1.ora file and change all occurrences of viton1 to viton2.
    At OS prompt : export ORACLE_SID=viton2, then
    sqlplus /nolog
    conn / as sysdba
    startup nomount pfile=$ORACLE_BASE/admin/viton2/pfile/initviton2.ora
    then CREATE DATABASE.....

  • How to extract data from Buffer and create a RTP stream

    Hi
    I'm working on a project where I need to interrupt a media stream (video and audio) and extract the data. Send the data with a custom protocol. On the recieveing side I would like to reconstruct the stream using only the data chunks.
    I'm currently looking at [DataSourceReader.java|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/DataSourceReader.java] and more specifically at a method like printDataInfo(Buffer buffer) to extract the data.
    Is it possible to create a RTP stream, only having access to the byte array "data" in Buffer ?
    Thanks in advance.

    camelstrike wrote:
    Hi
    I'm working on a project where I need to interrupt a media stream (video and audio) and extract the data. Send the data with a custom protocol. On the recieveing side I would like to reconstruct the stream using only the data chunks.
    I'm currently looking at [DataSourceReader.java|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/DataSourceReader.java] and more specifically at a method like printDataInfo(Buffer buffer) to extract the data.
    There are a couple of different ways to get the data. Reading it from inside a DataSink is perfectly fine...
    Is it possible to create a RTP stream, only having access to the byte array "data" in Buffer ?Yes and no.
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/CustomPayload.html]
    You need to know the format of the media in addition to the actual media data...

  • Problems Presenting RTP Stream

    Hi, I have a Video Streaming server developed in Java using
    Java Media Framework and i need to see the videos with a flash
    player, but i know that flash works with RTMP and JMF doesn't
    support it. Is there any way to present the RTP Streams with flash?
    Thanks.

    Flash Player only supports streams delivered over the RTMP
    protocol from Flash Media Server.

  • Write(Export) the RTP Stream from a SIP Call to an audio file

    I am working on telephony system that need record each call to a file.
    I got the RTP Stream in ReceiveStreamEvent handler and start to record
    the problem is i got a file and play it but there are any sounds
    when i set FileTypeDescriptor.WAVE and AudioFormat.ULAW
    I try out the FileTypeDescriptor.WAVE and AudioFormat.IMA4_MS, i got a fixed size file with 60kb
    if the FileTypeDescriptor.MPEG_AUDIO and AudioFormat.MPEG, the processor cannot be realize,
    the thread is blocked!!
    Could anyone save me? thanks in advanced!
    =================================Code===================================================
    public void update(ReceiveStreamEvent evt){
    logger.debug("received a new incoming stream. " + evt);
    RTPManager mgr = (RTPManager) evt.getSource();
    Participant participant = evt.getParticipant(); // could be null.
    ReceiveStream stream = evt.getReceiveStream(); // could be null.
    if (evt instanceof NewReceiveStreamEvent)
    try
    stream = ( (NewReceiveStreamEvent) evt).getReceiveStream();
    DataSource ds = stream.getDataSource();
              recorder = new CallRecorderImpl(ds);
    recorder.start();
    catch (Exception e)
    logger.error("NewReceiveStreamEvent exception ", e);
    return;
    ========================================================================================
    this is the CallRecorderImpl Class
    public class CallRecorderImpl implements DataSinkListener, ControllerListener {
         private Processor processor = null;
         private DataSink dataSink = null;
         private DataSource source = null;
         private boolean bRecording = false;
         FileTypeDescriptor contentType = new FileTypeDescriptor(FileTypeDescriptor.WAVE);
         Format encoding = new AudioFormat(AudioFormat.IMA4_MS);
    MediaLocator dest = new MediaLocator("file:/c:/bar.wav");
         public CallRecorderImpl(DataSource ds) {
              this.source = ds;
         public void start() {
              try {
                   processor = Manager.createProcessor(source);
                   processor.addControllerListener(this);
                   processor.configure();
              } catch (Exception e) {
                   System.out.println("exception:" + e);
         public void stop() {
              try {
                   System.out.println("stopping");
                   this.dataSink.stop();
                   this.dataSink.close();
              } catch (Exception ep) {
                   ep.printStackTrace();
         public void controllerUpdate(ControllerEvent evt) {
              Processor pr = (Processor) evt.getSourceController();
              if (evt instanceof ConfigureCompleteEvent) {
                   System.out.println("ConfigureCompleteEvent");
                   processConfigured(pr);
              if (evt instanceof RealizeCompleteEvent) {
                   System.out.println("RealizeCompleteEvent");
                   processRealized(pr);
              if (evt instanceof ControllerClosedEvent) {
                   System.out.println("ControllerClosedEvent");
              if (evt instanceof EndOfMediaEvent) {
                   System.out.println("EndOfMediaEvent");
                   pr.stop();
              if (evt instanceof StopEvent) {
                   System.out.println("StopEvent");
                   pr.close();
                   try {
                        dataSink.stop();
                        dataSink.close();
                   } catch (Exception ee) {
                        ee.printStackTrace();
         public void dataSinkUpdate(DataSinkEvent event) {
              if (event instanceof EndOfStreamEvent) {
                   try {
                        System.out.println("EndOfStreamEvent");
                        dataSink.stop();
                        dataSink.close();
                        System.exit(1);
                   } catch (Exception e) {
         public void processConfigured(Processor p) {
              // Set the output content type
              p.setContentDescriptor(this.contentType);
              // Get the track control objects
              TrackControl track[] = p.getTrackControls();
              boolean encodingPossible = false;
              // Go through the tracks and try to program one of them
              // to output ima4 data.
              for (int i = 0; i < track.length; i++) {
              try {
                   track.setFormat(this.encoding);
                   encodingPossible = true;
              } catch (Exception e) {
                   track[i].setEnabled(false);
              if (!encodingPossible) {
                   System.out.println("cannot encode to " + this.encoding);
              return;
              p.realize();
         public void processRealized(Processor p) {
              System.out.println("Entered processRealized");
    DataSource source = p.getDataOutput();
    try {
         dataSink = Manager.createDataSink(source, dest);
         dataSink.open();
         dataSink.start();
         dataSink.addDataSinkListener(new DataSinkListener() {
                        public void dataSinkUpdate(DataSinkEvent e) {
                             if (e instanceof EndOfStreamEvent) {
                                  System.out.println("EndOfStreamEvent");
                                  dataSink.close();
    } catch (Exception e) {
         e.printStackTrace();
         return;
              p.start();
    ============================================

    Which shows that the output stream cannot be of that particulat format and descriptor.
    Look at this code
    import javax.swing.*;
    import javax.media.*;
    import java.net.*;
    import java.io.*;
    import javax.media.datasink.*;
    import javax.media.protocol.*;
    import javax.media.format.*;
    import javax.media.control.*;
    class Abc implements ControllerListener
         DataSource ds;
         DataSink sink;
         Processor pr;
         MediaLocator mc;
         public void maam() throws Exception
              mc=new MediaLocator("file:C:/Workspace/aaaaa.mpg");
              pr=Manager.createProcessor(new URL("file:G:/java files1/jmf/aa.mp3"));
              pr.addControllerListener(this);
              pr.configure();          
         public void controllerUpdate(ControllerEvent e)
              if(e instanceof ConfigureCompleteEvent)
                   System.out.println ("ConfigureCompleteEvent");
                             Processor p = (Processor)e.getSourceController();
                   System.out.println ("ConfigureCompleteEvent");
                         processConfigured(p);
              if(e instanceof RealizeCompleteEvent)
                   System.out.println ("RealizeCompleteEvent");
                           Processor p = (Processor)e.getSourceController();
                             processRealized(p);
              if(e instanceof ControllerClosedEvent)
                         System.out.println ("ControllerClosedEvent");
                     if(e instanceof EndOfMediaEvent)
                        System.out.println ("EndOfMediaEvent");
                        Processor p = (Processor)e.getSourceController();
                        p.stop();
                 if(e instanceof StopEvent)
                         System.out.println ("StopEvent");
                         Processor p = (Processor)e.getSourceController();
                         p.close();
                   try
                        sink.stop();
                        sink.close();
                   catch(Exception ee)
         public void processConfigured(Processor p)
              System.out.println("Entered processConfigured");
              p.setContentDescriptor (new FileTypeDescriptor (FileTypeDescriptor.MPEG_AUDIO));       
                /*    TrackControl track[] = p.getTrackControls ();
                 boolean encodingPossible = false;
                 for (int i = 0; i < track.length; i++)
                      try
                           track.setFormat (new VideoFormat (VideoFormat.MPEG));
                   encodingPossible = true;
                   catch (Exception e)
                   track[i].setEnabled (false);
         p.realize();
         public void processRealized(Processor p)
              pr=p;
              System.out.println("Entered processRealized");
              try
                   MediaLocator dest = new MediaLocator("file:C:/Workspace/ring1.mpg");
                   sink = Manager.createDataSink(p.getDataOutput(), dest);
              sink.addDataSinkListener(new DataSinkListener()
                        public void dataSinkUpdate(DataSinkEvent e)
                             if(e instanceof EndOfStreamEvent)
                        System.out.println ("EndOfStreamEvent");
                                  sink.close();
                   sink.open();
                   sink.start();
              catch(Exception eX)
              System.out.println("Just before start");
              p.start();
    /*     public void dataSinkUpdate(DataSinkEvent event)
              if(event instanceof EndOfStreamEvent)
                   try
                        System.out.println("EndOfStreamEvent");
                        dsk.stop();
                        dsk.close();
                        System.exit(1);
                   catch(Exception e)
    public class JMFCapture6
         public static void main(String args[]) throws Exception
              Abc a=new Abc();
              a.maam();

  • RTP streaming noise from JMStudio

    Am using the AVTransmit3 (also tried AVTransmit2) downloaded code to send a RTP stream to JMStudio. It does recognize it as ULAW as does Ethereal however JMStudio only plays noise. This is a priority project so any help would be much appreciated.
    java AVTransmit3 rtp:172.16.85.2:8000/audio 192.168.10.223 9000
    Track 0 is set to transmit as:
    ULAW/rtp, 8000.0 Hz, 8-bit, Mono
    Created RTP session: 192.168.10.223 9000
    Start transmission for 60 seconds...
    ...transmission ended.

    Change the video format to JPEG/RTP

  • Mixing audio RTP-Streams to conference

    I'm trying to create a telefon conference applikation with jmf.
    My problem:
    I need to mix multiple incomming RTP-Streams (Audio/G711_ulaw) together into one outgoing RTP-Stream.
    If I use a MergingDataSource i can get one DataSource with multiple Tracks, but i'm not able to send them out together over one RTP-Session.
    Is there any way to do this?
    I have also thaught about using RawBufferMux or WAVMux to multiplex the tracks included in the MergingDataSource ,but i cant find out how to use them.
    Or is there an easyer way to create an conference over RTP?
    I need to connect to cisco ip-phones, so i can only use the g711_ulaw codec.
    Is there anyone out who can help me?

    Im sorry, but I met this problem in past and find it impossible to resolve.
    Im also thought about MergingDataSource, about Many2One class and more and more but nothing I could use. And nobody could answer this question in this forum half of year ago.
    Oh, nearly forgot: it is possible to write merged streams to file, seems possible to write them into file and then to transmit it to net from file. But how to realize more than one Processor?..

  • To retransmit the rtp streams received

    hi !
    Program A transmits media data to another program B using RTPSessionMgr.What B has to do is send the recived streams to another receiver program C.This has to play the stream.That is I am trying to implement a client router server model.
    So i maintained a session between A & B and also between B & C.Once NewReceiveStream event occurs at B,it retrieves the datasource from the stream and uses the createSendStream() method of the sessionmanager and sends to C.
    My problem is that,C receives the audio and video streams.but never plays it.I get a pink screen for video and no audio.
    can somebody help me out in pointing out my mistake or tell me a new strategy to implement this.
    Actually,i have tried to use only datagram sockets on the router side.ie I created 2 sockets for B to receive data. and forwarded to C using 2 other sockets .And this did work.But the client does not know the sender details.I require to maintain the sender,receiver reports.So i went for the session manager(RTPManager /RTPSessionMgr).
    kindly help.

    Hi all!
    Nice to meet ya, it very exciting that I got a project which will integrating the JMF and JXTA, it got RTP streams from the JMF framework and then sending it to the JXTA framework ,and sent to any peer in some groups to visualizing it .So some aspects of which is similar to yours ,would you mind add me to your contact list of MSN(my MSN account is:[email protected])

  • H263 RTP and JPEG RTP buffering problem

    Hi
    I have a problem with buffering received video stream. I use a buffercontrol of rtpmanager and for audio stream eveything works fine but for video buffercontrol is null, sa i can't set it (the same problem is with player or porcessor). Audio stream has all parameters (bitrate, samplerate etc) but video has only encoding and size parameters.
    I tried to change format (by processor) of received stream to supported format (for example jpeg) and set all parameters (size, framerate, maxdatalength and encoding) and use output datasource but still player can't get buffercontrol. When i use a porcessor as a player i can see in media properties that it is set like i want, but buffercontrol is null.
    I think that h263_rtp and jpeg_rtp video format don't support frameate (i tried set it but nothing happens) and it's played as it comes.
    I need to set a buffer because the framerate of incoming stream is really low (it's p2p application) and smothnes of video is bad.
    Please helpme where i can find any suggestion to solve this problem. Maybe i have to use custom transport layer or something like that.

    cziz13 wrote:
    I don't know what you want to say. I said exactly what I wanted to say.
    i got buffercontrol on realized player (it cals rendering and it's in solution AudioBufferControl.java). I got the buffer even on video file, but only when i play it from my hd. when i try to do it on received video stream the buffer is not working.Good for you. But that wasn't my point.
    Whenever you request a "Player" object, you don't get a "Player" object, you get some specific "Player" subclass that, depending on what it's designed to do, will implement certain interfaces.
    The BufferControl object you're getting on your Player class that will play from the hard drive is more of a prefetch cache than a buffer (in the sense that I think of a buffer, at least). It's using a pull datasource, so it can control getting data from its datasource.
    However, the player object that's playing a RTP stream is relying upon a pull buffer data source. It has absolutely no control over the data, and it's told by the DataSource when data is ready to be played...so it just loads when it's told to do by the DataSource.
    And the datasource is just handed data from the RTP streams inside the RTPManager based on the internal RTP buffering that is handled by the RTPManager...
    Any more suggestions?My original "suggestion" still stands. Get your BufferControl object on the RTPManager that you're using to receive the stream...

  • Video RTP transmittion problem

    Hey guys (and girls),
    I'm writing two programs that either send or receive a RTP video stream captured with a webcam. Using JMStudio I can capture and transmit the video and happily receive it with my client program, but when I try to capture and transmit the video using my server program the client can't realize because it's not got any data.
    Here is my transmitter code, my server class creates a new instance of the camTransmitter class.
    import java.io.*;
    import java.awt.*;
    import java.util.Vector;
    import javax.swing.*;
    import java.awt.event.*;
    import javax.media.*;
    import javax.media.format.*;
    import javax.media.protocol.*;
    import javax.media.control.*;
    import javax.media.ControllerListener;
    import javax.media.rtp.*;
    import javax.media.rtp.rtcp.*;
    import javax.media.rtp.event.*;
    import javax.media.rtp.SendStreamListener.*;
    import java.net.InetAddress;
    public class camTransmitter implements ControllerListener
         Processor p = null;
         CaptureDeviceInfo di = null;
         boolean configured, realized = false;
         public String ipAddress = "192.168.123.150";
         public int port = 22222;
         public RTPManager rtpMgrs[];
         public DataSource vDS;
         public SessionAddress localAddr, destAddr;
         public InetAddress ipAddr;
         public SendStream sendStream;
        public camTransmitter()
         // First, we'll need a DataSource that captures live video
         Format vFormat = new VideoFormat(VideoFormat.RGB);
         Vector devices = CaptureDeviceManager.getDeviceList(vFormat);
         if (devices.size() > 0)
              di = (CaptureDeviceInfo) devices.elementAt(0);
         else
              // exit if we could not find the relevant capture device.
              System.out.println("no devises");          
              System.exit(-1);
         MediaLocator camLoctation = di.getLocator();
         // Create a processor for this capture device & exit if we
         // cannot create it
         try
              vDS = Manager.createDataSource(camLoctation);
              p = Manager.createProcessor(vDS);
              p.addControllerListener(this);
         catch (IOException e) {
              System.exit(-1);
         }catch (NoProcessorException e) {
              System.exit(-1);
         }catch(NoDataSourceException e) {
              System.exit(-1);
         // at this point, we have succesfully created the processor.
         // Realize it and block until it is configured.
         p.configure();
         while(configured != true){}
         // block until it has been configured
         p.setContentDescriptor(new ContentDescriptor(ContentDescriptor.RAW_RTP));
         TrackControl track[] = p.getTrackControls();
         boolean encodingOk = false;
         // Go through the tracks and try to program one of them to
         // output ULAW_RTP data.
         for (int i = 0; i < track.length; i++)
              if (!encodingOk && track[i] instanceof FormatControl)
                   if (((FormatControl)track).setFormat(vFormat) == null)
                        track[i].setEnabled(false);
                   else
                        encodingOk = true;
              else
                   // we could not set this track to gsm, so disable it
                   track[i].setEnabled(false);
         // Realize it and block until it is realized.
         p.realize();
         while(realized != true){}
         // block until realized.
         // get the output datasource of the processor and exit
         // if we fail
         DataSource ds = null;
         try
              ds = p.getDataOutput();
         catch (NotRealizedError e)
              System.exit(-1);
         PushBufferDataSource pbds = (PushBufferDataSource)ds;
         PushBufferStream pbss[] = pbds.getStreams();
         rtpMgrs = new RTPManager[pbss.length];
         for (int i = 0; i < pbss.length; i++) {
         try {
              rtpMgrs[i] = RTPManager.newInstance();
              ipAddr = InetAddress.getByName(ipAddress);
              localAddr = new SessionAddress(InetAddress.getLocalHost(),port);
              destAddr = new SessionAddress(ipAddr, port);
              System.out.println(localAddr);
              rtpMgrs[i].initialize(destAddr);
              rtpMgrs[i].addTarget(destAddr);
              System.out.println( "Created RTP session: " + ipAddress + " " + port);
              sendStream = rtpMgrs[i].createSendStream(ds, i);          
              sendStream.start();
         } catch (Exception e) {
              System.err.println(e.getMessage());
         //System.out.println("RTP Stream created and running");
    public void controllerUpdate(ControllerEvent e)
         if(e instanceof ConfigureCompleteEvent)
              System.out.println("Configure Complete");
              configured = true;
         if(e instanceof RealizeCompleteEvent)
              System.out.println("Ralizing complete");
              realized = true;
    When I run this the processor realizes and the RTP seems to stream, but I have no idea why I can't receive the video from this.
    Pls help if anyone can Thanks

    Dear andreyvk ,
    I've read your post
    http://forum.java.sun.com/thread.jspa?threadID=785134&tstart=165
    about how to use a single RTP session for both media reception and trasmission (I'm referring to you RTPSocketAdapter modified version), but, at the moment, I'receive a BIND error.
    I think that your post is an EXCELLENT solution. I'modified AVReceive3 and AVTransmit3 in order to accept all parameters (Local IP & Port, Remote IP & Port).
    Can you please give me a simple scenario so I can understand what the mistake?
    I'use AVTransmit3 and AVReceive3 from different prompts and if I run these 2 classes contemporarely both in 2 different PC (172.17.32.27 and 172.17.32.30) I can transmit the media (vfw://0 for example) using AVTransmit3 but I'can't receive nothing if I run also AVReceive3 in the same PC?
    What's the problem? Furthermore, If I run first AVReceive3 from a MSDOS Prompt and subsequently I run AVTransmit3 from another prompt I see a BIND error (port already in use).
    How can I use your modified RTPSocketAdapter in order to send and receive a single media from the same port (e.g. 7500).
    I've used this scenario PC1: IP 172.17.32.30 Local Port 5000 and PC2:IP 172.17.32.27 LocalPort 10000
    So in the PC1 I run:
    AVTransmit3 vfw://0 <Local IP 172.17.32.30> <5000> <Remote IP 172.17.32.27> <10000>
    AVReceive3 <Local IP 172.17.32.30/5000> <Remote IP 172.17.32.27/10000>
    and in PC2:
    AVTransmit3 vfw://0 <Local IP 172.17.32.27> <10000> <Remote IP 172.17.32.30> <5000>
    AVReceive3 <Local IP 172.17.32.27/10000> <Remote IP 172.17.32.30/5000>
    I'd like to use the same port 5000 (in PC1) and 10000 (in PC2) in order to transmit and receive rtp packets. How can i do that without receive a Bind Error? How can I receive packets (and playing these media if audio &/or video) from the same port used to send stream over the network?
    How can I obtain a RTP Symmetric Transmission/Reception solution?
    Please give me an hint. If you can't post this is my email: [email protected]

  • Send RTP stream to NAT address

    Hi,
    i want to transmit a RTP stream from a server to a host in a LAN.
    This host has a NAT address and it's non real IP address, so i can't send any stream trought usage of SessionManager API because it need to know a public IP.
    The other issue is that in a LAN, in most popular cases, there is a firewall that close the connection from internet to their hosts.
    I think this solution:
    1) LAN's hosts can intiate the connection with server sending a non real RTP data
    2)Server store the SessionManager of this connection
    3)server can send your RTP stream now
    Someone have a more good solution or any suggestion?
    Thank for all
    [email protected]

    I have one appletTransmitter that capture video from webcam and transmit it to other client on internet.
    I try to transmit medialocator from appletTransmitter to servlet1 and then save MedialLocator as servlet attribute, then other client can connect to servlet2 that send saved MediaLocator to appletClient.
    APPLETTRANSMITTER:
    URL url=null;
    MediaLocator media=new MediaLocator("vfw://0");
    try{
    url = new URL("http://localhost:8080/servlet1");
    catch(MalformedURLException mue){mue.printStackTrace();}
    URLConnection conn=null;
    try{
    conn = url.openConnection();
    catch(IOException ioe){ioe.printStackTrace();}
    conn.setDoOutput(true);
    OutputStream os=null;
    ObjectOutputStream oos=null;
    InputStream in=null;
    ObjectInputStream iin=null;
    MediaLocator mResp=null;
    String r=null;
    try{
    os=conn.getOutputStream();
    oos=new ObjectOutputStream(os);
    oos.writeObject(media);
    //oos.writeObject("Prova Servlet");
    oos.flush();
    catch(IOException io){io.printStackTrace();}
    catch(ClassNotFoundException cn){cn.printStackTrace();}
    SERVLET1
    ObjectInputStream objin = new ObjectInputStream(request.getInputStream());
    MediaLocator ml =null;
    try{
    ml = (MediaLocator) objin.readObject();
    context.setAttribute("media",ml);
    catch(ClassNotFoundException e)
    {e.printStackTrace()}
    But on servlet1 there is a ClassNotFoundException: MediaLocator
    What do we think about the solution and exception problem?
    Best Regards,
    Nico from Italy

  • Using RTP stream as data source

    I have successfully written a program that reads G.711 audio from a wav file and sends it out over the network.
    I'm trying to modify that program so that instead of getting its source audio from a file, it instead receives the audio from an RTP stream. So, in effect, it is simply receiving audio from one socket and sending it back out another.
    The error I'm encountering:
    [java] javax.media.NoProcessorException: Cannot find a Processor for: rtp://172.30.18.140:32916
    [java] at javax.media.Manager.createProcessorForContent(Manager.java:1663)
    [java] at javax.media.Manager.createProcessor(Manager.java:627)
    Here is the code fragment where the exception occurs:
    processor = javax.media.Manager.createProcessor(new MediaLocator("rtp://172.30.18.140:32916"));
    Can anybody help? Thanks!
    <removed funky formatting><br>
    Message was edited by:
    feh

    Hi,
    I Had a very similar problem, but in addiction I had to store data in a buffer before retransmission. I resolved it employing low level classes. At this time, I still do not know any other solution, and nobody helped me in any way, also in this forum...
    So, first you have to access single frames of the stream (Buffer or ExtBuffer objects) using a RawBufferParser, then you have to reconstruct a DataSource by multiplexing the frames (use an appropriate Multiplexer class, such as RTPSynchBufferMux).
    The big problem is that there is no documentation avalaible about how to use these classes: I have lerned all I know by "reverse engineering" (lots of hours spent in reading very long annoying code). A little help can be given by the PlugIn Viewer of JMStudio, that tells you which classes are to be employed.
    I have not tried to retransmit data "straightforward", without access single frames, because that was not my task. Maybe if you use RTPManager, it is possible to get a DataSource by the ReceiveStream object, and then to pass it to another RTPManager to create a SendStream.
    I do not know any simpler way to do that, but this does not mean that it does not exist...!
    good luck.
    Alberto M. (Italy)

  • RTP streaming

    Hi guys,
    what I need to do is to stream YUV images between two JMF applications. I am using VideoTransmit application (taken from this site) to transmit and JMStudio to receive. From a little research I have noticed the following to transmit:
    1) Create the DataSource using the Manager.CreateDataSource(MediaLocator)
    2) create processor using the above created datasource
    3) create player etc.
    My question now is the following:
    How can I transmit a YUV file? How can I create the DataSource which has this YUV file without having problems for the further creation of processor etc?
    Thanks in advance guys,
    Chris

    Ok, I think I know what was confusing me. Apparently the setRate() isn't effecting the rate of stream. The audio file I am streaming runs about five minutes usually, so I sped the rate up to 10. It sped the process of writing to a file up significantly, but writing to an RTP stream seems to take the full run-time of the file. Does this sound right or is there something else I'm missing?
    Thanks,
    Khanathor

  • RTP-Streaming (MP3)

    Dear Java-Users,
    we like to build a little RTP-Streaming radio.
    We found a lot of code samples for example the AVTransmit/Recieve from java.sun.
    Our client is working well and is able to recieve RTP-Streams.
    But our server isn't running at the moment.
    When we start both (server and client) on one local machine, the client can recieve the Stream. When we start the server on an other machine from the same network, the client cannot recieve the stream.
    Well there are two problems left :
    1) As I said the server does not work except from localhost.
    2) We read that there is no MP3 support on Java-RTP-Streaming. But as you know MP3 is the most used audio-format and that's why we like to implement it.
    If somebody could help me, I would be really glad.
    Please send me an email or post here. ([email protected])
    I appreciate for your time. with good greetings,
    Tobias Belch

    Really I have to pay to use MP3 ?
    No I didn't knew this.
    I want to use MP3, for I want to build a simple and small client/server for RTP-Streaming over a standard network. And MP3 is the most used format and that's why it is just comfortable for the user to use mp3.
    But thank you very much for your hint.
    greetings

  • Play and save rtp streams

    Hello
    I need help can someone tell me how i play and save at the same time an rtp stream?
    Thanks

    Thanks for your response.
    Very much appreciated. Was very informative.
    This is my current situation with 3 your suggestions:
    Daniele,
    Your suggestion 1’s result:
    In Wireshark ----> Under Statistics --->I have VoIP calls.
    (I don’t see VoIP calls under Telephony –> may be a different version of Wireshark).
    Anyway, there is only one call because the Wireshark had a Capture Filter to track information between one source and one destination IP address. So I select that call and click on Player button and then click on Decode button. Then I select the forward stream (From IP1 to IP2) and click on play and I don’t hear anything at all. All silence. Same when I select the reverse stream from IP2 to IP1 and play.
    Your suggestion 2’s result:
    In Wireshark ---> Under Statistics ---> I Selected Stream Analysis (Did not select Show All Streams – not sure what the difference is) then ---> Save Payload ----> Select “au” instead of raw and it says – “Can’t save in a file:saving in au format supported only for alaw / ulaw stream
    Your suggestion 3’s result:
    Saved the file in .raw format. Opened Audacity and imported the file as raw and specified FIRST the A-Law codec for G.711A and selected 8000hz and that didn’t work and SECOND tried the u-Law coding for G.711u and selected the sample frequency again equal to 8000 Hz and that didn't work.
    Didn't work means:
    When I played the imported information I get all noise (like heavy metallic sound) and no voice.
    So my guess is that this capture is neither A-Law or u-Law codec - right. This capture was given to me by a customer.
    Any other suggestions – much appreciated Daniele.

Maybe you are looking for