RTP-Streaming (MP3)

Dear Java-Users,
we like to build a little RTP-Streaming radio.
We found a lot of code samples for example the AVTransmit/Recieve from java.sun.
Our client is working well and is able to recieve RTP-Streams.
But our server isn't running at the moment.
When we start both (server and client) on one local machine, the client can recieve the Stream. When we start the server on an other machine from the same network, the client cannot recieve the stream.
Well there are two problems left :
1) As I said the server does not work except from localhost.
2) We read that there is no MP3 support on Java-RTP-Streaming. But as you know MP3 is the most used audio-format and that's why we like to implement it.
If somebody could help me, I would be really glad.
Please send me an email or post here. ([email protected])
I appreciate for your time. with good greetings,
Tobias Belch

Really I have to pay to use MP3 ?
No I didn't knew this.
I want to use MP3, for I want to build a simple and small client/server for RTP-Streaming over a standard network. And MP3 is the most used format and that's why it is just comfortable for the user to use mp3.
But thank you very much for your hint.
greetings

Similar Messages

  • Streaming MP3 with RTP

    Hello,
    I am trying to implement a RTP server and a RTP client, for that purpose I am using JMF, it seems to work good until I tried to transmit MP3 files. When I trasmit MP3, I get the next error message:
    Unable to handle format: mpeglayer3, Unknown Sample Rate
    Failed to realize: com.sun.media.PlaybackEngine@1700391
    Error: Unable to realize com.sun.media.PlaybackEngine@1700391
    Error in ControllerErrorEvent: javax.media.ResourceUnavailableEvent[source=com.sun.media.content.unknown.Handler@fa39d7,message=Failed to realize: input media not supported: mpeglayer3 audio]
    I have installed the mp3 pluging and I have used the AVTrasmint2 and AVReceive2 class with some modifications to try to solve the problem, but I can not solve it.
    I tried to use JMFStudio in both sides (server and client side) and I got the same error.
    -->failed to handle a data format change!
    I would like to add that I have used two differents computers (one for the server and one for the client) to make all the test.
    Could you give me some advise, help or solution for that problem? If someone has a solution(code) to transmit streaming MP3 using RTP and kown that it is running ok, could you show me?
    I am doubting if MP3/RTP is supported in JMF, If someone knows another alternative to implement a streaming MP3 audio server and client, could you say me what solution is it?
    Thank you very much, I expect that someone could help me.
    PD: Sorry for my english, I think that it is not so good.

    Hi I had the same problem but i manage to solve it.
    First of all you have to install the mp3 plug-in and as u mentioned that u already did that.Make sure the mp3plugin.jar file is inside the lib/ext of the jre you are using for your project and one imp thing is first register the plug-in by issuing the command
    java com.sun.media.codec.audio.mp3.JavaDecoder
    It will modify the jmf.properties binary file.Once your done with this put this updated jmf.properties file in your project folder.After this your code should run perfectally.In case of any problem.Post it.
    cheers

  • How can I manage controls in an RTP Streaming

    Hello,
    I'm currently using an RTP Streaming in order to play some music. I started with AVReceive2 and AVTransmit2 from sun and by adding a plugin, I managed to play MP3. So far so good.
    Because I didn't want to use view components given by sun, I created my own GUI for my player and I use it instead of using the one in sun's examples.
    I use listeners for my "buttons" (I prefer using mouselistener and Panel like in the sun's example Jamp) but I can't find a way to execute those buttons.
    If I press pause, it has to pause the streaming in AVTransmit2.
    If I press next, or previous, it has to change the process AVTransmit2.
    Each control has to be done in AVTransmit2 but my player is part of AVReceive2.
    My first guess was to check if I could use an event to tell my AVTransmit2 object to execute my controls but I havn't found a way to tell my AVTransmit2 object that it was this button I pressed.
    I finally stopped trying transmitting my orders thanks to the events.
    I tried then to find a way to get my AVTransmit2 object in my AVReceive2 object so that I could call methods but i failed :(
    How can i manage my controls so that it will call methods on my server and not my client ?
    Thanks :)
    Shad.
    Edited by: Shadwolf on Feb 9, 2010 2:15 AM

    Hi, again,
    For the next & previous button, I was thinking in something.
    Do you think it could be good to create a new class RTPClientManager which extends from RTPManager and get 2 booleans previous & next that are set to true when I press on buttons ?
    From here, couldn't I modify my function in AVTransmit2 like this (I implements ReseiveStreamEvent on AVTransmit2) :
    @Override
         public void update(ReceiveStreamEvent evt)
              RTPClientManager mgr = (RTPClientManager )evt.getSource();
              Participant participant = evt.getParticipant();     // could be null.
              ReceiveStream stream = evt.getReceiveStream();  // could be null.
               * Détection de la fermeture de connexion du client afin de fermer la transmission streaming
              if (evt instanceof ByeEvent)
                   System.err.println("  - Got \"bye\" from: " + participant.getCNAME());
                         if(mgr.isNext())
                     this.stop();
                             /* SOME STUFF FOR NEXT */
                        else if(mgr.isPrevious())
                     this.stop();
                             /* SOME STUFF FOR PREVIOUS*/
                    else //means it's the stop button called
                              this.stop() ;
         }Would it work ?
    If you have better ideas I am ready to hear theam :P
    But if this works, I will manage stop, next and previous, but how could I manage play & pause button ?
    I can't find a proper solution to manage my controls :(

  • Write(Export) the RTP Stream from a SIP Call to an audio file

    I am working on telephony system that need record each call to a file.
    I got the RTP Stream in ReceiveStreamEvent handler and start to record
    the problem is i got a file and play it but there are any sounds
    when i set FileTypeDescriptor.WAVE and AudioFormat.ULAW
    I try out the FileTypeDescriptor.WAVE and AudioFormat.IMA4_MS, i got a fixed size file with 60kb
    if the FileTypeDescriptor.MPEG_AUDIO and AudioFormat.MPEG, the processor cannot be realize,
    the thread is blocked!!
    Could anyone save me? thanks in advanced!
    =================================Code===================================================
    public void update(ReceiveStreamEvent evt){
    logger.debug("received a new incoming stream. " + evt);
    RTPManager mgr = (RTPManager) evt.getSource();
    Participant participant = evt.getParticipant(); // could be null.
    ReceiveStream stream = evt.getReceiveStream(); // could be null.
    if (evt instanceof NewReceiveStreamEvent)
    try
    stream = ( (NewReceiveStreamEvent) evt).getReceiveStream();
    DataSource ds = stream.getDataSource();
              recorder = new CallRecorderImpl(ds);
    recorder.start();
    catch (Exception e)
    logger.error("NewReceiveStreamEvent exception ", e);
    return;
    ========================================================================================
    this is the CallRecorderImpl Class
    public class CallRecorderImpl implements DataSinkListener, ControllerListener {
         private Processor processor = null;
         private DataSink dataSink = null;
         private DataSource source = null;
         private boolean bRecording = false;
         FileTypeDescriptor contentType = new FileTypeDescriptor(FileTypeDescriptor.WAVE);
         Format encoding = new AudioFormat(AudioFormat.IMA4_MS);
    MediaLocator dest = new MediaLocator("file:/c:/bar.wav");
         public CallRecorderImpl(DataSource ds) {
              this.source = ds;
         public void start() {
              try {
                   processor = Manager.createProcessor(source);
                   processor.addControllerListener(this);
                   processor.configure();
              } catch (Exception e) {
                   System.out.println("exception:" + e);
         public void stop() {
              try {
                   System.out.println("stopping");
                   this.dataSink.stop();
                   this.dataSink.close();
              } catch (Exception ep) {
                   ep.printStackTrace();
         public void controllerUpdate(ControllerEvent evt) {
              Processor pr = (Processor) evt.getSourceController();
              if (evt instanceof ConfigureCompleteEvent) {
                   System.out.println("ConfigureCompleteEvent");
                   processConfigured(pr);
              if (evt instanceof RealizeCompleteEvent) {
                   System.out.println("RealizeCompleteEvent");
                   processRealized(pr);
              if (evt instanceof ControllerClosedEvent) {
                   System.out.println("ControllerClosedEvent");
              if (evt instanceof EndOfMediaEvent) {
                   System.out.println("EndOfMediaEvent");
                   pr.stop();
              if (evt instanceof StopEvent) {
                   System.out.println("StopEvent");
                   pr.close();
                   try {
                        dataSink.stop();
                        dataSink.close();
                   } catch (Exception ee) {
                        ee.printStackTrace();
         public void dataSinkUpdate(DataSinkEvent event) {
              if (event instanceof EndOfStreamEvent) {
                   try {
                        System.out.println("EndOfStreamEvent");
                        dataSink.stop();
                        dataSink.close();
                        System.exit(1);
                   } catch (Exception e) {
         public void processConfigured(Processor p) {
              // Set the output content type
              p.setContentDescriptor(this.contentType);
              // Get the track control objects
              TrackControl track[] = p.getTrackControls();
              boolean encodingPossible = false;
              // Go through the tracks and try to program one of them
              // to output ima4 data.
              for (int i = 0; i < track.length; i++) {
              try {
                   track.setFormat(this.encoding);
                   encodingPossible = true;
              } catch (Exception e) {
                   track[i].setEnabled(false);
              if (!encodingPossible) {
                   System.out.println("cannot encode to " + this.encoding);
              return;
              p.realize();
         public void processRealized(Processor p) {
              System.out.println("Entered processRealized");
    DataSource source = p.getDataOutput();
    try {
         dataSink = Manager.createDataSink(source, dest);
         dataSink.open();
         dataSink.start();
         dataSink.addDataSinkListener(new DataSinkListener() {
                        public void dataSinkUpdate(DataSinkEvent e) {
                             if (e instanceof EndOfStreamEvent) {
                                  System.out.println("EndOfStreamEvent");
                                  dataSink.close();
    } catch (Exception e) {
         e.printStackTrace();
         return;
              p.start();
    ============================================

    Which shows that the output stream cannot be of that particulat format and descriptor.
    Look at this code
    import javax.swing.*;
    import javax.media.*;
    import java.net.*;
    import java.io.*;
    import javax.media.datasink.*;
    import javax.media.protocol.*;
    import javax.media.format.*;
    import javax.media.control.*;
    class Abc implements ControllerListener
         DataSource ds;
         DataSink sink;
         Processor pr;
         MediaLocator mc;
         public void maam() throws Exception
              mc=new MediaLocator("file:C:/Workspace/aaaaa.mpg");
              pr=Manager.createProcessor(new URL("file:G:/java files1/jmf/aa.mp3"));
              pr.addControllerListener(this);
              pr.configure();          
         public void controllerUpdate(ControllerEvent e)
              if(e instanceof ConfigureCompleteEvent)
                   System.out.println ("ConfigureCompleteEvent");
                             Processor p = (Processor)e.getSourceController();
                   System.out.println ("ConfigureCompleteEvent");
                         processConfigured(p);
              if(e instanceof RealizeCompleteEvent)
                   System.out.println ("RealizeCompleteEvent");
                           Processor p = (Processor)e.getSourceController();
                             processRealized(p);
              if(e instanceof ControllerClosedEvent)
                         System.out.println ("ControllerClosedEvent");
                     if(e instanceof EndOfMediaEvent)
                        System.out.println ("EndOfMediaEvent");
                        Processor p = (Processor)e.getSourceController();
                        p.stop();
                 if(e instanceof StopEvent)
                         System.out.println ("StopEvent");
                         Processor p = (Processor)e.getSourceController();
                         p.close();
                   try
                        sink.stop();
                        sink.close();
                   catch(Exception ee)
         public void processConfigured(Processor p)
              System.out.println("Entered processConfigured");
              p.setContentDescriptor (new FileTypeDescriptor (FileTypeDescriptor.MPEG_AUDIO));       
                /*    TrackControl track[] = p.getTrackControls ();
                 boolean encodingPossible = false;
                 for (int i = 0; i < track.length; i++)
                      try
                           track.setFormat (new VideoFormat (VideoFormat.MPEG));
                   encodingPossible = true;
                   catch (Exception e)
                   track[i].setEnabled (false);
         p.realize();
         public void processRealized(Processor p)
              pr=p;
              System.out.println("Entered processRealized");
              try
                   MediaLocator dest = new MediaLocator("file:C:/Workspace/ring1.mpg");
                   sink = Manager.createDataSink(p.getDataOutput(), dest);
              sink.addDataSinkListener(new DataSinkListener()
                        public void dataSinkUpdate(DataSinkEvent e)
                             if(e instanceof EndOfStreamEvent)
                        System.out.println ("EndOfStreamEvent");
                                  sink.close();
                   sink.open();
                   sink.start();
              catch(Exception eX)
              System.out.println("Just before start");
              p.start();
    /*     public void dataSinkUpdate(DataSinkEvent event)
              if(event instanceof EndOfStreamEvent)
                   try
                        System.out.println("EndOfStreamEvent");
                        dsk.stop();
                        dsk.close();
                        System.exit(1);
                   catch(Exception e)
    public class JMFCapture6
         public static void main(String args[]) throws Exception
              Abc a=new Abc();
              a.maam();

  • Streaming MP3s

    Hi all,
    I am trying to stream mp3 using JMF and RTP, however I am getting an error at the receiver saying:.
    RTP Handler internal error: javax.media.ControllerErrorEvent[source=com.sun.media.content.unknown.Handler@183f74d,message=Internal module com.sun.media.BasicRendererModule@1386000: failed to handle a data format change!]I have installed the mp3 plugin and the mp3 does play locally using a simple JMF player. The Stream is also being transmitted properly because I was able to play it using VLC player. I can post my code if needed.
    Please, please, please could someone help me or suggest a better way than JMF for streaming audio in any language.
    Thanks

    I have found the answer. MP3's cannot be streamed under the format MPEG.It cannot be decoded, however, DVI can + others on:
    http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/formats.html
    The quality is shit, but for voice or other purposes it should be fine.
    Cheers anyway

  • How to extract data from Buffer and create a RTP stream

    Hi
    I'm working on a project where I need to interrupt a media stream (video and audio) and extract the data. Send the data with a custom protocol. On the recieveing side I would like to reconstruct the stream using only the data chunks.
    I'm currently looking at [DataSourceReader.java|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/DataSourceReader.java] and more specifically at a method like printDataInfo(Buffer buffer) to extract the data.
    Is it possible to create a RTP stream, only having access to the byte array "data" in Buffer ?
    Thanks in advance.

    camelstrike wrote:
    Hi
    I'm working on a project where I need to interrupt a media stream (video and audio) and extract the data. Send the data with a custom protocol. On the recieveing side I would like to reconstruct the stream using only the data chunks.
    I'm currently looking at [DataSourceReader.java|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/DataSourceReader.java] and more specifically at a method like printDataInfo(Buffer buffer) to extract the data.
    There are a couple of different ways to get the data. Reading it from inside a DataSink is perfectly fine...
    Is it possible to create a RTP stream, only having access to the byte array "data" in Buffer ?Yes and no.
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/CustomPayload.html]
    You need to know the format of the media in addition to the actual media data...

  • Writing a conference server for RTP streams

    Hello,
    I'm trying to write a conference server which accepts multiple RTP streams (one for each participant), creates a mixed RTP stream of all other participants and sends that stream back to each participant.
    For 2 participants, I was able to correctly receive and send the stream of the other participant to each party.
    For 3 participants, creating the merging data source does not seem to work - i.e. no data is received by the participants.
    I tried creating a cloneable data sources instead, thinking that this may be the root cause, but when creating cloneable data sources from incoming RTP sources, I am unable to get the Processor into Configured state, it seems to deadlock. Here's the code outline :
        Iterator pIt = participants.iterator();
        List dataSources = new ArrayList();
        while(pIt.hasNext()) {
          Party p = (Party) pIt.next();
          if(p!=dest) {
            DataSource ds = p.getDataSource();
            DataSource cds = Manager.createCloneableDataSource(ds);
            DataSource clone= ((SourceCloneable)cds).createClone();
            dataSources.add(clone);
        Object[] sources = dataSources.toArray(new DataSource[0]);
        DataSource dataSource =   Manager.createMergingDataSource((DataSource[])sources);
        Processor p = Manager.createProcessor(dataSource);
        MixControllerListener cl = new MixControllerListener();
        p.addControllerListener(cl);
        // Put the Processor into configured state.
        p.configure();
        if (!cl.waitForState(p, p.Configured)) {
            System.err.println("Failed to configure the processor.");
            assert false;
        }Here are couple of stack traces :
    "RTPEventHandler" daemon prio=1 tid=0x081d6828 nid=0x3ea6 in Object.wait() [98246000..98247238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f37e4a8> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at demo.Mixer$MixControllerListener.waitForState(Mixer.java:248)
            - locked <0x9f37e4a8> (a java.lang.Object)
            at demo.Mixer.createMergedDataSource(Mixer.java:202)
            at demo.Mixer.createSendStreams(Mixer.java:165)
            at demo.Mixer.createSendStreamsWhenAllJoined(Mixer.java:157)
            - locked <0x9f481840> (a demo.Mixer)
            at demo.Mixer.update(Mixer.java:123)
            at com.sun.media.rtp.RTPEventHandler.processEvent(RTPEventHandler.java:62)
            at com.sun.media.rtp.RTPEventHandler.dispatchEvents(RTPEventHandler.java:96)
            at com.sun.media.rtp.RTPEventHandler.run(RTPEventHandler.java:115)
    "JMF thread: com.sun.media.ProcessEngine@a3c5b6[ com.sun.media.ProcessEngine@a3c5b6 ] ( configureThread)" daemon prio=1 tid=0x082fe3c8 nid=0x3ea6 in Object.wait() [977e0000..977e1238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f387560> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at com.sun.media.parser.RawBufferParser$FrameTrack.parse(RawBufferParser.java:247)
            - locked <0x9f387560> (a java.lang.Object)
            at com.sun.media.parser.RawBufferParser.getTracks(RawBufferParser.java:112)
            at com.sun.media.BasicSourceModule.doRealize(BasicSourceModule.java:180)
            at com.sun.media.PlaybackEngine.doConfigure1(PlaybackEngine.java:229)
            at com.sun.media.ProcessEngine.doConfigure(ProcessEngine.java:43)
            at com.sun.media.ConfigureWorkThread.process(BasicController.java:1370)
            at com.sun.media.StateTransitionWorkThread.run(BasicController.java:1339)
    "JMF thread" daemon prio=1 tid=0x080db410 nid=0x3ea6 in Object.wait() [97f41000..97f41238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Object.wait(Object.java:429)
            at com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave.run(CloneableSourceStreamAdapter.java:375)
            - locked <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Thread.run(Thread.java:534)Any ideas ?
    Thanks,
    Jarek

    bgl,
    I was able to get past the cloning issue by following the Clone.java example to the letter :)
    Turns out that the cloneable data source must be added as a send stream first, and then the clonet data source. Now for each party in the call the conf. server does the following :
    Party(RTPManager mgr,DataSource ds) {
          this.mgr=mgr;
          this.ds=Manager.createCloneableDataSource(ds);
       synchronized DataSource cloneDataSource() {
          DataSource retVal;
          if(getNeedsCloning()) {
            retVal = ((SourceCloneable) ds).createClone();
          } else {
            retVal = ds;
            setNeedsCloning();
          return retVal;
        private void setNeedsCloning() {
          needsCloning=true;
        private boolean getNeedsCloning() {
          return needsCloning;
         private synchronized void addSendStreamFromNewParticipant(Party newOne) throws UnsupportedFormatException, IOException {
        debug("*** - New one joined. Creating the send streams. Curr count :" + participants.size());
        Iterator pIt = participants.iterator();
        while(pIt.hasNext()) {
          Party p = (Party)pIt.next();
          assert p!=newOne;
          // update existing participant
          SendStream sendStream = p.getMgr().createSendStream(newOne.cloneDataSource(),0);
          sendStream.start();
          // send data from existing participant to the new one
          sendStream = newOne.getMgr().createSendStream(p.cloneDataSource(),0);
          sendStream.start();
        debug("*** - Done creating the streams.");So I made some progress, but I'm still not quite there.
    The RTP manager JavaDoc for createSendStream states the following :
    * This method is used to create a sending stream within the RTP
    * session. For each time the call is made, a new sending stream
    * will be created. This stream will use the SDES items as entered
    * in the initialize() call for all its RTCP messages. Each stream
    * is sent out with a new SSRC (Synchronisation SouRCe
    * identifier), but from the same participant i.e. local
    * participant. <BR>
    For 3 participants, my conf. server creates 2 send streams to every one of them, so I'd expect 2 SSRCs on the wire. Examining the RTP packets in Ethereal, I only see 1 SSRC, as if the 2nd createSendStream call failed. Consequently, each participany in the conference is able to receive voice from only 1 other participant, even though I create RTPManager instance for each participany, and add 2 send streams.
    Any ideas ?
    Thanks,
    Jarek

  • Mixing audio RTP-Streams to conference

    I'm trying to create a telefon conference applikation with jmf.
    My problem:
    I need to mix multiple incomming RTP-Streams (Audio/G711_ulaw) together into one outgoing RTP-Stream.
    If I use a MergingDataSource i can get one DataSource with multiple Tracks, but i'm not able to send them out together over one RTP-Session.
    Is there any way to do this?
    I have also thaught about using RawBufferMux or WAVMux to multiplex the tracks included in the MergingDataSource ,but i cant find out how to use them.
    Or is there an easyer way to create an conference over RTP?
    I need to connect to cisco ip-phones, so i can only use the g711_ulaw codec.
    Is there anyone out who can help me?

    Im sorry, but I met this problem in past and find it impossible to resolve.
    Im also thought about MergingDataSource, about Many2One class and more and more but nothing I could use. And nobody could answer this question in this forum half of year ago.
    Oh, nearly forgot: it is possible to write merged streams to file, seems possible to write them into file and then to transmit it to net from file. But how to realize more than one Processor?..

  • Streamed MP3 music files losing first few seconds

    My Apple TV seems to drop the first few seconds of MP3 music files when they are streamed from iTunes, but not when they are synced to the Apple TV.
    The number of seconds that are dropped varies from none at all for some songs to 20, or much more than that ... but seemingly never more than a minute.
    When I select one certain song in a particular playlist -- "Song for the Mira" by the Garrison Bros. -- the time shown is 5:19. When I press play on the Apple TV remote the time changes to 5:00, very briefly ... just before the Apple TV starts to play the song with its first 19 seconds missing, as if it were really a 5:00 song.
    From that time on (unless I reload the iTunes library) every time I select the same song from any playlist, or from under Songs, or Albums, or whatever, the time is always shown right away as 5:00. So Apple TV is "remembering" the time of the song.
    If I cause Apple TV to reload that iTunes library, the time goes back to 5:19 until I actually play the song again, at which time it becomes 5:00 again.
    The amount of time that is sliced off from the beginning of a song is always consistent for that song. So if the amount is 12 seconds one time for a particular song, it is always 12 seconds.
    I have not altered the start or end points of affected songs in iTunes.
    As far as I can tell, this never happens with Protected AACs.
    It never happens with synced copies of the MP3s, only with streamed playback.
    I have the latest versions of iTunes (7.3.2) and of the Apple TV software (1.1).
    Restoring the Apple TV to factory settings, re-downloading the software, etc., have no effect on the problem. Nor does restarting my iMac, which is running OS X 10.4.10. Nor does restarting the various base stations on my wireless network, all of which have the latest firmware.
    Anyone else have a problem with the beginnings of streamed MP3s being truncated? Anyone know what to do about it? Thanks.

    Since my last post to this thread I have discovered:
    (1) I was wrong in thinking there is no pattern to the number of seconds lopped off at the beginning of various MP3s when streamed to my Apple TV. Actually, I have found that the same number of seconds is dropped from every song in a particular album or compilation!
    So my "Sounds of Nova Scotia, Vol. 1" compilation -- which contains the "Song for the Mira" that is reduced from 5:19 to 5:00 at the start of Apple TV's streamed playback -- has 19 seconds removed at the start of every track when its tracks are streamed to Apple TV. (It does not appear to be significant that this particular track seems to be rounded to the nearest minute. Cutting the first 19 seconds off the other tracks in the compilation does not bring them to an even minute.)
    My Beatles' "Abbey Road" album has not 19 but 30 seconds dropped at the start of every streamed track!
    (2) I was also wrong in reporting that the time change from, say, 5:19 to 5:00 is "remembered" by Apple TV the next time I select the affected track from a list on the TV screen. Actually, the next time I select the track, it shows up as 5:19 again. But if I press Play, that changes to 5:00 in the split second just before the track actually starts to play ... and then the progress bar that appears at that point, when the actual song is being played, reflects a 5:00 total time, not 5:19.
    To recap what I said before, this happens only when I stream (even from the syncing iTunes), not when I play a synced copy. AACs are unaffected. Not all MP3s seem to be affected, though all in an affected album or compilation are affected the same way, by having an identical number of seconds lopped off. I have the latest software/firmware in all my devices. Restarts/resets do not help. The problem persists in copies of copies of the affected MP3 tracks streamed from iTunes _on my other Mac_, a MacBook Pro. The problem does not occur when I play the tracks right in iTunes on either machine. I am not using manually set start and stop times on the affected tracks in iTunes. I do not have *Part of a Gapless Album* or *Remember Playback Position* set in Info in iTunes Preferences, though I have tried setting *Part of a Gapless Album*, with no effect on the problem.
    Anyone have any more ideas as to what might be causing this? Thanks.

  • Can I play and save the incoming RTP streams simultenously

    Hi,
    I want to whether i can play and save the incoming RTP stream simultenously using
    JMF 2.1. The idea is by saving in the file, i can playback the same file later time
    Is there any example code aviliable.
    Thanks in Advance
    bye
    Srikanth

    I think this, today, it's impossible lamentably. By but that I have tried it, I have not been able to obtain it.
    If exists someone that it has obtained it, please, help us with this subject. It's very important.
    Thanks

  • How to synchronize audio and video rtp-streams

    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?
    How are you supposed to sync audio and video if not with as above ?

    camelstrike wrote:
    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?The RTP packets are timestamped when they leave, and they are played in order. You can't change the timebase on an RTP stream because, for all intensive purposes, it's "live" data. It plays at the speed the transmitting RTP server wants it to play, and it never gets behind because it drops out of order and old data packets.
    If your RTP streams aren't synced correctly on the receiving end, it means they aren't being synced on the transmitting end...so you should look into the other side of the equation.

  • To retransmit the rtp streams received

    hi !
    Program A transmits media data to another program B using RTPSessionMgr.What B has to do is send the recived streams to another receiver program C.This has to play the stream.That is I am trying to implement a client router server model.
    So i maintained a session between A & B and also between B & C.Once NewReceiveStream event occurs at B,it retrieves the datasource from the stream and uses the createSendStream() method of the sessionmanager and sends to C.
    My problem is that,C receives the audio and video streams.but never plays it.I get a pink screen for video and no audio.
    can somebody help me out in pointing out my mistake or tell me a new strategy to implement this.
    Actually,i have tried to use only datagram sockets on the router side.ie I created 2 sockets for B to receive data. and forwarded to C using 2 other sockets .And this did work.But the client does not know the sender details.I require to maintain the sender,receiver reports.So i went for the session manager(RTPManager /RTPSessionMgr).
    kindly help.

    Hi all!
    Nice to meet ya, it very exciting that I got a project which will integrating the JMF and JXTA, it got RTP streams from the JMF framework and then sending it to the JXTA framework ,and sent to any peer in some groups to visualizing it .So some aspects of which is similar to yours ,would you mind add me to your contact list of MSN(my MSN account is:[email protected])

  • Problems streaming mp3's

    Hi,
    I'm having some trouble with streaming mp3's and clicking
    links at the same time. It seems that while the mp3 file is
    downloading it's impossible to click on links, it freezes until the
    song is fully downloaded. To view the mp3 player i've built with a
    friend please feel free to visit
    http://www.musicplayer.fm/ -
    if you sign in with username: "test" password: "password" and feel
    free to play any song, and you'll see what i mean. I'm stuck
    getting around this problem.
    I've tried embedding the player using the SWFobject method in
    javascript, cross domain streaming... and nothing seems to work.
    The player is streaming the mp3 using loadSound(location, true);
    and the location is fed using an xml file.
    Thanks for any advice in advance.
    Matt

    No surprise at all - these are two different technologies for two entirely different purposes.
    A podcast is meant for files to be downloaded to a user's computer so that they can be listened to at any time or transfered to an iPod. A streamed file is more like a radio broadcast - it can only be listened to while the streaming server is "broadcasting" the file to the client computer.
    If you want to take advantage of both options you'll need to upload your file to two different locations.

  • Capture streaming MP3 to file

    Is there a way to open an MP3 stream and write it to a file? A java based way to record internet radio for later listens?

    1) Can someone point me to a URL where if I connect it will send me streaming MP3 data?No. I mean, I could make a URL that sends you MP3 data, but it'll only be what I want to send, not necessarily what you want.
    The file you get is a meta file, which tells where the content is. If you want to connect to that, you need to know what the protocol is. If it's not publicly available, then I'm sure a reasonably motivated person could sniff it out. And that's presuming that the audio is actually MP3 data to begin with.
    2) Is there a better way of doing this?That depends. Better for who? You the application developer? The end user? The copyright owner?
    Back to your original questions, which were actually reasonable...
    Is there a way to open an MP3 stream and write it to a file? If you have an MP3 stream, it's either a TCP or UDP stream, in which case, yes, you can have Java connect and record it, presuming you can talk to the server in the appropriate protocol that the server expects to be talked to in.
    A java based way to record internet radio for later listens?This is entirely up to the radio protocol.
    You'd probably be better off intercepting the raw audio from the sound card, as it's more universal.

  • Receiving Video RTP Stream (JMF) in JME ( MMAPI ) - URGENT !!!

    Hi Folks...
    I�m trying to develop an application that sends the images from a web cam connected to the computer, to the PDA that the images can be viewed by the user...
    My code for the JMF RTP Video Stream is as follows.
    Processor proc = null;
              javax.media.protocol.DataSource ds = null;
              TrackControl[] tc = null;
              int y;
              boolean encodingOk = false;
              Vector<javax.media.protocol.DataSource> datasources = new Vector<javax.media.protocol.DataSource>();
              for( int x = 0 ; x < camerasInfo.length ; x++ ){
                   try {
                        proc = Manager.createProcessor(camerasInfo[x].getLocator());
                   }catch (NoProcessorException e) { System.out.println("Erro ao int�nciar PROCESSOR: " + e);     }
                    catch (IOException e) { System.out.println("Erro ao int�nciar PROCESSOR: " + e); }
                   proc.configure();
                   try {
                        Thread.sleep(2000);
                   } catch (InterruptedException e1) {
                        // TODO Auto-generated catch block
                        e1.printStackTrace();
                   proc.setContentDescriptor( new ContentDescriptor(ContentDescriptor.RAW_RTP) );
                   tc = proc.getTrackControls();
                   for( y = 0; y < tc.length ; y++ ){
                        if (!encodingOk && tc[y] instanceof FormatControl){
                             if( ( (FormatControl) tc[y] ).setFormat( new VideoFormat(VideoFormat.RGB) ) != null ){
                                  tc[y].setEnabled(true);                              
                             }else{
                                  encodingOk = true;
                        }else{
                             tc[y].setEnabled(false);
                   proc.realize();
                   try {
                        Thread.sleep(2000);
                   } catch (InterruptedException e1) {
                        e1.printStackTrace();
                   try{
                        ds = proc.getDataOutput();
                   }catch(NotRealizedError e){
                        System.out.println("ERRO ao realizar datasource: " + e);
                   }catch(ClassCastException e){
                        System.out.println("Erro ao realizar datasource: " + e);
                   datasources.add(ds);
                   System.out.println( ds.getLocator() );
                   encodingOk = false;
              MediaLocator ml = new MediaLocator("rtp://10.1.1.100:99/video");
              try {
                   DataSink datSink = Manager.createDataSink(ds , ml);
                   datSink.open();
                   datSink.start();
              } catch (NoDataSinkException e) {
                   System.out.println("Erro ao instanciar DataSink: " + e );
              } catch (SecurityException e) {
                   System.out.println("Erro ao iniciar DataSink: " + e);
              } catch (IOException e) {
                   System.out.println("Erro ao iniciar DataSink: " + e);
              }I�m not sure if this code is correctly... it is ?
    So... the next part of the systems runs on the PDA..
    The code that access this RTP Stream is as follows..
              VideoControl c = null;
              try {
                   player = Manager.createPlayer("rtp://10.1.1.100:99/video");
                   c = (VideoControl) player.getControl("VideoControl");
                   tela = (Item) c.initDisplayMode( GUIControl.USE_GUI_PRIMITIVE, null);
                   player.start();
                   append(tela);
              } catch (IOException e) {
                   str.setText(e.toString());
                   append( str );
              } catch (MediaException e) {
                   str.setText(e.toString());
                   append( str );
              }So when the APP try to create a player for "rtp://10.1.1.100:99/video" an MediaException is throwed..
    javax.microedition.media.MediaException: Unable to create Player for the locator: rtp://10.1.1.100:99/video
    So... I don�t know what happen =/
    The error is in the PDA module ? Or in the computer�s initialization off the RTP Video Streaming ?
    I need finish this job at next week... so any help is usefull..
    Waiting for answers
    Rodrigo Kerkhoff

    First of all: before going onto the j2me part, make sure the server works, before doing anything else! Apparently, it doesn't...
    The MediaLocator is generally used to specify a kind of URL depicting where the data put into the datasink should go to. In your case, This cannot be just some URL where you want it to act as a rtps server. You'll need to implement that server yourself I guess.

Maybe you are looking for

  • Fact Sheet in SAPGUI

    Hi. I want to use fact sheet in business partner cockpit. I am dissatisfied with Standard info block class. I want to change info block. How do you make info block? (CRM part) Development? And I want to use BI report to fact sheet. (ex. BI Report of

  • Problem in SQL Query

    Hi Guru Please help me to make a SQL query to retrieve appropriate data.In my database table has columns like Reg_no, Order_no. Please find out those records where Order_no same but different Reg_no.But dont show those records where same order_no has

  • Invalid Entry message on VM

    I am trying to access my vm: I dial in and enter my passcode.  It asks me to dial 1 for XXXX or 2 for XXXX.  2 has a message, so I dial 2 to access it.  It tells me invalid entry.  I have tried many times with no luck.  Any suggestions? Thanks.

  • JavaScripting linear fill

    Trying to get a linear fill, top to bottom, in a button or field using JavaScripting so I can change the colors when needed. Adobe documentation says it can be done, but is not clear as to how to get the desired result. Any ideas? Thank you.

  • IDML File Help

    Hello, I have sent an IDML file to a client that was created in CS6 and they are unable to open it in CS5.5 getting the error "Cannot open the file ______ Adobe InDesign may not suport the dile format, a plug-in may that supports the file format may