Multiple RTP streams + local video Player - S.O.S.

Hi guys,
I ran out of ideas, so I need your help now. It's gonna be a long one...
Given:
MediaLocator(vfw://0) --> DataSource(video) --> Processor(video)
Then I send processor's output (videoProcessor.getDataOutput()) over RTP to multiple destinations. Of course, I am using cloneable DataSources for this, cause there's no other way. So, now I also need to have a local video feed just for self-reference, but it seems to be a tough one! Let me go over what I've tried til now:
1. Using DataSource clone from video processor, I tried to create a Player (i.e. Manager.createPlayer(clonedDS) ). BOOM!
javax.media.NoPlayerException: Cannot find a Player for: com.ibm.media.protocol.SuperCloneableDataSource$PushBufferDataSourceSlave@126c6ea
     at javax.media.Manager.createPlayerForSource(Manager.java:1512)
     at javax.media.Manager.createPlayer(Manager.java:500)
     at org.interlab.mc.media.MediaManager.setVideoProcessorEnabled(MediaManager.java:2046)
     at org.interlab.mc.media.MediaManager.openVideoStreams(MediaManager.java:1807)
     at org.interlab.mc.UserAgent.callStateChanged(UserAgent.java:2516)
     at org.interlab.mc.call.Call.fireCallStateChangedEvent(Call.java:366)
     at org.interlab.mc.call.Call.setState(Call.java:244)
     at org.interlab.mc.call.CallManager.processInviteOK(CallManager.java:330)
     at org.interlab.mc.UserAgent.processResponse(UserAgent.java:1632)
     at gov.nist.javax.sip.EventScanner.deliverEvent(EventScanner.java:288)
     at gov.nist.javax.sip.EventScanner.run(EventScanner.java:489)
     at java.lang.Thread.run(Unknown Source)So i checked. If I create a clone from my DataSource(video) then Manager would return this instance:
class com.sun.media.multiplexer.RawBufferMux$RawBufferDataSourceBut if, like now, I want to create a clone from DataSource from video processor (i.e. Processor.getDataOutput()) then I get this instance:
com.ibm.media.protocol.SuperCloneableDataSource$PushBufferDataSourceSlaveHope you noticed the difference. :) Now, from the latter one neither I can create a Player nor can I create a Processor. Too bad. So, I tried another one.
2. I took my DataSource(video), created a cloneable (just like in Clone.java) and from there I finally got my precious Player! Yes!! But not so fast.... Player was working, but the rest now wasn't. Once I have my DataSource(video) cloned, my Processor(video)'s data output got starved - i could not send video anymore. What a life?! Fine... Let's try another one.
3. Just like I send video remote I decided to send the video locally in a loop, for instance, from 203.159.2.3:25000 to 203.159.2.3:25002. And what do you think? Still doesnt work!! This "loop" stream is not detected (i.e. controller's update doest sense anything). But if i open JMStudio, and start listening (i.e. open RTP session) on 203.159.2.3:25002, i get my stream showing nice and clear!
Someone, anyone, please, help! Point my nose to some document, piece of code - anything to make this work.
Kind regards.

I'd better post here, so everyone else can get it too. so here it goes:
That's RTPSocketAdapter
import java.io.IOException;
import java.net.InetAddress;
import java.net.DatagramSocket;
import java.net.MulticastSocket;
import java.net.DatagramPacket;
import java.net.SocketException;
import javax.media.protocol.PushSourceStream;
import javax.media.protocol.ContentDescriptor;
import javax.media.protocol.SourceTransferHandler;
import javax.media.rtp.RTPConnector;
import javax.media.rtp.OutputDataStream;
* An implementation of RTPConnector based on UDP sockets.
public class RTPSocketAdapter implements RTPConnector {
    DatagramSocket dataSock=null;
    DatagramSocket ctrlSock=null;
    InetAddress remoteAddress=null;
    int remotePort=0;
    InetAddress localAddress=null;
    int localPort=0;
    SockInputStream dataInStrm = null, ctrlInStrm = null;
    SockOutputStream dataOutStrm = null, ctrlOutStrm = null;
    public RTPSocketAdapter(InetAddress localAddress, int localPort,
                                  InetAddress remoteAddress, int remotePort) throws IOException {
         this(localAddress, localPort, remoteAddress, remotePort, 1);
    public RTPSocketAdapter(InetAddress localAddress, int localPort,
                                   InetAddress remoteAddress, int remotePort, int ttl) throws IOException {
          try {
              if (remoteAddress.isMulticastAddress()) {
                    dataSock = new MulticastSocket(localPort);
                    ctrlSock = new MulticastSocket(localPort+1);
                    ((MulticastSocket)dataSock).joinGroup(remoteAddress);
                    ((MulticastSocket)dataSock).setTimeToLive(ttl);
                    ((MulticastSocket)ctrlSock).joinGroup(remoteAddress);
                    ((MulticastSocket)ctrlSock).setTimeToLive(ttl);
              else {
                    dataSock = new DatagramSocket(localPort, localAddress);
                    ctrlSock = new DatagramSocket(localPort+1, localAddress);
          catch (SocketException e) {
              throw new IOException(e.getMessage());
          this.localAddress = localAddress;
          this.localPort = localPort;
          this.remoteAddress = remoteAddress;
          this.remotePort = remotePort;
     * Returns an input stream to receive the RTP data.
    public PushSourceStream getDataInputStream() throws IOException {
          if (dataInStrm == null) {
              dataInStrm = new SockInputStream(dataSock, remoteAddress, remotePort);
              dataInStrm.start();
          return dataInStrm;
     * Returns an output stream to send the RTP data.
    public OutputDataStream getDataOutputStream() throws IOException {
          if (dataOutStrm == null)
              dataOutStrm = new SockOutputStream(dataSock, remoteAddress, remotePort);
          return dataOutStrm;
     * Returns an input stream to receive the RTCP data.
    public PushSourceStream getControlInputStream() throws IOException {
          if (ctrlInStrm == null) {
              ctrlInStrm = new SockInputStream(ctrlSock, remoteAddress, remotePort+1);
              ctrlInStrm.start();
          return ctrlInStrm;
     * Returns an output stream to send the RTCP data.
    public OutputDataStream getControlOutputStream() throws IOException {
          if (ctrlOutStrm == null)
              ctrlOutStrm = new SockOutputStream(ctrlSock, remoteAddress, remotePort+1);
          return ctrlOutStrm;
     * Close all the RTP, RTCP streams.
    public void close() {
          if (dataInStrm != null)
              dataInStrm.kill();
          if (ctrlInStrm != null)
              ctrlInStrm.kill();
          dataSock.close();
          ctrlSock.close();
     * Set the receive buffer size of the RTP data channel.
     * This is only a hint to the implementation.  The actual implementation
     * may not be able to do anything to this.
    public void setReceiveBufferSize(int size) throws IOException {
         dataSock.setReceiveBufferSize(size);
     * Get the receive buffer size set on the RTP data channel.
     * Return -1 if the receive buffer size is not applicable for
     * the implementation.
    public int getReceiveBufferSize() {
          try {
              return dataSock.getReceiveBufferSize();
          catch (Exception e) {
              return -1;
     * Set the send buffer size of the RTP data channel.
     * This is only a hint to the implementation.  The actual implementation
     * may not be able to do anything to this.
    public void setSendBufferSize( int size) throws IOException {
         dataSock.setSendBufferSize(size);
     * Get the send buffer size set on the RTP data channel.
     * Return -1 if the send buffer size is not applicable for
     * the implementation.
    public int getSendBufferSize() {
          try {
              return dataSock.getSendBufferSize();
          catch (Exception e) {
              return -1;
     * Return the RTCP bandwidth fraction.  This value is used to
     * initialize the RTPManager.  Check RTPManager for more detauls.
     * Return -1 to use the default values.
    public double getRTCPBandwidthFraction() {
         return -1;
     * Return the RTCP sender bandwidth fraction.  This value is used to
     * initialize the RTPManager.  Check RTPManager for more detauls.
     * Return -1 to use the default values.
    public double getRTCPSenderBandwidthFraction() {
         return -1;
     * An inner class to implement an OutputDataStream based on UDP sockets.
    class SockOutputStream implements OutputDataStream {
          DatagramSocket sock;
          InetAddress addr;
          int port;
          public SockOutputStream(DatagramSocket sock, InetAddress addr, int port) {
              this.sock = sock;
              this.addr = addr;
              this.port = port;
          public int write(byte data[], int offset, int len) {
              try {
                   sock.send(new DatagramPacket(data, offset, len, addr, port));
              catch (Exception e) {
                   return -1;
              return len;
     * An inner class to implement an PushSourceStream based on UDP sockets.
    class SockInputStream extends Thread implements PushSourceStream {
          DatagramSocket sock;
          InetAddress addr;
          int port;
          boolean done = false;
          boolean dataRead = false;
          SourceTransferHandler sth = null;
          public SockInputStream(DatagramSocket sock, InetAddress addr, int port) {
              this.sock = sock;
              this.addr = addr;
              this.port = port;
          public int read(byte buffer[], int offset, int length) {
              DatagramPacket p = new DatagramPacket(buffer, offset, length, addr, port);
              try {
                   sock.receive(p);
              catch (IOException e) {
                   return -1;
              synchronized (this) {
                   dataRead = true;
                   notify();
              return p.getLength();
          public synchronized void start() {
              super.start();
              if (sth != null) {
                   dataRead = true;
                   notify();
          public synchronized void kill() {
              done = true;
              notify();
          public int getMinimumTransferSize() {
              return 2 * 1024;     // twice the MTU size, just to be safe.
          public synchronized void setTransferHandler(SourceTransferHandler sth) {
              this.sth = sth;
              dataRead = true;
              notify();
          // Not applicable.
          public ContentDescriptor getContentDescriptor() {
              return null;
          // Not applicable.
          public long getContentLength() {
              return LENGTH_UNKNOWN;
          // Not applicable.
          public boolean endOfStream() {
              return false;
          // Not applicable.
          public Object[] getControls() {
              return new Object[0];
          // Not applicable.
          public Object getControl(String type) {
              return null;
      * Loop and notify the transfer handler of new data.
          public void run() {
              while (!done) {
                    synchronized (this) {
                        while (!dataRead && !done) {
                              try {
                                  wait();
                              catch (InterruptedException e) { }
                        dataRead = false;
                    if (sth != null && !done) {
                        sth.transferData(this);
}See, how to create RTPManager in next post

Similar Messages

  • Get a event for a new received rtp stream in a player

    Hi!
    I'm trying to implement a RTP-player, that receives a AV-stream and play it. The special thing about this player should be, that even if the stream interrupts, the player wait's on the same IP and port for a new stream and open it in the SAME frame (not like jmstudio in new window).
    I try to catch the "ReceiveStreamEvent", so i can restart the player, but i don't get eny events for this. I tried to do it with a "RTPManager", but i don't know how.
    Does anybody has a example how to get the "ReceiveStreamEvent", so i know, the RTP-stream has been interrupted?
    Thanks
    Adam

    See AVReceive2 in JMF Solutions, in JMF web site

  • Newbie. local video player issue, please help.

    Hi,
    I have been using Dreamweaver CS3 for a few months but I can't seem to figure out my problem and was hoping someone here with much more wisdom could please help me.
    I am creating a local site, just for use on my personal computer, to better organize my computer as well as my life.
    I am trying to link some videoes I have on my computer so that when I click on them they will open in my default media player (VLC), however I seem to only be able to get them to open on another webpage.
    ie. I click on my morning workout video and instead of it opening in VLC player my site goes to a new page that plays the video there (with no controls). Is there a way I can get my video player to open the local file instead of it being sent to a new page? The files are in .avi, .mp4, .mp3, .flv, .mpeg format.
    I know I could simply find all the files individually when I want to watch them but that would defeat the purpose of creating this site to organize my computer.
    This issue is driving me crazy. I would really appreciate any and all help! Thank you!

    Yes, I hope they will provide that option in the software update asap.
    -------------------If this post helped you, click on accept as solution.------------------
    -----------------------------Appreciate by clicking on white star.----------------------------

  • Problem while sending RTP stream to QuickTime player

    Hello,
    I am trying to send a RTP video stream from a local file to a Quicktime player.
    The player is waiting for the stream in its default ports (6970/6971). I archieve this by using my own RTSP server.
    I start the transmission by using a processor and a datasink. It seems that I send one or two RTCP packages to port 6971, but not RTP packages are sent.
    This is my code:
    pr = Manager.createProcessor( new MediaLocator("file:/rootDirectory/file.mpg"));
    pr.configure();
    while(pr.getState()==Processor.Configuring){}
    TrackControl[] tracks = pr.getTrackControls();
    tracks[0].setFormat(new VideoFormat(VideoFormat.JPEG_RTP));
    tracks[0].setEnabled(true);
    pr.realize();
    while(pr.getState()==Processor.Realizing){}
    DataSource ds = null;
    ds = pr.getDataOutput();
    String url1 = "rtp://193.147.59.231:6970/video/10"
    MediaLocator m1 = new MediaLocator(url1);
    DataSink d1 = Manager.createDataSink(ds, m1);
    d1.open();
    d1.start();JMF doesn't show any error, but my application just is not sending RTP packages. What could be wrong in my code?
    Kind regards.

    Any clue please? I am frustrated!!

  • Apple TV and Sony Bravia, Slight video distortion when streaming local video

    I have a 3rd Generation Apple TV and a Sony Bravia TV
    When I am strwaming movies from iTunes, it appears it is skipping frames every few seconds. I have it hooked up through a Sony ES receiever, and i have by passed it to go directly to the TV. Below is all the troubleshooting i have done:
    - Changed the HDMI cable
    - Plugged it in directly to the TV
    - I have tried 2 Different TV's in the house (Video Streaming is fine on both TVs)
    - Tested the video on 2 macs and window media center, videos play fine
    - Moved the Airport Extreame and internet connection next to the unit
    - streamed from Both my macs and the WMC PC.
    - tried connecting either net to Airport extreame with the Mac
    - Streamed a movie trailer and it looked fine.
    - Changed the resolution of the apple tv from 1080P to 720P
    - Played movies from the PS3 and DVD play, no problems with the picture on the TV
    - Power cycled EVERYTHING!
    I have over 400 movies on my media server, and none of them acted like this on my 1st generation Apple TV i just sold.
    The weird part is that it only happens to the movies i am streaming on this specific TV. I know the TV is good, the moveis are fine. And the connection between the device and the system hosting the movies is fine..
    Anyone have any idea what could be causing this?

    Make sure its compatible
    H.264 and protected H.264 (from iTunes Store): Up to 5 Mbps, Progressive Main Profile (CAVLC) with AAC-LC audio up to 160 Kbps (maximum resolution: 1280 by 720 pixels at 24 fps, 960 by 540 pixels at 30 fps) in .m4v, .mp4, and .mov file formats
    Message was edited by: vazandrew

  • Slow Streaming - Local Files (Video ) only

    I'm trying to pin down this slow streaming and excessive buffering problem:
    Apple TV 2nd gen - updated to latest software - operating wirelessly
    Fios Actiontec Router
    PC quadcore intel (fast)
    Everything else seems to be working fine, photos, internet streaming, etc. But when I try to stream local video files, it is extremely slow (if it starts at all)
    I have just run a test and Netflix and Youtube are blazing fast to load and then play long videos.
    I have just tested it with 2 new videos in my Itunes Home Video folder:
    The orginal is an mp4- 258 MB with a 5.75 bitrate
    I actually also converted this via itunes to a m4v file for testing purposes
    Both files are extremely slow if they load at all.Virtually unwatchable.
    The same file (mp4) uploaded to youtube and played through the YouTube app plays almost instantaneously.
    Could this be some router setting? Or is this all I can expect to get,
    Is there some other fix?
    Thanks

    It likely has to do with the encoding. Local streaming is done over the local network. Try converting with handbrake.
    Here are the supported specs for ATV
    H.264 video up to 1080p, 30 frames per second, High or Main Profile level 4.0 or lower, Baseline profile level 3.0 or lower with AAC-LC audio up to 160 Kbps per channel, 48kHz, stereo audio in .m4v, .mp4, and .mov file formats
    MPEG-4 video up to 2.5 Mbps, 640 by 480 pixels, 30 frames per second, Simple Profile with AAC-LC audio up to 160 Kbps, 48kHz, stereo audio in .m4v, .mp4, and .mov file formats
    Motion JPEG (M-JPEG) up to 35 Mbps, 1280 by 720 pixels, 30 frames per second, audio in ulaw, PCM stereo audio in .avi file format

  • Writing a conference server for RTP streams

    Hello,
    I'm trying to write a conference server which accepts multiple RTP streams (one for each participant), creates a mixed RTP stream of all other participants and sends that stream back to each participant.
    For 2 participants, I was able to correctly receive and send the stream of the other participant to each party.
    For 3 participants, creating the merging data source does not seem to work - i.e. no data is received by the participants.
    I tried creating a cloneable data sources instead, thinking that this may be the root cause, but when creating cloneable data sources from incoming RTP sources, I am unable to get the Processor into Configured state, it seems to deadlock. Here's the code outline :
        Iterator pIt = participants.iterator();
        List dataSources = new ArrayList();
        while(pIt.hasNext()) {
          Party p = (Party) pIt.next();
          if(p!=dest) {
            DataSource ds = p.getDataSource();
            DataSource cds = Manager.createCloneableDataSource(ds);
            DataSource clone= ((SourceCloneable)cds).createClone();
            dataSources.add(clone);
        Object[] sources = dataSources.toArray(new DataSource[0]);
        DataSource dataSource =   Manager.createMergingDataSource((DataSource[])sources);
        Processor p = Manager.createProcessor(dataSource);
        MixControllerListener cl = new MixControllerListener();
        p.addControllerListener(cl);
        // Put the Processor into configured state.
        p.configure();
        if (!cl.waitForState(p, p.Configured)) {
            System.err.println("Failed to configure the processor.");
            assert false;
        }Here are couple of stack traces :
    "RTPEventHandler" daemon prio=1 tid=0x081d6828 nid=0x3ea6 in Object.wait() [98246000..98247238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f37e4a8> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at demo.Mixer$MixControllerListener.waitForState(Mixer.java:248)
            - locked <0x9f37e4a8> (a java.lang.Object)
            at demo.Mixer.createMergedDataSource(Mixer.java:202)
            at demo.Mixer.createSendStreams(Mixer.java:165)
            at demo.Mixer.createSendStreamsWhenAllJoined(Mixer.java:157)
            - locked <0x9f481840> (a demo.Mixer)
            at demo.Mixer.update(Mixer.java:123)
            at com.sun.media.rtp.RTPEventHandler.processEvent(RTPEventHandler.java:62)
            at com.sun.media.rtp.RTPEventHandler.dispatchEvents(RTPEventHandler.java:96)
            at com.sun.media.rtp.RTPEventHandler.run(RTPEventHandler.java:115)
    "JMF thread: com.sun.media.ProcessEngine@a3c5b6[ com.sun.media.ProcessEngine@a3c5b6 ] ( configureThread)" daemon prio=1 tid=0x082fe3c8 nid=0x3ea6 in Object.wait() [977e0000..977e1238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f387560> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at com.sun.media.parser.RawBufferParser$FrameTrack.parse(RawBufferParser.java:247)
            - locked <0x9f387560> (a java.lang.Object)
            at com.sun.media.parser.RawBufferParser.getTracks(RawBufferParser.java:112)
            at com.sun.media.BasicSourceModule.doRealize(BasicSourceModule.java:180)
            at com.sun.media.PlaybackEngine.doConfigure1(PlaybackEngine.java:229)
            at com.sun.media.ProcessEngine.doConfigure(ProcessEngine.java:43)
            at com.sun.media.ConfigureWorkThread.process(BasicController.java:1370)
            at com.sun.media.StateTransitionWorkThread.run(BasicController.java:1339)
    "JMF thread" daemon prio=1 tid=0x080db410 nid=0x3ea6 in Object.wait() [97f41000..97f41238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Object.wait(Object.java:429)
            at com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave.run(CloneableSourceStreamAdapter.java:375)
            - locked <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Thread.run(Thread.java:534)Any ideas ?
    Thanks,
    Jarek

    bgl,
    I was able to get past the cloning issue by following the Clone.java example to the letter :)
    Turns out that the cloneable data source must be added as a send stream first, and then the clonet data source. Now for each party in the call the conf. server does the following :
    Party(RTPManager mgr,DataSource ds) {
          this.mgr=mgr;
          this.ds=Manager.createCloneableDataSource(ds);
       synchronized DataSource cloneDataSource() {
          DataSource retVal;
          if(getNeedsCloning()) {
            retVal = ((SourceCloneable) ds).createClone();
          } else {
            retVal = ds;
            setNeedsCloning();
          return retVal;
        private void setNeedsCloning() {
          needsCloning=true;
        private boolean getNeedsCloning() {
          return needsCloning;
         private synchronized void addSendStreamFromNewParticipant(Party newOne) throws UnsupportedFormatException, IOException {
        debug("*** - New one joined. Creating the send streams. Curr count :" + participants.size());
        Iterator pIt = participants.iterator();
        while(pIt.hasNext()) {
          Party p = (Party)pIt.next();
          assert p!=newOne;
          // update existing participant
          SendStream sendStream = p.getMgr().createSendStream(newOne.cloneDataSource(),0);
          sendStream.start();
          // send data from existing participant to the new one
          sendStream = newOne.getMgr().createSendStream(p.cloneDataSource(),0);
          sendStream.start();
        debug("*** - Done creating the streams.");So I made some progress, but I'm still not quite there.
    The RTP manager JavaDoc for createSendStream states the following :
    * This method is used to create a sending stream within the RTP
    * session. For each time the call is made, a new sending stream
    * will be created. This stream will use the SDES items as entered
    * in the initialize() call for all its RTCP messages. Each stream
    * is sent out with a new SSRC (Synchronisation SouRCe
    * identifier), but from the same participant i.e. local
    * participant. <BR>
    For 3 participants, my conf. server creates 2 send streams to every one of them, so I'd expect 2 SSRCs on the wire. Examining the RTP packets in Ethereal, I only see 1 SSRC, as if the 2nd createSendStream call failed. Consequently, each participany in the conference is able to receive voice from only 1 other participant, even though I create RTPManager instance for each participany, and add 2 send streams.
    Any ideas ?
    Thanks,
    Jarek

  • How to  stream audio/video to real Audio Player

    Hello
    I am planning on doing a project on streaming audio/video onto Real Audio Player . I am very new to JMF . I would appreciate some pointers on how to start off. I ahve read the documents but am not able to get started right .
    I wanted to know the following :
    1. Isnt the RTP streaming streaming for live audio or video ? If I want to stream something which I have stored in my local machine then how do i do it - do i need to use RTP ?
    2. I undertsand that if i need to stream the audio/video file I need to convert it to a format the Real Audio Player supports. How do i go about that ?
    Any help would be greatly appreciated!!!
    Thanks in advance
    Shailaja

    JMF doesn't have any support for Real media. Real has gone open source with their Helix Community, so maybe you could find some useful information there: http://www.helixcommunity.org

  • How to synchronize audio and video rtp-streams

    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?
    How are you supposed to sync audio and video if not with as above ?

    camelstrike wrote:
    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?The RTP packets are timestamped when they leave, and they are played in order. You can't change the timebase on an RTP stream because, for all intensive purposes, it's "live" data. It plays at the speed the transmitting RTP server wants it to play, and it never gets behind because it drops out of order and old data packets.
    If your RTP streams aren't synced correctly on the receiving end, it means they aren't being synced on the transmitting end...so you should look into the other side of the equation.

  • Does Media Player cache http streams locally?

    Hi,
    I am developing a secure video playback system. I am planning to load encrypted data, decrypt it and present it from a http server. This would be read by the media player using http progressive download. The security is lost if the media player stores the stream locally. Any help on finding out where the media player might store data if it does? If I use rtsp, will it solve the problem?
    Regards,
    Arun

    On my Linux system, I played a JavaFX program designed to read media from http. Then I used the command line to print a list of all file descriptors in use. None of the file descriptors being used by the JavaFX program looked caching related, but I am honestly not sure whether or not it does in fact cache.
    Since the documentation doesn't say anything, whether or not it caches and how it does so may very well be OS specific. As such, figuring out whether there even IS a problem, let alone how to solve it would be very tricky.
    Even if it does cache, you'll still be about as safe as it gets. The vast majority of the population won't have a clue how to steal your video.
    As for hackers, they probably have better things to do with their time and there isn't much you can do to stop them (short of not placing the film online in the first place of course!). Even if a hacker can't find a video cache, they could still decompile your program to figure out where the media is being downloaded, which would make the caching issue irrelevant. And even if you made it difficult to read the URL from a decompiler, there is nothing you can do to prevent them from just filming the screen.

  • Receiving Video RTP Stream (JMF) in JME ( MMAPI ) - URGENT !!!

    Hi Folks...
    I�m trying to develop an application that sends the images from a web cam connected to the computer, to the PDA that the images can be viewed by the user...
    My code for the JMF RTP Video Stream is as follows.
    Processor proc = null;
              javax.media.protocol.DataSource ds = null;
              TrackControl[] tc = null;
              int y;
              boolean encodingOk = false;
              Vector<javax.media.protocol.DataSource> datasources = new Vector<javax.media.protocol.DataSource>();
              for( int x = 0 ; x < camerasInfo.length ; x++ ){
                   try {
                        proc = Manager.createProcessor(camerasInfo[x].getLocator());
                   }catch (NoProcessorException e) { System.out.println("Erro ao int�nciar PROCESSOR: " + e);     }
                    catch (IOException e) { System.out.println("Erro ao int�nciar PROCESSOR: " + e); }
                   proc.configure();
                   try {
                        Thread.sleep(2000);
                   } catch (InterruptedException e1) {
                        // TODO Auto-generated catch block
                        e1.printStackTrace();
                   proc.setContentDescriptor( new ContentDescriptor(ContentDescriptor.RAW_RTP) );
                   tc = proc.getTrackControls();
                   for( y = 0; y < tc.length ; y++ ){
                        if (!encodingOk && tc[y] instanceof FormatControl){
                             if( ( (FormatControl) tc[y] ).setFormat( new VideoFormat(VideoFormat.RGB) ) != null ){
                                  tc[y].setEnabled(true);                              
                             }else{
                                  encodingOk = true;
                        }else{
                             tc[y].setEnabled(false);
                   proc.realize();
                   try {
                        Thread.sleep(2000);
                   } catch (InterruptedException e1) {
                        e1.printStackTrace();
                   try{
                        ds = proc.getDataOutput();
                   }catch(NotRealizedError e){
                        System.out.println("ERRO ao realizar datasource: " + e);
                   }catch(ClassCastException e){
                        System.out.println("Erro ao realizar datasource: " + e);
                   datasources.add(ds);
                   System.out.println( ds.getLocator() );
                   encodingOk = false;
              MediaLocator ml = new MediaLocator("rtp://10.1.1.100:99/video");
              try {
                   DataSink datSink = Manager.createDataSink(ds , ml);
                   datSink.open();
                   datSink.start();
              } catch (NoDataSinkException e) {
                   System.out.println("Erro ao instanciar DataSink: " + e );
              } catch (SecurityException e) {
                   System.out.println("Erro ao iniciar DataSink: " + e);
              } catch (IOException e) {
                   System.out.println("Erro ao iniciar DataSink: " + e);
              }I�m not sure if this code is correctly... it is ?
    So... the next part of the systems runs on the PDA..
    The code that access this RTP Stream is as follows..
              VideoControl c = null;
              try {
                   player = Manager.createPlayer("rtp://10.1.1.100:99/video");
                   c = (VideoControl) player.getControl("VideoControl");
                   tela = (Item) c.initDisplayMode( GUIControl.USE_GUI_PRIMITIVE, null);
                   player.start();
                   append(tela);
              } catch (IOException e) {
                   str.setText(e.toString());
                   append( str );
              } catch (MediaException e) {
                   str.setText(e.toString());
                   append( str );
              }So when the APP try to create a player for "rtp://10.1.1.100:99/video" an MediaException is throwed..
    javax.microedition.media.MediaException: Unable to create Player for the locator: rtp://10.1.1.100:99/video
    So... I don�t know what happen =/
    The error is in the PDA module ? Or in the computer�s initialization off the RTP Video Streaming ?
    I need finish this job at next week... so any help is usefull..
    Waiting for answers
    Rodrigo Kerkhoff

    First of all: before going onto the j2me part, make sure the server works, before doing anything else! Apparently, it doesn't...
    The MediaLocator is generally used to specify a kind of URL depicting where the data put into the datasink should go to. In your case, This cannot be just some URL where you want it to act as a rtps server. You'll need to implement that server yourself I guess.

  • Unload multiple youtube video player

    hey guys...i have a problem when i was trying stop a multiple youtube streaming video in flex. i'm using 3 file in this project:videoTutorial.mxml(as main application), windowVideoTutorial.mxml (as titleWindow component), and videoTutorialx.xml (as external xml file).this is the code:
    videoTutorial.mxml
    <?xml version="1.0" encoding="utf-8"?>
    <mx:Application xmlns:mx="http://www.adobe.com/2006/mxml"
        initialize="videoService.send();" width="100%" height="100%" >
        <mx:HTTPService id="videoService" url="xml/VideoTutorialx.xml" result="videoHandler(event);" />
        <mx:Script>
            <![CDATA[
                import mx.rpc.events.ResultEvent;
                import mx.collections.ArrayCollection;
                import mx.managers.PopUpManager;
                [Bindable]
                public var repvid:ArrayCollection;
                private function videoHandler(event:ResultEvent):void{
                    repvid=event.result.VideoTutorial.myvideo;
                private function launchPopUp(e:MouseEvent):void {
                    if(thumbClick.selectedItem){
                        var win :windowVideoTutorial = new windowVideoTutorial();
                        win.sourceVideo = thumbClick.selectedItem.urlVid;
                        PopUpManager.addPopUp(win,this,true);
                        PopUpManager.centerPopUp(win);
            ]]>
        </mx:Script>
        <mx:ApplicationControlBar width="100%" height="100%" horizontalAlign="center">
            <mx:VBox>
                <mx:ApplicationControlBar width="100%" verticalAlign="middle">
                    <mx:Label text="Tutorial Multimedia Player"
                        width="100%" textAlign="center" />
                </mx:ApplicationControlBar>
                <mx:TileList id="thumbClick" dataProvider="{repvid}"
                    width="700" height="525" rowHeight="200" columnWidth="325"
                    click="launchPopUp(event)" verticalAlign="middle"
                    horizontalCenter="0" verticalCenter="0" >
                    <mx:itemRenderer>
                        <mx:Component>
                        <mx:VBox horizontalAlign="left" verticalAlign="top" width="100%">
                            <mx:Label text="{data.titleVid}" fontWeight="bold" fontSize="12"/>
                            <mx:Image source="{data.imageVideo}" width="150" height="150"/>
                        </mx:VBox>
                        </mx:Component>
                    </mx:itemRenderer>
                </mx:TileList>       
              </mx:VBox>
        </mx:ApplicationControlBar>
    </mx:Application>
    windowVideoTutorial.mxml
    <?xml version="1.0" encoding="utf-8"?>
    <mx:TitleWindow xmlns:mx="http://www.adobe.com/2006/mxml"
        width="100%" height="100%"
        showCloseButton="true"
        close="closeWindow(event);"
        creationComplete="Init();">
        <mx:Script>
            <![CDATA[
                import mx.events.CloseEvent;
                import mx.managers.PopUpManager;   
                [Bindable]
                public var sourceVideo:String;       
                private function closeWindow(e:CloseEvent):void {
                    PopUpManager.removePopUp(this);
                    {what should i write in here . . .} 
                private function Init():void{
                    var url:String = sourceVideo; 
                    Security.allowDomain(url);                                          
                    youtubevid.load(url);                                                
            ]]> 
        </mx:Script> 
        <mx:SWFLoader 
            id="youtubevid" verticalAlign="top" 
            horizontalAlign="center"   
            width="100%" height="100%" 
            /> 
    </mx:TitleWindow> 
    videoTutorialx.xml
    <?xml version="1.0" standalone="yes"?>
    <VideoTutorial>
      <myvideo>
        <titleVid>Tutorial After Effect</titleVid>
        <imageVideo>assets/videoThumbnail01.jpg</imageVideo>
        <urlVid>http://www.youtube.com/v/Gq5jTY76slU</urlVid>
      </myvideo>
      <myvideo>
        <titleVid>Tutorial Wayang</titleVid>
        <imageVideo>assets/videoThumbnail02.jpg</imageVideo>
        <urlVid>http://www.youtube.com/v/7PrXlEJb-0Y</urlVid>
      </myvideo>
      <myvideo>
        <titleVid>Tutorial premierre</titleVid>
        <imageVideo>assets/videoThumbnail03.jpg</imageVideo>
        <urlVid>http://www.youtube.com/v/XsdJA3AweXE</urlVid>
      </myvideo>
      <myvideo>
        <titleVid>Tutorial sound audio</titleVid>
        <imageVideo>assets/videoThumbnail04.jpg</imageVideo>
        <urlVid>http://www.youtube.com/v/vCWSUVi-bII</urlVid>
      </myvideo>
      <myvideo>
        <titleVid>Tutorial masking</titleVid>
        <imageVideo>assets/videoThumbnail05.jpg</imageVideo>
        <urlVid>http://www.youtube.com/v/wyXxIMWjdA4</urlVid>
      </myvideo>
      <myvideo>
        <titleVid>Video Show off</titleVid>
        <imageVideo>assets/VideoTutorial/videoThumbnail06.jpg</imageVideo>
        <urlVid>http://www.youtube.com/v/1nFXms0pqF8</urlVid>
      </myvideo>
    </VideoTutorial>
    i attach source code here...im waiting for your reply...thank you very much

    A player can not run like that but I was not trying for one night.
    iyi izle video izle izlesene facebook izlesene iyi izle youtube

  • How to connect flash video player(like youtube)  to live streaming video the programm like                             

    How to connect flash video player(like youtube) to live
    streaming video the programm like webcamXP?
    Or through a browser to look in a videoplayer video from
    other usual personal computer with the program for a video
    broadcasting from the web chamber.

    you can use google to search for tutorials on skinning the component.

  • How to stream with the player strobe media playback videos using http dynamic streaming?

    Hi..
    please could i have some help about the player strobe media playback?
    i would like to know how to stream with the player strobe media playback videos using http dynamic streaming.
    I have already installed the server,i put my own video in the video on demand folder.i downloaded the player but i don't know how to stream that video with the adobe protocol dynamic streaming.

    http://osmf.org/configurator/fmp/
    Use this configurator, and use the same code it generates with your strobemediaplayback.swf
    Also make sure the domain where strobemediaplayback.swf is being called from and resides, is in your /webroot/crossdomain.xml  file, or it will not work.

  • Local video files on native player - any examples or official response?

    Sorry to bring this up again as there are several posts in this forum asking the same questions but I figured now that Packager for iPhone is once again being actively supported it would be a good time to try and get some clarification.
    Playing local video files on the iPhone/Pad's native video player is a well documented feature of the packager but one that I have yet to get working or even see an example of. There are quite a few posts here and elsewhere with people asking how to get this feature working, especially with reference to the navigateToURL() function as described in the developer guide.
    Has anyone ever managed to get this feature working and are there any examples of code using navigateToURL() which have been successful for anyone? If not could one of the Adobe reps on this board give a response just to clarify if H.264 video playback is working as intended and documented.
    Many thanks for any incoming responses, I have been very happy to hear the news from Apple/Adobe over the last few days and getting this feature working would top off a few days of good news!

    I've tried that with no luck.  Here's a screenshot of the app.  You can see the many ways I've tried to play video files and it just doesn't work the way you suggested, or the way the documentation suggests.  What am I doing wrong?  Or does it just not work since no one can seem to get it to work?
    And yes, I did include the video:

Maybe you are looking for