Adding Effect on an incoming H.263/RTP stream

Hi,
I am playing an h.263/rtp stream which is multicasted over a network. I want to play the file with rotation or some other effect added to it. I tried to see following link for some help(as the forums suggested):- http://java.sun.com/products/java-media/jmf/2.1.1/solutions/RotationEffect.html. But the link is disabled and new link which they are providing does not contain the example.
Can any of me help me how to add the effect on h263/rtp.
Though i already created my effect class and tried adding it. But could not add the effect.
Kindly give me some directions.

Is there some error in above code or if there is some other issue.There's an error in the above code because you're assuming I'm being imprecise with my language. I am not.
You must copy the data from the input buffer to the output buffer...
Buffer objects are wrappers around byte[], and you cannot copy data from one array to another by modifying the object references...which is what your code does.
public int process(Buffer inBuffer, Buffer outBuffer) {
    // Make sure the output buffer will hold data
    if (!(outBuffer.getData() instanceof byte[])) {
        outBuffer.setData(new byte[inBuffer.getLength()]);    
        outBuffer.setLength(inBuffer.getLength());  
        outBuffer.setOffset(0);
    // Make sure the output buffer is long enough
    if (outBuffer.getLength() + outBuffer.getOffset() < inBuffer.getLength())) {
        int offset = outBuffer.getOffset();
        int length = inBuffer.getLength() + outBuffer.getOffset();
        byte[] oldData = (byte[])outBuffer.getData();
        byte[] newData = new byte[length];
        // Copy the previous data
        for (int i = 0; i < offset; i++) {
            newData[i] = oldData;
// Set the variables for the output buffer
outBuffer.setData(newData);
outBuffer.setLength(length);
outBuffer.setOffset(offset);
// Copy the data from in to out
int j = outBuffer.getOffset();
for (int i = inBuffer.getOffset(); i < inBuffer.getOffset() + inBuffer.getLength(); i++) {
((byte[])outBuffer.getData())[j++] = ((byte[])inBuffer.getData())[i];
return BUFFER_PROCESSED_OK;
That's untested code above, fix my dumb mistakes as necessary.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

Similar Messages

  • H.263/RTP payload

    As we know jmf can stream in h.263/rtp with payload 34. i am streaming h.263/rtp to a device which does not understand payload 34 rather payload 96 whihc is used by h.263-1998 and 2000 updated versions.
    Do anyone has any idea how can change payload from 34 to compatible with 96.
    Sorry if it is not the right place/forum to ask this question. But if any one has any idea where i can ask question regarding transcoding h.263/rtp packets from the one compatible with rfc 2190 (which is supported by jmf and is almost outdated) to packets compatible with rfc 2429.

    Sorry captfoss i was on leave for some time so could not respond to your reply on time.
    While working today i found that JMF has a videoformat :-VideoFormat.H263_1998_RTP which is compaitable with RFC 2429. But there is a problem - i tried several times to transcode my h263 file i am reading from local computer to h263_1998_RTP format but it fails giving following error:-
    Failed to realize: com.sun.media.ProcessEngine@19209ea
      Cannot build a flow graph with the customized options:
        Unable to transcode format: H263, 176x144, FrameRate=15.0, Length=76032 0 extra bytes
          to: H263-1998/RTP
          outputting to: RAW/RTP
    Error: Unable to realize com.sun.media.ProcessEngine@19209eathen i checked in jmf registery i found a codec -com.sun.media.codec.video.vh263.Native Decoder -This Decoder converts h263,h263/rtp as well as h263-1998/rtp into yuv format. but i could not found any encoder which converts any format(yuv or rgb or h263 or any other) to h263-1998/rtp.
    Is there any way by which i can transcode to h263-1998/rtp???

  • Can I play and save the incoming RTP streams simultenously

    Hi,
    I want to whether i can play and save the incoming RTP stream simultenously using
    JMF 2.1. The idea is by saving in the file, i can playback the same file later time
    Is there any example code aviliable.
    Thanks in Advance
    bye
    Srikanth

    I think this, today, it's impossible lamentably. By but that I have tried it, I have not been able to obtain it.
    If exists someone that it has obtained it, please, help us with this subject. It's very important.
    Thanks

  • Adobe premiere pro for Mac, why is there no playback / severe lag after adding effects? What can I d

    I recently added effects to a number of clips in Adobe Premiere pro, now the playback has virtually stopped and is lagging. It will play for a mili second then pause.
    Computer information:
    Hardware Overview:
      Model Name:          MacBook Pro
      Model Identifier:          MacBookPro8,1
      Processor Name:          Intel Core i5
      Processor Speed:          2.4 GHz
      Number of Processors:          1
      Total Number of Cores:          2
      L2 Cache (per Core):          256 KB
      L3 Cache:          3 MB
      Memory:          4 GB
    Intel HD Graphics 3000:
      Chipset Model:          Intel HD Graphics 3000
      Type:          GPU
      Bus:          Built-In
      VRAM (Total):          384 MB
      Vendor:          Intel (0x8086)
      Device ID:          0x0126
      Revision ID:          0x0009

    With an i5 and only 4Gig of ram, your computer is under powered to edit HD video
    What are you editing, and what effects are you using?
    Report back with the codec details of your file, use the programs below... A screen
    shot works well to SHOW people what you are doing
    http://forums.adobe.com/thread/592070?tstart=30
    For Mac http://mediainspector.massanti.com/

  • CT differences with Effect to Net income

    Hi,
    I am running CT for the current month i.e July 2010.
    For one of the company code (9019), it was showing two steps under differences as below:
    CT differences without having effect on Net income: XXX amount
    CT differences with EFFECT on net income 1200 USD
    Our group currency is USD
    For all the months it will have postings like above, but with effect on NET income will be negligible i.e. always less than 1 USD. so there is no issue.
    Now when I look into the balances using total records, the PNL items (All accounts) are equal to Zero in LC.
    But In GC is showing the balance which is equal to USD 1200, exactly same amount as "CT differences with EFFECT on net income "
    Could someone explain what is CT difference with effect of Net income and why PNL items in GC are not equal to zero.
    Thanks in Advance,
    Richard.

    Hi Dan,
    The issue is resolved after following the steps provided by you.
    Also i have done one more mistake of assigning  account type balance sheet instead of PNL in fs item, so it was mis leading the results.
    In fact the issue is because of wrong accont type. But the solution provided by you works 100% but in the next period again i faced the same issue.
    After correcting the account type the issue is resolved. Also executed previous month process again for better reporting.
    Thanks a lot for your help provided. In the entire project you have helped a lot.
    Many thanks for your support.
    Regards,
    Richard

  • Adding Effect (e.g shadow ) to image in Fxz files.

    Hi
    Can somebody show me how can I added Effect like shadow to images in the FXZ files ?
    Below is an e.g of FXZ converted by &ldquo;JavaFX Production Suite&rdquo; from a svg file produce by inkscape.
    I know an alternative is to edit the FXZ file.
    Thanks in advance.
    Fire-fly
    * Generated by JavaFX svg2fx tool.
    * Created on Thu Jun 03 23:35:13 SGT 2010.
    //@version 1.3
    FXD {
    content: [
    Group {
    content: [
    Rectangle {
    fill: Color.rgb(0xcc, 0xcc, 0xcc, 1.0)
    height: 37.142857
    id: "shaft"
    width: 462.85715
    x: 100.0
    y: 369.50504
    // effect: Glow {
    //   level: 0.5
    Group {
    content: [
    SVGPath {
    content: "m 251.42857,386.64789 a 50,54.285713 0 1 1 -100,0 50,54.285713 0 1 1 100,0 z"
    fill: Color.rgb(0xff, 0xff, 0x00, 1.0)
    id: "thumb"
    }

    Hi
    I am very new to javaFX.
    Try the following, but did not work.
    To change this template, choose Tools | Templates
    *and open the template in the editor.*
    package insertobject;
    import javafx.stage.Stage;
    import javafx.scene.Scene;
    import javafx.scene.text.Text;
    import javafx.scene.text.Font;
    import javafx.fxd.FXDLoader;
    import javafx.fxd.FXDNode;
    import javafx.scene.paint.Color;
    import javafx.scene.effect.Shadow;
    import javafx.scene.effect.Effect;
    var URL = "{__DIR__}slider.fxz";
    var shadow : Shadow;
        shadow.color =Color.BLACK;
        shadow.radius= 15;
        shadow.height =7;
        var fxdNode: FXDNode=FXDNode {
        url: URL
        backgroundLoading: true
        placeholder: Text { x: 10 y: 10 content: "Loading graphics ..."}
         loader: FXDLoader {
            onDone: function() {
                var layer1 = fxdNode.getNode("shaft");
                var layer2 = fxdNode.getNode("thumb");
                var rect = fxdNode.getShape("shaft");
                   rect.effect = shadow;
                layer1.translateX = 70;
                layer1.translateY = 100;
                layer2.translateX = 70;
                layer2.translateY = 100;
    Stage {
        title: "Application title"
        scene: Scene {
            width: 550
            height: 700
            //fill:Color.web("0x00001F")
            content: [
                fxdNode
    }Edited by: fire-fly on Jun 14, 2010 11:10 AM

  • Simultaneous Recording in file and  playing incoming rtp stream

    I want to Simultaneous Recording in file and  playing incoming rtp stream.I have cloned the data source. After cloning I created two data sinks and two processors.One processor/datasink is meant for recording the stream in a file and other processor/data sink is used to play the rtp through speaker.
    But at a time I am only able to achieve one operation i.e either recording or playing.For recording into the wav file I have to put Thread.sleep(30) so that I can record the rtp into the file.During this time I am not able to play the audio through speaker.I also create separate thread but this does not help.
    Please give suggestion on how to resolve this issue.

    thanks for the reply, but I think I've already solved the problem.

  • Writing a conference server for RTP streams

    Hello,
    I'm trying to write a conference server which accepts multiple RTP streams (one for each participant), creates a mixed RTP stream of all other participants and sends that stream back to each participant.
    For 2 participants, I was able to correctly receive and send the stream of the other participant to each party.
    For 3 participants, creating the merging data source does not seem to work - i.e. no data is received by the participants.
    I tried creating a cloneable data sources instead, thinking that this may be the root cause, but when creating cloneable data sources from incoming RTP sources, I am unable to get the Processor into Configured state, it seems to deadlock. Here's the code outline :
        Iterator pIt = participants.iterator();
        List dataSources = new ArrayList();
        while(pIt.hasNext()) {
          Party p = (Party) pIt.next();
          if(p!=dest) {
            DataSource ds = p.getDataSource();
            DataSource cds = Manager.createCloneableDataSource(ds);
            DataSource clone= ((SourceCloneable)cds).createClone();
            dataSources.add(clone);
        Object[] sources = dataSources.toArray(new DataSource[0]);
        DataSource dataSource =   Manager.createMergingDataSource((DataSource[])sources);
        Processor p = Manager.createProcessor(dataSource);
        MixControllerListener cl = new MixControllerListener();
        p.addControllerListener(cl);
        // Put the Processor into configured state.
        p.configure();
        if (!cl.waitForState(p, p.Configured)) {
            System.err.println("Failed to configure the processor.");
            assert false;
        }Here are couple of stack traces :
    "RTPEventHandler" daemon prio=1 tid=0x081d6828 nid=0x3ea6 in Object.wait() [98246000..98247238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f37e4a8> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at demo.Mixer$MixControllerListener.waitForState(Mixer.java:248)
            - locked <0x9f37e4a8> (a java.lang.Object)
            at demo.Mixer.createMergedDataSource(Mixer.java:202)
            at demo.Mixer.createSendStreams(Mixer.java:165)
            at demo.Mixer.createSendStreamsWhenAllJoined(Mixer.java:157)
            - locked <0x9f481840> (a demo.Mixer)
            at demo.Mixer.update(Mixer.java:123)
            at com.sun.media.rtp.RTPEventHandler.processEvent(RTPEventHandler.java:62)
            at com.sun.media.rtp.RTPEventHandler.dispatchEvents(RTPEventHandler.java:96)
            at com.sun.media.rtp.RTPEventHandler.run(RTPEventHandler.java:115)
    "JMF thread: com.sun.media.ProcessEngine@a3c5b6[ com.sun.media.ProcessEngine@a3c5b6 ] ( configureThread)" daemon prio=1 tid=0x082fe3c8 nid=0x3ea6 in Object.wait() [977e0000..977e1238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f387560> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at com.sun.media.parser.RawBufferParser$FrameTrack.parse(RawBufferParser.java:247)
            - locked <0x9f387560> (a java.lang.Object)
            at com.sun.media.parser.RawBufferParser.getTracks(RawBufferParser.java:112)
            at com.sun.media.BasicSourceModule.doRealize(BasicSourceModule.java:180)
            at com.sun.media.PlaybackEngine.doConfigure1(PlaybackEngine.java:229)
            at com.sun.media.ProcessEngine.doConfigure(ProcessEngine.java:43)
            at com.sun.media.ConfigureWorkThread.process(BasicController.java:1370)
            at com.sun.media.StateTransitionWorkThread.run(BasicController.java:1339)
    "JMF thread" daemon prio=1 tid=0x080db410 nid=0x3ea6 in Object.wait() [97f41000..97f41238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Object.wait(Object.java:429)
            at com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave.run(CloneableSourceStreamAdapter.java:375)
            - locked <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Thread.run(Thread.java:534)Any ideas ?
    Thanks,
    Jarek

    bgl,
    I was able to get past the cloning issue by following the Clone.java example to the letter :)
    Turns out that the cloneable data source must be added as a send stream first, and then the clonet data source. Now for each party in the call the conf. server does the following :
    Party(RTPManager mgr,DataSource ds) {
          this.mgr=mgr;
          this.ds=Manager.createCloneableDataSource(ds);
       synchronized DataSource cloneDataSource() {
          DataSource retVal;
          if(getNeedsCloning()) {
            retVal = ((SourceCloneable) ds).createClone();
          } else {
            retVal = ds;
            setNeedsCloning();
          return retVal;
        private void setNeedsCloning() {
          needsCloning=true;
        private boolean getNeedsCloning() {
          return needsCloning;
         private synchronized void addSendStreamFromNewParticipant(Party newOne) throws UnsupportedFormatException, IOException {
        debug("*** - New one joined. Creating the send streams. Curr count :" + participants.size());
        Iterator pIt = participants.iterator();
        while(pIt.hasNext()) {
          Party p = (Party)pIt.next();
          assert p!=newOne;
          // update existing participant
          SendStream sendStream = p.getMgr().createSendStream(newOne.cloneDataSource(),0);
          sendStream.start();
          // send data from existing participant to the new one
          sendStream = newOne.getMgr().createSendStream(p.cloneDataSource(),0);
          sendStream.start();
        debug("*** - Done creating the streams.");So I made some progress, but I'm still not quite there.
    The RTP manager JavaDoc for createSendStream states the following :
    * This method is used to create a sending stream within the RTP
    * session. For each time the call is made, a new sending stream
    * will be created. This stream will use the SDES items as entered
    * in the initialize() call for all its RTCP messages. Each stream
    * is sent out with a new SSRC (Synchronisation SouRCe
    * identifier), but from the same participant i.e. local
    * participant. <BR>
    For 3 participants, my conf. server creates 2 send streams to every one of them, so I'd expect 2 SSRCs on the wire. Examining the RTP packets in Ethereal, I only see 1 SSRC, as if the 2nd createSendStream call failed. Consequently, each participany in the conference is able to receive voice from only 1 other participant, even though I create RTPManager instance for each participany, and add 2 send streams.
    Any ideas ?
    Thanks,
    Jarek

  • Mixing audio RTP-Streams to conference

    I'm trying to create a telefon conference applikation with jmf.
    My problem:
    I need to mix multiple incomming RTP-Streams (Audio/G711_ulaw) together into one outgoing RTP-Stream.
    If I use a MergingDataSource i can get one DataSource with multiple Tracks, but i'm not able to send them out together over one RTP-Session.
    Is there any way to do this?
    I have also thaught about using RawBufferMux or WAVMux to multiplex the tracks included in the MergingDataSource ,but i cant find out how to use them.
    Or is there an easyer way to create an conference over RTP?
    I need to connect to cisco ip-phones, so i can only use the g711_ulaw codec.
    Is there anyone out who can help me?

    Im sorry, but I met this problem in past and find it impossible to resolve.
    Im also thought about MergingDataSource, about Many2One class and more and more but nothing I could use. And nobody could answer this question in this forum half of year ago.
    Oh, nearly forgot: it is possible to write merged streams to file, seems possible to write them into file and then to transmit it to net from file. But how to realize more than one Processor?..

  • How to synchronize audio and video rtp-streams

    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?
    How are you supposed to sync audio and video if not with as above ?

    camelstrike wrote:
    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?The RTP packets are timestamped when they leave, and they are played in order. You can't change the timebase on an RTP stream because, for all intensive purposes, it's "live" data. It plays at the speed the transmitting RTP server wants it to play, and it never gets behind because it drops out of order and old data packets.
    If your RTP streams aren't synced correctly on the receiving end, it means they aren't being synced on the transmitting end...so you should look into the other side of the equation.

  • How can I manage controls in an RTP Streaming

    Hello,
    I'm currently using an RTP Streaming in order to play some music. I started with AVReceive2 and AVTransmit2 from sun and by adding a plugin, I managed to play MP3. So far so good.
    Because I didn't want to use view components given by sun, I created my own GUI for my player and I use it instead of using the one in sun's examples.
    I use listeners for my "buttons" (I prefer using mouselistener and Panel like in the sun's example Jamp) but I can't find a way to execute those buttons.
    If I press pause, it has to pause the streaming in AVTransmit2.
    If I press next, or previous, it has to change the process AVTransmit2.
    Each control has to be done in AVTransmit2 but my player is part of AVReceive2.
    My first guess was to check if I could use an event to tell my AVTransmit2 object to execute my controls but I havn't found a way to tell my AVTransmit2 object that it was this button I pressed.
    I finally stopped trying transmitting my orders thanks to the events.
    I tried then to find a way to get my AVTransmit2 object in my AVReceive2 object so that I could call methods but i failed :(
    How can i manage my controls so that it will call methods on my server and not my client ?
    Thanks :)
    Shad.
    Edited by: Shadwolf on Feb 9, 2010 2:15 AM

    Hi, again,
    For the next & previous button, I was thinking in something.
    Do you think it could be good to create a new class RTPClientManager which extends from RTPManager and get 2 booleans previous & next that are set to true when I press on buttons ?
    From here, couldn't I modify my function in AVTransmit2 like this (I implements ReseiveStreamEvent on AVTransmit2) :
    @Override
         public void update(ReceiveStreamEvent evt)
              RTPClientManager mgr = (RTPClientManager )evt.getSource();
              Participant participant = evt.getParticipant();     // could be null.
              ReceiveStream stream = evt.getReceiveStream();  // could be null.
               * Détection de la fermeture de connexion du client afin de fermer la transmission streaming
              if (evt instanceof ByeEvent)
                   System.err.println("  - Got \"bye\" from: " + participant.getCNAME());
                         if(mgr.isNext())
                     this.stop();
                             /* SOME STUFF FOR NEXT */
                        else if(mgr.isPrevious())
                     this.stop();
                             /* SOME STUFF FOR PREVIOUS*/
                    else //means it's the stop button called
                              this.stop() ;
         }Would it work ?
    If you have better ideas I am ready to hear theam :P
    But if this works, I will manage stop, next and previous, but how could I manage play & pause button ?
    I can't find a proper solution to manage my controls :(

  • Merging RTP streams and writing to file

    I'm am in the middle of project using JMF i'm trying to merge incoming RTP streams and write them to file but i always have problem with formats. I have created processor with merged DataSource i set ContentDescriptor to different FileTypeDescriptors and tried to transcode the RTP streams to different formats, but always processor couldnt realize because of this error:
    Unable to handle format: LINEAR, 8000.0 Hz, 16-bit, Mono, LittleEndian, Signed
    Unable to handle format: LINEAR, 8000.0 Hz, 16-bit, Mono, LittleEndian, Signed
    Could someone tell me what content descriptors i need to use and what formats to solve this problem. Any help will be appreciated.
    Message was edited by:
    kaligula

    Hello,
    Here is a procedure i wrote to write on text files the content of any table (and also generate the insert SQL order)
    http://oracle.developpez.com/sources/?page=developpement#Extraction_table
    You can download the script file here http://sheikyerbouti.developpez.com/src/extraction_table.zip
    Hope this help you.
    Francois

  • Write(Export) the RTP Stream from a SIP Call to an audio file

    I am working on telephony system that need record each call to a file.
    I got the RTP Stream in ReceiveStreamEvent handler and start to record
    the problem is i got a file and play it but there are any sounds
    when i set FileTypeDescriptor.WAVE and AudioFormat.ULAW
    I try out the FileTypeDescriptor.WAVE and AudioFormat.IMA4_MS, i got a fixed size file with 60kb
    if the FileTypeDescriptor.MPEG_AUDIO and AudioFormat.MPEG, the processor cannot be realize,
    the thread is blocked!!
    Could anyone save me? thanks in advanced!
    =================================Code===================================================
    public void update(ReceiveStreamEvent evt){
    logger.debug("received a new incoming stream. " + evt);
    RTPManager mgr = (RTPManager) evt.getSource();
    Participant participant = evt.getParticipant(); // could be null.
    ReceiveStream stream = evt.getReceiveStream(); // could be null.
    if (evt instanceof NewReceiveStreamEvent)
    try
    stream = ( (NewReceiveStreamEvent) evt).getReceiveStream();
    DataSource ds = stream.getDataSource();
              recorder = new CallRecorderImpl(ds);
    recorder.start();
    catch (Exception e)
    logger.error("NewReceiveStreamEvent exception ", e);
    return;
    ========================================================================================
    this is the CallRecorderImpl Class
    public class CallRecorderImpl implements DataSinkListener, ControllerListener {
         private Processor processor = null;
         private DataSink dataSink = null;
         private DataSource source = null;
         private boolean bRecording = false;
         FileTypeDescriptor contentType = new FileTypeDescriptor(FileTypeDescriptor.WAVE);
         Format encoding = new AudioFormat(AudioFormat.IMA4_MS);
    MediaLocator dest = new MediaLocator("file:/c:/bar.wav");
         public CallRecorderImpl(DataSource ds) {
              this.source = ds;
         public void start() {
              try {
                   processor = Manager.createProcessor(source);
                   processor.addControllerListener(this);
                   processor.configure();
              } catch (Exception e) {
                   System.out.println("exception:" + e);
         public void stop() {
              try {
                   System.out.println("stopping");
                   this.dataSink.stop();
                   this.dataSink.close();
              } catch (Exception ep) {
                   ep.printStackTrace();
         public void controllerUpdate(ControllerEvent evt) {
              Processor pr = (Processor) evt.getSourceController();
              if (evt instanceof ConfigureCompleteEvent) {
                   System.out.println("ConfigureCompleteEvent");
                   processConfigured(pr);
              if (evt instanceof RealizeCompleteEvent) {
                   System.out.println("RealizeCompleteEvent");
                   processRealized(pr);
              if (evt instanceof ControllerClosedEvent) {
                   System.out.println("ControllerClosedEvent");
              if (evt instanceof EndOfMediaEvent) {
                   System.out.println("EndOfMediaEvent");
                   pr.stop();
              if (evt instanceof StopEvent) {
                   System.out.println("StopEvent");
                   pr.close();
                   try {
                        dataSink.stop();
                        dataSink.close();
                   } catch (Exception ee) {
                        ee.printStackTrace();
         public void dataSinkUpdate(DataSinkEvent event) {
              if (event instanceof EndOfStreamEvent) {
                   try {
                        System.out.println("EndOfStreamEvent");
                        dataSink.stop();
                        dataSink.close();
                        System.exit(1);
                   } catch (Exception e) {
         public void processConfigured(Processor p) {
              // Set the output content type
              p.setContentDescriptor(this.contentType);
              // Get the track control objects
              TrackControl track[] = p.getTrackControls();
              boolean encodingPossible = false;
              // Go through the tracks and try to program one of them
              // to output ima4 data.
              for (int i = 0; i < track.length; i++) {
              try {
                   track.setFormat(this.encoding);
                   encodingPossible = true;
              } catch (Exception e) {
                   track[i].setEnabled(false);
              if (!encodingPossible) {
                   System.out.println("cannot encode to " + this.encoding);
              return;
              p.realize();
         public void processRealized(Processor p) {
              System.out.println("Entered processRealized");
    DataSource source = p.getDataOutput();
    try {
         dataSink = Manager.createDataSink(source, dest);
         dataSink.open();
         dataSink.start();
         dataSink.addDataSinkListener(new DataSinkListener() {
                        public void dataSinkUpdate(DataSinkEvent e) {
                             if (e instanceof EndOfStreamEvent) {
                                  System.out.println("EndOfStreamEvent");
                                  dataSink.close();
    } catch (Exception e) {
         e.printStackTrace();
         return;
              p.start();
    ============================================

    Which shows that the output stream cannot be of that particulat format and descriptor.
    Look at this code
    import javax.swing.*;
    import javax.media.*;
    import java.net.*;
    import java.io.*;
    import javax.media.datasink.*;
    import javax.media.protocol.*;
    import javax.media.format.*;
    import javax.media.control.*;
    class Abc implements ControllerListener
         DataSource ds;
         DataSink sink;
         Processor pr;
         MediaLocator mc;
         public void maam() throws Exception
              mc=new MediaLocator("file:C:/Workspace/aaaaa.mpg");
              pr=Manager.createProcessor(new URL("file:G:/java files1/jmf/aa.mp3"));
              pr.addControllerListener(this);
              pr.configure();          
         public void controllerUpdate(ControllerEvent e)
              if(e instanceof ConfigureCompleteEvent)
                   System.out.println ("ConfigureCompleteEvent");
                             Processor p = (Processor)e.getSourceController();
                   System.out.println ("ConfigureCompleteEvent");
                         processConfigured(p);
              if(e instanceof RealizeCompleteEvent)
                   System.out.println ("RealizeCompleteEvent");
                           Processor p = (Processor)e.getSourceController();
                             processRealized(p);
              if(e instanceof ControllerClosedEvent)
                         System.out.println ("ControllerClosedEvent");
                     if(e instanceof EndOfMediaEvent)
                        System.out.println ("EndOfMediaEvent");
                        Processor p = (Processor)e.getSourceController();
                        p.stop();
                 if(e instanceof StopEvent)
                         System.out.println ("StopEvent");
                         Processor p = (Processor)e.getSourceController();
                         p.close();
                   try
                        sink.stop();
                        sink.close();
                   catch(Exception ee)
         public void processConfigured(Processor p)
              System.out.println("Entered processConfigured");
              p.setContentDescriptor (new FileTypeDescriptor (FileTypeDescriptor.MPEG_AUDIO));       
                /*    TrackControl track[] = p.getTrackControls ();
                 boolean encodingPossible = false;
                 for (int i = 0; i < track.length; i++)
                      try
                           track.setFormat (new VideoFormat (VideoFormat.MPEG));
                   encodingPossible = true;
                   catch (Exception e)
                   track[i].setEnabled (false);
         p.realize();
         public void processRealized(Processor p)
              pr=p;
              System.out.println("Entered processRealized");
              try
                   MediaLocator dest = new MediaLocator("file:C:/Workspace/ring1.mpg");
                   sink = Manager.createDataSink(p.getDataOutput(), dest);
              sink.addDataSinkListener(new DataSinkListener()
                        public void dataSinkUpdate(DataSinkEvent e)
                             if(e instanceof EndOfStreamEvent)
                        System.out.println ("EndOfStreamEvent");
                                  sink.close();
                   sink.open();
                   sink.start();
              catch(Exception eX)
              System.out.println("Just before start");
              p.start();
    /*     public void dataSinkUpdate(DataSinkEvent event)
              if(event instanceof EndOfStreamEvent)
                   try
                        System.out.println("EndOfStreamEvent");
                        dsk.stop();
                        dsk.close();
                        System.exit(1);
                   catch(Exception e)
    public class JMFCapture6
         public static void main(String args[]) throws Exception
              Abc a=new Abc();
              a.maam();

  • Using RTP stream as data source

    I have successfully written a program that reads G.711 audio from a wav file and sends it out over the network.
    I'm trying to modify that program so that instead of getting its source audio from a file, it instead receives the audio from an RTP stream. So, in effect, it is simply receiving audio from one socket and sending it back out another.
    The error I'm encountering:
    [java] javax.media.NoProcessorException: Cannot find a Processor for: rtp://172.30.18.140:32916
    [java] at javax.media.Manager.createProcessorForContent(Manager.java:1663)
    [java] at javax.media.Manager.createProcessor(Manager.java:627)
    Here is the code fragment where the exception occurs:
    processor = javax.media.Manager.createProcessor(new MediaLocator("rtp://172.30.18.140:32916"));
    Can anybody help? Thanks!
    <removed funky formatting><br>
    Message was edited by:
    feh

    Hi,
    I Had a very similar problem, but in addiction I had to store data in a buffer before retransmission. I resolved it employing low level classes. At this time, I still do not know any other solution, and nobody helped me in any way, also in this forum...
    So, first you have to access single frames of the stream (Buffer or ExtBuffer objects) using a RawBufferParser, then you have to reconstruct a DataSource by multiplexing the frames (use an appropriate Multiplexer class, such as RTPSynchBufferMux).
    The big problem is that there is no documentation avalaible about how to use these classes: I have lerned all I know by "reverse engineering" (lots of hours spent in reading very long annoying code). A little help can be given by the PlugIn Viewer of JMStudio, that tells you which classes are to be employed.
    I have not tried to retransmit data "straightforward", without access single frames, because that was not my task. Maybe if you use RTPManager, it is possible to get a DataSource by the ReceiveStream object, and then to pass it to another RTPManager to create a SendStream.
    I do not know any simpler way to do that, but this does not mean that it does not exist...!
    good luck.
    Alberto M. (Italy)

  • RTP streaming

    Hi guys,
    what I need to do is to stream YUV images between two JMF applications. I am using VideoTransmit application (taken from this site) to transmit and JMStudio to receive. From a little research I have noticed the following to transmit:
    1) Create the DataSource using the Manager.CreateDataSource(MediaLocator)
    2) create processor using the above created datasource
    3) create player etc.
    My question now is the following:
    How can I transmit a YUV file? How can I create the DataSource which has this YUV file without having problems for the further creation of processor etc?
    Thanks in advance guys,
    Chris

    Ok, I think I know what was confusing me. Apparently the setRate() isn't effecting the rate of stream. The audio file I am streaming runs about five minutes usually, so I sped the rate up to 10. It sped the process of writing to a file up significantly, but writing to an RTP stream seems to take the full run-time of the file. Does this sound right or is there something else I'm missing?
    Thanks,
    Khanathor

Maybe you are looking for

  • Apple TV not working since new time capsule

    hi, i've a new 3TB Time Capsule, I have added it to my Apple Network (with an airport express and older Timecapsule as network extenders). All is fine with internet and back ups etc but my Apple TV's will not play any video over airplay, can you help

  • How to iterate a symbol name and use "push" to add it to an array?

    I'm having trouble with the push statement. On my stage I have ten blocks labeled Block1, Block2,...Block10. I added these blocks manually by dragging and dropping. Did this because it is easier for me to figure where they should be on the stage than

  • Multiline container mapping

    Hi, There are many invoices present in the incoming message. I have to make a single Bapi call for each invoice. For this purpose in the BPM I need to map the message to a multiline container, so that in the next block I can loop each line of this mu

  • Images won't always display in a region

    I have a confusing problem. I have loaded 5 gifs on my local version of APEX and designated them as application only. To display an image in a region, I use: <img src="#APP_IMAGES#AJ01.GIF"> That gif displays. If I choose any other gif, I get the "br

  • Display monthly calendar on home page of e72 ?

    Is there any settings or any 3th party plugin/program that displays monthly calenadar on home screen of e72 like %60 transparent, so we can see wallpapers too. i had an old cheap samsung cellphone and it can do this easily. best wishes, ozzy