AVTransmit2 and AVReceive2

I'm able to compile and run both these examples without error, but nothing is ever received.
Here's the output from the two programs:
C:\>java AVReceive2 192.168.0.149/1338
- Open RTP session for: addr: 192.168.0.149 port: 1338 ttl: 1
- Waiting for RTP data to arrive...
No RTP data was received.
Failed to initialize the sessions.
C:\>java AVTransmit2 vfw://0 192.168.0.128 1338
Track 0 is set to transmit as:
  JPEG/RTP, 320x240
- Setting quality to 0.5 on com.sun.media.codec.video.jpeg.NativeEncoder$1$QCA@1
a679b7
Created RTP session: 192.168.0.128 1338
Start transmission for 60 seconds...
...transmission ended.Any ideas as to why this would run without error, yet not do anything?

how abt using Observable/Observer on both end...
hope it helps.
with regards

Similar Messages

  • Problem in using AVTransmit2 and AVReceive2

    Hello,
    I am very new to JMF. I start to work on it and i teke sample example from http://java.sun.com/products/java-media/jmf/2.1.1/solutions/AVTransmit.html and http://java.sun.com/products/java-media/jmf/2.1.1/solutions/AVReceive.html.
    In AVTransmit i pass parameter as AVTransmit2 file://F://Songs//MalgudiDays.mp3 192.168.19.51 42050
    at receiving side AVReceive2 192.168.19.51/42050 192.168.19.29/42506.
    Here 192.168.19.29 is sender and 192.168.19.51 is receiver.
    I check that all used ports are free.
    here is my output :
    AVTransmit2.java Side
    Track 0 is set to transmit as:
    mpegaudio/rtp, 44100.0 Hz, 16-bit, Stereo, LittleEndian, Signed, 16000.0 frame rate, FrameSize=32768 bits
    Created RTP session: 192.168.19.51 42050
    Start transmission for 60 seconds...
    ...transmission ended.
    Process completed.
    AVReceive2.java Side
    - Open RTP session for: addr: 192.168.19.51 port: 42050 ttl: 1
    - Open RTP session for: addr: 192.168.19.29 port: 42506 ttl: 1
    - Waiting for RTP data to arrive...
    - Recevied new RTP stream: mpegaudio/rtp, Unknown Sample Rate
    The sender of this stream had yet to be identified.
    AVReceive2 internal error: javax.media.ControllerErrorEvent[source=com.sun.media.content.unknown.Handler@fa3ac1,message=Internal module com.sun.media.BasicRendererModule@1c39a2d: failed to handle a data format change!]
    Exiting AVReceive2
    Process completed.
    Please Can anyone guide me to identify my mistake.
    Thank you in advance.

    Navid,
    JHeadstart checks whether the attribute definition is queryable before adding it to the list of advanced search attributes. A transient VO attribute can still be set to queryable, for a transient EO attribute this is not possible. We will remove this check on queryability in the upcoming patch release, since you have a valid use case. For now, you can subclass JhsSearchBean and override method createArgumentListForAdvancedSearch and comment out the check for "def.isQueryable()".
    Steven Davelaar,
    JHeadstart Team.

  • Difference between avtransmit2 and avtransmit3

    I was looking over the two files and wanted to know what the difference was between the 2 files. If there is a better usage of code in the avtransmit2 as far as the transmit process is concerned. I have used the avtransmit2 and avreceive2 to send videos from one computer to another and have had mixed results. sometimes it works ok but not great. does anyone know if there is one that will work better, or if there is another java option out there that could be implemented into a program to stream video?
    Thanks

    Perhaps.

  • JMF example AVTransmit2.java and AVReceive2.java is not working for me?

    Dear all
    I want to develop a class which will capture my voice data from my microphone and then will send it to the sound system of my PC following RTP protocol. For that I tried both AVTransmit2.java and AVReceive2.java class from http://java.sun.com/products/java-media/jmf/2.1.1/solutions/ but they just not doing it how I want, I am sure I am making some mistake, please let me know.
    Moreover, when I try AudioBufferControl.java from the above link it is doing it for me what I want, let me hear what I am saying to my microphone, but it is using "javasound", I think that is not using RTP. Again if I use JMStudio even then I can hear my voice from microphone to sound system.
    Pls can you help me write a class and let me know what I am doing wrong!
    With care
    Arif.

    Try to follow these steps:
    1. make sure db tables have the pk and fk's pre-existing.
    2. create a brand new workspace
    a) Select New Business Components: business components from tables.
    b) Select these two tables
    c) on the define view pages (step 2), push both objects over, then just click [finish]
    3. Open the ViewController Node
    a) expand Web Content, then WEB-INF
    b) double click struts-config.xml to open it
    4. In Struts-config.xml
    a) drop a datapage, name it BrowseRoster, double click it, then drag and drop the Roster table from your DataControl Palette, as a READ only table.
    b) insert a column as the first cell in the table and drop the operation: setCurrentRowWithKey as a find Row link. This should create the ANCHOR link, i.e.
    ">Show Emps</a>
    c) create another page, EditOneEmp, double click to open that, and EXPAND the Roster table in the Data Control Palette. Drop the Emp table on this as an input form.
    d) Now go back to the STruts-config.xml and draw a forward from the browse to the detail and name the forward setCurrentRowWithKey.
    e) run the browse page from the struts-config.xml file (right click and run). Click on the hyperlink and you should go to the detail input page.
    You may have messed up in Step 4(c). Also, when you run into these situations, as we've run into this NUMEROUS times at work, follow through the examine again, IN A NEW WORKSPACE, and redo each and every step. You must be EXACT or things don't work.
    Let me know if it works out.
    Thanks.

  • AVTransmit2/AVReceive2 won't work with multicast???

    hi!
    i tried with some friends of mine avtransmit2 and avreceive2 in a little lan. cause we don't want to user broadcast we tried tu use multicast and started the server with following command: java AVTransmit2 javasound:// 224.112.112.112. 1200
    The server created the session without any problems. So we've started the client with the following command: java AVReceive2 224.112.112.112/1200 But it didn't get the stream from the server.
    Do these 2 solutions don't support multicasting or is it cause of our network?
    i hope u can help me :)
    thx
    Tweezer

    i've already tried this. so i startet it with: java AVReceive2 224.112.112.112/1200/1
    but it's still waiting f�r data. so i think avtransmit2 doesn't support multicast??? cause the constructor just allows sourceurl, destip und destport. but there is no ttl :/
    thx 4 help!
    Tweezer

  • Can i take both Receveing of audio and video on the same window.

    hi,
    i m using AVTransmit2 and AVReceive2 for Transmission and Receiving all is doing well but problem is that AVReceive2 show Transmission of audio and video in both different windows and it use different players for audio and video. Can u tell me can i merge both audio and video at the receiving end. i m already mergeing audio and video at transmission end ???. any one tell me how can i solve this problem ???
    thx in advance.

    Hi,
    I have the same situation - to use one window to show both rtp audio and video stream, but use the ControlPanelComponent of audio player to control both players.
    I want to use the audio player as the master player so that the video player could be synchronized with it. So: audioPlayer.addController(videoPlayer).
    In the only window frame for both players, I want to show the VisualComponent(VC) of video and the ControlPanelComponent(CC) of audio. So I add both components respectly to the window frame, after I got them from each player. Because the streams are coming from the network over RTP, and they do not become realized at the same time, I add the VC and CC of first realized player(no matter it is audio or video) to window frame at first, and then change them to the desired video VC and audio audio CC.
    Though I check both player are realized and not started before I call the addController() method, the application stucked in this method. Of course I call it in a try block but no exception was thrown out.
    when I use TimeBase to synchronize them, i.e., videoPlayer.setTimeBase(audioPlayer.getTimeBase()), everything goes fine. But, I can not control both players from one ControlPanelComponent in this way.
    Anybody having any idea to solve this problem please reply.
    By the way, in the example code such as AVReceiver, where audo and video streams are shown on two window seperately, user can use the ControlPanelComponent of audio to control both player, though it seems no relevant code to support that function. The ControlPanelComponent of video can not control the audio player anyway. Anyone having any explanation to this?

  • Help on RTP session!

    Actually, i'm currently testing a java program that can send voice through the network but i faced problem on only windows 2000. I had tested on windows 98, windows Xp and windows 2000 locally. But when i send voice over 2 windows 2000 PCs, this error come out - "Cannot create the RTP session: Can't open local data port: 12468". Can anybody tell me what is the error means? Thank you.

    Sorry by my english im from spain.
    The problem is in the transmiter because he send the stream by the same port that the receiver will go to receive.
    I change this code in AVTransmiter2
    private String createTransmitter() {
    int port,port2;
    port2 =portBase - 2*i -2;
    localAddr = new SessionAddress( InetAddress.getLocalHost(),port2);
    And now you can utilice in the same mahine AVTransmit2 and AVReceive2.
    PD: if you want have the sound in mpg archive you must put in AvReceive other sesion with the port + 2.

  • Audio+Video Chat works but one small problem.Pls help.

    HI..
    as i had posted once before in a different query's thread, i have implemented audio and video chat using AVTRansmit2 and avreceive2.
    audio and video chat work well.By video chat i mean not only video but VIDEO+AUDIO as well.. just like usual yahoo video chat.
    But the problem is like this:-=
    I have given an option for user to switch from audio chat to video chat.
    now if a user first goes to audio chat and then switched to video chat,
    then the problem comes. When he's shifted to video chat i close down previous Audio chat by RTPManager.dispose() , i close the player and everything..
    and then i start Video transmit/receive "ALONG WITH" audio transmit/receive..
    No this time video starts but audio doesnt work ..
    it says"waiting for rtp data to arrive.... "
    The problem is that when this new stream of audio data comes, the receiver somehow thinks that its the same old stream since its from same transmitter IP.. and so it tries to UPDATE the previous stream.
    It means there is some problem with the close method or RTPManager.dispose() which should have disposed off all the stuff much before the new connection was made.
    PLS HELP ASAP>
    this is crunch time for my project.

    Hi anandreyvk . .
    well, i had tried doing removetargets and also, i had disposed RTPManager well. As it turns out the problem was not with any of it but it was the problem of me incorrectly passing RTPManager from AVReceive2 to AVTransmit2.
    Actually i am using just one RTPManager.. since i am receiving and transmitting on the same port.
    i've solved the problem but i am not sure if this is the right way .. what do you think ??
    is creating just one RTPManager {by that,i mean initializing it only once} a good idea ?? Since we are supposed to call both AVTransmit2 and AVReceive2 with the "LOCAL PORT" .. {which in itself is a matter of discussion though! } i and my project mate thought initializing RTPManager only once is a better option .
    Whats your take on all of this ?

  • Video transmission over internet

    Hi to all ,
    iam developing video and audio transmission over internet using JMF in java.
    it is working in the LAN with multicast ip address like 224.235.202:42050
    my problem is when i want to transmit in the internet which ip address and port i need to use for transmission and receive.
    please help me out.
    if possible give me some sample code which is working in internet for transmission video and audio.

    Hi khrisna,
    i decide to make a conferrence over LAN first before it goes to internet.
    It doesn't work yet. I modify the code AVTransmit2 and AVReceive2.
    And I think I dont set the frame rate ( default from the web cam ).
    Video and Audio encoding are set just like AVTransmit2.
    I have a plan to make client-server transmit. In a Local Area Network,
    I use socket network programming.
    Now, I've a problem to clone datasource. I think someone in this forum has a problem as I have.
    whats ur suggestion about these :
    1. merge the datasource for audio and video capture first and then make it one processor. After processor has realized, cloned the dataOutput from processor, or
    2. make a processor for each audio and video capture, after both processor has been realized then I merge the dataOuput from both. and then clone the mergedDataOutput.
    which one can work better ? ur help,please!

  • Video Conferrence over Internet??

    Hi all experienced JMF-ers,
    I'm new to Java, of course JMF too .But I have a big will to learn it ..
    I'm interested to develop such a video conferrence using JMF.
    I've read JMF Guide ( although I don't understand all yet ), some refference, I learn several free source code on the internet ( it helps me to understanding JMF faster ) .
    with my poor of experience I modify AVTransmit2 and AVReceive2 then I can make the code to transmit captured video and audio over LAN ( still one way stream ). Although it still makes a delay 1-2 sec.
    But then when I look at this forum, I read several messages that video conferrence using JMF still have a problem such as can't transmit video stream over Internet yet.
    Is there anyone who have solved this problem? ( I'm not testing yet for DaddyE's code for video conferrence) , if yes, I apreciated if you want to share ur depth experience about its high level architecture? And what's the algorithm ? I'll try to make the code by my self. Thanx before.
    Regards,
    Adhi

    Hi khrisna,
    i decide to make a conferrence over LAN first before it goes to internet.
    It doesn't work yet. I modify the code AVTransmit2 and AVReceive2.
    And I think I dont set the frame rate ( default from the web cam ).
    Video and Audio encoding are set just like AVTransmit2.
    I have a plan to make client-server transmit. In a Local Area Network,
    I use socket network programming.
    Now, I've a problem to clone datasource. I think someone in this forum has a problem as I have.
    whats ur suggestion about these :
    1. merge the datasource for audio and video capture first and then make it one processor. After processor has realized, cloned the dataOutput from processor, or
    2. make a processor for each audio and video capture, after both processor has been realized then I merge the dataOuput from both. and then clone the mergedDataOutput.
    which one can work better ? ur help,please!

  • Video conferencing over LAN

    I want to do a project on video conferencing over LAN .Plz help me
    for implementation.

    Have you tried AVTransmit2 and AVReceive2 to transmit multicast stream. then have multiple receivers join and disconnect.
    It works without any code modification, so I would study the differences between that and yours.
    I think I asked this before, are you transmitting multisession?? and receive multisession??
    Are you using linux by chance?

  • Very low video framerate when sound activated

    Hi Guys,
    hope you can help me with a problem that already cost me 2 days.
    Basically I used the AVTransmit2 and AVReceive2 classes as a draft for my own implementations (see below).
                    DataSource video_ds      = getVideoSource(new Dimension(640, 480), 30.0f);
              DataSource audio_ds      = getAudioSource();
                    // when I use only one datasource here everything is ok (normal audio or 30fps video)
              DataSource[] sources      = new DataSource[] {video_ds, audio_ds};
              try {
                   processor_ds = Manager.createMergingDataSource(sources);
              } catch (IncompatibleSourceException e) {
                   return "IncompatibleSourceException creating data sources";
              processor = javax.media.Manager.createProcessor(processor_ds);
                    .......The problem I have is when using both datasources, video and audio creating a merged datasource.
    Then the framerate drops down to 1-2 fps. I tested this on 2 different pc's here with 3 different cameras and with JMStudio it seems to be the same.
    When capturing audio and video and transmitting via RTP, the resulting frame is showing video with 1 fps (Monitor seems to be ok).
    When using AVTransmit2 and AVReceive2 without any changes I've the same problem.
    Is this a known problem that a webcam stream with audio can't be transferred fluently?
    When needed, I can post the code of my two classes to be run standalone (It's quite big so I leave it out for the moment).
    Thanks in advance for your help.
    Edit:
    This is the output of my capture program:
    Found video capture device vfw:Microsoft WDM Image Capture (Win32):0
    DataSource created for device
    Supported video format: YUV Video Format: Size = java.awt.Dimension[width=640,height=480] MaxDataLength = 460800 DataType = class [B yuvType = 2 StrideY = 640 StrideUV = 320 OffsetY = 0 OffsetU = 307200 OffsetV = 384000
    Supported video format: RGB, 160x120, Length=57600, 24-bit, Masks=3:2:1, PixelStride=3, LineStride=480, Flipped
    Supported video format: RGB, 176x144, Length=76032, 24-bit, Masks=3:2:1, PixelStride=3, LineStride=528, Flipped
    Supported video format: RGB, 320x240, Length=230400, 24-bit, Masks=3:2:1, PixelStride=3, LineStride=960, Flipped
    Supported video format: RGB, 352x288, Length=304128, 24-bit, Masks=3:2:1, PixelStride=3, LineStride=1056, Flipped
    Supported video format: RGB, 640x480, Length=921600, 24-bit, Masks=3:2:1, PixelStride=3, LineStride=1920, Flipped
    Setting capture format to RGB, 640x480, FrameRate=30.0
    Input format for RTP conversion: RGB, 640x480, FrameRate=30.0, Length=921600, 24-bit, Masks=3:2:1, PixelStride=3, LineStride=1920, Flipped
    Track 0 is set to transmit as: JPEG/RTP, 640x480, FrameRate=30.0
    Input format for RTP conversion: LINEAR, 44100.0 Hz, 16-bit, Stereo, LittleEndian, Signed
    Track 1 is set to transmit as: gsm/rtp, 8000.0 Hz, 8-bit, Mono
    - Set quality to 0.5 on com.sun.media.codec.video.jpeg.NativeEncoder$1$QCA@1ef9f1d
    RTP channel opened to: 192.168.0.101 5030
    RTP channel opened to: 192.168.0.101 5032and here the output of the receiver / playback program:
      - Open RTP session for: addr: 192.168.0.101 port: 5030 ttl: 1
      - Waiting for RTP data to arrive...
      - Received new RTP stream: gsm/rtp, 8000.0 Hz, Mono
          The sender of this stream had yet to be identified.
      - Received new RTP stream: JPEG/RTP
          The sender of this stream had yet to be identified.
      - A new participant had just joined: otto@meier
      - A new participant had just joined: otto@meier
      - The previously unidentified stream
          gsm/rtp, 8000.0 Hz, Mono
          had now been identified as sent by: otto@meier
      - A new participant had just joined: otto@meier
      - The previously unidentified stream
          JPEG/RTP
          had now been identified as sent by: rlamotte@chumbellAlso curious and it would be great if somebody could bring some light in there:
    I thought I would need two RTPManager instances for the two tracks, so one socket for audio and one for video.
    But, when I configure my receiver only for listening on port 5030 with one RTPManager instance, I get both, audio and video.
    So, what's the normal approach to implement a video/audio transmission via RTPManager?
    Edited by: user2293420 on 18.07.2011 08:33

    Hi captfoss,
    thanks for your reply.
    "Highest possible quality" and JMF shouldn't really ever be used in the same sentence. JMF is 10 year old streaming technology designed to run on machines and networks that were medocre 12 years ago.I know, I was just referring to the 640x480 resolution. This has to be enough for the moment and maybe next year we will see how to improve that.
    So, make your code work like the AVTransmit2 example works. They wrote the example the way they did for a reason.That's exactly how I have it implementing now.
    private String createTransmitter() {
             DataSource ds = dataOutput;
             PushBufferDataSource pbds = (PushBufferDataSource)dataOutput;
             PushBufferStream pbss[] = pbds.getStreams();
              SendStream sendStream;
              String portNumber;
              String media = "";
              int port = 0;
             try {
                  for (int i=0; i<pbss.length; ++i) {
                       SessionAddress address = null;
                       RTPManager mgr = RTPManager.newInstance();
                       if (pbss.getFormat() instanceof VideoFormat) {
                        port = localVideoPort;
                        media = "Video";
                        videoRtpManager = mgr;
                   else if (pbss[i].getFormat() instanceof AudioFormat) {
                        port = localAudioPort;
                        media = "Audio";
                        audioRtpManager = mgr;
                   address = new SessionAddress(InetAddress.getLocalHost(), port);
                   mgr.initialize(address);                    
                   mgr.addTarget(address);
                        sendStream = mgr.createSendStream(ds, i);          
                        sendStream.start();
                        portNumber = (port == SessionAddress.ANY_PORT) ? "ANY_PORT" : Integer.toString(port);
                        log.info("RTP Manager for " + media + " capture created on port " + portNumber);
         } catch (Exception e) {
              e.printStackTrace();
              return e.getMessage();
              return null;
    It's working very good and stable now. When I transmit H263_RTP video with DVI_RTP audio, it works great (apart from the resolution, just the FPS which are constantly > 25). The problem is when I change to JPEG_RTP (640x480) with DVI_RTP, then the framerate drops until 1 FPS. This just the encoding part of the processor, the capture format of the datasource is always RGB and it not touched when I change the encoding format.
    Also, I can transmit JPEG_RTP with 640x480 and 30 FPS but only when I disable the audio track in the processor.
    Any clue why this could happen?                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • JMF Receiving Media source

    I need to display several transmitted medias in a single frame. I planed to have JPanels for each media source. but i couldn't accomplish that. I followed the AVTransmit2 and AVReceive2 example and customized according to my need. (My situation is one receiver for many transmitters). Can anyone help me please please.

    I planed to have JPanels for each media source.OK, It is just about correct approach.
    but i couldn't accomplish that. This is simple swing question now, here is the outline to accomplish displaying 4 videos in 2 x 2 grid:
    Player p1,p2,p3,p4;
    JPanel getVideoPanel(Player p){
        JPanel panel=new JPanel(new BorderLayout());
        panel.add(p.getVisualComponent(),BorderLayout.CENTER);
        panel.add(p.getControlPanelComponent(),BorderLayout.SOUTH);
        return panel;
    void addAllVideos(JFrame f){
        Container cont=f.getContentPane();
        cont.setLayout(new GridLayout(2,2));
        cont.add(getVideoPanel(p1));
        cont.add(getVideoPanel(p2));
        cont.add(getVideoPanel(p3));
        cont.add(getVideoPanel(p4));
    // * somewhere in the contructor  or main method make a display JFrame
    // * make realized players for each media locator
    // * call addAllVideos() with the parameter as the display JFrame.
    // * call pack() on display JFrame.Note: The above code is neither tested nor even compiled.
    My situation is one receiver for many transmittersA much better approach for this is: JDesktopPane + JInternalFrame combination
    For this you would have to make an instance of JInternalFrame for each player and then add it to JDesktopPane (which should be added to the main JFrame's content pane).
    Here is the link which may help you if you choose this (much better) approach:
    [http://java.sun.com/docs/books/tutorial/uiswing/components/internalframe.html]
    Thanks!

  • Multicast transmission problem

    Hi,
    I'm tring to multicast audio streams from 1 computer to another on the same network. The multicast address for my network is 10.255.255.255. When I check the amount of transmission bytes from ifconfig continually, I see a gradual increase of bytes being sent out to the network.
    When I look at ifconfig on the receiving computer, it is receiving bytes as well. But AVReceive2 can't detect the rtp stream. It just keeps on waiting for the data to come. By the way, I use AVTransmit2 to transmit the data. But when I just unicast the rtp stream from one computer to another, then the rtp streaming works!!
    Anyone has the same problem as me. Can anyone point out any problems that I may be having. Oh yeah and I am running JMF between two red hat linux 9 notebooks.
    Do I have to do some extra modification to some ip routing table somewhere or sonething?
    Thanks to anybody that can PLEASE help me!!!1

    Hi Stefan,
    If I unicast an audio file from one computer to the other then it works fine. I didn't try sending a video file but I think it should work the same as well. I've got two computers, one having an ip address of 10.0.0.1 and the other having an ip address of 10.0.0.2. Doing an ifconfig, my broadcast address on each computer is stated as 10.255.255.255. When I use 10.255.255.255 to send my audio file, using ifconfig again, I notice that the receiving computer is receving bytes but the program isn't recognising the rtp stream. My program uses a similar address resolution to avtransmit2 and avreceive2.
    So I can't figure out what's wrong with the code. Ok for avreceive2, what do you usually type in the command line to receive broadcast packets? Do we have to modify the /etc/hosts file again??
    Thanks

  • AVTransmit2, AVReceive2 or AVTransmit3, AVReceive3

    What is the difference between (AVTransmit2 and AVTransmit3) and (AVReceive2 and AVReceive3)? And which is better for transmitting and receive multiple video and audio sessions? I have looked at the two programs and am not sure which to use.

    The "2" version of the programs are designed for sending normal RTP sessions that conform with the RTP standard of using UDP sockets as the transport protocol.
    The "3" versions of the programs are for special applications where you want the RTP packets go over over something other than UDP, although the default custom transport layer just uses UDP (it's for demonstration purposes less so than use).
    If you don't know which you want, you want "2"... :-)

Maybe you are looking for