JMF+JTAPI, JMF+COMMAPI or JMF+ibm's J323

Hi all,
I've tried and tried and tried... to use JMF and COMMAPI to read data from a modem. It's human voice data. I can read from an InputStream the bytes that modem receives so, I THINK that I'm getting the voice in the InputStream as bytes (please corret me if I'm wrong). Know the proble is to transform this InputStream in a DataSource. If tried with the example of jmf's page (LiveStream and DataSource), but I have no idea about converting the InputStream to DataSource, and 'play' the InputStream by the phones. Then I tried to use XTAPI implementation of JTAPI... but It was impossible to get the next code working...
JtapiPeer peer = JtapiPeerFactory.getJtapiPeer(
"net.xtapi.XJtapiPeer");
myprovider = peer.getProvider("Serial");
or(
myprovider =peer.getProvider"net.xtapi.serviceProvider.OpenH323");
Terminal[] t = myprovider.getTerminals();
becouse there was no way the a Provider...
Then I saw that some people were talking about using ibm's J323 and I tried with, becouse they said that JTAPI was not :
JtapiPeer peer = JtapiPeerFactory.getJtapiPeer("com.ibm.telephony.mm.MM_JtapiPeer");
myprovider = peer.getProvider("H.323 Endpoint;login=223.254.254.94");
Terminal[] t = myprovider.getTerminals();
Using this I got the Provider, but, the getTerminals method return an empty array...
I'm DESPERADO, i GIVE YOU 10 DUKES if you help me.
I'm loosing a lot of time in this and my boss thinks I'm touching my nose in stead of working on this, becouse at this moment I have NOTHING.

Two other places to find out about J323:
http://www.zurich.ibm.com/csc/distribsys/j323/j323-2.html
---start quote---
Can I use the IBM J323 Engine to control my modem?
Can I use the IBM J323 Engine send a fax?
No. The IBM J323 Engine is an implementation of the H.323 version 2 protocol as defined by the ITU-T (International Telecommunication Union). The H.323 protocol is a Voice over IP protocol. It is used to establish phone calls over an IP network (ex: internet). Therefore the J323 Engine will only communicate with H.323 endpoints. A H.323 endpoint can be a terminal (ex: Microsoft NetMeeting) or a gateway (to connect to the PSTN - Public Switched Telephony Netwok - which is the "normal" telephony network).
---end quote---
Also, I when I run J323 demo-media.bat, the "-localaddr" argument seems to map to the "login=" part of the provider string, but this devolves to simply the string used to identify you when you contact other H.323 peers. The trace I get is:
C:\Program Files\j323\bin>C:\Java\jdk\j2sdk1.4.2\bin\java -Djmf.package=com.ibm
.media.teljmf.TjTmProvider -Dh323.endpoint.timer.factor=3 com.ibm.ct.cphone.Main
-trace 1 -mtrace 2 -sdfile demo.sd -locaddr "<[email protected]>"
CPhone: Obtaining JtapiPeer "null"...
CPhone: Obtaining Provider "H.323 Endpoint;login=<[email protected]>"...
Setting h323.endpoint.timer.factor to '3'
IBM J323 Engine v5.0.0
blah blah blah

Similar Messages

  • JMF+COMMAPI, JMF+JTAPI+XTAPI OR JMF+JTAPI+ibm'sJ323

    Hi all,
    we are trying to develope a program for an operation center were the policemen will call the control centre from a private telephone line in the street. The operator must received the entrance calling and dispatch it to the right department, well more or less that's what the end user want us to make.
    Now we are at the dessign stage, so we need to test the best solution. First we though about using COMMAPI, so we tried to make a little application as a test. We can hung up the modem, or dial a number through the modem from the application. We are able to get the data the modem send us in an InputStream. Here begin the problems... Do you think this InputStream we receive from the modem is the human voice that apeaks from the telephone that made the call to the modem? Then, another problem that we couldn't solve... If this InpuStream is in fact the data (the voice) received by the modem how can we convert this into a DataSource class to be able to reproduce this in the computer. We've read and tried thousand of times to use the examples in the JMF solution's place, but we couln'd work it out. We think the example more similar to our is the 'Generate LiveStream', which uses LiveStream and a custom DataSource classes. But we don't know how to use LiveStream or DataSource to make them read from our InputStream each time the modem send us data.
    Then we though that maybe it would be faster to develope the test program using XTAPI, an implementation of JTAPI ... but It was impossible to get the next code working...
    JtapiPeer peer = JtapiPeerFactory.getJtapiPeer(
    "net.xtapi.XJtapiPeer");
    myprovider = peer.getProvider("Serial");
    or(
    myprovider =peer.getProvider"net.xtapi.serviceProvider.OpenH323");
    Terminal[] t = myprovider.getTerminals();
    becouse there was no way the a Provider, t was null...
    Then I saw that some people were talking about using ibm's J323 and I tried with, becouse they said that JTAPI was an incomplete(?) implementation of the H323 specification. OK we download the trial version but... nothing, now we got the provider, but no Terminals are abailable. We 've tried everithing and we think everithing is installed propertly. We used:
    JtapiPeer peer = JtapiPeerFactory.getJtapiPeer("com.ibm.telephony.mm.MM_JtapiPeer");
    myprovider = peer.getProvider("H.323 Endpoint;login=223.254.254.94");
    Terminal[] t = myprovider.getTerminals();
    Although you have to pay for get a license of J323, the documentation and the support is so low.
    We think the better way to solve our problem is using COMMAPI, becouse it would give us a higher control over the application and the modem, but although we have though hard on convertin InputStream into DataSource, we cannot make it work.
    I'm DESPERADO, i will GIVE YOU 10 DUKES if you help us.
    We are loosing a lot of time on this and the boss thinks cannot understand why we are going so slow, and why there are so many problems. We see there's a lot of people posting similar problems, but there are no answers. Maybe this would help a lot of peolple on this matter.
    Thanks in advance.

    How can I integrate the jmf and the openh323 project? I want to do calls using the openh323 library and transmit/receive using the JMF. However the openh323, which uses the h.245 protocol for port and channel negociation, doesn't allow to use the ports that already were negociated. So, how can i transmit using JMF to these ports?
    Please, help me.

  • How to get JTAPI implimentation

    on compiling my code as givent in JTAPI documentation
    i found error:
    i am using windows 2000 Professional and Lucent modem
    Can���t get Provider: javax.telephony.JtapiPeerUnavailableException: JtapiPeer: DefaultJtapiPeer could not be instantiated.
    -Rana-

    there is one implementation from IBM called J323 engine, try it. u can also find other implementations list from http://java.sun.com/products/jtapi/implementations.html

  • Help on JTAPI -how to work with it

    hi , i want to start working with Jtapi
    i downloaded the class files for jtapi -it's a zip file containg classes
    how can i start working with it , where do i need to extarct it !!
    do i need a certain hardware !!
    and how do i compile new application i create
    please replay me asap , ir ealy need this info !!!
    thanks !!!

    sir, my project is to use fax modem and JTAPI to send and receive fax. now i am trying to dial the fax number. sir i am new to JTAPI. so i have downloaded the Outcall program in JTAPI. sir i have to execute it, so please kindly suggest me...
    since every jtapi need an implementation i have downloaded J323 engine. i have set the class path for j323.jar.
    what else i should download....
    now i have
    1. downloaded jtapi 1.2
    2. set the classpath of jtapi.jar
    3. download J323 and only set the class path of J323.jar.
    (is it necessary to follow the instructions in config.html)
    please suggest me the next step to follow, i have to execute it(Outcall)
    sir, in JTAPI how to get the Provider. sir if please kindly suggest the method to execute the program step by step..and what are the hardwares/softwares i am missing . i am using win95.
    sir i will be very thankful to u sir.
    please reply me.....
    mail id : [email protected]

  • JTAPI  jar file

    Hi
    I want to use JTAPI technology but i am unable to find out its jar file.
    if anyone knows about it then please tell me soon. You can also email me at [email protected]
    Thanx in advance.
    Best Regards.
    Asif

    sir,
    i am beginner please help me ....
    my project is to use fax modem and JTAPI to send and receive fax. now i am trying to dial the fax number. sir i am new to JTAPI. so i have downloaded the Outcall program in JTAPI. sir i have to execute it, so please kindly suggest me...
    since every jtapi need an implementation i have downloaded J323 engine. i have set the class path for j323.jar.
    what else i should download....
    now i have
    1. downloaded jtapi 1.2
    2. set the classpath of jtapi.jar
    3. download J323 and only set the class path of J323.jar.
    (is it necessary to follow the instructions in config.html)
    please suggest me the next step to follow, i have to execute it(Outcall)
    sir, in JTAPI how to get the Provider. sir if please kindly suggest the method to execute the program step by step..and what are the hardwares/softwares i am missing . i am using win95.
    sir mail ur ideas given below....
    sir i will be very thankful to u sir.
    please reply me.....
    mail id : [email protected]

  • Ibm MP4, divx, jmf 2.1.1e

    Hello.
    I've tested the above configuration to play a standard divx. i had problems, of course, and nothing finally works.
    Using jmf studio on XP, i get a pink background, and finally the program refuses to exit. In software mode (no acceleration selected in windows), nothing better. it loads plays something (hard drive accesses) and then crashes. Still with the pink background.
    Anyone succeeded in playing a divx? if so, what codec was used to compress the file?
    Thanks a lot

    Yes, i can play it in the windows media player, which is the most intriguing.
    The codec used is DX50, that the ibm codec is said to decompress. In some parts, it does, as the playing starts.
    By the way, that IBM logo is dumb. I can't understand why they cripple the codec for the 'evaluation version', if they even don't have a registred version. I'd love to register it, if it was possible, and if the codec was running correctly.. :<

  • JTAPI, JMF, and the 7921G

      We have a JTAPI application that uses the JMF library to send audio. The streaming audio stream can be heard on the desktop phones (e. g. 7941, 7961), wireless 7920 phones, and the Cisco IP Communicator but cannot be heard on the wireless 7921G or 7925G phones. There are no errors encountered in the programming logs when the audio is streamed with JMF to these phones. 
      To try to pinpoint the problem, we have gathered a number of traces and logs using the Cisco Real-Time Monitoring Tool and Wireshark.  The only suspicious thing that we saw was in the Call Statics on the 7921G itself.  The received packets showed 38 and the discarded packets also showed 38.  But, we have no idea what the discarded packets are or why the 7921G would discard packets to begin with.  We don't see discarded packets during a normal phone call.
       People that we have talked to about this problem have indicated that they think it's a codec problem, but when we look at the codec in the traces, the phone status log and the events that we get back at the programmatic level, everything indicates that we are sending and receiving with the G.711 u-law with a 30msec packets size, and this matches the codec of the streaming application.   We have even turned off the CUCM Service parameter "G722 Codec Enabled" to make sure that this setting was not overriding the G.711 negotiation.  Changing the parameter had no effect on the problem.
      We have checked the more obvious problems such as the volume on the 7921G phone and the volume of the original recording.  In addition, we tried to rule out a WAP problem by using the CIPC on a wireless laptop connected to the same WAP as the 7921G phone.  The announcement played just fine on the CIPC on the wireless laptop.  
      At the programming level, the audio is being streamed over a CTI port by the application when the call is connected to the destination phone. So, essentially, once the JTAPI application receives the CiscoRTPOutputStartedEv event, we set up the remote connection and start the stream. 
    The CTI port is set with:
   CiscoMediaCapability[] caps = new CiscoMediaCapability[1];
   caps[ 0] = CiscoMediaCapability.G711_64K_30_MILLISECONDS;
    
 The audio stream is set for format ULAW_RAW (codec G.711 u-law): 
  Format audioFormat = new AudioFormat( AudioFormat.ULAW_RTP, 8000, 8, 1 );
    
 I've adjusted the packet size to 20ms for the CUCM defaults as well as for the CiscoMediaCapabilties thinking that something might not be in sync. While making those adjustments effected the quality of the audio for the phones that were working, it did not effect being able to hear the audio on the 7921G.  This problem occurs on the CCM 4.X, CUCM 6.0 and CUCM 7.0.1 platforms
      Hopefully, there is a setting that we've overlooked, and there is a simple solution to the problem.
    Thanks,
    -Judy

    Yes, your statement is an accurate way to state it. We wait until the receiving phone has been answered and use the remote port obtained from the RTPOutputProperties to stream a unicast announcement to the phone. Then, we wait for a DTMF response from the user in IVR fashion.
    We'd assumed that other people were streaming announcements to the 7921G and not having a problem but wondered if anyone was doing it the way that we are. The previous post is helpful in trying to understand more about the phone. But, the question remains, what is different about how the 7921G reacts to the stream and how CIPC reacts to the stream going through the same WAP such that the announcement is not heard on the 7921G?

  • Help needed- Caller tone application using JTAPI and JMF

    Hi to All,
    I want to make a Caller Tone Application for Cisco IP phone.
    For that application I am using JTAPI and JMF.
    I am new to this two technologies.
    Can somebody help me for how to accomplish this?
    ---Ashish

    Hi Jerry,
    You can run analog input and counter tasks concurrently.  You can start them using software timing basically at the same time which may be fine for your needs and is the easiest to implement.  You also can use a hardware start trigger to start the tasks if you prefer.  It all depends on the level of synchronization you need. 
    You have not mentioned at what rate you will be acquiring data on your analog inputs.  The M Series PCI-6225 has 80 analog inputs and may suit your needs.  You will need to know what sampling rate you are trying to achieve.  Any M Series device will definitely give you the best value. 
    Hope this helps!
    Laura

  • IBM MPG4 FOR JMF ERROR WITH DIVX

    I have install and register the IBM alphaworks Mpeg4, and work with MPG4 parser with avi but with DIVX open the file and when play (with jmfstudio.exe) then happend the next;
    # An unexpected error has been detected by HotSpot Virtual Machine:
    # EXCEPTION_ACCESS_VIOLATION (0xc0000005) at pc=0x784ac024, pid=2592, tid=184
    # Java VM: Java HotSpot(TM) Client VM (1.5.0_02-b09 mixed mode)
    # Problematic frame:
    # C [ntdll.dll+0x4c024]
    --------------- T H R E A D ---------------
    Current thread (0x0ae3b438): JavaThread "Loop thread: com.sun.media.renderer.video.GDIRenderer@17f242c" [_thread_in_vm, id=184]
    If i used the sample app from IBM all like work ok, anybody know about thanks in advance...

    Hello,
    Thank you for your message! At least I am not the only one…
    I just cannot find any clues by "googling" my problem.
    It's been more than 48hours and I just didn't get an answer from Divx support nor from the online shop I bought the videos from
    Did you check the Macos console after an unsuccessful login attempt?
    I got sth like this :
    03/11/09 21:33:50 [0x0-0x106106].com.divx.DivX_Player[1642] error code: 336151573 in s3_pkt.c line: 1052 data: SSL alert number 45 flags: 3
    03/11/09 21:33:50 [0x0-0x106106].com.divx.DivX_Player[1642] human readable: error:14094415:SSL routines:SSL3READBYTES:sslv3 alert certificate expired
    At least I can enjoy the divx in a "thumbnail mode" running windows Xp on my Imac…
    Thank you Parallels desktop…
    Message was edited by: thecementgarden

  • Problem with using JMF audio over a network

    Hiya, I'm using IBM JMF code but I'm having problems trying to get it transmit data from the MediaTransmitter to the MediaPlayerFrame.
    I'm kinda new to JMF so I assume I'm missing something basis for why this doesn't work.
    Any help would be greatly appreciated.
    MediaPlayerFrame
    import javax.media.*;
    import javax.swing.*;
    import java.awt.*;
    import java.awt.event.*;
    import java.io.*;
    * An instance of the MediaPlayerFrame may be used to display any media
    * recognized * by JMF.  This is intended to be a very simple GUI example,
    * displaying all possible controls for the given media type.
    public class MediaPlayerFrame extends JFrame {
         * The frame title.
        private static final String FRAME_TITLE = "developerWorks JMF Tutorial " +
            "Media Player";
         * The panel title of the main control panel.
        private static final String CONTROL_PANEL_TITLE = "Control Panel";
        // location and size variables for the frame.
        private static final int LOC_X = 100;
        private static final int LOC_Y = 100;
        private static final int HEIGHT = 500;
        private static final int WIDTH = 500;
         private final static long serialVersionUID = 1;
         * The current player.
        private Player player = null;
         * The tabbed pane for displaying controls.
        private JTabbedPane tabPane = null;
         * Create an instance of the media frame.  No data will be displayed in the
         * frame until a player is set.
        public MediaPlayerFrame() {         
            super(FRAME_TITLE);
            System.out.println("MediaPlayerFrame");
            setLocation(LOC_X, LOC_Y);
            setSize(WIDTH, HEIGHT);
            tabPane = new JTabbedPane();
            getContentPane().add(tabPane);
            /* adds a window listener so that the player may be cleaned up before
               the frame actually closes.
            addWindowListener(new WindowAdapter() {
                                   * Invoked when the frame is being closed.
                                  public void windowClosing(WindowEvent e) {
                                      closeCurrentPlayer(); 
                                      /* Closing this frame will cause the entire
                                         application to exit.  When running this
                                         example as its own application, this is
                                         fine - but in general, a closing frame
                                         would not close the entire application. 
                                         If this behavior is not desired, comment
                                         out the following line:
                                      System.exit(0);
         * Creates the main panel.  This panel will contain the following if they
         * exist:
         * - The visual component: where any visual data is displayed, i.e. a
         * movie uses this control to display the video.
         * - The gain component:   where the gain/volume may be changed.  This
         * is often * contained in the control panel component (below.)
         * - The control panel component: time and some extra info regarding
         * the media.
        private JPanel createMainPanel() {
            System.out.println("createMainPanel");
            JPanel mainPanel = new JPanel();
            GridBagLayout gbl = new GridBagLayout();
            GridBagConstraints gbc = new GridBagConstraints();
            mainPanel.setLayout(gbl);
            boolean visualComponentExists = false;
            // if the visual component exists, add it to the newly created panel.
            if (player.getVisualComponent() != null) {
                visualComponentExists = true;
                gbc.gridx = 0;
                gbc.gridy = 0;
                gbc.weightx = 1;
                gbc.weighty = 1;
                gbc.fill = GridBagConstraints.BOTH;
                mainPanel.add(player.getVisualComponent(), gbc);
            // if the gain control component exists, add it to the new panel.
            if ((player.getGainControl() != null) &&
                (player.getGainControl().getControlComponent() != null)) {
                gbc.gridx = 1;
                gbc.gridy = 0;
                gbc.weightx = 0;
                gbc.weighty = 1;
                gbc.gridheight = 2;
                gbc.fill = GridBagConstraints.VERTICAL;
                mainPanel.add(player.getGainControl().getControlComponent(), gbc);
            // Add the control panel component if it exists (it should exists in
            // all cases.)
            if (player.getControlPanelComponent() != null) {
                gbc.gridx = 0;
                gbc.gridy = 1;
                gbc.weightx = 1;
                gbc.gridheight = 1;
                if (visualComponentExists) {
                    gbc.fill = GridBagConstraints.HORIZONTAL;
                    gbc.weighty = 0;
                } else {
                    gbc.fill = GridBagConstraints.BOTH;
                    gbc.weighty = 1;
                mainPanel.add(player.getControlPanelComponent(), gbc);
            return mainPanel;
         * Sets the media locator.  Setting this to a new value effectively
         * discards any Player which may have already existed.
         * @param locator the new MediaLocator object.
         * @throws IOException indicates an IO error in opening the media.
         * @throws NoPlayerException indicates no player was found for the
         * media type.
         * @throws CannotRealizeException indicates an error in realizing the
         * media file or stream.
        public void setMediaLocator(MediaLocator locator) throws IOException,
            NoPlayerException, CannotRealizeException {
              System.out.println("setMediaLocator: " +locator);
            // create a new player with the new locator.  This will effectively
            // stop and discard any current player.
            setPlayer(Manager.createRealizedPlayer(locator));       
         * Sets the player reference.  Setting this to a new value will discard
         * any Player which already exists.  It will also refresh the contents
         * of the pane with the components for the new player.  A null value will
         * stop the discard the current player and clear the contents of the
         * frame.
        public void setPlayer(Player newPlayer) {      
            System.out.println("setPlayer");
            // close the current player
            closeCurrentPlayer();          
            player = newPlayer;
            // refresh the tabbed pane.
            tabPane.removeAll();
            if (player == null) return;
            // add the new main panel
            tabPane.add(CONTROL_PANEL_TITLE, createMainPanel());
            // add any other controls which may exist in the player.  These
            // controls should already contain a name which is used in the
            // tabbed pane.
            Control[] controls = player.getControls();
            for (int i = 0; i < controls.length; i++) {
                if (controls.getControlComponent() != null) {
    tabPane.add(controls[i].getControlComponent());
    * Stops and closes the current player if one exists.
    private void closeCurrentPlayer() {
    if (player != null) {
    player.stop();
    player.close();
    * Prints a usage message to System.out for how to use this class
    * through the command line.
    public static void printUsage() {
    System.out.println("Usage: java MediaPlayerFrame mediaLocator");
    * Allows the user to run the class through the command line.
    * Only one argument is allowed, which is the media locator.
    public static void main(String[] args) {
    try {
    if (args.length == 1) {
    MediaPlayerFrame mpf = new MediaPlayerFrame();
    /* The following line creates a media locator using the String
    passed in through the command line. This version should
    be used when receiving media streamed over a network.
    mpf.setMediaLocator(new MediaLocator(args[0]));
    /* the following line may be used to create and set the media
    locator from a simple file name. This works fine when
    playing local media. To play media streamed over a
    network, you should use the previous setMediaLocator()
    line and comment this one out.
    //mpf.setMediaLocator(new MediaLocator(
    // new File(args[0]).toURL()));
    mpf.setVisible(true);
    } else {
    printUsage();
    } catch (Throwable t) {
    t.printStackTrace();
    MediaTransmitter
    import javax.media.*;
    import javax.media.control.*;
    import javax.media.protocol.*;
    import javax.media.format.*;
    import java.io.IOException;
    import java.io.File;
    * Creates a new media transmitter.  The media transmitter may be used to
    * transmit a data source over a network.
    public class MediaTransmitter {
         * Output locator - this is the broadcast address for the media.
        private MediaLocator mediaLocator = null;
         * The data sink object used to broadcast the results from the processor
         * to the network.
        private DataSink dataSink = null;
         * The processor used to read the media from a local file, and produce an
         * output stream which will be handed to the data sink object for
         * broadcast.
        private Processor mediaProcessor = null;
         * The track formats used for all data sources in this transmitter.  It is
         * assumed that this transmitter will always be associated with the same
         * RTP stream format, so this is made static.
        private static final Format[] FORMATS = new Format[] {
            new AudioFormat(AudioFormat.MPEG_RTP)};
         * The content descriptor for this transmitter.  It is assumed that this
         * transmitter always handles the same type of RTP content, so this is
         * made static.
        private static final ContentDescriptor CONTENT_DESCRIPTOR =
            new ContentDescriptor(ContentDescriptor.RAW_RTP);
         * Creates a new transmitter with the given outbound locator.
        public MediaTransmitter(MediaLocator locator) {
            mediaLocator = locator;
         * Starts transmitting the media.
        public void startTransmitting() throws IOException {
            // start the processor
            mediaProcessor.start();
            // open and start the data sink
            dataSink.open();
            dataSink.start();
         * Stops transmitting the media.
        public void stopTransmitting() throws IOException {
            // stop and close the data sink
            dataSink.stop();
            dataSink.close();
            // stop and close the processor
            mediaProcessor.stop();
            mediaProcessor.close();
         * Sets the data source.  This is where the transmitter will get the media
         * to transmit.
        public void setDataSource(DataSource ds) throws IOException,
            NoProcessorException, CannotRealizeException, NoDataSinkException {
            /* Create the realized processor.  By calling the
               createRealizedProcessor() method on the manager, we are guaranteed
               that the processor is both configured and realized already. 
               For this reason, this method will block until both of these
               conditions are true.  In general, the processor is responsible
               for reading the file from a file and converting it to
               an RTP stream.
            mediaProcessor = Manager.createRealizedProcessor(
                new ProcessorModel(ds, FORMATS, CONTENT_DESCRIPTOR));
            /* Create the data sink.  The data sink is used to do the actual work
               of broadcasting the RTP data over a network.
            dataSink = Manager.createDataSink(mediaProcessor.getDataOutput(),
                                              mediaLocator);
         * Prints a usage message to System.out for how to use this class
         * through the command line.
        public static void printUsage() {
            System.out.println("Usage: java MediaTransmitter mediaLocator " +
                               "mediaFile");
            System.out.println("  example: java MediaTransmitter " +
                "rtp://192.168.1.72:49150/audio mysong.mp3");
            System.out.println("  example: java MediaTransmitter " +
                "rtp://192.168.1.255:49150/audio mysong.mp3");
         * Allows the user to run the class through the command line.
         * Only two arguments are allowed; these are the output media
         * locator and the mp3 audio file which will be broadcast
         * in the order.
        public static void main(String[] args) {
            try {
                if (args.length == 2) {
                    MediaLocator locator = new MediaLocator(args[0]);
                    MediaTransmitter transmitter = new MediaTransmitter(locator);
                    System.out.println("-> Created media locator: '" +
                                       locator + "'");
                    /* Creates and uses a file reference for the audio file,
                       if a url or any other reference is desired, then this
                       line needs to change.
                    File mediaFile = new File(args[1]);
                    DataSource source = Manager.createDataSource(
                        new MediaLocator(mediaFile.toURL()));
                    System.out.println("-> Created data source: '" +
                                       mediaFile.getAbsolutePath() + "'");
                    // set the data source.
                    transmitter.setDataSource(source);
                    System.out.println("-> Set the data source on the transmitter");
                    // start transmitting the file over the network.
                    transmitter.startTransmitting();
                    System.out.println("-> Transmitting...");
                    System.out.println("   Press the Enter key to exit");
                    // wait for the user to press Enter to proceed and exit.
                    System.in.read();
                    System.out.println("-> Exiting");
                    transmitter.stopTransmitting();
                } else {
                    printUsage();
            } catch (Throwable t) {
                t.printStackTrace();
            System.exit(0);

    Okay, here's the it copied out.
    Media Transmitter
    C:\John\Masters Project\Java\jmf1\MediaPlayer>java MediaTransmitter rtp://127.0.
    0.1:2000/audio it-came-upon.mp3
    -> Created media locator: 'rtp://127.0.0.1:2000/audio'
    -> Created data source: 'C:\John\Masters Project\Java\jmf1\MediaPlayer\it-came-u
    pon.mp3'
    streams is [Lcom.sun.media.multiplexer.RawBufferMux$RawBufferSourceStream;@1decd
    ec : 1
    sink: setOutputLocator rtp://127.0.0.1:2000/audio
    -> Set the data source on the transmitter
    -> Transmitting...
       Press the Enter key to exit
    MediaPlayerFrame
    C:\John\Masters Project\Java\jmf1\MediaPlayer>java MediaPlayerFrame rtp://127.0.
    0.1:2000/audio
    MediaPlayerFrame
    setMediaLocator: rtp://127.0.0.1:2000/audioAs I said, it just kinda stops there, what it should be doing is opening the MediaPlayer.
    "MediaPlayerFrame" and "setMediaLocator: rtp://127.0.0.1:2000/audio" are just print outs I used to track here the code is getting to.

  • Problem with file descriptors not released by JMF

    Hi,
    I have a problem with file descriptors not released by JMF. My application opens a video file, creates a DataSource and a DataProcessor and the video frames generated are transmitted using the RTP protocol. Once video transmission ends up, if we stop and close the DataProcessor associated to the DataSource, the file descriptor identifying the video file is not released (checkable through /proc/pid/fd). If we repeat this processing once and again, the process reaches the maximum number of file descriptors allowed by the operating system.
    The same problem has been reproduced with JMF-2.1.1e-Linux in several environments:
    - Red Hat 7.3, Fedora Core 4
    - jdk1.5.0_04, j2re1.4.2, j2sdk1.4.2, Blackdown Java
    This is part of the source code:
    // video.avi with tracks audio(PCMU) and video(H263)
    String url="video.avi";
    if ((ml = new MediaLocator(url)) == null) {
    Logger.log(ambito,refTrazas+"Cannot build media locator from: " + url);
    try {
    // Create a DataSource given the media locator.
    Logger.log(ambito,refTrazas+"Creating JMF data source");
    try
    ds = Manager.createDataSource(ml);
    catch (Exception e) {
    Logger.log(ambito,refTrazas+"Cannot create DataSource from: " + ml);
    return 1;
    p = Manager.createProcessor(ds);
    } catch (Exception e) {
    Logger.log(ambito,refTrazas+"Failed to create a processor from the given url: " + e);
    return 1;
    } // end try-catch
    p.addControllerListener(this);
    Logger.log(ambito,refTrazas+"Configure Processor.");
    // Put the Processor into configured state.
    p.configure();
    if (!waitForState(p.Configured))
    Logger.log(ambito,refTrazas+"Failed to configure the processor.");
    p.close();
    p=null;
    return 1;
    Logger.log(ambito,refTrazas+"Configured Processor OK.");
    // So I can use it as a player.
    p.setContentDescriptor(new FileTypeDescriptor(FileTypeDescriptor.RAW_RTP));
    // videoTrack: track control for the video track
    DrawFrame draw= new DrawFrame(this);
    // Instantiate and set the frame access codec to the data flow path.
    try {
    Codec codec[] = {
    draw,
    new com.sun.media.codec.video.colorspace.JavaRGBToYUV(),
    new com.ibm.media.codec.video.h263.NativeEncoder()};
    videoTrack.setCodecChain(codec);
    } catch (UnsupportedPlugInException e) {
    Logger.log(ambito,refTrazas+"The processor does not support effects.");
    } // end try-catch CodecChain creation
    p.realize();
    if (!waitForState(p.Realized))
    Logger.log(ambito,refTrazas+"Failed to realize the processor.");
    return 1;
    Logger.log(ambito,refTrazas+"realized processor OK.");
    /* After realize processor: THESE LINES OF SOURCE CODE DOES NOT RELEASE ITS FILE DESCRIPTOR !!!!!
    p.stop();
    p.deallocate();
    p.close();
    return 0;
    // It continues up to the end of the transmission, properly drawing each video frame and transmit them
    Logger.log(ambito,refTrazas+" Create Transmit.");
    try {
    int result = createTransmitter();
    } catch (Exception e) {
    Logger.log(ambito,refTrazas+"Error Create Transmitter.");
    return 1;
    } // end try-catch transmitter
    Logger.log(ambito,refTrazas+"Start Procesor.");
    // Start the processor.
    p.start();
    return 0;
    } // end of main code
    * stop when event "EndOfMediaEvent"
    public int stop () {
    try {   
    /* THIS PIECE OF CODE AND VARIATIONS HAVE BEEN TESTED
    AND THE FILE DESCRIPTOR IS NEVER RELEASED */
    p.stop();
    p.deallocate();
    p.close();
    p= null;
    for (int i = 0; i < rtpMgrs.length; i++)
    if (rtpMgrs==null) continue;
    Logger.log(ambito, refTrazas + "removeTargets;");
    rtpMgrs[i].removeTargets( "Session ended.");
    rtpMgrs[i].dispose();
    rtpMgrs[i]=null;
    } catch (Exception e) {
    Logger.log(ambito,refTrazas+"Error Stoping:"+e);
    return 1;
    return 0;
    } // end of stop()
    * Controller Listener.
    public void controllerUpdate(ControllerEvent evt) {
    Logger.log(ambito,refTrazas+"\nControllerEvent."+evt.toString());
    if (evt instanceof ConfigureCompleteEvent ||
    evt instanceof RealizeCompleteEvent ||
    evt instanceof PrefetchCompleteEvent) {
    synchronized (waitSync) {
    stateTransitionOK = true;
    waitSync.notifyAll();
    } else if (evt instanceof ResourceUnavailableEvent) {
    synchronized (waitSync) {
    stateTransitionOK = false;
    waitSync.notifyAll();
    } else if (evt instanceof EndOfMediaEvent) {
    Logger.log(ambito,refTrazas+"\nEvento EndOfMediaEvent.");
    this.stop();
    else if (evt instanceof ControllerClosedEvent)
    Logger.log(ambito,refTrazas+"\nEvent ControllerClosedEvent");
    close = true;
    waitSync.notifyAll();
    else if (evt instanceof StopByRequestEvent)
    Logger.log(ambito,refTrazas+"\nEvent StopByRequestEvent");
    stop =true;
    waitSync.notifyAll();
    Many thanks.

    Its a bug on H263, if you test it without h263 track or with other video codec, the release will be ok.
    You can try to use a not-Sun h263 codec like the one from fobs or jffmpeg projects.

  • PLUG-IN IN JMF

    Hi everybody!
    Please answer my question very fast.
    How I can set a new plug-in in jmf.
    Thanks ,very much
    ROZA

    What you should do is extract the plugin you have downloaded in your JMF/lib directory.
    After that launch jmfregistry from JMF/bin, go to the plugins and codec tabs and add the relevant plugin.
    For instance, type com.ibm.media.codec.video.mpeg.MpegVideo and click "Add".
    If it works correctly no error message should be shown.
    Normally, on the plugin site you should find the relevant documentation on how to do it as each plugin has its own installation.

  • HELP, Diagnostics Applet JMF classes.....Not Found

    Hi,
    I installed JMF and verified the jars are included in classpath and path, but when I run the diagnostic applet page, it still says Applet JMF classes not found, anyone has any idea?
    classpath:
    .;.;.;C:\PROGRA~1\JMF21~1.1E\lib\sound.jar;C:\PROGRA~1\JMF21~1.1E\lib\jmf.jar;C:\PROGRA~1\JMF21~1.1E\lib
    ;C:\WINDOWS\java\classes;.
    path:
    C:\Program Files\ThinkPad\Utilities;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\Program F
    iles\IBM\Infoprint Select;C:\Notes;C:\Program Files\XLView;C:\lotus\compnent;C:\Utilities;C:\Program Fil
    es\IBM\Personal Communications\;C:\Program Files\IBM\Trace Facility\;C:\Program Files\ThinkPad\ConnectUt
    ilities;C:\WINDOWS\Downloaded Program Files;C:\Program Files\Common Files\Lenovo;C:\Program Files\ATI Te
    chnologies\ATI Control Panel
    JMF Diagnostics:
    Java 1.1 compliant browser.....Maybe
    JMF classes.....Not Found
    http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/jmfdiagnostics.html

    Everything looks fine as far as the setup goes... Unless you're having problems with a specific JMF application, I wouldn't worry about the diagnostic applet telling you it isn't installed correctly.
    Very well could be a browser security issue with the applet...

  • Do any one kown how to encoder pcm to alaw if you use jmf?

    I'm have the problem about it how to encoder pcm to alaw?
    I find some code about it, but not work in jmf.
    A friend in forum tell me write it by myself , because jmf not support for alaw.
    But i'm new about this, so i want whether somebody could help me if you have the same code about this.
    code like this:
    capture.java
    * support : locator, datasource
    public class Capture extends MainFrame {
        private static final long serialVersionUID = 1L;
        private Processor cp = null;
        private Boolean isMix = false;
        private Boolean isUlaw = false;
        private MainFrame mf = null;
        public Capture(MainFrame mf, boolean useVideo) {
            this.isMix = useVideo;
            this.mf = mf;
        public Capture(MainFrame mf) {
            this.mf = mf;
        public Capture(boolean useVideo) {
            this.isMix = false;
        //get video MediaLocator
        public MediaLocator getVideoLocator() {
            MediaLocator VideoLocator = null;
            Vector VideoCaptureList = null;
            CaptureDeviceInfo VideoCaptureDeviceInfo = null;
            VideoFormat videoFormat = new VideoFormat(VideoFormat.YUV);
            VideoCaptureList = CaptureDeviceManager.getDeviceList(videoFormat);
            if (VideoCaptureList.size() > 0) {
                VideoCaptureDeviceInfo = (CaptureDeviceInfo) VideoCaptureList.elementAt(0);
                VideoLocator = VideoCaptureDeviceInfo.getLocator();
            } else {
    //--- Search video device
                VideoCaptureList = (Vector) CaptureDeviceManager.getDeviceList(null).clone();
                Enumeration getEnum = VideoCaptureList.elements();
                while (getEnum.hasMoreElements()) {
                    VideoCaptureDeviceInfo = (CaptureDeviceInfo) getEnum.nextElement();
                    String name = VideoCaptureDeviceInfo.getName();
                    if (name.startsWith("vfw:")) {
                        CaptureDeviceManager.removeDevice(VideoCaptureDeviceInfo);
                int nDevices = 0;
                for (int i = 0; i < 10; i++) {
                    String name = com.sun.media.protocol.vfw.VFWCapture.capGetDriverDescriptionName(i);
                    if (name != null && name.length() > 1) {
                        System.err.println("Found device" + name);
                        System.err.println("Querying device.Please wait....");
                        com.sun.media.protocol.vfw.VFWSourceStream.autoDetect(i);
                        nDevices++;
                VideoCaptureDeviceInfo = (CaptureDeviceInfo) CaptureDeviceManager.getDeviceList(videoFormat).elementAt(0);
                VideoLocator = VideoCaptureDeviceInfo.getLocator();
            return VideoLocator;
        //get audio MediaLocator
        public MediaLocator getAudioLocator() {
            MediaLocator AudioLocator = null;
            Vector AudioCaptureList = null;
            CaptureDeviceInfo AudioCaptureDeviceInfo = null;
            AudioFormat audioFormat = new AudioFormat(AudioFormat.LINEAR);
            AudioCaptureList = CaptureDeviceManager.getDeviceList(audioFormat);
            if (AudioCaptureList.size() > 0) {
                AudioCaptureDeviceInfo = (CaptureDeviceInfo) AudioCaptureList.elementAt(0);
                AudioLocator = AudioCaptureDeviceInfo.getLocator();
                System.out.println("get audio locator");
            } else {
                System.out.println("empty");
                AudioLocator = null;
            return AudioLocator;
        public DataSource getDataSource() {
            try {
                DataSource[] source = new DataSource[2];    //audio 0; video 1;
                source[0] = Manager.createDataSource(getAudioLocator());
                if (isMix) {
                    DataSource MixDataSource = null;
                    source[1] = Manager.createDataSource(getVideoLocator());
                    MixDataSource = Manager.createMergingDataSource(source);
                    return MixDataSource;
                } else {
                    return source[0];
            } catch (Exception ex) {
                System.out.println("---error0---" + ex);
                return null;
        public DataSource getDataSourceByProcessor() {
            StateHelper sh = null;
            DataSource OutDataSource = null;
            DataSource InputSource = null;
            InputSource = getDataSource();
            try {
                cp = Manager.createProcessor(InputSource);
            } catch (IOException ex) {
                System.out.println("---error1---" + ex);
            } catch (NoProcessorException ex) {
                System.out.println("---error2---" + ex);
            sh = new StateHelper(cp);
            if (!sh.configure(10000)) {
                System.out.println("Processor Configured Error!");
                System.exit(-1);
            ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW_RTP);
            cp.setContentDescriptor(cd);
            AudioFormat ulawFormat = new AudioFormat(AudioFormat.ULAW_RTP, 8000.0, 8, 1);
           // change encode to alaw here?
            TrackControl[] tracks = cp.getTrackControls();
            boolean encodingState = false;
            for (int i = 0; i < tracks.length; i++) {
                if (!encodingState && tracks[i] instanceof FormatControl) {
                    if (tracks.setFormat(ulawFormat) == null) {
    tracks[i].setEnabled(false);
    } else {
    encodingState = true;
    //---- ptime set
                   if (isUlaw) {
         try {
         System.out.println("Cambiando la lista de codecs...");
         Codec[] codec = new Codec[3];
         codec[0] = new com.ibm.media.codec.audio.rc.RCModule();
         codec[1] = new com.ibm.media.codec.audio.ulaw.JavaEncoder();
         //codec[2] = new com.ibm.media.codec.audio.ulaw.Packetizer();
         codec[2] = new com.sun.media.codec.audio.ulaw.Packetizer();
         ((com.sun.media.codec.audio.ulaw.Packetizer) codec[2]).setPacketSize(160);
         (tracks[i]).setCodecChain(codec);
         } catch (UnsupportedPlugInException ex) {
         } catch (NotConfiguredError ex) {
    } else {
    tracks[i].setEnabled(false);
    if (!encodingState) {
    System.out.println("Encode error");
    System.exit(-1);
    if (!sh.realize(10000)) {
    System.out.println("Processor Realized Error!");
    System.exit(-1);
    OutDataSource = cp.getDataOutput();
    return OutDataSource;
    //run
    public void start() {
    cp.start();
    //stop
    public void close() {
    cp.close();

    test.java (get outputDatasource form capture.java)
    package siphone.test;
    import java.io.IOException;
    import java.net.InetAddress;
    import javax.media.Format;
    import javax.media.Manager;
    import javax.media.Player;
    import javax.media.PlugInManager;
    import javax.media.control.BufferControl;
    import javax.media.format.UnsupportedFormatException;
    import javax.media.protocol.DataSource;
    import javax.media.protocol.PushBufferDataSource;
    import javax.media.protocol.PushBufferStream;
    import javax.media.rtp.InvalidSessionAddressException;
    import javax.media.rtp.Participant;
    import javax.media.rtp.RTPControl;
    import javax.media.rtp.RTPManager;
    import javax.media.rtp.ReceiveStream;
    import javax.media.rtp.ReceiveStreamListener;
    import javax.media.rtp.SendStream;
    import javax.media.rtp.SendStreamListener;
    import javax.media.rtp.SessionAddress;
    import javax.media.rtp.SessionListener;
    import javax.media.rtp.event.ByeEvent;
    import javax.media.rtp.event.NewParticipantEvent;
    import javax.media.rtp.event.NewReceiveStreamEvent;
    import javax.media.rtp.event.NewSendStreamEvent;
    import javax.media.rtp.event.ReceiveStreamEvent;
    import javax.media.rtp.event.RemotePayloadChangeEvent;
    import javax.media.rtp.event.SendStreamEvent;
    import javax.media.rtp.event.SessionEvent;
    import javax.media.rtp.rtcp.SourceDescription;
    import siphone.decode.AlawRtpDecoder;
    import siphone.device.Capture;
    import siphone.gui.MainFrame;
    * @author kaiser
    public class Dial implements SessionListener, ReceiveStreamListener, SendStreamListener {
        private String ip = null;
        private int TargetPort;
        private int LocalPort;
        private MainFrame mf = null;
        private Capture capture = null;
        private DataSource oDataSource = null;
        private RTPManager rtp = null;
        private SendStream stream = null;
        private Player play = null;
        private Boolean LogStatu = false;
        static private Format AlawRtpFormat;
        public Dial(String ip, String rport, int lport, MainFrame mf) {
            this.ip = ip;
            this.TargetPort = Integer.parseInt(rport);
            this.LocalPort = lport;
            this.mf = mf;
            capture = new Capture(false);
            oDataSource = capture.getDataSourceByProcessor();
        public static void main(String[] args) throws IOException {
            Dial dial = new Dial("192.168.200.40", "40000", 10000, null);
            dial.Init();
            System.in.read();
            dial.stop();
        public synchronized void Init() {
            PushBufferDataSource pushDataSource = (PushBufferDataSource) oDataSource;
            PushBufferStream pushStream[] = pushDataSource.getStreams();
            SessionAddress localAddress, targetAddress;
            try {
                rtp = RTPManager.newInstance();
                rtp.addReceiveStreamListener(this);
                rtp.addSendStreamListener(this);
                rtp.addSessionListener(this);
                //port: TargetPort or LocalPort;
                localAddress = new SessionAddress(InetAddress.getLocalHost(), TargetPort);
                targetAddress = new SessionAddress(InetAddress.getByName(ip), TargetPort);
                rtp.initialize(localAddress);
                rtp.addTarget(targetAddress);
                BufferControl bc = (BufferControl) rtp.getControl("javax.media.control.BufferControl");
                if (bc != null) {
                    bc.setBufferLength(100);
                stream = rtp.createSendStream(oDataSource, 0);
                stream.start();
                capture.start();
                System.out.println("voice starting...........\n");
            } catch (UnsupportedFormatException ex) {
                System.err.println("error0:\n" + ex);
            } catch (InvalidSessionAddressException ex) {
                System.err.println("error1:\n" + ex);
            } catch (IOException ex) {
                System.err.println("error2:\n" + ex);
        //stop
        public void stop() {
            capture.close();
            if (rtp != null) {
                rtp.removeTargets("close session");
                rtp.dispose();
                rtp = null;
            System.out.println("close session...........\n");
        public void update(SessionEvent evt) {
            if (evt instanceof NewParticipantEvent) {
                Participant sPart = ((NewParticipantEvent) evt).getParticipant();
                System.out.print("  - join: " + sPart.getCNAME());
        public void update(ReceiveStreamEvent rex) {
            ReceiveStream rStream = rex.getReceiveStream();
            Participant rPart = rex.getParticipant();
            if (rex instanceof NewReceiveStreamEvent) {
                try {
                    DataSource rData = rStream.getDataSource();
                    RTPControl rcl = (RTPControl) rData.getControl("javax.media.rtp.RTPControl");
                    if (rcl != null) {
                        System.out.print("  - receive RTPControl info | the AudioFormat: " + rcl.getFormat());
                    } else {
                        System.out.print("  - receive RTPControl info");
                    if (rPart == null) {
                        System.out.print(" UNKNOWN");
                    } else {
                        System.out.print("data form " + rPart.getCNAME());
                    play = Manager.createRealizedPlayer(rData);
                    if (play != null) {
                        play.start();
                } catch (Exception ex) {
                    System.out.print("NewReceiveStreamEvent Exception " + ex.getMessage());
                    return;
            } else if (rex instanceof ByeEvent) {
                System.out.print("  - receive bye message: " + rPart.getCNAME() + "   exit");
                stop();
                return;
    }Anyone can tell me how to do?
    Thank you very much!

  • JMF on Pocket PC problem

    Hello all,
    I'm now using IBM J9 personalprofile for develop application.
    I need to write a audio player first. I try to port JMF to the pocket pc.
    I have read this post...
    http://forums.java.sun.com/thread.jspa?threadID=238274&start=0&tstart=30
    For me, I use JMF cross platform edition and use customerizer.exe from Windows version to generate custom.jar. However, I need to complie RegistryLib.java mannually because the program said cannot find it. I try to use J2SE SDK 1.4.2 and 1.1.8 to generate RegistryLib.class and place it under correct directory and re-run the customerizer using same setting as before.
    Then import custom.jar in to IBM WebSphere Studio Device Developer and Run the application. The exception flows...
    IOException in readRegistry: java.io.EOFException
    Unable to handle format: msadpcm, 22050.0 Hz, 4-bit, Stereo, Unsigned, 22311.0 frame rate, FrameSize=8192 bits
    Failed to realize: com.sun.media.PlaybackEngine@287a287a
    Error: Unable to realize com.sun.media.PlaybackEngine@287a287a
    When i port this program into PPC( I use iPAQ 5550) and run it with j9VM. The exception is noplayerexception
    I am thinking the problem may be in compling RegistryLib.java.
    It seems that someone has successfully port JMF on PPC. Does anyone has simliar experience? How can I solve it?

    hi,...regarding to your post you have suceeded to implement jmf cross platform. i've download it to my pc but i cant find any RegistriLib.java...where is it??
    i have running this project for months... i really need your reply,..thanks

Maybe you are looking for