RTP streaming port question

I want to stream captured audio from a Linux box to a PC. It works when I run JMstudio on both machines. I wrote my own Java program for the server side. It captures and can play back locally. I use a datasink to stream the audio and it seems to set up and run, but I don't receive anything at the PC (using JMstudio). Here is where I set up the datasink:
         try
            String url= "rtp://192.168.1.160:22000/audio/1";
            MediaLocator m = new MediaLocator(url);
            DataSink d = Manager.createDataSink(ds, m);
            System.out.println("data sink created");
            d.open();
            System.out.println("data sink opened");
            d.start();
            System.out.println("Streaming...");
         } catch (Exception e) {
            System.out.println("data sink exception: " + e.getMessage() );
            e.printStackTrace();
            System.exit(-1);
    I've captured the LAN packets. With JMstudio I see the udp data packets go from the server from port 22000 to the client port at 22000. I also see occasional udp packets on 22001 which I assume are status and command. With my code, I only see the status packets and no data packets. The odd thing (to me) is that the server port for the command/status is random each time I run it. The client is always 22001.
Do I need to force my server data port to 22000? And if so how? If not, any ideas on what's going on? I'm not sure what would help: code listing, log file, and/or LAN activity lists.
- Steve

(sheepish look) I forgot to start the DataSink and the Processor; the port thing was not the problem.

Similar Messages

  • RTP stream question

    Hello community, I have a question
    The setup is this, a cisco sccp phone is located in russia and registrated in a callmanager located in denmark. The phone dials a number that translate and use a routelist to a local gateway in russia. The dialed number in the local cisco gateway is a dialpeer with a siptrunk to a lync mediation server also located in denmark.
    How is the final rtp stream going to be from the cisco phone to the Lync server?
    Br. Kasper

    Hi. It actually all comes down to the reporting in Lync. The issue is that Lync reports "bad" conferance per Sip trunk. I want to make a more detailed design where I place a SIP trunk in every affilliate around europe in their local gateway, but as the phones are registrated in cucm cluster in denmark and the Lync mediation server(sip trunk connection to local gateway) also is located in denmark, I need a redesign of the rtp path including the MTP, that is why I want to put the MTP resource in the local affiliate gateway(this eksample russia) So by doing this the rtp path would be phone->local mtp->local vgw-siptrunk->Lync mediation in denmark. I just needed a secound thought on the design.
    The current setup would report all "bad" conferances as beeing in denmark, even if they are originated in other countries. The local dialpattern is just callcorwarded to the dk siptrunk->Lync mediation.The SIP trunk in cucm are limmited to only one connection to the Lync mediation server.

  • VoIP ports, RTP stream

    I have no audio with VoicePulse VoIP and am told by their technical support that the router must allow ports 10000 - 20000 to be opened. The situation is that I cannot hear an inbound call but they can hear me, and VoicePulse states tha the RTP stream is not being sent.
    Does anyone have any insight to this? I have used other VoIP carriers successfully with a TimeCapsule.

    What was the port range? I use Voicepulse and am considering switching a current Airport Extreme router. Do you have any other issues with voicepulse that you'd blame on the Airport? Any issues with call quality given the lack of QoS?

  • RTP Streaming seek & stop question.

    Is it possible to seek in a rtp stream received at the client. Since rtp works on push data concept,is there a way to implement this.
    I want the server to stop streaming when the client exits. Normally the sender sends Bye Event to Receiver. Is it possible for other way around to tell the sender to stop streaming.
    Thanks.

    From what I understand RTP on it's is basically for a blind transmission, like watch tv or the listening to the Radio, if the RTP stream (analogous to the tv signal or radio waves) is being broadcast you can receive it. With this method you have no control really, you are just given what is being streamed.
    However RTSP, allows you send commands to a server that will stream RTP content, which you can start, stop, pause, and in some cases move forward or backward to a particular time in the sequence.
    You may find the following link useful: http://www.cs.columbia.edu/~hgs/teaching/ais/slides/RTSP.pdf
    hope this helps,
    Stef

  • RTP/RTCP  Port query for JXTA

    Hello,everyone~I got a question about JXTA.
    I am trying to develop a video transmisstion system using AVRecevie3.java and AVTransmit3.java.I have already make GUIs for both sides,they work pretty well.
    By replacing the default DataProgramSocket with JxtaMulticastSocket,i then wanna implement P2P.In RTPJxtaSocketAdapter.java(implemented by myself),rtp and rtcp read two different unrelated advs to communicate,the transmitter side seems being able to transmit packet,but the receiver-side cannot recevie the stream with a message:"watiing for rtp data to arrive......"What is the problem?Or am i toally wrong about the design?
    Since RTP and RTCP use neighboring ports to coordinate the transmission,what am i supposed to do to so that the receiver-side can get the rtp stream in JXTA?
    Sorry about poor english,hope myself clear enough...
    Looking forward to your reply....Thanks

    I had the same problem and when I saw that post, I was angry that no one let an answer. So, I will do.
    My application is switching Transmitter and Receiver to make a half-duplex communication. The ports number are static. A user (receiver) always use the same port and the transmitter connect to all the ports.
    Example :
    User A -> port 22222
    User B -> port 22225
    User A talk to user B with port 22225
    User A let control to user B
    User B talk to user A with port 22222
    The fix is very simple: let at least 3 ports between each user.
    I hope this could help someone.

  • Mixing audio RTP-Streams to conference

    I'm trying to create a telefon conference applikation with jmf.
    My problem:
    I need to mix multiple incomming RTP-Streams (Audio/G711_ulaw) together into one outgoing RTP-Stream.
    If I use a MergingDataSource i can get one DataSource with multiple Tracks, but i'm not able to send them out together over one RTP-Session.
    Is there any way to do this?
    I have also thaught about using RawBufferMux or WAVMux to multiplex the tracks included in the MergingDataSource ,but i cant find out how to use them.
    Or is there an easyer way to create an conference over RTP?
    I need to connect to cisco ip-phones, so i can only use the g711_ulaw codec.
    Is there anyone out who can help me?

    Im sorry, but I met this problem in past and find it impossible to resolve.
    Im also thought about MergingDataSource, about Many2One class and more and more but nothing I could use. And nobody could answer this question in this forum half of year ago.
    Oh, nearly forgot: it is possible to write merged streams to file, seems possible to write them into file and then to transmit it to net from file. But how to realize more than one Processor?..

  • RTP streaming and Cisco IP phones problem

    Hello,
    I'm trying to write an application that should dial some numbers and play the voice message from the file into the phone line using Cisco JTAPI and Java Media Framework.
    I've found some samples, that seems useful for me, but unfortunately they does not work. There are no any errors and no exceptions, I have no idea what to do.
    Small brief: I make a call from one Cisco IP phone (7960) to another using Cisco JTAPI, then I catch the CiscoRTPInputStartedEv event, get the IP and port of the IP Phone and call the RTPStreamer class constuctor with them. It gives no any errors or exceptions (just a message shown below), but there is only silence in the phone line. Message:
    Should b streamin'...
    Encoding ok?: true
    streams is [Lcom.sun.media.multiplexer.RawBufferMux$RawBufferSourceStream;@53d : 1
    sink: setOutputLocator rtp://192.168.1.22:20794/audio
    Please see the RTFStreamer class code below.
    I set the packet size to 160 as reccomended for Cisco IP phones, I use the greeting.wav from Cisco example that properties are 8Khz 8bit mono, but it still doesn't work.
    Could you help me? Thank you for any advice!
    import java.io.* ;
    import java.util.* ;
    import java.net.* ;
    import javax.media.* ;
    import javax.media.control.* ;
    import javax.media.format.* ;
    import javax.media.protocol.* ;
    import stream.*;
    public class RtpStreamer
         public static int PlayCounter = 0;
         private RtpStreamer()
              // not supported
         public RtpStreamer(String IP, String Port)
              PlayCounter++;
              new RtpStreamer("rtp://" + IP + ":" + Port + "/");
         public RtpStreamer(String CurrentMediaUrl)
              PlayCounter++;
         System.out.println("Should b streamin'...");
         // Create a Processor for the selected file. Exit if the
         // Processor cannot be created.
         Processor processor = null;
         StateHelper sh = null;
         try
                   String mediaUrl = "file:\\C:\\greetings.wav";
         processor = Manager.createProcessor( new MediaLocator(mediaUrl));
         sh = new StateHelper(processor);
         catch (IOException e)
         System.out.println("Exception occured (1a): " + e);
         catch (NoProcessorException e)
         System.out.println("Exception occured (1b): " + e);
         // for loggin purpose
         //sh.setContext( getServletContext() );
         // configure the processor
         if (!sh.configure(10000))
         System.out.println("Configuration failed!!");
         // Block until the Processor has been configured
         TrackControl track[] = processor.getTrackControls();
         boolean encodingOk = false;
         // Go through the tracks and try to program one of them to
         // output ulaw data.
         for (int i = 0; i < track.length; i++)
         if (!encodingOk && track[i] instanceof FormatControl)
         if (((FormatControl)track).setFormat( new AudioFormat(AudioFormat.ULAW_RTP,8000,8,1)) == null)
         track[i].setEnabled(false);
         else
         encodingOk = true;
         else
         // we could not set this track to ulaw, so disable it
         track[i].setEnabled(false);
                   // set packet size to 160
                   try
                        Codec codec[] = new Codec[3];
                        codec[0] = new com.ibm.media.codec.audio.rc.RCModule();
                        codec[1] = new com.ibm.media.codec.audio.ulaw.JavaEncoder();
                        codec[2] = new com.sun.media.codec.audio.ulaw.Packetizer();
                        ((com.sun.media.codec.audio.ulaw.Packetizer)codec[2]).setPacketSize(160);
                        ((TrackControl)track[i]).setCodecChain(codec);
                   catch (Exception e)
                        System.out.println("Error setting packet size in 160: " + e + " in " + e.getMessage());
         System.out.println("Encoding ok?: " + encodingOk );
         // At this point, we have determined where we can send out
         // ulaw data or not.
         // realize the processor
         if (encodingOk)
         if (!sh.realize(10000))
         System.out.println("Realization failed!!");
         // block until realized.
         // get the output datasource of the processor and exit
         // if we fail
         DataSource ds = null;
         try
         ds = processor.getDataOutput();
         catch (NotRealizedError e)
         System.out.println("Exception occured(2): "+e);
         // hand this datasource to manager for creating an RTP
         // datasink.
         // our RTP datasink will multicast the audio
         try
         //String mediaUrl= "rtp://192.168.1.12:20002/audio/1"; // it works without errors
                        String mediaUrl= CurrentMediaUrl + "audio";
         MediaLocator m = new MediaLocator(mediaUrl);
         DataSink d = Manager.createDataSink(ds, m);
         d.open();
         d.start();
         catch (Exception e)
         System.out.println("Exception occured(3): "+e);

    BTW is there any solution to figure out if the RTP application makes any network activity or not?

  • Help forwarding an RTP stream

    I need help in forwarding an RTP audio stream from one client to another.I know i need to create a datasource from the newReceicveStream event and use it as the ds for the processor in transmitter but i have not had success.Can people help me with the code that does forwarding?

    Well ill try to help you. I dont really know what you are trying to do but if you want to pass the the rtp object to another client then have a look at this article on how to send objects over sockets.
    http://java.sun.com/developer/technicalArticles/ALT/sockets/
    Why dont you just send the information required for the rtp (ip, port) to the other client and have them create a new rtp session?

  • RTP stream - record and listen simultaneously ...

    Hi!
    I would like to record and listen RTP stream simultaneously, but I don't know how... I can record to a file OR listen in headphone but just separately.
    If i create a datasink for recording on a port I cannot use that port to listen. But in this case how can I listen the audio stream on headphone?
    Maybe the good solution would be for me if I could clone the stream but I cannot do, because I get an exception
    javax.media.NoProcessorException: Cannot find a Processor for: com.ibm.media.protocol.CloneablePushBufferDataSource@1d8957fThis is the code:
    DataSource ds = Manager.createCloneableDataSource(Manager.createDataSource(media_locator));
    processor = Manager.createRealizedProcessor(new ProcessorModel(ds, formats, outputType));Has anybody got any idea?
    Thanks!
    Ric Flair

    I'd point you in this direction initially.
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/Clone.java]

  • RTP streaming

    Hi guys,
    what I need to do is to stream YUV images between two JMF applications. I am using VideoTransmit application (taken from this site) to transmit and JMStudio to receive. From a little research I have noticed the following to transmit:
    1) Create the DataSource using the Manager.CreateDataSource(MediaLocator)
    2) create processor using the above created datasource
    3) create player etc.
    My question now is the following:
    How can I transmit a YUV file? How can I create the DataSource which has this YUV file without having problems for the further creation of processor etc?
    Thanks in advance guys,
    Chris

    Ok, I think I know what was confusing me. Apparently the setRate() isn't effecting the rate of stream. The audio file I am streaming runs about five minutes usually, so I sped the rate up to 10. It sped the process of writing to a file up significantly, but writing to an RTP stream seems to take the full run-time of the file. Does this sound right or is there something else I'm missing?
    Thanks,
    Khanathor

  • Get a event for a new received rtp stream in a player

    Hi!
    I'm trying to implement a RTP-player, that receives a AV-stream and play it. The special thing about this player should be, that even if the stream interrupts, the player wait's on the same IP and port for a new stream and open it in the SAME frame (not like jmstudio in new window).
    I try to catch the "ReceiveStreamEvent", so i can restart the player, but i don't get eny events for this. I tried to do it with a "RTPManager", but i don't know how.
    Does anybody has a example how to get the "ReceiveStreamEvent", so i know, the RTP-stream has been interrupted?
    Thanks
    Adam

    See AVReceive2 in JMF Solutions, in JMF web site

  • Monitoring RTP streams

    I have downloaded RTPMonitor.java to monitor the RTP streams. It executes fine but when I give the session address and the session name as arguments and click on 'Start', it gives:
    SessionManagerException creating RtpMonitorManager:
    Control Port must be valid and odd.
    Please help me out on this..
    Thanks,
    Meghana

    Where can I download RTPMonitor.java?
    I interested in reading all fields in RTP header. Can I do that with it?

  • No Inter-Cluster RTP Stream with Gatekeepers

    Hello,
    Firstly I am no expert in Cisco telephony as we have just recently migrated to a full Cisco solution, so apologies if I ask a fundamental question.
    Client A = Site A
    Client B = Site B
    Client C = Site C
    Site A and Site B are in 1 cluster
    Site C is in another cluster
    Intra-Cluster Traffic Works
    Client A -> Client B within the same cluster (across 2 sites with a low latency link) RTP stream comes up and the call functions as expected.
    Inter-Cluster Traffic Fails (GK to GK)
    Client C -> Client A this works, the RTP stream comes up and the call functions as expected.
    Client A -> Client C this call connects but there is no RTP stream.
    We are using G711 across the board and I have captured a wireshark capture from a Client A -> Client C failed call.
    I have been going through this capture and noticed that when I search H225 (for the gatekeepers) I see the following –
    CS: setup
    RAS: admissionRequest
    RAS: admissionConfirm
    CS: callProeeding
    CS: alerting
    CS: notify
    RAS: registrationRequest
    RAS: registrationConfirm
    CS: notify
    CS: connect
    CS: notify
    CS: releaseComplete
    RAS: disengageRequest  (DISCONECT_REASON=2,TIME=1321266127,DURATION=24,DISCONNECT_STRING=no resource,ORIGIN=0,LINE_NUMBER=GK,OUTBUND_GW_IP=..
    RAS: disengageConfirm
    There are firewalls inbetween and these were the first thing I looked at, but I dont even see any RTP stream trying to be initiated from the far side. Would anyone have any ideas where I could start looking?
    Thanks,
    Peter

    Pat,
    I can not talk to the UC540, but I ran into a situation recently where the SIP gateway was sending out the private extension of the phone number instead of the full DID that was registered to the provider.  The provider was then blocking call.  
    In our SIP debugs we saw the RDNIS information of the private extension I believe. The error code we were getting back from SP was code 404 or something along those lines.
    I recommend you do some debugs and track where the calls fails, compare the SNR call versus a normal call in the debugs, and then if you still get stuck post running configs and debugs back here. 

  • How to extract data from Buffer and create a RTP stream

    Hi
    I'm working on a project where I need to interrupt a media stream (video and audio) and extract the data. Send the data with a custom protocol. On the recieveing side I would like to reconstruct the stream using only the data chunks.
    I'm currently looking at [DataSourceReader.java|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/DataSourceReader.java] and more specifically at a method like printDataInfo(Buffer buffer) to extract the data.
    Is it possible to create a RTP stream, only having access to the byte array "data" in Buffer ?
    Thanks in advance.

    camelstrike wrote:
    Hi
    I'm working on a project where I need to interrupt a media stream (video and audio) and extract the data. Send the data with a custom protocol. On the recieveing side I would like to reconstruct the stream using only the data chunks.
    I'm currently looking at [DataSourceReader.java|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/DataSourceReader.java] and more specifically at a method like printDataInfo(Buffer buffer) to extract the data.
    There are a couple of different ways to get the data. Reading it from inside a DataSink is perfectly fine...
    Is it possible to create a RTP stream, only having access to the byte array "data" in Buffer ?Yes and no.
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/CustomPayload.html]
    You need to know the format of the media in addition to the actual media data...

  • Writing a conference server for RTP streams

    Hello,
    I'm trying to write a conference server which accepts multiple RTP streams (one for each participant), creates a mixed RTP stream of all other participants and sends that stream back to each participant.
    For 2 participants, I was able to correctly receive and send the stream of the other participant to each party.
    For 3 participants, creating the merging data source does not seem to work - i.e. no data is received by the participants.
    I tried creating a cloneable data sources instead, thinking that this may be the root cause, but when creating cloneable data sources from incoming RTP sources, I am unable to get the Processor into Configured state, it seems to deadlock. Here's the code outline :
        Iterator pIt = participants.iterator();
        List dataSources = new ArrayList();
        while(pIt.hasNext()) {
          Party p = (Party) pIt.next();
          if(p!=dest) {
            DataSource ds = p.getDataSource();
            DataSource cds = Manager.createCloneableDataSource(ds);
            DataSource clone= ((SourceCloneable)cds).createClone();
            dataSources.add(clone);
        Object[] sources = dataSources.toArray(new DataSource[0]);
        DataSource dataSource =   Manager.createMergingDataSource((DataSource[])sources);
        Processor p = Manager.createProcessor(dataSource);
        MixControllerListener cl = new MixControllerListener();
        p.addControllerListener(cl);
        // Put the Processor into configured state.
        p.configure();
        if (!cl.waitForState(p, p.Configured)) {
            System.err.println("Failed to configure the processor.");
            assert false;
        }Here are couple of stack traces :
    "RTPEventHandler" daemon prio=1 tid=0x081d6828 nid=0x3ea6 in Object.wait() [98246000..98247238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f37e4a8> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at demo.Mixer$MixControllerListener.waitForState(Mixer.java:248)
            - locked <0x9f37e4a8> (a java.lang.Object)
            at demo.Mixer.createMergedDataSource(Mixer.java:202)
            at demo.Mixer.createSendStreams(Mixer.java:165)
            at demo.Mixer.createSendStreamsWhenAllJoined(Mixer.java:157)
            - locked <0x9f481840> (a demo.Mixer)
            at demo.Mixer.update(Mixer.java:123)
            at com.sun.media.rtp.RTPEventHandler.processEvent(RTPEventHandler.java:62)
            at com.sun.media.rtp.RTPEventHandler.dispatchEvents(RTPEventHandler.java:96)
            at com.sun.media.rtp.RTPEventHandler.run(RTPEventHandler.java:115)
    "JMF thread: com.sun.media.ProcessEngine@a3c5b6[ com.sun.media.ProcessEngine@a3c5b6 ] ( configureThread)" daemon prio=1 tid=0x082fe3c8 nid=0x3ea6 in Object.wait() [977e0000..977e1238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f387560> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at com.sun.media.parser.RawBufferParser$FrameTrack.parse(RawBufferParser.java:247)
            - locked <0x9f387560> (a java.lang.Object)
            at com.sun.media.parser.RawBufferParser.getTracks(RawBufferParser.java:112)
            at com.sun.media.BasicSourceModule.doRealize(BasicSourceModule.java:180)
            at com.sun.media.PlaybackEngine.doConfigure1(PlaybackEngine.java:229)
            at com.sun.media.ProcessEngine.doConfigure(ProcessEngine.java:43)
            at com.sun.media.ConfigureWorkThread.process(BasicController.java:1370)
            at com.sun.media.StateTransitionWorkThread.run(BasicController.java:1339)
    "JMF thread" daemon prio=1 tid=0x080db410 nid=0x3ea6 in Object.wait() [97f41000..97f41238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Object.wait(Object.java:429)
            at com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave.run(CloneableSourceStreamAdapter.java:375)
            - locked <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Thread.run(Thread.java:534)Any ideas ?
    Thanks,
    Jarek

    bgl,
    I was able to get past the cloning issue by following the Clone.java example to the letter :)
    Turns out that the cloneable data source must be added as a send stream first, and then the clonet data source. Now for each party in the call the conf. server does the following :
    Party(RTPManager mgr,DataSource ds) {
          this.mgr=mgr;
          this.ds=Manager.createCloneableDataSource(ds);
       synchronized DataSource cloneDataSource() {
          DataSource retVal;
          if(getNeedsCloning()) {
            retVal = ((SourceCloneable) ds).createClone();
          } else {
            retVal = ds;
            setNeedsCloning();
          return retVal;
        private void setNeedsCloning() {
          needsCloning=true;
        private boolean getNeedsCloning() {
          return needsCloning;
         private synchronized void addSendStreamFromNewParticipant(Party newOne) throws UnsupportedFormatException, IOException {
        debug("*** - New one joined. Creating the send streams. Curr count :" + participants.size());
        Iterator pIt = participants.iterator();
        while(pIt.hasNext()) {
          Party p = (Party)pIt.next();
          assert p!=newOne;
          // update existing participant
          SendStream sendStream = p.getMgr().createSendStream(newOne.cloneDataSource(),0);
          sendStream.start();
          // send data from existing participant to the new one
          sendStream = newOne.getMgr().createSendStream(p.cloneDataSource(),0);
          sendStream.start();
        debug("*** - Done creating the streams.");So I made some progress, but I'm still not quite there.
    The RTP manager JavaDoc for createSendStream states the following :
    * This method is used to create a sending stream within the RTP
    * session. For each time the call is made, a new sending stream
    * will be created. This stream will use the SDES items as entered
    * in the initialize() call for all its RTCP messages. Each stream
    * is sent out with a new SSRC (Synchronisation SouRCe
    * identifier), but from the same participant i.e. local
    * participant. <BR>
    For 3 participants, my conf. server creates 2 send streams to every one of them, so I'd expect 2 SSRCs on the wire. Examining the RTP packets in Ethereal, I only see 1 SSRC, as if the 2nd createSendStream call failed. Consequently, each participany in the conference is able to receive voice from only 1 other participant, even though I create RTPManager instance for each participany, and add 2 send streams.
    Any ideas ?
    Thanks,
    Jarek

Maybe you are looking for

  • Apple TV deletes sync'd content (photos)

    My ATV setup is demonstrating a variant of what seems to be a frequently reported problem - erratic sync behavior. In general, the sync process will often (but not always) delete (all) previously sync'd photos from the ATV, despite the fact that the

  • WSDL Exception while defining a DB Adapter

    Hi all, We are getting a WSDL Exception while defining a DB Adapter that says: Error while writing wsdl file C:\.....wsdl Exception: The wrapper procedure Pkgname.procedure_name could not be found in the wrapper procedure pkg name for the particular

  • Any error come oracle 11g r1 n 11g r2 installed in same ORACLE_BASE

    Hello , if I had installed oracle 11g r2 in same oracle base where oracle 11g r1 was already existing,,,ll there b any problem going to occur .

  • QI06 - No suitable material/vendor combinations exist

    Hello Craig, Could you please provide more details regarding this part: "is required by the control key"? Which setting in QM Control key means Quality Info Record is mandatory? I know that for 0000 it is not working, for 0001 it is - but what exactl

  • Third Party Antenna for Cisco AP

    Hi all, Does anyone have experience using third party antenna for cisco aironet 1532E ? Can wimax sector antenna be supported on cisco aironet 1532E ? This Wimax antenna works in frequency 2.3 GHz - 2.7 GHz and dual polarization. Brand and type of wi