No Inter-Cluster RTP Stream with Gatekeepers

Hello,
Firstly I am no expert in Cisco telephony as we have just recently migrated to a full Cisco solution, so apologies if I ask a fundamental question.
Client A = Site A
Client B = Site B
Client C = Site C
Site A and Site B are in 1 cluster
Site C is in another cluster
Intra-Cluster Traffic Works
Client A -> Client B within the same cluster (across 2 sites with a low latency link) RTP stream comes up and the call functions as expected.
Inter-Cluster Traffic Fails (GK to GK)
Client C -> Client A this works, the RTP stream comes up and the call functions as expected.
Client A -> Client C this call connects but there is no RTP stream.
We are using G711 across the board and I have captured a wireshark capture from a Client A -> Client C failed call.
I have been going through this capture and noticed that when I search H225 (for the gatekeepers) I see the following –
CS: setup
RAS: admissionRequest
RAS: admissionConfirm
CS: callProeeding
CS: alerting
CS: notify
RAS: registrationRequest
RAS: registrationConfirm
CS: notify
CS: connect
CS: notify
CS: releaseComplete
RAS: disengageRequest  (DISCONECT_REASON=2,TIME=1321266127,DURATION=24,DISCONNECT_STRING=no resource,ORIGIN=0,LINE_NUMBER=GK,OUTBUND_GW_IP=..
RAS: disengageConfirm
There are firewalls inbetween and these were the first thing I looked at, but I dont even see any RTP stream trying to be initiated from the far side. Would anyone have any ideas where I could start looking?
Thanks,
Peter

Pat,
I can not talk to the UC540, but I ran into a situation recently where the SIP gateway was sending out the private extension of the phone number instead of the full DID that was registered to the provider.  The provider was then blocking call.  
In our SIP debugs we saw the RDNIS information of the private extension I believe. The error code we were getting back from SP was code 404 or something along those lines.
I recommend you do some debugs and track where the calls fails, compare the SNR call versus a normal call in the debugs, and then if you still get stuck post running configs and debugs back here. 

Similar Messages

  • Problems Presenting RTP Stream

    Hi, I have a Video Streaming server developed in Java using
    Java Media Framework and i need to see the videos with a flash
    player, but i know that flash works with RTMP and JMF doesn't
    support it. Is there any way to present the RTP Streams with flash?
    Thanks.

    Flash Player only supports streams delivered over the RTMP
    protocol from Flash Media Server.

  • RTP Media Stream with "Third Party SIP Device" always through CUCM

    Hello,
    i have i quite strage problem on one of my customers locations:
    we have a cucm 7.1.5(SU4) with the cucm in the datacenter. And we have a small location(branch office) which has a small wan connection to the datacenter (1MBit/s).
    In this location we have several Kirk (Polycom) Dect phones which register as "Third Party SIP Device - Basic" on the CUCM.
    (The problem is the same if i use the X-Lite SIP Client instead)
    When this SIP Phones or the X-Lite Client dials a internal Number of the same location the RTP Media Streams goes directly from the SIP Client to the phone. But if they dial an external number the RTP Stream goes from the SIP Client via the wan connection to the CUCM and back via the wan connection to the 2901 H.323 Gateway (on the same location).
    and of course if i now start a big download or upload i'm no longe able to complete the phone call because we have no QOS on the wan connection, because we don't want to make calls over this connection.
    When i look at the Sniffer files with Wireshark i see that the CUCM sends his own ip adresse in the SDP Header for the RTP Stream to the SIP Client. And this of course is wrong because the RTP Stream should always reside on the branch office.
    i tested this in my lab (CUCM 8.5) and it is the same. i used the "Standard SIP Profile" and a Basic Third Party SIP Device"
    The SCCP Phones on the same location which are configured with the same Region, Location, Device Pools, Media Resources and which use the same Gateway for external calls do not have this problem.
    In the Gateway configuration "MTP Required" is not activated and i tested it in my lab with some Cisco SIP Phones (9971) and they are also not affacted with this problem.
    any ideas?

    do you have a SIP trunk to the external devices with MTP required checked ?

  • RTP (UDP) multicast stream with JMF...?!

    Hi.
    I am quiet new in the material of jmf and its plugins. I have a network which offers me an UDP stream (mpeg2 or mpeg4 both could be available). I can watch this stream with an freewareplayer called VLC.
    When using the JMF Studio tool, I can't watch the stream ... The Player is waiting for signals and breaks after 60 seconds. So, okay ... Maybe the JMF doesn't bring the right codec with it.
    Now I've installed jffmpeg, but that doesn't help a lot. I write done the JAR File in Classpath and installed the Plugin, MimeType and Demultiplexer (as described) in the JMF Registry. The behaviour of JMF Studio is the same. Does anybody have an idea wether jffmpeg supports this kind of streaming
    and/or what I can do to get the stream?! Maybe there is an better Plugin for my solution.
    I can't suppose that nobody here ever tried to get an Mpeg2 or Mpeg4 multicast stream with JMF. There must be an solution.
    Many thanks for your help and reply!
    Jan Stanetzki

    Hi funnyjanni,
    May be I am writing this after 4 years.
    I am also in the stage when you faced the same problem. You might have found the solution to this.
    Please help me on this. I am trying to use jffmpeg to play UDP mutlicast stream.
    Thanks
    jklanka

  • RTP stream question

    Hello community, I have a question
    The setup is this, a cisco sccp phone is located in russia and registrated in a callmanager located in denmark. The phone dials a number that translate and use a routelist to a local gateway in russia. The dialed number in the local cisco gateway is a dialpeer with a siptrunk to a lync mediation server also located in denmark.
    How is the final rtp stream going to be from the cisco phone to the Lync server?
    Br. Kasper

    Hi. It actually all comes down to the reporting in Lync. The issue is that Lync reports "bad" conferance per Sip trunk. I want to make a more detailed design where I place a SIP trunk in every affilliate around europe in their local gateway, but as the phones are registrated in cucm cluster in denmark and the Lync mediation server(sip trunk connection to local gateway) also is located in denmark, I need a redesign of the rtp path including the MTP, that is why I want to put the MTP resource in the local affiliate gateway(this eksample russia) So by doing this the rtp path would be phone->local mtp->local vgw-siptrunk->Lync mediation in denmark. I just needed a secound thought on the design.
    The current setup would report all "bad" conferances as beeing in denmark, even if they are originated in other countries. The local dialpattern is just callcorwarded to the dk siptrunk->Lync mediation.The SIP trunk in cucm are limmited to only one connection to the Lync mediation server.

  • How to extract data from Buffer and create a RTP stream

    Hi
    I'm working on a project where I need to interrupt a media stream (video and audio) and extract the data. Send the data with a custom protocol. On the recieveing side I would like to reconstruct the stream using only the data chunks.
    I'm currently looking at [DataSourceReader.java|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/DataSourceReader.java] and more specifically at a method like printDataInfo(Buffer buffer) to extract the data.
    Is it possible to create a RTP stream, only having access to the byte array "data" in Buffer ?
    Thanks in advance.

    camelstrike wrote:
    Hi
    I'm working on a project where I need to interrupt a media stream (video and audio) and extract the data. Send the data with a custom protocol. On the recieveing side I would like to reconstruct the stream using only the data chunks.
    I'm currently looking at [DataSourceReader.java|http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/DataSourceReader.java] and more specifically at a method like printDataInfo(Buffer buffer) to extract the data.
    There are a couple of different ways to get the data. Reading it from inside a DataSink is perfectly fine...
    Is it possible to create a RTP stream, only having access to the byte array "data" in Buffer ?Yes and no.
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/CustomPayload.html]
    You need to know the format of the media in addition to the actual media data...

  • Writing a conference server for RTP streams

    Hello,
    I'm trying to write a conference server which accepts multiple RTP streams (one for each participant), creates a mixed RTP stream of all other participants and sends that stream back to each participant.
    For 2 participants, I was able to correctly receive and send the stream of the other participant to each party.
    For 3 participants, creating the merging data source does not seem to work - i.e. no data is received by the participants.
    I tried creating a cloneable data sources instead, thinking that this may be the root cause, but when creating cloneable data sources from incoming RTP sources, I am unable to get the Processor into Configured state, it seems to deadlock. Here's the code outline :
        Iterator pIt = participants.iterator();
        List dataSources = new ArrayList();
        while(pIt.hasNext()) {
          Party p = (Party) pIt.next();
          if(p!=dest) {
            DataSource ds = p.getDataSource();
            DataSource cds = Manager.createCloneableDataSource(ds);
            DataSource clone= ((SourceCloneable)cds).createClone();
            dataSources.add(clone);
        Object[] sources = dataSources.toArray(new DataSource[0]);
        DataSource dataSource =   Manager.createMergingDataSource((DataSource[])sources);
        Processor p = Manager.createProcessor(dataSource);
        MixControllerListener cl = new MixControllerListener();
        p.addControllerListener(cl);
        // Put the Processor into configured state.
        p.configure();
        if (!cl.waitForState(p, p.Configured)) {
            System.err.println("Failed to configure the processor.");
            assert false;
        }Here are couple of stack traces :
    "RTPEventHandler" daemon prio=1 tid=0x081d6828 nid=0x3ea6 in Object.wait() [98246000..98247238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f37e4a8> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at demo.Mixer$MixControllerListener.waitForState(Mixer.java:248)
            - locked <0x9f37e4a8> (a java.lang.Object)
            at demo.Mixer.createMergedDataSource(Mixer.java:202)
            at demo.Mixer.createSendStreams(Mixer.java:165)
            at demo.Mixer.createSendStreamsWhenAllJoined(Mixer.java:157)
            - locked <0x9f481840> (a demo.Mixer)
            at demo.Mixer.update(Mixer.java:123)
            at com.sun.media.rtp.RTPEventHandler.processEvent(RTPEventHandler.java:62)
            at com.sun.media.rtp.RTPEventHandler.dispatchEvents(RTPEventHandler.java:96)
            at com.sun.media.rtp.RTPEventHandler.run(RTPEventHandler.java:115)
    "JMF thread: com.sun.media.ProcessEngine@a3c5b6[ com.sun.media.ProcessEngine@a3c5b6 ] ( configureThread)" daemon prio=1 tid=0x082fe3c8 nid=0x3ea6 in Object.wait() [977e0000..977e1238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f387560> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at com.sun.media.parser.RawBufferParser$FrameTrack.parse(RawBufferParser.java:247)
            - locked <0x9f387560> (a java.lang.Object)
            at com.sun.media.parser.RawBufferParser.getTracks(RawBufferParser.java:112)
            at com.sun.media.BasicSourceModule.doRealize(BasicSourceModule.java:180)
            at com.sun.media.PlaybackEngine.doConfigure1(PlaybackEngine.java:229)
            at com.sun.media.ProcessEngine.doConfigure(ProcessEngine.java:43)
            at com.sun.media.ConfigureWorkThread.process(BasicController.java:1370)
            at com.sun.media.StateTransitionWorkThread.run(BasicController.java:1339)
    "JMF thread" daemon prio=1 tid=0x080db410 nid=0x3ea6 in Object.wait() [97f41000..97f41238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Object.wait(Object.java:429)
            at com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave.run(CloneableSourceStreamAdapter.java:375)
            - locked <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Thread.run(Thread.java:534)Any ideas ?
    Thanks,
    Jarek

    bgl,
    I was able to get past the cloning issue by following the Clone.java example to the letter :)
    Turns out that the cloneable data source must be added as a send stream first, and then the clonet data source. Now for each party in the call the conf. server does the following :
    Party(RTPManager mgr,DataSource ds) {
          this.mgr=mgr;
          this.ds=Manager.createCloneableDataSource(ds);
       synchronized DataSource cloneDataSource() {
          DataSource retVal;
          if(getNeedsCloning()) {
            retVal = ((SourceCloneable) ds).createClone();
          } else {
            retVal = ds;
            setNeedsCloning();
          return retVal;
        private void setNeedsCloning() {
          needsCloning=true;
        private boolean getNeedsCloning() {
          return needsCloning;
         private synchronized void addSendStreamFromNewParticipant(Party newOne) throws UnsupportedFormatException, IOException {
        debug("*** - New one joined. Creating the send streams. Curr count :" + participants.size());
        Iterator pIt = participants.iterator();
        while(pIt.hasNext()) {
          Party p = (Party)pIt.next();
          assert p!=newOne;
          // update existing participant
          SendStream sendStream = p.getMgr().createSendStream(newOne.cloneDataSource(),0);
          sendStream.start();
          // send data from existing participant to the new one
          sendStream = newOne.getMgr().createSendStream(p.cloneDataSource(),0);
          sendStream.start();
        debug("*** - Done creating the streams.");So I made some progress, but I'm still not quite there.
    The RTP manager JavaDoc for createSendStream states the following :
    * This method is used to create a sending stream within the RTP
    * session. For each time the call is made, a new sending stream
    * will be created. This stream will use the SDES items as entered
    * in the initialize() call for all its RTCP messages. Each stream
    * is sent out with a new SSRC (Synchronisation SouRCe
    * identifier), but from the same participant i.e. local
    * participant. <BR>
    For 3 participants, my conf. server creates 2 send streams to every one of them, so I'd expect 2 SSRCs on the wire. Examining the RTP packets in Ethereal, I only see 1 SSRC, as if the 2nd createSendStream call failed. Consequently, each participany in the conference is able to receive voice from only 1 other participant, even though I create RTPManager instance for each participany, and add 2 send streams.
    Any ideas ?
    Thanks,
    Jarek

  • Mixing audio RTP-Streams to conference

    I'm trying to create a telefon conference applikation with jmf.
    My problem:
    I need to mix multiple incomming RTP-Streams (Audio/G711_ulaw) together into one outgoing RTP-Stream.
    If I use a MergingDataSource i can get one DataSource with multiple Tracks, but i'm not able to send them out together over one RTP-Session.
    Is there any way to do this?
    I have also thaught about using RawBufferMux or WAVMux to multiplex the tracks included in the MergingDataSource ,but i cant find out how to use them.
    Or is there an easyer way to create an conference over RTP?
    I need to connect to cisco ip-phones, so i can only use the g711_ulaw codec.
    Is there anyone out who can help me?

    Im sorry, but I met this problem in past and find it impossible to resolve.
    Im also thought about MergingDataSource, about Many2One class and more and more but nothing I could use. And nobody could answer this question in this forum half of year ago.
    Oh, nearly forgot: it is possible to write merged streams to file, seems possible to write them into file and then to transmit it to net from file. But how to realize more than one Processor?..

  • Can I play and save the incoming RTP streams simultenously

    Hi,
    I want to whether i can play and save the incoming RTP stream simultenously using
    JMF 2.1. The idea is by saving in the file, i can playback the same file later time
    Is there any example code aviliable.
    Thanks in Advance
    bye
    Srikanth

    I think this, today, it's impossible lamentably. By but that I have tried it, I have not been able to obtain it.
    If exists someone that it has obtained it, please, help us with this subject. It's very important.
    Thanks

  • How to synchronize audio and video rtp-streams

    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?
    How are you supposed to sync audio and video if not with as above ?

    camelstrike wrote:
    Hi everyone!
    I've tried to make one of the processors, processor1, control the other processor, processor2, with processor1.addController(processor2) but I get a
    javax.media.IncompatibleTimeBaseException. I need to synchronize audio and video rtp-streams because they are sometimes completely out of sync.
    In JMF API Guide I've read:
    "2. Determine which Player object's time base is going to be used to drive
    the other Player objects and set the time base for the synchronized
    Player objects. Not all Player objects can assume a new time base.
    For example, if one of the Player objects you want to synchronize has
    a *push-data-source*, that Player object's time base must be used to
    drive the other Player objects."
    I'm using a custom AVReceive3 to receive rtp-streams and then I create processors for the incoming stream's datasource's, and they are ALL PushBufferDataSource's.
    Does this mean I can't synchronize these. If so is there any way I can change them into Pull...DataSources ?The RTP packets are timestamped when they leave, and they are played in order. You can't change the timebase on an RTP stream because, for all intensive purposes, it's "live" data. It plays at the speed the transmitting RTP server wants it to play, and it never gets behind because it drops out of order and old data packets.
    If your RTP streams aren't synced correctly on the receiving end, it means they aren't being synced on the transmitting end...so you should look into the other side of the equation.

  • How can I manage controls in an RTP Streaming

    Hello,
    I'm currently using an RTP Streaming in order to play some music. I started with AVReceive2 and AVTransmit2 from sun and by adding a plugin, I managed to play MP3. So far so good.
    Because I didn't want to use view components given by sun, I created my own GUI for my player and I use it instead of using the one in sun's examples.
    I use listeners for my "buttons" (I prefer using mouselistener and Panel like in the sun's example Jamp) but I can't find a way to execute those buttons.
    If I press pause, it has to pause the streaming in AVTransmit2.
    If I press next, or previous, it has to change the process AVTransmit2.
    Each control has to be done in AVTransmit2 but my player is part of AVReceive2.
    My first guess was to check if I could use an event to tell my AVTransmit2 object to execute my controls but I havn't found a way to tell my AVTransmit2 object that it was this button I pressed.
    I finally stopped trying transmitting my orders thanks to the events.
    I tried then to find a way to get my AVTransmit2 object in my AVReceive2 object so that I could call methods but i failed :(
    How can i manage my controls so that it will call methods on my server and not my client ?
    Thanks :)
    Shad.
    Edited by: Shadwolf on Feb 9, 2010 2:15 AM

    Hi, again,
    For the next & previous button, I was thinking in something.
    Do you think it could be good to create a new class RTPClientManager which extends from RTPManager and get 2 booleans previous & next that are set to true when I press on buttons ?
    From here, couldn't I modify my function in AVTransmit2 like this (I implements ReseiveStreamEvent on AVTransmit2) :
    @Override
         public void update(ReceiveStreamEvent evt)
              RTPClientManager mgr = (RTPClientManager )evt.getSource();
              Participant participant = evt.getParticipant();     // could be null.
              ReceiveStream stream = evt.getReceiveStream();  // could be null.
               * Détection de la fermeture de connexion du client afin de fermer la transmission streaming
              if (evt instanceof ByeEvent)
                   System.err.println("  - Got \"bye\" from: " + participant.getCNAME());
                         if(mgr.isNext())
                     this.stop();
                             /* SOME STUFF FOR NEXT */
                        else if(mgr.isPrevious())
                     this.stop();
                             /* SOME STUFF FOR PREVIOUS*/
                    else //means it's the stop button called
                              this.stop() ;
         }Would it work ?
    If you have better ideas I am ready to hear theam :P
    But if this works, I will manage stop, next and previous, but how could I manage play & pause button ?
    I can't find a proper solution to manage my controls :(

  • Adding Effect on an incoming H.263/RTP stream

    Hi,
    I am playing an h.263/rtp stream which is multicasted over a network. I want to play the file with rotation or some other effect added to it. I tried to see following link for some help(as the forums suggested):- http://java.sun.com/products/java-media/jmf/2.1.1/solutions/RotationEffect.html. But the link is disabled and new link which they are providing does not contain the example.
    Can any of me help me how to add the effect on h263/rtp.
    Though i already created my effect class and tried adding it. But could not add the effect.
    Kindly give me some directions.

    Is there some error in above code or if there is some other issue.There's an error in the above code because you're assuming I'm being imprecise with my language. I am not.
    You must copy the data from the input buffer to the output buffer...
    Buffer objects are wrappers around byte[], and you cannot copy data from one array to another by modifying the object references...which is what your code does.
    public int process(Buffer inBuffer, Buffer outBuffer) {
        // Make sure the output buffer will hold data
        if (!(outBuffer.getData() instanceof byte[])) {
            outBuffer.setData(new byte[inBuffer.getLength()]);    
            outBuffer.setLength(inBuffer.getLength());  
            outBuffer.setOffset(0);
        // Make sure the output buffer is long enough
        if (outBuffer.getLength() + outBuffer.getOffset() < inBuffer.getLength())) {
            int offset = outBuffer.getOffset();
            int length = inBuffer.getLength() + outBuffer.getOffset();
            byte[] oldData = (byte[])outBuffer.getData();
            byte[] newData = new byte[length];
            // Copy the previous data
            for (int i = 0; i < offset; i++) {
                newData[i] = oldData;
    // Set the variables for the output buffer
    outBuffer.setData(newData);
    outBuffer.setLength(length);
    outBuffer.setOffset(offset);
    // Copy the data from in to out
    int j = outBuffer.getOffset();
    for (int i = inBuffer.getOffset(); i < inBuffer.getOffset() + inBuffer.getLength(); i++) {
    ((byte[])outBuffer.getData())[j++] = ((byte[])inBuffer.getData())[i];
    return BUFFER_PROCESSED_OK;
    That's untested code above, fix my dumb mistakes as necessary.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • RTP stream being stripped

    How can I check to see if 802.1Q VLAN tagging information is being stripped from the RTP stream by the workstation's NIC?

    Some network cards strip the dot1q headers, so your packet capture won't  have this data. If you're not seeing dot1q headers please check the  following link from wireshark:
    http://wiki.wireshark.org/CaptureSetup/VLAN#head-81781716144f2855ab0aff2f8b752e95f2562efb
    Apply the previous settings to the machine you're capturing with.

  • How to receive video stream with different codec

    Hello Experts,
    I am transmitting the video stream from a system using RGBFormat with dimension 320x240 but at the receiver side in the absence this CODEC I need to receive the stream with RGBFormat dimension 160x120.
    I request you to please guide me that how can I do this.
    Apart from this please let me know that is there a way through which I can add codec RGBFormat with dimension 320x240 ?

    I am transmitting the video stream from a system using RGBFormat with dimension 320x240 but at the receiver side in the absence this CODEC I need to receive the stream with RGBFormat dimension 160x120.You need to do what exactly? Can you explain that better, because, that doesn't make much sense to me. You want to transmit a video stream at 320x240 and have the client receive it at a different resolution that you sent it at... that just doesn't make sense.
    I'd like to mail you a letter on a legal sized sheet of paper, but I'd like for you to receive it on standard paper... See? Doesn't really make any sense...
    Apart from this please let me know that is there a way through which I can add codec RGBFormat with dimension 320x240 ?The RGB format is already defined for JMF, it just doesn't have an RGB-RTP variant for sending over RTP...and creating all of the things you have to create to make JMF send in RGB is pretty complex, complicated stuff.

  • Setting dimension of RTP packet with 'rtp' jmf

    How is it possibile to set the dimension of RTP packet with JMF in the transmitting audio stream with RTP????

    thesti wrote:
    how JMF deal with RTP packet loss? since my application doesn't handle anything due to RTP packet loss, i believe that JMF has a mechanism to deal with it.It "deals" with it by having a blank spot in the rendering where that packet would have gone...

Maybe you are looking for

  • HP Officejet Pro 8600 and OSX 10.9 Mavericks

    Am not able to duplex print.  The duplex print accessory is properly installed.  I have tried opening the System Preferences and going to Print & Fax.  I have opened the print queue and clicked on Printer Setup, but then there is no Driver tab.  Wher

  • Button on page 0 -- Close current window

    I have been asked if its possible to close the current window when a user clicks on a button. (I suspect this will involve some javascript) any example would be appreciated

  • Request Timed Out

    Hello, for the past year or so I have been having intermittent internet connectivity issues. It appears to happen completely randomly, sometimes it will work flawlessly for a few weeks and then there are days where it just drops nonstop every 5 minut

  • WSDL failing on WS-I analyze with BP2012

    Hi, I am quite new to the WS-I analizer and it is failing with a BP2012 error which relates to "The binding (in soapbind:body elements) refers to part(s) of a soap:body element that do not have the "element" attribute.". I have the following WSDL <?x

  • Table Maintenance- Condition Records

    Hi Friends, I have created a custom table and i have done a Table Maintenance for that custom Table.. In that i have one field called DATBI(which is To-date). My Requirement is how to set condition in this table maintenance where IF TO-DATE < SYSDATE