Custom Streams

Is anyone else having issues with their Custom Streams in SocialCast? We used to get emails from these when we had them setup, but within I am not sure the last month or so, we aren't getting the emails to our email inboxes anymore.
We use the hashtags, to set up Custom Streams for #helpwanted. However, we aren't getting those emails anymore, so we never know in our company when someone needs help or not.
If anyone can please help me with this issue that would be great. We really need to get this resolved asap. We just realized it yesterday and had no idea other people were having the same issue.
Thank you,

Samantha:
Thank you for answering my question. I do have a couple of items I need to
clarify with you.
1. We have many groups set up for different Teams we have throughout our
CVA. Does this mean if you are receiving an email notification from any of
these groups that we have checked under settings in the email notification,
that you can no longer receive any emails from the Custom Streams we have
set up? These would't be duplicate emails as you have stated above, it
would be anytime we use our Custom Stream to received emails if we belong
to a group email notification.
2. We use Gmail as our Email client throughout or company, so unless there
is an issue with SC and Gmail together then we should be getting the
emails. We receive all other emails from SC, such as when we comment on
others peoples posts for example.
3. We have tried re-setting out Custom Stream as well.
4. There aren't any message from SC in our Spam Folders.
Thank you for taking the time to reply! Look forward to hearing back from
you.
Twila
On Fri, Feb 13, 2015 at 10:54 AM, samanthalee <

Similar Messages

  • Custom Streaming Transport Settings

    I can't get QT preferences to retain custom settings for Streaming Transport. I need to see if I am receiving stream via http port 80 or UDP 554. Each time I change and apply settings then close the window, re-open preference the streaming Transport is re-set to automatic
    I have Windows version of QTPro 7.04

    My location: at work:
    If you are using a workplace computer on a local network this may well be the case. A Firewall blocks out all Internet traffic except those services that local network administrators consider essential. You will need to contact your local network administrators and have the University's streaming server included in the acceptable services. There's an adminsitrator's guide to allowing streaming Quicktime through a firewall <a href "http://www.apple.com/quicktime/resources/qt4/us/proxy/proxy.html">here. To check to see that you can access any streaming movie, open the Quicktime Player and clicking Edit > Preferences > Quicktime preferences > Streaming transport. Click the Auto Configure button. Quicktime should confirm that it can use UDP/RTSP port 554. If it switches to http:// port 80, then you are most likely behind a firewall that is configured to block Quicktime streaming files.
    My location: at home:
    If you have a DSL router with a NAT firewall you will need to configure it to allow streaming media files. Open the Web-based interface for the router, select Advanced and choose "Port forwarding" (wording may differ for your router). Open port 554 for RTSP/TCP data and ports 6970 through 6999 (inclusive) for RTP/UDP data. Consult your router's documentation for specific instructions on configuring port forwarding, or contact the router vendor. To check to see that you can access any streaming movie, open the Quicktime Player and clicking Edit > Preferences > Quicktime preferences > Streaming transport. Click the Auto Configure button. Quicktime should confirm that it can use UDP/RTSP port 554. If it switches to http:// port 80, then you are most likely behind a firewall that is configured to block Quicktime streaming files.

  • Playback delay of customized DataSource/Stream

    I have developed a centralized voice chat conferencing program using JMF. The server mixes received streams into a single stream and sends the mixed stream to each client. To make this possible, I have developed my own DataSource and Stream classes. My stream class uses the standard GSM_RTP audio format. When a client receives the mixed stream, the client immediately creates a player for the stream. The player spends about four seconds realizing the stream before playback begins. If I send a non-mixed/non-customized stream to the clients, there is no delay. Any ideas as to why a customized stream would cause the player to delay for so long?

    I found the problem. I had some old code left over from the previous version that was setting the buffer length and threshold for the receive RTPManager. As soon as I removed this code, there was no longer a delay realizing the player.

  • Java.rmi.NoSuchObjectException with custom InputStream/OutputStream

    I'm experiencing a strange problem here...and have spent the night digging for an answer,
    but I don't see any...
    I am using a custom client/server socket factory, which works great until I return my
    custom In/Out streams from the custom Socket. I can return a BufferedInputStream
    wrapping the socket's InputStream, but if I write a subclass BufferedInputStream
    (and override no methods) I get the NoSuchObjectException...
    any ideas?
    public class BInputStream extends java.io.BufferedInputStream {
      private InputStream in = null;
      public BInputStream(InputStream in) {
        super(in);
        this.in = in;
    }

    What I am trying to do (and have done with the minor limitation that I cannot use
    custom streams to compress the RMI traffic...) is create a totally transparent proxy
    for RMI services.
    The Proxy object is for any Remote object. A Proxy is returned to the client
    containing a Remote Invoker that is called by the Proxy's InvocationHandler.
    Since this is for a known environment, there is no need for dynamic class loading,
    so the JVMs must all have the classes to talk to one another...
      A <---> B <---> CI want to, now, compress the A <--> B link, because it is over a saturated link, and
    a cursory look at the actual data tells me I can compress 300k, which is a standard
    transaction size (after pruning classpath entries from the clients), to around 40k
    (conservative estimate). Obviously, this is a huge improvement in any environment,
    but an especially appealing one considering the fact that this link slows, literally, to
    a crawl at times. I already have the inflate/deflate streams worked out as far as I can
    test them, so all I need to do is get past this current stumbling block and test the
    compression bits in an environment more representative of the deployment env.
    Anyway, that's the highest level goal :-)
    So, a question...to confirm what I believe I found earlier that led me into this bowl
    of noodles (really, it's not that bad, except that I can't debug the code once
    it wanders off into RMI land, which may indicate a problem in my design :-P)...
    [after some more time poking around to make sure I'm not asking a stupid
    question]
    It seems my spaghetti (the exporting of the Invoker inside of the InvocationHandler)
    is necessary for the transparency I desire. As it turns out, I know enough now to
    find useful information from google on this subject :-) That's some progress!
    It seems springframework has something exactly like the server half of my
    system, so I'm gonna take a look at that as I wind this day down.
    cheers.
    b

  • How to stream the content of a buffer[] via RTP?

    Hi,
    I have a buffer[] Objekt with encapsulates my Audiodata. I now have to make a DataSource out of my Buffer Object so that I can stream it via an RTP Manager.
    How to do it?
    regards
    einherjar

    You have to create a custom DataSource and a custom stream for the DataSource. The custom stream must implement/override the read() method. The JMF calls read() to get data from the stream to send via RTP. In the body of the read method, you will take a chunk of your buffer[] data and put it into the Buffer object passed as a parameter.

  • How do I get the default Photostream on my MacBook Pro to match the Photostream on my iPhone and iPad?

    Essentially, I am having issues getting photos that I save on my iPhone or iPad to automatically appear on my MacBook's iPhoto app (and vice versa). The iPad and iPhone automatically share all photos between each other. I have iCloud set up and all of the photo sharing options selected on all devices, so that's not the issue. I used a bit of a workaround and created and shared a custom stream, which ultimately works. However, it involves a lot of moving photos around on all three devices, so I'd like to get the Photostream service working the way it's intended by getting the default Photostream connected to my iCloud account automatically syncing among all three devices.
    Thanks.

    Photos are kept in the My Photo Stream for 30 days or until more than 1000 have been added.  The photos will remain in the MPS for longer than 30 days if the 1000 limit is not reached.
    However, depending on when a device joined the iCloud feature and started receiving the MPS it will not see any that were added more than 30 days from the date of joining.  So if your MBP signed into iCloud and the MPS feature later than the other 2 devices it will not see photos that they might.  Also if you've turned off My Photo Stream on the MPB and then added it back it will only see the last 30 days worth.

  • ITunes - Incoming network connections error message

    Hello,
    I am hoping some one can help me?
    I have a Mac and I keep get an error message appear when I open my ITunes.
    The error message reads " Do you want the application iTunes to accept incoming network connections".
    I have changed the security settings to accept incoming network connections but the error message keeps coming up.
    Any ideas??
    Thanks
    Gavin

    My father has a perfectly good working 17 inch MacBook Pro, running on 10.4.8 which is the most modern version it's allowed to run, plus running iTunes 9.x and have the same problem. It does get tiring clicking each time, it does get annoying not being able to use airplay because it blames the firewall, and It gets annoying entering the administrate access and password each time.
    When he selects the AirPort Express in question, that used to throw up the "not set in your firewall to allow AirPlay".
    Clicking the firewall preferences button is useless, as it's already been given permission to allow incoming Internet connection, and clicking ignore makes it either crash , stop working or be inactive.
    Turning firewall off to allow all connections defeats the point of having a firewall, but used to work most of the time, but doesn't seem to work now.
    This coincides with him suddenly not being able to browse the Internet streams through the old iTunes radio button, but the ones that he did safe to his personal playlists play fine and he can enter custom stream URLs fine. So something on apples side must be stopping the use of the old radio button that allows you to browse through various genres of Internet music streams.
    For the record there has been no iTunes or operating system updates for 10.4.8 for a couple of years now at the very least. So that is where our situation lies. A solution other than iTunes which no longer allows browsing.

  • Messing with JavaCompiler

    I have a following problem :
    I need to put into webservice a method that at runtime compiles custom source code and executes it.Compilation is done by calling JavaCompiler class that I've rewrote to make compilation from/to memory and put error messages to custom stream.During compilation some of webservice classes have to be visible for compiler.Problem is that they are inside jar nested in ear file.
    My question : is there any possible way to do this without using classpath parameter ? All classes are allready loaded when compilator is called, so maybe i could do some magic with class loaders ?

    I have done a very indirect form of compilation, using generated JSPs, as hot swapping was needed (just what JSPs offer). That involved an extra redirect/forward.
    If the library classes are available (loaded) then the rest is a pure ClassLoader problem. The best advice there, is that you digest the tricky materia yourself, by reading and testing. If you did the JavaCompiler, that should be no great challenge. Probably you'll find a couple of container related class loaders at your application.
    Sometimes it is easier to do an interface or even use a proxy class.

  • Serialization back-doors?

    Anyone know any good tips for speeding up serialization by customizing ObjectOutputStream / ObjectInputStream?
    I want to keep serialization transparent to my domain objects (i.e, externalizable at the code level isn't acceptable).
    Looking at the code for the two classes, I realised that I could do away with writing out ObjectStreamClasses - which gave a 30-40% gain.
    (Im only serializing classes between multiple instances of the same application, so all the class versioning stuff isnt important to me).
    However, Im still burning serious CPU on (de)serialization.
    I thought of writing a custom stream, in conjuntion with load time weaving to write in the equivilent to externalizable implementations to all serialized classes at load time. My motivation was that I believed java serialization used reflection to set / get fields. On closer inspection though, it seems that the infamous java.misc.Unsafe class is used to set / get fields using native code.
    Im wondering if anyone has found any other good tricks for stripping unnecessary stuff out of serialization in this context?

    Not really transient...
    My domain classes already have fields which dont need to be serialized marked as transient.
    It could be that using externalizable would speed things up a fair bit. The problem is that I dont control the domain classes (off-shore developers etc etc), and I dont want to make it hard to write domain classes by forcing developers to do all the externalizable stuff: I'd like any solution to be largely transparent to the domain classes.
    The context of the problem is significant here: Serialized data is short lived and is communicated between running instances of the same application using the same code base.
    I've thus written extensions to ObjectOutputStream and ObjectInputStream which do away with writing class descriptors - yielding a good gain in throughput (Im really supprised that there's nothing about this on the web - its a cheap trick with good results).
    So I suppose Im looking for more tricks like this which I might have missed :o)

  • Vod content from http

    Is it possible to get vod content from a http adress ?

    Thank you for your answer. But neither rtmp nor hds/hls worked.
    for rtmp I edited the fms.ini and added the second substitution
    VOD_DIR2 = \\172.16.40.142\vod_ortak
    then edited the application.xml of applications/vod and added the custom stream tags;
    <Streams>/;${VOD_DIR2}</Streams>
    but I got the following error;
    Invalid substitution variable : VOD_DIR2
    for hds/hls I edited the apache2.2/conf/httpd.conf and did the following changes
    <Location /hls-vod>
        HLSHttpStreamingEnabled true
        HLSMediaFileDuration 8000
        # HttpStreamingContentPath "../webroot/vod"
        HttpStreamingContentPath "\\172.16.40.142\vod_ortak"
        HLSFmsDirPath ".."
    <Location /hds-vod>
        HttpStreamingJITPEnabled true
        #    HttpStreamingContentPath "../webroot/vod"
        HttpStreamingContentPath "\\172.16.40.142\vod_ortak"
        JitFmsDirPath ".."
        Options -Indexes FollowSymLinks
    however I see from logs that path is not the way I desire.
    mod_jithttp [404]: [err=1] C:/172.16.40.142/vod_ortak/n_HOBBIT_720p.mp4 does not exist
    besides these, my main question is if fms can refer to files from http adress instead of an network folder. For example
    my file location will be http:\\172.16.40.142\mediafiles not "\\172.16.40.142\vod_ortak" and if fms can cache these remote materials.
    I can send you logs and config files
    best regards

  • Central recording server using MTP at remote

    I am getting ready to implement a voice recording solution. We will be using auto recording / RTP forking. We will have centralized recording servers and wish to transcode the G.711 recording streams to G.729 using MTP on the remote site voice gateways. I am trying to calculate the required DSPs for this solution. I can see our current MTP max sessions when I configure the MTP, but I don't know if this equates to a 1 for 1 for recording sessions. There will be two recording streams in the same direction, remote to HQ this includes the agent and customer streams. Does anyone know how many MTP sessions are used per recording session?
    I would assume it would be 1 MTP session per normal bi-directional call, but not sure with both streams going in the same direction for the recording sessions.
    Thanks,
    Mark

    I have g711ulaw and g729r8 configured. There is a mix of 2811 and 2911 voice routers. This is a typical config.
    Show inventory:
    NAME: "CISCO2911/K9 chassis", DESCR: "CISCO2911/K9 chassis"
    PID: CISCO2911/K9      , VID: V02 , SN: FTX1533AN65
    NAME: "VWIC2-1MFT-T1/E1 - 1-Port RJ-48 Multiflex Trunk - T1/E1 on Slot 0 SubSlot 0", DESCR: "VWIC2-1MFT-T1/E1 - 1-Port RJ-48 Multiflex Trunk - T1/E1"
    PID: VWIC2-1MFT-T1/E1  , VID: V01 , SN: FOC15240VK1
    NAME: "3rd generation two port FXS DID voice interface daughtercard on Slot 0 SubSlot 1", DESCR: "3rd generation two port FXS DID voice interface daughtercard"
    PID: VIC3-2FXS/DID     , VID: V03 , SN: FOC15273DM3
    NAME: "3rd generation two port EM voice interface daughtercard on Slot 0 SubSlot 2", DESCR: "3rd generation two port EM voice interface daughtercard"
    PID: VIC3-2E/M         , VID: V02 , SN: FOC15297W3H
    NAME: "PVDM3 DSP DIMM with 64 Channels on Slot 0 SubSlot 4", DESCR: "PVDM3 DSP DIMM with 64 Channels"
    PID: PVDM3-64          , VID: V01 , SN: FOC153013MW
    NAME: "C2911 AC Power Supply", DESCR: "C2911 AC Power Supply"
    PID: PWR-2911-AC       , VID: V03 , SN: DCA1522R1GX
    Thanks,
    Mark

  • ITunes 10.5.2 airplay problem

    Since upgrading to iTunes 10.5.2, I have problems using airplay to connect to my hi-fi speakers.
    When I enable them in iTunes I get the message
    "Do you want the application "iTunes.app" to accept incoming network connections? "
    immediately followed the message
    "An error occurred while connecting to the AirPlay device "hi-fi". The network connection failed".
    If I choose "allow" for the first message, deselect "hi-fi" in iTunes and select it again, it works!
    This is becoming more annoying having to do this every single time.
    I am using an Airport Xpress with airport utility 5.5.3 and Mac OS 10.6.8
    Any advice is appreciated.
    Nigel

    My father has a perfectly good working 17 inch MacBook Pro, running on 10.4.8 which is the most modern version it's allowed to run, plus running iTunes 9.x and have the same problem. It does get tiring clicking each time, it does get annoying not being able to use airplay because it blames the firewall, and It gets annoying entering the administrate access and password each time.
    When he selects the AirPort Express in question, that used to throw up the "not set in your firewall to allow AirPlay".
    Clicking the firewall preferences button is useless, as it's already been given permission to allow incoming Internet connection, and clicking ignore makes it either crash , stop working or be inactive.
    Turning firewall off to allow all connections defeats the point of having a firewall, but used to work most of the time, but doesn't seem to work now.
    This coincides with him suddenly not being able to browse the Internet streams through the old iTunes radio button, but the ones that he did safe to his personal playlists play fine and he can enter custom stream URLs fine. So something on apples side must be stopping the use of the old radio button that allows you to browse through various genres of Internet music streams.
    For the record there has been no iTunes or operating system updates for 10.4.8 for a couple of years now at the very least. So that is where our situation lies. A solution other than iTunes which no longer allows browsing.

  • WebServiceSoap_Stub - is it thread safe?

    Hi,
    Is the WebServiceSoap_Stub class thread safe, ie can it be used by multiple EJBs (threads) at the same time? I would say yes but a confirmation would be nice.
    Thanks!

    well, that assumes you are using thread-safe stream impls. i believe all the ones in the jvm are safe for this, but if you are using any custom streams, you'll have to check those out for yourself.

  • Data Streaming in JSP custom tag

    We have a JSP custom tag which uses a StringBuffer object to store the HTML content. In the doEndTag method, we use the JSPWriter's print method to send the buffer's content to the browser.
    Now, there are cases when the HTML content becomes very large,in the order of 1 - 10MB. In such cases, is there a way we can start streaming the data from the custom tag as the buffer is being built, rather than waiting for the entire buffer to be written out in the doEndTag.

    Yeah, just call the JSPWriter's print method instead of using the StringBuffer, or write your own subclass of StringBuffer (or StringBuilder if Java 5+) which will do this for you automatically when the size is at a certain point, and clear it out)

  • Biztalk custom send pipeline Using memory stream

    Hi friends,
    I developing the custom send pipeline,I am calling the custom helper class.The helper class I am using the Memory stream.
    The helper class method is looks like:
    public MemoryStream UpdateProcess(MemoryStream ms)
    In the pipeline i need to call this method 
     public IBaseMessage Execute(IPipelineContext pc, IBaseMessage inmsg)
                        //To get Incoming message
                System.IO.Stream originalStream = inmsg.BodyPart.GetOriginalDataStream();
                //Working with XDocument
                XDocument xDoc;
                using (XmlReader reader = XmlReader.Create(originalStream))
                    reader.MoveToContent();
                    xDoc = XDocument.Load(reader);
                // Returning stream
                byte[] output = System.Text.Encoding.ASCII.GetBytes(xDoc.ToString());
                MemoryStream memoryStream = new MemoryStream();
                memoryStream.Write(output, 0, output.Length);
                ProcessHelper mf = new ProcessHelper();
                mf.UpdateProcess(memoryStream);
                memoryStream.Position = 0;
                inmsg.BodyPart.Data = memoryStream;
                return inmsg;
                catch (Exception ex)
                    throw ex;
                finally
                    if (parms != null)
                        parms.Clear();
                return inmsg;
    Can you can any one let me know i am doing correct logic in pipeline component.
    Thanks
    hk
    hk

    Realistically, we can say for sure if it's right since we don't know the implementation of the helper.
    However, since it looks like the helper expects a MemoryStream with Xml content, you can probably just copy the data from the original stream to the local MemoryStream.  Passing it through an XmlDocument might be unnecessary.
    Also, unless you are absolutely sure the source document is ASCII, you shouldn't use ASCII encoding since you can loose meaningful character data.

Maybe you are looking for