Adobe Access Server for Protected Streaming Configuration Problem

I cannot get the Adobe Access Server for Protected Streaming sample implementation up and running.  Could somebody help me figure out where I'm going wrong?
I've configured the license server with the correct PFX files and when I run the validation tool, I get the following output.  This is a clean install with only the license server configuration (the content server is a separate machine).  I also get a similar error in the tomcat logs when I try to access a piece of content licensed to this server. 
$ java -jar libs/flashaccess-validator.jar -g -r /var/lib/tomcat6/licenseserver/
Validating global deployment ...
        Validating partition deployment - flashaccessserver...
                Validating tenant deployment - flashaccessserver/mediafly...Failed to validate tenant deployment 'flashaccessserver/mediafly' - Error reading key server certificate for partition='flashaccessserver', tenant='mediafly'
See log for details:
(/tmp/temp281722781181937073071254678876423/flashaccessserver/flashaccess-partition.log)
$ cat /tmp/temp281722781181937073071254678876423/flashaccessserver/flashaccess-partition.log
[] 2013-02-12 15:37:26,383 INFO  [[Partition(flashaccessserver)].com.adobe.flashaccess.server.license.context.SimpleContex tFactory] Creating class loader for partition 'flashaccessserver' with libraries '[file:/var/lib/tomcat6/licenseserver/flashaccessserver/libs/, file:/var/lib/tomcat6/licenseserver/flashaccessserver/libs/flashaccess-license-server-ext -sample.jar]'
[] 2013-02-12 15:38:41,247 ERROR [[Partition(flashaccessserver)].com.adobe.flashaccess.server.license.tools.Validator] Failed to validate tenant deployment 'flashaccessserver/mediafly'
com.adobe.flashaccess.server.common.configuration.ConfigurationException: Error reading key server certificate for partition='flashaccessserver', tenant='mediafly'
        at com.adobe.flashaccess.server.license.configuration.commonsadapter.TenantConfigurationImpl $ServerCertsImpl.readCerts(TenantConfigurationImpl.java:720)
        at com.adobe.flashaccess.server.license.configuration.commonsadapter.TenantConfigurationImpl $ServerCertsImpl.<init>(TenantConfigurationImpl.java:705)
        at com.adobe.flashaccess.server.license.configuration.commonsadapter.TenantConfigurationImpl .<init>(TenantConfigurationImpl.java:115)
        at com.adobe.flashaccess.server.license.configuration.commonsadapter.CommonsConfigurationBas edFactory.getTenantConfiguration(CommonsConfigurationBasedFactory.java:90)
        at com.adobe.flashaccess.server.license.tools.Validator.validateTenantDeployment(Validator.j ava:255)
        at com.adobe.flashaccess.server.license.tools.Validator.validatePartitionDeployment(Validato r.java:283)
        at com.adobe.flashaccess.server.license.tools.Validator.validateGlobalDeployment(Validator.j ava:301)
        at com.adobe.flashaccess.server.license.tools.Validator.process(Validator.java:173)
        at com.adobe.flashaccess.server.license.tools.Validator.main(Validator.java:117)
Caused by: java.security.cert.CertificateException: java.lang.IllegalArgumentException: unknown object in factory: org.bouncycastle.asn1.DERInteger
        at org.bouncycastle.jce.provider.JDKX509CertificateFactory.engineGenerateCertificate(Unknown Source)
        at java.security.cert.CertificateFactory.generateCertificate(CertificateFactory.java:322)
        at com.adobe.flashaccess.core.crypto.CertUtil.loadCert(CertUtil.java:33)
        at com.adobe.flashaccess.sdk.cert.CertificateFactory.loadCert(CertificateFactory.java:45)
        at com.adobe.flashaccess.server.license.configuration.commonsadapter.TenantConfigurationImpl $ServerCertsImpl.readCert(TenantConfigurationImpl.java:739)
        at com.adobe.flashaccess.server.license.configuration.commonsadapter.TenantConfigurationImpl $ServerCertsImpl.readCerts(TenantConfigurationImpl.java:718)
        ... 8 more
Some additional information:
$ uname -s -r -v -m
Linux 2.6.32-350-ec2 #59-Ubuntu SMP Mon Jan 7 14:20:59 UTC 2013 x86_64
$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description:    Ubuntu 10.04.4 LTS
Release:        10.04
Codename:       lucid
$ dpkg -l | grep tomcat
ii  libtomcat6-java                                             6.0.24-2ubuntu1.12                Servlet and JSP engine -- core libraries
ii  tomcat6                                                     6.0.24-2ubuntu1.12                Servlet and JSP engine
ii  tomcat6-common                                              6.0.24-2ubuntu1.12                Servlet and JSP engine -- common files
$ dpkg -l | grep jdk
ii  openjdk-6-jre-headless                                      6b24-1.11.5-0ubuntu1~10.04.2      OpenJDK Java runtime, using Hotspot JIT (hea
ii  openjdk-6-jre-lib                                           6b24-1.11.5-0ubuntu1~10.04.2      OpenJDK Java runtime (architecture independe
$ md5sum AdobeAccessProSDK_4_0_LS1_java.exe
73068d8348cbdc1c29211a463a58b8df  AdobeAccessProSDK_4_0_LS1_java.exe
Thanks,
Bryan

I found the problem. 
Certificates/KeyServer/File[@path] needs to be set your license file, but it must be in the ".cer" format.  The server won't start without this fileand it was not mentioned anywhere in the quick start guide but is mentioned in the protected streaming document.
Bryan

Similar Messages

  • Configurations Flash Access Server for Protected Streaming

    Do anybody have informations, how configurate files flashaccess-global.xml and flashaccess-tenant.xml?
    I now that Flash Access Server for Protected Streaming ignored content policy, and policy have to set in that configurations files.
    Where can I read about that? Very small informations are about that
    Thanks

    Hello,
    Please check chapter "Deploying the Flash Access Server for Protected Streaming" in http://help.adobe.com/en_US/flashaccess/2.0/protecting_content.pdf.
    Best regards,
    Wang Chao

  • Why don't my DRM policies seem to work? Using "License Server for Protected Streaming"

    Do you find yourself setting rights/restrictions in a DRM policy (like disallow playback on Linux), only to see that the right/restriction isn't being enforced when you play your content?  There may be several reason causing this to happen:
    1. If you are using the "Adobe Access Server for Protected Streaming" that came on your ESD/DVD, this streaming-only server is configured to specifically ignore any policy that was used to package content.  Instead, the server itself overrides the entire packaged policy and instead, reads rights from its tenant configuration XML file (flashaccess-tenant.xml) to generate licenses issued back to the requesting device.
    2. Is your license server configured to use a PolicyUpdateList?  This is a list that can be used to override any old policy (according to its Policy ID) and to use a newly defined policy instead.  This is useful if content was pacakged with a policy and then afterwards, the business rules have changed for the particular policy, warranting an across-the-board change to all content packaged with that particular policy.  Using a PolicyUpdateList is essentially a way to version your older (already used) policies.
    3. is your license server overriding the policy manually in its code?  If you're using the Reference Implementation license server (or building your own server from the SDK), your server has direct access to the policy.  The server can add/remove/change any right in the policy before using it to generate a license for the requesting device.  If you're using the Reference Implementation, by default, it will simply generate a license directly from the policy without any manipulation to it.
    4. Are you sure you re-packaged your content using the new policy where you made your changes?  On your license server, since you have access to the policy (it's sent up to the license server as part of the license request), the license server can log the policy ID of the policy.  Check your logs to verify the new policy is actually being used.
    5. Are you sure you're playing back the newly-packaged content?  Caches may be stale.
    cheers,
    /Eric.

    Not possible with a POP account.
    The email account sync process with iTunes syncs (or transfers) email account setup information for select accounts only from your computer to the iPhone. This does not not involve messages.
    The reason the account's Inbox mailbox seemed to sync is because received messages with a POP account are the only server stored messages and if not removed from the server when downloaded by one email client can be downloaded from the server by another - in this case the iPhone.
    If you want Sent messages kept synchronized between your computer's email client and the iPhone's email client, you MUST use an IMAP account for this purpose.

  • Oracle streams configuration problem

    Hi all,
    i'm trying to configure oracle stream to my source database (oracle 9.2) and when i execute the package DBMS_LOGMNR_D.SET_TABLESPACE('LOGMNRTS'); i got an error bellow:
    ERROR at line 1:
    ORA-01353: existing Logminer session
    ORA-06512: at "SYS.DBMS_LOGMNR_D", line 2238
    ORA-06512: at line 1
    When checking some docs, they told i have to destraoy all logminer session, and i verify to v$session view and cannot identify logminer session. If someone can help me because i need this sttream tools for schema synchronization of my production database and datawarehouse database.
    That i want is how to destroy or stop logminer session.
    Thnaks for your help
    regards
    raitsarevo

    Thanks Werner, it's ok now my problem is solved and here bellow the output of your script.
    I profit if you have some docs or some advise for my database schema synchronisation, is using oracle sctrems is the best or can i use anything else but not Dataguard concept or standby database because i only want to apply DMl changes not DDL. If you have some docs for Oracle streams and especially for schema synchronization not tables.
    many thanks again, and please send to my email address [email protected] if needed
    ABILLITY>DELETE FROM system.logmnr_uid$;
    1 row deleted.
    ABILLITY>DELETE FROM system.logmnr_session$;
    1 row deleted.
    ABILLITY>DELETE FROM system.logmnrc_gtcs;
    0 rows deleted.
    ABILLITY>DELETE FROM system.logmnrc_gtlo;
    13 rows deleted.
    ABILLITY>EXECUTE DBMS_LOGMNR_D.SET_TABLESPACE('LOGMNRTS');
    PL/SQL procedure successfully completed.
    regards
    raitsarevo

  • Enough bandwith to usa a NAS server for itune & stream via Airport Express?

    Hi,
    i posted this question in the itunes forum, but got zero feedback. i was hoping someone here could possibly answer my question :
    I was thinking of getting a ByteCC LANDisk NAS to use as an itunes music 'jukebox' server for 2 macs (a G5 and a macbook). I have a Airport Express that i'd like to use to stream music to my living room stereo via itunes,
    My question is is there enough bandwith to stream from the NAS server to the macs, and then to also stream the audio to the living room stereo via the airport express at the same time?
    I'm sure someone must've atempted this set up
    Any help or info is apreciated!

    This works. I have a WD Netcenter (320G) which I store my main iTunes library on (holding down the command key when starting iTunes allows you to chose which library you want to use). I'm not using an Airport base (D-Link and now a Zyxel) and I can stream music AND video no problem - all my movies are on there too and they play fine through Front Row.
    Oh and I usually mount it as nfs rather then smb - handy little apple script for this which I run as an app (when are Apple going to bring out a proper location manager that automounts networked drives etc based on where you are?!!!).
    Have fun.
    Charlie

  • How to get the trial version of adobe access whether can protect content

    i submit info in
    https://www.adobe.com/cfusion/mmform/index.cfm?name=flash_access_trial_cert
    but nobody contact me..
    How to get the trial version of adobe access?
    Thanks

    Hello,
    Apologies for that!  Currently, Adobe is shut down for the holidays which is why you're seeing a delay. Adobe will resume work again on January 2nd.
    If you can private message me your contact information, I can contact our sales team to make sure they are aware of your request.
    Sent from my mobile,
    /Eric

  • Writing a conference server for RTP streams

    Hello,
    I'm trying to write a conference server which accepts multiple RTP streams (one for each participant), creates a mixed RTP stream of all other participants and sends that stream back to each participant.
    For 2 participants, I was able to correctly receive and send the stream of the other participant to each party.
    For 3 participants, creating the merging data source does not seem to work - i.e. no data is received by the participants.
    I tried creating a cloneable data sources instead, thinking that this may be the root cause, but when creating cloneable data sources from incoming RTP sources, I am unable to get the Processor into Configured state, it seems to deadlock. Here's the code outline :
        Iterator pIt = participants.iterator();
        List dataSources = new ArrayList();
        while(pIt.hasNext()) {
          Party p = (Party) pIt.next();
          if(p!=dest) {
            DataSource ds = p.getDataSource();
            DataSource cds = Manager.createCloneableDataSource(ds);
            DataSource clone= ((SourceCloneable)cds).createClone();
            dataSources.add(clone);
        Object[] sources = dataSources.toArray(new DataSource[0]);
        DataSource dataSource =   Manager.createMergingDataSource((DataSource[])sources);
        Processor p = Manager.createProcessor(dataSource);
        MixControllerListener cl = new MixControllerListener();
        p.addControllerListener(cl);
        // Put the Processor into configured state.
        p.configure();
        if (!cl.waitForState(p, p.Configured)) {
            System.err.println("Failed to configure the processor.");
            assert false;
        }Here are couple of stack traces :
    "RTPEventHandler" daemon prio=1 tid=0x081d6828 nid=0x3ea6 in Object.wait() [98246000..98247238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f37e4a8> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at demo.Mixer$MixControllerListener.waitForState(Mixer.java:248)
            - locked <0x9f37e4a8> (a java.lang.Object)
            at demo.Mixer.createMergedDataSource(Mixer.java:202)
            at demo.Mixer.createSendStreams(Mixer.java:165)
            at demo.Mixer.createSendStreamsWhenAllJoined(Mixer.java:157)
            - locked <0x9f481840> (a demo.Mixer)
            at demo.Mixer.update(Mixer.java:123)
            at com.sun.media.rtp.RTPEventHandler.processEvent(RTPEventHandler.java:62)
            at com.sun.media.rtp.RTPEventHandler.dispatchEvents(RTPEventHandler.java:96)
            at com.sun.media.rtp.RTPEventHandler.run(RTPEventHandler.java:115)
    "JMF thread: com.sun.media.ProcessEngine@a3c5b6[ com.sun.media.ProcessEngine@a3c5b6 ] ( configureThread)" daemon prio=1 tid=0x082fe3c8 nid=0x3ea6 in Object.wait() [977e0000..977e1238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f387560> (a java.lang.Object)
            at java.lang.Object.wait(Object.java:429)
            at com.sun.media.parser.RawBufferParser$FrameTrack.parse(RawBufferParser.java:247)
            - locked <0x9f387560> (a java.lang.Object)
            at com.sun.media.parser.RawBufferParser.getTracks(RawBufferParser.java:112)
            at com.sun.media.BasicSourceModule.doRealize(BasicSourceModule.java:180)
            at com.sun.media.PlaybackEngine.doConfigure1(PlaybackEngine.java:229)
            at com.sun.media.ProcessEngine.doConfigure(ProcessEngine.java:43)
            at com.sun.media.ConfigureWorkThread.process(BasicController.java:1370)
            at com.sun.media.StateTransitionWorkThread.run(BasicController.java:1339)
    "JMF thread" daemon prio=1 tid=0x080db410 nid=0x3ea6 in Object.wait() [97f41000..97f41238]
            at java.lang.Object.wait(Native Method)
            - waiting on <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Object.wait(Object.java:429)
            at com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave.run(CloneableSourceStreamAdapter.java:375)
            - locked <0x9f480578> (a com.ibm.media.protocol.CloneableSourceStreamAdapter$PushBufferStreamSlave)
            at java.lang.Thread.run(Thread.java:534)Any ideas ?
    Thanks,
    Jarek

    bgl,
    I was able to get past the cloning issue by following the Clone.java example to the letter :)
    Turns out that the cloneable data source must be added as a send stream first, and then the clonet data source. Now for each party in the call the conf. server does the following :
    Party(RTPManager mgr,DataSource ds) {
          this.mgr=mgr;
          this.ds=Manager.createCloneableDataSource(ds);
       synchronized DataSource cloneDataSource() {
          DataSource retVal;
          if(getNeedsCloning()) {
            retVal = ((SourceCloneable) ds).createClone();
          } else {
            retVal = ds;
            setNeedsCloning();
          return retVal;
        private void setNeedsCloning() {
          needsCloning=true;
        private boolean getNeedsCloning() {
          return needsCloning;
         private synchronized void addSendStreamFromNewParticipant(Party newOne) throws UnsupportedFormatException, IOException {
        debug("*** - New one joined. Creating the send streams. Curr count :" + participants.size());
        Iterator pIt = participants.iterator();
        while(pIt.hasNext()) {
          Party p = (Party)pIt.next();
          assert p!=newOne;
          // update existing participant
          SendStream sendStream = p.getMgr().createSendStream(newOne.cloneDataSource(),0);
          sendStream.start();
          // send data from existing participant to the new one
          sendStream = newOne.getMgr().createSendStream(p.cloneDataSource(),0);
          sendStream.start();
        debug("*** - Done creating the streams.");So I made some progress, but I'm still not quite there.
    The RTP manager JavaDoc for createSendStream states the following :
    * This method is used to create a sending stream within the RTP
    * session. For each time the call is made, a new sending stream
    * will be created. This stream will use the SDES items as entered
    * in the initialize() call for all its RTCP messages. Each stream
    * is sent out with a new SSRC (Synchronisation SouRCe
    * identifier), but from the same participant i.e. local
    * participant. <BR>
    For 3 participants, my conf. server creates 2 send streams to every one of them, so I'd expect 2 SSRCs on the wire. Examining the RTP packets in Ethereal, I only see 1 SSRC, as if the 2nd createSendStream call failed. Consequently, each participany in the conference is able to receive voice from only 1 other participant, even though I create RTPManager instance for each participany, and add 2 send streams.
    Any ideas ?
    Thanks,
    Jarek

  • Apache or Windows server for HTTP streaming

    Does is matter what HTTP server I use?
    In the process of upgrading from FMS 3.0 on Windows using IIS 5 to FMS 4.0. Will dynamic HTTP Streaming work from a Windows server?

    Hi,
    Thanks for your interest in HDS.
    HTTP Dynamic Streaming only works with Apache web server. The basic Apache web server is already bundled along with FMS from the 4.0 versions where the HTTP Dynamic streaming modules are loaded by default. Alternatively, any external apache installation also can be used, by copying the required module files and changing the configuration apache files to enable them..

  • Rich client access server for data

    Hi
    I have a Rich client application and it need to access a web server via http protocol to access large amount of data.
    Any idea what is the best way to do this?
    im think of writing some server side code to generate the data object into XML files and then the client can retrieve the file. (either that or a plain text file...)
    Please advise me what is the best/common practice because this is the first time im doing this.
    thx

    XML can be a good idea, but remember that XML can bloat your data because of all the added tag information. so if you are working with large amounts of data, it can become a huge amount of data. Still you can use SAX to parse the XML on the fly in stead of parsing it into one large DOM tree structure first, which at least is efficient in the resources.
    If you can access the data in small parts, a webservice system might be a better, cleaner and more portable solution.

  • Adobe media server 5.0

    Hi,
    Where i can check adobe media server 5.0 trial configurations.
    It is giving very poor quiality in live streaming at my local pc. Need to check if it is configuration problem..
    Thanks

    It's probably less to do with GPU. While the video size is fairly small you're actually asking for 60fps video. You're giving 1 second 2000kbit and want it compressed down to 60 quality (fair to poor, more importantly, expensive to compress). Compressing 60 frames that fast stresses your camera.
    For each frame you're giving the video 33kb to display. Setting quality and bandwidth higher (say double that) for 60FPS might be necessary to reduce the systems need to compress.
    Honestly I think you should simply reduce your framerate to something a bit more standard at the same other settings, such as 30fps or 24fps. But it depends on what the videos purpose is.
    Lastly while FMS trial works fine on my Win7 install I'm pretty over-jacked on hardware. I also have cameras that internally encode themselves offloading that chore from the CPU. I can stream up to 1280x720 @ 60FPS but I need to crank up the bandwidth to about 12000 and keep quality above 70 or even I start seeing some serious bog, and my usage is ~1%. This just means my camera is innundated. I change it to 30fps/5000kbit 1280x720 70Q, it's smooth as glass.

  • Adobe Access on iOS.

    This discussion is a follow up of my previous discussion that can be found at http://forums.adobe.com/message/6174242. I am starting a new discussion for clarity.
    The previous discussion addressed the question of whether Adobe Access DRM and its variant PHLS are available on iOS. The answer is NO, Unless You Sign UP for Adobe Primetime.
    You have to sign up for Adobe Primetime to get the iOS Primetime Player SDK which includes the Adobe Access iOS APIs needed to stream PHLS in a native iOS app.
    The Adobe Primetime platform requires a large monthly commitment, so that means that for all practical purposes Adobe Access DRM and PHLS is available only to a small number of companies.
    The Adobe Access iOS API used to be part of the Adobe Access SDK but at some point was removed from the SDK.
    The Adobe Access website (http://www.adobe.com/products/adobe-access.html) continues to market Adobe Access as follows:
    “Publish and protect video streams carrying studio-grade content to desktops, connected TVs, tablets, and smartphones, including iOS and Android devices, using a single DRM platform”.
    After talking to several Adobe folks, the picture that is emerging is one of bait and switch. It is clear that at some point, Adobe decided to remove the Adobe Access iOS API from the Adobe Access SDK and make the iOS API part of the Adobe Primetime iOS SDK. This is a fact. The question is why?
    Only Adobe can answer that question, but it is safe to assume that this was a strategic decision. By removing the iOS API from the Adobe Access SDK, companies that want to deliver protected content to iOS devices have to sign up for Adobe Primetime.
    Adobe seems to have gone to some length to implement this strategic decision.
    It did not update the publicly available OSMF 2 to be able to consume Adobe Access protected HLS streams. Development of OSMF 2 was stopped in 2010.
    It removed support of DRMManager class and Netstream class members (required for DRM) from AIR for iOS. 
    It removed the Adobe Access iOS API from the Adobe Access SDK. I am not sure when this was done, but the Adobe SDK sent to me last August does not include the iOS API.  
    Adobe has not published any documentation on how Adobe Access DRM for HLS is implemented.
    The deception consists in the fact that Adobe is still marketing Adobe Access in a way that makes you believe that Adobe Access DRM is available on iOS devices. This is false.
    Purchasing Adobe Access alone is not enough to protect content on iOS devices. The Adobe Access SDK does not include the iOS API.
    The question is then why does Adobe continue to promote such deception.
    Again, this has to be a strategic decision….a classical example of bait and switch. Who in the right mind would develop around a half-baked product from the get go knowing that they would not be able to deliver protected content on iOS devices.
    The Adobe Access team has admitted that Adobe Access DRM and its variant PHLS are only available through Adobe Primetime iOS Player but the Adobe Access site continues to make false claims. This is deception.
    Of course, Adobe knows that by declaring no support for iOS, Adobe Access and Adobe Media server sales would be in danger. But that is the right thing to do both morally and legally.
    In my scenario the only right action is for Adobe to provide the Adobe Access iOS API. It’s too late for Adobe to say oops sorry. I have already agreements with content owners which specify Adobe Access as the protection scheme. All the encoding and packaging is based on Adobe Access and it’s variant PHLS. If Adobe does not provide the iOS API, I will be forced to seek the appropriate legal action. 

    Hello,
    Adobe Access DRM on mobile devices is only provided by Adobe via the Primetime SDK.  You can get more information on Primetime, as well as contact Adobe on becoming a Trial user, here: http://www.adobe.com/solutions/primetime.html
    cheers,
    /Eric.

  • Routing Issue between router and Access Server

    Hi,
    We have a Lucent MAX TNT access server, having TAOS version 9.0.9. I
    have configured
    the default route so that all dialup user traffic is diverted towards
    cisco 2611 series router but this
    only happens with the IP subnets configured on MAX TNT and cisco router
    ethernet interfaces.
    We need a solution in which dailup users on MAX TNT with IPs from any network can
    be routed towards the cisco router in order to reach the internet cloud accross the Router.

    Muhammad
    Your message states that you have configured the access server with a default route pointing to the 2611 router. But it does not say whether you have configured a route on the 2611 pointing to the access server for the address range used by the dial pool which the access server uses to assign addresses to dial up users. I suspect this is your problem. I believe that the 2611 needs a route to that address space and that the 2611 needs to advertise that address range if there are any other routers in your network.
    HTH
    Rick

  • Proxy between Flash Media Live Encoder and Adobe Media Server

    ciao,
    I encode from a pc with Adobe Flash Media Live Encoder to a server with Adobe Media Server. And this configuration works well.
    The actual configuration is:
    1) computer A (es: 192.168.1.5 - gw 192.168.1.254) point to computer B
    with FMS url: rtmp://COMPUTER_B_WAN_IP/live and StreamName
    test?user=useruser&password=userpassword
    2) computer B (es: 210.22.11.134) with Adobe Media Server
    But now I need to encode from Adobe Flash Media Live Encoder (computer A) to a pc (computer P) and then redirect the rtmp stream to the Adobe Media Server (computer B). Those 3 computers have different lan configurations. If I use rtmpsuck (http://rtmpdump.mplayerhq.hu/) on computer P, can I use it like proxy to redirect the computer A rtmp flow to computer B?
    I suppose:
    1) computer A (es: 192.168.1.5 - gw 192.168.1.254) point to computer P with FMS url: rtmp://COMPUTER_P_WAN_IP/live and StreamName test?user=useruser&password=userpassword
    2) on computer P (es: 10.10.10.8 - gw 10.10.10.254) I install
    rtmpsuck, I open port 1935 on his firewall and I configure iptables
    like: iptables -t nat -A PREROUTING -p tcp --dport 1935 -j DNAT
    --to-destination COMPUTER_B_WAN_IP:1935
    3) on computer B I do nothing.
    Now I have 3 questions:
    1- Can it works?
    2- On computer P (the proxy) I must install rtmpsuck and rtmpsrv or I can install only rtmpsuck? I don't necessary need to record the stream
    3- Is there another way to do this? Is there another software to redirect the stream?
    Thankyou
    M.

    Please update us when FMLE OSX is deployed.  BTW, I get that the server market is dominated by LINUX/WINDOWS and this is a FMS add on. But, my shops creative workflow is Mac based for obvious reasons.  Encoding is part of our editing workflow and thus needs to be mac based.  Throw us a bone please.

  • Some Adobe Media Server questions

    Hello,
    I need to setup Adobe Media server for my production environment and I have some questions.I need to have online media library (video on demand) available for end users as clips embedded in my site. My questions are:
    - What kind of source files are supported (source format, video/audio codec etc).
    - I also need dynamic video quality (user internet connection quality sholud be detected and video quality adjusted automatically). In addition, video should be transcoded and user should be able to choose video quality (and resolution) by hand. FlowPlayer is preffered player to be embedded to my site and it should be set up as client for adobe media server. What is the best way to implement this scenario? I can't find any step by step guide how to setup this (including dynamic quality adjustment). Any help
    - Live streams. The same as vod. I assume I will need Flash Media Encoder to encode stream from input source, so client will need to setup bitrate and other parameters? How to setup dynamic quality for clients - will server transcode stream again? How to setup this? I can't find any guide step by step, can you help me?
    - Video chat. How can I setup this using Adobe Media Server? Is this possible to make video chat via Adobe Media Server using flash applications like FSChat?

    Hi
    - What kind of source files are supported (source format, video/audio codec etc).
    Full list is available here:
    Adobe Media Server 5.0.7 * Supported clients, encoders, codecs, and file formats
    - I also need dynamic video quality (user internet connection quality sholud be detected and video quality adjusted automatically). In addition, video should be transcoded and user should be able to choose video quality (and resolution) by hand. FlowPlayer is preffered player to be embedded to my site and it should be set up as client for adobe media server. What is the best way to implement this scenario? I can't find any step by step guide how to setup this (including dynamic quality adjustment). Any help
    Guidelines on multi bitrate streaming should help:
    Adobe Media Server 5.0.7 * Configure HTTP Dynamic Streaming and HTTP Live Streaming
    - Live streams. The same as vod. I assume I will need Flash Media Encoder to encode stream from input source, so client will need to setup bitrate and other parameters? How to setup dynamic quality for clients - will server transcode stream again? How to setup this? I can't find any guide step by step, can you help me?
    Guidelines on multi bitrate streaming should help:
    Adobe Media Server 5.0.7 * Configure HTTP Dynamic Streaming and HTTP Live Streaming
    - Video chat. How can I setup this using Adobe Media Server? Is this possible to make video chat via Adobe Media Server using flash applications like FSChat?
    This is possible but will require some coding skills, an exampe can be found here.
    Creating a video sharing web application using Flex, Flash Media Server, and Flash Media Encoding Server | Adobe Develo…
    I am sure there are more examples to be found online.
    Alternately you could try Adobe Connect if you want an out of the box solution ready to go. Also available as a pay as you go hosted solution:
    Web conferencing software - Conference services | Adobe Connect 9
    Hope this is of help.
    Conor

  • Integrate native WebRTC audio/video on all AIR platforms including Adobe Media Server

    Hey everybody,
    It is apparent that Adobe is very busy trying to keep up with improving and fixing video and audio bugs on all platforms. Thankfully, in the past month Adobe finally made a H264 video with Nellymoser audio stream work on Androids! That took a while to make that work. But even though that is working, AIR on Androids still cannot transmit H264 video. Also AIR on iOS cannot view live video, but instead video has to be wrapped inside a Apple HLS (HTTP Live Streaming) format which introduces way too much latency for live audio/video streaming. Also AIR cannot transmit AAC Audio, and echo cancellation with Nellymoser just doesnt make the cut.
    Everybody is aware that Flash Player can only stream video/audio smoothly for 1 in 10 users. There is just way too much for Adobe to do to get audio/video to work again and to work for everybody on every device.
    So because WebRTC has much more development effort going into it, and because it is being promoted as free source to try to make the proprietary licensing world to rethink H264 and AAC audio, perhaps Adobe should just focus their efforts on implementing the existing and mostly working WebRTC libraries into Adobe AIR.
    Since these libraries are becoming so popular, Adobe could also integrate support for WebRTC into the Adobe Media Server for recording, peer-to-peer negotiation, and firewall hole punching.
    See my feature request here to integrate native WebRTC audio/video on all platforms
    https://bugbase.adobe.com/index.cfm?event=bug&id=3728399
    So will you vote with me to get WebRTC into Adobe AIR and Adobe Media Server?
    Adobe could essentially deprecate many existing features requests and bug fixes related to audio video and solve many problems with WebRTC such as:
    Implement Opus Codec:
    https://bugbase.adobe.com/index.cfm?event=selectBug&CFGRIDKEY=3016518
    https://bugbase.adobe.com/index.cfm?event=selectBug&CFGRIDKEY=3331640
    https://bugbase.adobe.com/index.cfm?event=selectBug&CFGRIDKEY=3633142
    Fix Enhanced Microphone issues:
    https://bugbase.adobe.com/index.cfm?event=selectBug&CFGRIDKEY=3711062
    https://bugbase.adobe.com/index.cfm?event=selectBug&CFGRIDKEY=3521224
    Add support to stream AAC HE-AAC v2 audio, allowing iOS to be an endpoint that can receive audio and video:
    https://bugbase.adobe.com/index.cfm?event=selectBug&CFGRIDKEY=3694393
    Add support to transmit H264 video from an Android:
    https://bugbase.adobe.com/index.cfm?event=selectBug&CFGRIDKEY=3648262
    Decreased video latency and more performance on Androids regardless of the 32-bit/64-bit ARM Intel processors
    https://bugbase.adobe.com/index.cfm?event=selectBug&CFGRIDKEY=3648268
    Improve video quality and bandwidth:
    HEVC, H265, VP9

    No there has been no update nor comment from Adobe regarding adding support for WebRTC. According to this year's roadmap they are booked! So it appears that we will not be seeing WebRTC in the ActionScript platform this year unfortunately. Back in March I made some predictions as to what might be considered higher priority for Adobe to be working on. It appears that list is still being worked on by the Adobe AIR/Flash team. They still have not finished support for iOS 8, and they still have several months to get hardware accelerated video finished. And VideoTexture is still in beta and will probably be in beta until AIR version 19 I would guess. Then 64-bit AIR runtime will be completed 3rd quarter along with HTML5 improvments for the 4th quarter. It looks like Adobe is completely booked.

Maybe you are looking for