Publish to iOS using GPU rendering?

Can the new publish to iOS feature use GPU rendering and if so, how does one set that?
AIR for iOS allows CPU, GPU, Auto, and Direct options.
Thanks.

Is this really the only solution? I've been trying to go back to Java 1.6 following various threads found on internet but have been unable to get to work - does Adobe have any plans to fix on thier end? Here's error message I'm getting, have tried switching Air versions but didn't help. We've actually had to remove an app from app store because unable to export to fix a bug, so any help would be appreciated.
Exception in thread "main" java.lang.Error: Unable to find llvm JNI lib in:
/Applications/Adobe Flash CS6/AIR3.2/lib/adt.jar/Darwin
/Applications/Adobe Flash CS6/AIR3.2/lib/aot/lib/x64
/Applications/Adobe Flash CS6/AIR3.2/lib/adt.jar
/Applications/Adobe Flash CS6/AIR3.2/lib
     at adobe.abc.LLVMEmitter.loadJNI(LLVMEmitter.java:577)
     at adobe.abc.LLVMEmitter.(LLVMEmitter.java:591)
     at com.adobe.air.ipa.AOTCompiler.generateExtensionsGlue(AOTCompiler.java:407)
     at com.adobe.air.ipa.AOTCompiler.generateMachineBinaries(AOTCompiler.java:1585)
     at com.adobe.air.ipa.IPAOutputStream.createIosBinary(IPAOutputStream.java:300)
     at com.adobe.air.ipa.IPAOutputStream.finalizeSig(IPAOutputStream.java:620)
     at com.adobe.air.ApplicationPackager.createPackage(ApplicationPackager.java:91)
     at com.adobe.air.ipa.IPAPackager.createPackage(IPAPackager.java:224)
     at com.adobe.air.ADT.parseArgsAndGo(ADT.java:557)
     at com.adobe.air.ADT.run(ADT.java:414)
     at com.adobe.air.ADT.main(ADT.java:464)

Similar Messages

  • How to use filters on ios mobile devices (iPhone/iPad) using GPU rendering (Solved)

    Many moons ago I asked a question here on the forums about how to use filters (specifically a glow filter) on a mobile devices (specifically the iPhone) when using GPU rendering and high resolution.
    At the time, there was no answer... filters were unsupported. Period.
    Well, Thanks to a buddy of mine, this problem has been solved and I can report that I have gotten a color matrix filter for desaturation AND a glow filter working on the iPhone and the iPad using GPU rendering and high resolution.
    The solution, in a nut shell is as follows:
    1: Create your display object... ie: a sprite.
    2. Apply your filter to the sprite like you normally would.
    3. Create a new bitmapdata and then draw that display object into the bitmap data.
    4. Put the new bitmapdata into a bitmap and then put it on the stage or do what you want.
    When you draw the display object into the bitmapdata, it will draw it WITH THE FILTER!
    So even if you put your display object onto the stage, the filter will not be visible, but the new bitmapdata will!
    Here is a sample app I created and tested on the iphone and ipad
    var bm:Bitmap;
    // temp bitmap object
    var bmData:BitmapData;
    // temp bitmapData object
    var m:Matrix;
    // temp matrix object
    var gl:GlowFilter;
    // the glow filter we are going to use
    var sprGL:Sprite;
    // the source sprite we are going to apply the filter too
    var sprGL2:Sprite;
    // the sprite that will hold our final bitmapdata containing the original sprite with a filter.
    // create the filters we are going to use.
    gl = new GlowFilter(0xFF0000, 0.9, 10, 10, 5, 2, false, false);
    // create the source sprite that will use our glow filter.
    sprGL = new Sprite();
    // create a bitmap with any image from our library to place into our source sprite.
    bm = new Bitmap(new Msgbox_Background(), "auto", true);
    // add the bitmap to our source sprite.
    sprGL.addChild(bm);
    // add the glow filter to the source sprite.
    sprGL.filters = [gl];
    // create the bitmapdata that will draw our glowing sprite.
    sprGL2 = new Sprite();
    // create the bitmap data to hold our new image... remember, with glow filters, you need to add the padding for the flow manually. Should be double the blur size
    bmData = new BitmapData(sprGL.width+20, sprGL.height+20, true, 0);
    // create a matrix to translate our source image when we draw it. Should be the same as our filter blur size.
    m = new Matrix(1,0,0,1, 10, 10);
    // draw the source sprite containing the filter into our bitmap data
    bmData.draw(sprGL, m);
    // put the new bitmap data into a bitmap so we can see it on screen.
    bm = new Bitmap(bmData, "auto", true);
    // put the new bitmap into a sprite - this is just because the rest of my test app needed it, you can probably just put the bitmap right on the screen directly.
    sprGL2.addChild(bm);
    // put the source sprite with the filter on the stage. It should draw, but you will not see the filter.
    sprGL.x = 100;
    sprGL.y = 50;
    this.addChild(sprGL);
    // put the filtered sprite on the stage. it shoudl appear like the source sprite, but a little bigger (because of the glow padding)
    // and unlike the source sprite, the flow filter should acutally be visible now!
    sprGL2.x = 300;
    sprGL2.y = 50;
    this.addChild(sprGL2);

    Great stuff dave
    I currently have a slider which changes the hue of an image in a movieclip, I need it to move through he full range -180 to 180.
    I desperately need to get this working on a tablet but cant get the filters to work in GPU mode. My application works too slow in cpu mode.
    var Mcolor:AdjustColor = new AdjustColor();   //This object will hold the color properties
    var Mfilter:ColorMatrixFilter;                           //Will store the modified color filter to change the image
    var markerSli:SliderUI = new SliderUI(stage, "x", markerSli.track_mc, markerSli.slider_mc, -180, 180, 0, 1);   //using slider from http://evolve.reintroducing.com
    Mcolor.brightness = 0;  Mcolor.contrast = 0; Mcolor.hue = 0; Mcolor.saturation = 0;            // Set initial value for filter
    markerSli.addEventListener(SliderUIEvent.ON_UPDATE, markerSlider);                          // listen for slider changes
    function markerSlider($evt:SliderUIEvent):void {
        Mcolor.hue = $evt.currentValue;                        
        updateM();
    function updateM():void{
        Mfilter = new ColorMatrixFilter(Mcolor.CalculateFinalFlatArray());
        all.marker.filters = [Mfilter];
    how would I use your solution in my case
    many thanks.

  • Cannot connect to camera when using gpu rendering

    Hi, anyone know why I can't connect to the camera when I use gpu rendering. I can use the camera when I use direct. Thanks!

    Is this mobile or desktop, or both? Direct is a very preferable option anyhow, FYI. It's required to use Stage3D in many engines as they make clear in all their requisites. 

  • Publish to iOS: Unable to find llvm JNI lib

    Whenever I try to publish to iOS using Flash Professional CS6 I get the following error.
    Exception in thread "main" java.lang.Error: Unable to find llvm JNI lib in:
    /Users/username/Tools/adobe/airsdk/3.5/lib/adt.jar/Darwin
    /Users/username/Tools/adobe/airsdk/3.5/lib/aot/lib/x64
    /Users/username/Tools/adobe/airsdk/3.5/lib/adt.jar
    /Users/username/Tools/adobe/airsdk/3.5/lib
    at adobe.abc.LLVMEmitter.loadJNI(LLVMEmitter.java:582)
    at adobe.abc.LLVMEmitter.<clinit>(LLVMEmitter.java:596)
    at com.adobe.air.ipa.AOTCompiler.generateExtensionsGlue(AOTCompiler.java:432)
    at com.adobe.air.ipa.AOTCompiler.generateMachineBinaries(AOTCompiler.java:1848)
    at com.adobe.air.ipa.IPAOutputStream.createIosBinary(IPAOutputStream.java:428)
    at com.adobe.air.ipa.IPAOutputStream.finalizeSig(IPAOutputStream.java:810)
    at com.adobe.air.ApplicationPackager.createPackage(ApplicationPackager.java:91)
    at com.adobe.air.ipa.IPAPackager.createPackage(IPAPackager.java:257)
    at com.adobe.air.ADT.parseArgsAndGo(ADT.java:571)
    at com.adobe.air.ADT.run(ADT.java:419)
    at com.adobe.air.ADT.main(ADT.java:469)
    This happens with the AIR 3.5 beta SDK as well as the built-in 3.4 and 3.2 SDKs.
    I'm running OS X 10.8.2 and suspect this thread (Mac Java Update Kills iOS Flash Publisher: http://forums.adobe.com/thread/1085439) may have something to do with my problem.
    Any help greatly appreciated.

    Is this really the only solution? I've been trying to go back to Java 1.6 following various threads found on internet but have been unable to get to work - does Adobe have any plans to fix on thier end? Here's error message I'm getting, have tried switching Air versions but didn't help. We've actually had to remove an app from app store because unable to export to fix a bug, so any help would be appreciated.
    Exception in thread "main" java.lang.Error: Unable to find llvm JNI lib in:
    /Applications/Adobe Flash CS6/AIR3.2/lib/adt.jar/Darwin
    /Applications/Adobe Flash CS6/AIR3.2/lib/aot/lib/x64
    /Applications/Adobe Flash CS6/AIR3.2/lib/adt.jar
    /Applications/Adobe Flash CS6/AIR3.2/lib
         at adobe.abc.LLVMEmitter.loadJNI(LLVMEmitter.java:577)
         at adobe.abc.LLVMEmitter.(LLVMEmitter.java:591)
         at com.adobe.air.ipa.AOTCompiler.generateExtensionsGlue(AOTCompiler.java:407)
         at com.adobe.air.ipa.AOTCompiler.generateMachineBinaries(AOTCompiler.java:1585)
         at com.adobe.air.ipa.IPAOutputStream.createIosBinary(IPAOutputStream.java:300)
         at com.adobe.air.ipa.IPAOutputStream.finalizeSig(IPAOutputStream.java:620)
         at com.adobe.air.ApplicationPackager.createPackage(ApplicationPackager.java:91)
         at com.adobe.air.ipa.IPAPackager.createPackage(IPAPackager.java:224)
         at com.adobe.air.ADT.parseArgsAndGo(ADT.java:557)
         at com.adobe.air.ADT.run(ADT.java:414)
         at com.adobe.air.ADT.main(ADT.java:464)

  • Stretched bitmaps blurred when rendered using GPU on iOS, how to avoid?

    Hi,
    in a project of mine (targeting iPhone 4) I'm using pixel art. To show it properly I stretch the bitmap using scaleX/scaleY. However, when rendering using GPU the bitmaps get blurred, ruining the clear pixel art effect that I am after. Rendering using CPU does not blur the bitmaps, but is slower.
    Any ideas on how to tell the renderer not to blur bitmaps?
    There was a similar questions raised in another thread ( http://forums.adobe.com/thread/866712 ) but it derailed into a benchmarking discussion.
    Thank you in advance.

    I did not.
    Instead I rewrote my code to scale up all my spritesheets in software upon load. Something like this:
    var bd : BitmapData = new BitmapData( _spriteWidth * magicScale, _spriteHeight * magicScale, true, 0x0 );
    var matrix:Matrix = new Matrix( magicScale, 0, 0, magicScale );
    bd.draw( originalBitmapData, matrix );
    frames.push( bd );
    The variable magicScale is stored behind the scenes in the engine so the rest of the code just acts as if the scene was not scaled.
    Note that this solution does not use any blitting techniques, it places bitmaps on the stage and then swaps the bitmapData in them. No off screen buffers, just relying on the GPU.
    Works pretty well!

  • Level-2 GPU published for IOS and Box2D/WCK    Does this fall under the revenue share requirements?

    WCK uses the Box2D Alchemy port and therefore uses "Domain Memory".  Part 1 of premium features usage.
    Does publishing to "Level-2 GPU" constitute use of Stage3D?  Is this now part 2?
    Does our project now fall under the revenue share requirements?
    We aren't using Starling or any framework other than WCK.
    I wouldn't think Level-2 GPU constitutes use of Stage3D but have to ask.
    We would appreciate some clarification from Adobe on this.

    Thanks Kevin.  I need to read more carefully. We are targeting IOS/Android with this project.
    "Level 2 - GPU" is a choice in the "Hardware Acceleration" drop down box on the main Publish Settings screen when
    you have Flash (.swf) highlighted in the left pane on Flash Pro CS5.5
    The other settings pages have Auto, CPU or GPU under "Rendering" or "Render Mode"
    Like I said I need to read more carefully. 
    What is NOT considered part of the premium features?
    The following uses are NOT considered premium and there is no revenue share required for these uses:
    Use of domain memory alone,
    Use of hardware accelerated Stage3D features alone,
    Use of software rendering of Stage3D, or
    Use of these features within Flash content packaged as apps using AIR
    Thanks again guys

  • How to use GPU acceleration in final rendering process in AE CS6 updated?

    Hello guys!
    Maybe my question is a frequent, but in fact I didn't get answer listing FAQs here.
    So I have new NVIDIA Geforce card where CUDA supported. I enabled Ray-Traced 3d in "composition Settings" Advanced tab and also in fast preview settings. Everything OK and I can see no errors, but when I try to render one of video hive's projects, I see that CPU's  load is 95-100% but GPU's is only 0-2%. What is the secret? How to use GPU for project rendering?

    The GPU is used very little by After Effects.
    See this page for details of what the GPU is used for in After Effects:
    http://blogs.adobe.com/aftereffects/2012/05/gpu-cuda-opengl-features-in-after-effects-cs6. html

  • Media Encoder CC not using GPU acceleration for After Effects CC raytrace comp

    I created a simple scene in After Effects that's using the raytracer engine... I also have GPU enabled in the raytracer settings for After Effects.
    When I render the scene in After Effects using the built-in Render Queue, it only takes 10 minutes to render the scene.
    But when I export the scene to Adobe Media Encoder, it indicates it will take 13 hours to render the same scene.
    So clearly After Effects is using GPU accelleration but for some reason Media Encoder is not.
    I should also point out that my GeForce GTX 660 Ti card isn't officially supported and I had to manually add it into the list of supported cards in:
    C:\Program Files\Adobe\Adobe After Effects CC\Support Files\raytracer_supported_cards.txt
    C:\Program Files\Adobe\Adobe Media Encoder CC\cuda_supported_cards.txt
    While it's not officially supported, it's weird that After Effects has no problem with it yet Adobe Media Encoder does...
    I also updated After Effects to 12.1 and AME to 7.1 as well as set AME settings to use CUDA but it didn't make a difference.
    Any ideas?

    That is normal behavior.
    The "headless" version of After Effects that is called to render frames for Adobe Media Encoder (or for Premiere Pro) using Dynamic Link does not use the GPU for acceleration of the ray-traced 3D renderer.
    If you are rendering heavy compositions that require GPU processing and/or the Render Multiple Frames Simultaneously multiprocessing, then the recommended workflow is to render and export a losslessly encoded master file from After Effects and have Adobe Media Encoder pick that up from a watch folder to encode into your various delivery formats.

  • AME takes a long time to start encoding when using GPU Acceleration (OpenCL)

    Have had this issue for a while not (current update as well as at least the previous one).
    I switched to using GPU Hardware Acceleration over Software Only rendering. The speed for the actual encoding on my Mac is much faster that software only rendering. However, more often than not, the encoding process seems to stick at 0% waiting to start for a long time, say anywhere from 1 minute to several minutes. Then it will eventually start.
    Can't seem to find what is causing this. Doesn't seem related to any particular project, codec, output format or anything else I can see. I'd like to keep using GPU Acceleration but it's pointless if it takes 5 times as long to even start as rendering by software-only would. It doesn't even pass any elapsed time while it's hanging or waiting. Says at 0 until it starts. Like I said, once it starts, it's a much faster render than what the software would take. Any suggestions? Thanks.
    using an iMac, 3.4Ghz, 24GB RAM, AMD Radeon HD 6970M 2048 MB

    Actually, I just discovered it doesn't seem to have anything to do with OpenCL. I just put the rendering back to software only and it is doing the same thing. hangs for a long time before it starts to run. Activity monitor shows it running 85-95% CPU, 17 threads and about 615 MB of memory. Activity Monitor doesn't say its not responding, it just seems kind of idle. It took almost 7 minutes for it to start rendering.
    1- Running version 7.2.2.29 of Media Encoder,
    version 7.2.2 (33) of Premiere.
    2- Yes, a premiere project of HD 1080 source shot with Canon 60D, output to H.264 YouTube 720, fps 23.976
    not sure what you are referring to by native rendering. The rendering setting i tried now is Mercury Playback Engine Software Only if that is what you are referring to. But OpenCL gives me the same issue now.
    3- H.264 YouTube 720, 23,976 - seems to happen on multiple output types
    4- In Premiere, I have my project timeline window selected. Command-M, selected the output location I wanted and what output codec, selected submit to Queue. Media Encoder comes up, the project loads and I select the run icon. It preps and shows the encoding process setup in the "Encoding" window, then just sits there with no "Elapsed" time occurring until almost 5-6 minutes later, then it kicks in.

  • Why mac pro 2013 not supporting illustrator GPU rendering

    Here, all nvidia cards supporting this:
    http://helpx.adobe.com/illustrator/kb/gpu-performance-preview-improvements.html
    now I'm thinking, that buying the new mac pro 2013 maybe wasn't the best idea... Apple put in AMD GPu's, that is fine by me, but come on, give us some more support for them! Normal windows PC (for 800€) with nvidia GPU is faster at rendering in Illustrator, than my workstation mac (3500+€)! Really? O.o
    And story doesn't end here. I was reading one article about GPU usage in this new mac pro and then I tested it myself. Only one GPU is used and the other is barely touched! And this goes for Maya, photoshop, illustrator and other adobe programs. I was shocked, that I discovered I have one GPU to warm up my room and only one actually doing all the work.
    Ok I understand, that also software developers needs to support stuff and write their software to actually use GPU's, but adobe creative suite is one of examples, where software is actually capable of using GPU via openCL and in some cases even CUDA from nvidia.
    And there is one more thing: new mac pro, has 2 identical AMD GPU's inside, which can work in crossfire configuration under windows in bootcamp, but not in OS X. Sometimes I even play some games on my mac and sometimes I'm testing them (I'm artist, but I also like to develop and test some games in my free time) and it's kinda a shame, that I can run games with better performance under windows in bootcamp.
    When I bought this computer I thought this is going to rock and sweep up with all my previous machines. Well in practice it is much of unused potential and great room warmer. It's kinda sad, that on my girlfriend's PC I can play games with better frames and details on GF 660GTX than on my mac pro... I know I know, mac's aren't for gaming, but hey, sometimes it is good to get my mind on rest and blow up some enemies!
    I'm not saying mac pro isn't great product, which is (very great). But support to use all computer capabilities is very limited and Apple should kick developers in their back side and push them to start developing things to use GPU's more.
    I'm also not expecting Apple to jump on their feet and fix problems overnight. No, just start thinking of adding more support and things that would make our computers much more used and not only good decoration on desk. First would be a good steep adding crossfire support to OS X. That might even help at rendering some 3D scenes?
    Next step would probably be better cooperation with software developers, like Adobe , Autodesk,... If adobe made good support for GPU rendering in Illustrator, aim that we get that support soon as well. I'm sometimes having a lot of paths and objects and rendering goes pretty slow then.
    I'm also thinking about that AMD choice Apple made wasn't so good when you see that Nvidia puts much more effort into supporting more and more stuff, while AMD is sitting idle not even updating their drivers anymore.
    Oh and one more thing: Can we please, please, please, please, preeeeeetty please get 10-bit output support on OS X? It's kinda sad, having good mac workstation with 10-bit capable monitor (dell UP2414Q) and no support for it, while other PC workstations all support it... Photoshop is working much better on OS X and here I have much better integration and everything and I really love working in Photoshop running on OS X, but some basic things are missing and I'm really asking myself sometimes: was that good decision? Photoshop was first written for macs and in 2014 there is much better support for adobe software on windows than on OS X. Where went wrong?
    Don't take me wrong and think I'm just ranting here without any good reason, but think: buying very expensive computer that is supposed to make creative work easy and painless is actually just on paper. In reality computer potential is pretty much unused and that makes me thinking if I would be much better buying iMac instead of mac pro...
    TL;DR: I wish for some more support from Apple for their new expensive mac pro's 2013. They are great piece of equipment, but equivalent PC's running windows are surpassing them in this department, specially when GPU's are involved. Now I would like to point this out to Apple, so I don't know where to write so they will read so I first wrote that here. I hope someone from Apple reads this and give us some feedback on those missing things or maybe I get direction where to write to Apple so I get maybe some response from them(?):
    -not enough GPU support for creative software (adobe suite, autodesk,...) - nvidia + windows offering much more for less money?
    -no crossfire support on OS X. Any particular reason for that?
    -no 10-bit support for 10-bit displays. Why not? Hardware is capable, why software isn't?
    -lack of openCL software out there. Apple isn't doing much to get more support or developers just too lazy to put more effort into this? GPGPU isn't the future as some companies are trying to convince us?

    Grant Bennet-Alder wrote:
    All displays are supported by one GPU -- the display GPU. The second GPU is reserved for un-interrupted GPU computing and has no Display output Hardware.
    Anandtech has a discussion of this in his review of the late 2013 Mac Pro.
    Under OS X the situation is a bit more complicated. There is no system-wide CrossFire X equivalent that will automatically split up rendering tasks across both GPUs.
    By default, one GPU is setup for display duties while the other is used exclusively for GPU compute workloads. GPUs are notoriously bad at context switching, which can severely limit compute performance if the GPU also has to deal with the rendering workloads associated with display in a modern OS. NVIDIA sought to address a similar problem with their Maximus technology, combining Quadro and Tesla cards into a single system for display and compute.
    from section 9:
    http://www.anandtech.com/show/7603/mac-pro-review-late-2013/9
    Another problem here: on windows bootcamp crossfire IS working, but not in OS X. As I wrote, sometimes it would be better to get more rendering power than computing power (actually at current state of GPU computing software, computing GPU isn't working very much if at all). It would be nice to have an option to switch on and off crossfire so we could use more rendering power when needed.
    Also not all people are doing video rendering on their computers. What about 3D modelling and 2D work in photosohp? And maybe people like me are occasionally wish to play some games on their mac?
    You linked me GPU usage from one program (probably adobe after effects or premiere pro or apple final cut X?), that is probably one of the only pieces of software that actually uses both GPU's under OS X. I made same experiment with software I use and I even tested some software I don't use but it is advertised it is "optimized" for mac pro like pixelmator.
    And guess what? Second GPU is 99% time idle. Even when some stuff is using openCL in Photoshop, stuff is still computed on rendering GPU and not on computing one, which makes computing GPU just a good room warmer as I wrote before. Wouldn't be better to at least use second GPU for something than heating air?

  • Publishing to iOS via Windows?

    I have been thinking about upgrading my D11.5 to D12 on Windows, but the prerequisites still seem to be Mac OS X 10.7.3 and xCode? So what is the point of D12 for Windows? Am I correct in assuming that the main new feature (publishing to iOS) is not available?  Will it be? I am still using D8 for Windows to develop educational applications for students because I cannot run the needed xtras in 11.5.  D12 still does not include the ability to Print a cast member correct?  Or, the ability to change screen resolution correct?  I use POM lite in D8, but it is no longer available (and it does not work on the Mac) correct?  The full POM costs as much as the entire Director 12 upgrade (that is totally ridiculous). I have access to a computer with Mac OS X 10.6.8, but why does D12 require 10.7.3 to publish to iOS?  I have been developing educational apps on Director since D4 in 1994, but see no point to upgrade to D12 for WIndows.  Is there an answer to my questions or some type of work around?

    You will only be able to publish for iOS on a Mac. Director 12 uses Apple provisioning profiles and distribution certificates which you will need to obtain via the Apple developer portal.

  • CS6/Bridge doesn't use GPU on Nvidia Optimus machines

    Photoshop CS6/Bridge
    Win8.1 64bit/Nvidia GTX 870M
    Is there a way to force Adobe Bridge to using the Nvidia GPU rather than “software rendering”?
    The problem: Starting Bridge on notebooks with the Nvidia Optimus (I tested several) in “preferences -advanced”, the option, “use software rendering”, is not tagged which is correct.
    But as soon as I open an image and recheck with the preferences, “use software rendering” is tagged and greyed out, hence zooming and all other manipulations suffer from a lack of speed and smoothness, slide-show for ex. is not usable at all.
    I’m familiar with the Nvidia control panel and of course the Nvidia GPU is assigned to Bridge.exe there.
    Strange enough, in the registry, the Bridge – preferences value for “use software rendering” always stays on “0”, which is correct.
    BTW. Photoshop is running fine on the Nvidia GPU.
    It is a shame that Bridge performs a lot better on a 6 years old laptop than on a brand new high end gaming machine only because there is Optimus on board.
    Does anybody have an idea for a workaround?

    Is there any insight here? I can't seem to get my brand new work y70 to use the NVidia graphics card for any external displays! What the heck is going on!? And, moreso, why can't even the crummy Intel onboard graphics push 2560 x 1600 via either HDMI or USB (through a replicator)? I've had a 30" monitor running 2560 x 1600 for 5 years. This is a brand new laptop and can't push out high res?! It's crazy to me!! EDIT: I presently have the Intel display adapter disabled in the device manager. I have video on the laptop display but I can no longer get ANY external displays to work. Which means, to me, it would seem that the NVidia GPU is 100% relegated to the laptop display, ONLY.

  • One Flash CC project gets stuck when publishing in IOS. It will quickly publish as Android, and other projects will publish as IOS. Not sure what to change.

    One Flash CC project gets stuck when publishing in IOS. It will quickly publish as Android, and other projects will publish as IOS. Not sure what to change.

    Thanks for the info.
    I've recently stumbled on an article on how to view the packaged file (apk) published using AIR 3.2 for Android setting (Adobe Flash CS6). All you need to do is change the extension of the file from (.apk) to (.rar) or (.zip) and extract the file as you would with any other zip or rar file. I was able to confirm that the packaged file (apk) indeed contain all the videos for the presentation. I have more than 80 videos in the presentation. All videos included in the package are of the same video format and resolution (FLV 320x240) but of different duration. I intentionally included the all the videos inside the package so I can use the presentation anywhere and anytime without the need to connect to the internet. The packaged apk file size is (1.23GB) the file includes the project's swf and xml file, the component swf and videos.
    When installed to android I noticed that the package remained as an apk file. Not really sure if android temporarily unpacks it when you run the app.
    Testing phase:
    I see no problem when I'm testing the project on my desktop computer. But when packaged and installed on my device the problem arise.
    Problem:
    I'm able to install and run the app on my device but wont play some of the videos included in the package.
    Was this due to the limited RAM or processor of my android device?
    I wanted to distribute the presentation to friends so i was trying to package it for easy access and installation on their device.
    Alternative Solution:
    I installed a third party app called SWF player by BIT LABS. Copied all the files to my device (the project's swf and component swf with the videos). The project worked like a charm the player (SWF player) and was able to run the presentation as if it was running of my desktop. All buttons and videos working properly. Although the alternative solution worked for me I still wanted to package the presentation for easy access and distribution to friends.
    Please help...Thanks in advance.

  • Is there doc for the exact steps to publish to iOS?

    I am looking for the EXACT steps for publishing to iOS. the Xcode doc talks about "making an archive" but that doesn't seem to work because it is specific to apps made entirely in Xcode. I can get the IPA to an iPad or iPhone attached to the Mac but that's it. certainly I could FEDEX a device to somebody else for them to see it but no, that is not a solution.
    this link (Publishing to iOS devices using Director 12) does NOT have enough information. there are steps missing. for example if you make a Provisioning file on developer.apple.com that version will fail showing a yellow icon when dragging it to the Organizer.
    maybe there is a link that I am missing. please help.

    Hello webdesignsf,
    I’m in the same situation as you – I’m just about to finish my first iPad app, and will be submitting it to the Apple iStore very soon.
    ( Can someone please verify this procedure below )
    As soon as I figure it all out, I’ll write down the steps.
    regard Milky.
    In the meantime, I’m guessing…..
    Get a distribution certificate & make a distribution provisioning profile
    Create icons and screen shots
    (??? maybe  -   Setup for embedded icons – edit the info.plist to reference a unique icons - see the bottom of this message for more info ???)
    Make your .ipa file with Director 12 (using the above distribution certificate – provisioning profile)
    Change the extension of your .ipa file – from .ipa into a .zip   (you just rename the file!)
    Unzip this .zip file –  (this will create a folder called ‘Payload’)
    Find the ‘Payload’ folder – inside you’ll now see a .APP file!
    Highlight this .APP file – and with the right mouse button - select ‘Compress’
    (This will convert the .APP file into a .zip file – this is what Apple calls a binary file)
    Navigate to the iTunes Connect area of the iOS Dev Centre
    Download and install the ‘Application Loader’ utility from the iOS Dev Center  -  (I think is's now a part of xCode too!)
    Go to the Manage Your Apps page and click 'Add New App'
    Fill out the forms describing your company and application
    When asked upload the icons and screenshots
    Save the app description
    Back on the Manage Your Apps page, select the app description you just created, and click the Ready to Upload Binary button
    Fill out the Export compliance form
    Your app should now be in a ‘Waiting for upload’ state in the manage your apps page
    Now use the Application Loader utility to upload your binary file.
    Other Notes:-
    There is a whole list of recommendations for getting apps approved by Apple - I'll write these down one day.
    make sure your app is as small as you can get it, and make sure it’s fast to load up the first screen (3 seconds or less)
    http://www.adobe.com/devnet/flash/articles/app_store_guide.html
    Guide for Apple App Store submissions | Adobe Developer Connection
    There are loads of youTube videos that show submissions to the app store with other development systems,
    but they are mostly the same, when it comes to setting up the distribution certificate etc...
    http://www.youtube.com/watch?v=D4iPyGyrhcM
    Uploading your GameSalad game to Apple - YouTube
    Setup for embedded icons:-
    Submitting icons and screenshots is a part of the app submission process.
    But you may need to create a Director made ‘ipa’ file, with unique icons already embedded first ???  I just don't know yet???
    (pallottadesign) wrote some instructions regarding this way back in May 2013 – I don’t know if this step is needed anymore – but I added it here in case....
    Re: director 12 IOS publishing icon plist
    pallottadesign May 1, 2013 6:17 AM (in response to ralph2511)
    Editing a separate info.plist to reference your app icons is actually a better approach than altering the defaults within Otto.app.
    The procedure is simple:
    Put your unique app icons in the same folder as your Director file. You can name these icons whatever you wish, or name them the same as the defaults in Otto.app, i.e. "projector_72x72.png" and "projector_57x57.png", or you can give them unique names.
    Copy the info.plist file to the same folder as your Director file.
    If you have given your icons unique names, open the info.plist file in Xcode and find the "Icon files" key. Edit "Item 0" and change the entry to the name of the 72x72 pixel png icon. Edit "Item 1" and change the entry to the name of the 57x57 pixel png icon.
    Save the info.plist file.
    Within Director, open Publish Settings and in the "Files" tab, enable "Copy linked and dependent files." Click on "Add Dependent Files" and navigate to the folder with your unique icons. Select each one and make sure that the radio button is checked for each one.
    In the iOS tab, select the "Info Plist Path" option and navigate to the location of your edited info.plist. Select the info.plist file in the dialog box and click "Choose."
    Publish your iOS app and test. The icons should now have been embedded within your iOS app bundle.
    You may be able to put the icons in an "images" folder within your app folder structure, but you will still have to use the "Copy linked and dependent files" and "Add Dependent Files" options to have that folder and the icons bundled within your iOS app.
    If you intend to develop apps for the app store, you may need additional icons at larger sizes.
    For reference: http://developer.apple.com/library/mac/#documentation/FileManagement/Conceptual/FileSystem ProgrammingGUide/FileSystemOverview/FileSystemOverview.html
    and for icons:
    http://developer.apple.com/library/ios/#documentation/userexperience/conceptual/mobilehig/ IconsImages/IconsImages.html

  • Media Encoder not using GPU in CC2014.2

    I updated to CC2014.2 yesterday.
    Rendering a project from Premiere Pro's Export function engaged the GPU as expected, I hear the fans spin up, renders are fast.
    However, when sending to Media Encoder, project render times are astronomical, the GPU is not engaging despite CUDA being switched on in all applicable options. The difference is a few minutes vs 2 hours. No non-native plug-ins are in use.
    This was not a problem a few days ago with CC2014.1. For the moment I'm stuck to exporting directly from Premiere until it's fixed. Has anyone else experienced similar bugs?
    Windows 7 64
    GeForce 580GTX 3GB
    Driver 341.44

    Hmm it seems we both broke the rule of "don't update mid project!"
    Still, hopefully this will lead to a quick fix, since it will affect many professional quite badly.
    Yes for me it's just no GPU rendering in Media Encoder. I've been able to get through this by using Export, which is extremely fast.
    My timeline playback works fine too as does Speedgrade, so it appears this is an issue with Media Encoder alone on my machine.

Maybe you are looking for