Can beginBitmapFill be handled by the GPU?

With the packager, is it possible to speed up calls to beginBitmapFill by using the GPU?  Each call includes a warping matrix.  The source in each case is the same large image (as big as 2048x2048).
The code implements the Thin Plate Spline algorithm, to produce something similar to the new Photoshop puppet warp feature.  It creates a triangular mesh (about 200 triangles) over a bitmap, and uses a different matrix to render each triangle to the screen.
We have a prototype Objective C++ version that uses texture mapping in openGL on the iPad.  Because it's openGL, it uses the GPU and is very fast.  In that version, we feed the triangles to openGL and let it iterate through them.  But we would prefer to develop the app using flex.
Thanks for any suggestions or opinions.

It cannot be tracked by either. It is tracked via Find My iPhone, so long as that was activated before the device came up missing. It has to be powered on and connected to the Internet as well.

Similar Messages

  • Can I get handle to Database Row before an entity is deleted

    Hi,
    We are using stored procedure to list, insert , update and delete entity objects.To use the stored procedure I override the default queries generated by Toplink and make use of the stored procedure query.The issue here is that my Stored Procedure requires more arguments (like name of the user doing the insert) than there are fields in the entity.I need to provide these argument values to the call.For this I do something like
    session.getDescriptor(MyClass.class).getEventManager().addListener(new DescriptorEventAdapter(){
         public void aboutToInsert(DescriptorEvent evt) {
         evt.getRow().put("USER",evt.getSession().getProperty("username"));
    I set the username in the unitOfWork out of which I can grab the value in the eventHandler and put it inside the row just before insert.I do the same for modify too.But I dont see a similar methods for "delete".Do we have a aboutToDelete() method where I can get a handle to the row.The preDelete() does not provide a handle to the row.How can I insert additional arg values to the delete query?
    Thanks,
    Harini

    Hello Harini,
    Wow, you are correct there is no aboutToDelete, I guess manipulating the row before deletetion is not that common for our TopLink clients as this is the first request I have heard for it. I will add this request to our enhancement database. You might want to follow it up with support if it is a priority for you.
    As a workaround you could use the preDelete event to change the stored procedure call with the user name argument (i.e. build the procedure call dynamically in preDelete and set it into the query. The query will have been cloned at this point so there should be no concurrency issues. Make sure when doing this that you do not set the deleteQuery in the descriptor query manager as this will override the original query if set.
    Example:
    descriptor.getEventManager().addListener(new DescriptorEventAdapter() {
    public void preDelete(DescriptorEvent event) {
    StoredProcedureCall call = new StoredProcedureCall();
    call.setProcedureName("Delete_proc");
    call.addNamedArgument("P_ID", "ID");
    call.addNamedArgumentValue("USER", "scott");
    call.addNamedArgumentValue("TIME", new java.util.Date());
    event.getQuery().setCall(call);
    });

  • How to activate the GPU with Yosemite, iris pro on macBOOK ?

    with after effect CS6 on macbook pro (under yosemite) with a iris pro from intel i can't activate the gpu render in the graphics options even i try with installing the latest Cuda and the way i found on video maze forum,have somebody experimate a solution?

    Do not install CUDA software on a computer without CUDA (Nvidia) hardware. Doing so can be very damaging.
    The GPU is almost entirely irrelevant to After Effects.
    See this page for details of the few things that the GPU does with After Effects:
    GPU (CUDA, OpenGL) features in After Effects

  • Hi, I don't know who to turn to: the person who handles all the compter stuff is out of town, This A.M.I started up Firefox and there was a message re an update, I thoght I new just to click on install and wait to re start Firefox. But what I can remember

    Hi, I don't know who to turn to: the person who handles all the compter stuff is out of town, This A.M.I started up Firefox and there was a message re an update, I thoght I new just to click on install and wait to re start Firefox. But what I can remember is that another window opened and asked me to choose 'something' (an icon or banner or?), I thought I must click something (I guess I should read all clearly). Firefox did update to 3.6.4, but it also inserted some sort of what might be called a banner? But it only partly covers the right one- half of my 3 toolbars at the top of the Firefox window. It can't be used for anything as it is just a mess of colors and unknowns, except what looks like a 'top' and what I recognize as a Firefox icon (but only partly visible)!! Now, what it does do is partly hide that whole portion of the toolbars and they are hard to read and in turn make this 'thing' impossible to figure out, if I would care to? I can't find anyway to remove it or use it?? But since the Firefox icon is on it, I guess you should know what it is and how to trash it. I am sorry for this occurrence caused my my carelessness, but I'm not much of a computer person ----HELP! Thank you for any help. Judi
    == This happened ==
    Every time Firefox opened
    == Today, Thrsday June 24, 2010

    The only thing I can imagine: it's a [https://addons.mozilla.org/en-US/firefox/personas/ Persona].
    Tools->Add-ons->Themes
    Here should be the persona (lightweight theme)
    and a button "Remove"
    Click it (button Remove), and it (Persona, this mix of colors) should go away instantly.
    Does it help you?

  • How can I get the Airport Express to handle all the PPPoE stuff?

    Hi, I’m visiting my family in China, and now trying to help my dad, with his Airport Express and how to set up a PPPoE connection.
    We have currently set up the Airport Express in bridge mode (not distributing IP adresses and selecting DHCP under the Internet tab in admin utility). The Airport settings on our two computers is set up to connect using PPPoE using the given login name and password. (ps! we can not see the Base station in Airport Admin Utility when using these settings, we would have to select a new location from the Apple menu to see it and make condigurations.)
    What we want is to do, is to have the Airport Express connect to the ISP using a PPPoE connection and not through the computer.
    I know there is a 'Connect using PPPoE' option in Airport admin util, letting me input account name and password. If I select this setting instead of DHCP, enable distribution of IP addresses and configure my Airport card to NOT connect using PPPoE, I will see my base station in the Airport admin util with the IP address of 10.0.1.1 (or similar) and my computer will have x.x.x.2. Next to the Airport icon in the menubar, a scrolling message will say 'Looking for PPPoE host' without anything happen. I am sure my account name and password is correct as they've both worked when using this computer to connect to PPPoE (like now)
    How can I get the Airport Express to handle all the PPPoE stuff without using bridge mode?
    Ps! Both me and my dad have iPhones whom we can’t seem to get to connect unless its been distributed an IP address cause there's as fars as I know, no options of inputing a PPPoE user name and password.

    Any solutions to this? I'm in China also, in Beijing, trying to get my Airport Express to work with an ADSL modem.
    Direct ethernet cable connection to my Macbook works fine.
    When I configure the Airport Express with the ID and password that seems to be fine also – Airport Express shows a green light.
    But I cannot figure out the settings to connect wirelessly from my Macbook to the Airport Express. I get a constantly scolling message: "Looking for PPPoEhost..."
    thanks
    Paul

  • Is there a way I can force an application to use the GPU?

    I own a MacBook Pro 15in with retina display with integrated graphics from NVIDIA GeForce GT 750M. Currently I have my settings in the "Energy Saver" section of the System Preferences set for automatic graphics switching. I use an interior design application called Live Interior 3D Pro. This application uses a lot of 3D rendering that lags for a 2-3 seconds. I checked my activity monitor to see if the application was using the GPU, and it indicated it was not. I understand that I can uncheck the automatic graphics switching to fix the problem, but the problem is I hear it screws with the battery life. I do not want to switch off the automatic switching setting but do want this specific application to use the GPU instead of the built in Intel Iris Pro graphics thing. Is there a setting I can use to make just this application fire up the GPU when I open it? If not, do you know if switching off the automatic graphics switching setting messes with the battery at all? Also, does this mess with the performance of the machine overall since I have 16GB of RAM and only 2GB of GPU power or are those totally different and irrelevant? Thank you for your time and thank you in advanced for your help!!!!!!!!

    jspatel1011,
    your MacBook Pro’s discrete NVIDIA GPU is more powerful than its integrated Intel graphics; that power comes at the cost of increased energy consumption. There’s no setting to allow exceptions to the automatic graphics switching; you’ll have to determine whether that power vs. energy tradeoff is worth it when running Live Interior 3D Pro or not. If it is worth it, then disable the automatic graphics switching before you run Live Interior 3D Pro, and reënable it after you quit that app. (You might look into AppleScript to see if that disabling and reënabling can be automated.) If it’s not worth it, then you’ll have to live with the lagging while it does the 3D rendering.

  • I bought in december of 2008 a 2.4ghz macbook pro and I think the gpu is starting to fail. Can I still be covered for a change because of the faulty nvidia gpu?

    I bought in december of 2008 a 2.4ghz macbook pro and I think the gpu is starting to fail. Can I still be covered for a change because of the faulty nvidia gpu?

    If your computer qualifies - MacBook Pro: Distorted video or no video issues  
    If it does, call Apple Customer Relations or print out a copy of the article and bring it in to the repair shop. 

  • How to Setup Forward Error Handling in PI Scenarios. Can you help me with the same with screen shots if possible?

    Dear all
    How to Setup Forward Error Handling in PI Scenarios. Can you help me with the same with screen shots if possible?
    Thanks
    Regards
    karan

    Hello
    These are the following errors
    1. Trace level="1" type="T">no interface action for sender or receiver found</Trace>
    2. <Trace level="1" type="System_Error">Application-Error exception return from pipeline processing!
    3.
    <Trace level="1" type="T">Application Error at Receiver... => ROLLBACK WORK</Trace>  
    <Trace level="1" type="T">System Error at Receiver... => ROLLBACK WORK</Trace>  
    <Trace level="1" type="B" name="CL_XMS_MAIN-WRITE_MESSAGE_LOG_TO_PERSIST" />  
    <Trace level="1" type="System_Error">Application-Error exception return from pipeline processing!</
    Trace level="3" type="T">No persisting of message after plsrv call, because of config</Trace>Trace level="3" type="T">Error of the prevous version:</TraceTrace level="3" type="T">Error ID APPLICATION_ERROR</Trace>
    tThere are repeating errors also.
    TThanks
    Regards
    kkaran

  • Can C++ AMP binaries be loaded straight to the GPU

    Hi,
    I'm currently investigating whether C++ AMP can be used in a situation where I can quickly load pre-compiled kernel binaries to the GPU for execution?
    For example CUDA API can load CUDA binaries pre-compiled from PTX (GPU assembly code) that the GPU can execute straight away.
    My understanding is that C++ AMP sits on top of DirectX, so are we looking at loading shader code?
    What I want to do is to build up a list of operations a user does when executing our application and be able to JIT the operations to binary code so that next time when the same or similar operations are required I can use some sort of hash to look up and
    load the pre-compiled binary for immediate execution on the GPU and hence avoiding the JIT stage. Is this possible with C++ AMP?
    Hope that makes sense. Thanks for any help,
    Julian

    The EoS for this product was January 20, 2014. That is the reason you can not find anything past 8.6.2. The replacement product is the BE6K.  For more information you can reference the EoS/EoL documentation.
    You will most likely find no documentation on running CME 10 or 10.5 on this platform for that reason.

  • How can i know what transaction is handle by the TM?

    How can i know what transaction is handle by the TM?

    Hi;
    Please see below
    E-Business Oracle Trade Management Product Information Center [ID 1297829.1]
    Regard
    Helios

  • In Aperture 3 creating a book using Formal theme and in Edit Layout I can move but not resize the image box even though handles appear and the manual says I should be able to. Help      image boxt

    Can I resize image boxes with the displayed handles as the User Manual suggests I can, when creating a book in Aperture 3 using the Formal Theme (with the beveled mask) and in Edit Layout mode?  I can move the box with the yellow guidelines, but those handles will not move to allow me to adjust size.  If I delete the box and add a photobox, I can resize, but I loose those nice beveled masks!

    Well I tried on my wife's MB and it worked fine, but keeps the same aspect ratio. Nearly there. I tried the local Apple outlet, but they had just taken Aperture off the demo machine .
    However, I have discovered that I can use the "right click" option for 'photo aspect ratio' and it will resize the box to accomodate my pictures.  By fiddling I can select an alternative ratio and/or picture and change the size of the frame up or down and try to get an appropriate size and ratio, e.g. if I have a small landscape picture, I can pop in a bigger square picture (it gets cropped) then choose a portrait mode (cropped differently) then pop in my landscape picture (still cropped, but bigger size), then use the option 'photo aspect ratio', I get it uncropped and bigger.  Not a really satisfactory work around, but better than nothing. 
    By combining they fiddling with aspect ratios and the pinch/squeeze routine - on the MB (or via Lion and a track pad I assume) - I should get there.
    Pity those pesky handles don't work!!
    Thanks folks for the help

  • Can CS6 exploit the GPU on my desktop when accessed from my laptop with RDP?

    Just to clarify further: both machines running windows 7 professional or better (so the full version of RDP is available), and CS6 runiing on the desktop, accessed from my laptop.
    As the GPU is being used as a compute resource, will CS use the desktop GPU for acceleration?
    sad day if not - I am considering ditching expensive laptops and use a medium spec laptop together with a good gutsy desktop (which is about 1/3rd the price / performance) as I have a good network to run over at home.

    The water is somewhat muddied.  I have only a Windows 7 and Windows 8 system to try it with at the moment.
    This is what I got when I RDP'd into a Windows 8 VM from Windows 7:
    Adobe Photoshop Version: 13.0 (13.0 20120315.r.428 2012/03/15:21:00:00) x64
    Operating System: Windows NT
    Version: 6.2
    System architecture: Intel CPU Family:6, Model:7, Stepping:6 with MMX, SSE Integer, SSE FP, SSE2, SSE3, SSE4.1
    Physical processor count: 4
    Processor speed: 3159 MHz
    Built-in memory: 4095 MB
    Free memory: 3104 MB
    Memory available to Photoshop: 3466 MB
    Memory used by Photoshop: 60 %
    Image tile size: 128K
    Image cache levels: 4
    The GPU Sniffer crashed on 8/21/2012 at 5:33:29 PM
    Display: 1
    Display Bounds:=  top: 0, left: 0, bottom: 1200, right: 1600
    No video card detected
    Serial number: Tryout Version
    Application folder: C:\Program Files\Adobe\Adobe Photoshop CS6 (64 Bit)\
    Temporary file path: C:\TEMP\
    Photoshop scratch has async I/O enabled
    Scratch volume(s):
      Startup, 159.7G, 134.6G free
    Required Plug-ins folder: C:\Program Files\Adobe\Adobe Photoshop CS6 (64 Bit)\Required\
    Primary Plug-ins folder: C:\Program Files\Adobe\Adobe Photoshop CS6 (64 Bit)\Plug-ins\
    Additional Plug-ins folder: \\Noelc4\c\DEV\ProDigital\trunk\plugins\Output\Win\LastBuild\x64\
    I have Photoshop working okay on the above system using the OpenGL VMware provides when I log directly into the console...
    I don't think the above failure says anything definitive one way or another what's up with your particular request yet, though it does tell me I need to do some additional testing to try to understand what's happening so that I know how my own products are going to behave in this situation.
    I have a Windows 7 VM I can test with.  Watch this space...
    -Noel

  • Can i upgrade the gpu of imac 21 2012late?

    Hi guys i have iMac 21inch  2012 late (that with very slim edges)
    It has i5 2,7GHz and gt640m vith 512mb of vram.
    Im teen and im playing all those modern dx11 games (ive bootcamped it on win7 64bit)
    For example i can run bf3 full HD on almost everything medium but next gen game like assassins creed4 i can run 900p lowest on 30fps with lags
    that 30 fps wont be problem but those lags every1 second
    I want better gpu but not that gt750m that nothing for me i want to fill it up with gtx 970m or 980m
    Last year i saw 27inch one with gtx 780m so is it possile to have this kinda gpus in 21 inch
    Will the power supply be OK or i must hvae better powersupply
    Can i wire in somehow better powersupply?
    can i wire in gtx 970 or980 deskyop versions to my imac via thunderbolt I/O port with that 100pound box

    The graphics processor is soldered on the logic board and is not upgradeable.  To do so would require desoldering the GPU and replace it with another, assuming the pins of the two match perfectly.

  • Can i use Handle C with the LABVIEW FPGA module?

    Hi,
    Can i use Handle C with the LABVIEW FPGA module?  I am working with CompactRIO right now so, i want to know weather i can use Handle C with CompactRIO.Can i access FPGA in compactRIO independant of LABVIEW  means can i program it without using LABVIEW?
    regards,
    Vishnu.

    Hi Vishnu,
    Although we don't support Handel-C directly, it is possible to design your algorithms using 3rd party tools or system integrators and consume them from a top-level LabVIEW VI. The webcast at http://zone.ni.com/wv/app/doc/p/id/wv-268 shows an example of how you can use IP from C-based tools provided by Celoxica in the LabVIEW FPGA environment.

  • How can I enable the GPU option on a new mac pro with 2 AMD D700 graphic cards?

    I have recently purchased the new mac pro and I saw the GPU option is not enable and only recognizes the CPU one, what is frustrating. Does anybody know how to enable the GPU option?
    thanks

    See this for details of how the GPU is used in After Effects:
    http://blogs.adobe.com/aftereffects/2012/05/gpu-cuda-opengl-features-in-after-effects-cs6. html
    Note that your card will be used for all of the GPU acceleration features but one---the ray-traced 3D renderer---which requires an Nvidia GPU. That is an obsolete feature that you don't need to worry about, anyway. There are many better alternatives to using 3D in After Effects, including Cinema 4D, which is now included with After Effects:
    http://blogs.adobe.com/aftereffects/2013/04/details-of-cinema-4d-integration-with-after-ef fects.html

Maybe you are looking for