Large Images + MPE Hardware = Decreased Performance?

I have been enjoying the performance gain of using the GPU on my graphics card (GTS 250) even though it is not officially supported. I have found that some timelines that take the CPU to 95% in software mode play back using only 25% using the GPU.
I have found only one anomaly so far and I am not sure if it is specific to the mercury playback engine using hardware, using an unsupported card, or just my card/system in particular.  If I place large (over 4,320 pixels alone the long edge) Jpeg pictures on the timeline (DSLR 1080p @29.97), animated the position and scale using the standard motion effect and apply a cross dissolve, the MPE using hardware will crash during export or bring it to a very slow grind.  It is the only case that I have found where exporting in software mode is much faster.
However, if I reduce all the images so that the longest edge is below 4,000 pixels, the speed of the MPE using hardware returns and exports work fine.  I am curious to here if others have noticed any performance lag/problems from using large images?
PPRO CS5 5.0
Nvidia GTS 250
Nvidia Driver Version 197.?

Found it...it was on Karle Soule's website.  Here is what he says about CS5 and maximum resolutions.
" In the Adobe Mercury Engine, the maximum timeline resolution is 10240 pixels by 8192 pixels, more than enough for any mastering resolution we'll see in the near future. The maximum resolution for clips dropped in any timeline will be limited to 256 megapixels in any direction. So, for example, footage from a 32000 by 8000 pixel sensor could be imported and dropped onto a timeline."
From the bottom of this page: http://blogs.adobe.com/VideoRoad/mercury/
I am sure that Photoshop does a better job scaling images, but for projects with a lot of images, it just does not seem very practical to crop/scale each image base on how much panning will be done.  My project is not slowing down due to larger images and playback on the timeline is great, I was just wondering why MPE using hardware bogs down.
My today's standards, an image of 4,300 pixels along the long edge is not that big....
I have found that the problem is related to the cross dissolve.  If I remove the cross dissolve the sequence will export with all the GPU goodness speed.  Make me wonder if both images have to be loaded by the GPU to calculate the dissolve.  If that is the case, then two images over 4096 would put my card over the 8192 maximum image texture...however, the GTX 285 has the same maximum image texture, so I wonder if the problem still occurs on my card/system or all GPU rendering?

Similar Messages

  • Larger image size and disk performance

    hey all,
    Granted the majority of users here won't experience issues with larger image sizes, but i'm hoping a few may be able to shed some experience.
    I work with a H3D, Mamiya RB (with various Leaf backs) and also drum scanned film. The issue is that often a single image size can be anything from 150mb upto 500mb, this is excluding any post production changes.
    My current aperture library is on a FW800 disk, but disk i/o is crippling the box and program at the moment. I'm looking at the express slot and wondering who here is running a disk off that and what they feel it's like from a performance perspective.
    An example is a recent location shoot which has a handful of images above 500mb each. Aperture takes around 10-15 mins to startup when this folder is selected (constantly processing the image every time it starts) and this leads to a totally unresponsive OS.
    How are you handling large files with Aperture?
      Mac OS X (10.3.9)  

    On the Quad I process 250Mb+ TIF scans, not often, but often enough. External drives 7200rpm in a Sonnet 500P encl. attached to eSATA muliport card (Sonnet E2P). Performance is equal to internal so far as I can judge.
    I recall the PCMCIA then PC Card bus speed was horrendously slow. Not sure what the Expresscard bus speed is, but it would be a crying shame to attach 300Gb/s burst capable drives (or RAID 5 driving 200Mb+ continuous) to a backend bus capable of a few Mb.
    As Alan notes, a MBP may be OK for field work and tethered shooting, but for the image sizes you have, the preferred solution would be a Mac Pro.
    G.

  • How to increase performance speed of Photoshop CS6 v13.0.6 with trasformations in LARGE image files (25,000 X 50,000 pixels) on IMac3.4 GHz Intel Core i7, 16 GB memory, Mac OS 10.7.5?   Should I purchase a MacPro?

    I have hundreds of these large image files to process.  I frequently use SKEW, WARP, IMAGE ROTATION features in Photoshop.  I process the files ONE AT A TIME and have only Photoshop running on my IMac.  Most of the time I am watching SLOW progress bars to complete the transformations.
    I have allocated as much memory as I have (about 14GB) exclusively to Photoshop using the performance preferences panel.  Does anyone have a trick for speeding up the processing of these files--or is my only solution to buy a faster Mac--like the new Mac Pro?

    I have hundreds of these large image files to process.  I frequently use SKEW, WARP, IMAGE ROTATION features in Photoshop.  I process the files ONE AT A TIME and have only Photoshop running on my IMac.  Most of the time I am watching SLOW progress bars to complete the transformations.
    I have allocated as much memory as I have (about 14GB) exclusively to Photoshop using the performance preferences panel.  Does anyone have a trick for speeding up the processing of these files--or is my only solution to buy a faster Mac--like the new Mac Pro?

  • How to get larger images in albums?

    I just realized that when you click on a thumbnail in one of my albums, the larger image is actually smaller than the original photo! All the photos I've put into my web albums have been shrunk to fit into the album display area. For instance, a photo that normally displays about 7 inches high when I view in Preview, comes out about 4.5 inches tall when displayed in my web album.
    How can I get it to display photos at actual size?

    Ray Dunakin wrote:
    I tried turning off "optimize images", but they still were showing up at the reduced size.
    providing a url to your site so people could have a look at what's going on may help...
    What I'd really like is a way to designated specific images to display at a larger size -- most don't need to be very large, but some do.
    I think that's not possible in iWeb PhotoPages... As you choose one format for all. However to get larger images in albums click on a photo and a window will popup called "Photo grid". You can increase the size of the images displaying by reducing the spacing and/or decreasing the number of columns.
    Regards,
    Cédric

  • While opening specific Document "Not enough Memory" while rendering the Document with Large Image

    When Opening a PDF-File with a Large Image, the Adobe Reader and Acrobat will Crash while rendering with the Message Out of Memory.
    After some investigations this error is reproducable on Windows 2008 R2 and 2012 R2 Machines running on HP ProLiant BL460c Gen8.
    The Servers are updated with the latest Firmware from HP.
    This error does not occure on a Virtual Machine running Windows 2008 R2, so it seems like a Hardware dependent Bug.
    When disabling the Option Show large Images, the error will not Show up. But this is neither a solution nor a workarround, as we Need to Display all Information in the PDF.
    Thanks for any Help

    Hello SumitV
    Thanks for your answer.
    Indeed the Output File can be opened without any Errors. It's a good workarround, but it's not a solution.
    Why does it work (and much faster) on a Virtual Machine, but not on a very Powerful HP Machine?
    I hope Adobe will investigate in this case to solve it completely.
    We have a test machine where we could install some debug tools, are there any to get a deeper analysis of this problem?
    Thanks a lot.

  • Editing very large images

    I have several very large images that require minor editing. The tasks that I require are only basic rotation, cropping, and maybe minor color adjustments.
    The problem is that the images range from 20780 x 15000 px down to 11150 x 2600 px.
    With images this size, when I try to perform any operation on the entire image, like a 3 deg rotation, fireworks simply times out. Maybe it is because it can't generate the preview image at that size, maybe it simply can't calculate the image at that size. If it is just the preview image, is there a way to lower the quality (say, jpeg quality of 5%) of the preview image only but when I export the image, I can keep it at 100%?
    Thank you,
    -Adam
    (Yes, I did search, but the forums search seemed mildly useless and engine results consistently returned nothing)

    Fireworks is not designed to work with images of this size. It's a screen graphics application. Photoshop would be a better option here, or other software designed for working with high resolution files.
    Jim Babbage

  • Working with large images

    i need to work with very large images (3000x3000 pixels) and they take a lot of memory (50MB). thing is that they don't use a lot of colors but they are loaded in memory with 24bit color model.
    it is jpeg image and it is decoded to buffered image with standard jpeg decoder. i want to use buffered iamge because it is very simple to work with, however, i need to find a way to reduce memory consumption.
    is it possible to somehow use different color model to reduce color count to 256, which is enough for me. i am clueless about imaging and i simple haven't got a time to get into it. i need some hints about this. i just want to know is it poosible (and how) to reduce size of BufferedImage in memory by decreasing it's number of colors.
    i was thinking about using gif encoder, but it works with images, not bufferedimages and conversion takes lot of time and is not a solution.
    thanx!
    dario

    You can create a byte indexed BufferedImage. This uses just 1 byte per pixel.
    Use
    public BufferedImage(int width,
    int height,
    int imageType,
    IndexColorModel cm)
    with
    imageType == BufferedImage.TYPE_BYTE_BINARY
    and create a color palette with
    new IndexColorModel()

  • Need to read/paint large images fast. Help!

    Hello Java community,
    I am having trouble writing a program which can read large image files from disk (~ 1-2 MB) and paint them to the screen while maintaining a frame rate of 15 or 20 fps. Right now I am doing this through a Producer/Consumer scheme. I have one thread which constantly creates ImageIcons (using the constructor which accepts a String file path), and places them on a blocking queue (production) with a maximum capacity of 5 or 10. I have a second thread which removes an image from the queue (consumption) and paints it to the JFrame every so many milliseconds to honor the frame rate.
    My problem is that this approach is not fast enough or smooth enough. With smaller images it works fine, but with larger images it cannot maintain a high frame rate. It also seems to consume more memory than it should. I am assuming that once I paint a new ImageIcon, the old one goes out of scope and is freed from memory by the Garbage Collector. However, even with a queue capacity of 5 or 10, I am getting out-of-memory exceptions. I plan on trying the flush() method of the Image class to see if that helps to recover resources. After searching around a bit, I see that there are many different ways to read/load an image, but giving a local path to an ImageIcon and letting it load the image seems to be a safe way to go, especially because in the documentation for ImageIcon, it says that it pre-loads the image using Media Tracker.
    Any help, ideas, or links would be appreciated!
    Thanks,
    Elliot

    Thanks for another insightful response.
    I've played a bit more, so let me update you. I am preloading images, the blocking queue (FIFO) is full when the animation begins. Right now, the queue capacity is 10, but for smaller images it can be closer to 40 or 50. The image reader queue reads images and places them in this queue. Oddly, the problem does not seem to be that the reader thread can't keep up. I added a print statement which displays the size of the queue before each removal and it remains very close to the capacity, at least for the my test images which are only 400 x 400, approximately 100 KB each.
    I've tried animating the images in two distinct ways. My first approach, as I mentioned in the original question, was to animate the images using purely swing components and no manual painting of any kind. This is always nice because it frames the image perfectly and the code is minimal. To accomplish this I simply had the image reader thread pass in the image path to the constructor of ImageIcon and put the resulting ImageIcon in the queue. The animator thread had a simple swing timer which fired an ActionEvent every so-many milliseconds (in the event dispatch thread). In the actionPerformed method I simply wrote a few lines to remove the head of the queue and I used the JLabel setIcon(ImageIcon) to update the display. The code for this was very nice, however the animation was choppy (not flickering).
    In my second approach, which was a bit longer, I created a class called AnimationPanel which extended JPanel and implemented Runnable. I simply overrode paintComponent and inside I painted a member Image which was set in the thread loop. Rather than storing ImageIcons, the queue stored Images. The reader thread used ImageIO.read(...) to generate an Image. I used simple logic in the thread loop to monitor the fps and to fire repaints. This approach suffered from the same problem as the one above. The animation was choppy.
    I found the following things to be true:
    - the reader can keep up with the animator
    - if I run the image reader and perform a basic polygon animation, both of my approaches animate smoothly at high frame rates. In other words, it's not the work (disk reads) being done by the reader that is causing the lag
    - I believe the slowness can be attributed to the following calls: label.setIcon(imageIcon) or g.drawImage(image, 0, 0, this). This leads me to believe that the images are not fully processed as they are being placed on the queue. Perhaps Java is waiting until they are being used to fully read/process the images.
    - I need to find a way to process the images fully before trying to paint them, but the I felt that my approaches above were doing that. I considered have the reader frame actually generate a JLabel or some object that can be displayed without additional processing.
    - your idea about AWT components is a good one and I plan on trying that.
    Elliot

  • Creating a custom DataBuffer for very LARGE images

    I would like to work with very large images (up to 100MB) in an application.
    Only for image manipulating, not rendering for screen.
    My idea is to write my own DataBuffer which use Hard Drive,maybe with some "intelligent" swapping.
    But At first performance doesn�t matter.
    I try this:ColorModel cm = ColorModel.getRGBdefault();
    SampleModel sm = cm.createCompatibleSampleModel(w,h);
    DataBuffer db =  new  myDataBufferInt(size);
    WritableRaster raster = Raster.createWritableRaster(sm,db,null);But somebody don�t like myDataBuffer
    and of course it is type DataBuffer.TYPE_INTthrows:java.awt.image.RasterFormatException: IntegerComponentRasters must haveinteger DataBuffers
    at sun.awt.image.IntegerComponentRaster.<init>(IntegerComponentRaster.java:155)
    at sun.awt.image.IntegerInterleavedRaster.<init>(IntegerInterleavedRaster.java:111)
    at sun.awt.image.IntegerInterleavedRaster.<init>(IntegerInterleavedRaster.java:78)
    at java.awt.image.Raster.createWritableRaster(Raster.java:980)
    public  class myDataBufferInt extends DataBuffer {
    public myDataBufferInt(int size) {
            super(TYPE_INT,size);
        . . . }The question is how to manage large image?
    What kind of subclasses I have to write?
    Thank you all
    P.S: I don�t want to use: java -Xmsn -Xmxn

    i have the same problem let me know please if you have an answer?

  • Large .bpel file size vs performance

    how does large .bpel file size affect performance,say that I have a process of .9 mgb size with around 10000 line how does this affect the instance creation ,fetching and message creation during the process life cycle.
    Edited by: arababah on Mar 8, 2010 7:23 AM

    Johnk93 wrote:
    MacDLS,
    I recently did a little house-cleaning on my startup drive (only 60Gb) and now have about 20Gb free, so I don't think that is the problem.
    It's probably not a very fast drive in the first place...
    I know that 5MB isn't very big, but for some reason it takes a lot longer to open these scanned files in photoshop (from aperture) than the 5MB files from my camera. Any idea why this is?
    Have a look at the file size of one of those externally edited files for a clue - it won't be 5MB. When Aperture sends a file out for editing, it creates either a PSD or an uncompressed TIFF after applying any image adjustments that you've applied in Aperture, and sends that out. Depending on the settings in Aperture's preferences this will be in either 8-bit or 16-bit.
    As a 16-bit uncompressed TIFF, a 44 megapixel image weighs in at a touch over 150MB...
    Ian

  • IIS Virtual Directory and Application Pool for Large Images folder

    I have a large images folder used for Upload and Download with approx. 1.5 TB for a web site.
    My Website use AppPool_A.
    For performance reasons, do I need to create different Virtual Directory and different Application Pool (AppPool_B) for Images folder. Assuming, if the number of users increase in upload/download images the memory & cpu utilization will be
    more and App Pool recycle happens more often. And other impact is if Website and Images are in same App Pool, the application performance slow.
    Pls. suggest
    AnilKumar Bedide

    Hi,
    Please post on IIS fourm instead of here.
    http://forums.iis.net/
    Best regards,
    Barry
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Problem rotating a large image

    Application description:
    A JFrame with a control panel at the bottom, which contains a couple of JSlider's which controls the scaling, translation and rotation of a BufferedImage using an AffineTransform.
    JDK used:
    1.5.0_04
    Problem description:
    The application works fine when using a "normal" sized image. But when I use a large image (1600x1200) the application seems to freeze. The GUI isn't drawn, just the infamous grey rectangle. I've narrowed it down to the rotation. If I comment out the AffineTransform.rotate call, the application works fine and the very large image is displayed in all it's glory, with translate and scale still applied.
    Is this due to some limitation of AffineTransform?

    Scale and translate would be less intensive than
    rotate.possibly, but my guess is probably not.
    If you have a world matrix that everything gets multiplied through, rotate, tranlate and scale all should be pretty fast.
    To rotate such a large image would be very cpu
    intensive and probably require a substantial amount
    of memory. You may want to check your computerAgain maybe, but IMO very unlikely. If Java uses a world matrix (and I've got no idea how else it could do things like scale or rotate) then there should be no or almost no penalty for transformations. And you certainly wouldn't need any more memory.
    performance during the operation to determine if your
    computer is being bogged down.Would be difficult to test for individual graphics operations, but you could check as to whether the CPU gets pinned while painting.

  • Resizing a large image(86400 x 43200) into smaller tiles

    Hello,
    I am working with a VERY large image (86400 x 43200). I need to resize this into smaller images (7200x7200) the fixed size crop tool is painstakingly slow, is there a plugin or automation script that will do this for me?
    My system should be able to handle this:
    Phenom 9600
    XFX 5770
    4gb DDR 1066
    Windows 7-x86
    Photoshop CS4
    Any Help would be greatly appreciatated!
    -James

    You may be able to change that behavior by adjusting the amount of RAM you allow Photoshop to use in the Edit - Preferences - Performance dialog.  Keep in mind you'll need to shut down and restart Photoshop after having made a change to the setting.
    -Noel

  • When a pop up window comes up it is - search bookmarks and history window! I cannot log into my bank as login button should open new window to log in but I get the search page. I cannot see larger images as again I get the search bookmarks and history pa

    When a pop up window comes up it is - search bookmarks and history window! I cannot log into my bank as login button should open new window to log in but I get the search page. I cannot see larger images as again I get the search bookmarks and history page etc. Happens on all options that should open new page. I am so frustrated, this has been happening since Firefox updated itself 2 days ago to Mozilla/5.0 (Windows; U; Windows NT 6.0; en-GB; rv:1.9.2.8) Gecko/20100722 Firefox/3.6.8 ( .NET CLR 3.5.30729; .NET4.0C) was fine before that. using windows vista. Can you please advise what I should do? Also can you go back to previous version? Error console eg
    Warning: Error in parsing value for 'cursor'. Declaration dropped.
    Source File: https://ib.nab.com.au/nabib/styles/menu_nab.css?id=009
    Line: 116
    ib.nab.com.au : server does not support RFC 5746, see CVE-2009-3555 and Warning: Selector expected. Ruleset ignored due to bad selector.
    Source File: https://ib.nab.com.au/nabib/styles/nabstyle.css?id=014
    Line: 837
    == This happened ==
    Every time Firefox opened
    == 2 days ago after update.

    Do you have that problem when running in the Firefox SafeMode?
    [http://support.mozilla.com/en-US/kb/Safe+Mode]
    ''Don't select anything right now, just use "Continue in SafeMode."''
    If not, see this:
    [http://support.mozilla.com/en-US/kb/troubleshooting+extensions+and+themes]

  • Open Dialog box: disable image preview (For large images, it's SLOOOWWWW)

    Open Dialog box in column view:
    Can I disable image preview?
    For large images, it's SLOOOWWWW !!!!!
    ( I often work with 8000 x 3500 panoramic images )

    William
    Mostly this is a problem in PhotoshopOK, my Photoshop is still a Classic version. I was looking at one or two others, like GraphicConverter, but the only pref there seems to be to generate a Preview if one doesn't exist.
    I'll look some more—it's irritating me now

Maybe you are looking for

  • C4480 (windows xp) documents are suddenly printing gibberish

    I've done the obvious like rebooting system and printer, ran hp printer diagnostics (no problems), and realigned.  I can't figure what has gotten out of whack.  Perhaps if someone has the code for me to manually reset the printer to factory default..

  • ICC profiles show as unavailable

    Hi I have CS running on Snow Leopard and I've downloaded some profiles from a print shop in the UK, copied them to library/colorsync/profiles but they don't show as loaded when I do view/proof setup. When I click load and locate the profiles they sho

  • Add fillable page to document

    I created a single order form page that can be filled in using reader. I want to include this as a page inside my online pdf document catalog for customers. I get an error messgae that I can insert the pdf page, only can attach it as a file. Is this

  • ADD APPROVER button to be disabled!

    Hi All,    I have implemented the n-level approval WF WS14000133 through the BADI BBP_WFL_APPROV_BADI.   Can anyone tell me whether it is possible thorugh disable the "ADD APPROVER" and "ADD REVIEWER"  button on the approval preview screen??? Regards

  • NEW HD

    MY G5 is at 4GB disk memory remaining on #1 HD; so I added a 1T Caviar Black Western Digital drive(7200 rpm) to the second bay. I am looking for suggestions/direction in how to write the #1 HD information to the new #2 drive to be used as my (new) pr