GTX 570s CUDA performance in PrP CS6?

Any info on the GTX 570s CUDA performance in PREMIERE PRO CS6? Is it better /worse than Quadro 4000.

Way better and especially BFTB wise at less than half the cost. See Benchmark Results and navigate to the MPE Gains charts.

Similar Messages

  • GTX 680 vs GTX Titan MPE Performance difference in cs6

    GTX Titan is avaialble here. Have any one tried benchmarking it? How it will performing compare to the GTX 680 the  present best card? I found reading on  another thread a MVP assuming will be 30-40% faster. But when considering the 6GB memory won't it give more performance?
    And also let me know do we get any more performance increase by using 4GB 680 instead of 2GB 680? is that performance diffrence noticialbe?

    Not completely true. Bill has tested our new MPEG2-DVD timeline, which is a single AVCHD 1080i/29.97clip with a number of effects like fast color correction, brightness & contrast, 3 way color correction, gaussian blur, gamma correction and speed adjustments and a duration of 2;39;04.
    Exporting to MPED2-DVD with the preset NTSC 23.976 Widescreen high quality and MRQ turned on, Bill tested this on his i7-2600K, (OC) 4.4, 32 GB memory and the project on 2 Samsung 840 Pro SSD's in raid0, he found the following results:
    Video card performance.
    Just using two of Bill's observations plus his i7-960X system and my own gives these summary results:
    System
    Harware
    Render time
    Software
    Render time
    Gain
    BillG i7-2600K and GTX 680 OC
    31 seconds (5.1 x RT)
    870 seconds
    28.1 x
    BillG i7-2600K and GTX 680
    35 seconds (4.5 x RT)
    870 seconds
    24.9 x
    BillG i7-960X and GTX 680 OC ?
    30 seconds (5.3 x RT)
    556 seconds
    18.5 x
    Harm's Monster and GTX 680 OC
    24 seconds (6.6 x RT)
    436 seconds
    18.2 x
    Here CUDA acceleration is severely tested, since there is scaling, blurring and frame blending going on. Note that the faster a system is, the lower the gain, but also the impact of a faster system on the results in hardware render time.
    PS. Matt said it in his introduction, he tested with the PPBM6 file as it existed at that moment and since then it has been changed drastically. In addition Matt used the Total Time score for his comparison where the RPI score would have been better. The Disk I/O test is completely irrelevant when testing hardware acceleration, H.264-BR is largely irrelevant, it mostly is all about the MPEG2-DVD test and - at that moment at least - the rendering test.

  • OpenCL and CUDA. AMD 6870 or gtx 570. (Keeping the future in mind).

    What GPU would be better, the 6870 or the GTX 570 for After Effects and Premiere Pro. With Apple completely jumping onto AMD GPUs, what is the future of OpenCL?
    Should I get a AMD 6870  for 40$ or a GTX 570 for 50$?

    No doubt in my mind, The nVidia is generally twice as fast as equivalent AMD GPU's.  I have tested both brands.

  • Gtx 680 vs two gtx 570?

    I don't think I have enough inform to make a informed decision...
    How helpful can two gpu(s) be for affect effects?  Would you link those gpus in sli? 
    Anybody running two or more gpu's for after effects?
    How does the gtx 680 compare to the gtx 570 in terms of affer effects performance?
    Sorry if these questions have been answered a thousand times.. I've been having some trouble finding the info...

    You're looking for a simple answer that doesn't exist. In both scenarios the potential speed gains will be at best 10-15% compared to a single GTX 570/580 or whatever, which, given the overall slowness of the raytracing stuff is marginal, for all intents and purposes. A 680 may fare a bit better while you work - it is, after all, newer and more optimized - but that advantage may still evaporate when you crank up the quality for final rendering and enable DOF, motion blur and all those costly features. And as I said many times: I wouldn't base any purchasing decisions on CS6' 3D. All comparisons where people brag about rendering 5 seconds of some simple text animation in an hour make me go "So what? I've been doing that the last 10 years in my 3D programs." So in all fairness, as far as I'm concerned, you are looking to solve a very specific need on the wrong end using the wrong means. You can have much more fun using Video CoPilot's Element or a 3D program and neither will impose those outrageous hardware requiremnts to work their magic. If you feel you still need that raytrace stuff, then personally I'd settle for a single GTX570/580 right now. The simple truth is that a half year from now there will be much better Keppler-based cards than a costly GTX 680 and Adobe may even care to support them then, so you'd regret spending a lot of money for nothing. And did I mention that Element will burn like crazy on a GTX 580 in which you could invest the money saved in the process?
    Mylenium

  • Quadro 2000 v. gtx 570 v. Quadro 4000

    Can anyone comment on the MPE performance difference between the Quadro 2000, the gtx 570 and the Quadro 4000?  I'll be using hacked GH2 footage with CBR intra-frame coding, which (I'm told!) can and should be put on AVC Intra time-lines (not AVCHD). 
    I'm concerned most with time-line responsiveness and playback performance.  DVD encoding, exporting footage, etc., won't be happening very often, so that's of less concern. 
    I'm aware that the gtx 570 is probably the best buy of the three; but that's not the question.  Many thanks.

    In any case, performance wise the Quadro 2000 is a waste of money: It costs almost $400, yet it performs equally as slowly as a $100 card. And in Premiere Pro CS5.5, the encoding performance becomes significantly slower with lesser GPUs. Look up posts by Bill Gehrke and you may find a list of GPUs along with their performance charts in the PPBM5 benchmarks. Bill tested a wide range of GPUs from a GTX 580 all the way down to an old 9500 GT. Pay particular attention to the MPEG-2 DVD scores. You will find that even on an overclocked i7-2600K system, the system with a GTX 550 Ti took more than twice as long (146 seconds) as the GTX 580 (60 seconds) or even a GTX 560 Ti 448-core (68 seconds) in that test. The Quadro 2000 would have performed even slower than the GTX 550 Ti in that same test (heck, the GTX 550 Ti itself is slightly slower than a first-generation GTX 260 in this test despite having an equal number of CUDA cores due to the 550 Ti's slightly lower total memory bandwidth). The Quadro 4000 would have performed roughly on a par with Bill's tested GTX 285 (117 seconds) in that same test.
    On the other hand, if you're encoding to H.264, then the Quadro 2000 would have been only slightly slower than the GTX 570; you would have had to downgrade further to Quadro 600 (GeForce GT 430) level to see a significant degradation of H.264 encoding performance.
    Secondly, the Quadro 2000 has only 1GB of RAM total. With your footage, it is possible that any effects that you apply will eat up more than the amount of memory on the card. If a scene needs 1.5GB of VRAM to render using MPE GPU mode, then the 1GB card will run out of RAM. And when the rendering job runs out of VRAM, that entire frame or scene will default entirely to the MPE software-only mode, which will result in slower performance and may also degrade image quality.
    And I strongly recommend avoiding the purchase of off-the-shelf PCs or workstations to begin with: Those systems are way too expensive for such bottom-of-the-barrel performance, and upgrading such a system via the manufacturer would have cost you three to four times more than if you bought those same parts elsewhere. If you can't build an editing workstation yourself (or find it too much of a bother), consider contacting a vendor who specializes in custom-configured editing systems such as ADK.

  • Client Monitor Dropping Frames with a GTX 570?

    I have a 2 monitor setup which are both connected to my GTX 570. I edit on one monitor and play HD video full screen on the client monitor. The video was shot on my Panasonic GH2. Premiere Pro CS5.5 would play this video silky smooth but CS6 drops frames so badly that I woldnt be surprised if it was playing 12 frames per sec.
    When I turn off CUDA accleration the video plays a lot smoother.
    Is anybody else having this problem?
    P.S. I have the latest Premiere Pro update os 6.0.1 and I am using the latest Nvidia drivers. I even installed the older drivers that I was using with CS5.5 and the video is still dropping frames like crazy.

    This is a known problem with AVCHD and maybe other codecs. The new 6.0.1 helps but does not completely eliminate the playback problem, which I see especially after scrubbing.  Maybe your video is more compressed than mine is.  See also This thread

  • Dilemma?? GTX 570, 580 or Quadro 4000

    hey guys apologies if im in the wrong place for this, its my first ever post online!  I have a slight dilemma, I have just recently ordered a new setup, however im confused as to which graphics card to go for. I will mainly use the system for HD Video editing (Sony HVR Z7) using premiere Pro CS5 and after effects. From what i have read the GTX range is more than capable of accelerating certain effects in premiere pro, but will the quadro be better?
    My other main use of the pc is that I would like to hook it up to my Sim 2 Lumis host projector, via HDMI or DVI, now, since the Quadro has 10 bit video would this in any way re produce a much better image quality than the GTX range? Or is this only limited to the display port? Is the 10bit video sent through all ports even HDMI/DVI? I know that my projector has 10bit Video Processing. I would really appreciate some guidance on this, as im wanting to place an order for the card asap.
    (Money is not an issue with regards to those cards)
    Many Thanks guys.

    BTW incase you need a reference this is from Nvidia.com
    http://www.nvidia.com/docs/IO/102043/GTX-570-Web-Datasheet-Final.pdf
    Page 3
    Advanced Display Functionality
    • Two pipelines for dual independent display
    • Two dual-link DVI outputs for digital flat panel display resolutions up to 2560×1600
    • Dual integrated 400 MHz RAMDACs for analog display resolutions up to and including 2048×1536 at 85 Hz
    • HDMI 1.4a support including GPU accelerated Blu-ray 3D support, x.v.Color, HDMI Deep Color, and 7.1 digital surround sound. See www.nvidia.com/3dtv for more details.
    • Displayport 1.1a support
    • HDCP support up to 2560×1600 resolution on all digital outputs
    • 10-bit internal display processing, including support for 10-bit scanout
    • Underscan/overscan compensation and hardware scaling
    Incase you need reference for what Deep Color is in the HDMI standard BTW:
    http://www.hdmi.org/learningcenter/faq.aspx
    HDMI 1.3:
    Higher speed: HDMI 1.3 increases its single-link bandwidth to 340 MHz (10.2 Gbps) to support the demands of future HD display devices, such as higher resolutions, Deep Color and high frame rates. In addition, built into the HDMI 1.3 specification is the technical foundation that will let future versions of HDMI reach significantly higher speeds.
    Deep Color: HDMI 1.3 supports 10-bit, 12-bit and 16-bit (RGB or YCbCr) color depths, up from the 8-bit depths in previous versions of the HDMI specification, for stunning rendering of over one billion colors in unprecedented detail.
    Broader color space: HDMI 1.3 adds support for “x.v.Color™” (which is the consumer name describing the IEC 61966-2-4 xvYCC color standard), which removes current color space limitations and enables the display of any color viewable by the human eye.
    New mini connector: With small portable devices such as HD camcorders and still cameras demanding seamless connectivity to HDTVs, HDMI 1.3 offers a new, smaller form factor connector option.
    Lip Sync: Because consumer electronics devices are using increasingly complex digital signal processing to enhance the clarity and detail of the content, synchronization of video and audio in user devices has become a greater challenge and could potentially require complex end-user adjustments. HDMI 1.3 incorporates automatic audio synching capabilities that allows devices to perform this synchronization automatically with total accuracy.
    New HD lossless audio formats: In addition to HDMI’s current ability to support high-bandwidth uncompressed digital audio and all currently-available compressed formats (such as Dolby® Digital and DTS®), HDMI 1.3 adds additional support for new lossless compressed digital audio formats Dolby TrueHD and DTS-HD Master Audio™.
    Eric
    ADK

  • Workstation with Quadro 2000 or GTX 570 HD 2,5GB, for PP CS 5.5?

    Hey there,
    I'm going to build up a new workstation for video-editing using the Production Premium Suite CS 5.5.
    But there is still one big question and I can't find a proper answer.
    What GPU should I take or which one will be faster? A Quadro 2000 or a GTX 570HD with 2,5GB?
    I know the Quadro has 192 cores and the GTX has 480 cores. So the GTX should be faster
    Actually? But would it really be faster? I can't find any Benchmark comparisons or stuff.
    Some say a Quadro 2000 is better, if it's only a workstation. But I also read that people
    Prefer the GTX-Models.
    I know the GTX needs more energy and it's getting warmer when used, but those two facts
    Wouldn't persuade me to buy the Quadro.
    The rest of my system would look like this:
    Intel Core i7-2600
    ASRock Z68 Extreme 3 Gen. 3
    G.Skill RipJaws-X DIMM Kit 16 GB
    Crucial m4 128GB for OS, Programms
    Western Digital AV-GP for the media-archive and the orginal videofiles
    WD Caviar Green for Export and stuff like that
    Fractal Design Arc
    Scythne Katana 3
    Super Flower Golden Green Pro 650 W
    So the only missing thing is the GPU.
    Thanks in advance for your help!

    For the most part, I second Harm. You see, the AV-GP is not compatible with PCs at all - but rather, it's a version of the WD Caviar Green designed specifically for set-top DVRs/PVRs. And in either case, the current WD Greens spin at far slower than 7200 RPM - in fact, most current WD Green drives spin at only 5405 RPM (with a few spinning as slow as 4200 RPM). The slower rotational speed negatively affects both sequential transfer performance and random seek performance.
    As for the non-K 2600, it is limited unlocked, not completely locked. There are two disadvantages to this limited unlock: Only the maximum single-core Turbo Boost multiplier is manually selectable, with the differing multi-core Turbo multipliers also increasing by the exact same number of steps as the single-core Turbo frequency (unlike on the K chips, the multi-core Turbo multipliers on the non-K chips cannot be set independently of the single-core Turbo multiplier). Second, the maximum Turbo multiplier setting is limited to four steps above the normal single-core Turbo multiplier: In the case of the 2600, the maximum single-core Turbo multiplier can be set at up to 42x (this will force the maximum quad-core multiplier to be boosted to 39x, which will result in a maximum quad-core overclock to 3.9GHz with the BCLK remaining at its stock 100MHz). The 2600K is so much easier to overclock the way the user wants it while costing only a few dollars more than the non-K 2600.
    As for the original decision between the Quadro 2000 and the GTX 570, definitely the latter: The Quadro 2000, as far as CS5.5 is concerned, is little more than a slightly underclocked GeForce GTS 450 with a huge heatsink attached to it and still only 1GB of VRAM. And as Bill's testing with the various GeForce GPUs (to be specific, Bill tested the GTX 580, GTX 480, GTX 560 Ti 448, GTX 285, GTX 260, GTX 550 Ti and the 9500 GT, from fastest to slowest - however, the GTX 560 Ti 448 is roughly equal to the GTX 480 in performance) in CS5.5 has demonstrated, the Quadro 2000 would definitely be slower than a GTX 550 Ti, especially in MPEG-2 DVD encodes.

  • GTX 570 and Premiere CS 5.5: Mercury Playback Engine doesn't seem to work

    Hi everyone,
    I'm getting choppy playback and a red render bar on Premiere CS5.5 whenever I apply to a clip one of this effects: Any effect on the distort folder, noise alpha or Magic Bullet Looks 2.
    The sequence is an AVCHD 1080/24p Premiere preset and the media files are AVCHD from a GH2 (1080/24p at 24Mbps).
    My system is more or less like this:
    Motherboard: Gigabyte Z68X-UD4-B3
    Processor: Core i7 2600k
    Ram: 16Gb DDR3 1600 Kingston HyperX
    Graphics Card: Gigabyte GTX 570 1280 Mb (it comes factory overclocked)
    (Only one 1920*1080 monitor is attached to the graphics card right now)
    HDDs: 3 Western Digital Caviar Black 1 TB lay out as follows:
            c: OS, Program Files, pagefile
            g: Project and media files
            h: previews
    And now, the symptons: In an AVCHD 1080/24p sequence with 24Mbps AVCHD clips, all in the same video track, whenever I apply an effect from the distort folder, Magic Bullet Looks or noise alpha (and probably any hardware accelerated effect) the render bar turns from yellow to red and I get choppy playback unless I set the playback resolution to 1/2. If I change the Mercury Playback Engine setting from hardware to software nothing changes. So I'm thinking that the GTX 570 is not working with Premiere CS5.5 for some reason. In this thread (http://forums.adobe.com/message/3736541) some people had similar complains. Also rendering feels pretty slow. For example a 30 seconds clip with a Magic Bullet Looks effect applied takes 150 seconds to turn from red to green.
    I've tried completely uninstalling the nvidia drivers and installing the latest ones to no effect. I'm kind of frustrated, I've chosen this system and CS5.5 thinking it was going to be a smooth experience and I'm a bit dissapointed right now. Hopefully there will be a fix for this. Or maybe I'm doing something wrong myself.
    Thanks a lot in advance and sorry to bother you with this.
    Cheers.
    Diego

    Thanks a lot Harm and Jim,
    I continue to be amazed by the willingness of people around the web to help each other. Maybe there's still some hope for the human race after all.
    Regarding my MPE expectations I suppose that it's a sign of the times; spoiled kids like me expecting the world from a new system and then getting dissapointed at the first symptons of something not going as smooth as they think at first. But then again when storytelling costs you more than it pays at the end of the month, sometimes it's difficult not to get into drama-queen mode every now and then.
    Sorry for going a little bit off-topic here but, is there away to make completely sure that the graphics card is actually doing its job inside Premiere? This thread: http://forums.adobe.com/message/3736541 got me a little nervous thinking that it may be a bug or something on CS5.5 that prevents the GTX570 from working properly.
    Also is there a way that I can improve general performance and rendering times without investing a lot more in this new machine. I thought about buying a couple of more disks and running a RAID through the motherboard but maybe the impact in processing power is not worth it. Sorry again to ask so many questions but video editing is almost completely new for me. Not so long ago writing was enough for me, but that's not anymore the case. Now, I'm using every storytelling medium that I can. So my mind is always filled with question marks dancing to Tchaikovsky ballet suites and who knows what else.
    Thanks a lot again. I hope I can repay the community in a not so distant future.
    Regards.
    Diego
    Atticus Diablowsky
    AKA D.B. Alonso
    AKA Diego Benito Alonso
    AKA "Hey you, little punk, ¿where do you think you're going with that?"atticusdiablowsky.com

  • Gtx 570 question

    Hello, I just purchased a gtx 570 to run with my 3930k/asusx79pro board.  However, I didn't realize I could spend $30 more to get double the onboard memory.  Is it worth the hassle to exchange the card???  Is it worth it to spend $100 more for a 580??
    Im using mostly PP and AE, editing 5d stuff.
    thanks in advance

    It depends on your '5D stuff'. If you have the habit to downrez your 5D material to around 1920 x 1080, there is no need to get the 2.5 GB version, but if you always use the 5D material in the original hi-rez resolution, it may make sense to change the card for the 2.5 GB version. You will not likely note any performance difference between the 570 and the 580.

  • GTX 570 plus Gaussian Blur etc Okay?

    I'm about to upgrade my GTX 285 to either the GTX 570 or GTX 470.
    I realize I will have to apply the "hack" to make either of those work properly. Before I pull the trigger I want to make sure I won't run into any unexpected problems. I'm doing mostly AVCHD files and use a lot of gaussian blur and masking.
    I can't remember where I but there was something about there being a problem with gaussian blur and one of the NDVIDA drivers.
    If anyone here can share their experiences running a GTX 570 or 470 with CS5, that would be awesome.
    Side note: there's a fairly "cheap" 570 from EVGA on Amazon, at $349. Seems to be legit and have all the specs as expected, but I'm wondering why this one's so much "cheaper" than comparable 570s from other manufacturers. Thoughts?
    Thanks!

    >I'm about to upgrade my GTX 285 to either the GTX 570 or GTX 470.
    I realize I will have to apply the "hack" to make either of those work properly.
    The GTX 470 is on the list of cards that provide the CUDA acceleration features.
    The official and up-to-date list of the cards that provide the CUDA processing features is here:
    http://www.adobe.com/products/premiere/systemreqs/
    Some of the cards on that list are only enabled if you have the recent updates.

  • Gtx 570 fan

    Hello everyone,would really appreciate some guidance,on how to find a replacement fan of a GTX 570 TwinFrozer II in Europe,all I stumble upon,are aftermarket coolers and stuff,could be that I m not searching right.

    Thank you for the quick reply.
    Doesnt sound like ''supportive" on behalf of MSI,dont really want to spend an extra 30-40 on a 70 euros card.By no means,does one propeller less,cause any performance or noise issue,might lower the price to get it sold like that.I'll give that chinese option a try.Thanks again!

  • GTX 570 Twin Frozr III POWER Edition supply issues

    Hi there,
    I've always been a fan of MSi products, so when the time came for me to start looking for an upgrade to my current graphics card, I started looking to what MSi had in their range. After some research, the GTX 570 TFIII seems a good option for me, as it gives good performance at a reasnoble price.
    Unfortunately MSi seems to have fallen off the map in South Africa. I cannot find any retailer or wholesaler to even give me an eta for 570 or 580 chipset cards in the country, and I've tried all of the suppliers listed on your site and a few that arn't listed. Is there some worldwide shortage of these models? Did the ship sink? Have MSi forgotten where South Africa is?
    Any help with the procurement of these cards would be greatly appreciated. I really don't want to start importing from the U.S/Europe myself, and I'd like to get an MSi card if possible, rather than go for a different make.
    regards,
    Al

    Welcome to the forum Alandrix.
    None of us work for MSI and as such we cannot answer to the supply shortage reasons.
    We are at the mercy of the importers. MSI has no official representation in South Africa and all their products are imported by these distributers who have standing contracts with MSI.
    Only these importers can supply you with information as to what is on order  by them and as to when thses shipments are expected.
    As for bringing this to MSI's attention, I suggest you open a ticket and describe the situation to them. That is the only way that MSI will be aware of any seeming neglect with regards to product availability by their official importers/distributers. >>How to contact MSI.<<
    Its unfortunately the way it is and it will remain that way until MSI decides to invest in some official representation on the African Continent.
    Check your PM. I posted a local contact that may be able to assist in your query.

  • GTX 570

    Hello every one.
    Yesterday i noticed that my MSI GTX 570 (stock) started to be 65-67 C in idle and its started to underperformed as drop fps in games. I had it since 06/2012. I checked if there is any dust in the hit sinks but everything is clean The temperature in the room about 25-27 C.
    Any suggestions what it can be?
    Regards

    One monitor. Usually I play only World Of Tanks and on full setting i get about 72 C in game process it self and about 30-35 fps, in hangar with all tanks its usually 57 C, but now its same 72 C in game process and in hangar but fps drops below 30 and some times 15, that never happened before couple of days ago and they did not release any patches since beginning of august. And i just got Sleeping Dogs. In full settings i get 40 FPS and 80 C in game process, but if i dont re install drivers like i did before the FPS drops to 15-20. So after drivers re installed everything back to normal, but after fresh PC start same thing happens. Idle PC starting to be 65 C instead of 38 as now and game performance degrade.

  • MSI GTX 570 TF III Power ed o/c Fuzzy Horizontal and vertical lines.

    Hello.
    Recently built, tested and stable PC with GPU issue. (System specs in Signature)
    I Have installed a MSI GTX 570 Power Edition O/C in my rig.  I have a clean install of Win 7 and carried out a clean install of the latest (non beta) Nvidea drivers DL'd from Nvidea site.
    Here is my problem:
    When running graphically demanding software I get fuzzy lines either horizontally or vertically across the screen (but not both at once)
    They only appear when GPU is underload.
    When running Heaven Benchmark the lines are soft and horizntal consistantly in the middle of the screen.
    When running the in the Kombuster Benchmark I get harder lines when the GPU load increases and they are vertical
    Interestingly whilst playing BF3 on Ultra no lines are present (but GPU load is not as heavy)
    I have not touched after Burner settings.
    Here is what I have tried already.
        Removed card and clean contacts (card connection to PCI slot)
        Cleaned (although brand new) mobo PCI-E slot
        Checked it is seated correctly upon reinstall
        Cleaned Power connectors and Card power connection
        Cleaned PSU modular power connection on PSU
        Changed PCI-E power cables connection on PSU (cable management model)
        Changed PCI-E power socket on PSU.
    I have looked into what Bios I have but have not found any information in regards to any other Bios releases by MSI.
    My GPU BIOS is:70.20.27.00.01
    Thanks for reading and all suggestions are welcome. Thanks  

    No problems at all.
    However when running some games,.and in particular K kombuster default "KMARK" I get those lines.
    In Heaven benchmark I h´get them horizontally but only one or 2 in the middle of the screen and very soft.
    I am sending the card back.
    No Overclock performed by me and no adjustment of any afterburner settings.
    It should work out of the box.
    Thanks for your help, and I will be sure to report back once my my new card is installed.
    Please look here for an image of what I have described. http://forums.tweaktown.com/gigabyte/47221-ga-z68xp-ud4-rev1-0-install-bios-test-procedures-questions-3.html#post413137

Maybe you are looking for