Does a high end graphic card really make a difference ? or is the CPU more important ?

NVIDIA has the Quadro CX  (very expensive)
http://www.nvidia.com/object/builtforadobepros.html
They mainly talk about h264 encoding.
Does it help as far as timeline editing ?
Im shooting in HDV but final project is SD MPEG2 for DVD
Or does a faster CPU work better.  Multiple Cores.
Is it better to get a faster Quad Core Intel Core i7 Processor ?
Intel® Core i7-940 Processor (2.93GHz) (Quad-Core)
or a slower Processor speed but dual ?
WS DUAL XEON E5504 2.0GHz, 8MB cache, 800 MHz, 4.8 GT/s QPI (Quad-Core)
WS DUAL XEON E5520 2.26GHz, 8MB cache, 1066 MHz, 5.86 GT/s QPI (Quad-Core)  [+$385 ]
What role is the graphics card really having.
I have read some places that as long as you have a good card you don't have to waste money on the top end ones.
Any thought would be helpful.
Thanks:  Glenn

IMO a Quadro CX is a waste of money, unless your editing life consists of only encoding to H264 and even then it may be a waste of money.
You may be better off with more capable CPU's and a better disk setup.
Have a look at all three topics linked to here:
How to get the best from a PC? Some guides...
Do read them carefully, including the responses, since there is some info on the performance of dual Xeons versus i7 that may help you decide.

Similar Messages

  • Xserve and high end graphic cards?!

    anybody tried to put a new graphic card (like ati 850xt or similar with 256 memory) into a xserve??
    could it work? i want to use xserve as desktop machine with gb ram on photoshop... so i need as well a high end graphic card ...
    any suggestions?

    No, you can't do this. The Xserve will only take PCI video cards; high-end graphics cards all use the very high speed AGP interface. The best graphics card you can put in it is probably going to be an ATI Radeon 7000 or thereabouts.
    The Xserve is not designed as a desktop machine. My suggestion is you buy a PowerMac G5 instead

  • Auxillary power supply for high-end graphics cards

    Dear All,
    I have an ATI Firepro card that needs 450W and of course the MacPro motherboard is not suited for the purpose. However there are a few auxillary power supplies advertised to supply extra power. Has anyone got any experience with using them?
    Is it safe to expand the power range of your motherboard?
    I'd really love to make use of this card (the FirePro V8750) as apparently it is the fastest for 3D graphics applications.
    Regards
    el

    I remember someone in macrumors installing a power booster in the lower optical bay, in order to power dual 4870s. I was trying to look for the thread, but couldn't find it
    I don't see a problem with using that to supply aux power, as long as it fits.
    Another option: You probably shouldn't though just cause I don't know the long term effects (if any), but you can get aux power from the lower optical bay molex... I can't say I recommend it, just because half of the people say its fine to do, while the other half says not to.
    I do know that people were using this option to run 4870s in crossfire; not sure how long it lasted though..
    I'm assuming you want to use this card under bootcamp correct?

  • Problems With 4K at 60fps & a High-End Graphics Card

    I'm having major problems editing footage shot at 3840x2160 @ 59.94. The video will play back fine initially, but after a few seconds, the video starts chugging along until I get less than 1fps. I have the same problem using adaptive resolution during playback set to 1/8.
    So, I would think that I might need to upgrade my hardware, but what is the bottleneck?
    Premiere Pro 2014.2 & AME 2014.2
    Windows 8.1
    Intel Core i5-4670 CPU @ 3.4 GHz
    16GB of RAM
    NVIDIA GeForce GTX 780 Ti (that's 2880 CUDA cores!)
    The footage was shot on a Panasonic HC-X1000 with MP4 selected as the output format.
    I have a 1 minute test file from that camera on my SSD. It's 59.94fps and shot using AVC at about 155 Mbps.
    Since Windows 8 doesn't come with any way to monitor the GPU usage, my sysadmin installed Geeks3d GPU Shark. It seems to work, as it was accurately able to detect GPU usage in external benchmarking tools. Maxed out at 99%.
    I moved Premiere's cache folder to my SSD to see if that had any affect. Mercury Playback Engine is definitely ON.
    When I try to playback the aforementioned 1 minute clip in Premiere, it:
    very few dropped frames, maxes out at around 29% of GPU usage, CPU virtually 100%
    plenty of dropped frames, GPU percentage starts decreasing, CPU virtually 100%
    around 1fps, GPU virtually 0%, CPU virtually 100%
    I didn't list the RAM, but it's nowhere near 100%, and goes up about a gig or so during playback, and it's not near the Memory ceiling of 10.9 GB that After Effects, PPro, AME, PS, and others share.
    Why does the GPU stop working when the number of frames dropped significantly increases? And why is it only using less than 30% of my GPU during playback while editing?
    With Mercury Playback Engine off, I noticed only a minor decrease in performace. The GPU was still used, but much less: maybe 10% or so.
    Also, why on earth does my encode with Mercury Playback Engine GPU Acceleration (Cuda) switched on encode with 100% CPU and 0% GPU? I'm trying to encode H.264 at the Match Source - High bitrate preset. AME + Mercury Playback Engine = 0% GPU usage?! There are no effects applied. No panning, scaling, video effects, nor audio effects. Literally, drag clip into new timeline (that matches clip settings), and encode using AME. What the hell is going on here.
    Can someone explain how the Mercury Playback engine is supposed to work, and why it doesn't seem to be leveraging my powerful GPU?
    Any recommendations for alternate GPU profiling tools, perhaps?
    I can ask my sysadmin to upgrade the CPU, but I'm not sure how much to go up.
    Note: I also tried bringing the footage into a sequence at 29.97fps, but it didn't help very much. I was hoping that reducing the playback framerate would help.

    There are no effects applied. No panning, scaling, video effects,
    That's why no GPU during export.  Encoding is a CPU only process.  The things you're not using are some of the very things processed on the GPU.  Add some and you may see different results.
    I would expect playback to be better with your hardware.  If you care to upload a clip for me to download, I can test it here.

  • Multiple High End graphics cards

    Does anyone know if the Mac Pro is able to use multiple Quadro FX 4500's or possibly a Quadro FX 4500 X2. And if this is not possible, would a Quadro FX 4500 in addition to one of the consumer cards work? I require at least 4 graphics outputs (2 to drive a walleye stereo wall, and 2 to use with desktop monitors) and would prefer to use the Quadros to the 7300's at least for the stereo wall and if possible I would like to get the same level of performance on the desktop monitors.
    Thanks
    macbook pro   Mac OS X (10.4.6)  

    Based on careful reading of the recently released development notes
    <http://developer.apple.com/documentation/Hardware/Conceptual/HWTechVideo/Articles/Video_implementation.html#//appleref/doc/uid/TP40003994> and
    <http://developer.apple.com/documentation/Hardware/Conceptual/HWtechPCI/Articles/pci_implementation.html#//appleref/doc/uid/TP40003937>,
    it may be possible to install two ATI Radeon X1900 XT cards, one in slot 1, and one in slot 2 (also covering slot 3). The PCI bus would be set to x8 x8 x1 x8.
    The configuration would meet all the cooling and power restrictions. There are auxiliary power connectors for both slots 1 and 2.
    This would give you four displays (all could be up to 30") with independent resolution and rotation.
    It is not a standard configuration, so you would order the Mac Pro with one ATI Radeon X1900 XT and order the second as an add-on
    <http://store.apple.com/1-800-MY-APPLE/WebObjects/AppleStore.woa/wo/6.RSLID?mco= 220D662A&nplm=MA631Z%2FA>. Note where it says:
    "System Requirements:
    Mac Pro with available double-wide PCI Express slot or two standard slots". They wouldn't say that if it was only allowed in slot 1 (the only double-wide slot).
    There would still be one empty slot, x8, with 36 watts available.
    You might also be able to use two Nvidia Quadro FX 4500 boards if you could find a second board. That would leave 80 watts for the last slot. I don't think you get display rotation with Nvidia boards.

  • Does upgrading your ram to 2GB really make a difference?

    okay, so the obvious answer to the question is yes.
    but when i mentioned i was thinking of doing this upgrade to a close friend (who did ICT A level) he seemed a little skeptical, and said that there are many other factors involved.
    im currently on 768MB of RAM, and i run logic pretty much all the time. but recently, while my sounds are getting a little more complex, the system performance bar is reaching the top (it goes yellow rather than orange)
    so to put my question more accurately...
    will i see a difference with 2GB of ram within logic ?
    and what else can i do to keep that system performance bar down (hard ware wise)

    Increasing RAM will always be of benefit.
    Another issue maybe how much disk space you have left on your hard drive?
    That version eMac shipped with either a 40 or 80 GB drive. If there is 15% or less left on the drive, this maybe an additional reason why Logic is working harder.
    Also, if you haven't done any disk maintenance in a while, like using Tech Tool Pro/Disk Warrior to clean up disk directories and defragment your hard disk, then that would tend to make Logic work harder because it has to constantly, in real time, find all of the audio data bits scattered all over your hard drive. Defragmenting a hard drive would put data that should be contiguous ( located together on the drive as one continuous data stream) back together.
    You need a lots of free disk space when working with large audio/video files.
    Purchasing a larger external Fire Wire drive would be a benefit.
    I use Logic Express 8 on a 1.25 Ghz G4 MDD with 2 GB of RAM.
    I do all of my recording to a external FW drive with plenty of free space.
    Works fairly well on a 6+ year old Power PC Mac. Haven't had too many issues or dropouts.
    This version of Logic on my Mac is not completely trouble-free, but performs well most of the times I have done lots of audio tracks and end up with, somewhat, large audio files.
    My issues are always with adding effects after recording audio as these are very processor intensive and need not only a really fast CPU, but lots O' RAM, too!
    I end up using many of the effects processes very sparingly and/or only on recorded tracks I feel really need to have them. Can't use as many effects or on too many tracks as this has given playback some hiccups and issues.
    It's an issue I can live with until I can, finally, afford a speedier Mac. That's not in the near future for me, though.
    I absolutely love Logic! Great Apple audio program!

  • Creative Cloud background service constantly uses high-performance graphics card on MacBook Pro

    On my MacBook Pro mid-2010, I recently noticed that the high-performance graphics card for some reason was constantly on. In MacBooks as of 2009, there are two graphics cards that can be used, one integrated, one PCIe. The integrated one is less powerful, but is used to preserve battery. The other one is enabled when there is a high demand for graphics power.
    I saw that when the Creative Cloud service (with this I mean the general software package that tracks for updates and syncs fonts and files) was loaded into the system, the MacBook automatically starts using this high-performance graphics card, even without any other software running, just freshly from boot. This, of course, drains battery and I don't think that much power is needed to run a background service. The only way to stop the high-performance graphics card from kicking in, is by terminating the complete Creative Cloud service, which is of course not very useful when you use the sync feature a lot.
    Is there a solution for this problem, or can you release an update of the software that addressess this issue?
    If I need to provide you with logs or other relevant information, please let me know.
    Thanks in advance, it's kind of annoying as it is now.
    Ruben Delil

    Same here. Macbook Retina 15", purchased earlier this year.
    Nice (and free) utility to monitor GPU-usage is:
    gfxCardStatus from http://gfx.io/
    Helped me to notice that it's better to quit Photoshop if not using it because it still uses the nVidia GPU if Photoshop is not even the active application (eg. running in the background).
    But still, sometimes after closing Photoshop, gfxCardStatus says that nVidia GPU is in use for process 'Creative Cloud'.
    I'm mostly running on battery and nVidia GPU consumes much, much more battery than the integrated, Intel HD Graphics 4000 so this is a big issue for me.

  • Does the NVIDIA GeForce GTX 780M with 4GB really make a difference

    I just bought 27inch late 2012 imac. 3.4ghz i7 32gb ram NVIDIA 680MX 2gb and Im currious does the 780M 4gb really make a difference. I do not play any games. I do have 2 27 lightening bolt displays. I do alot of video editing and photograph. I have all Adobe Products and Apple software but i cannot find any where that the GTX 780m with 4gb will be any better. All I can find is if you play games then 4gb is good.
    Any advice would be great!
    Thank you in advance,

    Certain applications such as Photoshop or Final Cut Pro will benefit from the higher video ram, however the GTX 680M is a very powerful card and the 2GB of ram that it comes with should be more than sufficient for most tasks.
    Only extremely large files/projects will truly benefit from having the 4GB or ram that the GTX 780M supplies.
    The 2013 iMac offers a performance boost over the 2012 version, however the increases in terms of CPU/GPU power are mostly modest.
    The Haswell architecture was designed with mobile use as the primary focus and so areas such as energy usage were the priority.
    The Nvidia 7XX graphics cards run on the same Kepler architecture as the 6XX cards, as with the Intel CPU the improvements are there but there is not a huge leap forward.

  • When i open thunderbird my graphics card temps raise when i close it the temps drop

    when i open thunderbird my graphics card temps raise when i close it the temps drop
    i have tried re installing thunderbird but the problem persists

    Does it make any difference if you disable hardware acceleration?
    https://support.mozilla.org/en-US/questions/1012145
    http://www.askvg.com/how-to-disable-hardware-acceleration-gpu-rendering-in-mozilla-firefox-4-0-to-fix-font-problem/

  • Need opinions on a VERY low end power supply handling a low end graphics card

    Greetings.
    I have acquired my father's old(old) HP Slimline S3400F and have intentions of turning it into a media computer.
    Problem : Due to the tiny form factor, it has an abysmal 160watt PSU.
    Inquiry : I know that most manufacturers FAR overstate the actual required wattage on graphics cards, as such, I am looking at a fanless GT 610 to put in it. Do you think since all it will be doing is serving up video, the PSU will be able to handle it?
    (Full specs can be found easily on Google).
    If you like my post, or solution to your issue/question, go ahead and click on the little star by my name and/or accept the post as the Solution. It makes me happy.
    I'm NOT an employee of Best Buy, or Geek Squad, though I did work as an Agent for a year 5 years ago. None of my posts are to be taken as the official stance that Best Buy will take on your situation. My advice is just that, advice.
    Unfortunately, that's the bad luck of any electronic, there's going to be bad Apples... wait that's a horrible pun.

    Oh, I will be doing 1080P.
    My plan has slightly shifted to transplanting the HP's parts into a super cheap ATX case with a regular PSU. That's the going plan for now, at least.
    If you like my post, or solution to your issue/question, go ahead and click on the little star by my name and/or accept the post as the Solution. It makes me happy.
    I'm NOT an employee of Best Buy, or Geek Squad, though I did work as an Agent for a year 5 years ago. None of my posts are to be taken as the official stance that Best Buy will take on your situation. My advice is just that, advice.
    Unfortunately, that's the bad luck of any electronic, there's going to be bad Apples... wait that's a horrible pun.

  • Need new low-end graphics card for Power Mac G5 (PowerPC 970)

    My Power Mac G5 (PowerPC 970) has recently died I believe due to a bad
    ATI RADEON X800 XT MAC EDITION video card.
    Reason for thinking its the card, is that the machine was performing its intended tasks for several days even though the monitor was snow.
    Having trouble determining a low-end replacement. (I no longer need the enhanced features of the RADEON X800 for this machine.
    Much appreciated if someone could suggest a product appropriate for PowerPC 970.
    Thanks!

    I also ran into this issue with my dual G5. When I bought it, I didn't get wireless because it was a dedicated desktop on the network, but I'm upgrading and giving it to the kids to use in a room without wired network. Now that wireless would be really nice and it looks like the official solution is:
    "Apple Wireless Upgrade Kit for Power Mac G5 Dual or Power Mac G5 Quad MA252G/A"
    which is really expensive and difficult to install. But the weird thing is that I do have bluetooth. I wonder if there's just some little doodad I need to connect to get wireless going too, but I can't find any info on the card, and it looks like it's buried on the motherboard behind all the cooling stuff. I'll probably just spring for one of the USB solutions.

  • Does Cas Latency really make a difference?

    I have a MSI K8N Neo2 setup (AMD64 3500) and I am inquiring if memory latency effects are really noticible or not during heavy gaming or creating home movies for DVD's.  Currently, I have two sticks of Corsair Value Select 512MB DDR PC-3200 (VS512MB400).  It has a Cas latency of 2.5.  Is there any good noticible performance reason, I should trade up and buy some better performing memory.  If so, does anyone have a tried and true favorite with the K8N Neo2?  Any thoughts?  I have never overclocked, I just run at stock speeds.  Perhaps the performance gain is not worth the money?  
    I also have:
    eVGA geforce 6800NU
    2- 200 GB Seagate IDE barracuda
    Sony DVD burner
    Enermax 465 watt PS
    Hauppauge PVR 250
    Netgear WG311v2
    4 case fans
    1 GB PC-3200

    Spread Spectrum Modulation was invented to reduce interferences of high order
    harmonics of the bus frequency. The theory is that, because every wave form
    generates higher order harmonic waves or Obertoene, accumulation of the latter
    can result in interference with the original signal.
    One way to avoid this problem is to subject the base frequency to a slow (ca 100,000 clock cycles) modulation,
     meaning that the FSB varies between e.g. + and - 1% of the nominal value.
    In older boards, usually two different settings were available,
    either centered around the nominal value or set with the nominal frequency as the maximum (low modulation).
    Most current boards employ the centered modulation.
    This is, at least, the official version of Spread Spectrum Modulation.
    In reality, there are different reasons for its implementation.
    With increasing operating frequency, electronic components emit electromagnetic interference signals (EMI).
    EMI, on the other hand can cause interferences with other devices and is,
     therefore, subject to regulation by the FCC which limits the signal amplitude according to its guidelines.
    Any device exceeding the maximum allowable signal strength will not gain
     approval by the FCC and can, therefore, not be marketed.
    In order to understand the reason for SSM, it is necessary to know how the FCC
    tests EMI.
    Basically, the testing device is a radio receiver and the testing is done by sweeping
    its receiving frequency through the frequency range of interest and measuring the
    interference with the video and audio signals. The bandwidth sensitivity of the
    measuring device is in the order of about 1 MHz.
    If the operating frequency is modulated to spread over a bandwidth of typically
    4-5 MHz, the same will happen to the EMI spectrum and,
    instead of showing a sharp peak, the spectrum will be spread to a more or less
    Gaussian bell shape.
    In this case, the amplitude will be, of course, substantially smaller, that is,
    in the order of 1/3 - ¼ of the original peak.
    The energy, however, will be the same. On the other hand,
    the measuring instrument with its bandwidth limited to only 1/4 of the spread will,
    consequently, only see 1/3 to ¼ of the EMI.
    Therefore, the system will obtain FCC approval even if it exceeds the guidelines.
    Recommended Settings
    If running at stock setting, enabling Spread Spectrum Modulation (SSM)
    may reduce EMI, and cause less interference with wireles communication devices.
    Under all conditions, enabling SSM may cause a system to crash.
    This is especially true for overclocking, simply because with the high multiplier values employed now, even a 0.5% modulation up and down can cause
    differences up to as much as 10 MHz clock speed within one modulation cycle.
    In other words, if the CPU is already operating at its limit, increasing the clock
    speed by another 10 MHz may be fatal.
    Therefore, for any overclocking, SSM should be turned off.
    Another side effect of SSM is that it can interfere with the clock generator.
    This means that, instead of merely initializing SSM, it is possible to enable FSB settings that were never supported by the manufacturer.
    Examples are the Tyan Trinity 400 where disabling SSE results in an actual bus
    speed of 90 MHz instead of the selected 117 MHz or the MSI 6309 where FSB settings of up to 200 MHz become available.
    The reason is that activating SSM can cancel out the FSB setting since there
    can be a pin address overlap on the clock generator chip.
    (from lostcircuits.com)
    Lets say your CPU is putting out EMI , its all concentrated on one "channel/frequency" on the spectrum.
    What spread spectrum does is broaden that out to multiple channels/frequencies so it isn't as "potent" or interupting.
    http://www.rojakpot.com/default.aspx?location=8&var1=0&var2=115
    This is interesting.
    Fsb spread spectrum enabled can cause internet dropouts.
    http://www.asus.com.tw/support/faq/qanda.aspx?KB_ID=84823
    If i can access it in bios i always have it disabled in any computer i'm maintaining.

  • Does Mackeeper really make a difference on your mac

    Hi, does mackeeper make a diffrence on your mac will it make it fast, protect from viruses, and protect your data. I really need to know, Thanks

    Oh, it'll make a difference, but not in a good way. See below.
    https://discussions.apple.com/docs/DOC-3036
    Stedman

  • Single or double wide/high for graphic cards?

    High everyone,
    What is better in regards to graphic cards for MacPro 2006 (model 1.1). Double or single wide/high? I am about to purchase an Nvidia 8800GT and don't know if I should choose the "flatter" or "thicker" one. The "thicker" one would obviously take up two spaces in the 1st PCI-E slot and the "thinner" one would not. They both have the double DVI ports I was looking for. Is there a difference in performance?
    Rio

    Hi hatter,
    I had the ATI 5770 but seemingly it did not support Quartz Extreme (whatever that is) and I failed to notice that it did not provide two DVI ports (I don't like the mini port) and it only "officially supports" Snow Leopard. I returned it to Apple.
    I understand where you are coming from regadring the Nvidia 8800 GT card choices via ebay. According to their description, some are listed as New items. Their description also mentions that they work for Tiger and up. Are you suggesting that the sellers are giving false info and that the cards may all be "flashed"? I was about to "click" the "Buy Now" button on one of the items but decided to seek last minute advice via this forum. Now I'm unsure as to what to do...
    You also mention that Apple never made "double wide" 8800 GT. That means that the "flatter" 8800 GT card should work fine in my MacPro?
    PS: What is the difference between a single and double wide graphics card?
    If I may bring some humour to the situation, I feel I'm kind of "stuck" between a "rock" and a "hard place", i.e., ATI 5770 that doesn't seem to suit some Appps in my MacPro and which Nvidia 8800 GT "could" work for me?!?
    I'll probably need to take a "risk" with some choice of cards within a day or two.
    Anyway, thanks for your input The hatter.
    Rio

  • Does T61 with integrated graphic card come with 65W power adapters?

    Hi everyone, i am about to buy T61 without the graphic card.
    Plz help!, anyone knows it comes with which type of power adapters?? say 65W or 90W
    Thank you

    integrated graphic card model should come with 65 w power adapter... while those Nvidia one has 90 w power adapter..

Maybe you are looking for