Quad-buffered OpenGL Stereo (Nvidia Quadro FX5600)

I have some early 2008 MacPros with Nvidia QuadroFX5600 graphics cards, which
were purchased expressly for doing stereo visualization and computation.  At the time they were
purchased, the best option for stereo monitors was from Planar (a setup with two monitors
and a telepromptor mirror) in which the left and right eye images are drawn to different
monitors (passive stereo, not requiring a fancy glasses/transmitter setup).
http://www.planar.com/products/desktop-touch-screen-monitors/stereoscopic-3d/
These setup works beautifully under Linux and Windows, but has never worked properly
under OSX - the only supported stereo model on OSX has been, it seems, the fancy
glasses/transmitter setup, and I have never been able to find a good reason why this
was not implemented in the graphics drivers on OSX.  The shutter glasses also give me
a fantastic headache with extended use due to eye strain, which is not a problem with
passive stereo.
We've been dealing with this by dual-booting into Ubuntu, but I'm seriously considering
ditching the whole OSX partition and settling on just using Ubuntu - the hardware has
been rock-solid otherwise.  It just seems a shame that the folks at Apple haven't
fully supported this part of the graphics/vis niche market, given that the MacOS is otherwise a
top-notch development environment.
Any thoughts on whether this will change with Mavericks, or whether anyone has had
any experience getting this kind of setup to work on either Lion or Mountain Lion?

Hello Chris,
Thanks for the information. I have a Quadro 4000 card around here that I can experiment with, but was hoping to get a response from Sapphire Technology regarding the ATI Radeon HD 7950 I had installed before I went down that road.
I spoke to a very helpful tech at PNY on Friday who told me that it would be necessary to purchase the NVidia 3D vision "Pro" kit, and that it would work with a monitor that was capable of a 120 hz refresh rate over DVI or DisplayPort. Do you know if the "Pro" kit is required, which seems to run anywhere from $749 to $899, or if I could get away with the consumer/gaming version at $129?
I'll add to the thread as soon as I hear back from Sapphire.
Thanks for the information, I appreciate it!
-n

Similar Messages

  • Quad-buffered OpenGL Stereo

    Hello all,
    Has anyone successfully gotten this to work? Currently I'm experimenting with a MacPro5,1.
    I'm trying to get an image sequence flipbook application called RV64 to output quad-buffered OpenGL stereo via an ATI Radion HD 7950 to an ASUS VG278 HR monitor.
    The ATI card has a built-in HDMI 1.4 port, and the documentation for the card clearly states that Stereo 3D is supported over HDMI. I tested the monitor with a 3D blu-ray player and it works fine. I have a support ticket open with the manufacturer of the card, Sapphire Technology.
    I also have a NVidia Quadro 4000 for Mac that I can install, but from what I understand, the NVidia drivers on the Mac simply don't support 3D vision. Is this still the case?
    Help is greatly appreciated.
    -n

    Hello Chris,
    Thanks for the information. I have a Quadro 4000 card around here that I can experiment with, but was hoping to get a response from Sapphire Technology regarding the ATI Radeon HD 7950 I had installed before I went down that road.
    I spoke to a very helpful tech at PNY on Friday who told me that it would be necessary to purchase the NVidia 3D vision "Pro" kit, and that it would work with a monitor that was capable of a 120 hz refresh rate over DVI or DisplayPort. Do you know if the "Pro" kit is required, which seems to run anywhere from $749 to $899, or if I could get away with the consumer/gaming version at $129?
    I'll add to the thread as soon as I hear back from Sapphire.
    Thanks for the information, I appreciate it!
    -n

  • Mac Pro 3.1 (early 2008) and nVIDIA Quadro FX5600

    Hi all,
    I am triying to install nVIDIA Quadro FX5600 on a Mac Pro 3.1 (early 2008).
    Well, indeed I have been using this graphics card in this computer already, however, a few hours after installing the card, I updated the Mac OS, and there began the problems.
    Characteristics:
    -Mac Pro 3.1 (early 2008)
    -2 x 2,8 GHz Quad-Core Intel Xeon
    -12GB 800 MHz DDR2 FB-DIMM
    Graphics cards:
    -slot 1: ATI Radeon HD 2600 XT
    -slot 2: nVIDIA Quadro FX5600
    The story is a little bit long, however I think it is important to give the maximum information:
    1-I purchased the nVIDIA Quadro FX5600, and took the computer and the card to an apple reseller to install it.
    2-They installedit, physically, in slot 2, and left in slot 1 the ATI Radeon, the card included in the computer when I purchased it.
    3-They told me that after installing the drivers in Mac OS 10.10.2, when restarting the computer after installing, they had kernel panic problems.
    4-They installed the card in another Mac computer with Mac OS 10.9.5 to check if the card was compatible with Mac, and it worked perfectly.
    5-And finally, "they say", that they repeated the same operation than in point 2 and 3, also in my computer, and this time they didn't have kernel panic problems. So I took the computer at home, ready to use my new graphics card
    6-When I arrived at home, I worked about 4/5 hours with the computer, and before ending the day and closing the computer, I installed two Mac updates. One of them was a security update and the other one an iMovie update. The first one worked, however the iMovie one didn't, it gave an error. I decided to give no importance and close the app store.
    7-Just after closing the app store, it appeared a message from nVidia controller, saying that there was a new update. So I install it, and once installed, it recommended to restart the computer, so I restarted.
    8-When the computer restarted, before loading the Mac OS desktop, they appeared the kernel panic problems, and seconds after the computer restarted automatically again. The same thing continuosly, all the time.
    9-The computer's disk is divided into two parts, Mac OS and Windows 7 professional. So I decided to erase all the content in the Mac OS part and install the OS 10.10.2 again from zero (I have a copy in time machine of all my documents).
    10-Once Mac OS 10.10.2 (14C1510) installed again, I have installed the Quadro & Geforce Mac OS X driver release 343.02.02 again which is the driver compatible with the operative system.
    11-The same problem has occured, when restarting the computer after installing the drivers, the same problem as in point 3 occurs again.
    12-I repeated the same operation as in point 9 and 10, however, this time, before restarting the computer after installing the nVIDIA drivers, I installed the CUDA 6.5.46 driver, and the same problem has occured.
    The key is that the computer has been working with this graphics card, however, the story has turned crazy...
    Anyone has experienced the same problems? Any clue about it?
    Many thanks in advanced!

    You probably read how that 10.10.2 security update changed the build number of the OS, how Apple is still using really old Nvidia driver from 10.8.2.
    So called "Security Update 2015-002" from today will purposely break nVidia drivers! ( 1 2 3)
    http://forums.macrumors.com/showthread.php?t=1853748
    Mixing ATI and Nvidia in older versions of Windows was a no-no.
    With 10.10 I would ditch the 2600XT, it is doing nothing for you except taking up a slot.
    you really just could ask and no need to drag 60 lbs and have someone else put a GPU in and attach the 6-pin aux power.
    Another site along with MacIntouch to check daily and keep your eye on:
    http://www.xlr8yourmac.com/
    Nvidia Graphics Driver Update for OS X 10.10.2 w/Security Update 2015-002
    Nvidia download page for Graphics driver 343.02.02f03 for OS X 10.10.2 (build 14C1510). Here's a clip from the main D/L page tab (other than version info typical boilerplate).
    "New in Release 343.02.02f03
    Graphics driver updated for OS X Yosemite 10.10.2 (14C1510) Contains performance improvements and bug fixes for a wide range of applications.
    Includes NVIDIA Driver Manager preference pane."
    See linked page for more info. (As of March 12th post date still links to Cuda 6.5.46 from January 28th but that could change.)

  • Nvidia quadro 4000 for mac and os 10.6.8 on After Effects CS6 (11.0.2) Cuda don't work

    Hi,
    We bought an Nvidia Quadro 4000 for Mac for use RayTrace on After Effects CS6 (11.0.2) with our Mac Pro 5.1 on OS 10.6.8.
    And the Mac lost the Cuda driver when we restart it.
    We use the Nvidia driver 256.02.25f01 for Mac Os 10.6.8 and we made test on different Cuda driver (4.1.25, 4.2.10, 5.0.37 and 5.0.47).
    When we install Cuda driver and start directly After Effects 11.0.2 it’s works (after effects see the GPU processor).
    But when we restart the Mac and try to launch After Effects, AE doesn’t see GPU processor and ask to install Cuda driver 4.0 or later.
    Do you know this problem ?
    Do you have a solution ?
    Thanks.
    And sorry for my English !

    Hi All,
    I'm having the exact same problem on 10.6.8 with a brand new Quadro 4000.  My CUDA preferences are up to date, but After Effects CS6 is not seeing the card.
    GPUSniffer gives me some really weird results, which I've included.  Of particular note there is a repeated "just leaking" message, an "invalid drawable" and "Did not find any devices that support GPU computation."
    Anyone have a fix for this?  We just dropped a lot of money on this card so I could squeeze some more life out of this workstation (Mac Pro 4,1 dual quad 2.66 with 12GB RAM).
    Last login: Fri May 31 10:20:47 on console
    edit-e-room-20:~ johnlee$ /Applications/Adobe\ Premiere\ Pro\ CS6/Adobe\ Premiere\ Pro\ CS6.app/Contents/GPUSniffer.app/Contents/MacOS/GPUSniffer
    --- OpenGL Info ---
    2013-05-31 10:28:52.456 GPUSniffer[71708:903] *** __NSAutoreleaseNoPool(): Object 0x1046095a0 of class NSCFString autoreleased with no pool in place - just leaking
    2013-05-31 10:28:52.458 GPUSniffer[71708:903] invalid drawable
    Vendor: NVIDIA Corporation
    Renderer: NVIDIA Quadro 4000 OpenGL Engine
    OpenGL Version: 2.1 NVIDIA-1.6.37
    GLSL Version: 1.20
    Monitors: 2
    Monitor 0 properties -
       Size: (0, 0, 2560, 1600)
       Max texture size: 16384
       Supports non-power of two: 1
       Shaders 444: 1
       Shaders 422: 1
       Shaders 420: 1
    2013-05-31 10:28:52.467 GPUSniffer[71708:903] *** __NSAutoreleaseNoPool(): Object 0x10411d900 of class NSCFString autoreleased with no pool in place - just leaking
    2013-05-31 10:28:52.467 GPUSniffer[71708:903] invalid drawable
    Monitor 1 properties -
       Size: (2560, 0, 1600, 1200)
       Max texture size: 16384
       Supports non-power of two: 1
       Shaders 444: 1
       Shaders 422: 1
       Shaders 420: 1
    --- GPU Computation Info ---
    Did not find any devices that support GPU computation.
    2013-05-31 10:28:52.471 GPUSniffer[71708:903] *** __NSAutoreleaseNoPool(): Object 0x1041234d0 of class NSCFArray autoreleased with no pool in place - just leaking
    2013-05-31 10:28:52.471 GPUSniffer[71708:903] *** __NSAutoreleaseNoPool(): Object 0x1041236a0 of class NSCFArray autoreleased with no pool in place - just leaking
    2013-05-31 10:28:52.471 GPUSniffer[71708:903] *** __NSAutoreleaseNoPool(): Object 0x10411c030 of class NSView autoreleased with no pool in place - just leaking
    2013-05-31 10:28:52.471 GPUSniffer[71708:903] *** __NSAutoreleaseNoPool(): Object 0x10411c030 of class NSView autoreleased with no pool in place - just leaking
    2013-05-31 10:28:52.472 GPUSniffer[71708:903] *** __NSAutoreleaseNoPool(): Object 0x10411b000 of class NSCFArray autoreleased with no pool in place - just leaking
    2013-05-31 10:28:52.472 GPUSniffer[71708:903] *** __NSAutoreleaseNoPool(): Object 0x104110e00 of class NSCFArray autoreleased with no pool in place - just leaking
    2013-05-31 10:28:52.472 GPUSniffer[71708:903] *** __NSAutoreleaseNoPool(): Object 0x10411ccc0 of class NSView autoreleased with no pool in place - just leaking
    2013-05-31 10:28:52.473 GPUSniffer[71708:903] *** __NSAutoreleaseNoPool(): Object 0x10411ccc0 of class NSView autoreleased with no pool in place - just leaking
    edit-e-room-20:~ johnlee$

  • NVidia Quadro FX 1300 - CS4 - Sloooow!

    I have a dell 690 workstation with an Nvidia Quadro FX 1300. The problem is Photoshop CS4 is running super slow. It is impossible to get any work done.
    When I look under Preferences - Performance , under GPU settings, the Detected Video card is empty and the "Enable OpenGL" and "Advanced settings" are both disabled. I have the latest FX driver from NVidia installed.
    My workstation is a dual - dual core 2.33GHz with 3GB ram. I tried the reg keys posted by the Adobe engineer with no luck
    Please advise. Thanks in advance.
    I probably need to go back to CS3 until this issue is resolved. Very frustrating...

    It appears that I wasn't really too clear in my earlier email. I'm using Vista Ultimate x64, an OS that is, according to literature coming out of Adobe and Microsoft, supported by Photoshop CS4.
    Thanks Fred for the XPx64 link - I can find no similar articles about Vista though, so I'm still unclear whether the lack of OpenGL support is down to 64-bit issues or anything else.
    http://kb.adobe.com/selfservice/viewContent.do?externalId=kb404898#vista64 suggests that there should be no such problems with Vista64.
    Unfortunately there are reports of the Adobe Updater also working incorrectly with Vista x64, so I think I may just enjoy the phenomenal speed of the demo for a few days and wait 'til CS5.
    The only problem I am having is with CS4 spotting the video card. The image at: http://www.loveyourpix.com/images/preferences.png compares CS3's preferences (top, video card spotted) and CS4's (64-bit) preferences on the same machine. The KB article Fred pointed at had this classic line:
    "If the card is too old to understand the commands, or you are running Photoshop CS4 on Windows XP 64-bit Edition, Photoshop doesn't talk to the display card, and turns off all OpenGL features."
    Strange that a conversation started in CS3 extended should be stopped in CS4.
    Like I wrote before, on the whole CS4 performs at a very impressive rate, and even without the OpenGL support 3D handling is still much better than CS3's on the same machine.
    WRT drivers and video card settings, the card is using the prepackaged Photoshop CS4 and CS4x64 settings as defined in the NVIDIA control panel. (My main reason for trying the demo was actually because of Bridge - Bridge CS3 does not support quad core processors and Version Cue scripts in Bridge fail to run).

  • NVidia Quadro 4500 for mac and pc question

    I am thinking of purchasing a new Mac Pro 3.0GHz Quad-Core desktop with the nVidia Quadro 4500 graphics card. I want to run Solidworks on Windows XP SP2 and also be able to boot into Mac OS X. I have done hours of research trying to make sure it will work but can't seem to find any posts that suggest my system will work as I intend.
    I have found out that the nVidia Quadro 4500 needs to configured for the mac (firmware) and that you just can't buy one from eBay and install it. There are PC versions and Mac versions. Also, flashing it to work with a mac is very difficult and may damage the card if you get it wrong. Not something I want to do.
    So, my question is: If I buy this Mac Pro from Apple with the nVidia Quadro 4500 card already installed, will I have any problems with it working on the window side of things when I boot into Windows XP? Does anyone have the setup that I am talking about and also running SolidWorks on the Windows side?
    Thanks in advance for any help you can give me.
    Yee Ha!

    Hi,
    Mac OS X Leopard has Boot Camp included with the software. I have read online (Apple site amongst others) the BootCamp bundled with Leopard has all the drivers for Windows included, so you no longer need to create a disc with the drivers needed for Windows (as you did under the BootCamp beta version).
    That being the case, Leopard should have the required drivers for your Quadro FX 4500 to function under Windows.
    See here for details:
    http://www.apple.com/uk/macosx/features/bootcamp.html
    Hope this helps

  • Vdpau and full screen youtube problems on Nvidia Quadro NVS 140M

    I have Nvidia NVS 140M on 64-bit Arch. I am trying to find perfect configuration for my Thinkpad with no luck.
    I used to have nouveau drivers, now using nvidia-173xx. The problems are following:
    1) YouTube videos:
    - with nouveau I was not able to watch any video (it was like a slideshow)
    - with nvidia everything works fine.
    I'd prefer nouveau than nvidia, because of KMS.
    2) Glxgears
    As far as I remember, you can check if acceleration is working, using this tool - CPU usage should be not changed after glxgears started. On both drivers (nouveau and nvidia) running glxgears process uses 100% CPU.
    3) vdpau
    According to this thread, my card supports vdpau. I can not make it to work.
    Additional information:
    thinkpad ~ $ pacman -Qs nvidia
    local/lib32-libvdpau 0.4.1-3
    Nvidia VDPAU library (32-bit)
    local/libvdpau 0.4.1-1
    Nvidia VDPAU library
    local/nvidia-173xx 173.14.28-3
    NVIDIA drivers for kernel26, 173xx branch.
    local/nvidia-173xx-utils 173.14.28-1
    NVIDIA drivers utilities and libraries, 173xx branch.
    thinkpad ~ $ vainfo
    libva: libva version 0.31.1
    Xlib: extension "XFree86-DRI" missing on display ":0.0".
    libva: va_getDriverName() returns 0
    libva: Trying to open /usr/lib/dri/nvidia_drv_video.so
    Failed to open VDPAU backend libvdpau_nvidia.so: cannot open shared object file: No such file or directory
    libva error: /usr/lib/dri/nvidia_drv_video.so init failed
    libva: va_openDriver() returns -1
    vaInitialize failed with error code -1 (unknown libva error),exit
    thinkpad ~ $ cat /etc/X11/xorg.conf.d/20-nvidia.conf
    Section "Module"
    Load "glx"
    Disable "dri"
    Disable "dri2"
    EndSection
    Section "Device"
    Identifier "Default nvidia Device"
    Driver "nvidia"
    Option "NoLogo" "True"
    EndSection
    thinkpad ~ $ glxinfo
    name of display: :0.0
    display: :0 screen: 0
    direct rendering: Yes
    server glx vendor string: NVIDIA Corporation
    server glx version string: 1.4
    server glx extensions:
    GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig,
    GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control,
    GLX_EXT_texture_from_pixmap, GLX_ARB_multisample, GLX_NV_float_buffer,
    GLX_ARB_fbconfig_float, GLX_EXT_framebuffer_sRGB
    client glx vendor string: NVIDIA Corporation
    client glx version string: 1.4
    client glx extensions:
    GLX_ARB_get_proc_address, GLX_ARB_multisample, GLX_EXT_visual_info,
    GLX_EXT_visual_rating, GLX_EXT_import_context, GLX_SGI_video_sync,
    GLX_NV_swap_group, GLX_NV_video_out, GLX_SGIX_fbconfig, GLX_SGIX_pbuffer,
    GLX_SGI_swap_control, GLX_NV_float_buffer, GLX_ARB_fbconfig_float,
    GLX_EXT_fbconfig_packed_float, GLX_EXT_texture_from_pixmap,
    GLX_EXT_framebuffer_sRGB, GLX_NV_present_video
    GLX version: 1.3
    GLX extensions:
    GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig,
    GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control,
    GLX_EXT_texture_from_pixmap, GLX_ARB_multisample, GLX_NV_float_buffer,
    GLX_ARB_fbconfig_float, GLX_EXT_framebuffer_sRGB,
    GLX_ARB_get_proc_address
    OpenGL vendor string: NVIDIA Corporation
    OpenGL renderer string: Quadro NVS 140M/PCI/SSE2
    OpenGL version string: 2.1.2 NVIDIA 173.14.28
    OpenGL shading language version string: 1.20 NVIDIA via Cg compiler
    OpenGL extensions:
    GL_ARB_color_buffer_float, GL_ARB_depth_texture, GL_ARB_draw_buffers,
    GL_ARB_fragment_program, GL_ARB_fragment_program_shadow,
    thinkpad ~ $ cat /var/log/Xorg.0.log
    [ 2830.098]
    X.Org X Server 1.9.4
    Release Date: 2011-02-04
    [ 2830.098] X Protocol Version 11, Revision 0
    [ 2830.099] Build Operating System: Linux 2.6.37-ARCH x86_64
    [ 2830.099] Current Operating System: Linux thinkpad 2.6.37-ARCH #1 SMP PREEMPT Tue Mar 8 08:34:35 CET 2011 x86_64
    [ 2830.099] Kernel command line: root=/dev/sda3 ro
    [ 2830.100] Build Date: 04 February 2011 09:38:18PM
    [ 2830.100]
    [ 2830.100] Current version of pixman: 0.20.2
    [ 2830.101] Before reporting problems, check http://wiki.x.org
    to make sure that you have the latest version.
    [ 2830.101] Markers: (--) probed, (**) from config file, (==) default setting,
    (++) from command line, (!!) notice, (II) informational,
    (WW) warning, (EE) error, (NI) not implemented, (??) unknown.
    [ 2830.103] (==) Log file: "/var/log/Xorg.0.log", Time: Sat Mar 12 21:24:44 2011
    [ 2830.104] (==) Using config directory: "/etc/X11/xorg.conf.d"
    [ 2830.104] (==) No Layout section. Using the first Screen section.
    [ 2830.104] (==) No screen section available. Using defaults.
    [ 2830.104] (**) |-->Screen "Default Screen Section" (0)
    [ 2830.104] (**) | |-->Monitor "<default monitor>"
    [ 2830.105] (==) No device specified for screen "Default Screen Section".
    Using the first device section listed.
    [ 2830.105] (**) | |-->Device "Default nvidia Device"
    [ 2830.105] (==) No monitor specified for screen "Default Screen Section".
    Using a default monitor configuration.
    [ 2830.105] (==) Automatically adding devices
    [ 2830.105] (==) Automatically enabling devices
    [ 2830.105] (WW) The directory "/usr/share/fonts/OTF/" does not exist.
    [ 2830.105] Entry deleted from font path.
    [ 2830.105] (==) FontPath set to:
    /usr/share/fonts/misc/,
    /usr/share/fonts/TTF/,
    /usr/share/fonts/Type1/,
    /usr/share/fonts/100dpi/,
    /usr/share/fonts/75dpi/
    [ 2830.105] (==) ModulePath set to "/usr/lib/xorg/modules"
    [ 2830.105] (II) The server relies on udev to provide the list of input devices.
    If no devices become available, reconfigure udev or disable AutoAddDevices.
    [ 2830.105] (II) Loader magic: 0x7d3b20
    [ 2830.105] (II) Module ABI versions:
    [ 2830.105] X.Org ANSI C Emulation: 0.4
    [ 2830.105] X.Org Video Driver: 8.0
    [ 2830.105] X.Org XInput driver : 11.0
    [ 2830.105] X.Org Server Extension : 4.0
    [ 2830.108] (--) PCI:*(0:1:0:0) 10de:0429:17aa:20d8 rev 161, Mem @ 0xd6000000/16777216, 0xe0000000/268435456, 0xd4000000/33554432, I/O @ 0x00002000/128
    [ 2830.108] (II) Open ACPI successful (/var/run/acpid.socket)
    [ 2830.108] (WW) "dri" will not be loaded unless you've specified it to be loaded elsewhere.
    [ 2830.108] (WW) "dri2" will not be loaded unless you've specified it to be loaded elsewhere.
    [ 2830.108] (II) "extmod" will be loaded by default.
    [ 2830.108] (II) "dbe" will be loaded by default.
    [ 2830.108] (II) "glx" will be loaded. This was enabled by default and also specified in the config file.
    [ 2830.108] (II) "record" will be loaded by default.
    [ 2830.108] (II) "dri" will be loaded even though the default is to disable it.
    [ 2830.108] (II) "dri2" will be loaded even though the default is to disable it.
    [ 2830.108] (II) LoadModule: "glx"
    [ 2830.109] (II) Loading /usr/lib/xorg/modules/extensions/libglx.so
    [ 2830.118] (II) Module glx: vendor="NVIDIA Corporation"
    [ 2830.118] compiled for 4.0.2, module version = 1.0.0
    [ 2830.118] Module class: X.Org Server Extension
    [ 2830.118] (II) NVIDIA GLX Module 173.14.28 Wed Sep 29 10:19:01 PDT 2010
    [ 2830.118] (II) Loading extension GLX
    [ 2830.118] (II) LoadModule: "extmod"
    [ 2830.118] (II) Loading /usr/lib/xorg/modules/extensions/libextmod.so
    [ 2830.118] (II) Module extmod: vendor="X.Org Foundation"
    [ 2830.118] compiled for 1.9.4, module version = 1.0.0
    [ 2830.119] Module class: X.Org Server Extension
    [ 2830.119] ABI class: X.Org Server Extension, version 4.0
    [ 2830.119] (II) Loading extension MIT-SCREEN-SAVER
    [ 2830.119] (II) Loading extension XFree86-VidModeExtension
    [ 2830.119] (II) Loading extension XFree86-DGA
    [ 2830.119] (II) Loading extension DPMS
    [ 2830.119] (II) Loading extension XVideo
    [ 2830.119] (II) Loading extension XVideo-MotionCompensation
    [ 2830.119] (II) Loading extension X-Resource
    [ 2830.119] (II) LoadModule: "dbe"
    [ 2830.119] (II) Loading /usr/lib/xorg/modules/extensions/libdbe.so
    [ 2830.119] (II) Module dbe: vendor="X.Org Foundation"
    [ 2830.119] compiled for 1.9.4, module version = 1.0.0
    [ 2830.119] Module class: X.Org Server Extension
    [ 2830.119] ABI class: X.Org Server Extension, version 4.0
    [ 2830.119] (II) Loading extension DOUBLE-BUFFER
    [ 2830.119] (II) LoadModule: "record"
    [ 2830.119] (II) Loading /usr/lib/xorg/modules/extensions/librecord.so
    [ 2830.119] (II) Module record: vendor="X.Org Foundation"
    [ 2830.119] compiled for 1.9.4, module version = 1.13.0
    [ 2830.119] Module class: X.Org Server Extension
    [ 2830.119] ABI class: X.Org Server Extension, version 4.0
    [ 2830.119] (II) Loading extension RECORD
    [ 2830.119] (II) LoadModule: "nvidia"
    [ 2830.119] (II) Loading /usr/lib/xorg/modules/drivers/nvidia_drv.so
    [ 2830.120] (II) Module nvidia: vendor="NVIDIA Corporation"
    [ 2830.120] compiled for 4.0.2, module version = 1.0.0
    [ 2830.120] Module class: X.Org Video Driver
    [ 2830.120] (II) NVIDIA dlloader X Driver 173.14.28 Wed Sep 29 10:00:06 PDT 2010
    [ 2830.120] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
    [ 2830.120] (--) using VT number 7
    [ 2830.124] (II) Loading sub module "fb"
    [ 2830.124] (II) LoadModule: "fb"
    [ 2830.124] (II) Loading /usr/lib/xorg/modules/libfb.so
    [ 2830.124] (II) Module fb: vendor="X.Org Foundation"
    [ 2830.124] compiled for 1.9.4, module version = 1.0.0
    [ 2830.124] ABI class: X.Org ANSI C Emulation, version 0.4
    [ 2830.124] (II) Loading sub module "wfb"
    [ 2830.125] (II) LoadModule: "wfb"
    [ 2830.125] (II) Loading /usr/lib/xorg/modules/libwfb.so
    [ 2830.125] (II) Module wfb: vendor="X.Org Foundation"
    [ 2830.125] compiled for 1.9.4, module version = 1.0.0
    [ 2830.125] ABI class: X.Org ANSI C Emulation, version 0.4
    [ 2830.125] (II) Loading sub module "ramdac"
    [ 2830.125] (II) LoadModule: "ramdac"
    [ 2830.125] (II) Module "ramdac" already built-in
    [ 2830.125] (II) NVIDIA(0): Creating default Display subsection in Screen section
    "Default Screen Section" for depth/fbbpp 24/32
    [ 2830.125] (==) NVIDIA(0): Depth 24, (==) framebuffer bpp 32
    [ 2830.125] (==) NVIDIA(0): RGB weight 888
    [ 2830.125] (==) NVIDIA(0): Default visual is TrueColor
    [ 2830.125] (==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
    [ 2830.125] (**) NVIDIA(0): Option "NoLogo" "True"
    [ 2830.125] (**) NVIDIA(0): Enabling RENDER acceleration
    [ 2830.125] (II) NVIDIA(0): Support for GLX with the Damage and Composite X extensions is
    [ 2830.126] (II) NVIDIA(0): enabled.
    [ 2834.780] (II) NVIDIA(0): NVIDIA GPU Quadro NVS 140M (G86GL) at PCI:1:0:0 (GPU-0)
    [ 2834.780] (--) NVIDIA(0): Memory: 524288 kBytes
    [ 2834.780] (--) NVIDIA(0): VideoBIOS: 60.86.3e.00.00
    [ 2834.780] (II) NVIDIA(0): Detected PCI Express Link width: 16X
    [ 2834.780] (--) NVIDIA(0): Interlaced video modes are supported on this GPU
    [ 2834.780] (--) NVIDIA(0): Connected display device(s) on Quadro NVS 140M at PCI:1:0:0:
    [ 2834.781] (--) NVIDIA(0): LEN (DFP-0)
    [ 2834.781] (--) NVIDIA(0): LEN (DFP-0): 330.0 MHz maximum pixel clock
    [ 2834.781] (--) NVIDIA(0): LEN (DFP-0): Internal Dual Link LVDS
    [ 2834.785] (WW) NVIDIA(0): The EDID for LEN (DFP-0) contradicts itself: mode "1280x800"
    [ 2834.785] (WW) NVIDIA(0): is specified in the EDID; however, the EDID's valid
    [ 2834.785] (WW) NVIDIA(0): HorizSync range (42.088-49.305 kHz) would exclude this
    [ 2834.785] (WW) NVIDIA(0): mode's HorizSync (32.9 kHz); ignoring HorizSync check for
    [ 2834.785] (WW) NVIDIA(0): mode "1280x800".
    [ 2834.785] (WW) NVIDIA(0): The EDID for LEN (DFP-0) contradicts itself: mode "1280x800"
    [ 2834.785] (WW) NVIDIA(0): is specified in the EDID; however, the EDID's valid
    [ 2834.785] (WW) NVIDIA(0): VertRefresh range (52.000-60.000 Hz) would exclude this
    [ 2834.785] (WW) NVIDIA(0): mode's VertRefresh (39.9 Hz); ignoring VertRefresh check
    [ 2834.785] (WW) NVIDIA(0): for mode "1280x800".
    [ 2834.785] (WW) NVIDIA(0): The EDID for LEN (DFP-0) contradicts itself: mode "1280x800"
    [ 2834.785] (WW) NVIDIA(0): is specified in the EDID; however, the EDID's valid
    [ 2834.785] (WW) NVIDIA(0): HorizSync range (42.088-49.305 kHz) would exclude this
    [ 2834.785] (WW) NVIDIA(0): mode's HorizSync (32.9 kHz); ignoring HorizSync check for
    [ 2834.785] (WW) NVIDIA(0): mode "1280x800".
    [ 2834.785] (WW) NVIDIA(0): The EDID for LEN (DFP-0) contradicts itself: mode "1280x800"
    [ 2834.785] (WW) NVIDIA(0): is specified in the EDID; however, the EDID's valid
    [ 2834.785] (WW) NVIDIA(0): VertRefresh range (52.000-60.000 Hz) would exclude this
    [ 2834.785] (WW) NVIDIA(0): mode's VertRefresh (39.9 Hz); ignoring VertRefresh check
    [ 2834.785] (WW) NVIDIA(0): for mode "1280x800".
    [ 2834.787] (II) NVIDIA(0): Assigned Display Device: DFP-0
    [ 2834.787] (==) NVIDIA(0):
    [ 2834.787] (==) NVIDIA(0): No modes were requested; the default mode "nvidia-auto-select"
    [ 2834.787] (==) NVIDIA(0): will be used as the requested mode.
    [ 2834.787] (==) NVIDIA(0):
    [ 2834.787] (II) NVIDIA(0): Validated modes:
    [ 2834.787] (II) NVIDIA(0): "nvidia-auto-select"
    [ 2834.787] (II) NVIDIA(0): Virtual screen size determined to be 1280 x 800
    [ 2836.069] (--) NVIDIA(0): DPI set to (98, 96); computed from "UseEdidDpi" X config
    [ 2836.069] (--) NVIDIA(0): option
    [ 2836.069] (==) NVIDIA(0): Enabling 32-bit ARGB GLX visuals.
    [ 2836.069] (--) Depth 24 pixmap format is 32 bpp
    [ 2836.074] (II) NVIDIA(0): Initialized GPU GART.
    [ 2836.084] (II) NVIDIA(0): Setting mode "nvidia-auto-select"
    [ 2836.968] (II) Loading extension NV-GLX
    [ 2837.035] (II) NVIDIA(0): NVIDIA 3D Acceleration Architecture Initialized
    [ 2837.037] (II) NVIDIA(0): Using the NVIDIA 2D acceleration architecture
    [ 2837.037] (==) NVIDIA(0): Backing store disabled
    [ 2837.037] (==) NVIDIA(0): Silken mouse enabled
    [ 2837.040] (==) NVIDIA(0): DPMS enabled
    [ 2837.040] (II) Loading extension NV-CONTROL
    [ 2837.041] (==) RandR enabled
    [ 2837.041] (II) Initializing built-in extension Generic Event Extension
    [ 2837.041] (II) Initializing built-in extension SHAPE
    [ 2837.041] (II) Initializing built-in extension MIT-SHM
    [ 2837.041] (II) Initializing built-in extension XInputExtension
    [ 2837.041] (II) Initializing built-in extension XTEST
    [ 2837.041] (II) Initializing built-in extension BIG-REQUESTS
    [ 2837.041] (II) Initializing built-in extension SYNC
    [ 2837.041] (II) Initializing built-in extension XKEYBOARD
    [ 2837.041] (II) Initializing built-in extension XC-MISC
    [ 2837.041] (II) Initializing built-in extension SECURITY
    [ 2837.041] (II) Initializing built-in extension XINERAMA
    [ 2837.042] (II) Initializing built-in extension XFIXES
    [ 2837.042] (II) Initializing built-in extension RENDER
    [ 2837.042] (II) Initializing built-in extension RANDR
    [ 2837.042] (II) Initializing built-in extension COMPOSITE
    [ 2837.042] (II) Initializing built-in extension DAMAGE
    [ 2837.042] (II) Initializing extension GLX
    [ 2837.204] (II) config/udev: Adding input device Power Button (/dev/input/event4)
    [ 2837.204] (**) Power Button: Applying InputClass "evdev keyboard catchall"
    [ 2837.204] (II) LoadModule: "evdev"
    [ 2837.204] (II) Loading /usr/lib/xorg/modules/input/evdev_drv.so
    [ 2837.205] (II) Module evdev: vendor="X.Org Foundation"
    [ 2837.205] compiled for 1.9.4, module version = 2.6.0
    [ 2837.205] Module class: X.Org XInput Driver
    [ 2837.205] ABI class: X.Org XInput driver, version 11.0
    [ 2837.205] (**) Power Button: always reports core events
    [ 2837.205] (**) Power Button: Device: "/dev/input/event4"
    [ 2837.216] (--) Power Button: Found keys
    [ 2837.216] (II) Power Button: Configuring as keyboard
    [ 2837.216] (II) XINPUT: Adding extended input device "Power Button" (type: KEYBOARD)
    [ 2837.216] (**) Option "xkb_rules" "evdev"
    [ 2837.216] (**) Option "xkb_model" "evdev"
    [ 2837.216] (**) Option "xkb_layout" "us"
    [ 2837.264] (II) config/udev: Adding input device Video Bus (/dev/input/event3)
    [ 2837.264] (**) Video Bus: Applying InputClass "evdev keyboard catchall"
    [ 2837.264] (**) Video Bus: always reports core events
    [ 2837.264] (**) Video Bus: Device: "/dev/input/event3"
    [ 2837.279] (--) Video Bus: Found keys
    [ 2837.279] (II) Video Bus: Configuring as keyboard
    [ 2837.279] (II) XINPUT: Adding extended input device "Video Bus" (type: KEYBOARD)
    [ 2837.279] (**) Option "xkb_rules" "evdev"
    [ 2837.280] (**) Option "xkb_model" "evdev"
    [ 2837.280] (**) Option "xkb_layout" "us"
    [ 2837.284] (II) config/udev: Adding input device Lid Switch (/dev/input/event1)
    [ 2837.284] (II) No input driver/identifier specified (ignoring)
    [ 2837.284] (II) config/udev: Adding input device Sleep Button (/dev/input/event2)
    [ 2837.284] (**) Sleep Button: Applying InputClass "evdev keyboard catchall"
    [ 2837.284] (**) Sleep Button: always reports core events
    [ 2837.284] (**) Sleep Button: Device: "/dev/input/event2"
    [ 2837.306] (--) Sleep Button: Found keys
    [ 2837.306] (II) Sleep Button: Configuring as keyboard
    [ 2837.306] (II) XINPUT: Adding extended input device "Sleep Button" (type: KEYBOARD)
    [ 2837.306] (**) Option "xkb_rules" "evdev"
    [ 2837.306] (**) Option "xkb_model" "evdev"
    [ 2837.306] (**) Option "xkb_layout" "us"
    [ 2837.312] (II) config/udev: Adding input device Logitech USB Laser Mouse (/dev/input/event9)
    [ 2837.312] (**) Logitech USB Laser Mouse: Applying InputClass "evdev pointer catchall"
    [ 2837.312] (**) Logitech USB Laser Mouse: always reports core events
    [ 2837.312] (**) Logitech USB Laser Mouse: Device: "/dev/input/event9"
    [ 2837.333] (--) Logitech USB Laser Mouse: Found 12 mouse buttons
    [ 2837.333] (--) Logitech USB Laser Mouse: Found scroll wheel(s)
    [ 2837.333] (--) Logitech USB Laser Mouse: Found relative axes
    [ 2837.333] (--) Logitech USB Laser Mouse: Found x and y relative axes
    [ 2837.333] (II) Logitech USB Laser Mouse: Configuring as mouse
    [ 2837.333] (II) Logitech USB Laser Mouse: Adding scrollwheel support
    [ 2837.333] (**) Logitech USB Laser Mouse: YAxisMapping: buttons 4 and 5
    [ 2837.333] (**) Logitech USB Laser Mouse: EmulateWheelButton: 4, EmulateWheelInertia: 10, EmulateWheelTimeout: 200
    [ 2837.333] (II) XINPUT: Adding extended input device "Logitech USB Laser Mouse" (type: MOUSE)
    [ 2837.333] (**) Logitech USB Laser Mouse: (accel) keeping acceleration scheme 1
    [ 2837.333] (**) Logitech USB Laser Mouse: (accel) acceleration profile 0
    [ 2837.333] (**) Logitech USB Laser Mouse: (accel) acceleration factor: 2.000
    [ 2837.333] (**) Logitech USB Laser Mouse: (accel) acceleration threshold: 4
    [ 2837.333] (II) Logitech USB Laser Mouse: initialized for relative axes.
    [ 2837.334] (II) config/udev: Adding input device Logitech USB Laser Mouse (/dev/input/mouse1)
    [ 2837.334] (II) No input driver/identifier specified (ignoring)
    [ 2837.335] (II) config/udev: Adding input device HDA Digital PCBeep (/dev/input/event7)
    [ 2837.335] (II) No input driver/identifier specified (ignoring)
    [ 2837.342] (II) config/udev: Adding input device Integrated Camera (/dev/input/event8)
    [ 2837.342] (**) Integrated Camera: Applying InputClass "evdev keyboard catchall"
    [ 2837.342] (**) Integrated Camera: always reports core events
    [ 2837.342] (**) Integrated Camera: Device: "/dev/input/event8"
    [ 2837.373] (--) Integrated Camera: Found keys
    [ 2837.373] (II) Integrated Camera: Configuring as keyboard
    [ 2837.373] (II) XINPUT: Adding extended input device "Integrated Camera" (type: KEYBOARD)
    [ 2837.373] (**) Option "xkb_rules" "evdev"
    [ 2837.373] (**) Option "xkb_model" "evdev"
    [ 2837.373] (**) Option "xkb_layout" "us"
    [ 2837.382] (II) config/udev: Adding input device AT Translated Set 2 keyboard (/dev/input/event0)
    [ 2837.382] (**) AT Translated Set 2 keyboard: Applying InputClass "evdev keyboard catchall"
    [ 2837.382] (**) AT Translated Set 2 keyboard: always reports core events
    [ 2837.382] (**) AT Translated Set 2 keyboard: Device: "/dev/input/event0"
    [ 2837.403] (--) AT Translated Set 2 keyboard: Found keys
    [ 2837.403] (II) AT Translated Set 2 keyboard: Configuring as keyboard
    [ 2837.403] (II) XINPUT: Adding extended input device "AT Translated Set 2 keyboard" (type: KEYBOARD)
    [ 2837.403] (**) Option "xkb_rules" "evdev"
    [ 2837.403] (**) Option "xkb_model" "evdev"
    [ 2837.403] (**) Option "xkb_layout" "us"
    [ 2837.404] (II) config/udev: Adding input device TPPS/2 IBM TrackPoint (/dev/input/event6)
    [ 2837.404] (**) TPPS/2 IBM TrackPoint: Applying InputClass "evdev pointer catchall"
    [ 2837.404] (**) TPPS/2 IBM TrackPoint: always reports core events
    [ 2837.404] (**) TPPS/2 IBM TrackPoint: Device: "/dev/input/event6"
    [ 2837.419] (--) TPPS/2 IBM TrackPoint: Found 3 mouse buttons
    [ 2837.419] (--) TPPS/2 IBM TrackPoint: Found relative axes
    [ 2837.419] (--) TPPS/2 IBM TrackPoint: Found x and y relative axes
    [ 2837.419] (II) TPPS/2 IBM TrackPoint: Configuring as mouse
    [ 2837.419] (**) TPPS/2 IBM TrackPoint: YAxisMapping: buttons 4 and 5
    [ 2837.419] (**) TPPS/2 IBM TrackPoint: EmulateWheelButton: 4, EmulateWheelInertia: 10, EmulateWheelTimeout: 200
    [ 2837.419] (II) XINPUT: Adding extended input device "TPPS/2 IBM TrackPoint" (type: MOUSE)
    [ 2837.420] (**) TPPS/2 IBM TrackPoint: (accel) keeping acceleration scheme 1
    [ 2837.420] (**) TPPS/2 IBM TrackPoint: (accel) acceleration profile 0
    [ 2837.420] (**) TPPS/2 IBM TrackPoint: (accel) acceleration factor: 2.000
    [ 2837.420] (**) TPPS/2 IBM TrackPoint: (accel) acceleration threshold: 4
    [ 2837.420] (II) TPPS/2 IBM TrackPoint: initialized for relative axes.
    [ 2837.420] (II) config/udev: Adding input device TPPS/2 IBM TrackPoint (/dev/input/mouse0)
    [ 2837.420] (II) No input driver/identifier specified (ignoring)
    [ 2837.422] (II) config/udev: Adding input device ThinkPad Extra Buttons (/dev/input/event5)
    [ 2837.422] (**) ThinkPad Extra Buttons: Applying InputClass "evdev keyboard catchall"
    [ 2837.422] (**) ThinkPad Extra Buttons: always reports core events
    [ 2837.422] (**) ThinkPad Extra Buttons: Device: "/dev/input/event5"
    [ 2837.446] (--) ThinkPad Extra Buttons: Found keys
    [ 2837.446] (II) ThinkPad Extra Buttons: Configuring as keyboard
    [ 2837.446] (II) XINPUT: Adding extended input device "ThinkPad Extra Buttons" (type: KEYBOARD)
    [ 2837.446] (**) Option "xkb_rules" "evdev"
    [ 2837.446] (**) Option "xkb_model" "evdev"
    [ 2837.446] (**) Option "xkb_layout" "us"

    lukaszg wrote:GlxGears still uses 100% CPU - is it OK?
    Should I disable both dri and dri2 in xorg.conf, for nvidia drivers?
    Dunno.  Comment them out, restart X and see what happens.
    Last edited by skunktrader (2011-03-13 16:01:02)

  • Nvidia Quadro 600, GeForce GTX 560 Ti or cheaper for Photoshop CS5 and Lightroom 3?

    Hello,
    I am a professional photographer and I am setting up a new PC (i7, Windows 7 64bit). But I have some troubles to choose the graphic card.
    I use Lightroom and Photoshop CS5 3, no video editing.
    Between the Open GL, Open CL, CUDA accelerations etc ... professional graphic cards models and consumer ones I am lost!
    The Nvidia GeForce GTX 560 Ti seems more powerful and more versatile but I tell myself that if Nvidia has professional range there must be a reason.
    So in Photoshop CS5 and Lightroom 3 what would be the best: GeForce GTX 560 Ti or Nvidia Quadro 600?
    http://www.geforce.com/Hardware/GPUs/geforce-gtx-560ti/specifications
    http://www.nvidia.com/object/product-quadro-600-us.html
    Any reason to get a Quadro 2000?
    http://www.nvidia.com/object/product-quadro-2000-us.html
    Or the graphic card doesn’t matter much and I should take an entry-level GeForce to enjoy the HDMI and the silence? Which one then?
    Does the Quadro 600 manages 10bits display 10bits? And the Geforce?
    Any change announced with Photoshop CS6 and Ligthroom4?
    The only 3D application I use is Google Earth in 3D mode, does it make any difference?
    Thank you for your help.

    You say "no video editing"...  If that's going to be the case, and you won't use the Mercury engine in the Adobe Premiere Pro package, which needs the nVidia Cuda subsystem, then I recommend you consider the ATI brand over nVidia.
    Why?
    Because while neither brand's developers (ATI or nVidia) always release perfect drivers, I find ATI display drivers to be of consistently higher quality than that of nVidia releases.  What this means to you is generally fewer crashes or quirks.  ATI has also traditionally supported older cards into the future better than nVidia - this might matter to you in a few years.
    People ask me what video card I would recommend, and right now that would be a VisionTek ATI Radeon HD 6670 1 GB GDDR5 card.  I like this particular card because:
    I've had 100% success with VisionTek cards in a number of different systems, not only initially but they have all run as long as I have used them, without ever breaking down.
    The 6670 model uses very little power (under 70 watts) and as such doesn't stress your computer's power supply, need a separate power connection, nor make a lot of fan noise. 
    It's not the fastest card made for 3D gaming, but it's inexpensive and excellent for Photoshop.  No matter what you choose, you should get a card that scores over 500 on the Passmark GPU benchmark, ideally over 1000:  http://www.videocardbenchmark.net/
    The ATI Catalyst display driver implementations for the 4670/5670/6670 line of cards have been good and solid.
    1 GB of on-card memory seems to be a good size, even for editing a lot of images, and GDDR5 memory provides faster access than DDR3.
    You should know that besides using Photoshop heavily, I also develop OpenGL-based software as well, so I have some additional insight into driver implementations.
    -Noel

  • Photomerge: no final image after 10.6.7 with Nvidia Quadro 4000

    Hi, as it was stated in similar thread I have this issue. / http://forums.adobe.com/thread/852821?tstart=0 /
    But in tha threat mentioned above, problem was solved:
    "3. May 18, 2011 3:50 PM in response to: Chris Cox
    Re: Photomerge Crashes
    I think the problem was that there was an "ñ" in the file names. Now it works!"
    My file names are: _MG_7975_1z3.psd
    So probably it is not a problem with the name. I have:
      Model Name: Mac Pro
      Model Identifier: MacPro4,1
      Processor Name: Quad-Core Intel Xeon
      Processor Speed: 2.93 GHz
      Number Of Processors: 2
      Total Number Of Cores: 8
      Memory: 20 GB
    with Nvidia Quadro 4000 installed.
    After I downloaded 10.6.7 I had a problem with my graphic card, new mac OS version erased something but I solved it when I found on Nvidia forum solution and downloaded new Nvidia 4000 drivers. But help me now with new problem: photomerge - no final image. I have a huge project with extensive using of that tool. Thanks in advance!

    I think that it is the answer:

  • Nvidia Quadro 4000 and GeForce GT 120

    Hi,
    I currently have 2 CUDA cards in my MacPro and I know that PPro can only use one card at a time.  I'm just trying to figure out if it is possible to tell which card PPro is using for CUDA acceleration, or hopefully to point PPro to the card I want it to use?
    I'm testing out Davinci Resolve and have added a GT 120 to my system which I've put in Slot 1, and moved my Quadro 4000 to Slot 2 as per the Resolve specs.
    The way I'm understanding it, is that PPro is using the GT 120 now instead of the 4000.  I've noticed today a couple of outright crashes of PPro.  I'm wondering it the GT 120 is responsible for that.  (I did put in a Speed/Duration of 100% reverse on a clip which I think may be the problem)
    My concern is that I want to continue to use the Quadro 4000 with PPro (I'm thinking, for the price I paid, it should be the higher preforming card), and not have it just sitting there, only getting used by DaVinci.
    I guess my question really is, if I am not sold on going with DaVinci, does make any sense to keep the GT 120 in my tower if PPro is my main editor?
    Also on a related note... is the GT 120 a better preforming card with AE? A quick render test did show the GT 120 render a comp 43secs as opposed to 62sec by the 4000.
    Any insight is appreciated.
    Thanks,
    A.J.
    www.modrew.com
    MacPro OS 10.7.2
    2.93 GHz Quad
    12GB RAM
    Nvidia Quadro 4000
    Nvidia GeForce GT 120
    Decklink HD Extreme 3D

    one quick addition... I know the GT 120 is not a "supported" card, but I am getting Mercury Hardware playback... that's why i'm not sure if the 4000 is being used.   What's leading to my confusion is the fact that After Effects will only use the GT 120 for OpenCL if both cards are installed (in either slot 1 or 2, the order doesn't seem to matter). 
    A.J.

  • NVIDIA QUADRO 4000 for MAC---Issues with Adobe Premeire

    Hello,
    Mac Pro (Tower)
    *OS 10.7.5
    *2.8 GHx Quad-Core Intel Xeon
    *16 GB RAM
    I've had a multitude of issues with Adobe Premiere on this system and I'm not sure what to attribute it to.  First of all: I have two MAC Book Pro's that run APremiere, with similiar specs and only 8GB of RAM.  These two laptops run APremiere flawlessly and it's because of this reason that I can deduce, that my 'tower' is having issues because of the NVIDIA QUADRO 4000 installed; it's the only difference, besides the RAM, that the computers have.  Has anyone had issues using the NVIDIA QUADRO 4000 before?  Some of the issues I've been having are (please note, these issues are random, happen at random times and on random projects.  The majority of the footage I use is MOV (from DSLR cameras) and MP4's (from a Sony EX-1)):
    *Renders crashing
    *Slow clip playback
    *An oddity: certain MP4 files will go out of sync-on playback from APremiere and even in a VLC player
    *AMedia Encoder crashes
    Maybe the NVIDIA has nothing to do with it?
    Thanks for your time and help.
    Tim S.

    Maybe the NVIDIA has nothing to do with it?
    One way to accrue evidence would be to turn off Mercury Playback in the Project Settings.
    I have that card, and turning of MPE often gets me through a session.  Some here have claimed that the card is the cause of the frequent Serious Errors, and disabling it is the solution.
    That's hardly satisfactory when one of the best features of Pr is the ability to use CUDA.
    However, anything going through the card also has to pass through Pr and the OS.  Adobe claims that updating to 10.8 fixes some bugs, but according to some here (myself included), not enough.
    I had one project recently that I couldn't get the Serious Errors to stop until I re-installed the nVidia driver.
    But, your list of problems could be due to corrupt Pr preferences.  If you have a Mac, you need to get used to refreshing your prefs several times a day.  They could also be corrupt media.  That will cause renders to crash.

  • NVidia Quadro 4000 First Impressions

    I've seen some questions and some discussion regarding the nVidia Quadro 4000 card. Mine arrived yesterday, I figured I would share my initial experiences with it to help give others guidance so they can make the best decisions for their own needs.
    First off, if your primary interest (or a significant interest) is playing games, than a Quadro card is not for you. nVidia's high end cards are fine-tuned on both a hardware and software (driver) level in the needs of pro apps and manipulating very large data sets. It happens at the expense of some performance stats that matter to gamers. Quadro cards aren't designed to get you a higher fps in your favorite shooter, they're designed to get you better performance with ray-tracing, real-time 3D environments, and scientific use. I'll leave you to surf to nVidia's web site for more marketing speak on that. In my initial tests, I found that to be completely true. I don't do much gaming, but the couple games I tested performed no faster than the GTX-285 I had in the machine before.
    Attempting to run some more tests, I found that RealTech VR's OpenGL extensions viewer (which has some decidedly gamer-centric benchmarks) showed little to no improvement over the GTX-285 (as expected).
    Running a few test renders in Cinema 4D, I found only about a 5-7% increase in performance. That might be due to immature drivers, but it may also be due to C4D renders being more about CPU mucle (Maxon doesn't have any specific CUDA-support or acceleration). What I did notice was that moving/camming around in the app was much improved. I couldn't say if that had to do with an extra gigabyte of vram, or if it was some kind of 'Fermi' magic (Fermi's the name of this generation chip technology from nVidia).
    I have not yet gotten the chance to give Adobe CS5 (and specifically Premiere Pro and After Effects) a serious workout, though just playing around I noticed that the Quadro card had much more capacity for handling multiple layers of video in real time (I threw a dozen videos onto a main track in varying sizes of 'picture-in-picture' display, and arbitrarily adjusted the speed of some and color corrected others). It handled everything I could throw at it without appearing to break into a sweat, and I haven't yet had time to give it a proper performance test.
    Being an early adopter, I have the expectation that on initial release there will be kinks and hiccups, and that as the drivers mature the performance will improve dramatically. Based on discussions with colleagues and what I've seen in reviews, this has been the case with both the GTX-285 and the Quadro FX 4800 card, and probably was also the case on older nVidia cards as well. The Quadro 4000 met those expectations - it feels like this is still a work in progress. The drivers (version 256.01.00f03) are stable (no crashes, no kernel panics, no horrible situations to speak of), but based on my early results I'd guess that they're not optimized for speed, either. On the Windows side, nVidia has driver version 259 available as a 'certified' release, and a higher performance version 260 available, and performance under Windows 7 Professional (64-bit) seems better. To be fair, the card's been on the market for PC's since late July, those drivers are more mature.
    I still need to give the card a serious workout with Adobe CS5, but so far things look promising. Anecdotally, I've also noticed that system performance is greatly improved when I'm doing lots of multi-tasking. I often have several different apps running at once, and between the new technology and the additional video memory (my old card had 1GB, this has 2GB), I find I can juggle 20+ apps and dozens of Safari windows/tabs running without the Mac Pro batting an eyelash. That's hard to quantify in a specific benchmark, but it's very welcome considering the way I tend to work.
    As the drivers improve, and as my own workflow evolves to make more use of larger datasets and more complex 3D scenes, I see the Quadro 4000 really starting to shine. Heavy-duty CUDA users may be happy to know that this card only requires a single additional power connection, which means that you can install two of these cards into a single Mac Pro (for a total of 512 CUDA cores). If you're doing big scientific work or working with CUDA-supported ray tracing (or other plugins), or doing extremely elaborate things with RED camera footage, that may likely be a game-changer for you. For me, it'll likely be quite some time before I outgrow what this card can do.

    As I'd mentioned in another thread, the card began shipping last week. I expect that it will take a couple months before places like Apple and Amazon to dig through the large number of backorders they have (I don't think this card is produced in mass quantity, even on the PC side).
    My system setup is a Mac Pro 8x2.26GHz, 32GB RAM, 8TB HD storage, nVidia Quadro 4000 2GB driving the primary display, and nVidia GT120 driving a secondary display. I'm considering getting third party power supply unit that sits in the second optical drive bay, and would plug into the Mac Pro's power supply, and then provide additional power supply connectors that would allow me to plug in my GTX-285 as a secondary graphics card (since it uses 2 connectors, and the Mac Pro only has 2 total).
    Even when my machine was using a GTX-285 and the GT-120, I could see a difference in performance when dragging an application window (particularly if it's a 3D app) from main display to the secondary (the GT-120 is a significantly lower power card, with only 32 CUDA cores and 512MB video memory). With the Quadro driving my primary display the difference is much more noticeable now.
    From what I understand, there are some technical issues with using ATI and nVidia GPU's in the same machine, so attempting to use with a 5770 may not work. But if you were able to use them together, it would make more sense to have the Quadro card driving your primary display, since it's likely going to perform as well or better than any other card you might be able to pair it with.
    I've already given some thought to a second Quadro card down the road. As the drivers mature, and the apps I use evolve to make better use of CUDA and OpenCL, and my own workflow and skills improve to the point where I'm doing more 3D modeling/rendering (and stuff like ray-tracing), then having 2 of these cards in a single machine could really come in handy. Today it appears that all those CUDA cores and VRAM are serving to help make the apps faster and more responsive at design time, but rendering is still very CPU-centric. But tomorrow those apps will hopefully be able to tap into the GPU to help improve render times.

  • Nvidia Quadro 4000 is  Freezing / shutting down / Buggy with Mac Pro 2009

    I currently have two Nvidia Quadro 4000 mac cards and there causing my Mac Pro 2009 machine to kernel Panic and freeze or shutdown my machine.
    I dropped it off at the Apple store for them to diagnose the problem for 9 days and they went ahead and confirmed that it was the card which was causing the problem.
    NVIDIA PLEASE UPDATE YOUR DRIVERS FOR THIS CARD FOR MAC.
    Its ridicules that if you spend $1200 (apple store) that it will crap out your Mac Pro. I'm waiting for a updated driver in order to test the stability with the Mac Pro
    I'm almost 100% sure I did the 10.6.6 update with the stock card, then installed the most updated drivers from Nvidia website and then installed the Cuda Drivers, then finally installing the Video card in the machine. After two days, system was acting up.
    Once I get my machine back from Apple tomorrow, I will go ahead and give it one more last try to see if it works. I'm mean the cards are amazing with Adobe Premiere and Media Encoder (super fast), but at the cost that your machine will be very buggy.
    Lets wait and see what Apple, Nvidia or PNY will do about the big problem. I'm wanting to keep these babies, so make some moves people and fix the issues for the Professionals.

    I hadn't been experiencing the problems you have, but I have been having issues, and yes it absolutely is a case of immature drivers. When the card was released in December, nVidia merely did a simple patch job on the 256.01.00f03 driver that shipped with 10.6.5 rather than include an optimized driver that was comparable to the 259.x driver available on the Windows platform at the time.
    Since then, nVidia's engineers have been hard at work doing what appears to be nothing for the Mac. On the Windows side, the Quadro drivers have progressed to 267.11. Rather than provide Apple with updated drivers to include with 10.6.6 or this week's 10.6.7 release, they chose to sit back and wait for the 10.6.7 release and then release their own update.
    After 4 months, their best and brightest have brought us <drumroll> driver version 256.01.00f03. To be fair, they changed it from "v5" (the patch job to enable Quadro 4000 compatibility) to "v6". The idea was that it would add compatibility for the Quadro 4000 running under 10.6.7, Sadly for nVidia's Quadro engineers, that driver's installer didn't actually work. It took them nearly a full day to fix that, finally releasing 256.01.00f03v7. As expected, there are no improvements in either performance or stability. In fact, what happened to me is that the new driver actually broke compatibility with Adobe Premiere Pro CS5's Mercury Playback Engine GPU acceleration feature.
    Fortunately, I still had my GTX-285 card available, and this evening I pulled the Quadro and re-installed the older GTX card. I really wish nVidia would care enough to release a solid driver update, I really want to like the Quadro 4000. On paper the potential for video production and OpenGL rendering performance should be huge.

  • Support for new Nvidia Quadro gpu?

    When support for new Nvidia Quadro gpu? For example Nvida Quadro K2200 and K4200
    Thk

    What do you mean when you say "support"? All of After Effects OpenGL features already work on those cards.
    The only thing that isn't supported on those cards is GPU acceleration of the ray-traced 3D renderer, which is an obsolete and almost entirely irrelevant feature. No more GPUs will ever be added to the list of cards supported for that feature.
    Details:
    GPU (CUDA, OpenGL) features in After Effects

  • Video sync problem with nvidia quadro fx cards, edge blending on.

    when I start edge blending feature of my nvidia quadro fx 3500 driver on 2 projectors, quicktime goes out of synchronization between audio and display. it's not directly related with framerate cuz my own opengl application with quicktime sdk, renders with 300+ fps succesfully but still the video is out of sync...

    Hi there, welcome to forum.
    "*Update*
    I have a feeling that my PSU might be causing the low performance, seeing that the Quadro requires two power connectors to attach to it: (http://www.pny.com/support/quadro/install/FX4000AdditionalPowerRequirements.pdf)
    Here is my PSU:
    http://www.xpcgear.com/500wseepsubl.html
    I do not know much about PSU's, so any input is greatly appreciated.  Thanks."
    your PSU is good one and should be powerfull enough.
    Output    +3.3V@28A, +5V@30A, +12V@34A, [email protected], [email protected], +5VSB@2A
    can't be power related, since at 1st your PSU must be enough, 2nd same issue happend with your old mx 440 VGA. card malfunction are dismissed as possibility as well since its work fine in another machine.
    think its drivers mess. ATI drivers(or part of them) still exist, guess is not completely removed. same symptoms happend many times when switching between Ati to NV, lagging 2D scrolling lag and etc, caused by ATI Smart (part of ATI Drivers) sometimes remain(survive after driver cleaner).
    suggest you a new clean fresh OS installation. like Nv support already directed. also ensure in BIOS, in "Integrated Peripherals", "Init Display First" are configured to AGP.
    let we know how its going, if problem persist we a ready with a tips to resolve your issue.

Maybe you are looking for

  • Never used Photoshop CS6 nor Adobe Bridge CS6, on my drive. Trash them before using CC?

    I am a photographer. I never used the Photoshop CS6 nor Adobe Bridge CS6, both of which are on my drive. I used Apple's Aperture software instead. a) should I trash them before downloading CC  (I don't see any Uninstall feature for them)? and, b) doe

  • JDBC Sender Adapter - Transaction & Parameterized Query?

    Dear Experts, I'm curious about the JDBC sender adapter in SAP PI. As I see from the document and have been searching in the Internet, The default procedure of the sender JDBC adapter is to first run a SELECT/Store Procedure query then update the rec

  • Daren't sync iPhone to new Windows install. HELP

    My old Windows installation was recently plagued by a virus, and as such I had to back everything up, and re-install. I've retrieved all my music off of my iPod Classic, no problems there. Set the iPod to manual, no stupid iTunes wiping. I also have

  • Java Portal Integration Kits for CrystalReports 10

    Hi, I have to integrate Websphere Portal 6.1 with Crystal Reportss 10. Is Java Portal Integration Kits avaible for Crystal Reports 10 ? Andrzej Urbanowicz

  • Batch Management Configuration Change

    Dear Experts We have 180 plants around the world. right now the batch settings is material level. we need to set it into plant level, the same material used by other plants but procuring from  different different soruce, Pls help me in detial how I c