Nvidia versus NV: recompiling

Couple of questions regarding the Nvidia and NV drivers:
1) Isn't the Nvidia driver (in extra/x11/nvidia) is the binary from Nvidia itself? If so, why do I keep hearing that it has to be compiled against the current kernel?
2) The NV driver (in extra/x11-drivers/xf86-video-nv) is the open source version, correct? Does it have to be compiled against a custom kernel as well?
Currently I'm using the NV drivers:
Section "Device"
### Available Driver options are:-
### Values: <i>: integer, <f>: float, <bool>: "True"/"False",
### <string>: "String", <freq>: "<f> Hz/kHz/MHz"
### [arg]: arg optional
#Option "SWcursor" # [<bool>]
#Option "HWcursor" # [<bool>]
#Option "NoAccel" # [<bool>]
#Option "ShadowFB" # [<bool>]
#Option "UseFBDev" # [<bool>]
#Option "Rotate" # [<str>]
#Option "VideoKey" # <i>
#Option "FlatPanel" # [<bool>]
#Option "FPDither" # [<bool>]
#Option "CrtcNumber" # <i>
#Option "FPScale" # [<bool>]
#Option "FPTweak" # <i>
#Option "DualHead" # [<bool>]
Identifier "Card0"
Driver "nv"
VendorName "nVidia Corporation"
BoardName "NV43 [GeForce 6600]"
BusID "PCI:1:0:0"
EndSection
and I want to compile a custom kernel and I'm wondering if/why I have to recompile the NV drivers?.

"nvidia.ko" is a kernel module, so most definitely needs to be compiled for the current module, otherwise it may be unstable if it works at all. It is part-binary and part-source, but the source part needs to be recompiled.
nv is not a kernel module.
To see the module:
find /lib/modules -name nvidia.ko
Last edited by brebs (2008-02-20 19:22:06)

Similar Messages

  • Video Card Question. Crossfire 7950s vs One GTX780

    Which is the better option for video editing on Adobe Premiere CC? I have the crossfire 7950's now but only one is installed since i would have to buy a new power supply. With the price drop on the NVIDIA GTX 780 I might pick up one of those and sell my 7950's, to avoid buying the power supply.
    I tried looking up which is more efficient for Adobe CC and could not find anything comparing the two. If anyone knows anything or has recomendations, I would really appreciate it! Thank you in advance.

    If you do decide to go with one GTX 780 be aware that nVidia is announcing a new high end card that is called the GTX 780 Ti and it is supposed to be delivered mid-month.  It will have a base price of $699 and be faster than the current top card the GTX Titan.  That will lead to lower prices on the GTX 780
    As far as supporting two GPU's that is true for Premiere 7 (CC)   So far we have no comparison data on nVidia versus AMD/ATI.

  • Downgrade nvidia by recompiling against current kernel

    I would like to downgrade my Nvidia driver to 295.59 and still use my current kernel. The Arch Wiki says here to
    search for the package you wish to downgrade. Once you find it, click "View SVN entries" and select "View Log". Locate the version you need and click on the path. Then just download the files located in that directory and build it with makepkg
    but this does not seem to be there. That is, all I can find is a "View Changes" feature, which results in what looks like a "diff" file. I've tried to modify the PKGBUILD file accordingly by starting with the ABS PKGBUILD and then compiling, but it failed with
    Starting package_opencl-nvidia()...
    install: cannot stat ‘libnvidia-opencl.so.295.59’: No such file or directory
    ==> ERROR: A failure occurred in package_opencl-nvidia().
    Aborting...
    Any help would be appreciated.

    Sounds like you are building the prerequisite package, nvidia-utils.   I think the opencl-nvidia package was built differently back in 295.59.  If you don't need the "opencl-nvidia" package, just remove 'opencl-nvidia' from the pkgname array in the PKGBUILD file.
    Change:
    pkgname=('nvidia-utils' 'opencl-nvidia')
    to
    pkgname=('nvidia-utils')
    Then rerun the makepkg.

  • NVIDIA's AGP versus the kernel's AGPGART?

    I've just taken the time to figure out how to use nvidia-agp instead of the kernel's agpgart, but it doesn't give any difference in performance (at least based on <code>glxgears</code> results).  Is there any point to switching?
    To get it to work, I had to blacklist <code>agpgart</code> in my rc.conf file, and then add this line to the device section of my xorg.conf file:
    Option "NvAgp" "1"
    which gives me what I was looking for:
    $ cat /proc/driver/nvidia/agp/status
    Status: Enabled
    Driver: NVIDIA
    AGP Rate: 8x
    Fast Writes: Disabled
    SBA: Enabled
    Does anyone have any experience with these, or did I just waste my time?  Either way I get just about 1700fps using glxgears with a GeForce 6200.

    Does somebody knows if those settings are suitable at any point for a nVidia pci-express graphic card ?
    Here are some output I get with a GeForce 6200TC PCIe.
    $ lsmod | grep agp
    amd64_agp              11140  1
    agpgart                31368  2 nvidia,amd64_agp
    $ ls /proc/driver/nvidia/
    cards/    registry  version   warnings/
    $ cat /proc/driver/nvidia/cards/0
    Model:          GeForce 6200 TurboCache(TM)
    IRQ:            10
    Video BIOS:      05.44.02.11.00
    Card Type:      PCI-E
    $ cat /proc/driver/nvidia/registry
    VideoMemoryTypeOverride: 1
    EnableVia4x: 0
    EnableALiAGP: 0
    NvAGP: 3
    ReqAGPRate: 15
    EnableAGPSBA: 0
    EnableAGPFW: 0
    SoftEDIDs: 1
    Mobile: 4294967295
    ResmanDebugLevel: 4294967295
    FlatPanelMode: 0
    DevicesConnected: 0
    VideoEnhancement: 0
    RmLogonRC: 1
    VbiosFromROM: 0
    ModifyDeviceFiles: 1
    DeviceFileUID: 0
    DeviceFileGID: 0
    DeviceFileMode: 438
    RemapLimit: 0
    UseCPA: 4294967295
    DetectPrimaryVga: 0
    /etc/X11/xorg.conf :
    Section "Screen"
            Identifier      "NV Monitor"
            Device          "NV 6200TC"
            Monitor         "ViewSonic E92f+"
            Option          "DPMS"
            Option          "NoDDC"          "false"
            Option          "NoLogo"         "true"
            #Option         "Coolbits"       "true"  (WW) NVIDIA(0): Option "Coolbits" requires an integer value
            #Option         "UseEditDPI"     "true"  (WW) NVIDIA(0): Option "UseEditDPI" is not used
            #Option         "UseEditFreqs"   "true"  (WW) NVIDIA(0): Option "UseEditFreqs" is not used
            Option          "NvAGP"          "1"
            Option         "UseEdidFreqs"    "1"
            #Option         "EnablePageFlip"         "true" (WW) NVIDIA(0): Option "EnablePageFlip" is not used
            Option         "NoBandWidthTest" "1"
            Option         "DigitalVibrance" "3"
            Option          "RenderAccel"    "true"
            Option         "backingstore"    "true"
            #should be switched off *if* the system crash:
            Option          "RenderAccel"    "true"
            Option          "AllowGLXWithComposite" "true"

  • Is nVidia Geforce GT640 good graphics card for PrE10 despite low memory bandwidth?

    Can anybody confirm that the nVidia Geforce GT640 is a reasonable graphics card for Premiere Elements 10 and Photoshop Elements 10?
    The person who assembled my Core i7 3770K desktop with 16Gb of RAM at 1600mHz installed the nVidia GT640 card with 2Gb of DDR3 memory. He said that this was a good (fairly low cost) card for video editing because it has 384 CUDA cores - very helpful in video editing. I am pretty ignorant about graphics cards, but like the low power usage, 65 watts, and reputed cool operating temperatures. I have since read that DDR5 memory would have been much faster because of greater memory bandwidth - say 80-90Gb per second compared with 28.5Gb per second for the DDR3 memory on the GT640 card. I was after economical power use. DDR5 cards use 110 watts upwards and run much hotter than DDR 3 cards, all other things being equal. The really fast cards require special power units and cooling.
    Does anybody know whether limited memory bandwidth is important in video editing? Is speed much more critical in gaming than in video editing? Are other attributes such as 384 CUDA cores, nvenc syncing, dedicated encodment, 28nm Kepler architecture, 2Gb memory frame buffer, 1.3 billion transistors, plenty of texture units -  more important than memory bandwidth in video editing? Does bandwidth limited by DDR3 memory affect quality of image?
    I read that the GT640 would be much faster (producing better image quality?) than the HD4000 integrated Intel graphics of the Core i7 Ivy Bridge processor. Is this so?
    Windows 7 Home Premium 64 bit and all programs are installed on a 120Gb OCZ Agility 3 solid state drive. My data drive is a 1Tb Seagate SATA 3 at 7200 RPM. I have a beautiful 21 inch ASUS vs228n LED monitor and LG blu-ray burner.
    I did lots of editing with PrE 3 with a Dell 3.06 gHz hyperthreading desktop and Win XP. The output and even the preview monitoring was clear and stable. I am still capturing standard definition mini dv tape by firewire from a 3MOS Panasonic handycam, but plan to upgrade to HD 3MOS with flash memory. I make the preview monitor really small - about 7cm wide - in PR10, because the quality of the preview picture is much poorer than the quality that I experienced with the Premiere Elements 3 program with Win XP. Is this just an indication of memory-saving in PrE 10 previews? I expect output to be much superior, although still mpeg2-DVD quality until I upgrade my camera. I have set rendering on maximum bitrate.
    Anyway, despite these reservations with preview quality, the GT640 seems to be performing fine. Picture quality in Photoshop, online and elsewhere on my computer is excellent.
    I updated the nVidia display driver only yesterday to version 306.23.
    Nearly all graphics cards forums are about gaming. I hope to see more forums about graphics in editing here.
    What do you think of the 2Gb nVidia GT640 for editing with PrE 10 and Photoshop Elements 10? What would you say about picture quality in the PrE 10 monitor versus quality of output? Was picture quality in the PrE 3 monitor sharper and more stable, as I imagine?
    Regards, Phil

    Sheltie,
    Thank you for the kind words. We all work very hard to help others with video-editing. Some of us also show up on other Adobe forums, depending on the products that we use most often.
    Besides helping out, I also find that I learn something new every day, even about programs that I have used for decades. Heck, I just learned something new about PrE vs PrPro (my main NLE program), when I went to try and help a user. I probably actually use PrE more to test my theories, or to replicate a user's problem, than I do to actually edit my videos. Still, when applicable, I do real work in the program.
    With about a dozen "regulars" here, if one of us is not around, several more usually are. Personally, I do not understand how Steve Grisetti and John T. can dedicate so very much time here. Steve is a noted author of books on PrE, PSE, Sony DVD Architect, and others, plus helps run a video/photography Web site, Muvipix.com, that is very active, and has so very much to offer. John T. is always under the watchful eye of The JobJarQueen, and gets dragged, kicking and screaming, out into the yard, or up on his roof, so can be gone for a bit.
    Neale usually beats us all, since he's in the UK, and normally answers all the questions, that come in too late for us to see. He is also a PrE power-user, so beats me hands down.
    I travel a great deal, but no one ever misses me. Was supposed to do a trip to Sydney last Dec., but had to cancel. Have not gotten details on the reschedule of that trip, but it would have been my first jaunt south of the Equator. Gotta' make that happen.
    Good luck, and happy editing,
    Hunt

  • Built-in NVIDIA sound chip doesn't play with ALSA

    I have a motherboard that has a built-in sound card. It's made by NVIDIA, the lspci line for it is below.
    00:07.0 Audio device: nVidia Corporation MCP78S [GeForce 8200] High Definition Audio (rev a1)
    When I run alsaconf, it finds the card and it thinks everything works. When it ends, I don't even get an error message. If I try to run alsamixer, it spits this out:
    alsamixer: function snd_ctl_open failed for default: No such file or directory
    However, all the sound modules are loaded! lsmod | grep snd shows this:
    snd_hda_intel 370736 0
    snd_seq_oss 31872 0
    snd_seq_midi_event 8192 1 snd_seq_oss
    snd_seq 49968 4 snd_seq_oss,snd_seq_midi_event
    snd_seq_device 8332 2 snd_seq_oss,snd_seq
    snd_hwdep 8964 1 snd_hda_intel
    snd_pcm_oss 40192 0
    snd_pcm 69636 2 snd_hda_intel,snd_pcm_oss
    snd_timer 21384 2 snd_seq,snd_pcm
    snd_page_alloc 9224 2 snd_hda_intel,snd_pcm
    snd_mixer_oss 16512 1 snd_pcm_oss
    snd 50724 9 snd_hda_intel,snd_seq_oss,snd_seq,snd_seq_device,snd_hwdep,snd_pcm_oss,snd_pcm,snd_timer,snd_mixer_oss
    soundcore 8160 1 snd
    The *weirdest* part about this is on my first boot, it didn't work at all and I didn't even try to get it to work yet. I installed X it worked, I rebooted for something completely unrelated. I booted and sound magically worked! Not complaining, I was listening to music but then I rebooted for another non-sound-related issue, and now the sound is broken again. I can't figure this out at all! Anyone have any ideas?

    Well, I found out that I need the module "snd-hda-intel" -- I think. But it doesn't seem to be on my system despite documentation saying it's still available in 2.6.27. Do I really need to recompile my kernel just for this one module? I'd really prefer not to  have to deal with that. If I do "modprobe snd-" and press tab twice, it shows me this list of modules:
    snd-ac97-codec snd-au8820 snd-cs5530 snd-es1688-lib snd-ice1712 snd-mixart snd-oxygen-lib snd-sb8-dsp snd-usb-caiaq
    snd-ad1816a snd-au8830 snd-cs5535audio snd-es18xx snd-ice1724 snd-mona snd-pcsp snd-sbawe snd-usb-lib
    snd-ad1848 snd-aw2 snd-cs8427 snd-es1938 snd-ice17xx-ak4xxx snd-mpu401 snd-pcxhr snd-sc6000 snd-usb-usx2y
    snd-ad1848-lib snd-azt2320 snd-darla20 snd-es1968 snd-indigo snd-mpu401-uart snd-pdaudiocf snd-seq-midi snd-util-mem
    snd-ad1889 snd-azt3328 snd-darla24 snd-es968 snd-indigodj snd-mtpav snd-portman2x4 snd-seq-midi-emul snd-via82xx
    snd-adlib snd-bt87x snd-dt019x snd-fm801 snd-indigoio snd-mts64 snd-pt2258 snd-seq-virmidi snd-via82xx-modem
    snd-ak4114 snd-ca0106 snd-dummy snd-gina20 snd-intel8x0 snd-nm256 snd-rawmidi snd-serial-u16550 snd-virmidi
    snd-ak4117 snd-cmi8330 snd-echo3g snd-gina24 snd-intel8x0m snd-opl3-lib snd-riptide snd-sgalaxy snd-virtuoso
    snd-ak4xxx-adda snd-cmipci snd-emu10k1 snd-gus-lib snd-interwave snd-opl3-synth snd-rme32 snd-sis7019 snd-vx-lib
    snd-ali5451 snd-cs4231 snd-emu10k1-synth snd-gusclassic snd-interwave-stb snd-opl3sa2 snd-rme96 snd-soc-core snd-vx222
    snd-als100 snd-cs4231-lib snd-emu10k1x snd-gusextreme snd-korg1212 snd-opl4-lib snd-rme9652 snd-sonicvibes snd-vxpocket
    snd-als300 snd-cs4232 snd-emu8000-synth snd-gusmax snd-layla20 snd-opl4-synth snd-sb-common snd-sscape snd-wavefront
    snd-als4000 snd-cs4236 snd-emux-synth snd-hdsp snd-layla24 snd-opti92x-ad1848 snd-sb16 snd-tea575x-tuner snd-ymfpci
    snd-atiixp snd-cs4236-lib snd-ens1370 snd-hdspm snd-maestro3 snd-opti92x-cs4231 snd-sb16-csp snd-tea6330t
    snd-atiixp-modem snd-cs4281 snd-ens1371 snd-hifier snd-mia snd-opti93x snd-sb16-dsp snd-trident
    snd-au8810 snd-cs46xx snd-es1688 snd-i2c snd-miro snd-oxygen snd-sb8 snd-usb-audio
    Nothing "hda" related.
    UPDATE: According to this and alsaconf (I checked /etc/modprobe.d/sound) I *do* need the "snd-hda-intel" module. I'm running the standard arch kernel from pacman, do I really have to recompile my kernel...?
    UPDATE: Turns out I do have the module I need. I don't know why it didn't show up. In other news, I get a new dmesg error!
    hda-intel: no codecs initialized
    So, it seems like I need to load snd_hda_codec... but that was "merged" with snd_hda_intel back around 2.6.23. Any ideas?
    Last edited by Nathan (2008-12-27 22:02:48)

  • Problem with nvidia video card GTX670 in PS CC

    I have been using PE10, but have just installed Photoshop CC v. 13.1.2 x64.  Although my video card (nvidia GTX 670, 2 gb RAM) is recognized by Photoshop, when I check "use graphics processor" under "preferences/performance" the converted RAW image is somewhat blurred at all zoom levels (except 100%) compared to when it is unchecked and sharp (for both Canon 1100D and 6D). The problem may possibly reside in the RAW conversion, because if I convert the file with "use graphics processor" unchecked, and then subsequently check it after it has been opened, the image is not blurry and I have access to all of the video card supported functions. Has anyone else experienced this problem, and found a solution?  Thanks, artmar4 (intel i5-3670, 24 gb RAM, Windows 7, 64 bit).

    artmar4 wrote:
    I must admit it's not intuitve to me that the image would be somewhat blurred when clicking on "fit screen" and "fill screen" (but then what do I know.)
    The GPU image-interpolation for display is more sophisticated than that of the CPU, but both methods cannot avoid compromising the image quality in some way in order to reduce the scale of an image. The GPU method can produce blurry results whereas the CPU method can produce ragged results.
    Both methods produce very good results when the zoom is 100% divided by a power of two. However, "Fit Screen" and "Fill Screen" won't give a nice neat zoom level like that, except on very rare occasions when an image has a particular size relative to the screen.
    Here's a (hideous) contrived example of the blurriness of GPU versus the raggedness of CPU. First is the image at 100% zoom. Then the doc on the left of each screenshot is being scaled by the GPU, the doc on the right by the CPU. Click the images to view without forum resampling!

  • Imac, NVIDIA GeForce 8800 GS

    Hello,
    I am planning to buy iMac 3.06 model.
    I read somewhere that core animation is slower on "NVIDIA GeForce 8800 GS" then "ATI Radeon HD 2600 PRO".
    The guy on that forum was saying that NVIDIA graphics card is faster in 3D rendering But slower on core animation. He was also saying that iMovie will be slower on NVIDIA graphics card then ATI.
    Is that true.?
    Sorry, I couldn't find the link of that forum, I read that only once.
    Thanks
    Dhaval.

    Here are two interesting links:
    "Early 2008" iMac
    iMovie and iDVD Surprise:
    Radeon beats Geforce!
    OS X 10.5.3 versus 10.5.2:
    Has the GeForce 8800's Core Image Performance Improved?
    Also, note that you can order a Built-To-Order iMac with the 3.06 Ghz processor and the Radeon 2600 card on the Apple online store:
    Configure your iMac 24-inch

  • I'm looking at the dell Inspiron Desktop 4th Generation Intel Core i5 Processor for photoshop work versus the Dell XPS 8700 i7 IS IT WORTH SPENDING THE EXTRA $400?

    I'm looking at the dell Inspiron Desktop 4th Generation Intel® Core™ i5 Processor for photoshop work versus the Dell XPS 8700 i7 IS IT WORTH SPENDING THE EXTRA $400? my old desktop is a AMD about 5 years old so there will be a huge change in speed to what I am use to
    Here are the specks on both:
    Inspiron
    Processor
    4th Generation Intel® Core™ i5-4460 Processor (6M Cache, up to 3.40 GHz)
    Operating System
    Help Me Choose
    Windows® 8.1 (64Bit) English
    Memory2
    8GB Dual Channel DDR3 1600MHz (4GBx2)
    Hard Drive
    1TB 7200 rpm SATA 6Gb/s Hard Drive
    Video Card
    NVIDIA® GeForce® 705 1GB DDR3
    Ports
    Front
    (2) USB 2.0, MCR 8:1, Mic and Headphone Jacks
    Rear
    Four USB 2.0 connectors , Two USB 3.0 connectors, HDMI, VGA, RJ-45 (10/100/1000 Ethernet), 3-stack audio jacks supporting 5.1 surround sound
    Media Card Reader
    Integrated 8-in-1 Media Card Reader
    (supports Secure Digital (SD), Hi Speed SD (SDXC), Hi Capacity SD (SDHC), Memory Stick (MS), Memory Stick PRO (MS PRO), Multimedia Card (MMC), Multimedia Card Plus (MMC Plus), xD-Picture Card(XD))
    Memory Slots
    2 DIMM Slots
    Chassis
    Bluetooth
    BT 4.0 via 1705 WLAN card
    Chipset
    Intel® H81 PCH
    Power
    300 Watt Power Supply
    XPS 8700
    Processor
    4th Generation Intel® Core™ i7-4790 processor (8M Cache, up to 4.0 GHz)
    Operating System
    Help Me Choose
    Windows 8.1 (64Bit) English
    Choose Options  
    Memory2
    12GB Dual Channel DDR3 1600MHz (4GBx2 + 2GBx2)
    Hard Drive
    1TB 7200 RPM SATA Hard Drive 6.0 Gb/s
    Video Card
    NVIDIA GeForce GTX 745 4GB DDR3
    CPU Thermal
    86W
    Graphics Thermal
    225W/150W/75W
    Power
    460W, optional 80 PLUS Bronze, 85% efficient, supply available on ENERGY STAR configurations
    Ports
    Bays
    Support for 4 HDD bays: including (3) 3.5” HDDs
    –Capable of 1 SSD and 3 HDD configuration
    Media Card Reader
    19-in-1 Card Reader (CF Type I, CF Type II, Micro drive, mini SD, MMC, MMC mobile, MMC plus, MS, MS Pro, MS Pro Duo, MS Duo, MS Pro-HG, RS-MMC, SD, SDHC Class 2, SDHC Class 4, SDHC Class 6, SM, xD)
    Slots
    Memory Slots
    4 DIMM

    From my personal experience, I wouldn't go for an Integrated card. This is one of the most important components for Photoshop, so invest in a decent graphics card (ATI or NVidia). It doesn't have to be a really exepensive one - I have been using ATI Radeon with 256MB of RAM on my Dell Studio for almost two years and it still rocks! (even when I work with 3D in Ps CS5 Extended).
    I would also invest in more RAM (this can be added easily - I bought extra 4GB as my Studio came with 3GB).
    I wouldn't worry about the processor - I'm on Intel Core 2 Duo - and it works very very well, it's very quick which is very important as I'm training Photoshop.
    I hope this helps.

  • T510 with nVidia 3100M Optimus - not switching

    Just received the above mentioned PC and I was trying to test out the switching capability of the Optimus technology.  However, I can't seem to get it to switch.  I have a game inside Steam and through the nVidia control panel I have set Steam to work with the discrete cart, but it never switches.  This is evident because I've started up a game with Optimus enabled versus entering the BIOS and explicitly choosing the discrete card and it makes a world of difference.
    Is there another application I could use to test out the switching capability (I read a previous message about Steam possibly not working).  Or is there any way to explicitly switch the cards within the OS?  I am looking for a way to verify this technology works.
    -john

    That would be beautiful, except that it wasn't switching automatically in certain cases.  In particular, Steam was having an issue.
    I made some headway though.  Basically, I updated the drivers for the nVidia chip by going to this page:
    http://www.nvidia.com/object/notebook_drivers.html
    And having them scan my system.  You have to allow them to use an ActiveX control (hint: use 32-bit IE for this).  This resulted in two options, one for the 3100M and one for the Intel chip.  I chose the 3100M download (version 2.67 from 3/24/2011) and installed it.
    The first thing I noticed is the nVidia control panel had a lot more configured apps.  Also, on the preview pane, I actually saw the spinning nVidia 3D logo whereas before it wasn't showing up in Optimus mode.  I was able to see the switching happen via this utility:
    http://forums.laptopvideo2go.com/topic/26992-optim​us-test-tools-finally-in-users-hands/
    I was a little worried downloading at first, but it seemed to work fine.  Anyway, you can play around with various applications and watch the switch happen.
    Unfortunately, my game through Steam still crashed out, but it got a lot further after the driver update.  I've seen a lot of other people have problems with this particular game (Civ 5) using optimus, so I'll just chalk it up to the game for now.  It plays fine if I explicitly set the discrete chip in the BIOS which I can live with. 
    My next step will be to test other games to make sure it truly is game specific.  Hope this helps people having the same problem.

  • Slow render speed with new Mac Pro / NVIDIA 8800

    I bought a new Mac Pro to replace a dual G5 because I have a number of Motion projects to finish quickly. My local dealer recommended the NVIDIA 8800 as it has 512 mb RAM (versus 128 on my old machine). I'm finding that the render time is worse than with the G5. I have yet to successfully render a 1 minute segment using Compressor (I've left it to run overnight several times with no success).
    In Activity Monitor I'm seeing this message: CompressorTranscoder (Not Responding). But none of Batch Monitor, Compressor or Motion are unresponsive - not indicating any hung applications.
    I've read that version 10.5.5 has compromised the NVIDIA card with Motion but haven't been able to get any reliable corroboration. As the Mac Pro shipped with 10.5.5 I can't really roll back to an earlier version.
    Any advice? Anyone with similar experiences? I'm trying to render on a new iMac with 4 GB of RAM (with the ATI 2600 video card) but it doesn't seem to be working as it is an hour in and shows at least three more hours.

    ChineesYouth wrote:
    there's a better post just use the search and type in 3870 or best graphics card for motion
    or click here

  • [Solved] Nvidia after kernel update

    The nvidia wiki clearly states that: "Every time the kernel26 package is updated, it is a requirement to reinstall the nvidia package and rebooting afterwards. "
    But my question is, why? I mean, if the module was to be recompiled, that would make sense. But what does just reinstalling the driver accomplish? In my system, pacman -Ql nvidia shows:
    $ pacman -Ql nvidia
    nvidia /etc/
    nvidia /etc/modprobe.d/
    nvidia /etc/modprobe.d/nouveau_blacklist.conf
    nvidia /lib/
    nvidia /lib/modules/
    nvidia /lib/modules/2.6.35-ARCH/
    nvidia /lib/modules/2.6.35-ARCH/kernel/
    nvidia /lib/modules/2.6.35-ARCH/kernel/drivers/
    nvidia /lib/modules/2.6.35-ARCH/kernel/drivers/video/
    nvidia /lib/modules/2.6.35-ARCH/kernel/drivers/video/nvidia.ko
    From what I've read, I think the kernel update process copies the compiled modules into the proper folder (inside /lib) named by the kernel version. Should this not be the case with nvidia? If it is not, then is this why the reinstall is needed?
    Thanks in advance (and my apologies if I've glossed over something that's patently obvious --- while I've searched for the answer before posting, I am still quite the novice at tinkering with the kernel)
    EDIT: just after posting I've realized that maybe this post belongs in "Kernel and Hardware Issues" subforum. But the truth is this isn't really an "issue"... it's more a question from, well, a newbie :-)
    Last edited by gauthma (2010-09-28 21:53:04)

    gauthma wrote:
    Thank you for the feedback.
    ngoonee wrote:Of course, if you're using nvidia-beta or some other AUR nvidia driver, you need to reinstall it.
    Running the risk of asking what might be obvious to some, why?
    The driver speaks to the kernel. When the kernel updates, the language used might have changed in various ways. Since nvidia is closed-source, its not always obvious whether what has changed affects it. The same applies to xorg-server, when it updates you should recompile your nvidia driver. Its SOMETIMES not needed, but when it is, would you rather be solving it from the tty?
    quathma wrote:
    ngoonee wrote:That's what the wiki is referring to. Even if the kernel major version has not updated (for example 2.6.35.5 to 2.6.35.6 is only a minor version bump) the driver may need to be recompiled. Safer to just do it everytime and restart the machine.
    Now I'm confused. Are you saying that the safest approach is to recompile and (re)install the nvidia module after each kernel update? And does this only apply to modules from beta & AUR, or [extra] as well? And shouldn't all this issue of recompiling modules also apply to every other module installed in the system?
    Yes that's the safest approach. [extra] is taken care of by the devs, the simple way is to assume they'll bump it if needed (and you don't need to do anything).
    The same thing applies to almost any kernel module (AFAIK). It applies to virtualbox's modules, for sure, as well as phc-intel (the only other 2 external modules I use). Most kernel modules are actually compiled in to the kernel26 package anyway, so its not very many. I just automatically recompile the ones I use whenever I notice a kernel/xorg update.

  • Can't build NVIDIA kernel module: .run file cannot be downloaded

    I have patched my kernel to replace scheduler from O(1) to BFS. Then I tried to recompile nvidia proprietary drivers. That didn't work
    paul@paul-laptop-x86-64:~$ sudo abs
    Password:
    ==> Starting ABS sync...
    rsync: failed to connect to rsync.archlinux.org (66.211.214.132): Connection timed out (110)
    rsync error: error in socket IO (code 10) at clientserver.c(122) [Receiver=3.0.9]
    So I tried to download PKGBUILD mannualy via Web interface.
    I tried to build it. That didn't work again
    paul@paul-laptop-x86-64:/tmp/nvidia$ makepkg -cd
    ==> Making package: nvidia 302.17-1 (Wed Jun 20 16:33:23 MSK 2012)
    ==> WARNING: Skipping dependency checks.
    ==> Retrieving Sources...
    -> Downloading NVIDIA-Linux-x86_64-302.17-no-compat32.run...
    % Total % Received % Xferd Average Speed Time Time Time Current
    Dload Upload Total Spent Left Speed
    0 0 0 0 0 0 0 0 --:--:-- 0:01:03 --:--:-- 0curl: (7) couldn't connect to host
    ==> ERROR: Failure while downloading NVIDIA-Linux-x86_64-302.17-no-compat32.run
    Aborting...
    I am not sure, but it could be network problems. I will be very pleased if someone will give me  a file ftp://download.nvidia.com/XFree86/Linux-x86_64/302.17/"NVIDIA-Linux-x86_64-302.17.run here in an attach.
    Also - is there any other way to build it instead of skipping the dependency checks.
    I am using Arch x86_64 version
    Last edited by VisualPaul (2012-06-20 12:54:30)

    Your link is not working. Can't download.

  • Xorg-config isn't yet, trying nVidia-nouveau, on a desktop machine...

    Ok, I'm newbie on 'Arch', but fairly seasoned on Debian-family (MEPIS,sidux), and on openSUSE.
    Gotta admit, I am really LIKING this low-level installer, as a means to 'test' my Linux knowledge to date.
    But, it's certainly a *slower* way to get KDE-4 running, by having to manually hack on all the individual
    /etc/<yada>/yada.conf files using 'nano', rather than just boot a LiveCD all the way into run-level-5 in
    one fell swoop, run a GUI-based installer, specify a username/password, and be running
    KDE-4 20 minutes later!
    But, I'm not complaining.
    Let's see...where am I?
    Got the basic (run-level-3) system installed to a hard-drive partition, using my existing /home partition.
    [Had one small glitch to get ethernet to DHCP-acquire an IP-addr...had to "hook <some-conf?>", so
    I could add 'nameserver 192.168.0.1' into 'resolv.conf' (or whatever it's name is) and get that to stick.
    So, net-access works.  Was then able to do whatever else that basic-install manual said...told it to
    configure me nVidia-nouveau, by creating the '20-nouveau.conf'.  Rebooted, as directed.  (Several times now.)
    Tried 'Xorg -configure'.  It fails with:
    1:  [drm] No DRICreatePCIBusID symbol.
    2: Num created screens does not match num of detected devices.
    EDIT: Oh, did read that I could grab an existing xorg.conf from another distro, but
    that would be cheating, eh?
    That's the one blocking getting to run-level-5.
    I did 'jump ahead' and installed KDE-4, but of course, till I make Xorg happy, that's really not
    gonna do me much good.
    So, in addition to advice on configuring Xorg, one other 'forest-for-the-trees question':
      Is there a higher-level (GUI-based) approach to installing Arch, rather than using 'netinstall'?
    (I did notice that other kit, recommended for 'people with unreliable networks', but that
    didn't sound like me.)
    TIA...
    Last edited by cookdav (2010-12-03 23:11:04)

    I agree with 'some like the control of cmd-line, to achieve non-default settings', but I'd argue that
    doing the installation using a LiveCD like Chakra OUGHT to allow the 'same' choices and flexibility
    that the cmd-line method does.
    However, in my view, the 'cmd-line' approach seemed to leave out actually forcing (guiding the
    user thru a script that causes 'sound' and 'wifi' pre-requisites to actually get installed)!  (The
    'netcfg' package is a prime example.)  Instead, the user follows a written document (e.g. either
    'basic' or 'complete' arch manuals, which mention subjects like 'sound' and 'wifi', but then do
    NOT even detect presence of 'wifi-adapter' or 'sound' hardware, etc.  In other words, if
    ANY wifi-hardware is detected, the 'netcfg' package should get installed, but clearly doesn't.
    Rather, it relies on the person installing to somehow know/determine that they need to
    manually install 'netcfg' (as well as then manually hunt-down their firmware, etc).
    Whereas, the 'Chakra' (GUI) approach at least makes an attempt (e.g. causes 'netcfg' to
    get installed transparently), easing the learning curve.
    That said, the negative of the guided/GUI approach is that the person installing does NOT
    then learn/understand precisely WHAT STEPS got executed for them, and thus learns very
    little of what packages and what config-files need to be examined/modified to carry
    out those various steps.  As I result, while I'm convinced that SUCCESSFUL 'arch' owners/users
    become some of the most knowledgeable users in the details of Linux configuration,
    I'm left wondering just what per-cent of newbies who first TRY Linux via 'arch',
    end up succeeding with it, versus either giving up  on Linux or wandering off to some
    other distro that has a more scripted/automated installer.
    (i.e. an install-script/GUI that tries to get them up-and-running with minimal fuss, and allows them
    to  LATER absorb the details,  via  future updates/maintenance glitches/issues.).
    In other words, after my 5-10 years of Linux experience, I was surprised how much I had to
    struggle, to get a working laptop-with-wifi installation.  In a word, I 'failed' the test that
    arch provided me, and felt I had to resort to Chakra, to meet the other goal of getting
    that laptop up-and-running with sound and wifi and xserver working.  But, maybe that's
    just a reflection of our modern society's push into the "instant-gratification" direction?
    Thanks to ALL who have participated in this thread, for their guidance!  As I mentioned,
    I'm both humbled and more knowledgeable than I was when I started on the 'arch'
    journey, just a few short days ago.
    As I mentioned in the base-post, I came to 'arch' to LEARN something, and to TEST my
    Linux knowledge,and I met those goals.  That said, I found it disconcerting that the manual(s)
    that I read seemed to gloss over such fundamental things as not mentioning the
    need to install 'netcfg' and not-resulting in some flavor of sound-support being put in place.
    Last edited by cookdav (2010-12-07 14:50:22)

  • Error after latest nvidia update...missing libpangox?

    Latest nvidia update  (304.60-1):
    $ nvidia-settings
    "nvidia-settings: error while loading shared libraries: libpangox-1.0.so.0: cannot open shared object file: No such file or directory"
    Closest thing is libpango, libpangoxft....

    Or recompile nvidia-settings. Here's a snippet of how I do it (in a totally different distro), from nvidia's nvidia-settings sourcecode:
    if module_installed "gtk+-2" ; then
    # Build nvidia-settings from source
    cd nvidia-settings-$VERSION &&
    # xf86vmode.h has been removed in xf86vidmodeproto 2.3
    if [[ ! -e /usr/include/X11/extensions/xf86vmode.h ]] ; then
    sed -i -e "s:#include <X11/extensions/xf86vmode.h>:#include <X11/extensions/xf86vmproto.h>:" src/libXNVCtrlAttributes/NvCtrlAttributes{,VidMode,Glx}.c
    fi &&
    make clean &&
    make -C src/libXNVCtrl &&
    make || exit 1
    module_installed is distro-specific - of course, it's checking whether gtk+-2 is available, FYI.
    Edit: Corrected sourcecode file
    Last edited by brebs (2012-11-17 23:32:30)

Maybe you are looking for

  • Balance forward billing not working on r12.1.3

    Hi, I am unable to work on the Balance forward billing, the bill date is not getting generated. Can you please send me the solution for BFB for 12.1.3, also I need to customize the BFB to be based on Customer PO number and not date. Thanks in Advance

  • Tons of errors with sync and restore

    Let me state first, that this iPod Classic i brand new, and has latest software installed. So does my Mac Mini (also brand new). All latest software is installed. I have a large library, therefore i chose a Classic. From the very first sync, errors s

  • Error in validation an input field in a table ui and high lighting the cell

    Hello All, I have a table ui in one of my application.   This table has, say 6 columns; out of these 6 columns 4 columns are shown as input fields.  User needs to put in some numbers in two of the fields.  I need to capture these two number fields da

  • REGARDING BOM ITEMS

    Dear PP Experts, Can anyone tell me, what is the use of 'INSERT' function in the BOM - EDIT menu. When i used it the system inserting the many lines in the item screen. I want to insert the only one line below the line which i selected. Thanks, Regar

  • Saving values into table from Interactive Form

    Hallo together, iam trying to do the example "Include Tables" (SAP Interactive Forms by Adobe - Galileo Press). It works, but not as described. When i want to change data in the interactive_forms ui element and press the webdynpro native button send,