30'' & the GeForce 6800 GT

Hello, Today i purchased the 30 in. apple HD Cinema Display. I did try to research it a bit before i purchased it. I use a GeForce 6800 GT but the display wont allow me to go higher than 1280x800 res.
Any ideas if this is a PC vs. Apple issue or my graphics card? I did download the latest driver.
If it isnt a hardware conflict, what can i do to increase the res?
Thanks for you time.
Az

I have a PC with a GeForce 6800 GT and it supports 1920 x 1200 resolution for my 23" Display. It should go higher than 1280. Try downloading the new driver-
http://www.nvidia.com/object/winxp2k81.85.html

Similar Messages

  • Is the geforce 6800 gt or ultra compatible with g5 june 2004?

    is the geforce 6800 gt or ultra compatible with g5 june 2004?
    if so which i think it is (just need to check) wheres the cheapest place/website i can buy it from in the uk?

    wheres the cheapest place/website i can buy it from in the uk?
    eBay.co.UK from this proven and reliable seller:
    OEM Geforce 6800 Ultra (best if you want two, big monitors)
    Flashed, Mac compatible Geforce 6800 GT (a performance value)

  • Is the geforce 6800 ultra ddr3 same as the ddr and will the ddr3 fit my dual powermac G5 june 2004?

    i want to get this video card but there is a ddr and a ddr3 whats the difference and will it fit my power mac dual cpu june 2004??

    The Apple OEM versions are GDDR3 memory for the GT DDL and the Ultra DDL.
    Both run 16 pipelines and both use the NV4x GPU clocked at 350MHz and 400MHz respectively..
    Total bandwidth runs in the favor of the Ultra DDL:
    Geforce 6800 GT 32GB/s
    Geforce 6800 Ultra DDL 35.2GB/s
    However, benchmarks show the performance of the two cards to be very close:
    http://www.barefeats.com/radx800.html
    The PC versions of the GT and Ultra use the same GPU, GDDR3 memory and same clock speeds, except in the case of OC variants.
    The Memory clocks of the GT and Ultras (OEM or PC) are 500 MHz (1000 MHz) and 550 MHz (1100 MHz).
    Performance of the PC  version is the same as the OEM equivalent, only port configuration and card width being major differences (PC cards are typically single slot, while OEM are dual slot. This is a negative if one has many PCI/PCI-X cards).
    Of course there are cooler variations, but all use either a single or a dual slot Cooler Master cooler.
    Both OEM Geforce 6800 GT and Ultra cards will fit in and work in any AGP G5 (yours) with only OS X Panther 10.3.6 or later being needed (Tiger sam an improvement in Geforce drivers).
    Of the flashed GT and Ultra cards, the same is true of the OEM- either can be used in any AGP G5.

  • Will the geforce 6800 fit in msi k8n neo ms- 7030 ??

    please see this picture
    http://www.hardwarezone.com/img/data/articles/2004/1247/msi_capacitor_bloc.jpg
    http://www.hardwarezone.com/articles/view.php?id=1247&cid=6&pg=12

    Might help a tad:
    https://forum-en.msi.com/index.php?topic=86098.0

  • DVI to ADC Adapter for GeForce 6800 video card?

    Do I need DVI to ADC Adapter for Nvidia GeForce 6800 which I just installed on my G5 to be able to run ADC 23" Cinema off the GeForce 6800 card?
    I want to run a new 30" Cinema and old ADC 23" Cinema from GeForce 6800. I ran the old 23" from the ATI Radeon 9600 Pro
    Thanks,
    Milan

    Hi Milan,
    If the Geforce 6800 does not have an ADC output, then yes, you do.
    You need to check to see whether or not that the card has DVI-I output, as some of the DVI to ADC adaptors use a DVI-I plug which will not fit into a DVI-D output due to the extra pins in the plug.
    Cheers!
    Karl

  • X800XT vs. GeForce 6800

    Chris?
    BD?
    Hope you guys are well. Chris, I know we've explored these issues before. Forgive the repetition. Still working on a G5 SP 1.8 for a friend in need. He'll be using an older version of FC. The app will work fine, but I'd like to upgrade his 9600 Radeon to either an Applemacanix flashed X800XT, or one of those 6800's we talked about. The 6800, (Nvidia A220 GeForce 6800 603-7710), is used of course, (with 30 day return), from reputable Ebayer. And it's also cheaper. Waddyatink? Which way?
    Best
    Mike

    The flashed X800XT (FireGL) is a good card.
    Running at Apple OEM X800XT clocks, it is slower than with the FireGL clock. The card, though, running faster (at FireGL clock speeds and faster) heats things up fast.  I have one OC'd to 550 MHz, but also have a Arctic Cooling ATI Silencer on it.
    It's a bit of a struggle to cool the FireGL with it's stock cooler, even at OEM ROM settings.
    The flashed X800 is a single slot card, good for PCI card placement, has a dual link DVI port and provides very strong OpenGL support for OS X.
    Weak coolers follow previous Radeon cards.....
    FC? Motion?
    I'd get the Geforce 6800 Ultra:
    http://www.barefeats.com/rad9650.html

  • Will this card work? - Nvidia Geforce 6800 ULTRA 256MB VRAM AGP Video Card

    I posted a question about getting the maximum resolution from an Apple 30 inch Cinema Display and another card, which didn't work.
    The replies were fast here in the forum and quite correct.
    My next question is regarding the Nvidia Geforce 6800 ULTRA 256MB VRAM AGP Video Card.
    I have an AGP card with an ATi Radeon 9600 Pro card that can't power the 30 inch because it's only 64MB and no DVI output.
    I'm wondering whether this card will give me the maximum resolution?
    Thanks again.

    Hi Japamac-
    Thanks for the quick reply.
    My System Profiler indicates a ATI Radeon 9600 Pro, but it doesn't have the Dual Link since this is an older G5 with only 64MB of VRAM.
    The GeForce 6800 is a refurb, but is Mac compatible.
    I'm hoping that it will work, but since it won't be returnable, I don't want to spend the money needlessly.
    I did order another card that the Apple support suggested should be compatible, but turned out that they sent me the PC version.
    I've spoken to Apple as well and get conflicting info.
    It boils down to the fact that this is a G5 with a late 2004 date.
    It is fast enough for me and my only hang up is that I'm only getting 12980X800 resolution.
    I'm not using it for gaming or graphics processing.
    HTH.

  • Graphics card upgrade from GeForce 6800 Ultra...

    Can anyone help me out?
    I'm working in Final Cut Studio(FCS 2 in a few weeks wooo!!), Photoshop, AE7, Logic, etc. I'm on a Dual 2.7 G5 w/ 8GB SDRAM. I currently have the GeForce 6800 Ultra DDL card w/ a 23" CinemaHD display.
    I've read a few posts saying that this card is pretty good, however I feel that my system still chugs a bit sometimes, depending on the application I'm in. I've read about the XT 800, ATI radeon 1900, and some others. In my system profiler, it says that I have an AGP bus, so I understand compatibiltiy is an issue, I just don't have an expansive knowledge of graphics displays/setups.
    While I'm still not quite in the market for an 8core, I would still like to max my graphics speed and efficiecy on my current setup. Keep in mind I am close to purchasing a second 23" display as well.
    Those things in mind, can somebody educate me a bit on alternatives/upgrades/options? Although price is always an issue, I would like to exhaust ALL avenues for this situation.
    I'm also anticipating CS3, and new Motion 3 w/3D space to chug my system, so anything I can do to combat this would help.
    Sorry about lack of graphics knowledge, ANY help would greatly be appreciated. Thanks, Zack

    Too add to Mike's response, here are some links to tests with both the 6800 Ultra and the X800:
    http://barefeats.com/radx800.html
    http://barefeats.com/radx850.html
    Note that the second link contains info on the OEM X850 as well as the retail X800, so read the graphs carefully!

  • Will the XFX Geforce 6800 Ultra work with G5 Mac

    I have the standard NVidia Geforce 6800 Ultra AGP card working in my G5 2.7DP, and have been given an XFX version of the card.
    But the literature that comes with the XFX suggests it's for Windows, not Mac. Will it work in my Mac? If so, I'll keep it for a backup; if not, it's no use to me.
    Description of card is XFX Geforce 6800 Ultra 256Mb DDR3 AGP8x Dual DVI TVout.

    It won't work as is.
    If you have the right ROM chip on it, you can put it into a Windows machine and flash it with a modified Mac ROM. If all is right and goes well, then it would work in a Mac.
    Otherwise, as you put it, it is of "no use" to you.

  • Problem with MSI Nvidia GeForce 6800 LE (128MB) when playing 3D games

    Games problem
    I have bought a new MSI Nvidia GeForce 6800 LE (128MB) a couple of months ago and have been have loads of trouble when playing games.
    In windows, the graphics card works fine. The computer never crashes and everything is speedy. Movies on DVD, Xvid, Divx all play fine without problem, the most I get is the occasional freezing and then windows will be normal again.
    The problem arises when I try to play a 3D game. When I load up the game it goes into the menu screens ok, the computer will load up the 3D engine and it will play for around a minute nice and crisp then the screen will go blank and computer will lock up. Then all the drives will go quiet and the computer will make 3 beeps with the PC speaker. Pressing CTRL+ALT+DEL at this point does nothing, nor does the reset button. I have to turn the power on and off.
    I really have no clue why this is happening, I have taken the covers off and kept it in a cool area, hoping that it was something to do with over heating, my computer is not overly cluttered and this has made no difference.
    I have also unplugged the CDR, DVDR, USB ports, one Hard drive (storage) to see if it was something to do with the power no getting to the Videocard and then tried to play a game and the same problem arose.
    I have two games that I have tried, Tiger woods 2005 and doom III, both of which crash in the same manner.
    I have tried to be as detailed as I can with the descritopn of my computer below and the power details I'm not sure about so I posted a picture of the power unit to help.
    Many thanks for your help in this matter, I'm desperately trying to get some worth out of this card and feel frustrated that its going bonkers only when trying to play games :/
    Please tell me if you require further information about the system.
    Neehar Shah
    Model : Intel(R) Pentium(R) 4 CPU 3.20GHz
    Speed : 3.21GHz
    L2 On-board Cache : 512kB ECC Synchronous ATC (8-way sectored, 64 byte line size)
    System BIOS : Phoenix Technologies, LTD 6.00 PG
    Mainboard : Legend QDI PLATINIX-8
    Total Memory : 511MB DDR-SDRAM
    CPU
    Model : Intel Corporation 82845G/GL/GE Brookdale Host-Hub Interface Bridge (B1-step)
    Front Side Bus Speed : 4x 200MHz (800MHz data rate)
    Total Memory : 512MB DDR-SDRAM
    Memory Bus Speed : 2x 200MHz (400MHz data rate)
    BOIS
    Manufacturer : Phoenix Technologies, LTD
    Version : 6.00 PG
    Date : 06/19/2003
    Plug & Play Version : 1.00
    SMBIOS/DMI Version : 2.30
    (EE)PROM Size : 256kB  (2Mbit)
    Video System
    Monitor/Panel :  SyncMaster 171S/ 175S/ 170S, SyncMaster Magic  CX175S-AZ/LX175S
    Adapter : NVIDIA GeForce 6800 LE
    OEM Device Name : Nvidia Corp ??? (0042)
    OEM Hardware ID : FUN_0, VEN_10DE, DEV_0042, REV_A1
    Device Name : Micro-Star International Co Ltd (MSI) ??? (0042)
    Product ID : VEN_1462, DEV_9751
    Revision : K / 2 (161)
    Physical Storage Devices
    Hard Disk : Maxtor 6Y080L0 (76.3GB)
    Hard Disk : ST3200822A (186.3GB)
    CD-ROM/DVD : AOPEN CD-RW CRW2440 (CD 40X Rd, 24X Wr)
    CD-ROM/DVD : PLEXTOR DVDR   PX-708A (CD 40X Rd, 40X Wr) (DVD 5X Rd, 5X Wr)
    Peripherals
    USB Controller/Hub : USB 2.0 Root Hub
    FireWire/1394 Controller/Hub : OHCI Compliant IEEE 1394 Host Controller
    MultiMedia Device(s)
    Device : MPU-401 Compatible MIDI Device
    Device : Standard Game Port
    Device : Creative SB Audigy
    Device : Creative Game Port
    Operating System(s)
    Windows System : Microsoft Windows XP/2002 Professional (Win32 x86) 5.01.2600 (Service Pack 2)
    Network Services
    Adapter : Realtek RTL8139/810x Family Fast Ethernet NIC

    Firstly, thank you for all your prompt responses, I really appreciate it. From the sounds of it, the PSU is ****
    antec
    From what I have read, the Antec PSUs have good reliability and ratings. I looked around and saw this one: Antec NEOPower 480W ATX PSU for around £65. I also went read that article posted by Dr Stu, which then stumbled on more articles about PSUs.
    I have a couple more questions….
    The rating of my system (http://www.jscustompcs.com/power_supply/Power_Supply_Calculator.php) with all the components running came out to be 422W.  So do you think that the Antec NEOPower 480W ATX PSU would be sufficient for the computer considering I might add more RAM in the future?
    Or should I buy the 500W+ one then?
    Does anyone know if the Antec PSUs are quiet? My current q-Tec one makes me feel I'm in an aeroplane.
    lucky I read the forum!
    I was in Maplins (computer shop) at lunch and thought, that shiny 650W Q-Tec looks nice and there looks like loads of power, that much do the trick. i got to the counter and thought.....let me wait, till i get home from work and read some more on the forum about this PSU issue.
    Thanks god I didn’t buy that S***. having read some of the reviews, i would have wasted loooots of money for nothing.

  • Geforce 6800 Ultra - output only video to TV

    I just bought a new Geforce 6800 Ultra and I'm a little confused about the settings for TV output via the s-video cable. Is it possible to have only the video I'm watching zoomed full screen on the TV? Currently I'm only able to get the desktop shown on the TV but not the video only.
    I'm asking this because I used to have Matrox Parhelia 128 which had this function and I liked to watch all the DVD's and other video from my tv.
    Thank you for your help!

    I'm still confused about the TV-problem.
    When I switch to TV mode I can watch videos from TV but when I return I have to put all the resolutions and screen adjustments back in the settings. It's really frustrating to do it every time.
    Does anyone know a workaround for this?

  • [SOLVED] xcompmgr and nvidia geforce 6800 LE

    Hello everybody,
    I am installing arch on my main desktop on which I have a nvidia 6800 LE. I used wiki and almost everything works fine. However I am used to have xcompmgr running with awesome wm. But my xorg configuration for my geforce 6800 LE should be not correct because it's not working at all.
    I observed that both modules freetype and type1 that are highlighted in wiki are not found on my computer ...
    You can find my xorg here :
    # nvidia-xconfig: X configuration file generated by nvidia-xconfig
    # nvidia-xconfig:  version 1.0  (buildmeister@builder63)  Thu Apr 16 19:36:29 PDT 2009
    Section "ServerLayout"
        Identifier     "X.org Configured"
        Screen      0  "Screen0" 0 0
        InputDevice    "Mouse0" "CorePointer"
        InputDevice    "Keyboard0" "CoreKeyboard"
    EndSection
    Section "Files"
        ModulePath      "/usr/lib/xorg/modules"
        FontPath        "/usr/share/fonts/misc"
        FontPath        "/usr/share/fonts/100dpi:unscaled"
        FontPath        "/usr/share/fonts/75dpi:unscaled"
        FontPath        "/usr/share/fonts/TTF"
        FontPath        "/usr/share/fonts/Type1"
    EndSection
    Section "Module"
       # Load           "glx"
       # Load           "dri2"
       # Load           "record"
       # Load       "xtrap"
        Load           "dbe"
        Load           "extmod"
       # Load           "type1"
       # Load           "freetype"
        Load           "glx"
    EndSection
    Section "ServerFlags"
    #    Option         "AutoAddDevices" "False"
    #    Option         "AllowEmptyInput"  "False"
        Option         "DontZap" "false"
    EndSection
    Section "InputDevice"
        Identifier     "Keyboard0"
        Driver         "kbd"
        Option         "XkbLayout" "fr"
        Option         "XkbVariant" "latin9"
    EndSection
    Section "InputDevice"
        Identifier     "Mouse0"
        Driver         "mouse"
        Option         "Protocol" "auto"
        Option         "Device" "/dev/input/mice"
        Option         "ZAxisMapping" "4 5 6 7"
    EndSection
    Section "Monitor"
        Identifier     "Monitor0"
        VendorName     "Monitor Vendor"
        ModelName      "Monitor Model"
        HorizSync       30.0 - 130.0
        VertRefresh     50.0 - 100.0
    EndSection
    Section "Device"
            ### Available Driver options are:-
            ### Values: <i>: integer, <f>: float, <bool>: "True"/"False",
            ### <string>: "String", <freq>: "<f> Hz/kHz/MHz"
            ### [arg]: arg optional
            #Option     "ShadowFB"               # [<bool>]
            #Option     "DefaultRefresh"         # [<bool>]
            #Option     "ModeSetClearScreen"     # [<bool>]
        Identifier     "Card0"
        Driver         "nvidia"
        VendorName     "nVidia Corporation"
        BoardName      "NV40.2 [GeForce 6800 LE]"
        Option       "RenderAccel" "true"
        Option         "NoLogo" "true"
        Option         "AGPFastWrite" "true"
        Option         "EnablePageFlip" "true"
        Option         "AllowGLXWithComposite" "true"
        Option         "XAANoOffscreenPixmaps"
        Option         "NoFlip" "True"
    EndSection
    Section "Screen"
        Identifier     "Screen0"
        Device         "Card0"
        Monitor        "Monitor0"
        DefaultDepth   24
        Option         "AddARGBGLXVisuals" "True"
        SubSection     "Display"
            Viewport    0 0
        EndSubSection
        SubSection     "Display"
            Viewport    0 0
            Depth       4
        EndSubSection
        SubSection     "Display"
            Viewport    0 0
            Depth       8
        EndSubSection
        SubSection     "Display"
            Viewport    0 0
            Depth       15
        EndSubSection
        SubSection     "Display"
            Viewport    0 0
            Depth       16
        EndSubSection
        SubSection     "Display"
            Viewport    0 0
            Depth       24
        Modes       "1280x1024" "1024x768" "800x600"
        EndSubSection
    EndSection
    Section "Extensions"
        Option         "Composite" "Enable"
       # Option       "RENDER" "Enable"
    EndSection
    I was wondering if you had an idea of the cause of this non functioning. Moreover if someone has a good working xorg for geforce 6800 LE, you would be happy to know about it.
    Thanks a lot for you help,
    Sirsurthur
    Last edited by Sirsurthur (2009-05-24 07:56:37)

    Solved :
    # nvidia-xconfig: X configuration file generated by nvidia-xconfig
    # nvidia-xconfig:  version 1.0  (buildmeister@builder63)  Thu Apr 16 19:36:29 PDT 2009
    Section "ServerLayout"
        Identifier     "X.org Configured"
        Screen      0  "Screen0" 0 0
        InputDevice    "Mouse0" "CorePointer"
        InputDevice    "Keyboard0" "CoreKeyboard"
    EndSection
    Section "Files"
        ModulePath      "/usr/lib/xorg/modules"
        FontPath        "/usr/share/fonts/misc"
        FontPath        "/usr/share/fonts/100dpi:unscaled"
        FontPath        "/usr/share/fonts/75dpi:unscaled"
        FontPath        "/usr/share/fonts/TTF"
        FontPath        "/usr/share/fonts/Type1"
    EndSection
    Section "Module"
         Load "bitmap"
         Load "ddc"
         Load "dbe"
         Load "dri"
         Load "extmod"
        # Load "freetype"
         Load "glx"
         Load "int10"
        # Load "type1"
         Load "vbe"
       # Load           "glx"
       # Load           "dri2"
       # Load           "record"
       # Load       "xtrap"
       # Load           "dbe"
       # Load           "extmod"
       # Load           "type1"
       # Load           "freetype"
       # Load           "glx"
    EndSection
    Section "ServerFlags"
        Option         "AutoAddDevices" "False"
        Option         "AllowEmptyInput"  "False"
        Option         "DontZap" "false"
    EndSection
    Section "InputDevice"
        Identifier     "Keyboard0"
        Driver         "kbd"
        Option         "XkbLayout" "fr"
        Option         "XkbVariant" "latin9"
    EndSection
    Section "InputDevice"
        Identifier     "Mouse0"
        Driver         "mouse"
        Option         "Protocol" "auto"
        Option         "Device" "/dev/input/mice"
        Option         "ZAxisMapping" "4 5 6 7"
    EndSection
    Section "Monitor"
        Identifier     "Monitor0"
        VendorName     "Monitor Vendor"
        ModelName      "Monitor Model"
        HorizSync       30.0 - 130.0
        VertRefresh     50.0 - 100.0
    EndSection
    Section "Device"
            ### Available Driver options are:-
            ### Values: <i>: integer, <f>: float, <bool>: "True"/"False",
            ### <string>: "String", <freq>: "<f> Hz/kHz/MHz"
            ### [arg]: arg optional
            #Option     "ShadowFB"               # [<bool>]
            #Option     "DefaultRefresh"         # [<bool>]
            #Option     "ModeSetClearScreen"     # [<bool>]
        Identifier     "Card0"
        Driver         "nvidia"
        VendorName     "nVidia Corporation"
        BoardName      "NV40.2 [GeForce 6800 LE]"
        Option       "RenderAccel" "true"
        Option         "NoLogo" "true"
        Option         "AGPFastWrite" "true"
        Option         "EnablePageFlip" "true"
        Option         "AllowGLXWithComposite" "true"
        Option         "XAANoOffscreenPixmaps"
        Option         "NoFlip" "True"
    EndSection
    Section "Screen"
        Identifier     "Screen0"
        Device         "Card0"
        Monitor        "Monitor0"
        DefaultDepth   24
        Option         "AddARGBGLXVisuals" "True"
        SubSection     "Display"
            Viewport    0 0
        EndSubSection
        SubSection     "Display"
            Viewport    0 0
            Depth       4
        EndSubSection
        SubSection     "Display"
            Viewport    0 0
            Depth       8
        EndSubSection
        SubSection     "Display"
            Viewport    0 0
            Depth       15
        EndSubSection
        SubSection     "Display"
            Viewport    0 0
            Depth       16
        EndSubSection
        SubSection     "Display"
            Viewport    0 0
            Depth       24
        Modes       "1280x1024" "1024x768" "800x600"
        EndSubSection
    EndSection
    Section "Extensions"
        Option         "Composite" "Enable"
       # Option       "RENDER" "Enable"
    EndSection

  • Heat ram geforce 6800 ultra

    i'm planning on installing a watercooling on my msi geforce 6800 ultra because the fan just makes way too much noise. therefore i can choose between a complete vga cooling or just a chipset cooling. so i was wondering... does th ram get hot when their not cooled or will they be just fine without any cooling?
    i was also wondering if when i've installed the watercooling i can remove the extra powersupply and let the card run on the agp-power (probably not, but i only got a 460 W supply and i'm having powershortages (i already had to remove the tv-card))...

    I do believe the RAM on the 6800 is GDDR3 which runs cooler than it's predecessors, however, I am not entirely sure.
    I do know that some of the previous heatsink designs were basically consolidated solutions, meaning the GPU and RAM were cooled with a single heatsink.
    If you take MSI's TopTech coolers found on some the 8X Ti 4X series, this was a prime example of this particular technology. I have found in several cases where I had failing fans, I had no alternative but to replace the cooler assembly with a single GPU fan such as the Vantec IceberQ 4 http://www.vantecusa.com/product-cooling.html# but the package only came with 4 heatsinks and the cards I had in my possesion needed 8. Needless to say I did not use them as I discovered that the card ran much cooler without them in a non overclocked situation, certainly ramsinks would have improved OC'ing performance but in a client situation, this was not necessary.
    Ultimately what I noticed was happening was, that the all in one heatsink transferred heat from the GPU, to the heatsink and back to the ram chips that ran cooler anyways. Not the best solution IMHO.
    Also there was an additional factor involved, and that was the systems I built had a side cover fan aimed directly at the card so in fact I didn't really need ramsinks.
    In your situation, you may be able to get away with just the GPU waterblock, but for lack of my own experience with the 6800 I cannot say definitively that this would be the best solution for you. I do know that my X800's RAM is GDDR3 and seems to run cooler as memtioned above. But I am on air.
    If you stay current with available systems by boutique builders that are featured in some PC magazines such as Maximum PC, you may have been aware that one system builder actually made a separate cooling solution for the ram as well.
    I do not know if this is available yet, and you may find you may have to use a combination of air and liquid. You may not even need to do it at all. Hard for me to speculate

  • GeForce 6800, Screensaver and Timer of Audio Hijack

    My G5 with GeForce 6800 wakes up from sleep fine after touching any key. First there is a picture of my screensaver for a split of a second and then the whole active screen. The sleep of the HD and the screen are set on default in System Prefs, Energy Saver.
    When the Timer of Audio Hijack Pro is to wake up the machine for audio recording the picture freezes, screen doesn't open and the recording can't proceed because AHP application remains closed.
    I've tried variety of screensaver and sleep settings but the Timer wouldn't wake up the machine beyond the frozen screensaver. If I eliminate screensaver the Cinema 30 screen opens up but is frozen. The clock isn't running until I touch the key. It looks like the Timer's impuls isn't strong enough and can't fully wake up the machine.
    Could this be a conflict of 6800 with the Timer of AHP? Does it make sense? Has anybody experienced similar issue? I would hate to test it by removing 6800 and putting the 9800 back. In my old G4, without 6800, the Timer works just fine.
    Milan

    Aside from Andy's reply:
    10.8.5 runs as smooth or even smoother than 10.8.4 on my Mac mini. Sure, not an iMac, and none of the issues you report.
    Sublime Text 2 Build 2221 appears to launch, open and save files quicker than on 10.8.4.
    Your other third-party apps may require developer updates -- or not. Even IntelliJ Ideas 12.1.4 still has a JDK 6 installation requirement. Recommend the appropriate support community for IntelliJ.
    Also recommend that you run Disk Utility and verify/repair permissions. I found some stuff post-install that needed that.
    Update: Just saw your latest post. Good solve.

  • Colour Issues - NVIDIA GeForce 6800 Ultra in G5 2.5Ghz

    Hi
    I have an odd issue with NVIDIA GeForce 6800 Ultra card in a G5 Tower.
    The monitor (tried several screen so ruled that out) displays as if it is only producing Thousands instead of Millions of colours, but this is only seen at the Black end of the scale and white. For example, if you are viewing a folder's content as list, you can't see the pale blue lines under every other file. If you open another window on top, you can see the blue lines in what would be the shadow of the top window. So pale colours and dark colours are showing as white or black.
    Interestingly I have a fix for this issue but it's not permanent as it reverts on restart. If I go to System Prefs > Displays and rotate the display 180˚ and then revert to 0˚ the problem is fixed!
    I've tried everything I know to find a permanent fix :-
    Dumping Preferences
    Calibrating
    Resolutions and Refresh rate
    I did have this issue on an ATI card, but I resolved that with an ATI System Preference Pane.
    Can anybody suggest trying anything?
    thanks
    Matt

    Genius, this strikes a chord. I remember resolving this issue for a customer a couple of years ago and this was the answer. Was v.pleased with myself at the time, but obviously my memory is failing me now.
    Can't test until monday, but will be in touch.
    PS what does this say/mean - Aha, good clue, the bloe bard!
    thanks
    Matt

Maybe you are looking for