New Geforece 6600GT AGP cards

Has anyone installed one of the new Gefore 6600GT cards (AGP...since there's no PCIE on exisiting AMD mobos).
If so...any issues?  Any good comments?
I'm looking to purchase the one out by XFX, but I wanted feed back first.  I know my mobo is sensitive to it's components

Quote
Originally posted by syar2003
The 6600 version with AGP needs a AGP/Pci-e bridge , this makes it
more expensive than the native 6600 pci-e .
Yes, I stated that in my post. But, as I said, I read some reviews that said that the AGP version is supposed to have underclocked RAM to make up the cost of the bridge.  Here's the link:
http://www.anandtech.com/video/showdoc.aspx?i=2277
"The GeForce 6600GT AGP runs at the same core clock speed as the PCI Express version (500MHz) but has a slightly lower memory clock (900MHz vs. 1GHz on the PCI Express version).  By lowering the memory clock NVIDIA helps to offset the additional cost of the PCI Express-to-AGP bridge.  The performance impact of the reduction in memory clock as well as the on-board bridge is between 0 - 5%.  For example, in Doom 3 at 1024 x 768 (High Quality) the PCI Express version of the GeForce 6600GT is 3.5% faster than the AGP version.  There is a performance difference, but it does not appear to be huge. "
Maybe the cheaper RAM isn't enough to make up the cost of the bridge, but I can't believe the bridge is that expensive.  I think they cost more due mainly to much higher demand for AGP.
NOTE:  Also in the same article: "The XFX card is identical to NVIDIA's reference board except for one major factor - it is clocked at the PCI Express 6600GT speeds of 500/1000."  Ah hah!  Will be interesting to see how prices compare when some competitors to XFX come along....

Similar Messages

  • Any performance difference between these two 6600gt agp cards?

    Is there any performance or spec differences between the two 6600GT AGP cards: NX6600GT-VTD128 and  NX6600GT-VTD128SP ?
    Only difference I see is one comes with Riddick and one with XIII. If there's no difference, which game is better? :p
    Also are there any big known bugs with this card? I'm currently running an AMD XP2800 on a MSI K7N2 mobo.

    Pipo,
    The difference is in the Memory Speed. 900Mhz vs 1000Mhz on the SP.
    Take Care,
    Richard

  • MSI Kt6 and NEW Msi 6600GT AGP will not boot up!

      My Girlfriend's computer.
    I put this new video card in today MSI 6600GT AGP .For the 2nd time.
    It frezzing up at startup on this screen....
    "GO TO SET UP ...Press F11 , F10 ,Tab: LOGO Ckecking NVRAM..."
    I rma the 1st one . Now 2nd 6600 Gt card samething.frezzing up too.
    This time I put it in my PC, MSI K8T Neo, 3000 64 processor (Socket 754)
    It worked Great!!!
    6600GT power cord is hooked up.
    So it must be the motherboard?
    The 6600Gt card box
    MIN.System Requirements ,On the MSI BOX
    DOESNT SAY ,You might need to Flash your BIOS!!!!
    Ive never flash a bios,To SCARY!!!
    MSI KT6 Delta-FISR ....5.10 bios
    AMD Barton 2500XP
    512 2700 333
    True Blue 430 PS
    CD burner
    floppy
    40 gig
    I love MSI , Ive built 15 pcs, ALL with MSI motherboards!
    Thanks
    G

    Hello!
    Don't think you have to flash the BIOS. Nor do I think there is a problem with the mobo.
    Two main reasons for boot problems are:
    PSU with not enough grunt - please say something about it.
    Memory timings are too tight - don't have any special or ultra settings.
    The power cable has not been attached to the graphics card - don't know really if it is needed but I think so.
    Please note that a weak PSU might not reveal itself til the power cable is connected.
    About flashing BIOS, if you use the LiveUpdate program that came on the MSI install CD, it is no more trouble than upgrading any driver. But please make sure you get the right BIOS, and that it solves the issue you got.

  • K7N2G-ILSR COMPATABILITY WITH NEW HARDDRIVES/video agp cards?

      will the K7N2G-ILSR work with the new SATA WD 74G Raptors? Also will this board support the 300G Diamond Max or the Hitachi 400G PATA OR SATA? also would like to know if anyone used a 9800 AIW or the new X800 XT AIW on this board. would appreciate the input before I purchase. Thanks in advance.
    K7M2G-ILSR
    KINGSTON 512 MEG RAM IN DUAL MODE
    2500+ BARTON CORE
    ONBOARD SOUND
    ONBOARD GFX
    ENERMAX 300 WATT PS
    TDK BURNER
    WIN XP PRO SP2

    Sorry guys I didnt explain that I am a "techie" (my fault) and yes the Pwr supply is a no brainer for the AIW alone, a cool master 500 watt is already on the list as well as plextor newest dvd dual layer, an upgrade to one gig o ram, and a barton core 333fsb 3000+ xp. I am just looking for actual "hands on" testing of the hardware on this board. I have already done the research on the specs and what should work but for someone to already have experienced it first hand and report thier findings would be a great plus. and yes I have the most current bios from MSI (2.0)

  • HELP ! System freezes now with new MSI GeForce 6600GT AGP

    Any/All help is greatly appreciated... this is driving me crazy.
      Just installed a new MSI 6600GT AGP card.  Windows/Apps work fine.. but system keeps freezing (disabling keyboard as well) when running certain games (3D?) like Medal of Honor, Prince of Perisa, SplinterCell.  Also, freezes on 3DMark01SE benchmark software.  What strange is 3DMark2K5 ran fine.. and returned a respectable score (3078).   I've tried unistalling/re-installing latest MSI drivers (71.22), Nvidia's 66.93, and removing nVidia's GART drivers.  I've also disconnected other hardware to allieviate PSU constraints.  I've uninstalled/re-installed 6600GT card and molex power connector on card.  My system specs are below:
    Motherboard = ASUS Focus A7N8X-LA, BIOS=American Megatrend 3.07, Chipset = nVidia nForce2
    MSI GeForce 6600GT 128 AGP
    1Gb Ram
    200Gb Maxtor  8mb buffer HD
    ASUS E616P1  DVD-ROM
    RICOH  MP5240A  DVD+RW
    Enermax EG375P-VE SFMA PSU  (has separate 12V rails)
    Any suggestions as to what else to try?  Getting ready to RMA this card.

    Quote from: andysue on 23-February-05, 07:34:53
    Thanks for that Crull.
    I have just found this on a site it may help someone, I hope to get my replacement tomorrow I shall let you know how it goes.
    Question
    I just got a K8 series MB , and your A6600GT 128MB VGA , every time when I quit the 3D games ,
    the system gave me a blue screen and then restart the system. I already update the VGA driver and MB
    driver but no luck , what can I do?
    Answer
    If your system are WinXP + SP2 , We suggest you to disable DEP function.
    Please follow below steps to disable the DEP. (See below picture)
    1. First , Select “Advanced” under “System Properties”
    2. Choose “Settings” under “Startup and Recovery”
    3. Click “ Edit “ , then you can edit “boot.ini” file.
    4. Change “/NoExecute=OptIn” as “/execute” (See below red square in step4)
    5. Then save the boot.ini file and restart the system.
    6. After rebooting , the DEP function will be disable.
    Other information please see below link :
    http://support.microsoft.com/kb/875352/en-us
    http://www.sysinternals.com/ntw2k/info/bootini.shtml
    http://www.microsoft.com/technet/prodtechnol/winxppro/maintain/sp2mempr.mspx
    • /NOEXECUTE=OPTIN Enables DEP for core system images and those specified in the DEP
    configuration dialog.
    • /NOEXECUTE=OPTOUT Enables DEP for all images except those specified in the DEP
    configuration dialog.
    • /NOEXECUTE=ALWAYSON Enables DEP on all images.
    • /NOEXECUTE=ALWAYSOFF Disables DEP.
    Configuration Description
    OptIn
    (default configuration)
    On systems with processors capable of hardware-enforced DEP, DEP is
    enabled by default for limited system binaries and applications that
    “opt-in,”
    With this option, only Windows system binaries are covered by DEP by
    default.
    OptOut DEP is enabled by default for all processes. Users can manually create
    a list of specific applications which do not have DEP applied using
    System in Control Panel. IT Pros and Independent Software Vendors
    (ISVs) can use the Application Compatibility Toolkit to opt-out one or
    more applications from DEP protection. System Compatibility Fixes
    (“shims”) for DEP do take effect.
    AlwaysOn This provides full DEP coverage for the entire system. All processes
    always run with DEP applied. The exceptions list for exempting
    specific applications from DEP protection is not available. System
    Compatibility Fixes (“shims”) for DEP do not take effect. Applications
    which have been opted-out using the Application Compatibility Toolkit
    run with DEP applied.
    AlwaysOff This does not provide any DEP coverage for any part of the system,
    regardless of hardware DEP support. The processor does not run in
    PAE mode unless the /PAE option is present in the boot entry.
    Yes let us know how it goes. Hopefully well.

  • Will the ATI Radeon 700 AGP card run two Studio 21 Displays?

    I just bought a G4 MDD 1.4 GHZ Mac that has what I assume was the stock video card, which appears to be for LCD displays, since the cables from my Apple 21" Studio displays (CRTs) don't match up to it (those are called VGA monitors, right?).
    On my present Mac (G4 Quicksilver) I am running both of these 21" Studio Displays, one plugged into the stock AGP card and the other connected to a PCI video card. As I prepare to switch over to my "new" MDD Mac now, I need to figure out how best to run the two monitors with this G4. (I will leave the video cards in the old Quicksilver and give it to a relative who also wants to run two monitors with it).
    I notice a new Radeon 7000 AGP card being offered on Ebay which has one VGA port, and they provide an adapter so that the second (DVI) port can also run a VGA monitor. If that card would run both my 21" Studio Displays independently as claimed, it would be the ideal solution, if the performance is adequate.
    I do lots of Photoshop(CS2)adjustment of digital camera images and also make video movies with Final Cut Express, iDVD, etc., with both monitors set to 1152 X 870 at 75 Hertz. Would such a card as this one be up to the task with both monitors?
    Every time I try to post the link to the Ebay auction for this card, everything in this text box disappears. So I'll just copy and paste the description of the card from the auction: "ATI Radeon 7000 64MB DDR Dual Monitor capable AGP Video Card with VGA, DVI-I, and S-Video Out and a DVI to VGA adapter. The video card has ATI's latest firmware (v226) with the latest ATI Displays support. It does not have any sleep issues. The card is capable of dual true independent/simultaneous monitor support with extended desktop. The adapter allows for dual VGA operation. This model has been tested in a G4 Tower, G4 Cube, and G5 with several versions of OS X including OS 10.4/Tiger. This card is guaranteed to work in any Macintosh G4 Tower, G4 Cube, or G5 Tower with OS 9.2.2 or later that has an available AGP slot."
    Sounds good---maybe too good to be true. Any drawbacks to such a setup as this?
    Any advice most appreciated.
    Tom

    Thanks for the info Malcolm. I checked out those links you provided, and it appears that buying two adapters for the presently-inistalled video card which would allow it to run two VGA monitors looks like a workable solution, all right. But it would be more expensive than just buying that Radeon AGP card to run both monitors, if the latter would work (that's what I'm still wondering).
    Also, I notice that the DVI to VGA adapter sold by Apple has some poor reviews and seems to have glitches. It is reported by several people to be unreliable and seems to turn a lot of displays magenta, and in other cases requires a lot of wiggling or careful positioning to make it work right. Sounds like Apple needs to work on the design.
    I appreciate the info, but I'm still favoring the Radeon 700 AGP card if someone can tell me it might work.
    As to which card my Mac has in it right now, I can't tell just by looking at it (why don't they put an identifying sticker or something on the things?) and I can't connect any of my VGA displays to it yet to let the Mac report what it has in it, so it's still a mystery.
    Thanks again,
    Tom

  • Msi 6600gt agp not working on epox mobo ?

    I have this mobo (http://www.epox.nl/products/view.php?product_id=420).
    I bought a msi 6600gt agp card and installed it in this pc, when i tried to boot up the pc all the fans including the cards were spinning.
    But nothing else seemed to get power, dvd drives and hard drives didn't get any power.
    Now somebody said to me that the card needs 0.8 volts and that ly mobo only supports 1.5 volts.
    I'm not sure if this is true, so thats why i post here and i hope someone can help me out with this.
    thx in advance

    Quote from: polokus on 20-November-05, 06:18:08
    My old one is a 5600 fx and that one works.
    I changed power suply to a 400 watt cause i thought it was that, no change.
    We plugged it in another machine and there it boots, someone said to me that it is my agp slot that is not giving the right amount of voltage.
    greetz
    It may very well still be the power supply. Stating using a 400w power supply means absolutely nothing these daysPower Supply Guide
    And, as Overvolt suggested, it might be a good idea to look into the BIOS for your motherboard....
    So after you read the link I gave, let's have the specs on that power supply you tried, and what was in the other machine?

  • I modded my MSI 6600GT AGP

               I took my MSI 6600GT AGP all apart to mod it. I thought I would share what I found out in case anything wants to mod theirs also.
    I didn't want to flash the bios for the temperature, so I attached this probe to the GPU.
    http://www.bestbyteinc.com/prodinfo.asp?number=SEN-CNU-001
    The bridge chip uses a thin foil thermal pad, which I replaced with Artic Silver Ceramique. The GPU heatsink uses what I think is a thick thermal compound, it didin't come off like a pad. It might also have some silver content because of it's grayish color. There are four approx. 1/16" white thermal pads for the memory.
    I don't think both of the heatsinks are copper. Because when I removed the heatsink I could see in the threads of where the support arms get screwed into and it wasn't the same color as copper. I think they are copper colored anodized aluminum or maybe even copper covered aluminum if that is even possible. My best guess is they are not copper. The color is slightly wrong to be true copper.
    To take the main heatsink off.
    You need to press down slightly on one side of the back support bar and swing if off of the post. If you look at the support bar both ends are different. One side has little metal nubs that need to clear the post to swing it off. The other end is shaped like a C and just rides on the groove in the other post.
    After you take off that bar, there is another flat bar that has a pad in the middle to protect the components. That bar just lifts straight up and off the posts. Remember which way you took it off, it is very important. The components make litte dents in the pad. So putting it back on the exact same way fits the components in the same dents they made when the bar was first installed.
    Once you have both support bars removed, you need a flat screwdriver to unscrew both posts from the heatsink. At this point I don't believe the heatink is attached anymore to the board even with the posts still attached. Once you take off the posts flip the card over holding the heatsink as you do so. I gently twisted the heatsink and it came right off. You need to unplug the fan at this point.
    I took four copper slugs around the size of a penny, and thinned them down to get the right height of the pads. The pads were removed to be replaced by the copper slugs. It took a very long time to do this. I would take one and rest it on the memory. Then I would use a small straight edge ruler to test the height with the GPU. The copper slugs have to be at the same exact height or slightly lower then the GPU or the heatsink will not make proper contact with the GPU.
    What I found by doing this is, without having the support bar on the video card the GPU is not on a flat plane to the memory chips. So thinning the slugs down the right height was hard to do because of this.
    Once I got the slugs down to the right height I used Arctic Silver thermal epoxy to glue each one to the memory. After that I used the Artic Silver Ceramique on the GPU and on the copper slugs. Then I attached the thermal probe with a little dab of Artic Silver Ceramique on the tip to contact the side of the GPU. I used double sided thermal tape to stick the probe to the board and to the area on the side of the GPU.
    I then re-attached the heatsink in the reverse order I removed it. I checked under the heatsink with a light and everything looked like it was making good contact.
    My idle temp with the probe is around 34C, stressing the card with benchmarks the highest I have seen it is around 55C.
    The copper slugs didn't really gain me all that much overclocking the memory, but that could just be the memory has reached its limits. The copper slugs will probably help the memory last a little longer then the pads. I wouldn't suggest using the slugs to anyone. It is just too much work, a lot of risk and doesn't really gain you anything. 

    Quote from: akeer on 16-February-05, 18:17:49
    fw off, agp set to 4x.... many times tried this with various drivers (also setting the 2d and 3d speed to the same and many other trix...)
    also tried to unplug devices... no help.... as I said if I help the cooling with really biiiiiiig fan underneath the card, it will not freeze... but 12cm fan at full speed generates looooooots of noise ))
    it seems that aftermarket cooling (e.g. Zalman vf700cu) is the only solution
    thanx and have a nice day
    Here is something I just thought of. There are a lot of components on the board. There might a possibility one of them is not working very well when it heats up.
    The GPU could also be overheating, but you have to understand something here.
    The 6600GT has automatic core slowdown if it gets too hot so how could you have damaaged it? If you didn't put on enough thermal compound or the heatsink wasn't tight enough the core would slowdown to protect itself from heat damage.
    If it was damaged by heat going by everything I know so far, it would probably only be damaged in the first few seconds right after you turn the computer on, but even then that would probably only happen if there was no heatsink attached at all.
    Another thing to consider. I have an idle temperature of around 34C using the probe, the internal temperature would be slightly higher so lets say it's around 40C. The highest load temperature I have seen has been around 55C with the probe, so lets say inside its around 60C inside.
    Those temperatures using the stock heatsink are very good.  So what I am saying is even if you get a new heatsink, it might lower the temperature a bit on the card...but I doubt it will help with any heat related problems your describing.
    If that stock "heatsink is on correctly" it should be more then enough to keep the card cool.
    Maybe you should take the chance and flash the bios so you can see the actual internal temperature of the card. At this point it really doesn't matter because you can't return it anyway, but there is always a slight risk flashing a bios.
    There is a forum here that will explain flashing the bios if you decide to do it.
    http://www.mvktech.net

  • MSI NX6600 AGP Card & MSI 7008, VIA PT880 Motherboard Compatible?

    Hello all!
    Need a little help today.
    I have a MSI NX6600 AGP Card & MSI 7008, VIA PT880 Motherboard that are giving me some headaches.If I try any NVIDIA driver above version 71.25 the graphics (2d and 3d) are very badly distorted. The desktop screen is unreadable and any software I try to run looks warped and garbled. I am using an Antec case with a 350w Antec PSU. Both the motherboard and video card are from MSI. This is the second video card I try, I RMA'd the first one. If I use an older AGP card (Ti4200) all graphics display perfectly. Is there something I can do to fix the problem? Is there a compatibility problem between the VIA chip and the NVIDIA 6600 chip? 
    I'd appreciate a push in the right direction!
    Thanks!

    Hi!
    There seems to be some problems with certain setups, see: http://forums.nvidia.com/index.php?showtopic=3612
    I had these 2d-issues with 6600GT and RMA'd the card, hope I'll get newer revision... none of the fixes helped.
    AMD Athlon XP 2800+
    MSI KT6V-LSR
    1024MB DDR400
    XPSP2

  • Dual ADC AGP card supplier?

    Hi,
    I'm trying to assist a user who is looking to replace a dead dual ADC graphics card (RADEON blah blah) in one of the first PowerMac G5s. Can someone recommend a suitable dual ADC card and UK supplier? [Cheers]
    I seem to recall that you have to be careful when sourcing AGP cards in Macs i.e. you need to get the 'Mac Edition' in the case of ATI, but am not sure about Nvidia etc. Am I barking up the wrong tree thinking that the cards in question have extra pins (for power)?
    TIA
    PowerBook G4   Mac OS X (10.4)  

    No graphics board has more than one ADC port. There is an adapter to use an ADC display with a DVI port.
    <http://store.apple.com/Apple/WebObjects/ukstore.woa/wa/RSLID?mco=CC1BA768&nplm= M8661>
    You will need a Mac G5 version of whatever board you get. The only new ADC G5 board being made is the
    ATI Radeon® X800 XT MAC Edition RoHS 256MB AGP
    <http://shop.ati.com/product.asp?sku=3170810>
    There are some other boards available from Apple as spare parts from
    <http://www.welovemacs.com/g5video.html>
    <http://www.welovemacs.com/agpg5vica.html>
    Not all have ADC ports, but can be used with the DVI to ADC adapter.

  • Can a PC ATI X1600 XT AGP card work with my MDD..

    Can a PC ATI X1600 XT AGP card work with my MDD?

    I have a friend that made the mistake of buying a (new) non-Mac edition ATI card. Not the card you've specified, this was a few years ago, but everything worked for him, with OSX. His biggest issue was the loss of the ADC connection on the card, getting DVI connectors only instead. That means a trip to Apple to get an ADC-DVI converter, mine was about £100 for my second display. My friend payed slightly less. At the end of the day, it would have been cheaper for him to pay the £30 extra for the Mac edition.

  • G5 Still Holding up - AGP Card NOT!

    I was recently relieved after bringing my G5(June 2004?) in for repair that it's not the motherboard or the hard drive but the ATI Radeon 9800 Pro 256 MB video card (using this machine primarily for video editing, motion graphics etc.) After hearing the crushing news that my Mac is "vintage" and my local store can't offer up any cards because they don't make them anymore I scoured the interwebs. I found some on eBay and a few other sites.
    A few questions: I'm seeing the exact card I need on sites like eBay but most of them are NEW CONDITION bought in bulk from ATI and not in retail boxes. Has anyone purchased a card this way? It's the first time I've had to do this so I'm a little hesitant but since I have an older Mac I know my options are limited.
    Also, since I have to get a new card, what are my options with regard to an upgraded card beyond the ATI card I'm currently replacing. I'm seeing ATI Radeon 9800 X8 SE cards (X8 referring to the AGP slot?) and 9800 XT cards. Since these are all outdated cards, what are my options here?
    Thanks all,
    William

    Hi William,
    See japamacs posts here on the best AGP cards for G5s...
    http://discussions.apple.com/thread.jspa?messageID=10460940&#10460940
    http://discussions.apple.com/thread.jspa?messageID=10319750&#10319750
    http://discussions.apple.com/thread.jspa?messageID=11182739#111
    http://www.jcsenterprises.com/Japamacs_Page/Blog/4B4B7BA2-7ABB-47F1-87AC-B03D379 42BEE.html
    I suspect he'll drop in later also with more details if needed.

  • Overclocking my PT880 got best results with a PCI video card instead of a AGP card. Why?

    Hi guys,
    I posted a complete review of my PT880 OC experience at the "Overclocking section of the forum: thread post PT880 OC results" : https://forum-en.msi.com/index.php?threadid=48006&sid=&threadview=0&hilight=&hilightuser=0&page=2
    Some days of test with my new PT880 board and I was able to run my P4 (B?) 2.4Ghz at 18 x 154 = 2772 Mhz. I got this speed using a cheap PCI card XFX GeForce 4 MX440, from my old P3 computer. I also got system to work with higher FSBs, but with some unstable sound board issues that I related on the link above.
    This week I decided to change the card to another same cheap XFX GeForce 4 MX440, but now a AGP 8x version. My surprise is that the system started to hang with just 20~30 seconds of use. Setting the front side bus to 133 helped to fix the stability issue. Playing a bit more I could OC using a 144 FSB, getting a final speed of 2592. Trying higher FSBs results in a unstable system again.
    This makes me wondering if the AGP/PCI is really locked. Using 144 FSB, it is supposed that an unlocked AGP port will be running at 72 Mhz, what is the limit for some AGP cards... My BIOS setup is using the AGP clock option at the default of 66Mhz. Is it really working at this speed?  
    If the AGP/PCI bus speed is really locked in this VIA PT880 MSI motherboard, why this difference in OC results when I change from a PCI to an AGP card (same manufacturer, same nvidia chipset) ? Will be the northbridge more occupied now?  
    My PC setup now:
    - MSI PT880 LSR with Bios upgraded to 1.4
    - Pentium 4(B? 533Mhz) 2.4Ghz
    - 512Mb DDR PC3200 NonECC ADATA Memory (just one module, so just single channel)
    - XFX GeForce4 MX440 AGP8x
    - Seagate 120Gb 7200RPM ATA100
    - Seagate 80Gb 7200RPM ATA100
    - Prolink PlayTV HD (conexant cx88)
    - Sound Blaster Live! 4.1
    - LG DVD-RW 4040b
    - 450W generic PSU
    That's all for now  

    Your AGP bus uses more bandwith than the PCI bus, thus making the OC more dependend from the AGP bus. And if your AGP slot will drain more CPU power and fragile as AGP can be, it will not tolerate errors due higher clock speeds. Mhz += Stability -. Notice: PCI is 33 Mhz/ AGP is 66/133 Mhz... . As you see there is less margin for errors. While you overclock the CPU will make tolerable errors: NP with PCI... But AGP... will take it to the MAX. [I hope this reply is No nonsense]

  • Msi 6600gt Agp + Win Xp

    Hey guys,
    Just brought an msi 6600gt agp and it is awsome...
    Having a bit of trouble with the D.O.T software.
    It seems that i have enabled it, i try to change the core and memory speeds and
    save settings. I log out of xp and log back in to check if the setting have saved
    and they have. But, everytime i restart the computer it just goes back to the
    default settings. Is this normal?
    The other thing is if i use o/c tools like coolbits, can i still enable d.o.t so that my
    card is not clocked when i'm on my desktop surfing?
    Thanks in advance,
    Pooki3

    oops..
    sorry i didn't make my self clear. I also unticked the d.o.t  feature and bumped the
    core and memory.. saved it and restarted the computer again...
    It just goes back to defaults again...
    Anyone got the same issue?
    Pookie

  • MSI 6600GT AGP running HOT!

    Hi guys,
    i did the "enable temp monitoring" BIOS mod on my MSI 6600GT AGP video card and it worked fine. But now i'm a little bit concerned about the high temp readings i get. Idle temps are around ~55°C and ~81°C while playing games. Look at the attached pic.
    Is this to high? What are your MSI 6600GT AGP temps?
    BTW: System is rockstable, no crashes and no visual artifacts. I've used "RivaTuner" to log the temps: http://www.guru3d.com/rivatuner/

    Here`s my max temp for MSI NX6600GT agp.
    Yours seems a little high. Maybe you need more ventilation. I removed my HSF and applied artic silver 5. Also removed the thermal pads off mem chips--they caused the HSF to not sit flat on GPU.

Maybe you are looking for