MSI NX6600GT-TD128E Geforce 6600GT 128MB other models

Ive heard a lot about this VGA and its supposed to be good but there are diffrent models at diffrent places with the same name. If you go to the buy now for this card it will take you to zoom as the cheapest but if you go to newegg the models are diffrent so does anybody know what the deal is. I have also heard that there are some heating problems with the older model does anybody know.

http://www.nvnews.net/reviews/msi_geforce_6600gt_sli/index.shtml try this link for the review or google it for overheating problems, DOK

Similar Messages

  • MSI NX6600GT-TD128E SLI Help

    Hi all
    I have two Problems.
    1. I can’t OverClock my MSI NX6600GT Cards in SLI
    2. If I run a fast game like Colin McRae Rally 04 or Colin McRae Rally 05. After 3 -5 min of playing the game, The Pc screen will slit in two & the bottom half of the screen will go all funny. And then the PC will Lockup.
    Any help.
    PS. I'm not getting very good score on AquaMark3 only 57,000 But was getting 63,000.
    Amd winchester 939 3200+
    1Gig of Kingston pc3200 512M
    Asus A8N-SLI 
    MSI NX6600GT-TD128E X 2 (PCI-Express)
    600w power supply
    1xLED fans
    2xExhaust fans

    If the game is using a profile that uses SFR the screen is split in two horizontally ... pls see below
    Single frame rendering
    Single frame rendering (SFR) mode in NVIDIA SLI works similarly to Metabyte’s PGC in the sense that the graphics card splits the workload horizontally across the screen. One card takes the upper portion while the second card takes the lower segment. The frame buffer data is then combined and sent to the monitor.
    It’s important to note that SFR doesn’t necessarily split the screen directly down the middle; in some scenes the lower portion of the screen may be more complex than the upper portion, or vice versa. NVIDIA has developed custom load-balancing algorithms that are designed to take this into account, and split the screen appropriately – if one GPU takes longer to render, no problem, the driver just gives that GPU less work to do.
    Alternate frame rendering
    Up to now, ATI’s Rage Fury MAXX was arguably the most well known card to take advantage of alternate frame rendering. Rather than splitting the work up every frame like in SLI, with AFR, each graphics core handled alternate frames. Graphics core one would handle everything in frame 1, while graphics core two would then handle frame number 2. Each chip renders every other frame instead of alternating lines in the same frame, as 3dfx had done with scan-line interleave. If one GPU doesn’t finish drawing its frame, the remainder is pushed to the second GPU. This is the same concept NVIDIA has employed for their SLI technology, only we’re dealing with two distinct graphics cards, rather than two graphics cards on the same core.
    The most well known downside with AFR has been perceived “lag” that may be felt by some twitch players in very fast-paced first-person shooters such as Quake 1 or Quake 3. The argument is that user inputs, such as a quick flick of your wrist to nail your opponent in mid-air with your rail gun may feel lagged. In theory, this could happen when the second AFR scenario we described occurs – GPU1 doesn’t finish its frame and the remainder is passed to GPU2 – there could be a lag between the key being pressed and the output being shown on the screen. Ironically enough, NVIDIA used this same argument to tout their GeForce 256 over Rage Fury MAXX five years ago.
    We played a little Half-Life 2 (which uses AFR by default) to test out this theory, but didn’t perceive any additional lag during our “testing” sessions. Buffering kicks in when data is split between GPUs, and it appears to work well. Also keep in mind that today’s shooters aren’t as fast-paced as they were a few years ago.
    Currently, AFR is NVIDIA’s preferred mode for SLI. In general, AFR uses less communications overhead (as no communications overhead between GPUs is used as long as frames are contained to each GPU), allowing it to scale better than SFR. NVIDIA also says that applications with heavy vertex loads benefit less from SFR.

  • MSI RX800-TD128E PCI Express 128MB Two Versions?

    Hello,
    I just bought a MSI RX800-TD128E with DDR on zipzoom: http://www.zipzoomfly.com/jsp/ProductDetail.jsp?ProductCode=321949
    I found this reveiw with some help for a similar card: http://www.digit-life.com/articles2/video/x800-2.html
    I looked on newegg and they had the same card with DDR3 see: http://www.newegg.com/Product/Product.asp?Item=N82E16814127176
    Could anyone help with more information? I bought because of price and this card should be slightly better then a 6600gt.

    Thanks for the response.
    The article states that some cards are being made ddr1 not ddr3. It appears when you go to the two msi websites there are two versions of this card.
    I can't believe that msi would call out the wrong memory. Goto zip and clip the manufacturer website.
    Just trying to figure out if this is true abd if anyone has more links about X 800?

  • Msi Nx6600gt -td128e

    Hi,
    Will this graphic card work with power supply at 300 Watt?
    My system is:
    CPU Type Intel Pentium 4E, 3000 MHz Socket L775
    POwer supply : 300 Watt
    Motherboard: Gigabyte GA-8I915G-MF
    Motherboard Chipset Intel Grantsdale-G i915G
    System Memory 480 MB
    Graphic card Intel(R) 82915G/GV/910GL Express Chipset Family (128 MB)
    3D Acclerator Intel GMA 900
    Monitor Acer AL1721 [17" LCD]
    Hard Drive: Seagate Barracuda 7200.7, S-ATA NCQ, 7.200RPM, 8MB cache, 160GB

    Power supply in my comp. is Huntley 6300HP, 300 Watt with max. load of 15A at 12 V.
    This MSI N6600GT-TD128E will not be used to overclocking! I tried the suggested Wattage counter and ended with around 300 Watt, although 6600 cards are not listed!
    Maybe I should stick to MSI PCX5750-TD128E, as I'm not a game freak, and just need a better card in my system than Intel's built-in.

  • [nVidia] MSI NX6600GT TD128E vGPU Mod

    I finally found a pic of the volt mod that I did to my MSI 6600GT card.
    Replacing the standard cooling system with something more efficient, for example with a water-cooling solution, and pulling up the voltages are the traditional means of increasing the overclockability of a graphics card.
    The graphics card under question has GDDR3 memory chips from Samsung, known for their low overclockability which doesn’t grow much even after a voltage increase. So I didn’t tamper with the memory voltage on the reference card at all – it wasn’t worth the trouble.
    The overclockability of the graphics processor is another matter, as we deal with NVIDIA’s first GPU made with the 0.11-micron tech process. The thinner tech process and the smaller transistor count in comparison to the GeForce 6800 series should render this chip capable of working at higher clock rates.
    Let’s check it out. The core voltage regulator is based on the ISL6534 chip from Intersil. It is a dual-channel pulse-width controller capable of driving a line regulator. One of the channels supplies power to the GPU. Unlike the regulators on GeForce FX 5950 Ultra or FX 5900 Ultra cards, which have digital inputs for setting the output voltage level and capable of adjusting the output voltage “on the fly”, this chip uses a resistor devisor for setting the output voltage.
    Curiously enough, the NVIDIA engineers made this regulator change the output voltage “on the fly”, too. Receiving the control signals from the GPU’s registers, two transistors attach resistors with preset resistances to the devisor, thus adjusting the voltage value. Voltage regulators on NVIDIA GeForce 5900 XT cards employ the same idea, by the way.
    In order to lift the voltage of the graphics processor, you only must reduce the resistance of one of the divisor’s resistors. That’s exactly what I did:
    You can see the controller chip of the voltage regulator in the top left corner of the snapshot, while the output voltage control scheme and the two resistors of the devisor are in the center. The arrows point to the spots I soldered an additional variable resistor to. If you want to test the vmod before you make it permanent, just scribble a little #2 pencil lead (graphite) over the resistor. This will lower the resistance and basically do the same thing that the potentiometer is dong. However, it's not nearly as easy to controll the voltage that way.
    This pic shows where you check the core voltage:
    In doing this, I was able to increase my core speed up to 680 Mhz!

    To do the pencil mod, scribble over the resistor that has the two wires added in the first picture. The second picture is where you can measure your vgpu voltage.
    Edit: The two pictures are actually two different cards, as I couldn't actually find a vMod for a MSI 6600GT anywhere on the 'net. That's why one pic is green one is red. But you are correct. The scribble of the pencil goes on the resistor between the two arrows.

  • 3d experience and msi nx6600gt td128e/k8n sli platinum

    just finished installing all the drivers updates etc for my new system...
    installed the 3d turbo experience prgm that came on the cd with my graphics card.  the prgm runs at startup, but when i go to open the main window i get the following error
    MSIVGA.ocx initialization error
    how can i fix this and get it to work????

    Read the stickies, there's a thread "Alternative to MSI drivers and Overclocking too" there is a link to the DOT page at MSI, it's pretty straightforward.
    MSI's DOT is a Dynaminc overclocking utility that takes most of the guesswork out of it and is supposedly safer than manual overclocking.

  • MSI NX6600GT RIP

    MSI NX6600GT-VTD128 GeForce 6600GT 128MB 128-bit GDDR3 AGP
    Purchased 1/21/2005 4:14:00 PM
    Lastnight it died 3/25/2008 ,My daughter was playing a game and said smells like something burning.
    I shut it down quick .The card was hot and the fan was dangling from the card .
    gary

      And another 6600 bites the big one. Been a lot of those, also 6800.

  • GeForce 6600GT AGP 4x/8x Compatibilty Question

    Hi, I have received an  MSI NX6600GT-VTD128 Geforce 6600GT 128MB DDR3 AGP 4X/8X Video Card as a gift. I also have a media center pc with e-Home TV Tuner. I was wondering if i replaced my Current GeForce4 MX440 with this new MSI card, would there be any compatibility issues with the tv tuner card, or does this MSI Card support all tv tuner cards?
    Thanks,
     ~Snikkle

    Snikkle,
    My friend's car is having problems, the car has 4 wheels. What do think is the problem?
    I would greatly appreciate any help . . . thanks in advance.
    Richard
    P.S. If you have trouble helping me with my problem. Think about how much trouble I am having helping you with yours. Especially when I do not have a clue what the specs of your Hardware, OS and Power Supply are.

  • [GeForce PCe] MSI NX6600GT PCi-e ?

    Hi there,
             I have 2 MSI NX6600GT 128 running in SLi mode. When I goto use MSI Turbo Experience it comes up with a screen that says:
    "MSIVGA.ocx initialization error". I have 71.84 drivers and all other drivers are installed. It also happened with the 66.93 drivers that came on the CD and so I updated.  I contacted MSI and they told me to read the forums here as there most likely would be a solution. I built this system from the ground up myself. I have had no troubles with it. I can play games like MVP2005 and Doom3 with no probs. But sometimes when I goto load up a saved game playing Silent Hunter 3 starts to initialize then goes to the desktop (this has happened very rarely). I haven't overclocked anything I won't be doing that. I only have a few games installed as I am taking things slowly. My GPU temps are 48c idle and after playing Doom 3 for about an hour are 60c. CPU is 35c and case temp is 43c. I have a 120mm fan blowing in from the front and 2 80mm fans exhausting. Any advice would be helpful. Thanks in advance.
    System specs:
    MSI NEO4 Platinum SLI mobo
    AMD64 3500+ 512K 90nm cpu
    2x MSI NX6600GT TD128e vid cards (SLI mode)
    4x 512mb Kingmax SuperRam Dual Channel(2gig)
    450 watt PS 22amps on the 12v rail
    Super Tanlent Dual fan memory cooler
    2x 200gb WD SE 8mb HDD's (IDE)
    2x Memorex DVD 16x burners
    2x CoolDrive 3 for HDD's
    19" Sceptre LCD monitor
    if I forgot anything sorry
     

    Kick MSI Turdo Experience to the curb (get rid of it)
    It's more of a sales gimmick, then really being usefull in most cases.
    If it continues, you may want to start all over from scratch and get the latest drivers here nVidia Drivers
    I always avoid drivers on the CD at all costs and the manufacturer's drivers as well. Unless of course there is a known problem or issue that mandates use of the manufacturer's drivers. Otherwise, always get the latest drivers from the chipset vendor.
    Also, a bit tedious is trolling forums for any clues as to what everone else is using without issue, you'll at least have an educated head start.

  • Is my NX6600GT-TD128E faulty?

    I bought the MSI NX6600GT-TD128E in February of 2005. I haven't touched it since I first put it into my computer and it has been working fine for the past two years.
    For the past month or so whenever I run something that uses accelerated graphics there is a ton of corruption on the screen. I have tried dozens of different configurations and tried it on multiple clean installs of Windows Vista, Ubuntu Edgy Eft, and Ubuntu Feisty Fawn (the last two are Linux distributions). Obvious I doubt it is a software problem if it just randomly stopped working and I tested it on multiple configurations.
    I don't know if there is anything I can do about it and I was wondering if I should (or can) get it fixed under warranty? If I do, does it usually take a long time and how much will it cost?
    I have tons of screenshots depicting the problem which I will attach.

    Quote from: compotatoj on 16-March-07, 14:22:15
    Oh my gosh! You are right! Wow you are so smart. The fan popped off halfway from the heat sink!
    Quote from: compotatoj on 16-March-07, 14:36:13
    Is this under warranty? I can't believe it is still running!
    well dunno about the warrantly since its 2 year old and hmm looks like cooler is heavy damaged.
    you can ask your reseller about warantly or to How to contact MSI. for more information.
    easy way to fix it just get and install after market VGA cooler. (NOTE: make sure it will be compatible with your VGA model).

  • NX6600GT TD128E - Freeze then go on

    Hello everyone,
    I’ve these card for a few years now and always had the same trouble, when I’m playing game (and only when playing often on Counter Strike Source, Guitar Hero III,…), sometimes and randomly, my computer freeze a bit (one or two second), and then go on.
    I don’t understand why, I’ve upgraded my motherboard bios; all my drivers are up to date… Tried with both nvidia and msi driver (which one is supposed to be better, for now I’m running with the last nvidia.)
    My temperatures seem usual around 44°c when idle, and 65°c when playing. (With expertool).
    I’ve never take care of this trouble because, I was at school and on the boarding house with my laptop so, I never really had the time to play.
    There is my configuration:
    ASUS A8N-SLI
    MSI NX6600GT TD128E (Bios version : 5.43.02.69.00)
    AMD Ahtlon 64 3200+ (never goes above 55°c)
    Windows XP SP3
    1024Mo
    Audigy 4
    Maybe upgrading my bios ?
    Or make some advanced test ? But which one ?
    Thanks for your concern.

    I think you should try with decent PSU.
    That PSU used is cheap noname and low quality.
    See here: https://forum-en.msi.com/index.php?topic=112039.msg836246#msg836246
    And here:
    >>> PSU's---2 x 12v---The Dual Rail Myth <<<
    >>> Is your problem caused by your PSU? <<<
    Other tips:
    Also do a fresh OS installation.
    Test VGA in another machine.

  • HELP ! System freezes now with new MSI GeForce 6600GT AGP

    Any/All help is greatly appreciated... this is driving me crazy.
      Just installed a new MSI 6600GT AGP card.  Windows/Apps work fine.. but system keeps freezing (disabling keyboard as well) when running certain games (3D?) like Medal of Honor, Prince of Perisa, SplinterCell.  Also, freezes on 3DMark01SE benchmark software.  What strange is 3DMark2K5 ran fine.. and returned a respectable score (3078).   I've tried unistalling/re-installing latest MSI drivers (71.22), Nvidia's 66.93, and removing nVidia's GART drivers.  I've also disconnected other hardware to allieviate PSU constraints.  I've uninstalled/re-installed 6600GT card and molex power connector on card.  My system specs are below:
    Motherboard = ASUS Focus A7N8X-LA, BIOS=American Megatrend 3.07, Chipset = nVidia nForce2
    MSI GeForce 6600GT 128 AGP
    1Gb Ram
    200Gb Maxtor  8mb buffer HD
    ASUS E616P1  DVD-ROM
    RICOH  MP5240A  DVD+RW
    Enermax EG375P-VE SFMA PSU  (has separate 12V rails)
    Any suggestions as to what else to try?  Getting ready to RMA this card.

    Quote from: andysue on 23-February-05, 07:34:53
    Thanks for that Crull.
    I have just found this on a site it may help someone, I hope to get my replacement tomorrow I shall let you know how it goes.
    Question
    I just got a K8 series MB , and your A6600GT 128MB VGA , every time when I quit the 3D games ,
    the system gave me a blue screen and then restart the system. I already update the VGA driver and MB
    driver but no luck , what can I do?
    Answer
    If your system are WinXP + SP2 , We suggest you to disable DEP function.
    Please follow below steps to disable the DEP. (See below picture)
    1. First , Select “Advanced” under “System Properties”
    2. Choose “Settings” under “Startup and Recovery”
    3. Click “ Edit “ , then you can edit “boot.ini” file.
    4. Change “/NoExecute=OptIn” as “/execute” (See below red square in step4)
    5. Then save the boot.ini file and restart the system.
    6. After rebooting , the DEP function will be disable.
    Other information please see below link :
    http://support.microsoft.com/kb/875352/en-us
    http://www.sysinternals.com/ntw2k/info/bootini.shtml
    http://www.microsoft.com/technet/prodtechnol/winxppro/maintain/sp2mempr.mspx
    • /NOEXECUTE=OPTIN Enables DEP for core system images and those specified in the DEP
    configuration dialog.
    • /NOEXECUTE=OPTOUT Enables DEP for all images except those specified in the DEP
    configuration dialog.
    • /NOEXECUTE=ALWAYSON Enables DEP on all images.
    • /NOEXECUTE=ALWAYSOFF Disables DEP.
    Configuration Description
    OptIn
    (default configuration)
    On systems with processors capable of hardware-enforced DEP, DEP is
    enabled by default for limited system binaries and applications that
    “opt-in,”
    With this option, only Windows system binaries are covered by DEP by
    default.
    OptOut DEP is enabled by default for all processes. Users can manually create
    a list of specific applications which do not have DEP applied using
    System in Control Panel. IT Pros and Independent Software Vendors
    (ISVs) can use the Application Compatibility Toolkit to opt-out one or
    more applications from DEP protection. System Compatibility Fixes
    (“shims”) for DEP do take effect.
    AlwaysOn This provides full DEP coverage for the entire system. All processes
    always run with DEP applied. The exceptions list for exempting
    specific applications from DEP protection is not available. System
    Compatibility Fixes (“shims”) for DEP do not take effect. Applications
    which have been opted-out using the Application Compatibility Toolkit
    run with DEP applied.
    AlwaysOff This does not provide any DEP coverage for any part of the system,
    regardless of hardware DEP support. The processor does not run in
    PAE mode unless the /PAE option is present in the boot entry.
    Yes let us know how it goes. Hopefully well.

  • GeForce 6600GT » NX6600GT-VTD128 Query

    Hi
    I would like to buy the GeForce 6600GT » NX6600GT-VTD128 graphics card from the retailer pixmania but i have a query about something they (pixmania) advertiste the GeForce 6600GT » NX6600GT-VTD128 as MSI NX6600GT-VTD128M - 128 Mo.
    http://www.pixmania.co.uk/uk/uk/130210/art/msi/nx6600gt-vtd128m-128-mo-t.html
    May i ask what the M stands for after the VTD128 because ive always know it to be described as NX6600GT-VTD128 without the M.
    And could you please confirm that that graphics card on pixmaina is the same as the one on the offical MSI site
    http://msicomputer.co.uk/Products.aspx?product_id=703575&cat_id=78
    thanks all

    Ruud00,
    I think M stands for Typo.
    Take Care,
    Richard
    P.S. MSI only makes one AGP 6600GT - NX6600GT-VTD128 

  • MSI GeForce 6600GT AGP problems

    Hi all,
    A couple of days ago i bought the 6600GT 128MB AGP graphics card and i have the following problems.
    I have a gigabyte motherboard with a 370 socket, a 4X AGP, and a 350W PSU. I tried 3 games and i got major problems in the graphics in two of them. Doom 3 seems ok, but in painkiller and half-life 2 the polygons dont look right to me (the polygons are corrupted).
    Also in the "MSI information" tab in the display properties i got the wrong information about my graphics card. The memory type is wrong (it displays SDRAM) and the AGP 4X in the "AGP/PCI information" is not ticked. I checked my BIOS option and is set to 4X so i dont know whats wrong. I also tried both the 66.93 NVIDIA drivers and the 71.22 drivers from MSI but nothing happened. The polygon coprruption in the aformentioned games remains.
    Could please someone help me? Does the 6600GT work on AGP 4X?
    Thanks in advance,
    Yannis

    I tried "dxdiag" to perform some diagnostics on DirectX and while testing the Direct3D capabilities of the card the test fails when using hardware-accelarated Direct3D 8 & 9 interfaces, while passes the test with Direct3D 7. I cannot figure out the problem. I tried everything i could so far but still nothing. Can someone help me? The AGP slot of my motherboard supports 4X (AGP 2.0 compliant). Can this problem be from a damaged graphics card?

  • MSI GeForce 6200 128MB running DVI 1680x1050

    Hello,
    My VGA is MSI GeForce 6200 128MB. VGA driver is GeForce 175.19
    Monitor is Samsung T200 with native resolution 1680x1050 and DVI input.
    When I set resolution to 1680x1050 using DVI output, desktop is streched wider than screen and I have to scroll down for taskbar or scroll right to show clock for example. Setting resolution at 1680x1050 using analog output, it looks fine except that desktop icons are slightly streched verticaly.
    How can I get correct picture in DVI? Is it driver problem and what version is recommended to install?
    Thanks&Regards

    Alrite thanks, I’ll see what I can do. A friend told me that older drivers might be more convenient for 6200 and he mentioned 91.31. So I might go with that.
    Also browsing other forums I found this solution, and they said it will work well with static screens but not so well when playing games. Well this might be the last alternative, providing I don’t find solution with drivers.
    Quote
    Go to the nvidia control panel.
    Select - Manage Custom Resolutions.
    Select your widescreen/DVI connected monitor from the list.
    Uncheck "Treat as HDTV"
    Click Create
    Go to Advanced
    Boxes:
    Custom Display Mode Values
    Horizontal desktop pixels: 1680
    Vertical desktop pixels: 1050
    GDI refresh rate: 60
    Bits Per Pixel: 8
    Back-end Parameters:
    Timing Standard: Manual
    Desired Refresh Rate: 60
    Horizontal Front Porch: 48
    Active horizontal pixels: 1680
    Horizontal total: 1840
    Horizontal sync width: 32
    Horizontal sync polarity: -
    Vertical front porch: 3
    Active vertical lines: 1050
    Vertical Total: 1080
    Vertical sync width: 6
    Vertical sync polarity: +
    Front-end paramaters:
    Scaling type: fixed aspect
    Active Horizontal Pixels: 1680
    Active Vertical Lines: 1050
    Hit OK.
    Go to change resolution on the main nvidia control panel sidebar.
    Choose 1680x1050.

Maybe you are looking for

  • Mini displayport to dvi adapter not working

    hello, I have both a 2008 Mac Pro tower with 2 Apple Cinema Displays, and a newer MacBook Pro that I'd like to hook up to one of the monitors. The 2 monitors are both DVI, so I bought a mini displayport adapter to plug into the thunderbold port on th

  • USB 2.0 on D-bracket not working in WinXP

    Only the USB 1.1 ports work.  Device manger lists the 2.0 controler and ports as working properly.  Tried disabling Windows XP from disconnecting power to device to.  NOPE.  Using latest USB 2.0 drivers from MSI (Microsoft?) Is it supposed to work? K

  • A problem related to setLayout(...)

    Hi all, I put two JScrollpanes inside a JFrame, using setLayout(GridLayout(1,2)). However I cannot adjust the sizes of the both JScrollpane. Do I have to use setLyout(Flowlayout())? Any suggestion or hint would be greatly appreciated. Thank you so mu

  • Customer fields on own screen - Shirley Pereira

    Hi Everybody,    I saw lot of threads on the same issue but i couldnt get a solution for this issue. I saw all oss notes which all are mentioned for this issue but its not working for me. Requirement is, i have to add my own fields for contact(GOA) a

  • I phone will not "sleep" after non use of 1 hour. How do I fix this?

    i phone will not "sleep" after non use of 1 hour. How do I fix this?