GPU temperature problem - Nvidia Go 7300

hello,
i got a problem with my n200 , t8300 2,4 ghz.
past months gpu temperature was fine 54°C.whole day, even hard at work with protools&vegas.no games!
since two weeks the temperature is 70°C - 80°C in idle mode!!!
pls help.
keine_ahnung

if you got warranty, get it checked out. 
Regards,
Jin Li
May this year, be the year of 'DO'!
I am a volunteer, and not a paid staff of Lenovo or Microsoft

Similar Messages

  • Graphics Problems - Nvidia Geforce 7300 GT & Safari

    I have been having some weird graphics issues with Safari for the past year or so and it's really starting to bother me.
    Occasionally, on certain (and random) web pages in Safari, only half of the page will be displayed. The entire page will load but some images and text will only be displayed until...
    A. I "command" + "tab" back and forth from another open window and force it to display the page
    or
    B. I click and drag the mouse over the entire contents of the page highlighting everything and forcing it to display what I have selected.
    It's a very odd issue and doesn't happen all of the time on all web pages.
    Another oddity is when composing a new message in Gmail. My typing isn't displayed until I scroll down and then back up forcing it to display my text.
    I have searched for reseting or updating the firmware for the Geforce 7300 GT but the Nvidia website says that OSX is not supported... only Windows. And the firmware updater I found says that I do not need the update.
    Here are the specs as found in "System Profiler":
    NVIDIA GeForce 7300 GT:
    Chipset Model: NVIDIA GeForce 7300 GT
    Type: GPU
    Bus: PCIe
    Slot: Slot-1
    PCIe Lane Width: x16
    VRAM (Total): 0 MB
    Vendor: NVIDIA (0x10de)
    Device ID: 0x0393
    Revision ID: 0x00a1
    ROM Revision: 3008
    Displays:
    Display Connector:
    Status: No Display Connected
    Acer B223W:
    Resolution: 1680 x 1050 @ 60 Hz
    Pixel Depth: 32-Bit Color (ARGB8888)
    Main Display: Yes
    Mirror: Off
    Online: Yes
    Rotation: Supported

    Not moved, not resolved, while it might seem helpful, people post wherever they want. One general forum with MORE sub-topics would make more sense to me: Storage, RAM, Graphics, RAID as well as Using, Networking, Display, and Upgrade/Expanding.
    some people have found 7300GT with bulging blown capacitors.
    2008 and Later = Apple brought UEFI to Mac Pro
    2009 = Nehalem and DDR3
    The Pre-2009 then only means FBDIMMs which doesn't make a whole lot of sense for a category.
    Try to rule out Safari with a new user account. Sometimes to troubleshoot you really need a "spare" graphic card to determine what.

  • GPU Temperature Problems: Faulty Chip or SMC?

    Hello Community
    I am looking for some backup? on my theories regarding my MBPs GPU overheating - or the SMC has cracked it.
    Timeline:
    Running Windows 7 for video games, start games, very bad performance. Reboot. Fan is at full. Leave it to cool. Try again. Same problem. Try another game. same problem. Try OS X. Same problem.
    Run Tech Tool Pro 5. No faults detected. As its a graphics issue, I hammer the VRAM test. No faults found. Given the 9400M shares system memory, I ran Mmetest86 and found stacks of addressing errors. Popped the RAM out, thinking it was faulty, popped the old RAM in. Memtest finds errors and no performance increase.
    On suggestion of someone else, checked temperatures. Hello super high GPU temps. I have seen sustained and constant temps of between 90 (OS X) to 105 (Windows) degrees Celsius from running animated loading screens for video games.
    Using a few utilities on the Windows side (which has more available) I have found an unsurprising link between GPU usage (and thus performance) and temperature. Beyond 85 degrees or so, things start to go cactus.
    I did an SMC reset, and a PRAM reset. I've had a Genius do the same. I had one Genius explain to me that GPU cooling requires hardcore nitrogen setups and that 90 degrees for a GPU is OK.
    There is obviously a problem. When I force the fans to max out, the temp stays around 90 C max (and performance is still bad) but if the SMC was working correctly, it should be maxxing the fan at that temp instead of me using 3rd party tools to kick them up.
    To end my rant, do people think its SMC or the GPU? Either way its a new logic board, I just wanted some input on what people think it is.

    That does sound like it could be from overheating. Try blowing out the fans and vents. You can buy cans of compressed air at most electronics store. Use it to blow the vents out while the computer is off and unplugged. This can greatly reduce the temperature!
    - Peter

  • Problems with 3D Hardware acceleration on nVidia geForce 7300

    I have a system with a Core 2 Duo CPU and a nVidia geForce 7300 graphic card. I am trying to run a 3D robotics simulation program (Webots 7.0.3) and its performance is abysmal. I never had graphic performance issues before---but then again, I never tried to run this particularly intensive simulation software. Other simulation software compiled from source (e.g. player/stage ) runs smoothly.
    Where can I start to look to find out where the problem is?
    Here is what I've been able to find out so far:
    1. Hardware acceleration seems to be enabled:
    [stefano@polus]$ glxinfo | grep -n5 "direct rendering"
    1-name of display: :0
    2-display: :0 screen: 0
    3:direct rendering: Yes
    4-server glx vendor string: NVIDIA Corporation
    5-server glx version string: 1.4
    6-server glx extensions:
    7- GLX_EXT_visual_info, GLX_EXT_visual_rating, GLX_SGIX_fbconfig,
    8- GLX_SGIX_pbuffer, GLX_SGI_video_sync, GLX_SGI_swap_control,
    2. glxgears reports very poor performance: about 60FPS. The software manufacterer says their product needs around 2000FPS to run smoothly
    [stefano@polus]$ glxgears                                                                                                                                                                                                             
    Running synchronized to the vertical refresh.  The framerate should be                                                                                                                                                                       
    approximately the same as the monitor refresh rate.                                                                                                                                                                                         
    301 frames in 5.0 seconds = 60.157 FPS                                                                                                                                                                                                       
    299 frames in 5.0 seconds = 59.781 FPS                                                                                                                                                                                                       
    3. The software (i.e. Webots) is proprietary, and their linux version is built against libjpeg version 6.2. Archlinux currently runs libjpeg 8, so I had to manually install the older version.
    4. The software runs satisfactorily on my Kubuntu laptop, an old Lenovo T61 which has similar (or similarly aged) hardware. In other words, the CPU seems to be adequate and the prolem lies with the graphic setup in my Archlinux system, I believe. 
    5. I read the nvidia pages on the Arch website, the nvidia-linux pages and the Xorg pages on how to set up xorg.conf. It did not help.
    Some info on my system are appended below.
    Any hint greatly appreciated.
    Stefano
    Hardware
    CPU Version: Intel(R) Core(TM)2 Duo CPU E6550 @ 2.33GHz
    Graphic card: nVidia GeForce 7300/Gt/Pci/SS2
    RAM: 4 GB
    Double monitor setup with Xinerama enabled
    SW:
    Archlinux 64 bits
    Kernel: Linux 3.6.11-1-ARCH
    Graphic driver: nvidia304xx

    chris_l wrote:
    stefano wrote:
    2. glxgears reports very poor performance: about 60FPS. The software manufacterer says their product needs around 2000FPS to run smoothly
    Running synchronized to the vertical refresh. The framerate should be
    approximately the same as the monitor refresh rate.                                                                                                                                                                                         
    301 frames in 5.0 seconds = 60.157 FPS
    Why you get 60FPS? you have your answer there.
    I noticed that and was wondering about it. I am not familiar with glxgear at all and I am not sure about the exact meaning of the statement about framerate. But one of the developers of the software I am trying to run sent me this glxgear output, obtained on a recent Ubuntu system running under VMWare (on a Windows host, I presume):
    $ glxgears
    11030 frames in 5.0 seconds = 2205.791 FPS
    11492 frames in 5.0 seconds = 2298.289 FPS
    What gives? Are they running a different version of glxgears or is there some hidden switch in glxgears that unties it from monitor refresh rate and makes it work as a benchmark tool  (man glxgears did not report any)?

  • HT4664 I have problem running final cut pro X because  NVIDIA GeForce 7300 GT graphics card I live in a remote area what are my options to fix this ?

    I have problem running final cut pro X because  NVIDIA GeForce 7300 GT graphics card I live in a remote area what are my options to fix this

    This is the list of Apple approved graphics, except for the cards for laptops. Replace your card with one of these. Get the best you can afford.
    NVIDIA
    GeForce GT 120
    GeForce GT 130
    GeForce GTX 285
    GeForce 8800 GT
    GeForce 8800 GS
    Quadro FX 4800
    Quadro FX5600
    ATI
    Radeon HD 4670
    Radeon HD 4850
    Radeon HD 4870
    Radeon HD 5670
    Radeon HD 5750
    Radeon HD 5770
    Radeon HD 5870

  • W700: max GPU temperature for Quadro FX3700M?

    I've recently noticed a freeze / stall problem in Direct 3D applications (mostly games) with the Nvidia Quadro FX3700M GPU of my W700 (2757) under Win XP x64 and Win 7 Pro x64. The system was running rock solid for more then 3 years, and after a lot of testing with fresh Windows installations and different Nvidia driver versions, I decided to take a closer look at GPU temperatures.
    Turned out that in idle mode GPU temps are already 65 °C and climb over 96°C under full load in the GPU testing tool from MSI (MSI Kombustor). Furmark brings it even over 100 °C (I stopped it after a few seconds).
    Are these temperatures normal for the FX3700M? I hear the GPU fan kicking in, but it may be necessary to apply fresh thermal compound for optimal cooling.
    I'd appreciate any advice.
    ThinkPad W700, W701, T40p, T420, X200S, TPT2, Twist, ThinkStation D20, ThinkCentre M90z, M92p, M93p

    I downloaded TPFanControl and it is working well on my W700ds so far.  It shows all the temperatures and allows you to control the fan speed at different temperatures. 
    Before I blew the air through, I downloaded the Intel Processor Diagnostic Tool from Intel.  It tests the processor and loads it up pretty hard.  It gives a differential temperature from the maximum recommended temperature.  At the start of the test my processor indicated 42 degrees below max and at the end 21 degrees below max.  Not sure what max temp is but when I first turned on the TPFancontrol it indicated my gpu was running between 78C and 81C degrees.  Since blowing air back through the fans, the highest gpu temp I saw, with the TPFancontrol running in as received setup, was about 60C. I played with the .ini file for the TPFancontrol and have mine running at a fan level 4 at 57C.  that would be without heavy use.
    Geophyte1
    W700ds 2757-CTO

  • OEM Apple nVidia GeForce 7300 GT is on Adobe's list of unsupported GPUs for Open GL rendering-

    My Mac Pro's Radeon X1900 XT is retired to my computer grave yard, in my basement, but going back to the OEM Apple nVidia GeForce 7300 GT, which shipped with my Mac Pro, brought some encouraging results.
    When I first got the 2007 Mac Pro, in 2009, I was running it in Mac OS 10.4.4.11 Tiger and Photoshop CS4 (11.0.0) would disable Open GL rendering when it detected the stock nVidia GeForce 7300 GT card, which was on Adobe's list of unsupported graphics GPUs.   So, I purchased a refurbished Apple Radeon X1900 XT.  This card allowed Photoshop CS4 and Bridge CS4 to enable Open GL Rendering.  Over the last year, the Radeon X1900 XT graphics card has been creating stripes on my screen when it is running hot and sometimes the dual displays would just shut off while I was working.  Also, the computer was doing hardware freezes about once a day.   This morning the Mac Pro shut down it's dual displays, while I was working, and I had to do another cold shut off in order to restart.  I then manually shut the computer down and did a manual boot-up so that Snow Leopard would employ disk maintenance.
    I shut the computer off and pulled the Radeon X1900 XT card and re-inserted the OEM Apple nVidia GeForce 7300 GT card.  Now I'm running Mac OS 10.6.8 Snow Leopard with Photoshop CS4 (11.0.2) instead of Photoshop CS4 (11.0.0).
    Now, the updated Photoshop CS4, under the newer OS, is enabling Open GL rendering instead of disabling it. This graphics card's hardware limitation is now fixed, so I see no reason to buy the ATI Radeon HD 5770 Graphics Upgrade Kit for my Mac Pro:
    http://eshop.macsales.com/item/Apple/6615718/
    Some who have purchased the Apple  ATI Radeon HD 5770 say they still got the striping on their display when the Radeon HD 5770 card was running hot,  just like with the Radeon X1900 XT.
    So far, the OEM Apple nVidia GeForce 7300 GT is working fine and I'm not getting any striping on my displays with no display shut-offs or hardware freezes.
    I'm glad to have Adobe's Open GL rendering enabled with an OEM graphics GPU that runs cool and requires no imbedded cooling fan.  Apples OEM nVidia GeForce 7300 GT, for the Mac Pro, is on Adobe's list of unsupported GPUs for Open GL rendering under Photoshop CS4, so why is it supported now?  Is that just one of the benefits of the Photoshop CS4 11.0.2 update?

    Chirs, thanks for your info but you are talking about Photoshop CS5, which does not support OpenGL drawing on the Mac nVidia 7300 or the Mac Radeon X1900.  As long as I am using Photoshop CS4 and have not yet upgraded to CS5, I can continue to use the Mac nVidia 7300 and get limited OpenGL drawing support from Photoshop.  And it is working fine.
    Today I found out that Adobe has updated their OpenGL support information and apparemntly when I upgraded my Mac from Tiger to Leopard, and later, I gained Photoshop's Open GL support for the Apple nVidia GeForce 7300 GT:
    Supported video cards (Mac OS)
    Intel-based Macintosh, tested on Mac OS X 10.4.11 and 10.5.4
    17-inch iMac x1600, 128 MB
    ATI HD 2600, 256 MB
    MacBook Air intel GMA X3100
    Nvidia Quadrofx 4500, 512 MB
    Radeon x1900, 512 MB
    Intel-based Macintosh, tested on Mac OS X 10.5.4 only
    8800 GT, 512 MB
    iMac 8800 GS, 512 MB
    Nvidia 8600M, 256 MB
    The following Power PC cards work in Photoshop CS4: Nvidia 7800 (256 MB), Nvidia GeForce 7800 GT, and the Nvidia Quadro FX 4500.
    Note: OpenGL is enabled for the GeForce 7300GT, but Advanced Drawing and 3D Acceleration are disabled.
    So, when I upgrade to Photoshop CS5, I will need to replace my video card with something compatible with Photoshop CS5s OpenGL drawing engine.  Do you think the ATI Radeon HD 5770 Graphics Upgrade Kit, for my Mac Pro, will get Photoshop CS5 to make OpenGL drawing availible and active?
    I appriciate an Adobe employee, like yourself, providing the information I need to make the right hardware choices when upgrading to Adobe CS5.

  • Kernel Panic caused by Nvidia GeForce 7300 GT after Upgrade

    Kernel Panic caused by Nvidia GeForce 7300 GT after Upgrade
    Mac Pro 2,66 Ghz Dual Core Intel Xeon
    driving 2 Apple Cinema Displays 23 (out of order since update) and 30
    running Mac OS X Lion 10.7.2
    Hi I'm Phil from Germany,
    and I have the same problem like many others here:  https://discussions.apple.com/thread/1541589?tstart=0
    caused by the Nvidia GeForce 7300 GT after combined Upgrade by Apple.
    Started with frozen desktop, and ended with a permanent Kernel Panic after a reeboot.
    - Safe mode is not running
    - System CD is not running
    - Apple Hardware Test is ok.
    - tried the newest 7300 firmware update
    - I have double checked the RAM, and HDD and disconnected all external equipment several times.
    System.log at startup ends with the folling permanent Problem:
    Previous Shutdown Cause:3
    NVDANV40HALG7xxx loaded and registered.
    DSMOS has arrived
    After running my Mac Pro in Target Mode (Fire Wire Connection with my MacBook Pro) I found the Problem-Folder of the 7300 GT:
    File:  HD/SYSTEM/LIBRARY/EXTENSIONS/NVDANV40HalG7xxx  inside my library and erased them......
    NOW MY SYSTEM IS RUNNING AGAIN, but I can't run my System-CDs without a kernel panic,
    Time Machine runs without the Starfleet-Background, and my second cinema display still doesn't work any more!
    After spending several hours in front of my Mac Pro I'm sure that all of my hardware works fine.
    NOW I THINK THE PROBLEM IS: I have to change my ROM-Version on my Nvidia GeForce 7300 GT and install the Firmware 3008.
    So how does it work and where can I get this Firmware? Or how can I run an Downgrade from 3011 to 3008?
    ANY IDEAS ....or similar cases?
    WOULD BE GLAD TO GET SOME HELP!!!!!

    Thanks Hatter
    I'm sure that an ATI 5770 would solve my Problem...and maybe I could save a lot of my time?
    But why I should spend 250 € on a new Graphic Board when the old one isn't broken?
    Especially when the problem wasn't caused by me.
    I need the old firmware, because my System runs with my GT 7300 after removing the software part listed above. Only one of the firmware components that had been updated won't fit in connection with the newer OS X Versions.
    I'm also sure that the Problem must be on the Graphic Board, because my System-CD isn't running while pushing the "c" button.

  • Radeon HD 5770 vs. NVIDIA GeForce 7300 GT

    I am running a first gen. 1,1 Mac Pro 2 x GHz Dual-Core with 18 GB Ram.
    I have two 30"-Cinema Display connected to two NVIDIA GeForce 7300 GT.
    Will I recognize more speed by upgrading to a single Radeon HD 5770?
    I know that this upgrade would cost me an Mini DisplayPort to Dual-Link-DVI-adapter, too.
    I am mainly working with Apple Logic Pro.
    Regards
    Philipp

    Visit Bare Feats and look for their various GPU benchmarks from when the 5770 was first released. The 5770 will perform better than the 7300, but it will not achieve its maximum capabilities in your model because your Mac Pro only has PCIe 1.0 slots.

  • ATI 4870 AND NVIDIA GeForce 7300 GT together ?

    hello. i have a mac pro 1.1 with the stock NVIDIA GeForce 7300 GT video card. I will receive a new ATI Radeon HD 4870 card today and was wondering if it was possible to have both cards installed and if this wold provide a performance boost (ie will rendering videos utilize the power of both combined etc) . Also, i would like the option of having 3 displays . So 2 questions,
    1. can i have both cards installed at once?
    2. and if yes, will it provide benefits or problems/conflicts ?
    thanks
    Mac Pro 1,1 , 5GB Ram , Latest Snow Leopard , Mac 20ish" Display, and Sony Bravia 52" as display using dvi-hdmi cable .

    I've run the 4870 together with the 7300gt; there were no problems whatsoever. You'll be able to run 4 monitors, other than that there are no other benefits. This is under OSX.
    Note though, if you're running windows under bootcamp, with both cards installed you will likely run into driver conflicts. Best to keep to the same "clan" (ATI or Nvidia) when booting in windows.

  • Black Screen (source and preview) when using GPU acceleration (CUDA) [NVidia GTX 780M]

    Hi guys!
    I am running Adobe Premiere Pro CS6 on Windows 7 (Ult) on a machine equipped with NVidia GTX 780M (4 GB VRAM) graphics card. The source and the preview monitors are both black if I select GPU accelerated rendering. There is a yellow bar above the timeline clips, I can hear the sound tracks, but there is no image. If I choose the Software Only rendering (CPU) then I do see the image as well, but the real-time playback and rendering times are awful!
    At first, the application would not allow me to use the GPU acceleration using NVidia’s CUDA processing. Once I entered my video card name in the “cuda_supported_cards.txt” file in the Adobe Premiere Pro CS6 folder I was able to choose between Mercury Playback Engine Software Only and Mercury Playback Engine GPU Acceleration CUDA.
    I tried downgrading the video card driver to the previous stable version and reinstall Pr – no go. How do I make Adobe Premiere Pro CS6 work with my NVidia GTX 780M video card?
    Your insight is appreciated!
    P.S. I used to work on a different machine equipped with NVidia GTX 670M card (which is also missing from the officially supported Pr CS6 cards list) and I never had any issues of this nature with that laptop.

    Hi there,
    As stated before, I had the same problem. Apparantly the solution for me was to check the Nvidia settings. My setup has both an onboard GPU and a Nvidia GPU (on purpose specifically for video editing and gaming). Normally the system should switch to Nvidia automatically. However, the Nvidia settings for Premiere where set to favor the onboard GPU. After changing this. everything works fine again!
    kind regards. Bart

  • K7N2 Temperature Problems

    I'm having some small temperature problems with this board.
    Here are the temps
    CPU:        45-60 (normal-load)'c
    Case:       38-45
    Chipset:    42-47
    The CPU isn't too bad, although I do see some strage fluctuations, like while writing this, its went upto 57 and then the next reading dropped to 47???
    I have a laser digital thermometor which obviously can only scan temps external of the heatsinks, but it is accurate, and it does report similar temps.
    My main worry here is the case temperature.
    Here is my setup
    K7N2 Delta 2
    athlon 2.5 @ 2.2Ghz
    1GB corsair (512*2)
    Nvidia 6800GT 256MB DDR3
    DVDRW
    DVDROM/CDRW
    120GB SATA HDD
    500WATT PSU (which is heating up a little and could be a cause of this)
    3 case fans
    thermaltake volcano 12+
    I've tried not clocking but there is little difference, room temps are at 20-25'c.
    The heatsinks near the chipset seems to be getting rather hot also.
    It doesn't have any stability problems. Its fine with games, no slowdowns. but its just the temps.
    Can someone please compare my temps with the same board, just so I can get an idea.
    Thanks in advance

    I definitally need something black...
    Otherwise it'd be a little out of place!
    In all honesty I wish I could buy things from newegss. they seem to have a great range and nice system.
    In the UK i'm reverted to buying from EBuyer (despite their poor customer services!), there prices are the cheapest although much dearer than the US...
    http://www.ebuyer.com/customer/products/index.html?rb=3549119343&action=c2hvd19wcm9kdWN0X292ZXJ2aWV3&product_uid=54146
    Its way out of my price range. I have £50ish set aside for it, and I'm looking at these at the moment all from overclockers (seem to have best range, not price) :
    Sunbeam Samurai:
    http://www.overclockers.co.uk/acatalog/Sunbeam_cases.html
    Coolermaster Centurion 5:
    http://www.overclockers.co.uk/acatalog/Online_Catalogue_Coolermaster_Cases_126.html
    Antec SLK3700BQE Black Quiet Midi
    http://www.overclockers.co.uk/acatalog/Antec_Cases.html
    If anyone from the UK can reccomend somewhere that would be sweet!
    I'm looking at:
    Dabs
    Ebuyer
    Eclipse Computers
    Redstore
    Simply Computers
    Aria.co.uk
    Please I need more!

  • GPU Temperature

    I have a Mac Mini from 2012 that have a high GPU Temperature, I'm using a monitoring program (Temperature Gauge) that is warning about high GPU temperature (95-97C).
    The Mac mini is placed upon a tv furniture so it has a lot of air aroound it, I've used compressed air to remove dust but still gett the high GPU temperature. Is this a comon problem and what can I do about it?

    While it may not be recommended by some here, I use > Lobotomo Software: MoofMenu (aka Fan Control) on my 2010 Mac Mini to keep it from cooking.
    Right now I'm just surfing, listing to iTunes and it's cool in the office, so the fan is only running at default speed.
    But notice what happens when I play a game or watch video. As the temp increases so does the fan speed and so it never goes much above 70 C no matter what I do.
    I use smcFanControl on my iMac's and Fan Control on my Mac Mini's and have never had a fan or graphic's failure. Personally the only draw back is that one needs to clean the air intake(s) more often, but that a small price compared to having melted Mac.

  • How to monitor GPU temperature?

    Hi dears.
    Recently I changed my HD3870's fan with the Arctic Cooler Accelero S1 Rev.2 silent cooler. This is much more silent than the stock fan installed on the Video card.
    So the question is: how do I monitor GPU temperature to check if all is ok? In the event of too much hot GPU can I cause problems to all the system or only to GPU?
    Thank's a lot.
    Regards.

    I use this app to monitor my iMac's overall temperature: http://www.macupdate.com/app/mac/23049/smcfancontrol
    If the computer appears over 120ºF/130ºF, you can use the slide bars to increase the fan speed. Mac computers can still pretty easily run at around 210ºF, but the cooler you keep it, the better. Personally, I keep my iMac at roughly 100º.

  • About GPU temperature

    I checked my gpu temperature is around 58 C as soon as i logged on widows.
    Is that noraml ?

    no.. i bought X3100 equipped T61, i had a T61 with Nvidia before but i swapped with a friend for a intel integrated X3100 T61.
    Display quality is the same, but the graphic performance when it comes to complex CAD and games are different obviously.
    The T61 1440x900 14.1 inch widescreen, T61 with 14.1 inch SXGA+ standard screen, and a T61 15.4 inch with WSXGA (my friend swapped the OEM WXGA for a WSXGA, before swapping with mine Nvidia equipped T61) <-- these are the one that i currently own.... 
    Message Edited by lead_org on 08-06-2009 10:53 PM
    Regards,
    Jin Li
    May this year, be the year of 'DO'!
    I am a volunteer, and not a paid staff of Lenovo or Microsoft

Maybe you are looking for