Got a 65nm G200? Getting SC2? Might want to upgrade.

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Looks like SC2 code isn't playing well with 65nm G200 cards at all. In particular, a rash of suddenly "checkerboarding" (dead) GTX280s and a few 192 shader 260s from beta players. Apparently EVGA is claiming board microfractures from the cards operating at much higher temperatures than normal. SC2 is the only known game to cause widespread G200 death.

http://forums.battle.net/thread.html?topicId=24038430815&sid=5000

So if you have one and plan to play a lot of SC2 you may wish to upgrade sooner rather than later.
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
What in the game is so demanding? It looks like Warcraft 3 with a little nicer animations. Nothing revolutionary or imo crushing for a modern GPU.
 

HumblePie

Lifer
Oct 30, 2000
14,665
440
126
Hrmm, nothing about SC2 is demanding at all really. It can be run on some really old tech reasonably well. A 512MB 8800GT does remarkably well right now in beta for example. If true, the OP's statement in a surprise to me.


*EDIT* actually just googled the issue. It's not SC2, but the 196.75 beta drivers from Nvidia. While running SC2, and a few other games, those drivers will randomly turn off the fan. Just don't use those drivers.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Looks like SC2 code isn't playing well with 65nm G200 cards at all. In particular, a rash of suddenly "checkerboarding" (dead) GTX280s and a few 192 shader 260s from beta players. Apparently EVGA is claiming board microfractures from the cards operating at much higher temperatures than normal. SC2 is the only known game to cause widespread G200 death.

http://forums.battle.net/thread.html...30815&sid=5000

So if you have one and plan to play a lot of SC2 you may wish to upgrade sooner rather than later.
The game is fine, and 65nm G200s are fine.

Read the last post on page 2, a person with Radeon 5850 also suffers the exact same problem.

People often don't realized that a) Thermal pasting will dry and needs to be changed unless you are using something fancy to begin with. The pasting that came with the GPU and/or CPU sucks. b) Dust do get stuck inside the video card and need to be cleaned. c) Video card needs external cooling too, not just the GPU and memory itself.

Most people only care about the temperature shown on screen without actually feel temperature surrounding the hardware itself. System with correct airflow will not have this problem, but that isn't the usual case. GPU at 70c is considered low but 70c surrounding the card is considered very high. This will happens when the ventilation of the video card is blocked. This will mildly increase the GPU temp as the fan simply blows harder, but since the heatsink isn't cooled properly, other parts of the card may get heat up when it was not design to withstand heat.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Clearly I fail at linking. Fixed. Anyhoo, it's *one* person with a 5 series card vs a rash of G200 deaths. Quoting a bliz developer from that thread:

A video card overheating on WoW's login/char select screen, NCSoft's Aion character select screen or in any other game, really, shows that your card has design flaws, is manufactured incorrectly or your particular setup has issues. An entire GPU family tends to die for the first and second thing so when you see an outbreak, it's that. We've seen it once in recent times with no programs in particular.

People are running all sorts of drivers in that thread, not just the "g92 card killer" drivers. It's not driver specific.

As far as graphically demanding: eve-o used to overheat high end GPUs and laptops as well because the game had no frame cap. Cards ran at well over 400 fps on some hardware simply because the game wasn't graphically demanding. It's fine to render at that frame rate for a login screen because typically it's not displayed for long. MMO sessions go on long enough to bork some hardware.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
How would running a game at whatever FPS the GPU can handle be bad for the GPU?
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
You tell me. Running other games my 8800 was topping out in high 50s and very low 60s. Running eve at hundreds (thousands?) of FPS I was looking at mid to high 80s and a faint but definite squealing noise coming from either the card or the PSU (or both). Google for "eve interval one" for more juicy details.

Since most early 8800GTs idled in the low 70s and loaded to 100s for some people I could see an additional 20-30C being a big problem for them.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
How would running a game at whatever FPS the GPU can handle be bad for the GPU?
For laptops in particular it can be a problem since they often have 1-2 coolers tasked with cooling the entire system in the name of less noise and lower weight. A game that can drive up the CPU and GPU simultaneously can generate more heat than the laptop is really capable of dealing with, leading to issues.

However with discrete cards this is rarely an issue. Those cards have their own dedicated cooling system which is closely tied to GPU temperature, and unless they're completely cut off from airflow can simply ramp up to suck in more air.

It's an interesting thread, but it doesn't make much sense. The temperatures listed are fine for a GT200 card. It's possible that it's localized heating though - since the game isn't very demanding it would be the ROPs getting a workout if there's no frame cap.
 

Dark4ng3l

Diamond Member
Sep 17, 2000
5,061
1
0
From reading that forum the problems seem to be coming from the main screen. The main screen is really not intensive and you can see it rendering at hundreds of FPS. Apparently that is what is damaging the nvidia cards.
 

Athadeus

Senior member
Feb 29, 2004
587
0
76
Actually, I have recently noticed my HD 4770 with a non-reference cooler that is normally silent even when gaming making noise when I'm between matches of SC2, and it instantly goes quiet when I alt tab. I should check my temps and load and post just for the heck of it.
 

SHAQ

Senior member
Aug 5, 2002
738
0
76
Maybe it is running at 1000FPS and heating up the cards. Perhaps they can put in a frame rate limiter like Unreal Engine 3 does.
 

blanketyblank

Golden Member
Jan 23, 2007
1,149
0
0
I think they'll probably fix this by the time they release the actual game.
However what it does mean is if you have a 65nm G200 you shold never play the beta for this game.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
Funny because when playing this game @ Ultra settings it made my 9600 GT low power start to artifact. I could hear the fan winding while playing --Maybe it was maxed out--Not sure? When I exited the game my windows desktop was still messing up!! Instead of messing with it, I threw in my XXX 8800 GT and played with that instead. No more artifacts but man my fan sounds like it's about to take off--->> It seems that the graphics are not that great but it keeps your video card running @ pretty high usage constantly.
 

alcoholbob

Diamond Member
May 24, 2005
6,390
470
126
Hah, Neverwinter Nights 2 used to do this to my GTX 285 FTW. On Crysis I was getting maybe 85C on 1080p max DX10 with 8xAA...

with Neverwinter Nights 2 1080p full settings with no AA, the card ran up to 110C and the clockspeed started throttling at full fan speed. And that game looks very pedestrian.

Glad I got a 5870 instead; it tops out at 70C no matter what the settings are on either game.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Did they manage duplicate the Furmark code for the main screen?

I always thought it was pretty stupid for people to think it's okay for a video card to fail furmark and have the attitude that a real game would never cause temperatures like that. The simplest rebuttal is: What if a game uses the same kind of visual effect that causes those temperatures? A video card needs to be able to run any kind of application without any danger of failure just like a CPU can, otherwise it is defective and should be returned.
 

Piotrsama

Senior member
Feb 7, 2010
357
0
76
Funny because when playing this game @ Ultra settings it made my 9600 GT low power start to artifact. I could hear the fan winding while playing --Maybe it was maxed out--Not sure? When I exited the game my windows desktop was still messing up!! Instead of messing with it, I threw in my XXX 8800 GT and played with that instead. No more artifacts but man my fan sounds like it's about to take off--->> It seems that the graphics are not that great but it keeps your video card running @ pretty high usage constantly.
(Not directed at you, nice to read your experience)

If a game isn't optimized... people complains.
Now that Blizzard optimized their engine to fully utilize the Nvidia Architecture (causing overheating because of it).... people will also complain??

That's a new one!! :p
 
Dec 30, 2004
12,553
2
76
Looks like SC2 code isn't playing well with 65nm G200 cards at all. In particular, a rash of suddenly "checkerboarding" (dead) GTX280s and a few 192 shader 260s from beta players. Apparently EVGA is claiming board microfractures from the cards operating at much higher temperatures than normal. SC2 is the only known game to cause widespread G200 death.

http://forums.battle.net/thread.html?topicId=24038430815&sid=5000

So if you have one and plan to play a lot of SC2 you may wish to upgrade sooner rather than later.

every nvidia card I have owned has run super hot.
 

thelanx

Diamond Member
Jul 3, 2000
3,299
0
0
From reading that forum the problems seem to be coming from the main screen. The main screen is really not intensive and you can see it rendering at hundreds of FPS. Apparently that is what is damaging the nvidia cards.

Maybe it is running at 1000FPS and heating up the cards. Perhaps they can put in a frame rate limiter like Unreal Engine 3 does.

You tell me. Running other games my 8800 was topping out in high 50s and very low 60s. Running eve at hundreds (thousands?) of FPS I was looking at mid to high 80s and a faint but definite squealing noise coming from either the card or the PSU (or both). Google for "eve interval one" for more juicy details.

Since most early 8800GTs idled in the low 70s and loaded to 100s for some people I could see an additional 20-30C being a big problem for them.

I am wondering if the high fps is stressing out other components on the card, such as the capacitors (which is likely responsible for the whine noise) Those fractures may be the result of overheating of certain components or maybe the high frequency vibrations generated by those components? Not having a EE or any other engineering background, this is just a idle theory.