• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

What kills graphics cards the fastest?

AmongthechosenX

Senior member
Just randomly thought about this, but what kills graphics cards the fastest? what does first, the GPU or the memory?

how does this happen most of the time? gaming? folding? overheating?

just thought about it so i felt like asking lol
 
i've never had a video card die on me even after heavy usage (24/7 folding) (heavy overclocking) (heating it up to 100c)

that is.. none ever died on me before i stopped wanting to use it and upgraded. i usually upgrade every 1.5-2 years.
 
From what I've heard ( although it never happened to me ), usually the memory goes first, because of various reasons: heat, unstable clocks or just because it wants to die without any reason. 😉
 
Usually starts with artifacts. That would usually mean memory but now a days core can also cause artifacts.

Memory should last a life time. I've got memory sticks that is over 10 years old and still working. Either they work or they don't.

I think what's been killing memory on graphic cards in recent years is that manufacturers use the same heatsink to cool the memory that is also cooling the core of the GPU. Cooling semi warm memory with a hot heatsink reaching 70-90C is a stupid idea.
 
Originally posted by: Azn
Cooling semi warm memory with a hot heatsink reaching 70-90C is a stupid idea.

Yeah, I hate that. You have a lot of heat transferred from the GPU, to the memory chips, that would otherwise stay very cool.
 
Originally posted by: error8
Originally posted by: Azn
Cooling semi warm memory with a hot heatsink reaching 70-90C is a stupid idea.

Yeah, I hate that. You have a lot of heat transferred from the GPU, to the memory chips, that would otherwise stay very cool.

Agreed.. little heatsinks once you connect them to the gpu cooling block.

 
a bullet to the GPU...

mmm, no wait... lightening...

But you probably mean in normal use, which would be voltage, followed by heat.
 
Bad games

get pissed you spent $500 on a videocard only to get 20 fps in crysis so you smash it against a wall :laugh:


(actually I like crysis)
 
A quick check at the roshambo rulebook shows that both rock and scissors trump graphics card.

Graphic cards trumps paper though, so that counts for something doesn't it?
 
Heat is the number one enemy of all electronics.

With that being said, the memory tends to go first. The VRAM on my 6800 GT has been acting funny for a few years now (random artifacting, usually in 2D environment), and seeing as the cooling on those chips can be considered inadequate for such a warm-running card, it's not surprising.
 
Back
Top