Question Diablo4 causing gpu's to die

Ranulf

Platinum Member
Jul 18, 2001
2,384
1,262
136
Just a heads up for anyone trying the Diablo4 beta this weekend. Game is apparently bricking 3080ti cards, gigabyte ones. There are reports of it hitting other cards though, including AMD.


While Diablo IV's lenient PC spec requirements indicate a well-optimized game, some users share troubling reports of their expensive graphics cards failing during gameplay. There have been multiple reports of NVIDIA RTX 3080 Ti GPUs failing while playing the Diablo IV early access beta, with symptoms like GPU fan usage skyrocketing to 100% following an outright hardware shut down.

Blizz forum post on it:


Jayz2c video:

 

Saylick

Diamond Member
Sep 10, 2012
3,216
6,579
136
Thanks for the PSA. My brother has a 3080 Ti and he loves Blizzard games, so I immediately told him NOT to touch the D4 beta. He doesn't have a Gigabyte version (it's a Founder's Edition) but can't be too safe. I ain't tryna pay another $650 to replace it with something equivalent.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Sounds like more people that think frame caps are a joke, and they should be able to run any game they want at unlocked frame rates. Because as we all know, in this PvE game, higher FPS means you kill demons way faster.

I just do not understand people that think its fine to run the GPU at 100% load with uncapped FPS all the time.
 

Ranulf

Platinum Member
Jul 18, 2001
2,384
1,262
136
Sigh. Under-engineered cards FTL.

Maybe. It could also be bad programming with some games. Total War Warhammer 3 has a known problem with many cards and pushing the gpu to 100% in the campaign map. It means my 2060S runs its fans at 2500rpm, no matter the graphics settings unless I cap the frames in nvidia control panel to 30fps.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,397
10,080
126
Maybe. It could also be bad programming with some games. Total War Warhammer 3 has a known problem with many cards and pushing the gpu to 100% in the campaign map. It means my 2060S runs its fans at 2500rpm, no matter the graphics settings unless I cap the frames in nvidia control panel to 30fps.
Could do what miners do, power-limit in AB and lower core clock.
 
  • Like
Reactions: Shmee

Saylick

Diamond Member
Sep 10, 2012
3,216
6,579
136
If a card can't handle that, then it's faulty defective design.
In a vacuum, I think this is the correct take. If a card can't do 100% sustained load for an extended period of time that is analogous to a long gaming session, the card wasn't built right. This naturally includes the power circuitry along with the cooling system. However, I also do think that game developers should be cognizant that certain portions of their games should not be pegging the GPU at 100% load, e.g. menus and pre-scripted animations. It shouldn't require an fps cap that is set by the user to resolve this.
 
  • Like
Reactions: Pohemi and Ranulf

Ranulf

Platinum Member
Jul 18, 2001
2,384
1,262
136
Could do what miners do, power-limit in AB and lower core clock.

Yup, that is the other solution, lower clocks etc. to 75-85% maybe. It is just easier to limit it to 30fps. Sucks for the rts battles though. It is clearly the game though. The recommended settings for my system put almost everything at high and tweaking most settings does nothing other than I think turning off AA completely. I've kinda given up on the game though and gone back to WH2.
 

Ranulf

Platinum Member
Jul 18, 2001
2,384
1,262
136
100% is not "pushing".
GPU have to able to handle that.

Sure, what I mean is that the graphics on screen should not be pegging a gpu at 100% given what is displayed on screen. If it does, fine but it is sloppy coding by the devs given the game is on a cutscene, menu screen or the campaign map in WH3 is not objectively better looking than WH2 which the same system can run at ultra settings and hit over 80fps vsync off. A game where the rts battles with hundreds of little units on screen fighting stress the card less than an animated campaign map.
 
  • Like
Reactions: Pohemi

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
If a card can't handle that, then it's faulty defective design.

While I believe that's likely the case here, there is nuance to it.

100% utilization with static load is one thing.

100% utilization with sudden changes in load is another. In the case of New World, the sudden changes in load caused massive current spikes. The utilization stayed 100%, but the type of load would suddenly change.

When you have chips that suffer from crazy transient issues like the 3000 series does, these fluctuations in the load make it significantly worse.

My comment was more meant to my annoyance anytime I run across somebody that absolutely has to run their games with uncapped FPS. Absolutely nothing is gained from it in an action RPG like Diablo. If you want to run the game at a high refresh, thats one thing. 120hz, 144hz, etc. The game is very well optimized, so that's not hard to do. But if somebody kills their card because they were running hundreds, and in some cases thousands of FPS, I have zero pitty for them.
 

ZGR

Platinum Member
Oct 26, 2012
2,054
661
136
I’m pretty sure this game would eventually kill my 3080 if I left it at stock. Stock fan curve at stock voltage sees my junction temps go well above 105C and power consumption is over 300W sustained. I like to stay below 95C and well under 0.9V

I’m not gonna test this. I like my GPU. I think this is another wake up call to undervolt cheaply made RTX 3000 GPU’s. Performance scaling is awful past 0.85v anyways.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I’m pretty sure this game would eventually kill my 3080 if I left it at stock. Stock fan curve at stock voltage sees my junction temps go well above 105C and power consumption is over 300W sustained. I like to stay below 95C and well under 0.9V

I’m not gonna test this. I like my GPU. I think this is another wake up call to undervolt cheaply made RTX 3000 GPU’s. Performance scaling is awful past 0.85v anyways.

And what happens if you turn on vsync?

Software should not be able to kill hardware operating in 'normal' conditions.
Imagine if CPUs were this finicky and sensitive.

Right, but running without a frame cap isn't "normal" conditions.
 

gdansk

Platinum Member
Feb 8, 2011
2,212
2,835
136
Right, but running without a frame cap isn't "normal" conditions.
No, it totally is normal condition. If the GPU will fry itself at say 5000fps then Nvidia should put a hard lock in the Graphics System Processor firmware to stop it. Like how CPUs have handled overheating for decades.

This trash tier hardware support that blames the users is something only gamers accept. Everyone else knows better.

Except software specifically written to be destructive it should be impossible to run a piece of software that bricks your GPU. CPUs despite all their errata have been doing this pretty reliably for decades. If it is a result of Gigabyte's factory OC then they should be on the hook for warranty replacement.
 
Last edited:

GodisanAtheist

Diamond Member
Nov 16, 2006
6,927
7,334
136
I've said it before and I'll say it again: I wonder how many corners got cut on 2021 run graphics cards.

2020 cards should be fine since they were made with quality parts fabbed in 2019 and early 2020, but stuff built in 2021 and even early 2022 were likely squeezed out with substandard components subbed in to ensure orders were full filled.

Now that supply chain issues are largely resolved, I wonder if manufacturers will keep using substandard parts and pocket the savings (I mean, hey, no one on that supply chain benefits using 20yr parts rather than 7-10yr parts).
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
However, I also do think that game developers should be cognizant that certain portions of their games should not be pegging the GPU at 100% load, e.g. menus and pre-scripted animations.
Nonsense. It's absolutely not a game developers problem if a GPU dies under a certain workload. The $billions invested by nVidia for R&D and testing are supposed to include proper safeguards for their products.

It shouldn't require an fps cap that is set by the user to resolve this.
I agree. If we're relying on developers putting framerate caps in the menu then the product is a catastrophic design failure, and nVidia should be paying compensation to all affected parties.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
100% utilization with sudden changes in load is another. In the case of New World, the sudden changes in load caused massive current spikes. The utilization stayed 100%, but the type of load would suddenly change.
It's absolutely not the developers problem to load balance hardware in order to prevent physical failure.

When you have chips that suffer from crazy transient issues like the 3000 series does, these fluctuations in the load make it significantly worse.
Sure, which further highlights their design flaws. That's nVidia's problem, not the developers.

My comment was more meant to my annoyance anytime I run across somebody that absolutely has to run their games with uncapped FPS. Absolutely nothing is gained from it in an action RPG like Diablo. If you want to run the game at a high refresh, thats one thing. 120hz, 144hz, etc. The game is very well optimized, so that's not hard to do. But if somebody kills their card because they were running hundreds, and in some cases thousands of FPS, I have zero pitty for them.
But this is nothing more than your opinion. Who decides which game is "allowed" a given FPS? Why shouldn't a 500Hz monitor owner be allow to play Diablo 4 at 500 FPS, for example?

I personally cap my 2070 @ 60 FPS for many reasons, but I have the absolute right to run it uncapped full-bore 24/7 and expect it to last the duration of the warranty.

Right, but running without a frame cap isn't "normal" conditions.
Which nVidia reviewer's guide says to cap the framerate to ensure correct operation? And what if a reviewer wants to test Diablo 4?

Also where does it say on nVidia's GPU boxes "warning, product not designed to run at uncapped framerate, doing so may damage the hardware and void the warranty, do so at your own risk"?
 
Last edited: