Intel processors crashing Unreal engine games (and others)

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Hans Gruber

Platinum Member
Dec 23, 2006
2,337
1,237
136
4C/4T, no matter how high clocked (within reasonable contemporary limits) just isn't where it's at anymore.
It has no issues with CS2. I am bringing up older games here and performance. There is definitely an issue with unreal engine games. I have never seen an i5 intel processor do the weird things I saw this morning in PUBG. 4 core Intel processors started to have issues with games like Battlefield 5. That game meant the end of the 4 core gaming processor. The FPS would be good but there was stuttering and lag.

I know PUGB is not too much of a game for a 7600k and 1660super. I was getting over 100fps on that unreal engine game @1080P.
 
Jul 27, 2020
20,565
14,288
146
I know PUGB is not too much of a game for a 7600k and 1660super. I was getting over 100fps on that unreal engine game @1080P.
If a game update caused the issue, it's more an issue caused by the developers than Intel. Of course, playing on Windows 11 with the memory integrity feature turned on and all the meltdown/spectre mitigations will only make the poor CPU lag behind even more. Maybe try in an older OS with older drivers and see if the issue persists.

Another option is trying on Linux where the mitigations can be turned off.
 

Hans Gruber

Platinum Member
Dec 23, 2006
2,337
1,237
136
How did you isolate the problem to the CPU?
I was playing with a rando on PUBG. Ironically, he had a i3 13100 CPU. He said hold on in game, I am lagging. PUBG is a broken game but it works. I know what the game is capable of with all it's quirks. I would drive across the map at high speed in a vehicle. For no reason, the game bogged down like a monster truck and tractor pull. That's all she's got type things at the end of the pull. A 7600K should have no problems in a 7 year old game like PUBG. I know people who play on Intel and they never had problems with PUBG in the past. It's not a game that would require a 6 core or more CPU.
 

Hans Gruber

Platinum Member
Dec 23, 2006
2,337
1,237
136
If a game update caused the issue, it's more an issue caused by the developers than Intel. Of course, playing on Windows 11 with the memory integrity feature turned on and all the meltdown/spectre mitigations will only make the poor CPU lag behind even more. Maybe try in an older OS with older drivers and see if the issue persists.

Another option is trying on Linux where the mitigations can be turned off.
The computer I was using has Windows 10 Pro. It has 16GB of ram, 7600K and a 1660 Super.
 

MrPickins

Diamond Member
May 24, 2003
9,112
760
126
I remember the days when AMD were ruthlessly mocked for their 220W FX-9590, the last hurrah of Piledriver. "Space heater", people called it. And now this is just a normal and acceptable power limit for Intel's main CPU? What the hell happened?

In my mind, I equate it with Prescott. Sure, it's competitive, but at what cost (power and thermal)?
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
In my mind, I equate it with Prescott. Sure, it's competitive, but at what cost (power and thermal)?
At base power of 125W (look it up on ark) the 14900k uses 91W in an average of 41 apps, while the 7950x at default uses 128W ,that's 41% more power that ryzen needs to get 3.5% more performance than intel.
Just because it is overclockable and ryzen is not doesn't mean that it's inefficient.
And base power isn't any eco mode either, it's the official TDP.

Techpowerup

D5TipA9.jpg
 

Rigg

Senior member
May 6, 2020
547
1,294
136
At base power of 125W (look it up on ark) the 14900k uses 91W in an average of 41 apps, while the 7950x at default uses 128W ,that's 41% more power that ryzen needs to get 3.5% more performance than intel.
Just because it is overclockable and ryzen is not doesn't mean that it's inefficient.
And base power isn't any eco mode either, it's the official TDP.

Techpowerup

D5TipA9.jpg
How's it look against the 7950X3D there?

Official TDP on intel k SKU......

 

Thunder 57

Diamond Member
Aug 19, 2007
3,043
4,763
136
At base power of 125W (look it up on ark) the 14900k uses 91W in an average of 41 apps, while the 7950x at default uses 128W ,that's 41% more power that ryzen needs to get 3.5% more performance than intel.
Just because it is overclockable and ryzen is not doesn't mean that it's inefficient.
And base power isn't any eco mode either, it's the official TDP.

Techpowerup

D5TipA9.jpg

I'm not sure what you are trying to prove other than you have an agenda. Also no one is going to try to dechiper two images cut and pasted next to each other. If you want to make an argument, make it easy to follow.
 

eek2121

Diamond Member
Aug 2, 2005
3,118
4,454
136
I remember the days when AMD were ruthlessly mocked for their 220W FX-9590, the last hurrah of Piledriver. "Space heater", people called it. And now this is just a normal and acceptable power limit for Intel's main CPU? What the hell happened?
Both AMD and Intel chips consume significantly more power than a Pentium 4, which also was known for being a space heater. Not saying that in a good/bad way, just gives me a chuckle.
4C/4T, no matter how high clocked (within reasonable contemporary limits) just isn't where it's at anymore.
Steam Deck says hi. (Yes I know it is 4C/8T)

EDIT: Many developers of both games and engines are now optimizing for the Steam Deck, so funny enough, gaming with a quad core may improve.
 

Hans Gruber

Platinum Member
Dec 23, 2006
2,337
1,237
136
Both AMD and Intel chips consume significantly more power than a Pentium 4, which also was known for being a space heater. Not saying that in a good/bad way, just gives me a chuckle.

Steam Deck says hi. (Yes I know it is 4C/8T)

EDIT: Many developers of both games and engines are now optimizing for the Steam Deck, so funny enough, gaming with a quad core may improve.
The older Intel processors still rip on basic games. The Unreal Engine is a new phenomena that Intel should address. The 6 core PC is required for games like Battlefield 5 and later. The new Call Of Duty games need more cores. With the advent of games geared towards 6 cores, older Intel CPU's prior to Intel 8th generation where 6 core i5 and 8 core i7 became the new standard.
 

Wolverine2349

Senior member
Oct 9, 2022
437
139
86
I remember the days when AMD were ruthlessly mocked for their 220W FX-9590, the last hurrah of Piledriver. "Space heater", people called it. And now this is just a normal and acceptable power limit for Intel's main CPU? What the hell happened?
Regarding power consumption yes but to be fair to Intel the performance is on par with AMD or even slightly better overall despite their insane power draw.

Where as AMD FX 220 Watt was insane furnace heater and had significantly worse performance to Intel chips at the time despite being double core count. And those AMD pre Ryzen FX chips had stability issues as well in addition to their IPC and performance was just embarrassingly bad and even the best Pile Diver had clock normalized IPC almost as bad as half of what Skylake derivatives did. AMD releasing even original Ryzen was a game changer and caught up so much to Intel because of how bad AMD pre Ryzen FX chips were in IPC. And it was much more power efficient and only like 15% IPC behind Intel latest offerings 7 years ago which was a massive leap from the horrible FX chips that laid groundwork for where AMD became from mid 2019 to present with Zen 2 and especially late 2020 to present with Zen 3.

Not going to defend Intel's insane power draw (not that I am that concerned about climate change and obsession with energy efficiency for that purpose and electricity bill amount increase is negligible), but heating up the room and inside of PC case makes having a quiet system and comfortable space so hard which is the part I care about. So I am all in AMD with similar performance and so much less heat output in case which enables a much quieter system that already has one power hungry RTX 4090. Though you can rationalize insane GPU power consumption as it is a big device and its heat and power can be dissipated much more easily and cooled reliably unlike consumer sized LGA 1700, and AM5/AM4 CPUs.
 
Last edited:

bononos

Diamond Member
Aug 21, 2011
3,911
172
106
Regarding power consumption yes but to be fair to Intel the performance is on par with AMD or even slightly better overall despite their insane power draw.

Where as AMD FX 220 Watt was insane furnace heater and had significantly worse performance to Intel chips at the time despite being double core count. And those AMD pre Ryzen FX chips had stability issues as well in addition to their IPC and performance was just embarrassingly bad and even the best Pile Diver had clock normalized IPC almost as bad as half of what Skylake derivatives did. AMD releasing even original Ryzen was a game changer and caught up so much to Intel because of how bad AMD pre Ryzen FX chips were in IPC. And it was much more power efficient and only like 15% IPC behind Intel latest offerings 7 years ago which was a massive leap from the horrible FX chips that laid groundwork for where AMD became from mid 2019 to present with Zen 2 and especially late 2020 to present with Zen 3.

Not going to defend Intel's insane power draw (not that I am that concerned about climate change and obsession with energy efficiency for that purpose and electricity bill amount increase is negligible), but heating up the room and inside of PC case makes having a quiet system and comfortable space so hard which is the part I care about. So I am all in AMD with similar performance and so much less heat output in case which enables a much quieter system that already has one power hungry RTX 4090. Though you can rationalize insane GPU power consumption as it is a big device and its heat and power can be dissipated much more easily and cooled reliably unlike consumer sized LGA 1700, and AM5/AM4 CPUs.
The fx-9590 was spec'd and measured at 220W TDP/power draw whereas the current Intel cpus are drawing close to double of their rated TDP just to win at some benchmarks and have been cheating like this for the past few generations.
 

SolidQ

Senior member
Jul 13, 2023
578
706
96

Saylick

Diamond Member
Sep 10, 2012
3,622
8,148
136
Inb4 Intel makes a statement that it's not the processor's fault, it's the mobo manufacturers who decide to juice up the chips beyond stability.

Meanwhile, if you ask why doesn't Intel enforce voltage/power limits on mobo makers, you get crickets.
 
Jul 27, 2020
20,565
14,288
146
Meanwhile, if you ask why doesn't Intel enforce voltage/power limits on mobo makers, you get crickets.
It's a symbiotic relationship. Mobo makers get to give extreme performance to their users with great benchmark scores. Intel gets the reassurance that its CPUs won't last 10 years due to active degradation from running beyond limits hence more future sales. WIN-WIN!
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
29,783
25,302
146
Gamers Nexus and some of the other big tech reviewers should be all over this.

200w.gif


They could reimburse users experiencing the crashes, to send their boards and CPUs for testing. Maybe they can determine what the cause is.
It's a symbiotic relationship. Mobo makers get to give extreme performance to their users with great benchmark scores. Intel gets the reassurance that its CPUs won't last 10 years due to active degradation from running beyond limits hence more future sales. WIN-WIN!
That only works if the CPU lasts until the warranty is up my friend. Above normal RMA replacements on the most expensive parts is taking the L.
 

H433x0n

Golden Member
Mar 15, 2023
1,222
1,600
96
Gamers Nexus and some of the other big tech reviewers should be all over this.

200w.gif


They could reimburse users experiencing the crashes, to send their boards and CPUs for testing. Maybe they can determine what the cause is.

That only works if the CPU lasts until the warranty is up my friend. Above normal RMA replacements on the most expensive parts is taking the L.
It’s going to happen. He’ll do a 3 episode series on it filled with moral outrage and righteous indignation in true Gamers Nexus style.
 

H433x0n

Golden Member
Mar 15, 2023
1,222
1,600
96
Inb4 Intel makes a statement that it's not the processor's fault, it's the mobo manufacturers who decide to juice up the chips beyond stability.

Meanwhile, if you ask why doesn't Intel enforce voltage/power limits on mobo makers, you get crickets.
I mean, it sort of is the mobo manufacturers. Intel’s issue is not actively enforcing sane limitations.

Even from a pure self interest standpoint I don’t get what Intel is doing. The juice isn’t worth the squeeze. That extra 3-5% in cinebench & blender is not worth it. It’s such a simple trade off - enforce 253W limit as default and they retain their scores in gaming and 1T benchmarks and lose a measly 3-5% in a handful of nT benchmarks. As a bonus, normie users will get a better experience too.
 

Hans Gruber

Platinum Member
Dec 23, 2006
2,337
1,237
136
Inb4 Intel makes a statement that it's not the processor's fault, it's the mobo manufacturers who decide to juice up the chips beyond stability.

Meanwhile, if you ask why doesn't Intel enforce voltage/power limits on mobo makers, you get crickets.
That is 100% false information if that is why Intel believes. I have a 2017 B250 motherboard. A 7600k and that processor. I know what I saw in PUGB which has never been a problem in the past for the millions of PUBG players. They need to fix their CPU's. From what it sounds like, it affects all their processors up to 13th and 14th generations.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
29,783
25,302
146
It’s going to happen. He’ll do a 3 episode series on it filled with moral outrage and righteous indignation in true Gamers Nexus style.
I certainly hope so. They have been at the forefront of consumer advocacy for the DIY community for years now. And I'll laugh every time they use the "Thanks Steve" "Back to you Steve" clips they took from that cringefest Intel presentation.