Question If the 3700x, 3800x, and 3900x have very similar game performance (FPS measurement), does that mean they are consuming the same amount of power?

leepox

Junior Member
Jul 21, 2020
9
3
36
To get to a certain level of performance, in theory all of the above will consume a similar-ish amount of power to get the same-ish gaming results right? Maybe apart from the 3900x which has two chiplets.

I'm sure cinebench/etc. it would be a different story when all cores are synthetically maxed out. But for gaming purposes, I assume it would be negligible? It's just that I'm worried my current cooling set-up is inadequate (currently have a 3700x which doesn't boost beyond 4Ghz so I'm selling it), and upgrading to a 3900x would be a big mistake in terms of thermals and power.
 

arandomguy

Senior member
Sep 3, 2013
556
183
116
There's going to be variance game to game but they're going to use well under the typical stress test measurements for CPUs. Reviewers should be testing power consumption under more scenarios in my opinion. But since they don't there's more limited data but here is some -

CPU only test measurement -

power-hitman2_r5-3600-review.png


System measurement -

power-gaming.png
 

leepox

Junior Member
Jul 21, 2020
9
3
36
I swear I've been looking for benchmarks on the net, how you found them, I have no idea... I must really suck at google :oops:

Thanks for these, exactly what I wanted to see. By the looks of it, 20 watts more than the 3700x under normal gaming scenarios. I guess that is the penalty to pay for better minimum (1%) fps perhaps, and maybe an extra few frames every now and then. Perhaps better higher sustained boost clocks as well?

Surprised how it's on par with the 1700x (which I also had). I guess that answers the question, my 240mm AIO should be enough to keep it cool for gaming. Quite interested in seeing how much improvement it will give for video encoding, and I think even then, it won't maximise all of the CPUs. I really need to buy a power plug meter.
 

ondma

Platinum Member
Mar 18, 2018
2,719
1,279
136
There's going to be variance game to game but they're going to use well under the typical stress test measurements for CPUs. Reviewers should be testing power consumption under more scenarios in my opinion. But since they don't there's more limited data but here is some -

CPU only test measurement -

power-hitman2_r5-3600-review.png


System measurement -

power-gaming.png
Interesting that Intel's much maligned high power consumption is essentially the same as Ryzen in actual real life gaming.
 

Hitman928

Diamond Member
Apr 15, 2012
5,236
7,785
136
That is the whole system power including most likely a 2080TI, unlike the 10900k that uses 250 watts all by itself.

While the 10900k can use a lot of power when running full out, there's no game that I know of that will tax the 10900k anywhere near full load. Most games today still really only use about 4 cores / 8 threads although 6/12 is starting to become more common among modern engines. When only loading ~ half the cores and even then there's a lot of up and down usage, the average power use of the 10900K will drop dramatically. Same for the AMD chips. This is also a two edged sword though if you want to buy a 10900k for gaming because you are just wasting money. The 10600k + overclock will actually outperform a 10900k (without overclocking) in today's games and basically match it when both are overclocked. So what's the point of buying the 10900k if all you care about is playing games today?

If we start to see games utilizing 8+ cores + SMT in the future, I would expect that you would see more separation between CPUs based upon power consumption and obviously a much larger performance lead for CPUs with that many cores or more.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,540
14,495
136
While the 10900k can use a lot of power when running full out, there's no game that I know of that will tax the 10900k anywhere near full load. Most games today still really only use about 4 cores / 8 threads although 6/12 is starting to become more common among modern engines. When only loading ~ half the cores and even then there's a lot of up and down usage, the average power use of the 10900K will drop dramatically. Same for the AMD chips. This is also a two edged sword though if you want to buy a 10900k for gaming because you are just wasting money. The 10600k + overclock will actually outperform a 10900k (without overclocking) in today's games and basically match it when both are overclocked. So what's the point of buying the 10900k if all you care about is playing games today?

If we start to see games utilizing 8+ cores + SMT in the future, I would expect that you would see more separation between CPUs based upon power consumption and obviously a much larger performance lead for CPUs with that many cores or more.
What I was tryi ng to get through to ondma, was that a 10900k will use way more than 72 watts when gaming at full tilt. I will guess 150 ?
 
  • Like
Reactions: Tlh97 and Mk pt

Hitman928

Diamond Member
Apr 15, 2012
5,236
7,785
136
What I was tryi ng to get through to ondma, was that a 10900k will use way more than 72 watts when gaming at full tilt. I will guess 150 ?

I would actually be surprised if it's 150 W. Most games do not stress the CPU very much. There are a few that might start to get close to 150 W (speculation) but it wouldn't surprise me if most don't.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,540
14,495
136
I would actually be surprised if it's 150 W. Most games do not stress the CPU very much. There are a few that might start to get close to 150 W (speculation) but it wouldn't surprise me if most don't.
Maybe 110-120 ? I just know it will be considerably more than 72.
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
What I was tryi ng to get through to ondma, was that a 10900k will use way more than 72 watts when gaming at full tilt. I will guess 150 ?
You should rather be trying to answer his question why the 3900x should be consuming 30w+ more power while being slower in gaming, since the gpu is obviously being pushed more in the 9900k system and so should consume more power as a result.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,540
14,495
136
You should rather be trying to answer his question why the 3900x should be consuming 30w+ more power while being slower in gaming, since the gpu is obviously being pushed more in the 9900k system and so should consume more power as a result.
He didn;t ask that question. Also, no 9900k was mentioned, and the 9700k is mentioned at 21 watts less, but no benchmark to show it was slower or faster.
 

leepox

Junior Member
Jul 21, 2020
9
3
36
So by the sounds of it, stick to the cheaper option i.e. the 3800x since games don't really make use of the extra cores?

At least I can say I'll save that money towards the 4000 series as this is pretty much a stop gap for me to last until next summer.
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
He didn;t ask that question. Also, no 9900k was mentioned, and the 9700k is mentioned at 21 watts less, but no benchmark to show it was slower or faster.
9900k is second on the list in the second chart at 356w against the 3900x at 385w.

Do you mean you haven't called the 9900k "hot," and other less desirable adjectives?

Edit: Less desirable adjectives related to power consumption?
 
Last edited:

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,540
14,495
136
9900k is second on the list in the second chart at 356w against the 3900x at 385w.

Do you mean you haven't called the 9900k "hot," and other less desirable adjectives?

Edit: Less desirable adjectives related to power consumption?
I missed that. Still the question is, what do the benchmarks show from that test ? the 3900x does win some.
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
Not at any resolution in Witcher 3, where the power consumption graph above is based on.
 

DrMrLordX

Lifer
Apr 27, 2000
21,612
10,819
136
That is the whole system power including most likely a 2080TI, unlike the 10900k that uses 250 watts all by itself.

CPUs aren't going to be doing very much when running a game, compared to something like Blender. I suspect that Ryzen CPUs might actually use a bit more power in games just due to interconnect, though I'm skeptical of the linked chart showing a 9900k using less than an 8700k. Really?
 

rbk123

Senior member
Aug 22, 2006
743
345
136
So by the sounds of it, stick to the cheaper option i.e. the 3800x since games don't really make use of the extra cores?

At least I can say I'll save that money towards the 4000 series as this is pretty much a stop gap for me to last until next summer.
You could get the 3600 for gaming and save even more $$ towards the 4000 series. The newer batches seem to just keep getting faster.
 
  • Like
Reactions: Elfear and Tlh97

leepox

Junior Member
Jul 21, 2020
9
3
36
I've got a really good offer from a friend who is selling his 3900x unopened box for the price of a 3800x because he's downgrading his build as he lost his part time job. So I am really in splits about this, especially as the 4k series is around the corner. Having said that I tend to upgrade a year into production of new chips mainly because of the manufacturing reason that processes get better over time so you get more bang for the buck in terms of chip quality. Also the full line up doesnt really come into fruition until that time as well due to the previous reason with manufacturing quality, until they can bin chips better. So I'm not looking to upgrade again until at least this time next year.
 

DrMrLordX

Lifer
Apr 27, 2000
21,612
10,819
136
Because they ran the 9900k at stock (95w), where the 9900k is very efficient indeed.

That isn't stock. You have to manually adjust the UEFI in nearly any Z390 motherboard to get a 9900k to run that way. If they did the same thing to an 8700k, it would use less power. 9900ks are slightly more refined, but there's no way the 8c 9900k is using less power than the 8700k unless a). the load taxes at most 6c (HT included) and b). there isn't enough load on the CPU for the 9900k's higher boost clocks to be an issue.
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
That isn't stock. You have to manually adjust the UEFI in nearly any Z390 motherboard to get a 9900k to run that way. If they did the same thing to an 8700k, it would use less power. 9900ks are slightly more refined, but there's no way the 8c 9900k is using less power than the 8700k unless a). the load taxes at most 6c (HT included) and b). there isn't enough load on the CPU for the 9900k's higher boost clocks to be an issue.
Wow, I didn't know a 9900k with MCE on could be so energy efficient.

1595527655972.png

Do you realize the 3700x is drawing more power than the 9900k in the multithreaded CB chart in my previous post?
 

Keljian

Member
Jun 16, 2004
85
16
71
Wow, I didn't know a 9900k with MCE on could be so energy efficient.

View attachment 26585

Do you realize the 3700x is drawing more power than the 9900k in the multithreaded CB chart in my previous post?

You can actually tune it to be even more efficient: enable speedshift and HWP, drop the voltage offset, make sure you have all C states enabled, and enable core parking (they stealth fixed it).. cinebench doesn’t hit the cores as hard as blender though.

intel and motherboard makers typically don’t enable this stuff by default, as it can (rarely) reduce scores in benchmarks. I haven’t found that to be the case on mine.

That said, I have tuned mine heavily for power savings and noise reduction, even with 5ghz all core peaks (tempted to go further)

As for why the 9900k in the above chart is consuming less power, it’s probably the power hungry x570 chipset.
 
Last edited:
  • Like
Reactions: Tlh97

DrMrLordX

Lifer
Apr 27, 2000
21,612
10,819
136
Do you realize the 3700x is drawing more power than the 9900k in the multithreaded CB chart in my previous post?

Is MCE on? Doesn't look like it. That definitely doesn't gel with other reviews of the 9900k power-wise:


No way the 8700k so close power-wise to the 9900k unless the 8700k has MCE on while the 9900k (for whatever reason) doesn't. Also those TPU power numbers for the 2700x look suspect.
 
  • Like
Reactions: Tlh97

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
Is MCE on? Doesn't look like it. That definitely doesn't gel with other reviews of the 9900k power-wise:


No way the 8700k so close power-wise to the 9900k unless the 8700k has MCE on while the 9900k (for whatever reason) doesn't. Also those TPU power numbers for the 2700x look suspect.
But that is peak power draw while the chip is opportunistically boosting all cores to the max. Gaming power draw is different. The 8700k will push its 6 cores harder to attain the same fps, and as a result should be less efficient.