Discussion Anand, the last tech reviewer who still do it properly

Intrepid3D

Junior Member
Jun 20, 2017
9
1
41
For whatever reasons a lot of tech reviewers recently make no effort to find differences in performance between the CPU's they test when it comes to gaming.

There is a trend now to publish bar charts that is a solid block of near equal length bars between as many as 20 different CPU's, in the most extreme case i have seen a 3300X is 95% the performance of a 10900K or 5800X at 1080P, really?

I'm not going to point fingers at individual publishers but it seems pretty prevalent now that they seem to have no interest now in finding those CPU's that are good at gaming vs those which are not as good or bad, its as if they all want to come to a predefined conclusion, the one they always make "they are all the same"
No winners, no losers, everyone gets a prise. Well that's nice but for me researching my next hardware it tells me absolutely nothing, am i really supposed to believe that a 3300X is just as good as a 5800X???????

Its obvious to me, but only because i understand a bit about it, that they are all publishing slides where the GPU, not the CPU is the limitation. I'm looking at a performance test of the GPU they used, not the CPU's.

Its only when you come to AnandTech that you see there is actually a tangible difference between these CPU's. for example. https://www.anandtech.com/show/16535/intel-core-i7-11700k-review-blasting-off-with-rocket-lake

With that i would like to extend my thanks to AnandTech for still doing it properly. Thank you.
 

Iron Woode

Elite Member
Super Moderator
Oct 10, 1999
30,876
12,383
136
Just a FYI, but Anand sold the company and has been gone for a long, long time (7+ years now).

I think the last review he did was for Haswell CPUs. It has been Ian Custress since that time.
He may have meant AnandTech itself rather than Anand himself.
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
I suppose looking at those results can be useful, but they usually aren't because most gamers will wind up GPU-bound before the CPU limits performance.

Not too many people will pair a Celeron with a Titan, but assuming they did it might not matter if they're gaming at 4K maxed settings. In those cases the Titan buckles even before the lowly CPU.

It's interesting in an academic sense, but is it practical for the average person? Probably not, but this isn't a forum for the average person so I do appreciate that AT takes the time to explore different aspects of performance like this even though they won't apply to me.
 

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,698
136
I suppose low res benchmarks at high framerates of a twitch shooter make sense, but I also don't really get the push to try and manufacture a difference between CPUs by turning settings down to 720p low because they all test the same at 1080 high.
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
Some sites do benchmark Civilization turn times if you look for it: https://www.overclock3d.net/reviews/cpu_mainboard/amd_ryzen_5_5600x_review/17

It doesn't really matter what you have since the newest CPUs all come in at around 30 seconds. Going from a 6-core Zen 3 CPU to a 16-core Zen 3 CPU shaves off 5 seconds, unless you overclock both CPUs which for some reason causes the 6-core CPU to run .8 seconds faster. It seems like there's a clear limit as to how well it scales with more threads.

Starcraft II isn't benchmarked all that frequently anymore, but you can find results if you look for them. Most games actually take advantage of multiple cores now so it isn't particularly useful to test something that's pretty much bound by top single-core performance.
 
  • Like
Reactions: Tlh97 and Ranulf

Dave2150

Senior member
Jan 20, 2015
639
178
116
For whatever reasons a lot of tech reviewers recently make no effort to find differences in performance between the CPU's they test when it comes to gaming.

There is a trend now to publish bar charts that is a solid block of near equal length bars between as many as 20 different CPU's, in the most extreme case i have seen a 3300X is 95% the performance of a 10900K or 5800X at 1080P, really?

I'm not going to point fingers at individual publishers but it seems pretty prevalent now that they seem to have no interest now in finding those CPU's that are good at gaming vs those which are not as good or bad, its as if they all want to come to a predefined conclusion, the one they always make "they are all the same"
No winners, no losers, everyone gets a prise. Well that's nice but for me researching my next hardware it tells me absolutely nothing, am i really supposed to believe that a 3300X is just as good as a 5800X???????

Its obvious to me, but only because i understand a bit about it, that they are all publishing slides where the GPU, not the CPU is the limitation. I'm looking at a performance test of the GPU they used, not the CPU's.

Its only when you come to AnandTech that you see there is actually a tangible difference between these CPU's. for example. https://www.anandtech.com/show/16535/intel-core-i7-11700k-review-blasting-off-with-rocket-lake

With that i would like to extend my thanks to AnandTech for still doing it properly. Thank you.

What's your opinion on Anandtech's timely review of the Nvidia 3080/3090?
 
  • Like
Reactions: DooKey

Ranulf

Platinum Member
Jul 18, 2001
2,348
1,165
136
Why is nobody benchmarking in CPU reviews Civilization turn times, or StarCraft 2 and the like? Games that would actually be CPU limited, instead of artificially creating bottlenecks in FPS.

SC2 was never that great of a test. Its limited to dual core and probably really max single core speed, favors intel cpus and was not that well optimized by Blizzard to begin with.
 

scineram

Senior member
Nov 1, 2020
361
283
106
SC2 was never that great of a test. Its limited to dual core and probably really max single core speed, favors intel cpus and was not that well optimized by Blizzard to begin with.
That's exactly why it would be a good benchmark. Also why would it favor Intel?
 

Timorous

Golden Member
Oct 27, 2008
1,606
2,747
136
SC2 was never that great of a test. Its limited to dual core and probably really max single core speed, favors intel cpus and was not that well optimized by Blizzard to begin with.

If it favours Intel it favours Intel, that is useful data. I would like to see tic rate tests for lategame Paradox Grand Strategy games and turn times for other 4x titles like Gal Civ, Age of Wonders, Endless Space and so on. Would be good to see some high pop sprawling cities as well in skylines and other city builders.

I play Civ 6 GS at 4k on a 2200G because that works perfectly fine with the strategy map. The issue I have is more related to turn times and just general hitchiness when panning the map late game.

These are the kind of games where the CPU matters more than the GPU but they are so rarely tested which is a real shame.
 
  • Like
Reactions: Tlh97 and scineram

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,400
2,437
146
I would like to see Quake Champions, but also older games like Star Wars Empire at War with mods.
 

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
I'm finally getting to compare a 4.85ghz 5800x with 3600 cl18 ddr4 against my i7 5775c. I'm able to test old games that websites don't use anymore, and the results are far larger than Anandtech Bench or GN bench's show.

In ARMA 3, the 5800x is getting a consistent 2x higher minimum fps than my i7 and an r5 3600 in CPU limited missions with hundreds of AI. Or in Sins of a Solar Empire 1.93 SoTP 0.88.5, the Flood AI crush my i7 down to 30fps with severe stuttering, while the 5800x maintains over 90 with smooth frametimes. That performance boost makes upgrading an obvious choice.

I want to see old game benchmarks for modern hardware still. I would love to see the 5800x vs the i9 10900k in these older games.
 

ondma

Platinum Member
Mar 18, 2018
2,720
1,280
136
For whatever reasons a lot of tech reviewers recently make no effort to find differences in performance between the CPU's they test when it comes to gaming.

There is a trend now to publish bar charts that is a solid block of near equal length bars between as many as 20 different CPU's, in the most extreme case i have seen a 3300X is 95% the performance of a 10900K or 5800X at 1080P, really?

I'm not going to point fingers at individual publishers but it seems pretty prevalent now that they seem to have no interest now in finding those CPU's that are good at gaming vs those which are not as good or bad, its as if they all want to come to a predefined conclusion, the one they always make "they are all the same"
No winners, no losers, everyone gets a prise. Well that's nice but for me researching my next hardware it tells me absolutely nothing, am i really supposed to believe that a 3300X is just as good as a 5800X???????

Its obvious to me, but only because i understand a bit about it, that they are all publishing slides where the GPU, not the CPU is the limitation. I'm looking at a performance test of the GPU they used, not the CPU's.

Its only when you come to AnandTech that you see there is actually a tangible difference between these CPU's. for example. https://www.anandtech.com/show/16535/intel-core-i7-11700k-review-blasting-off-with-rocket-lake

With that i would like to extend my thanks to AnandTech for still doing it properly. Thank you.
Yea, those 320p tests Anand Tech is doing are really relevant. Anand tech has a very.... unique set of resolutions and quality settings for their gaming tests. Personally I am not a fan, but each to his own. They also tend to use less than top of the line GPUs.
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
Yea, those 320p tests Anand Tech is doing are really relevant. Anand tech has a very.... unique set of resolutions and quality settings for their gaming tests. Personally I am not a fan, but each to his own. They also tend to use less than top of the line GPUs.

They include a lot of resolutions and settings, but the 320p (or similarly low resolution tests that you're describing) are there for testing CPUs, not the GPU. The purpose of dropping the resolution that low is to move the performance bottleneck off of the GPU and on to the CPU to see where it starts to limit performance. The GPU reviews test 3 main resolutions at max/high/ultra settings for a game and don't include low resolutions like the CPU reviews do.

I believe that they're still using a 2080 Ti which until recently was a top of the line GPU and given the difficulty in obtaining either a 3080 or a 3090 right now still is for many people.
 

ondma

Platinum Member
Mar 18, 2018
2,720
1,280
136
They include a lot of resolutions and settings, but the 320p (or similarly low resolution tests that you're describing) are there for testing CPUs, not the GPU. The purpose of dropping the resolution that low is to move the performance bottleneck off of the GPU and on to the CPU to see where it starts to limit performance. The GPU reviews test 3 main resolutions at max/high/ultra settings for a game and don't include low resolutions like the CPU reviews do.

I believe that they're still using a 2080 Ti which until recently was a top of the line GPU and given the difficulty in obtaining either a 3080 or a 3090 right now still is for many people.
Well, I am talking about the cpu reviews. I understand the rational for the low resolution tests, but 320p seems really extreme. I mean come on, people used to criticize 720p as not being relevant, much less 320. The 3080 and 3090 have been out for six months, BTW.
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
Well, I am talking about the cpu reviews. I understand the rational for the low resolution tests, but 320p seems really extreme. I mean come on, people used to criticize 720p as not being relevant, much less 320. The 3080 and 3090 have been out for six months, BTW.

For a CPU test you'd want to use the lowest resolution possible. In a lot of games that's likely 720p, but a few let you go lower and I wouldn't be surprised if they're chosen for that reason. The first page of the game reviews mentions using Deus Ex because it's a more CPU intensive title even though the game is a little older.

Not using a 3090 may just be a case of wanting to make it easier to include historical results without having to do a lot of retesting. It may also be due to the fires out in Oregon last fall. Ryan who does the GPU reviews had to evacuate. If the don't have a 3090 anymore it's probably quite hard to get one right now.
 

naukkis

Senior member
Jun 5, 2002
705
576
136
Well, I am talking about the cpu reviews. I understand the rational for the low resolution tests, but 320p seems really extreme. I mean come on, people used to criticize 720p as not being relevant, much less 320. The 3080 and 3090 have been out for six months, BTW.

If benchmarking CPU, not GPU the used gpu isn't important at all if resolution used is low enough that GPU won't be a bottleneck. Yes, 720p is not relevant for GPU testing but for CPU test it might be still be too high resolution if GPU isn't able to render pixels at rate cpu prepares them.

And those low resolution benchmarks are relevant to measure CPU - that's the max framerate CPU is capable if GPU isn't bottleneck. And from Anandtechs review we see that Zen3 isn't a little bit faster in gaming than Intel's nowadays cpus - it has massive advantage, in many games it's 30-50% faster. That's relevant also in higher resolutions when those cpus are paired with faster GPUs than what is available today.
 

ondma

Platinum Member
Mar 18, 2018
2,720
1,280
136
If benchmarking CPU, not GPU the used gpu isn't important at all if resolution used is low enough that GPU won't be a bottleneck. Yes, 720p is not relevant for GPU testing but for CPU test it might be still be too high resolution if GPU isn't able to render pixels at rate cpu prepares them.

And those low resolution benchmarks are relevant to measure CPU - that's the max framerate CPU is capable if GPU isn't bottleneck. And from Anandtechs review we see that Zen3 isn't a little bit faster in gaming than Intel's nowadays cpus - it has massive advantage, in many games it's 30-50% faster. That's relevant also in higher resolutions when those cpus are paired with faster GPUs than what is available today.
Even if one accepts the assumption that benchmarks run at low resolution (I mean *extremely low* like 320p) today are directly indicative of performance in future games on different engines with much more powerful gpus (most likely a different architecture), it is likely that by the time those hugely more powerful gpus are available, no one demanding cutting edge performance would use a cpu from today because much more powerful cpus will be available as well.
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
It's a pretty good look at how the CPU will do five or so years down the line. There was a recent HUB video about Nvidia having some driver overhead issues in more recent DX12 titles that spurred a bit of investigation and regardless of whether you use an AMD or Nvida GPU, for some titles the CPU will hold you back at 1080p. An original Zen CPU can bottleneck enough that a 3080 is no better than a 1080 in terms of FPS.

Not everyone upgrades every time a new generation comes out. Trying to future proof when you know that you'll be holding on to a system for 5+ years is certainly worth while. However, if you're just going to game at 4K ultra, then the CPU doesn't matter at all for the most part. You could buy a Celeron and it will do almost equally as well as a top of the line i9.