That was one of the reasons I liked his review. Everyone tests for science. But few own a 4090, and when a $1000 GPU is the bottleneck most of the time at 1080 and 1440 max settings it provides some perspective.
That said, Leo is testing, not playing the games. I know for a fact you can better expose some of the CPUs in that list in games like Cyberpunk'd, Spiderman games, MS flight sim, Hogwarts, etc. Particularly with RT on.
Exactly. 0.1% get nerfed for all kinds of dumb reasons; like when you die in CS2. They rarely say anything useful, because the longer you play, the higher they tend to be. Some games start out rough and get smoother over time. Take away: games are a mess and all of the benchmarketers info is of limited utility. For those of us that play the games for many hours the experience is often different than their results would lead you to believe.
My overclocked 4770K looked good in Assassin's Creed Odyssey in the benchmarks. But I could make it crap the bed in big fights with mercs, soldiers, and citizens all going ham. Even PBO boosted Ryzen 2600X did not manage to be buttery smooth all of the time. It took a Ryzen 3600 to completely eliminate frame pacing issues; thank Denuvo!
Bryan is a good guy, but his degree is in biz/financing iirc. Props for being one of the first to callout raptor ringbus issues and swap back to a 10th gen. But like the rest usually benchmarks games, instead of playing them.
I liked the comment in the Zen 5 thread that posited AMD is separating work and play more. Vanilla for budget and work. 3D for serious gamers.