New game engines are increasingly better threaded. This is not a novel idea. That does not invalidate low res CPU benchmarks. Regardless of whether you think they're important or not, there are still a lot of people to whom those benchmarks are relevant. The best compromise would be if the reviewers just included both.
Yes it does invalidate low res benchmarks when comparing a 8/16 processor vs 4/8 processor and erroneously suggesting to consumers such benchmarks predict future performance, THEY DON'T. You have been shown why this the case, if you feel strongly otherwise feel free to offer a good explanation, preferably with a vaid example.Re-read my last two posts. You're asking me to recreate an Intel version of his flawed analysis and commit exactly the same fallacy that I was criticizing the video for.
New game engines are increasingly better threaded. This is not a novel idea. That does not invalidate low res CPU benchmarks. Regardless of whether you think they're important or not, there are still a lot of people to whom those benchmarks are relevant. The best compromise would be if the reviewers just included both.
I'm making an argument against GPU limited CPU benchmarks in CPU reviews/the idea that CPU limited CPU benchmarks have no value.
Technically there would be more future head room for a more powerful gpu, or demonstrating 240fps+ competitive gaming, providing your comparing two cpus with the same threads.Well, why do they have value? What do they show?
It would seem all they show is current performance of the CPU on a current GPU on the current version of a game.
They don't show future performance of the same CPU on the same GPU on a future version (patch) of the game. Or a future GPU.
You cant base a full cpu review on the basis consumers are going to be playing the very same game 4 years later! :/No it doesn't. If X game doesn't utilize more than 4/8 threads, X game is currently GPU bottlenecked, and you still intend to play X game in 2-4 years time when GPU horsepower has doubled, then they absolutely do help predict the CPU power reserves you might have in that event. Again i'm not using this as an argument against Ryzen specifically --right at this point it's ST performance is pretty competitive-- but this has played out countless times in the past (games like Skyrim, Crysis).
You're so fixated on the core count of the CPU, and so confident in your assumption that everyone is going to be playing exclusively highly threaded titles from the future in the future, that you've repeatedly missed the point I'm making, despite my having spelled it out it the simplest possible terms.
As I've said from the start, it all comes down to the individual consumer's priorities. You don't help consumers by hiding the potential differences between CPUs by benchmarking them in a quasi-real-world GPU bound scenario.
You are mixing things up here so this will be my last comment on the subject.Have you noticed how popular Skyrim and GTX 5 still are? And you're not "basing the full CPU review on that". No one will ever answer me this, I think I've already asked it twice in this thread: What is more useless; contrived low res CPU bound CPU benchmarks that give you some idea of the relationship between two parts, or flatline GPU bound CPU benchmarks that tell you absolutely nothing?
Yep. But this argument is only valid as long as you run a 8350 that couldnt get the most out of the games when it launched. It was to slow out the gate for even 60fps gaming.No it doesn't. If X game doesn't utilize more than 4/8 threads, X game is currently GPU bottlenecked, and you still intend to play X game in 2-4 years time when GPU horsepower has doubled, then they absolutely do help predict the CPU power reserves you might have in that event. Again i'm not using this as an argument against Ryzen specifically --right at this point it's ST performance is pretty competitive-- but this has played out countless times in the past (games like Skyrim, Crysis).
You're so fixated on the core count of the CPU, and so confident in your assumption that everyone is going to be playing exclusively highly threaded titles from the future in the future, that you've repeatedly missed the point I'm making, despite my having spelled it out it the simplest possible terms.
As I've said from the start, it all comes down to the individual consumer's priorities. You don't help consumers by hiding the potential differences between CPUs by benchmarking them in a quasi-real-world GPU bound scenario.
No, that's what the so-called "real world" GPU bound benchmarks show, and that's precisely why they're useless.
They can show what performance you might be able to attain with a future GPU.
Jesus wept, is it possible that you guys could a more spectacular job of missing the point? You won't be getting those "crazy frame rates" if you're CPU bound.
Incidentally, a GTX 780 couldn't sustain 60fps in Crysis at 1920x1080 with 4xAA.
I think adored showed that threads is the one metric that does indicate longevity.Predicting future cpu gaming performance longevity using the past performance trends seems silly at best. It's probably best to use some imagination and try to predict (based on current trends) where it's headed in the end.
Sadly some bigger players in the market tend to sandbag the evolution process.
Based on the past Ryzen should have been hot, power hungry, undesirable, and lacking in performance! This phylosophy could explain why it kind of looks like the MB makers got caught with their pants down. AMD delivered a competitive product which seemed impossible....If that makes sense.
The past doesn't dictate the future. In the end it only shows what happened then and doesn't really reflect on what will happen tomorrow, next month, year, etc.
They can show what performance you might be able to attain with a future GPU.
Yes you did (missed that) but it doesn't change my responses.And I said they predict it in more ways than one. He said low res benches are useless, I raised an objection to that. Don't try to tell me he was referring exclusively to Ryzen vs current Intel, because the majority of his video dealt with legacy hardware.
I introduced that "corner case" clearly in my first post (future gains for current games), so if you want to dismiss my argument based on that, why didn't you do it then? Oh, you didn't expect 2-4 years? Give me a break, that's hardly an outlandish upgrade cycle for a GPU.
Yes, and that's why the CPU is even in question at all. If the game wasn't GPU bound at the time, there would be no dilemma in choosing the CPU because you'd just choose the fastest one.
Who said anything about Ryzen? Again, this is not an anti Ryzen argument.
Gears of War on PC is DirectX 12 purely.
Sorta. Kinda.
Leaving aside game patches for a minute, it shows very little of relevance.
Unless moving from 90 FPS to 150 FPS really floats your boat?