You are saying both of his videos are rigged? He redid the tests with a GTX670 and in some games FX8350 is faster. You are saying that's not possible that an FX8350 can win in some games? The original reviews done on the Bulldozer platform also didn't have all the Windows patches, etc. Also, if you notice, it's not as if FX8350 is winning everything. It loses in many titles as well. You seem to dismiss any results where AMD CPUs are winning as "inaccurate" or "biased benchmarking" because it's not what you are used to seeing.
What about
this testing at 1080P with AA and a GTX670 in 12 games? It's pretty clear that with a single $400 GPU, most games played at 1080P with AA are going to be GPU limited.
I find it interesting that whenever an AMD CPU wins benchmarks, those results are constantly scrutinized / dismissed or the benchmarks are viewed as "irrelevant".
I've not quoted your whole post as much of what you said targets stuff aside from gaming and is thus not immediately relevant for this thread or my answer.
Have you taken your time and actually looked at those results? I've said it already and I'll say it again: They are full of highly unlikely numbers that make no sense.
Let me analyze it for you:
Crysis Warhead (7:03)
- Difference is larger at 1440p than at 1080p. Makes no sense.
- Massive GPU bottleneck with Intel (fps drop), yet OC yields fps boost
- OC at 1080p yields a higher boost with Intel than with AMD though both CPUs are overclocked by the same 25% vs their guaranteed base clock. Makes no sense.
Arma 2 (7:13)
- OC yields a higher boost at 1440p than at 1080p (AMD). Makes no sense.
- Massive GPU bottleneck with Intel (fps drop), yet massive differences between Intel and AMD (indicating a CPU bottleneck). You can't have both bottlenecks at this intensity at once, makes no sense.
Far Cry 3 (7:23)
- Again a quite strong GPU bottleneck according to AMD vs AMD@OC and yet vastly different CPU results between AMD and Intel. Makes no sense.
Metro 2033 (7:42)
- GPU bottleneck, yet OC yields fps boost with Intel at 1080p. Questionable.
- Differences between AMD and Intel increase with resolution. Makes no sense.
Natural Selection 2 (7:57)
- 43% more fps by OC the 3570K 25% (3.6GHz are guaranteed even for all threads, 4.5/3.6=1.25). Wow! Yeah, that makes absolutely no sense.
Skyrim (8:13)
- 32% more fps by 25% more OC for the Intel at 1080p. Yeah...don't think so.
Trine 2 (8:55)
GPU bottleneck, looks ok.
And finally:
In absolutely no game in every review that I have seen (and I've seen alot!) does the 8350 win against the 3570K (aside from maybe BF3 which was not tested here). Especially the Arma 2 and Far Cry 3 results are astronomically off compared to anything I've ever seen.
So you see, this review is full of errors which calls into question either the ability or the integrity of TEK Syndicate. But good that you criticized me first
Edit:
About the pctuning link:
First, it depends on the games that you play. Most reviews test only mainstream games but leave out RTS and simulations.
Secondly, most reviews have a flawed testing methodology, using integrated benchmarks that often have lower CPU load than real gameplay. Very few test with the savegame method, actually playing carefully selected parts of the game that are representative of CPU demanding scenes in the game. Among those are PCGH.de, computerbase.de, ht4u.net and hardware.fr
Thirdly, GPU bottlenecks aren't necessarily relevant. For example the GPU in the review is bottlenecking at 45fps, but you need 60fps for enjoyable gameplay. Wouldn't you like to know if CPU A could achieve that, but CPU B couldn't? No one is forced to play at the exact same settings the reviewer conducted his/her tests in. It is a bad idea to assume what fps are sufficient for someone by thinking those settings and numbers are set in stone.