8700K vs 2700X on with 2080Ti [computerbase]

Abwx

Lifer
Apr 2, 2011
10,854
3,298
136
Typical viral marketing from Volker Riska, he published the results even if it s obvious that they dont correlate with their own previous reviews using the same games, word is that it s the Nvidia drivers wich are the culprit...

Anyway to check deeper before talking of those 30% that were 11% not so long ago....

https://www.computerbase.de/2018-04/amd-ryzen-2000-test/4/

So what happened.?...
 
  • Like
Reactions: IEC

Muhammed

Senior member
Jul 8, 2009
453
199
116
New No information on actual FPS, if anything was overclocked, and on what game ? There is almost no information to back this up.
The games are literally posted in the first paragraph of the article, with the settings. fps is posted at the end of the article. All CPUs are at stock clocks.

All the info you need are in the article, if only you could click the link and read!
 
Last edited:
  • Like
Reactions: mikk

Muhammed

Senior member
Jul 8, 2009
453
199
116
Anyway to check deeper before talking of those 30% that were 11% not so long ago....

https://www.computerbase.de/2018-04/amd-ryzen-2000-test/4/

So what happened.?...
It's because :
He used a more powerful GPU
he used 6 new different games
He also used savegames instead of on rail benchmarks

The CPU test system relies on the first driver for the RTX 2080 (Ti), the GeForce 411.51, and the eleven games tested have been updated to include new savegames and other test scenarios, which makes them incompatible with previous tests
https://www.computerbase.de/2018-09...e-rtx-2080-ti/#diagramm-performancerating-fps
 
Last edited:
  • Like
Reactions: ozzy702 and mikk

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,483
14,434
136
The games are literally posted in the first paragraph of the article, with the settings. fps is posted at the end of the article. All CPUs are at stock clocks.

All the info you need are in the article, if only you could click the link and read!
I don't read German or whatever language that it.
 
  • Like
Reactions: Drazick

Abwx

Lifer
Apr 2, 2011
10,854
3,298
136
It's because :
He used a more powerful GPU
he used 6 new different games
He also used savegames instead of on rail benchmarks


https://www.computerbase.de/2018-09...e-rtx-2080-ti/#diagramm-performancerating-fps

That s not the point and you missed the fact that he was questionned in their forum and so far he gave no explanations as to why the difference increased with previous GPUs, it was 11% difference with a 1080ti here :

https://www.computerbase.de/2018-04/amd-ryzen-2000-test/4/

And in this new, and dubbious, review it s 26% with this same 1080ti and the same games...

https://www.computerbase.de/forum/a...7-ryzen-5_7-mit-geforce-rtx-im-ve-png.711938/

Btw, the games have nothing to do about it since he used them previously, besides the results are not even consistent for Intel since the 8700K has only HT compared to the i5s, but the difference is big, and it should be big as well between the 2600X and the 2700X since the latter benefit from two cores that have way higher throughput than the 2600X s SMT for additional threads, but that s not the case at all..

Edit : Main difference between the two reviews are the Nvidia drivers :

GeForce 391.35

https://www.computerbase.de/2018-04/amd-ryzen-2000-test/3/


GeForce 411.51

https://www.computerbase.de/2018-09/nvidia-geforce-rtx-2080-ti-test/2/
 
Last edited:

Muhammed

Senior member
Jul 8, 2009
453
199
116
Edit : Main difference between the two reviews are the Nvidia drivers :
Not only the driver, but the new tests used savegames instead of built in benchmarks, it also used several new CPU heavy games.
esides the results are not even consistent for Intel since the 8700K has only HT compared to the i5s, but the difference is big,
The 8400 is clocked way lower than 8700K.
it should be big as well between the 2600X and the 2700X since the latter benefit from two cores that have way higher throughput than the 2600X s SMT for additional threads,
No it shouldn't, most games can't even max 4 cores let alone 6 or 8. The 2600X results are completely normal.
 

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
Typical viral marketing from Volker Riska, he published the results even if it s obvious that they dont correlate with their own previous reviews using the same games, word is that it s the Nvidia drivers wich are the culprit...

Anyway to check deeper before talking of those 30% that were 11% not so long ago....

https://www.computerbase.de/2018-04/amd-ryzen-2000-test/4/

So what happened.?...

Different games and a faster GPU. It's possible that at 1080p in some games the benchmark was still GPU limited with the 8700k and not CPU limited.

But in the end it doesn't matter as the amount of people playing at 1080p with a 2080TI will be close to 0 worldwide. If we look at the actual test we see that at 2k and 4k it doesn't really matter which CPU you have of these 2. So nothing really changes. If you want 1080p high fps gaming, get the intel cpu and OC it to 5 ghz. If you game at 2k or higher, invest your money into freesync/gsync display and a faster GPU.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
it seems rather logical that the 8700K would be that much faster on CPU bound gaming...

gaming almost never right now benefits from more than 6 cores, the 8700K has cores that are more capable and running at a higher clock, and doesn't suffer with the latency issues that Ryzen might...

it's fair to say that this is unrealistic for most players, being this CPU limited, because most people will be pushing more pixels or using slower cards, but, it's a relevant test

specially because testing in games is always flawed, they can never test the entire game, most don't even play the entire game enough to find the ideal, most representative spots,

and, when the "3080 Ti" is released it might perform at 4K or 1440P like the 2080 Ti performs at 1080P, so there is something there...

still, it's good to have context and let people know that 4K or even 1440P will be far less CPU limited, but... the difference is potentially there...

I don't think anyone would disagree that purely for gaming the 8700K is clearly the superior solution when gaming is CPU limited, and since they cost comparable amounts...
 

Brunnis

Senior member
Nov 15, 2004
506
71
91
Yeah, I don't get this. Buying the most expensive everything, then running it on a potato monitor.... Having had a 4k monitor for a while I'd never go back to 1080p.
If you were in the market for a CPU and wanted the fastest, would you prefer a benchmark where the reviewer had gone out of their way to create a GPU bottleneck or would you prefer one where they tried to lower the GPU's effect to instead isolate CPU performance? Which test would be more useful to you? As an embedded systems engineer, it wouldn't cross my mind to try to gain useful info about a SoC's CPU performance by giving it an excessive GPU load.

It all depends on what you're trying to show. If you intend to show complete system performance in modern games, you'd benchmark at the most popular settings the target demographic use (which may be 4K and "Ultra" settings, in this case). However, if you want to benchmark an individual component, creating a bottleneck that hides the performance of the component in question is simply not a viable test methodology. It's the definition of a botched test.

The fact that the performance difference is smaller or non-existent at 4K does not mean that those two CPU's suddenly got equally fast, it just means that one of them is hamstrung. In the future, if you were to buy a significantly faster GPU, that 8700K system would realize its potential even at 4K, providing faster performance than the 2700X. It's also possible that the difference would show up at 4K and "Ultra" settings already today, given a game with sufficiently high CPU load and low GPU load.

Note: I'm not saying I agree with the test in the OP (I haven't analyzed it). Just providing some thoughts regarding benchmarking in general, something which I've actually worked with professionally.
 

Gideon

Golden Member
Nov 27, 2007
1,608
3,573
136
Not only the driver, but the new tests used savegames instead of built in benchmarks, it also used several new CPU heavy games.

Yes that's true. What @Abwx was getting at, was that even in the games included in both, AMD results were significantly worse than before. Even when they used saved games instead of time-demos, it's still a bit odd, Considering how beefy the 2700X is on resources. I've yet to see it tank so much, relatively to the competition, in actual gameplay vs time demos. One would expect the difference to be about the same give-or-take a little.


Case in point: Performance diff 8700K vs 2700X

New Article (1080 Ti results only @ 1080p):
Assassins Creed Origins - Intel: 92.6 FPS, AMD: 77.7 FPS, Diff: 19%
Project Cars 2 - Intel: 116.6 FPS, AMD: 81.5 FPS, Diff: 43%
Kingdom Come Deliverance - Intel: 79.4 FPS, AMD: 64.7 FPS, Diff: 23%
Total War Warhammer - Intel: 62.8 FPS, AMD: 49.1 FPS, Diff: 28%

Old Review (Asus GeForce GTX 1080 Ti Strix @ 1080p):
Assassins Creed Origins - Intel: 98.5 FPS, AMD: 89.6 FPS, Diff: 10%
Project Cars 2 - Intel: 107.2 FPS, AMD: 93.6 FPS, Diff: 15%
Kingdom Come Deliverance - Intel: 78.9 FPS, AMD: 71.0 FPS, Diff: 11%
Total War Warhammer - Intel: 54.7 FPS, AMD: 49.2 FPS, Diff: 11%

AMD has lost ~10FPS in 3/4 games while Intel has gained 10 FPS in 2 out of 4 games, stayed even in one and lost 6 FPS in only in one title.
I don't doubt that these are the results they got, but IMO such profound performance loss on one side, agains a similar net-gain on the other vendor, at least merits a deeper look as to why. Bear in mind the GPU, resolution and games are the same. The only difference is saved-games and drivers.

The most obvious things to test would be:
1) to run the old time-demos on the new rigs (and drivers), to see if there is a difference
2) run the same save-games with old drivers, to see if there is a difference
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
21,583
10,785
136
If you were in the market for a CPU and wanted the fastest, would you prefer a benchmark where the reviewer had gone out of their way to create a GPU bottleneck or would you prefer one where they tried to lower the GPU's effect to instead isolate CPU performance?

Neither, I want a review where someone tests the settings that I would use, or something close to it.

Obvious GPU bottleneck is obvious, such as when every CPU tested produces the same framerate @ 4k or whatever. So clearly such reviews provide little in the way of useful data, other than perhaps that every "good" CPU from Intel and AMD can get you 60 fps @ 4k with a strong enough GPU in most games.

He should test 1080P 4 teh lulz, then do 2k and 4k. The 1080p and 2k tests should be done on 144 MHz gaming monitors to show those who care about such things how close the rigs can get to the screens refresh rate limit.
 

Brunnis

Senior member
Nov 15, 2004
506
71
91
Neither, I want a review where someone tests the settings that I would use, or something close to it.
Like I said: Then you're not interested in a CPU test, you're interested in a system test. That's fine, but pretty meaningless if you want to know the performance difference between the CPUs themselves. Just looking at such data could end up with you buying a CPU that's a good deal slower than it appears, which may become apparent if you try to run games with higher CPU requirements (or future games).

The whole point with my post is that there are very obvious reasons for testing at lower resolutions. Honestly, I can't understand how this still seems hard to grasp or "controversial" among some people.
 

scannall

Golden Member
Jan 1, 2012
1,944
1,638
136
Like I said: Then you're not interested in a CPU test, you're interested in a system test. That's fine, but pretty meaningless if you want to know the performance difference between the CPUs themselves. Just looking at such data could end up with you buying a CPU that's a good deal slower than it appears, which may become apparent if you try to run games with higher CPU requirements (or future games).

The whole point with my post is that there are very obvious reasons for testing at lower resolutions. Honestly, I can't understand how this still seems hard to grasp or "controversial" among some people.
It isn't hard to grasp or understand. It just isn't relevant to my use case is all. Gaming is secondary to me. My main use requires as many cores/threads as I can get. A Threadripper 2950 would be ideal for me, even if it doesn't game as well as an 8700k for example.

I guess my point is simply 1080p gaming really isn't relevent to quite a few people. If gaming is all you do, then great. These kinds of benchmarks are relevant to you. Doesn't mean much though to folks that do a lot of heavy lifting on their systems that aren't gaming related.
 

Abwx

Lifer
Apr 2, 2011
10,854
3,298
136
Like I said: Then you're not interested in a CPU test, you're interested in a system test. That's fine, but pretty meaningless if you want to know the performance difference between the CPUs themselves. Just looking at such data could end up with you buying a CPU that's a good deal slower than it appears, which may become apparent if you try to run games with higher CPU requirements (or future games).

The whole point with my post is that there are very obvious reasons for testing at lower resolutions. Honestly, I can't understand how this still seems hard to grasp or "controversial" among some people.

At a lower resolution, that is 720p, and the most demanding scenes of the tested games there s not as much difference than what Computerase got at 1080p..

http://www.pcgameshardware.de/Ryzen...2600X-Review-Benchmark-Release-Preis-1254720/

PCGH, 720p, 1080ti
ACO = 2700x zu 8700k = +4%
Kingdom Come Deliverance = 2700x zu 8700k = + 13%
Wolfenstein 2 = 2700x zu 8700k = +11%

Computerbase, 1080p, 1080ti
ACO = 2700x zu 8700k = +19%
Kingdom Come Deliverance = 2700x zu 8700k = + 23%
Wolfenstein 2 = 2700x zu 8700k = +27%

(Courtesy of DonL_ at Computerbase forum..)

CPU limit and difficulty to grasp you said..?
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
Typical AMD versus Intel comments aside, one has admire how powerful 8700K is. Being 25-35% faster in frametimes versus same architecture , but lower clock 8400 is amazing. It seems that 12MB of L3 and extra clock helps a lot.

Overall this does put future 9900K in great position, even more L3 and even more clock. It will take Ryzen2 to properly challenge CFL and CFL-R.
 
  • Like
Reactions: Zucker2k and Gideon

VirtualLarry

No Lifer
Aug 25, 2001
56,230
9,990
126
IMO such profound performance loss on one side, agains a similar net-gain on the other vendor, at least merits a deeper look as to why. Bear in mind the GPU, resolution and games are the same. The only difference is saved-games and drivers.
Did they transplant the same HDD/SSD between platforms to do their testing? What about the Windows' platform HPET settings? I've read some from AT about the fact that Intel and AMD platforms respond differently, and that this can rear its ugly head in terms of affecting the outcome of benchmarks between the two.

Edit: Also, Power Plan settings, if they used the Ryzen Power Plan, versus High Performance, versus Balanced.
 
Last edited:

PotatoWithEarsOnSide

Senior member
Feb 23, 2017
664
701
106
If low resolution testing was the determinor of CPU capabilities, why not go the whole hog and go for the lowest possible resolution?
Test a really old game that only came with a terrible resolution setting.
Would tell me nothing about the price of fish in Sowetto, but at least I'd know that a 8700k could get 554785211588444 fps, whilst a 2600x could get 488524862288553 fps. Meanwhile, my 60/144Hz monitor that displays at 1080p/4k resolution, still wouldn't care what CPU was being used, only what GPU it was being fed from.
Low-res competitive eSports gamers are the only folk that notice any difference in top-end CPU performance