8700K vs 2700X on with 2080Ti [computerbase]

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

happy medium

Lifer
Jun 8, 2003
14,387
480
126
and, when the "3080 Ti" is released it might perform at 4K or 1440P like the 2080 Ti performs at 1080P, so there is something there...

Imagine what a Ryzen 1600 looks like with a 2080ti! It will be CPU limited at 1440p.

People complained about the 8700k price , but you only pay once , with AMD cpu's you pay $250 for the 1600, then $250 more for the 2600, then $350 more for Zen 2 just to get the performance of a 8700k at 5.0ghz when you upgrade your GPU at the high end.

Do you think a 2600 will push a 3080ti that's 35% faster next year at 1440p like a 8700k? I think there is no way, and how about the 1600x? Not a chance.

1080p scores DO MATTER! It tells us how your CPU will perform when you upgrade your GPU.
Even 1440p is starting to matter more now that we have the 2080ti.

I hope we finally have some CPU reviews that will take a good look of how the 1600 and 2600 compared to the 8700k with a 2080ti, because next year's 7nm rtx3070 is this year 2080ti.
 
Last edited:

rainy

Senior member
Jul 17, 2013
505
424
136
with AMD cpu's you pay $300 for the 1600, then $300 more for the 2600, then $350 more for Zen 2

Hard level of trolling detected - since when price of Ryzen 1600 was 300 dollars?
Initially it was 219 dollars and it dropped bellow 200 after Coffee Lake release last year.
https://en.wikipedia.org/wiki/Ryzen#CPUs:_Summit_Ridge_/_Whitehaven

Btw, you can always sell your older CPU minimizing cost of upgrade.

Personal attacks are not allowed
nathanddrews
AnandTech Moderator
 
Last edited by a moderator:

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,686
136
People complained about the 8700k price , but you only pay once , with AMD cpu's you pay $300 for the 1600, then $300 more for the 2600, then $350 more for Zen 2 just to get the performance of a 8700k at 5.0ghz when you upgrade your GPU at the high end.
Yeah, every Intel customer who bought the i5 7500 just before AMD introduced this Ryzen 1600 milking scheme must be swimming in joy right now. /s

BTW, it's $200 for the 1600/2600.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
Imagine what a Ryzen 1600 looks like with a 2080ti! It will be CPU limited at 1440p.

It probably would be. That's one of the reasons why I think 1080p testing is, shall we say, obsolete? It's time to toss it in the dustbin except maybe for iGPU gaming, or in the case of cheaper video cards.

If you really want to test a high-end gaming CPU (and let us face it, the 8700k is the fastest gaming CPU on the market right now, even if the linked benchmarks may be skewed), throw the highest-end graphics system you can on that computer and then test multiple resolutions to try to show actual gamers what is going to happen when they buy and use the CPU. Simply lowering resolution to remove the GPU bottleneck is going to skew results away from driver overhead/draw calls and towards other parts of game logic. It will produce results that few readers can actually use.

There are always going to be outlier games were other factors will come into play, like time-between-turns in a Total War game or what have you. Such titles may merit their own benchmark methodologies.
 

Brunnis

Senior member
Nov 15, 2004
506
71
91
Simply lowering resolution to remove the GPU bottleneck is going to skew results away from driver overhead/draw calls and towards other parts of game logic.
Granted, I'm no expert on GPU drivers or graphics pipelines, but I didn't think draw calls or driver overhead had anything to do with resolution. Are you sure about that?
 
  • Like
Reactions: ryan20fun

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
Granted, I'm no expert on GPU drivers or graphics pipelines, but I didn't think draw calls or driver overhead had anything to do with resolution. Are you sure about that?

Higher rez should (overall) increase driver overhead. Actually I would like to see something like this:

https://www.tomshardware.com/reviews/gfxbench-3-graphics-performance,3743-6.html

run on PC hardware to see exactly what is the relationship between driver overhead/draw calls and resolution. Sadly that seems more aimed at mobile products. Though there was a draw call bench thread or two around here somewhere (albeit one aimed at Gamebryo, ugh). I'm sure @MajinCry could add some salient commentary here.
 

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
Driver overhead almost always refers to draw calls, and they're independent of resolution. What affects draw calls, is the position of the camera. In other words, changing the rendered scene will change the number of draw calls being issued.

That being said, different aspect ratios could indeed change the number of draw calls being rendered, due to frustum culling (culling objects outside of the camera's field of view). A 21:9 resolution could render more objects than a 16:9 resolution, and a 16:9 resolution render more than 5:4.

What can also affect the number of objects being rendered, is the actual camera FOV itself being changed. Smaller FOV = smaller vertical & horizontal length of the camera = smaller scene being rendered.
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
Imagine what a Ryzen 1600 looks like with a 2080ti! It will be CPU limited at 1440p.

People complained about the 8700k price , but you only pay once , with AMD cpu's you pay $300 for the 1600, then $300 more for the 2600, then $350 more for Zen 2 just to get the performance of a 8700k at 5.0ghz when you upgrade your GPU at the high end.

Do you think a 2600 will push a 3080ti that's 35% faster next year at 1440p like a 8700k? I think there is no way, and how about the 1600x? Not a chance.

1080p scores DO MATTER! It tells us how your CPU will perform when you upgrade your GPU.
Even 1440p is starting to matter more now that we have the 2080ti.

I hope we finally have some CPU reviews that will take a good look of how the 1600 and 2600 compared to the 8700k with a 2080ti, because next year's 7nm rtx3070 is this year 2080ti.

Prices aside (as mentioned above) that's actually a good point and why I went with a 8700K over a 8600K or Ryzen.

To put it into perspective, the 8700K price is about 1/3 that of a high end GPU like a 2080 Ti, so in the grand scheme of things, it doesn't change the overall cost of a high end gaming system much to invest in a better CPU. While it may be 'overkill' in todays games, a 8700K will outlast a first or 2nd gen Ryzen chip in terms of gaming usefulness, so you can most likely squeeze out an extra generation or two of GPU upgrades before the CPU becomes the bottleneck.
 

AtenRa

Lifer
Feb 2, 2009
14,000
3,357
136
Do you think a 2600 will push a 3080ti that's 35% faster next year at 1440p like a 8700k? I think there is no way, and how about the 1600x? Not a chance.

Definitely, with RT ON even at 1080p :D

1080p scores DO MATTER! It tells us how your CPU will perform when you upgrade your GPU.
Even 1440p is starting to matter more now that we have the 2080ti.

New Games always increase the Image Quality and the requirements of GPU performance, what you see today is not how RTX2080Ti will run 2019-2020 games at 1440p.

I hope we finally have some CPU reviews that will take a good look of how the 1600 and 2600 compared to the 8700k with a 2080ti, because next year's 7nm rtx3070 is this year 2080ti.

How many gamers will buy a $600-800 RTX3070 next year ??? And how many of those that will buy a RTX3070/3080/3080Ti next year will have the R5 1600/2600/8400 ??? Next year, those people with R5 1600/2600 and Core i5 8400/8600 will be happy with a used VEGA56/64 or a used GTX1080/Ti, or a new GTX2060 or new AMD $250-400 cards.

Also to point out here, just because a CPU will not get 100% the fps a GPU would be able to provide, it doesnt mean you will not get higher fps when upgrading from your older dGPU.
Example, even if R5 2600X will not get the same fps at 1080p with a RTX2080Ti as you would get with a Core i7 8700K @ 5GHz, that doesnt mean that if you upgrade to 3080Ti you will not get a performance uplift.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Prices aside (as mentioned above) that's actually a good point and why I went with a 8700K over a 8600K or Ryzen.

To put it into perspective, the 8700K price is about 1/3 that of a high end GPU like a 2080 Ti, so in the grand scheme of things, it doesn't change the overall cost of a high end gaming system much to invest in a better CPU. While it may be 'overkill' in todays games, a 8700K will outlast a first or 2nd gen Ryzen chip in terms of gaming usefulness, so you can most likely squeeze out an extra generation or two of GPU upgrades before the CPU becomes the bottleneck.
This was my point, thanks.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
Driver overhead almost always refers to draw calls, and they're independent of resolution. What affects draw calls, is the position of the camera. In other words, changing the rendered scene will change the number of draw calls being issued.

That being said, different aspect ratios could indeed change the number of draw calls being rendered, due to frustum culling (culling objects outside of the camera's field of view). A 21:9 resolution could render more objects than a 16:9 resolution, and a 16:9 resolution render more than 5:4.

What can also affect the number of objects being rendered, is the actual camera FOV itself being changed. Smaller FOV = smaller vertical & horizontal length of the camera = smaller scene being rendered.

So given a hypothetical infinitely-powerful GPU that can always hit some arbitrary framerate at any resolution with any quality setting, would you expect CPU usage to change at different resolutions, given the same FoV and camera angle?
 
  • Like
Reactions: ryan20fun

PhonakV30

Senior member
Oct 26, 2009
987
378
136
Last edited:

Hans Gruber

Platinum Member
Dec 23, 2006
2,092
1,065
136
Let me just point out. I have an intel on my main system right now. The 2700 was $220 on amazon yesterday. The 2600 has been $150. In defense of the 2600 and 2700. From what i have read both chips can OC to 4ghz without any problem. The previous generation 1600/1700 were always stuck around 3.8ghz unless you reall cranked up the voltage.

AMD made crap for 10 years and suddenly Ryzen changed that well deserved reputation.
 

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
So given a hypothetical infinitely-powerful GPU that can always hit some arbitrary framerate at any resolution with any quality setting, would you expect CPU usage to change at different resolutions, given the same FoV and camera angle?

The only possible case where CPU usage would change dependent on resolution, is if there's an occlusion check based on the final render size of the object. The only game that I know with this feature, is the OpenMW open source engine re-implementation of Morrowind, and even then, it could very well be slower (in regards to CPU performance) to check the object's render size in pixels than just drawing the object.
 
  • Like
Reactions: ryan20fun

Abwx

Lifer
Apr 2, 2011
10,847
3,297
136
Possibly the reviewer running a poorly-configured 2700x system. DDR4-2133 or similar.

Faulty bios on the AMD MB and overclocking out of the box for the Intel MB, we ll soon know if the "reviewer" will take account of some observations that were adressed in their forum and elsewhere, in the meantime he published nothing about the exact settings, as if it was a paid review...
 
Last edited:
  • Like
Reactions: IEC and scannall

krumme

Diamond Member
Oct 9, 2009
5,952
1,585
136
At lower res and high fps the bm is generally dictated by memory subsystem. As can also be seen by cache differences.

When times and years goes by bm is bound by throughput.

The idea to evaluate gpu longevity by testing at lower res have been debunked. It's not that simple.
 

TheELF

Diamond Member
Dec 22, 2012
3,967
720
126
I mean, I don't really know what you're getting at here. Testing a CPU at some resolution I will not use, does not help me. I will still want to know what differences there will be at my chosen resolution. Maybe there will be none at all
Sure it does help you (maybe not you but people in general) if one CPU is x% faster it means that at the same FPS (due to res or weaker GPU or any reason) the faster CPU will have x% more CPU being idle able to be used for background tasks or streaming or whatever,seems to be a big deal in this forum.
Or if that doesn't matter to you it at least tells you that you can go with x% "less CPU" from the same vendor and arch and still achieve the same FPS even with the overpowered card not to mention with weaker cards,yup the same logic people use to justify getting a ryzen can be used for lower intels as well.

The ram speeds aren't in there at all. What is troublesome is seeing what has been shown many time as a 12-14 percent deficit all of a sudden 30? With limited infromation on the test beds. It seems more like click bait than anything to me. Intel currently enjoys a 7ish percent advantage in IPC, and a small disadvantage in SMT. Plus a large clockspeed advantage. To sum it up, this %30 is an outlier. By quite a bit, with too little information to dig into it.
Yup ~7% IPC and about 20% if not more clock speed at all core turbo but 30% is an outlier...
 
  • Like
Reactions: Muhammed

PhonakV30

Senior member
Oct 26, 2009
987
378
136
I find it very strange :

Test A :
https://www.computerbase.de/thema/prozessor/rangliste/
Date : 17.9.2018 , Time : 10:30

Test B :
https://www.computerbase.de/2018-09...080-ti/#diagramm-performancerating-frametimes
Date : 26.9.2018 , Time : 14:00

Now go to Test A and choose only : Assassin's Creed , Kingdom Come , Project Cars 2 , Total War: Warhammer , so far 4 games
now do it again with Test B , So result would be :

Test A :
Core i7 8700K : 98.8 FPS
Ryzen 2700X : 85.9 FPS
Ryzen 2600X : 82.9 FPS

Test B :
Core i7 8700K : 98.3 FPS
Ryzen 2700X : 74.7 FPS
Ryzen 2600X : 53.5 FPS

So Nothing change in 8700K but about 13% loss on Ryzen 2700X and about 35% loss on Ryzen 2600X
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
At lower res and high fps the bm is generally dictated by memory subsystem. As can also be seen by cache differences.


When times and years goes by bm is bound by throughput.

The idea to evaluate gpu longevity by testing at lower res have been debunked. It's not that simple.
Got a recent link to those claims?
Something that has a 8700k vs a 1700x vs a 2700x with a 2080ti and a 2080 at 1440p? No need to use 1080p really.
My guess is the 8700k will be still pulling away from the AMD cpu's and next year the next gen Gpu's will even look worse for AMD, mabe even at 4k.
See the pattern?
1080p looked bad for the 1700x vs the 8700k last year with the 1080ti, 1440p looks bad for the 2700x vs the 8700k this year with the RTX 2080, and Mabe next year with the rtx3080 the 2700x will look even worse vs the 8700k at 5.0ghz at 4k.

That's the way its looking.
The 9900k,9700k will pull away even more with games that use many threads.
 

PhonakV30

Senior member
Oct 26, 2009
987
378
136
Got a recent link to those claims?
Something that has a 8700k vs a 1700x vs a 2700x with a 2080ti and a 2080 at 1440p? No need to use 1080p really.
My guess is the 8700k will be still pulling away from the AMD cpu's and next year the next gen Gpu's will even look worse for AMD, mabe even at 4k.
See the pattern?
1080p looked bad for the 1700x vs the 8700k last year with the 1080ti, 1440p looks bad for the 2700x vs the 8700k this year with the RTX 2080, and Mabe next year with the rtx3080 the 2700x will look even worse vs the 8700k at 5.0ghz at 4k.

That's the way its looking.
The 9900k,9700k will pull away even more with games that use many threads.

I guess you only saw this pattern in Latest CB's test.
 
  • Like
Reactions: kawi6rr
Aug 11, 2008
10,451
642
126
At lower res and high fps the bm is generally dictated by memory subsystem. As can also be seen by cache differences.

The idea to evaluate gpu longevity by testing at lower res have been debunked. It's not that simple.

By whom? Got a link to anybody but your own opinion?
I think that is an open question either way. It might, but nobody really knows for sure.