Speculation: i9-9900K is Intel's last hurrah in gaming

Page 24 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Will Intel lose it's gaming CPU lead in 2019?


  • Total voters
    184
  • Poll closed .

Abwx

Lifer
Apr 2, 2011
10,939
3,440
136
.
You can't argue that larger L3 doesn't reduce the number of trips to memory required. The only question is by how much they are reduced. That's why headline figures mean diddly squat.

Seconded..

In the graphs below we can clearly see L3 cache size influence, the 2400G has higher clock but only 4MB L3 (in respect of the R5 1400/1500X).

In games at roughly 10% lower frequency the R5 1400 (8MB) is only 3% behind while the 1500X (16MB) is 10% ahead despite its lower frequency, those %ages extend even further in the applications graph.


https://www.hardware.fr/articles/973-22/indices-performance-cpu.html
 

B-Riz

Golden Member
Feb 15, 2011
1,482
612
136
Sure, you can make the argument that the 3900X has the 'gaming crown' with an outlier review.

But there are 20 others out there with a differing view.

At the end of the day, I think both the 3900X and 9900K are overpriced as a strictly 'gaming only' CPU.

You can easily save $200 from either and still get comparable gaming performance.

But on the topic was of this thread, I think the results are definitive enough to say that the 9900K hasnt been 'dethroned' as the fastest gaming CPU, although in many cases the lead it has over the 3900X is small enough to be trivial.

The 9900K is Intel's overall best CPU right now.

I think it is *better* to compare the 3900X more to the 9920X, as, oddly, the 9900K is, for the most part, Intel's best workstation CPU also, up to a certain point.

The 3900X puts a lot of pressure on X299, just like 1800X did to X99. Then Threadripper happened...

The Intel product stack is too "fat" right now; i7 9700K ($410 !!! at Newegg) is a bad buy to 9900K / 9900KF

And any i5 is a bad buy to the i7 8700K.

I think the 9900K is just ~$50 - $75 too much for being a desktop processor.

Hmm, look what is on sale right now at Newegg... (might have to sell the 8700k...)

As predicted, Intel has *sales* (price drops) but is still a few $ more than the new competition. 3800X is $400.

Dang, it is a good time to be a PC enthusiast.

1562660668716.png

1562660835890.png
 

TheGiant

Senior member
Jun 12, 2017
748
353
106
The 9900K is Intel's overall best CPU right now.

I think it is *better* to compare the 3900X more to the 9920X, as, oddly, the 9900K is, for the most part, Intel's best workstation CPU also, up to a certain point.

The 3900X puts a lot of pressure on X299, just like 1800X did to X99. Then Threadripper happened...

The Intel product stack is too "fat" right now; i7 9700K ($410 !!! at Newegg) is a bad buy to 9900K / 9900KF

And any i5 is a bad buy to the i7 8700K.

I think the 9900K is just ~$50 - $75 too much for being a desktop processor.

Hmm, look what is on sale right now at Newegg... (might have to sell the 8700k...)

As predicted, Intel has *sales* (price drops) but is still a few $ more than the new competition. 3800X is $400.

Dang, it is a good time to be a PC enthusiast.

View attachment 8170

View attachment 8171
well Intel didnt lose this battle of our time on the 8C front
as much I would like it, they wont reduce prices by much (if anything)
or maybe when they release the 9900KickFromChuck model the reduce the 9900K to 389 and 9900KF to 339EUR
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
832
136
You mean the leaked slides which puts Rocket Lake in 2022? Those are likely to be outdated or fake.

Nope.

There have been numerous other sources and don't get hung up on 2022, 2021 is bad enough.

You effectively seem to be ruling out the possibility that Zen 3 will get here before Intel 10nm and/or that Zen 3 could be a better gaming CPU than the 9900K.
 
  • Like
Reactions: Tlh97 and epsilon84

Thala

Golden Member
Nov 12, 2014
1,355
653
136
Vega 64 1.274Ghz, 4096 Cores, 14nm, 295W
Vega 7 1.400Ghz, 3840 Cores, 7nm, 300W

The 7 doubled the HBM bus from 2048 to 4096 bit and a little more than doubled memory bandwidth, which is basically all of the performance increase came from (eg; downclock and undervolt the Vega7 to Vega64 levels, and performance doesn't drop much from stock, but remains above the 64 due to the 200%+ memory speeds).

Vega7 has 30% higher power efficiency than Vega64. This is only due to process. The power efficiency gain would have been even higher if AMD would have decided to go iso-clock - with 10% higher clock you loose part of the power savings.
Now looking at Zen2, there is hardly a frequency increase so the whole process gain goes into power efficiency.
 
Last edited:
  • Like
Reactions: Tlh97

tamz_msc

Diamond Member
Jan 5, 2017
3,772
3,595
136
...from Intel.

Of course they are going to cherry pick their best case scenarios to highlight a point. It is well known that Ryzen suffers in FC5 and GTA V.
I'd like to see similar graphs relating to Zen+ and Zen 2. I've asked for such before.
If we look at CS:GO we can see that even Intel sees big uplifts in L3 hits, and that should suggest that Zen 2 L3 should hit even more. Guess what? Zen 2 now wins in CS:GO.
What matters how many fewer requests to memory there are as a result of increased L3, as each L3 hit there is a net 50ns+ improvement over Zen+. My own calculations indicate only 25% fewer trips to memory are needed as a result of larger L3 for Zen 2 to have the same real world latency as Intel.
You can't argue that larger L3 doesn't reduce the number of trips to memory required. The only question is by how much they are reduced. That's why headline figures mean diddly squat.
The relationship between hit rate and cache size is not linear. Doubling the cache size increases hit rate by root(2) times. There is no doubt that Zen 2's larger L3 reduces trips over to the memory, but it is not enough to close the memory access penalty to Skylake as access to main memory is much slower on Zen 2.
 
  • Like
Reactions: Tlh97 and guachi

tamz_msc

Diamond Member
Jan 5, 2017
3,772
3,595
136
Where are you getting this from? AFAIK we're talking 43-45ns on Skylake vs. 65-67ns on Zen 2.
From the AT review.
lat9900log.png

lat3900log.png
 
  • Like
Reactions: Tlh97

tamz_msc

Diamond Member
Jan 5, 2017
3,772
3,595
136
Fair enough, that's 2X according to this testing methodology. Where did your 2.5X come from?
This graph is logarithmic; I was eyeballing the linear graph, the curves are all compressed so it appeared to look like 2.5x. You can ignore that.
 

B-Riz

Golden Member
Feb 15, 2011
1,482
612
136
Probably Adored, guy was constantly throwing BS against the wall.

Nothing would stop them from putting the Zen2 design on 12nm or 14nm, it would just be bigger dies, higher power consumption, lower power/heat density. Perhaps even higher clock headroom. Look at the Zen2 ICs under the IHS, there's loads of room.

7nm primary advantage seems to be how many dies per wafer, thus profit, as long as they keep the fab costs under control.

IOW, I think the good stuff on Zen2 is all design, not really anything special to do with 7nm.

We saw similar things with Vega to Vega 7. 7nm made nearly zero difference.

No, I don't follow FUD and hot air.

I think you are discounting the benefit of a smaller mfg process for x86 cpu's.


Smaller process, more transistors on package; the extra L3 was because of the inherent design of Zen and needing to reduce the memory latency issue of Zen and Zen+.
 

Abwx

Lifer
Apr 2, 2011
10,939
3,440
136
The relationship between hit rate and cache size is not linear. Doubling the cache size increases hit rate by root(2) times. There is no doubt that Zen 2's larger L3 reduces trips over to the memory, but it is not enough to close the memory access penalty to Skylake as access to main memory is much slower on Zen 2.

That doesnt hold if we compare a 3700X (4.4GHz peak) to a 9900K (5GHz pk), despite 13% higher frequency the latter is only 3 and 2 % ahead in FPS and min Framerate respectively (at 1080p) :

https://www.computerbase.de/2019-07...mm-test-performancerating-fuer-spiele-fps-fhd

The gap is more than closed, they dont even need to reach 5Ghz...
 

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,829
136
Vega 64 1.274Ghz, 4096 Cores, 14nm, 295W
Vega 7 1.400Ghz, 3840 Cores, 7nm, 300W

Slightly off-topic, but where are you getting those clockspeeds?

The gap is more than closed, they dont even need to reach 5Ghz...

More than a few cores on Matisse need to hit around 4.6 GHz (or better) to close the gap (or take a lead) over the 9900k. The process may just not allow such a thing to happen without phenomenal ambient cooling.
 

maddie

Diamond Member
Jul 18, 2010
4,738
4,667
136
From the AT review.
lat9900log.png

lat3900log.png

How do we know that there's not going to be some L4 in the IOD next round if on 7nm+?

With the lower cost and power and higher performance and density of EUV 7nm+, it seems like an obvious progression in debottlenecking to add a usefully sized L4.
 
  • Like
Reactions: Tlh97 and epsilon84

TheGiant

Senior member
Jun 12, 2017
748
353
106
That doesnt hold if we compare a 3700X (4.4GHz peak) to a 9900K (5GHz pk), despite 13% higher frequency the latter is only 3 and 2 % ahead in FPS and min Framerate respectively (at 1080p) :

https://www.computerbase.de/2019-07...mm-test-performancerating-fuer-spiele-fps-fhd

The gap is more than closed, they dont even need to reach 5Ghz...
that is an outlier review
look at

starts here https://techreport.com/review/34672/amd-ryzen-7-3700x-and-ryzen-9-3900x-cpus-reviewed/8/
https://www.techspot.com/review/1869-amd-ryzen-3900x-ryzen-3700x/
https://www.sweclockers.com/test/27760-amd-ryzen-9-3900x-och-7-3700x-matisse/1#content
https://www.tomshardware.com/reviews/ryzen-9-3900x-7-3700x-review,6214-7.html
https://www.purepc.pl/procesory/test_procesora_amd_ryzen_7_3700x_premiera_architektury_zen_2


no, despite the fact that I can finally recommend r3line to gamers it is not quite there

maybe now the ppl even here realize how stupid that was to compare r2 lineup
r3 is competitive to skylake, r2 wasnt
 
Last edited:

exquisitechar

Senior member
Apr 18, 2017
657
871
136
Has any site done an actual clock for clock comparison in gaming? I honestly think Ryzen is pretty close now, if not on par in that regard. I only know of that video where the 8700k and 3600x trade blows while being close in clock speeds with the same core/thread count. It would be interesting to see a proper comparison.
 
  • Like
Reactions: Charlie22911

Charlie22911

Senior member
Mar 19, 2005
614
228
116
Couldn’t some on-package HBM/eDRAM conceivably be used as a pseudo-L4 to further diminish the impact of latency?
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
Has any site done an actual clock for clock comparison in gaming? I honestly think Ryzen is pretty close now, if not on par in that regard. I only know of that video where the 8700k and 3600x trade blows while being close in clock speeds with the same core/thread count. It would be interesting to see a proper comparison.

It's not by design a clock for clock comparison, but the 3600 overclocked to 4.3GHz just so happens to be the same clockspeed as a 8700K:
https://www.google.com/url?sa=t&sou...Vaw38npTBOa9wjcKyYE0kgSld&cshid=1562674292092
 

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,829
136
Couldn’t some on-package HBM/eDRAM conceivably be used as a pseudo-L4 to further diminish the impact of latency?

Not sure about HBM. I do not think eDRAM integrated into the I/O die would be a better solution that just hitting the memory controller. Could be wrong though.
 

TheGiant

Senior member
Jun 12, 2017
748
353
106
Computerbase an outlier ?.

Techreport, see this difference between the 1800x and the 2700X among others :

Deus_Ex_Mankind_Divided_average_fps.png

Deus_Ex_Mankind_Divided_99perc.png


And at Kitguru :

game-deus-ex-1080.png




https://www.kitguru.net/components/...en-9-3900x-ryzen-7-3700x-zen-2-cpu-review/10/

Undoubtly you know who are the "serious" sites...
https://www.kitguru.net/components/...en-9-3900x-ryzen-7-3700x-zen-2-cpu-review/10/
much higher resolution of the kitguru, which reduces the CPU dependancy vs 1080p

here on the same site https://www.kitguru.net/components/cpu/luke-hill/amd-ryzen-9-3900x-ryzen-7-3700x-zen-2-cpu-review/8/

1080P as better CPU testing

yeah, based on site....
 
  • Like
Reactions: Zucker2k

misuspita

Senior member
Jul 15, 2006
400
438
136
At the end of the day, I think both the 3900X and 9900K are overpriced as a strictly 'gaming only' CPU.
But there is a catch. If someone wanted a top of the line productivity setup and a gaming setup, until recently that meant 2 separate systems. Now you don't have to make that distinction. A 3900x and probably 3950x too, can make "the best" productivity and gaming system. 2 in 1.
 

misuspita

Senior member
Jul 15, 2006
400
438
136
I don't see how waiting to 2022 is sustainable, so I imagine if Intel's process is still unviable, they will go to a foundry, but who knows then what date it will come out.
I don't think that's an option for Intel, they target their fabs and silicon for their design and so they are able to out 5GHz CPUs. I don't think TSMC or any other fab can do this now.

Plus, let's not forget how gigantic Intel needs are. They would choke all 7nm from TSMC if they moved with all their products :)
 

ondma

Platinum Member
Mar 18, 2018
2,720
1,280
136
You mean the ones like Toms, or Anands or TPU? Guess I forgot to read them all so I could cherry pick the "right" one.
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
But there is a catch. If someone wanted a top of the line productivity setup and a gaming setup, until recently that meant 2 separate systems. Now you don't have to make that distinction. A 3900x and probably 3950x too, can make "the best" productivity and gaming system. 2 in 1.

No argument from me there, for mixed gaming/productivity workloads the 3900X trumps the 9900K unless the software isn't heavily threaded, such as Photoshop.

Like others have said, the 9900K (at it's current price) makes very little sense for most people unless you're that 0.1% that NEEDS 5.0GHz for whatever reason, maybe an SLI 2080 Ti setup?!