Speculation: AMD's response to Intel's 8-core i9-9900K

Page 25 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

How will AMD respond to the release of Intel's 8-core processor?

  • Ride it out with the current line-up until 7nm in 2019

    Votes: 129 72.1%
  • Release Ryzen 7 2800X, using harvested chips based on the current version of the die

    Votes: 30 16.8%
  • Release Ryzen 7 2800X, based on a revision of the die, taking full advantage of the 12LP process

    Votes: 17 9.5%
  • Something else (specify below)

    Votes: 3 1.7%

  • Total voters
    179

coercitiv

Diamond Member
Jan 24, 2014
7,355
17,425
136
Let me explain,
If you want the message to get across you need to keep it more simple than that :)

The 9900K beats the 7820X in professional workloads because it has a higher TDP ceiling. Set the same high power limits on 7820X, let it run at max clocks on all cores, and the difference will evaporate. This will become even more clear once i7-9800X hits the shelves, all praise the sTIM and the updated 165W TDP!

Also, for whoever wants to understand the massive difference stock power settings induce, here's a video from today's selection:
 
Last edited:

dlerious

Platinum Member
Mar 4, 2004
2,115
928
136
Vast majority of cases the 9900K is irrelevant, AtenRa is right since he lives in the EU, 678€ for a 9900K, 650€ for a 1920X and 750€ for the 1950X.

Only thing he s not accurate is that not only the 1920X is price competitive with Intel 8C/16T but the 1950X as well is a much better buy.

https://geizhals.de/intel-core-i9-9900k-bx80684i99900k-a1870092.html#offerlist

https://geizhals.de/?cat=cpuamdam4&xf=12099_Desktop~820_TR4

Now if your raison d'etre is to be milked by Intel that s another story...
Your prices are high over there. Last I checked here, the 1920X was around $420, 1950X is $680, and 9900K is $580 (excluding Micro Center sales).Are the 20 series RTX cards inflated too?
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Your prices are high over there. Last I checked here, the 1920X was around $420, 1950X is $680, and 9900K is $580 (excluding Micro Center sales).Are the 20 series RTX cards inflated too?

Dunno for GFX but for CPUs the relative prices are the same, that is, the delta is the same percentages wise, also German prices include 19% VAT, with such ratios the 9900K of your exemple would be 690$, that s comparable with 678€ since Micro Center is generally quite cheap for the US...
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Every review is using only a single application and everyone is taken this as a general power usage indication for each CPU. Now suddenly this power usage only matters if we build a Pov-Ray farm ??

Exactly when did Perf/watt become the primary measure of importance for high performance desktop computers? Goalpost shifting because the performance numbers turned out so good on the 9900K?

Perf/Watt is a big deal for mobile, and when you build a render farm, but not so much for high performance desktops. It's a nice bonus when all else is equal but for it takes a back seat to performance for high performance desktop computers.

As an all round high performance desktop CPU, the 9900K is essentially the top performer.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Exactly when did Perf/watt become the primary measure of importance for high performance desktop computers? Goalpost shifting because the performance numbers turned out so good on the 9900K?

.

You are just digging further your lost case, actually that s not only perf/watt but also absolute perf, AtenRa did post both metrics.

So, AMDs TR2 2920X @ $649 is currently a direct competitor even if 9900K is "using a mainstream platform".


And this is why the i9 9900K is not suited for workstation/professional workloads. It is better to invest in TR2 or Intel 2066 socket.

From AT TR2 2920X review,
https://www.anandtech.com/show/13516/the-amd-threadripper-2-cpu-review-pt2-2970wx-2920x

Pov-Ray perf/watt

Core i9 9900K = 5542 / 168,48W = 32,89
Core i9 7920X = 5861 / 140,02 = 41,85

7920X has 27% higher perf/watt than 9900K in this application. It also shows that i9 9900K is way higher than 140W TDP with those aggressive turbo clocks. This CPU is clearly not a mainstream SKU, you cannot get the same performance using a H310/B360 vs Z390 motherboards.

102297.png


102278.png
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
You are just digging further your lost case, actually that s not only perf/watt but also absolute perf, AtenRa did post both metrics.

You really need to reconsider who is digging further after reposting his post, which made me look a bit closer.

It highlights a couple of things. First the extreme bias and partisanship, that always see things drastically in their chosen sides favor, regardless of the reality of the situation, and of course how those graphs don't really tell the story he is selling and your backing.

Naturally he of course one of the potential worse cases for 9900K vs 2920x. An embarrassingly parallel rendering task. Which should be a weakness of lower core count part.

Yet with 50% more cores, the 2920x was only 2% Faster! ( 5661/5542).

And after that huge diatribe about perf/watt:

ps2. Ohh and between you and me, the 12 Threaded TR2 2920X completely destroys the 9900K in perf/watt at highly threaded applications. This perf/watt and high efficiency is something the Intel boys were tutting for the last 4-5 years but it seems recently its not that important anymore.

Ahem: From the graphs you highlighted again:
Pov-Ray perf/watt

Core i9 9900K = 5542 / 168,48W = 32,89
TR 2920X = 5661 / 179,11W = 31.6

2920X actually has slightly worse perf/watt...
 
  • Like
Reactions: CHADBOGA

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Yet with 50% more cores, the 2920x was only 2% Faster! ( 5661/5542).

And after that huge diatribe about perf/watt:



Ahem: From the graphs you highlighted again:
Pov-Ray perf/watt

Core i9 9900K = 5542 / 168,48W = 32,89
TR 2920X = 5661 / 179,11W = 31.6

2920X actually has slightly worse perf/watt...

Granted but that s the most favourable case in rendering for the 9900K, all other other renderers are not that ICL friendly, including the Intel engine based Embree within Corona renderer..

https://www.anandtech.com/show/13516/the-amd-threadripper-2-cpu-review-pt2-2970wx-2920x/6

You can compare POVRAY to the rest of the bunch, only Cinema 4D is missing, lol...
 

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
I have to agree with Peter.

From what I can see the decisions in at least media content creation is a) OSX or Windows/Linux, and b) desktop or HEDT (or server/render farms).

If the decision is to get a desktop platform CPU then power just isn't an issue. Processing time is. Faster work = more work = more profit. It's that simple. If the decision on the other hand is HEDT then of course that's a different matter. Now, of course if the 9900k is on par with the 2920x in performance and if the price is the same then if nothing special is needed (i.e. Thunderbolt) then x399 wins out simply because of connectivity (minus TB).
 
  • Like
Reactions: CHADBOGA

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
ps2. Ohh and between you and me, the 12 Threaded TR2 2920X completely destroys the 9900K in perf/watt at highly threaded applications. This perf/watt and high efficiency is something the Intel boys were tutting for the last 4-5 years but it seems recently its not that important anymore.
I don't know if you're really paying attention to the charts. Do you remember the 7980XE? Check out it's performance per watt against the AMD counterparts. The Intel HEDT chips, which were the laughing stock of some on this forum actually look great compared to AMD's offerings. On the performance chart, the picture is a mixed bag. On the power consumption chart, the AMDs are all in the bottom, save for one, the 1900x. Yeah, you didn't pay good attention to the charts.
 
Last edited:

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
Granted but that s the most favourable case in rendering for the 9900K, all other other renderers are not that ICL friendly, including the Intel engine based Embree within Corona renderer..

https://www.anandtech.com/show/13516/the-amd-threadripper-2-cpu-review-pt2-2970wx-2920x/6

You can compare POVRAY to the rest of the bunch, only Cinema 4D is missing, lol...
But not performance per watt, which Is his original argument. Plus, the 9900k held it's own. It even won more than a couple of the benchmarks in the HEDT section. Well, not only Cinema 4D is missing. I may be wrong but I didn't see any DAW/Audio tests either...…..
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
But not performance per watt, which Is his original argument. Plus, the 9900k held it's own. It even won more than a couple of the benchmarks in the HEDT section. Well, not only Cinema 4D is missing. I may be wrong but I didn't see any DAW/Audio tests either...…..

With such set ups i would be more comfortable with something that i could later upgrade hugely for cheap, i mean a 24C will be the same price as current Summit Ridge 12C (410€..) at some point, or even 16C since the 1950X is only 10% more pricey than the 9900K and should be accounted as a potential contender.

As for putting DAW/Audio in the debate that s pure deflection, if this soft doesnt perform adeqately with such CPU then it means that something is broken in the software and that it s a limitation even for Intel CPUs, in such case i would choose the software that has the best perfs accross the board, and obviously this is not this one, because what is benched is not only the CPU....
 
Last edited:

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
As for putting DAW/Audio in the debate that s pure deflection, if this soft doesnt perform adeqately with such CPU then it means that something is broken in the software and that it s a limitation even for Intel CPUs

Except testing bears out that you're wrong about that. Intel CPUs perform better in certain DAW/Audio tasks. It's as simple as that. I'll never buy Intel for myself unless there's a change in corporate attitude versus customers, but despite being an AMD fanboy I can't ignore that Intel performs better per clock cycle than AMD. It's just the way it is.
 

Abwx

Lifer
Apr 2, 2011
11,885
4,873
136
Except testing bears out that you're wrong about that. Intel CPUs perform better in certain DAW/Audio tasks.

It s not a matter of performing better, the soft is bugged or at least very badly coded, no competently designed soft can produce results such as the ones below, they do not make any sense in any metric, IPC, RAM bandwith, latency seems to have no importance in the perf, the only metric is that if it s an Intel CPU or not :

https://techreport.com/review/34214/amd-ryzen-threadripper-2920x-cpu-reviewed/7

So what s your take, what is the parameter that makes the difference..?.
A CPU dispatcher when compiled..?.

Also from TReport, about this nicely designed software :

Apologies for the lack of results at 96 KHz and a buffer depth of 96 here. Thanks to something in the chain of Reaper, Windows 10, and our ASIO driver, our many-core CPUs couldn't run the 96-96 test at all—we got popping and crackling from the get-go.
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
It s not a matter of performing better, the soft is bugged or at least very badly coded, no competently designed soft can produce results such as the ones below, they do not make any sense in any metric, IPC, RAM bandwith, latency seems to have no importance in the perf, the only metric is that if it s an Intel CPU or not :

https://techreport.com/review/34214/amd-ryzen-threadripper-2920x-cpu-reviewed/7

So what s your take, what is the parameter that makes the difference..?.
A CPU dispatcher when compiled..?.

Also from TReport, about this nicely designed software :

Apologies for the lack of results at 96 KHz and a buffer depth of 96 here. Thanks to something in the chain of Reaper, Windows 10, and our ASIO driver, our many-core CPUs couldn't run the 96-96 test at all—we got popping and crackling from the get-go.
That's as real world as it gets. Plus latency is critical in DAW and we know how Threadripper does in that category. Other factors are hard pagefaults and cpu stalls etc, which all deal with cache and mem subsystem.
Edit: Plus, from the AnandTech review, Intel handles legacy code better than AMD
 

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
It s not a matter of performing better, the soft is bugged or at least very badly coded, no competently designed soft can produce results such as the ones below, they do not make any sense in any metric, IPC, RAM bandwith, latency seems to have no importance in the perf, the only metric is that if it s an Intel CPU or not :

https://techreport.com/review/34214/amd-ryzen-threadripper-2920x-cpu-reviewed/7

So what s your take, what is the parameter that makes the difference..?.
A CPU dispatcher when compiled..?.

Also from TReport, about this nicely designed software :

Apologies for the lack of results at 96 KHz and a buffer depth of 96 here. Thanks to something in the chain of Reaper, Windows 10, and our ASIO driver, our many-core CPUs couldn't run the 96-96 test at all—we got popping and crackling from the get-go.

Techreport did test at 96kHz and 96 samples. In the 9900k review here they tested Threadripper CPUs just fine. So, sine that was done a while ago it points to the problem being with updated components and not the Reaper test suite, since it has as far as I know not been updated. So between the DAW, the plugins, the OS, potential BIOS/firmware updates, it's just not correct to point the finger at Reaper.

Further more, the common denominator is actually NOT the software "Reaper", it is Intel CPUs vs the Zen architecture. Techreport isn't the only tester that uses DAWbench for DAW testing. There are at least a couple of builders that do this regularly as well. (You can read through tests at scanproaudio.info if you want to spend the time.)

So, we can speculate all day long why this is happening, but it's been a case where Zen has been consistently behind Intel in the one specific test case, regardless of reviewer etc.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
You really need to reconsider who is digging further after reposting his post, which made me look a bit closer.

It highlights a couple of things. First the extreme bias and partisanship, that always see things drastically in their chosen sides favor, regardless of the reality of the situation, and of course how those graphs don't really tell the story he is selling and your backing.

Naturally he of course one of the potential worse cases for 9900K vs 2920x. An embarrassingly parallel rendering task. Which should be a weakness of lower core count part.

Yet with 50% more cores, the 2920x was only 2% Faster! ( 5661/5542).

And after that huge diatribe about perf/watt:



Ahem: From the graphs you highlighted again:
Pov-Ray perf/watt

Core i9 9900K = 5542 / 168,48W = 32,89
TR 2920X = 5661 / 179,11W = 31.6

2920X actually has slightly worse perf/watt...


Measuring perf/watt you have to take a few things in to consideration, you either take perf watt at the same or close the same performance or at the same or close the same wattage/power consumption(Wh).

Since we dont do the testing, we only have the absolute highest performance of each product taken from the reviews. So at the end we can only measure the absolute highest performance divided by the watt they measure. And each review only measure one or two applications, so we have to see more review in order to have a more complete idea of the perf/watt.

In the AT review, I compared the one year old Core i9 7920x because it has the same or higher performance than the
9900K but the 7920X also have lower wattage making the absolute performance to watt higher than the 9900K. The 7920X was chosen to highlight that the 9900K is not a good product for professional use because of the worst perf/watt and limited platform features and of the high price.

As for the TR@ 2920X, yes in the PovRay benchmark on the AT review the perf/watt is not better than the 9900K, but when you see more reviews the picture is getting clearer.

For example from Techspot review ,
https://www.techspot.com/review/1737-amd-threadripper-2970wx-2920x/

The TR2 2920X finishing the work in Blender in 13% less time (faster) than the Core i9 9900K with only using 6% more power. If you put both processors to finish the same Blender work at the same time and you measure the power used you will see that the 2920X is way more efficient.

Blender.png


Power_Blender.png


Another example, from PCper
https://www.pcper.com/reviews/Proce...per-2920X-and-2970WX-Review/Synthetic-Testing

Here in Cinebench R15 MT, the TR2 2920X is 26.4% faster than the Core i9 9900K using only 20.8% more power. Again if you put both processors to finish the benchmark at the same time , the TR2 2920X will be way more efficient than the 9900K.

cb-multi.png


power.png


Of course there are applications that the perf/watt will be close , but when the application or workload (high throughput) demands high thread count , the TR2 2920X will have the advantage in efficiency vs the 9900K.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I don't know if you're really paying attention to the charts. Do you remember the 7980XE? Check out it's performance per watt against the AMD counterparts. The Intel HEDT chips, which were the laughing stock of some on this forum actually look great compared to AMD's offerings. On the performance chart, the picture is a mixed bag. On the power consumption chart, the AMDs are all in the bottom, save for one, the 1900x. Yeah, you didn't pay good attention to the charts.

Read my post above about perf/watt.
 

coercitiv

Diamond Member
Jan 24, 2014
7,355
17,425
136
The Intel HEDT chips, which were the laughing stock of some on this forum actually look great compared to AMD's offerings.
The HEDT chips were the laughing stock of some on these forums due to 2 reasons:
  • they bombed in games, removing any hope for a platform that can excel at both productivity and gaming
  • they dropped solder, which greatly limited their sustained clock potential above stock values
The desire for having a "master of everything" halo chip is the same driving factor behind some people here indirectly pushing the 9900K as a better alternative over Intel's own HEDT in productiviy, which leads to this interesting behavior of avoiding comparisons between Intel products: all we see is heated debates of 9900K vs. TR2 products or HEDT vs. Threadripper, but everybody consistently avoids discussing price / performance / features for 9900K vs. HEDT.

We will soon have an i7 9800X with solder, full PCIe lane benefits, updated 165W TDP profile, and L3 cache increased by 50% over the equally priced 7820X. That's when the schizophrenia will reach it's peak.
 
Last edited:
  • Like
Reactions: Despoiler

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
The HEDT chips were the laughing stock of some on these forums due to 2 reasons:
  • they bombed in games, removing any hope for a platform that can excel at both productivity and gaming
  • they dropped solder, which greatly limited their sustained clock potential above stock values
The desire for having a "master of everything" halo chip is the same driving factor behind some people here indirectly pushing the 9900K as a better alternative over Intel's own HEDT in productiviy, which leads to this interesting behavior of avoiding comparisons between Intel products: all we see is heated debates of 9900K vs. TR2 products or HEDT vs. Threadripper, but everybody consistently avoids discussing price / performance / features for 9900K vs. HEDT.

We will soon have an i7 9800X with solder, full PCIe lane benefits, updated 165W TDP profile, and L3 cache increased by 50% over the equally priced 7820X. That's when the schizophrenia will reach it's peak.
You forgot power consumption. Funny that that is off the table all of a sudden. The power hogs that both TR and TR2 are is being quietly brushed aside. The 9900k is an amazing chip with what it can do with only 8 cores. Look at the 2700x and 1900x compared to the 9900k. Of course it'll consume more where it needs to, but it shows where that energy is going by dominating in overall performance, even compared to the HEDT chips.
 

coercitiv

Diamond Member
Jan 24, 2014
7,355
17,425
136
You forgot power consumption. Funny that that is off the table all of a sudden. The power hogs that both TR and TR2 are is being quietly brushed aside. The 9900k is an amazing chip with what it can do with only 8 cores. Look at the 2700x and 1900x compared to the 9900k.
Listen to yourself: I talk about the drawbacks Skylake-X brought in comparison to Intel's own product line, and all you can do is keep bringing AMD back into the comparison so you have something to speak badly of without hurting your favorite brand in the crossfire.

Of course it'll consume more where it needs to, but it shows where that energy is going by dominating in overall performance, even compared to the HEDT chips.
As I said, conveniently ignoring HEDT has lower power ceiling in stock configuration than 9900K on Z390 . Bring the power ceiling on 7820X up to 165W / 210W just like on 9900K, let the clocks go up on multi-core loads, and we'll see how that domination holds.

As I said, Basin Falls cannot come soon enough, the logical disconnect is glaring.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
The HEDT chips were the laughing stock of some on these forums due to 2 reasons:
  • they bombed in games, removing any hope for a platform that can excel at both productivity and gaming
  • they dropped solder, which greatly limited their sustained clock potential above stock values
The desire for having a "master of everything" halo chip is the same driving factor behind some people here indirectly pushing the 9900K as a better alternative over Intel's own HEDT in productiviy, which leads to this interesting behavior of avoiding comparisons between Intel products: all we see is heated debates of 9900K vs. TR2 products or HEDT vs. Threadripper, but everybody consistently avoids discussing price / performance / features for 9900K vs. HEDT.

We will soon have an i7 9800X with solder, full PCIe lane benefits, updated 165W TDP profile, and L3 cache increased by 50% over the equally priced 7820X. That's when the schizophrenia will reach it's peak.
Is the extra cache on the 9800X really going to make a big difference? Because other than that, the 7820X and 9800X look quite similar.
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
Listen to yourself: I talk about the drawbacks Skylake-X brought in comparison to Intel's own product line, and all you can do is keep bringing AMD back into the comparison so you have something to speak badly of without hurting your favorite brand in the crossfire.


As I said, conveniently ignoring HEDT has lower power ceiling in stock configuration than 9900K on Z390 . Bring the power ceiling on 7820X up to 165W / 210W just like on 9900K, let the clocks go up on multi-core loads, and we'll see how that domination holds.

As I said, Basin Falls cannot come soon enough, the logical disconnect is glaring.
8 cores, 16 threads at 4,7GHz is not max efficiency territory so yeah, lets see the 9800x match those clocks and see what it consumes.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
If the 9900K would beat the 7820X, it seems like it would also beat the 9800X?
Didn't the 8700K keep up with, or beat, the 7820X in gaming?
 

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
I still don't understand why people talk about power consumption or perf/watt if we're looking at current CPUs that aren't either low-power/mobile or server/render farms.

Any content creator that invests in either a high end "regular" platform 9900k/2700x OR an HEDT x299/x399 platform will generally not be as concerned with power usage compared to pure performance. These businesses (or people) buy high end chips for the performance. Saving time saves more money than saving power.

So again; why the talk about power consumption relative to performance?
 

coercitiv

Diamond Member
Jan 24, 2014
7,355
17,425
136
I still don't understand why people talk about power consumption or perf/watt if we're looking at current CPUs that aren't either low-power/mobile or server/render farms.
So again; why the talk about power consumption relative to performance?
It's not about power saving, it's about power as a normalization factor.

If saving time is more valuable than energy, why bench the 7820X @ 140W versus 9900K @ 165W+? Why not let both CPUs work with the same power limits, especially considering they share the same pedigree?

Think about it the other way around: how would this discussion look like if Intel had enforced a 95W PL1 and 125W PL2 on the 9900K at stock?