AMD X399 !!!!!

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Atari2600

Golden Member
Nov 22, 2016
1,409
1,655
136
True but >$1700 is still pretty high if you want to sell actual volume.

They want to be careful not to price it too cheap.

Otherwise they'll undercut their pukka Opteron line.

This X399 will have to have some limitations relative to Opteron. Its a delicate balancing act.


Intel HEDT CPUs are identical to their Xeon parts. This hypothetical Ryzen chip is not a Naples reject it is a separate design, an MCM of 2x Ryzen so not a complete new design but sure there is some work involved to get it up and running,

It is my understanding that this will be based on the SP3 socket and is indeed essentially a rebadge of the 16C Opteron (which is supposed to come in 16C and 32C variants).



My guess is the top 16-core version will slot it at $1300. A bit more than 2x 1800x but not too much to prefer 2x1800x rigs over this. The slowest 16-core will then be $999 and fastest 12-core obviously less probably $849.

Possibly. But def not as low as $700 for 12C and $1000 for 16C.
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
So you will want to sell enough volume to make that investment worth it. >$1000 is too much for enthusiasts. That is where people buy that make more money from having a faster CPU.
Are we assuming that enthusiasts and prosumers are the same or different? In my mind, the former would buy a 1080Ti for $699 after the latter already bought the Titan X for $1199 months earlier. Of course, Intel thinks the "prosumer" wants $1,700 CPUs and NVIDIA is marketing the $8,000 GP100 as a "prosumer" card as well.

If AMD is offering a ~3GHz 16T/32C for the retail channel, it's primary competition is the $1,700 10C/20T 6950X and maybe the $1,800 16C/32T Xeon E5-2683 V4 which runs at a weak base 2.1GHz. Intel also has a $3,000 base 2.6GHz 16C/32T Xeon E5-2697A V4. AMD could price their mystery chip at $2,000 and STILL have a solid multi-core perf/$ win over Intel.
 
  • Like
Reactions: french toast

french toast

Senior member
Feb 22, 2017
988
825
136
True but >$1700 is still pretty high if you want to sell actual volume. Intel HEDT CPUs are identical to their Xeon parts. This hypothetical Ryzen chip is not a Naples reject it is a separate design, an MCM of 2x Ryzen so not a complete new design but sure there is some work involved to get it up and running, So you will want to sell enough volume to make that investment worth it. >$1000 is too much for enthusiasts. That is where people buy that make more money from having a faster CPU. My guess is the top 16-core version will slot it at $1300. A bit more than 2x 1800x but not too much to prefer 2x1800x rigs over this. The slowest 16-core will then be $999 and fastest 12-core obviously less probably $849.
This thread has gone crazy with polar extremes of expectation, you make the most sense and more closely resemble my thinking :)

Remember the platform as a whole is higher end, to expect linear pricing scale from mainstream dual channel is daft.
As someone else said they have naples server pricing in that scale and that is where the real money is and what zen is optimised for, else why bother with CCXs and infinity fabric? No it makes perfect sense to follow the market leaders pricing segmentation and structure, but to continue AMDs value proposition offering more for less.

So something like this;
12/24 @ 3ghz /3.2 @ 699$
12/24 @ 3.4/3.6 @ 799-850$
16/32@ 3.2/3.4 @ 999$
16/32@ 3.4/3.6 @ 1199$

With a standout from the croud and outrageous name called threadripper that lineup would sell like hot cakes it really would, intel just couldn't compete with that value proposition, as long as the platform is solid from launch and it can carry 3200 ram it will rip any competing processor in its price range to threads! (intentional :) )
 

OrangeKhrush

Senior member
Feb 11, 2017
220
343
96
My understanding of this platform and early knowledge on Pinnacle ridge is that AMD will not outright win a performance war, but the expectations should never have gotten that high. Haswell was about where the marker landed and where most really predicted performance to be.

That being said not beating intel is not the issue, the issue is that AMD have given a rather diverse PC experience to the end user, with enough performance to be competitive with Intel.
 

CatMerc

Golden Member
Jul 16, 2016
1,114
1,149
136
My understanding of this platform and early knowledge on Pinnacle ridge is that AMD will not outright win a performance war, but the expectations should never have gotten that high. Haswell was about where the marker landed and where most really predicted performance to be.

That being said not beating intel is not the issue, the issue is that AMD have given a rather diverse PC experience to the end user, with enough performance to be competitive with Intel.
I don't see how they wouldn't win an outright performance war with another iteration given high IPC, and a more high performance oriented node to clock higher. Unless they really don't see ways to increase IPC by 5%-7% for Zen 2, or if switching to an HP node wouldn't increase clock speeds.

Intel is stuck with Skylake for god knows how long, they can only increase efficiency and clockspeeds. 2018-2019 is an amazing window for AMD.
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
If AMD is offering a ~3GHz 16T/32C for the retail channel, it's primary competition is the $1,700 10C/20T 6950X and maybe the $1,800 16C/32T Xeon E5-2683 V4 which runs at a weak base 2.1GHz. Intel also has a $3,000 base 2.6GHz 16C/32T Xeon E5-2697A V4. AMD could price their mystery chip at $2,000 and STILL have a solid multi-core perf/$ win over Intel.

You are not wrong. Yet AMD strategy rightly so seems to be to increase market share. They will only get there if they offer something exceptional to which buyers can't say no. in doubt, buyers will go with the proven thing in that market. So getting a 16 core part for less than half of intels price seems completely reasonable (see R7). They need to gain market share so software is optimized for their uArch, They need market share simply for economics of scale. They need market share because of the WSA. Having a 16-core CPU at <$1500 that beats the $3000 Xeon E5-2697A V4 is exactly what could increase their market share.

At work I'm doing some heavy data stuff and looking into getting an own server/workstation for this. a 16-core with 3 ghz+ base clocks for less than $2000 sounds like an excellent value proposition. Some of the stuff sadly is still single-threaded so low clocked xeons really suck for that.
 

dnavas

Senior member
Feb 25, 2017
355
190
116
Prosumers would jump at the chance to buy a 16C processor running at >3.0 GHz for under $2K.

I would not, and that's not what I'm hearing in my circle either. Most of these people already have a 6900 or 6950 (and the only reason I don't, is I'm cheap, and I hate the thought of the next processor being on a different socket, necessitating a new motherboard, and more money to Microsoft). Unlike the consumer space, which hasn't felt the need to be on the upgrade cycle, prosumers have little choice -- we need to follow the performance curve.

What I've heard, specifically, is that it would be a waste to buy a $1200 16 core that doesn't OC up to 4.2G, because they already have a 6950 that can OC to 4.4/4.5 on water. If Ryzen is going to sell into this space, they can't just be an upgrade path for folks that are two generations behind, they need to be competitive with SkylakeX for the next upgrade round. This is very much a harder sell.

As for ST/MT, and how no one buys these processors for ST, that is absolutely wrong. ST performance can matter a great deal. Speaking to the space I know, not all videography work scales. You'd think it would, but it doesn't. H264 encode/decode is single-threaded per slice, and when you split your picture, you prevent optimizations across those boundaries. And you bet it matters. My 1800X can't decode single-slice 4k60p video in real-time in my NLE. It *just* misses managing dual-slice. Fortunately my camera outputs 4 slices, but that means a 16 core is less appealing to me. Encoding only uses ~70% of the CPU for similar reasons.

Stabilization passes are often similarly single-threaded.

I have very little interest in going significantly lower than the 3.9G OC I'm currently running at, and if AMD decides to take the price of the 1800X (which I think we can all appreciate is just a little high for the family) and apply an overcharge factor from there ... well, I don't think it'll go over very well.

But I know a very small slice of this market. I don't, for example, know anyone buying the Xeons :shrug:

ed: and I should point out, I'm not prosumer myself, in that I don't make money off this stuff, but the people I know in this space often do. My day job is spark, so I likely am using large-core-count CPUs, it's just behind AWS. The 70% encoding is for 1080p, 4k encodes use nearly 90% (both are likely gated on the 4k decode, which I should fix by doing a single decode pass, but don't because I don't have 10gbe to my raid yet). I use 2-pass encodes, too, so mileage varies. Intel didn't make the 8 core purchase any easier given the qsv hardware on the 4 cores. And I'd be interested in 16 cores, because once you throw a colorspce curve or two on the thing, add scaling, add a touch of sharpening, the video is no longer RT, and it could be with more cores. Just, not more cores at something insanely low like 3G. :shrug:
 
Last edited:

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
Yet AMD strategy rightly so seems to be to increase market share. They will only get there if they offer something exceptional to which buyers can't say no. *snip* Having a 16-core CPU at <$1500 that beats the $3000 Xeon E5-2697A V4 is exactly what could increase their market share.
Completely agree. I would love to see AMD bring back the top SKU pricing model of yesteryear at $999, like Intel used to do. I mean, AMD undercut NVIDIA's Titan Z by more than 50% with the 295X2, so they're often out for blood when pricing their products.

Some of the stuff sadly is still single-threaded so low clocked xeons really suck for that.
Most of those low base clock Xeons boost up to 3.4-3.6 for single-core, before overclocking where possible. Ryzen with a base clock for all cores at or near Intel's single-core boost? I don't even want to get excited about that prospect until I see some data to support it... :D
 

Atari2600

Golden Member
Nov 22, 2016
1,409
1,655
136
But I know a very small slice of this market. I don't, for example, know anyone buying the Xeons :shrug:

Its a fair point.

If the software doesn't scale (and lets face it, 16 cores is well up the road on Amdahl's law for a lot of commercial code) then 16 cores is not necessarily 2x better than 8.
If the workload doesn't scale, then 16 cores definitely isn't better than 8.

At the minute, and probably over the prosumer lifespan of this CPU generation, the jump from 8C to 16C is one that in many cases won't yield a linear return.

In which case my projections for the costs are probably too high. Excellent! :D:D
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
I would not, and that's not what I'm hearing in my circle either. Most of these people already have a 6900 or 6950 (and the only reason I don't, is I'm cheap, and I hate the thought of the next processor being on a different socket, necessitating a new motherboard, and more money to Microsoft). Unlike the consumer space, which hasn't felt the need to be on the upgrade cycle, prosumers have little choice -- we need to follow the performance curve.

What I've heard, specifically, is that it would be a waste to buy a $1200 16 core that doesn't OC up to 4.2G, because they already have a 6950 that can OC to 4.4/4.5 on water. If Ryzen is going to sell into this space, they can't just be an upgrade path for folks that are two generations behind, they need to be competitive with SkylakeX for the next upgrade round. This is very much a harder sell.

As for ST/MT, and how no one buys these processors for ST, that is absolutely wrong. ST performance can matter a great deal. Speaking to the space I know, not all videography work scales. You'd think it would, but it doesn't. H264 encode/decode is single-threaded per slice, and when you split your picture, you prevent optimizations across those boundaries. And you bet it matters. My 1800X can't decode single-slice 4k60p video in real-time in my NLE. It *just* misses managing dual-slice. Fortunately my camera outputs 4 slices, but that means a 16 core is less appealing to me. Encoding only uses ~70% of the CPU for similar reasons.

Stabilization passes are often similarly single-threaded.

I have very little interest in going significantly lower than the 3.9G OC I'm currently running at, and if AMD decides to take the price of the 1800X (which I think we can all appreciate is just a little high for the family) and apply an overcharge factor from there ... well, I don't think it'll go over very well.

But I know a very small slice of this market. I don't, for example, know anyone buying the Xeons :shrug:
I haven't heard of too many firms that provide overclocked workstations to their employees to work with... not saying there's none, but I haven't personally heard of a single one yet
 

dnavas

Senior member
Feb 25, 2017
355
190
116
I haven't heard of too many firms that provide overclocked workstations to their employees to work with... not saying there's none, but I haven't personally heard of a single one yet

Different spaces. I don't know too many people who make money off videography that work at companies that give workstations to employees :)
[Once upon a time I was given a peak inside of Foundation Imaging, but that was a very long time ago, and I haven't talked to those folks in decades.]
 

Timmah!

Golden Member
Jul 24, 2010
1,419
631
136
Are we assuming that enthusiasts and prosumers are the same or different? In my mind, the former would buy a 1080Ti for $699 after the latter already bought the Titan X for $1199 months earlier. Of course, Intel thinks the "prosumer" wants $1,700 CPUs and NVIDIA is marketing the $8,000 GP100 as a "prosumer" card as well.

If AMD is offering a ~3GHz 16T/32C for the retail channel, it's primary competition is the $1,700 10C/20T 6950X and maybe the $1,800 16C/32T Xeon E5-2683 V4 which runs at a weak base 2.1GHz. Intel also has a $3,000 base 2.6GHz 16C/32T Xeon E5-2697A V4. AMD could price their mystery chip at $2,000 and STILL have a solid multi-core perf/$ win over Intel.

Its other way around. Its the enthusiast who bought Titans, because they wanted, not needed them. Meanwhile prosumers either waited for Ti or simply bought 1080, because it has 2/3 - 3/4 of the performance at half the price. 2 1080s have 5120 CC´s, one Titan has 3584 (at lower clocks disregarding watercooling titan).

Similarly, as a prosumer, you can hardly justify buying 6950x, if it offers 1,25x more cores at 1,7x higher price than 6900k.
 

Atari2600

Golden Member
Nov 22, 2016
1,409
1,655
136
Similarly, as a prosumer, you can hardly justify buying 6950x, if it offers 1,25x more cores at 1,7x higher price than 6900k.

?

For the prosumer, time is typically much more valuable than a few hundred $$/€€/££/etc.

I think you might have your categories back to front!
 

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
?

For the prosumer, time is typically much more valuable than a few hundred $$/€€/££/etc.

I think you might have your categories back to front!

I know I'm brand new here, so please be gentle ;-)

But I think you're mixing up the categories and that Timmah is right. A "prosumer" is pretty much someone that's part "consumer" and part "professional". I would agree with you if you said "professional". In that category time definitely is money and the volume of paid work is sufficient to make it worth while getting the latest and greatest. With the "consumer" money is an issue. The "enthusiast" spends as much as he wants and can. And what we're left with is the "pro-sumer" which in my opinion and experience is someone who straddles both sides of the consumer/professional fence; need enough performance to be productive, but aren't making enough money from the product to warrant a large expense.

I produce content on my personal machine, but it's far from 100% of my revenue. I look at myself as a "prosumer" in the sense above. Even though I benefit from saving time, it doesn't necessarily outweigh the costs. It might, it might not, and so I'm cautious when it comes to spending, even if it's for something that will generate maybe 10-20% of my revenue.
 

imported_jjj

Senior member
Feb 14, 2009
660
430
136
This platform isn't addressing the existing market segment as Intel had no desire to build it.
It's addressing a bunch of folks
- people that buy a 2 socket Xeon platform with 400$ octa cores
- people that waste money on 6-10 cores Intel HEDT
- folks that were not aware of the Xeon option
- anyone else that finds appeal in more cores, memory BW, PCIe lanes and other IO.
AMD needs to create the market with prices that make sense or investing in the platform is a waste of time.

As for cannibalizing server, that's not at all an issue as this is single socket and server SKUs won't cost more, will just be clocked a bit lower for higher efficiency and maybe a few extra features enabled.
There is a big difference between how one prices single socket SKUs vs multi socket and AMD by using MCM doesn't incur the extra costs that are unavoidable with a larger die.

Intel's idiotic Kaby Lake X as well as the rebranding from E to X only exist to shift folks to the platform and try to upsell as their focus is only on margins. AMD can do a lot more and their interest is to build the market, the ecosystem,maximize profits. Where Intel is , is utterly irrelevant as that's not what generates the most $ for AMD.
 

Timmah!

Golden Member
Jul 24, 2010
1,419
631
136
@mattiasnyc



Great first post, would not write it better myself. I guess its true different people have different needs, so you cant really generalize easily, but as i see it, enthusiast buys Titan, prosumer 10x0 (Ti) - whichever has the best price/performance ratio and professional gets Tesla/Quadro :)
 

OrangeKhrush

Senior member
Feb 11, 2017
220
343
96
I don't see how they wouldn't win an outright performance war with another iteration given high IPC, and a more high performance oriented node to clock higher. Unless they really don't see ways to increase IPC by 5%-7% for Zen 2, or if switching to an HP node wouldn't increase clock speeds.

Intel is stuck with Skylake for god knows how long, they can only increase efficiency and clockspeeds. 2018-2019 is an amazing window for AMD.

over optimism is not good, AMD with Ryzen delivered what most with reasonable expectations were expecting. I was setting my marker around Haswell and it pretty much hovers thereabouts, on the broad spectrum it in gaming performs low level about Ivy/Ivy E, nominally around Haswell and in the odd best case somewhere around Broadwell level, in some instances it mirrors Haswell E and Broadwell E.

IPC or single thread is still about 10-11% off Kaby and if PR is fair improvement that will close but I can assure you that AMD mainstream or HEDT will not beat Kaby by Pinnacle Ridge in gaming.

This is not my issue anyways, AMD have opened doors long since closed by under performance, AMD is showing developers and major games works that they are serious about high end again and that will bear fruits over the coming years.

Forget beating intel and be happy that AMD actually has a solid offering on the market for a very good price.
 

CatMerc

Golden Member
Jul 16, 2016
1,114
1,149
136
over optimism is not good, AMD with Ryzen delivered what most with reasonable expectations were expecting. I was setting my marker around Haswell and it pretty much hovers thereabouts, on the broad spectrum it in gaming performs low level about Ivy/Ivy E, nominally around Haswell and in the odd best case somewhere around Broadwell level, in some instances it mirrors Haswell E and Broadwell E.

IPC or single thread is still about 10-11% off Kaby and if PR is fair improvement that will close but I can assure you that AMD mainstream or HEDT will not beat Kaby by Pinnacle Ridge in gaming.

This is not my issue anyways, AMD have opened doors long since closed by under performance, AMD is showing developers and major games works that they are serious about high end again and that will bear fruits over the coming years.

Forget beating intel and be happy that AMD actually has a solid offering on the market for a very good price.
Dunno, just seems that AMD is so close to the top that it's within reach in a generation. Maybe Zen really is so balanced that they can't really improve upon it without sacrifices, but I somehow doubt that lol
 
  • Like
Reactions: guskline

Ajay

Lifer
Jan 8, 2001
15,458
7,862
136
Well, if Pinnacle Ridge is due out in a year; I would think that it has already taped out. Any news on this? I've looked around as much as I can and I haven't seen anything suggesting GFL has anything better than the pretty mature 14LPP process - so no easy bump in performance. Some design changes in PR could boost clocks a bit (if there are clock limiting circuit designs in the critical path) and it would seem like a 4-7% bump in IPC is a no brainer. So a performance bump of ~10% is reasonable. Coffee Lake will probably get a 5-10% performance bump from Intel's 14nm++. So, unless AMD or GFL drop a bomb with more significant improvements - Intel will stay on top in pure ST performance. AMD will stay the perf/dollar leader (with excellent performance this time around).

On of the things to remember is that Pinnacle Ridge was designed before AMD had engineering samples back for Summit Ridge (Keller was still at AMD, IIRC). There are some tweaks that AMD may have made based on SR engineering samples before masks were made for PR. After that there could be one or two respins on PR before retail. Right now, I really curious to see what AMD does with the B2 stepping (which I expect will focus mainly on critical issues for Naples, but hopefully some IMC improvements so Ryzen can benefit from more bandwidth).

The largest gains for Zen over the next year will probably come from BIOS (stability) & software improvements, and early results indicate that there are substantial gains available for some high performance apps like games. Interesting time though :)
 
Last edited:

Atari2600

Golden Member
Nov 22, 2016
1,409
1,655
136
Anyone any the wiser on X399 - either in the specs or possible release dates.

My old 1100T really started to flake out last night. Its getting to the point where I can't depend on it. :-/
 

hrga225

Member
Jan 15, 2016
81
6
11
Anyone any the wiser on X399 - either in the specs or possible release dates.

My old 1100T really started to flake out last night. Its getting to the point where I can't depend on it. :-/
New version of aida64 added preliminary support for 12 and 16 core AMD CPUs.You can find it in their version documentation.
 

Ajay

Lifer
Jan 8, 2001
15,458
7,862
136
Pretty much crickets. AMD is holding its cards close to to the vest.
 

dnavas

Senior member
Feb 25, 2017
355
190
116
Pretty much crickets. AMD is holding its cards close to to the vest.

I expect most leaks are purposeful. It doesn't make sense to interrupt the r5 marketing, so I don't expect leaks on mini-Naples. Maybe after r5 they'll spin a tease or two out.
 

Atari2600

Golden Member
Nov 22, 2016
1,409
1,655
136
https://www.reddit.com/r/Amd/comments/67lpkb/amd_msi_press_event_x399vega_reveal/

So AMD and MSI are having a joint press event in the coming days, and a whole bunch of tech journalists have been invited, similar to the Ryzen launch reveal. They are saying it is going to be related to Ryzen, but who knows. That could be to throw us off for all we know. This could possibly be a Vega launch reveal, or if it really is related to Ryzen it could be the reveal of their High End Desktop platform with the release of their 12 and 16 core Ryzen chips on the X399 platform. Or it could be something much more boring, but I don't know why they would be flying in tons of tech journalists for anything less. And remember that AMD teased Vega with the Prey preview event, which releases in 10 days.

What do you guys think?

Interesting. Here's hoping we get a bit more visibility of whats going on.
 
Last edited: