Prosumers would jump at the chance to buy a 16C processor running at >3.0 GHz for under $2K.
I would not, and that's not what I'm hearing in my circle either. Most of these people already have a 6900 or 6950 (and the only reason I don't, is I'm cheap, and I hate the thought of the next processor being on a different socket, necessitating a new motherboard, and more money to Microsoft). Unlike the consumer space, which hasn't felt the need to be on the upgrade cycle, prosumers have little choice -- we need to follow the performance curve.
What I've heard, specifically, is that it would be a waste to buy a $1200 16 core that doesn't OC up to 4.2G, because they already have a 6950 that can OC to 4.4/4.5 on water. If Ryzen is going to sell into this space, they can't just be an upgrade path for folks that are two generations behind, they need to be competitive with SkylakeX for the next upgrade round. This is very much a harder sell.
As for ST/MT, and how no one buys these processors for ST, that is absolutely wrong. ST performance can matter a great deal. Speaking to the space I know, not all videography work scales. You'd think it would, but it doesn't. H264 encode/decode is single-threaded per slice, and when you split your picture, you prevent optimizations across those boundaries. And you bet it matters. My 1800X can't decode single-slice 4k60p video in real-time in my NLE. It *just* misses managing dual-slice. Fortunately my camera outputs 4 slices, but that means a 16 core is less appealing to me. Encoding only uses ~70% of the CPU for similar reasons.
Stabilization passes are often similarly single-threaded.
I have very little interest in going significantly lower than the 3.9G OC I'm currently running at, and if AMD decides to take the price of the 1800X (which I think we can all appreciate is just a little high for the family) and apply an overcharge factor from there ... well, I don't think it'll go over very well.
But I know a very small slice of this market. I don't, for example, know anyone buying the Xeons :shrug:
ed: and I should point out, I'm not prosumer myself, in that I don't make money off this stuff, but the people I know in this space often do. My day job is spark, so I likely am using large-core-count CPUs, it's just behind AWS. The 70% encoding is for 1080p, 4k encodes use nearly 90% (both are likely gated on the 4k decode, which I should fix by doing a single decode pass, but don't because I don't have 10gbe to my raid yet). I use 2-pass encodes, too, so mileage varies. Intel didn't make the 8 core purchase any easier given the qsv hardware on the 4 cores. And I'd be interested in 16 cores, because once you throw a colorspce curve or two on the thing, add scaling, add a touch of sharpening, the video is no longer RT, and it could be with more cores. Just, not more cores at something insanely low like 3G. :shrug: