I will walk back agreeing with JHH in many ways. He said some really stupid things. Sorry, but running lower resolution and upscaling is not "crushing" anything. And your RTX performance hasn't exactly been "crushing" anything but your stock value. And RTX isn't the only way to do ray-tracing (AMD showed Vega 20 doing ray-tracing, granted not the half-baked real time gaming ray-tracing consumers are getting which non-RTX stuff will be able to do, but still ray-tracing, on a slide, think it was maybe at the New Horizons event or maybe it was CES). And Vega 20 has AI feature support (considering that was apparently the focus of Vega 20 development...just makes me scratch my head at what he's talking about, only thing I can guess is he's trying to act like Tensor Cores is the only way of doing AI or something, or just his usual attempt at talking trash). And Radeon VII will likely clobber everything but Titan V (and some older Titans and AMD stuff) in DP/FP64 performance.
The same reason people believe conspiracy theories. Jumping to conclusions on no evidence just like you did. The FACTS are:
1) AMD has NEVER done a pro only GPU chip.
2) AMD is the king of squeezing every niche out of a single die. Just look at Ryzen, everything from lowly quad cores to 32 core enterprise on a single die.
3) Lisa Su has stated publicly that it was always intended to be a consumer gaming GPU, you know like every GPU AMD has ever made.
OTOH we have "feelings" because AMD reps didn't state the obvious that their GPU chip would also be used in gaming card, like every single other GPU chip they ever made, that for some unknown reason, they would limit this GPU chip to a limited niche and not release it as a gaming card.
I just realized you're ignoring several examples of chips that AMD didn't use multiple markets. I'll even ignore consoles (Gamecube, Wii, 360, and Wii U all got custom dGPU chips not offered anywhere else). Vega M, Vega 12 (which I believe is just in Pro 20 and 16, which it likely was just repurposed Vega Mobile but apparently the only taker was Apple for the Macbook Pro; and I have a hunch the cost of HBM pushing the overall cost was probably a major contributing factor to its limited uptake). And Polaris 30 isn't in a Pro product (which maybe it will be, which that's the usual path for AMD GPUs, consumer then pro; Vega 20 is by far the biggest exception they've ever made for that, and I think that kinda nullifies your "AMD ALWAYS DOES THIS!!!" argument; interestingly the only other exception for that I can remember was Vega 10 - with rumors that AMD was seriously considering not releasing it as a consumer product because of its high cost and mediocre performance which inhibited their ability to adjust pricing in the event of poor sales, now why does that sound familiar?).
CPUs are not even comparable. By that logic, what happened to Navi being AMD's "scalable" multi-GPU chiplet working as one monolithic one that was all the talk a few years back? Also, uh, by that logic, you're basically saying that well AMD is going to cover the entirety of the GPU market with a single die? But they've NEVER done
that. So much for the "AMD gets every niche out of a single die", huh?
And? She'd also only EVER mentioned it as a datacenter GPU prior to that, and AMD has to the best of my knowledge NEVER released a GPU as an HPC/Pro product before consumer, so this whole line of speculation on your part has as many caveats as the one you're arguing against.
Another issue, and why I think this "Vega 20 was always a gaming chip" is just spin is that, she also says its not just a shrink of Vega 10. Which, is technically true, but for the most part also is not true. For the GPU design itself, there's very little (if anything) changed from Vega 10. The main changes are they got it fully validated for ECC, and they added two memory controllers. And then they added some new ops (which I believe is just they expanded the software support, Vega 10 likely could do those as well). Those are not major changes. So yes, she's technically right that its not a literal shrink of Vega 10, but it is still largely just Vega 10 on 7nm. It does go to show how strong Vega is in compute, which is why it could largely be Vega 10 with more bandwidth. It unfortunately also highlights how mediocre it is for graphics.
As opposed to you ignoring how AMD has never release a GPU for the Pro market first? Limited niche that likely sells more units than the other limited niche (gamers shelling out $700)? Seriously high end gaming market is a more limited niche for AMD than the pro market.
It's not that expensive unless you believe recent nonsense about HBM pricing.
While cost/area of silicon increases at each process node shrink, the costs/transistor DOES NOT. In fact it generally continues downward.
So with similar transistor count to TU104 it should have similar costs, or lower, especially as time goes by.
IF HBM is really that expensive they could do cheaper 12GB or 8GB model (though with the loss of memory bandwidth), by simply leaving out an HBM stack or two. A two stack Vega7 shouldn't really cost much more to build than Vega 64.
Its not that expensive yet its not used in any products outside of enthusiast and higher? Cheapest product its been on was probably Vega M and that was just a single stack for what appears to be a pretty tailored product (frankly was likely mostly as a prototype showing that Intel could make something like it) that was not exactly high volume. Nvidia, who probably has as much if not more need for it in their GPUs, doesn't use it on anything outside of HPC products. AMD could certainly use it in a variety of products, where its power savings and compact size would prove especially beneficial.
If you're just looking purely at the literal cost of when fabbing it. But that's ignoring the drastically increasing cost/transistor of designing and engineering the chip, something that you cannot ignore when talking about a company's cost of making chips.
Maybe, but that doesn't matter. 100% guarantee that the finished products that Vega 20 is on cost more because of HBM2 memory. Plus, that's a very bizarre argument. Are you trying to say that because of the similar price points, that should indicate that the costs of both are about equal?
Except I think that would have severely limited its performance improvement compared to Vega 10 products, which would mean it'd be hard to sell customers on a much more expensive card with very little performance improvement, and so they'd have to have lower pricing (margins might be about the same, but then it'd probably be competing with maybe the 2070 and AMD would look more foolish that their "high end" 7nm GPU is competing with the lower "mid" tier of Nvidia. I personally think Vega 20 die likely costs significantly more, and then using twice as much HBM2, which required a new interposer, and the costs would've added up. I wouldn't be surprised if Vega 20 products cost close to twice as much to produce as Vega 10 ones right now (for the whole card). Don't forget full ECC costs more as well.
AMD has been selling HBM consumer cards since 2015, Vega 56 and Vega 64 had launch prices of $399 and $499.
If AMD can make money on an HBM consumer card with two stacks of HBM for $399, they can certain make it on $699 card with 4 stacks, especially since as time goes by the cost of HBM is almost certainly declining.
I don't think that's in serious dispute at this point. But I also don't know what that has to do with anything. We know AMD is willing to take less margins than Nvidia. We also know that costs severely hampered AMD being able to adjust Vega products' pricing to better fit its relative competitive place in the market, to the point that they questioned releasing Vega 10 to consumers/gamers because their cost coupled with its performance meant much lower margins than AMD would like. But then mining took off which raked in the money for AMD (to the point that they then go criticized for trying to help people get Vega cards for MSRP instead of the ultra-gouged mining prices). Vega 20 didn't radically change that, and while mining demand had flatlined, Nvidia's pricing gave AMD some leeway. Now, I don't doubt that AMD can drop the price some and still make money. Personally I doubt they make much money from gamers on Radeon VII. I think most sales will be to users that can use the compute capability.
I kind'a have to agree with Peter here though.
First of all, the quote isn't a quote of what AMD said officially, but rather what a reporter thought AMD was planning (or not planning). Huge difference. And the other article talks about the cards, not the actual GPUs.
So Peter has a point here when he says AMD has been pretty consistent recently with reusing the same architecture up and down the product lines, from server/compute down to consumer. Because of that consistency it's not a far-fetched guess that they'd eventually roll out 7nm Vega "below" the Instinct line.
Of course it's also possible that it is as (ironically) I think Jim @ AdoredTV said...
AMD repeatedly mentioned Vega 20 as datacenter. It was literally the only language they used around it until CES this year. I looked some but haven't been able to find the articles that I read where they were specifically asked about it being a gaming chip and AMD said no, that Navi would be their 7nm chip for gamers. There were rumors about Vega II, but most of that was I think back in 2017 when things pointed to 7nm Vega, but then AMD completely reworked their roadmaps and from then on Vega 20 only showed up with regards to Instinct cards. If I recall, for all of 2018 Vega 20 was only listed for Instinct.
Except they always released it as a consumer product before pro. There's been two exceptions to that I'm aware of. Vega 20 and Vega 10. I'm not sure Vega 10 counts as the Frontier Edition was "prosumer", but I think it does offer some insight. Vega 10 had issues.
Show me one single quote where AMD said this.
You are just looking at some blogs interpretation of AMD computex events and assuming there would never be a gaming card.
This is exactly how the meme got started. Everyone read interpretations of AMD's silence, and assumed. There was no statement.
So Anandtech is just a blog now? Because the GPU writer for Anandtech got that impression from being at AMD events. Why did everything that AMD stated about Vega 20 only mention Instinct until CES this year? It wasn't that they said it'd be datacenter first, they literally talked about nothing but datacenter with regards to it other I think they said a couple of times it'd show up as Radeon Pro as well (which uh, where is that?). Why did they go "we'll have 7nm gaming cards", right around when they were talking about Vega 20, but deliberately not mention Vega 20? Why is Vega 20 the complete opposite of every other AMD GPU release, going HPC/enterprise market first (if I'm not mistaken, that was generally the last iteration they'd do of a given GPU)? Frankly, I think that might be smart, but it also tells me that gaming was not really considered until later. And they can spin it any number of ways. I actually think they might have said that Vega 20 was just for datacenter in 2017 when they first adjusted their roadmaps and removed the 14+/12nm Vega versions.