Just noticed I had 3TB/s NAND, which uh, that's nowhere close, right? They're pushing more like 3GB/s right now right? Plus what are Infinity Fabric links rated for? Kinda curious how feasible my memory chip bridge might would be as I think that could be a really interesting idea but not sure it'd be worth it if you couldn't get anywhere close to HBM2's bandwidth over IF.
So you think the CEO of AMD lies in public about their intentions?
I believe her, and FWIW, as soon as I saw original story, I assumed there would be a consumer version.
It's common practice of AMD and NVidia to release higher end chips into Pro products before consumer products,
and when they release the pro product they are usually silent about consumer offshoots.
But both prefer to reuse the dies for both markets whenever they can. AMD in particular is VERY big on die reuse. Can you name one AMD product die in either CPU or GPU that pro only from AMD in the last decade?
Yes, I'm sure she's told at least one lie about their intentions. I don't think it was necessarily malicious (no I'm not trying to incite some negative response or say she needs to be investigated and class action lawsuit or anything, I just want it called out so that mealy-mouth half truths does not become the norm from them), and likely might not have been a lie at the time she said it. Heck, she could spin this as Vega 10 was a gaming chip, and Vega 20 is heavily based around Vega 10, so duh, Vega was always for gaming! Not that she did that.
Absolutely true, well mostly. AMD tends to be pretty clear on there being a consumer version (as they were with Vega 10) and often their Pro versions come after the consumer ones (as we saw with Polaris, and I think Fury and Hawaii, and most of their other chips for awhile there - largely because the extra testing and stability of the Pro stuff likely took longer). The issue is that, the costs of making Vega 20 based stuff, for the relative performance increase, likely was going to make the card have to be sold at a loss in the consumer gaming market. But then Nvidia upped pricing substantially, so AMD saw an opportunity to make such a card and still make money.
For sure and that makes sense. So why were were people feeling that they were deliberate in making it clear that Vega 20 was not for gamers? That's the issue. I couldn't care less if they did or didn't make it a consumer card (honestly if they're making money from it, awesome for them, and great for the people that this is a good product for, including well to do gamers that could use the extra performance it has but didn't want to go with Nvidia for whatever reason).
Not CEOs in general, always being honest.
Specifically Lisa Su in this instance. She seems very much a like a straight shooter and I watched the video, where she didn't seem the least bit evasive or weaselly. She almost laughed at the suggesting that it was only for Data center. Laughing in the sense how some of the media gets carried away on the tiniest hint of something.
I wonder if AMD is looking to fork Vega off into a compute only card akin to the Tesla series while a "scalable" (remember that slide?) Navi arch trimmed of all the compute related fat brings up the consumer space.
AMD likely does not have the resources to run two separate lines simultaneously however, but an interesting thought nonetheless...
I agree, she's a great CEO and I especially like how she talks. She's like a JHH with less ego, where she's enthused about the product and company and enjoys talking the technical aspects (compared to say someone like Steve Jobs who seemed to hate the technical stuff, and instead focus on "well I'm saying its better" type of stuff where there's a lot of ambiguity).
I don't think she was being evasive (she even kinda laughs it off and I think was just having a bit of fun with the situation, and perhaps a lot of it was people misinterpreting what she said last year, but there's several people that got that impression), but I do not believe that this was intended for a retail gaming card release. I'm sure things changed. If so, then just fess up. If it turns out that it is limited run and they aren't clear about that, then that is where I'd start to take issue.
The thing is, I don't think she's being nefarious, but she's definitely not being totally open either. And that's expected but still somewhat disappointing.
I don't really get that comment (about running separate lines at the same time; haven't they been doing exactly that with Polaris and Vega, and then Fiji and Hawaii before then?). Its simply about profitability. Prior to Nvidia's RTX cards, I think AMD figured pricing would be prohibitive to them selling Vega 20 for a profit in the consumer gaming market. I do not think that Vega 20 is a cheap card to make, and pushing it to other markets requires a certain amount of support that also costs money (so sometimes its easier to not sell in a market when the costs supersede what you'd benefit from the market even if the cost wouldn't outright be higher than you'd sell it for. Plus they've had other specialty cards (there was that Polaris card with the NAND onboard, and they attempted to sell Fury X2 for like VR content creators after it flopped with consumers and they ended up with excess stock that wasn't selling, or actually I don't even remember exactly what happened I just recall they tried pushing Fury X2 probably because it didn't make a good gaming card because of the price and lack of worthwhile performance for that price).
So if NV cannot sell its high end GPUs , Vega II at $699 will have a real hard time. Nice because both of them have to lower their prices.
Yeah, and if those benchmarks putting it below the 1080Ti turn out to be largely representative, then I'll seriously question people buying this over RTX cards (and I've been quite critical of those for the perf/$).
AMD wants and needs to sell high margins products, Vega II at $699 is one of them. Cutting down supply because the market is not willing to buy will not make things better for them. There is no infinity demand for Datacenter GPUs, if you dont sell at both the Gaming and Datacenter markets you dont increase revenue thus you dont make profit. Simple as that.
Is it? Seems like you're doing what you're taking issues with others doing, which is just making up numbers to support why you think that's true. I have a hunch its somewhere in the middle. I doubt its great margins, but I doubt its razor thin to the point that $699 is the only price they can make money.
Er, and pushing stuff into the market that doesn't sell
will benefit them? Does that make any sense to you? If it doesn't sell, then you'd literally only be losing money by putting it into the market. Now maybe you're arguing that well they already made them so its better than not selling at all, but that's not really the discussion unless they're flopping almost totally in the intended market and so they've got a glut of them and can probably get some money back out of them by selling much cheaper to consumers.
That's true (no infinite market), but they could probably sell the exact same Radeon VII cards to datacenters for less than the MI50/60 but still make more money than they will from selling to gamers. Why not do a Frontier Edition version that again, will have some demand, and sell for more. Or sell it as a FirePro? There's other markets they could be selling this in, where the demand will probably be equal or higher, and the selling price will almost definitely be higher. Now, perhaps this is for those markets, and they just decided to sell it cheaply to maximize the demand, but not lose money on any single card. I could see that being a possibility (AMD has intentionally aimed to disrupt pricing before). If so, then that would be awesome (provided it has quality support for those customers; I'd be interested in its 3D modeling capabilities, and potentially might be able to make the case or getting one for that usage if it does very well there). I am skeptical of that though, which is weird for them to be selling a more expensive but less capable Vega 10 based card there.
You say that but I don't think the consumer gaming market is going to matter much at all for the profitability of Vega 20. If its profitable in datacenter then it'd be profitable, if not, I doubt the gaming market would make it profitable and at that point it'd probably just be trying to keep from taking a total wash from making it.
I never know 3DMark scores so no idea if those are good or not. And while I believe FFXV favors Nvidia, that doesn't look very good. I really won't be surprised if performance on this card is all over the place, sometimes really good, and other times barely better than Vega 64.
Radeon VII is basically a loss-leader to keep the brand fresh in the consumer market. AMD knows that Polaris isn't going to be enough on its own until Navi launches. Being an AMD board partner right now must be an agonizing affair.
And I will remind you that at least the initial run of Radeon VII amounts to maybe 20k cards. If you think the pro market won't absorb an allocation like that in the blink of an eye then I have a bridge in Brooklyn to sell you.
Grain of salt taken. Yes, I have heard that the HBM2 costs of Radeon VII would be $320 alone, which seems a bit . . . much. You would think that after all these years of HBM production that production costs would come down. Still, even if the cost of HBM2 on Radeon VII were only $200-$250, that would leave little room for profit after taking into account all the other costs associated with production and sales of Radeon VII.
That would be my guess. There were also rumors about it being to commemorate the retirement of some longtime AMD person. Which neat, but reminds me of their special edition or whatever attempt at moving Fury X2 cards.
I think this chip is serving its purpose. Its a pipe cleaner for AMD to get experience with 7nm, and its targeted at a market that will take any performance increase they can get at even high prices. That they can sell it to consumers is nice, but it doesn't magically make it a great product that is worth its asking price there, even if it is a lot less expensive than its intended market.
And yet that's the number most-often repeated. The 5k number has been effectively debunked. That's about it.
Interesting, but irrelevant.
How much money do you think AMD poured into developing Vega20, in total? Certainly there is the expectation of profit to be gained from the exercise. How many Mi50s and MI60s do you think they would have to sell to make it worth the company's while?
. . . really? Why?
https://www.extremetech.com/extreme/283957-amd-claims-it-has-enough-radeon-viis-to-meet-demand
This article indicates that Radeon VII's memory + interposer cost would have been $285 in early 2018. You really think the cost of that arrangement would have dropped to $150 or lower just a year later?
Honestly? I don't think that much (that they didn't need to spend to learn 7nm). It really seems like Vega 10, on 7nm, with some very minor tweaks, using newer HBM2, and improved software. Most of the cost was just the engineering for 7nm (and that was needed for the rest of their products, so them being able to get a worthwhile product out of it is a good thing, especially if it means that their other stuff ends up better and isn't hampered by delays from learning 7nm). But I think they targeted HPC/enterprise because they wanted to make sure that they would make money on it. Hopefully their yields and HBM2 related costs are low enough that they're getting decent margins on Radeon VII too. If Navi works out well an HBM2 version for mobile (and like Macbook Pro) could be nice.
Nice find. Yeah I'd be surprised if its not around $250 at best.
It's always been plagued with production issues and has never really ramped up to the kind of high volume production that's necessary to drive prices down through economies of scale.
There are also a smaller number of companies producing it which means less competitive pressures on price.
HBM2 costs certainly haven't dropped enough to make it really worthwhile or else I think we'd have seen Nvidia go that route on either mobile RTX or the Titan and 2080Ti chips, as I'm sure the bandwidth would've been worth it (they were up to like 1.2TB/s using the 307GB/s stacks which would almost double the RTX bandwidth). And it would help make their high prices a little more worth it, but the costs probably were high enough that either Nvidia would give up margins or would push their prices to even higher levels.