I think the argument is rather that subsidies are wasted on Intel. After all as you said Intel didn't show to have the right people in the right places and for a long time preferred to spend its own money on buybacks instead on solving its long term issues (like getting/retaining the right people). Intel's issue never has been money, so why should it receive subsidies from tax payers?I think the argument that Intel wasted all their money on buybacks is specious. Yes, they bought back a ton of stock. But they also invested plenty in building fabs and developing new processes. Lack of investment is not the reason they fell behind, they spent more than enough to keep up. They just didn't have the right people in the right places
Going for an epeen win, price be darned is a strategy I really don't think AMD is going to play into.
Honestly, I do not think Zen3d will be their halo product for long, if at all. Zen4 should be faster. In fact, I expect that Zen4 (and/or a refresh of Zen4) would be faster than Raptor Lake.
Most of the estimates are for October launch for Alder Lake.
With addition of DDR5 and motherboards and Windows 11, all having to be available, I don't think Alder Lake will be a huge seller in Q4. Just too many prerequisites all having to come true all at once.
But I think it is as much a battle for mind share as it is for market share.
Yeah, some info will start trickling out...
I think the argument is rather that subsidies are wasted on Intel. After all as you said Intel didn't show to have the right people in the right places and for a long time preferred to spend its own money on buybacks instead on solving its long term issues (like getting/retaining the right people). Intel's issue never has been money, so why should it receive subsidies from tax payers?
The Hyperscalers are the customers AMD is having the most success with, as far as market penetration, and other segments are much slower to move. The other segments don't have discounts as deep as hyperscalers.
AMD kept throwing silicon at the EPYC chips as long as performance kept scaling well and until the power limit was hit. AMD didn't decide to randomly stop half way.
BTW, the cost of that SRAM silicon is most likely lower than cost of the IO die silicon, per mm2.
I don't think yields will be an issue at all.
TSMC is getting excellent yields on N7, even better on N6. SRAM will get as close to 100% as you can get. Only good die are entering assembly, and there will most likely be way to isolate and disable bad layers.
Having problem yielding good dies means you don't have a lot f silicon to throw at the problem.
It has been only a year since AMD became very profitable. A lot of the decisions about product line up and risk profile of products were made when AMD was much poorer, and the market share in more lucrative segments was very low. So AMD did not have a lot of options as far as buying up risk production at TSMC.
AMD is still quite conservative and risk averse about future products.
What most people can't seem to get is that V-Cache is a very low risk, high reward move. AMD can get N5 or even N3 level performance product out of TSMC N7 node.
Nvidia is charging - and successfully selling a gaming graphics card for $1,500. Nvidia is going to ridiculous lengths to get this type of product.
The trick is to have a product.
The Halo ADL product with DDR5, released in 2021 is going to destroy Zen 3.
And Zen 4 is more than a year away from today.
Or to keep ASPs from crushing.
Luckily, AMD has a tool in its tool chest with which performance can be increased gradually, with $6 increments, up to the level needed to beat the ADL
If your halo product is going to look embarrassing compared to the competition that's one thing, but if your halo product can win, you should always release it... and you should always aim to win. Jensen Huang's pathological need to win has made Nvidia what it is today. On the other side of the coin, AMD's old "sweet spot" strategy single-handedly lost them a generation when they had a nearly 2x perf/mm2 advantage.
Honestly, I do not think Zen3d will be their halo product for long, if at all. Zen4 should be faster. In fact, I expect that Zen4 (and/or a refresh of Zen4) would be faster than Raptor Lake.
Honestly, I do not think Zen3d will be their halo product for long, if at all. Zen4 should be faster. In fact, I expect that Zen4 (and/or a refresh of Zen4) would be faster than Raptor Lake.
Not sure that they ll ever release a Zen 3 with this cache, probably that it s only a test vehicle since they still had no Zen 4 at hand, in all likelyhood it will be commercialy inaugurated with the latter.
Not sure that they ll ever release a Zen 3 with this cache, probably that it s only a test vehicle since they still had no Zen 4 at hand, in all likelyhood it will be commercialy inaugurated with the latter.
Update: June 1st:
In a call with AMD, we have confirmed the following:
- This technology will be productized with 7nm Zen 3-based Ryzen processors
I think Lisa straight up said products with it will be out Q4 this year. Unless Zen 4 is WAY ahead of schedule, its gonna be Zen 3.
I think Lisa straight up said products with it will be out Q4 this year. Unless Zen 4 is WAY ahead of schedule, its gonna be Zen 3.
See the bottom:AnandTech Forums: Technology, Hardware, Software, and Deals
Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.www.anandtech.com
I think Lisa straight up said products with it will be out Q4 this year. Unless Zen 4 is WAY ahead of schedule, its gonna be Zen 3.
See the bottom:AnandTech Forums: Technology, Hardware, Software, and Deals
Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.www.anandtech.com
For Intel, launch and availability are two different things. If it launches in October, you are looking at December-January for availability.
Another way to look at it, what value does producing the fastest halo product have? Would it be worth it if AMD lost money on each one sold? How much is that 'mind share' worth? At what cost does it stop making sense to make a halo product just to say you have a product that is the best at gaming?
If the US is trying to attract others to build fabs in the US, like TSMC, it can't very well exclude Intel from the same subsidies. After all, Intel has fabs in other countries and if they get subsidies from them they would simply build fabs overseas instead of in the US. Multinationals like Intel don't care about the strategic needs of the US, they care only about what is most profitable.
I agree that subsidies targeted at Intel or that would only be available to Intel may not be wise. But if you want TSMC and Samsung to invest in US based fabs, it is pretty hard to write laws that would exclude Intel from those same subsidies.
It just might be worth it, but it's obviously not the situation where AMD would come close to losing money.
During the RV770/G200 generation, Nvidia's top die was 2.25x larger than AMD's, and beat it by ~10%. Prior to that generation, commanding a 2X perf/mm2 advantage guaranteed market domination. R300 single handedly made ATI the market leader. Ditto G80 for Nvidia. And neither of those examples had anything like a ~2x perf/mm2 advantage. Nvidia won that gen because they made the fastest product they could. AMD lost that gen because of people who thought the same way you are thinking now.
If you don't see that mindshare matters a hell of a lot, you haven't been paying attention.
You didn't answer my question. What is the value of this mind share and how much cost should AMD sink into getting / maintaining it? I agree mind share is worth something, I just don't think its worth going for the top part, no matter the cost.
R300 may have made ATI the market leader in performance, but not in sales, not even close. The legendary 9000 series wasn't even able to do that.
When the Radeon HD7000 launched, it was the bigger die and the OC models (later official GHz models released by AMD) beat Nvidia's 680 at the time, all for. . . not much in the end in terms of market share.
To say that Nvidia's market leadership was due to them having a 10% lead with a bigger die I think is naive. There's far more to it than that, especially with GPUs. IMO, Nvidia's dominance has more to do with ATI/AMD's failures in the past than making sure they have that last little 10% lead at the top.
All of this, though, is really a tangent as the GPU and CPU markets, even for gaming, are different beasts.
If it weren't for the real world example of Nvidia, which always goes for the top spot no matter the cost, and is now a 500B dollar company, I might agree.
Yes it did. You're misreading the chart. When the X800 series launched, ATI had close nearly 60% market share. After the launch, ATI slowly lost market share share since Geforce 6 was more impressive, but remained the leader via momentum.
Keep in mind when reading the chart that only the start of each generation is labelled, and a lot of movements correlate with mid-generation refreshes.
Admittedly, it wasn't R300 or R400 that made ATI the market leader at the time, it was Half Life 2. Without that, ATI might have ended the R300 generation around 50% share (maybe a bit lower), and then immediately lost their co-leadership position after the R400/Geforce6 launch.
The GHz model didn't beat the 680. It jockeyed for position while consuming much more power with (crucially) a much worse fan profile. If the 7970 (GHz or otherwise) had convincingly beat (or even convincingly tied) the 680, Nvidia had GK100 waiting in the wings.
At that point in time, the one and only failure was R600 relative to G80. And GT200 would have been a greater failure relative to a hypothetical big-RV770 than that was.
If anything, the psychological aspect is more important in the current CPU market than it was in the historical GPU market. You don't topple a behemoth by allowing it to retake mindshare when you're able to blunt it from doing so.
Nvidia won that gen because they made the fastest product they could. AMD lost that gen because of people who thought the same way you are thinking now.
If you don't see that mindshare matters a hell of a lot, you haven't been paying attention.
You don't topple a behemoth by allowing it to retake mindshare when you're able to blunt it from doing so.
No they didn't. If they wanted to throw more silicon at Epyc they easily could have. Each chiplet is only ~80 mm2 and there is room left on the interposer, they easily could have made each CCD bigger but they chose not to.
Do you have any calculations for this?
I'm not talking about the yield of the dice themselves but the yield of the stacking. We don't know what the yield is like through this process, but whatever it is, it is going to be significantly worse the more stacks that are added relative to 1 stack.
No, it just means your costs go up to get working silicon which is the whole point. Companies could keep adding silicon to increase performance but they have to take into account not only the cost of the added silicon, but the hit in yields as the dice grow bigger. That's the whole reason AMD went chiplets in the first place. Stacking happens because continuing to grow horizontally on the substrate is no longer practical at some point both in power and cost, so stacking becomes the superior option,
but it will still be lower yielding than not stacking and stacking 4 hi will be lower yielding than 1 hi, etc.
AMD has technically been profitable since the beginning of 2018 but that notwithstanding, they still could have bought into risk production and just paid their debts will all of their income from chip sales. What you're saying, however, is that buying into risk production would have cut into their profits despite it giving them a superior product. Doesn't sound like a very good business decision, does it?
Nvidia also has sold a $3000 Titan graphics card and it was an absolute failure in terms of consumer success.
Even Nvidia has limits as to what they can charge. We also have no idea how the 3090 would have sold if it weren't for the current market conditions and miners who are much less price sensitive than the regular consumers. You can't compare what graphics cards sell for now to pretty much any other computer part in history (except other GPUs during mining runs) as it is a unique situation and these cards are being driven up in price by people who aren't buying them as members of the target consumer market but are being bought as money making machines.
I doubt it will destroy Zen 3 but I guess that depends on your definition of destroy. I also said more than a year away from when ADL launches (actual launch, not announced).
Again, I'd like to see your math behind this number.
Based upon your posts in this forum, I just don't think we will see eye to eye on how AMD will/should operate their business. We'll just have to wait and see how far AMD is willing to go and how much they are willing to spend to keep their 'mind share' as the gaming leader despite having only won this mind share less than a year ago and doing just fine without it.
You didn't answer my question. What is the value of this mind share and how much cost should AMD sink into getting / maintaining it? I agree mind share is worth something, I just don't think its worth going for the top part, no matter the cost.
This perfectly explains why the Wii was such a commercial failure. If you don't have the biggest and best product you'll be quickly relegated to the pages of history.
If you think that's what AMD has to do in order to be successful you're sorely misguided in your thinking.