Intel, is that you?As long as gamers still buy their GPUs in record numbers, why should they improve? It's a winning formula for NV to grow profits.
Intel, is that you?As long as gamers still buy their GPUs in record numbers, why should they improve? It's a winning formula for NV to grow profits.
You forgot OEM's not actually being able to test any cards before launch which lead to crashing issues that had to be resolved with a driver change.How could Nvidia have screwed up so many things in so many different areas so badly this launch? This would have been impossible to predict based on past performance
Cheaper (?) node with apparently poor characteristics.
Unbalanced design with too many compute units relative to rest.
Supply side disaster.
Terrible memory options. Too much or too little, choose.
It was a winning formula, but will it continue? I think NV knew that they had a problem once SS 7EUV got botched. They needed to get something out and regroup; trading on their great mind share. Like Intel, they can be on their back foot this generation - so long as the follow up (Hopper) delivers, they won't lose much momentum. It took three years of stagnation before Intel's bottom line started to be affected. No doubt the more nimble culture at NV will adjust quickly. Interesting times for gamers anyway, and little short term downside for NV's financials.As long as gamers still buy their GPUs in record numbers, why should they improve? It's a winning formula for NV to grow profits.
I guess Nvidia just isn't used to having this much competition at the high end. AMD threw a wrench into their system with RDNA 2 and now they are scrambling to figure out a revised line-up to stay competitive while also trying to buoy their profit margins. JHH would like another Ferrari but something has to give when AMD is squeezing his plums with a GPU that likely has better yields, is likely easier to supply, has better perf/W, and comes with more RAM. That "something" is AIB profit margins along with their relationship with AIBs, and it's not exactly a wise move since they are screwing over their distributors. As the saying goes, don't crap where you eat.How could Nvidia have screwed up so many things in so many different areas so badly this launch? This would have been impossible to predict based on past performance
Cheaper (?) node with apparently poor characteristics.
Unbalanced design with too many compute units relative to rest.
Supply side disaster.
Terrible memory options. Too much or too little, choose.
I think waiting would actual have been better. What's the point of an early launch if you have no inventory to sell? Anyone into a new GPU will soon also see the new AMD ones on the same chart and go from there. It's not like NV will avoid a comparisons with AMDs cards. Being a bit later with a mediocre product is better than pissing of your customer base, the ones that could buy will be annoyed if a refresh with more vram happens and the ones that couldn't buy got pissed due to lack of inventory.One of the reasons nVidia's launches normally go so well is they can basically release at their leisure. Except this year, where they have real competition. But its looking like they were just way to early.
Unfortunately for NV, it does seem like RTG has got their act together and given the funding they needed to excel. Early leaks for RDNA3 have it another >50% perf/w leap vs RDNA2. If NV didn't wake up when they saw RDNA1 potential, they will be in a world of hurt at the time of RDNA3 vs Hopper. ie. For all we know, Hopper could be Volta-esque destined for datacenters only and NV relying on Ampere to carry them, expecting no competition from RTG.It was a winning formula, but will it continue? I think NV knew that they had a problem once SS 7EUV got botched. They needed to get something out and regroup; trading on their great mind share. Like Intel, they can be on their back foot this generation - so long as the follow up (Hopper) delivers, they won't lose much momentum. It took three years of stagnation before Intel's bottom line started to be affected. No doubt the more nimble culture at NV will adjust quickly. Interesting times for gamers anyway, and little short term downside for NV's financials.
Unfortunately for NV, it does seem like RTG has got their act together and given the funding they needed to excel. Early leaks for RDNA3 have it another >50% perf/w leap vs RDNA2. If NV didn't wake up when they saw RDNA1 potential, they will be in a world of hurt at the time of RDNA3 vs Hopper. ie. For all we know, Hopper could be Volta-esque destined for datacenters only and NV relying on Ampere to carry them, expecting no competition from RTG.
Intel is toast until 22 when Keller-influenced new CPU design materializes as products.
And here I was hoping my* launch day 6800xt would be the bees knees for at least two years!Early leaks for RDNA3 have it another >50% perf/w leap vs RDNA2.
I don't think he wanted this.Better for gamers isn't better for Jensen though.
In GPUs, Xe might still be behind RDNA2/Ampere nevermind RDNA3 and Hopper in uarch.Intel is likely toast till HVM of their 7nm products hits in 2023 - and they won't catch up till 5nm, if that node doesn't get delayed also.
Intel doesn't need to compete with the high end, or even the upper mid range. The VAST majority of GPU's sold are low to mid range cards. Thats where intel needs to compete. And they don't even have to compete with cards sold at retail. Most GPU's are sold via OEMs. Which intel already has strong ties with.In GPUs, Xe might still be behind RDNA2/Ampere nevermind RDNA3 and Hopper in uarch.
They are going to need another substantial leap in perf/watt and possibly perf/mm2 before having any chance of competing with the two and this is if they have process parity!
Competing using price using an inferior uarch is a losing battle.Intel doesn't need to compete with the high end, or even the upper mid range. The VAST majority of GPU's sold are low to mid range cards. Thats where intel needs to compete. And they don't even have to compete with cards sold at retail. Most GPU's are sold via OEMs. Which intel already has strong ties with.
The last time Intel tried entering a new market and competing on cost by using their strong OEM ties ended with billions in losses and complete retreat. And that was during the last stage of their golden era, when their brand image was intact. "Intel inside" meant a lot more back then than it does now.And they don't even have to compete with cards sold at retail. Most GPU's are sold via OEMs. Which intel already has strong ties with.
Especially for Intel who has grown to expect fat margins, fatter than it's leaner competitors.Competing using price using an inferior uarch is a losing battle.
And as the third vendor, they have to be above everyone else just so they get everyone's attention and have a shot at taking any marketshare.
I didn't say competing on price. I meant competing on performance, just in low to mid range GPU's, not the high end market. Trying to shovel an inferior product just on price wont work. But if they come up with a GPU that competed with say, an RTX 3050, where performance was equal, they could push AMD and nVidia out of some OEM deals. Specifically in the mobile space.Competing using price using an inferior uarch is a losing battle.
And as the third vendor, they have to be above everyone else just so they get everyone's attention and have a shot at taking any marketshare.
Intel doesn't sell any discrete cards right now so they don't have any margins. While I'm sure they'd like them to be similar to their CPUs, as long as they make a profit and can sustain the cost of developing their next generation of GPUs then they're making more money than they would be by not doing so.Especially for Intel who has grown to expect fat margins, fatter than it's leaner competitors.
Inferior tech, means higher production costs, to deliver the same result, and then you have to charge less than competitors to sell, so a double hit to margins, then factor leaner margins of competitors to start with, so a triple hit to margins.
How long will Intel suffer lean margins before giving up?
I am skeptical Intel GPUs will ever really matter to game playing consumers. But I would be happy if they surprised to the contrary. More competition would be nice and actually it's most needed in the mid range, where prices are slow to move.
What you say makes sense, but it won't go over well in the board room or at quarterly earnings report, I guarantee you that.Intel doesn't sell any discrete cards right now so they don't have any margins. While I'm sure they'd like them to be similar to their CPUs, as long as they make a profit and can sustain the cost of developing their next generation of GPUs then they're making more money than they would be by not doing so.
If their product is only good as a bottom basement option then it's better for them to take that on slim margins than to not sell anything at all. If they had the capability to release a killer product that takes the crown right out of the gate then they would have had far better APU graphics for years. As long as they can cover their costs until they can get better it's fine for them to not be the best.
Are the board members so stupid that they'd turn down an increase in net profit? I don't doubt they have some idiots given the current state of Intel, but if Intel can make their graphics division self-sustaining then they'd be utter fools to kill it for no other reason than it can't hit margin levels for their long-standing core business area that didn't have any real competition for half a decade.What you say makes sense, but it won't go over well in the board room or at quarterly earnings report, I guarantee you that.
Making money at a place like Intel doesn't mean anything, they make more money than they know what to do with. What the board room / stock holders want to see is them making more money at high profitability so that they can share in the profits in terms of dividends and share price.Are the board members so stupid that they'd turn down an increase in net profit? I don't doubt they have some idiots given the current state of Intel, but if Intel can make their graphics division self-sustaining then they'd be utter fools to kill it for no other reason than it can't hit margin levels for their long-standing core business area that didn't have any real competition for half a decade.
-Yeah, it's like the directive inside NV was to defy expectations and everyone in the company interpreted it the wrong way.How could Nvidia have screwed up so many things in so many different areas so badly this launch? This would have been impossible to predict based on past performance
Cheaper (?) node with apparently poor characteristics.
Unbalanced design with too many compute units relative to rest.
Supply side disaster.
Terrible memory options. Too much or too little, choose.