The principle isn't stupid. The way price hikes traditionally work is by yielding bigger overall profits even if the number of customers decreases (as per price elasticity). This is perfectly reasonable from the company's PoV, they should always search this optimal point where they extract the most profit. At the same time though, the company must always be mindful of the effects of price changes over their customer base or even the health of the entire market as a whole. (assuming low competition)Giving back sounds stupid to me.
If PC gaming is at the core of your business revenue, then your business is invested in the success and overall health of the ecosystem. Simply maximizing revenue isn't going to work.
All of this leads us to the real problem: Nvidia is no longer reliant on the PC gaming market. They no longer care about where this market is heading in the long term. Their crop rotation is somewhere else now.
My entire post was based on the assumption of low/fake competition.The real issue is the weakness of the competition.
I'd argue that a $200 4060 would be all it would take right now to keep the ground fertile.
Even screwing over AIBs, nVidia would be losing money.
Look, things are going to get even worse. N2 looks like it's going to be decently worse $ per transistor. And who knows how much GDDR7 will cost. Performance gains are still possible but you will have to pay for it.
There are people here who know what the margins really are on these products. AFAIK, they are aren't commenting on this this thread. Or any threads here.
Speculation on margins of products mass produced & assembled overseas is folly without insider knowledge.
This is ground we've covered before, so I won't mention it further here.
Are we talking about the AMD who is now fully invested in datacenter GPUs, to the point where they scrap some consumer RDNA 4 designs to focus on CDNA? AMD has shown us again and again they will only undercut Nvidia by as little as possible.I argue if these were true. AMD would have strongly responded to the Super cards with significant price cuts.

They don't because there's a duopoly so AMD just rides NV's pricing structure, with slight undercuts where needed. This isn't some kind of shocking revelation.I argue if these were true. AMD would have strongly responded to the Super cards with significant price cuts.
Are we talking about the AMD who is now fully invested in datacenter GPUs, to the point where they scrap some consumer RDNA 4 designs to focus on CDNA?
AMD has shown us again and again they will only undercut Nvidia by as little as possible.
Will the supers stop the bleeding? We'll see. Or have many gamers indeed crossed Nvidia off of the shopping list? I know I have. Voting with our wallets is how we reach them. They have enjoyed the Shut up and take my money! effect far too long, and PC gaming has suffered for it.
www.techspot.com
Exactly. Now whether they get squeezed in the datacenter and are forced to look around for less bloody water markets remains to be seen, PCs could maybe matter again to them some day.
The AI hype train will derail soon enough when everyone realized the cost isn't worth the gain in most cases. Also when dedicated AI inferencing hardware will be available, that market will also be gone because such hardware would be cheaper and more important use a fraction of the power compared to a "GPU".
The AI hype train will derail soon enough when everyone realized the cost isn't worth the gain in most cases. Also when dedicated AI inferencing hardware will be available, that market will also be gone because such hardware would be cheaper and more important use a fraction of the power compared to a "GPU".
Some of the posts here really are taking the proverbial, aren't they?Guys, guys, it's clearly all inflation. NV operates razor thin margins, yo!
![]()
Nvidia generates up to 1,000% profit for each H100 GPU sold
For every H100 GPU accelerator sold, Nvidia appears to be making a remarkable profit, with reported margins reaching 1,000 percent of production costs. Tae Kim, a senior...www.techspot.com
AD107 is 159mm2 according to TPU's database:Even screwing over AIBs, nVidia would be losing money.
Look, things are going to get even worse. N2 looks like it's going to be decently worse $ per transistor. And who knows how much GDDR7 will cost. Performance gains are still possible but you will have to pay for it.
isine.com
Firstly, we know their overall margins.
I don't think any company which is R and D heavy wants to deal with such poor margins for a product and would rather not sell the product in general. They would rather maintain prices and let the product stay on shelves. AMD CPU's are a far worse culprit when it comes to fat margins and even with awful sales, AMD does not really want to drop the prices too much.Some of the posts here really are taking the proverbial, aren't they?
Firstly, we know their overall margins.
And we know the die sizes.
So saying that a $200 4060 would result in:
AD107 is 159mm2 according to TPU's database:
so that's 358 potential dies (0.12mm scribe line is the default). I took the sqrt of 159mm² and from TPUs' die shots in the reviews AD107 is pretty square.![]()
![]()
Die Yield Calculator - iSine
DIE YIELD CALCULATOR Use this online calculator to figure out die yield using Murphy’s model. You’ll need to know the die size, wafer diameter, and defect density. iSine is your complete resource for ASIC design – from concept to manufacturing and testing. We have expertise in system...isine.com
Yields are another thing. TSMC is defect rate is often quote at 0.07 per mm² so I used that, giving 320 defect-free dies. Defect-free does not mean they will all run the correct frequency / voltage but then GPUs usually have parts which can be fused off so 89.4% is probably worse case.
So, how much is a 5nm/4nm TSMC wafer? Well, $20,000 sounds like the max and would make each die a bit over $62.
IMO, then $200 should be doable even with decent margins but we'd have to know more about the BOM of the rest of the card. AD107 is a GPU designed down to price though with a 128-bit bus, only 8GB of VRAM etc.
Now, we know Nvidia have no intention of selling RTX 4060 for $200, but I propose they could and still enjoy healthy margins.