NV 4060 / 4060TI reviews

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SteveGrabowski

Diamond Member
Oct 20, 2014
6,865
5,803
136
https://preview.redd.it/hg3ffwbyan1b1.jpeg?width=2400&format=pjpg&auto=webp&v=enabled&s=405304d7eec0b76410640c8d7cc2ee4f51b96774


Again, as I said in one of the previous posts, using every traditional metric such as cuda cores and memory bus width, nvidia upshifted every die into the tier above. 4080 should have been 4070, 4070 should have been 4060, 4060ti should have been 4050. And on top of that they also raised prices for each - 4080 which is really a 70 class card at $1200? 4070 which is really a 60 class at $600? GTFO.

Vote with your wallet, it's the only thing that will change this.
Ouch, I was comparing this to the crap GTX 960 when it's much closer to the crap GTX 950 by this metric
 
  • Like
Reactions: Tlh97 and IEC

SteveGrabowski

Diamond Member
Oct 20, 2014
6,865
5,803
136
As some of us have been saying for years: This is what happens when a company dominates the market. Let's hope this is the turning point. That gamers keep their wallets shut, or spend it on AMD or Intel, to punish Nvidia the only way they understand.

EVGA looks wiser every day.
Gotta start buying AMD to really tell Nvidia to piss off. Hopefully AMD comes out with a nice replacement for the 6700 XT to claim this $400 price point with something strong this gen.
 

KompuKare

Golden Member
Jul 28, 2009
1,014
925
136
AD106 too expensive to do something like that. More likely it's end production. nVidia probally should have left AD106 and AD107 to be mobile only... or maybe OEM only on desktop.
I have to say that I am with @BFG10K here: show us your workings!
Please show us evidence of the profit on each 4060TI. I'd like to see these "razor thin margins" that poor, poor NV has to endure.


Nah, what you're seeing is a company with a monopoly in the GPU market trying to maintain miner margins despite mining being dead.

Exactly like Intel giving us 4 cores on 14nm++++++ for seven years.
While we can only speculate on wafer prices, we do have some concrete facts:
  1. AD106 is 190mm²
  2. TSMC's 4nm yields should be similar to 7nm at this stage, so a defect rate of 0.07 #/sq. cm is reasonable.
  3. From the TPU dissection I make the die ratio to be about 1:23, or that AD106 is 12.5mm by 15.2mm.
Taking those figures to https://isine.com/resources/die-yield-calculator/ gives me:
72NboGl.png

So ignoring partial dies (if there are many of these, Nvidia might eventually release a plain 4060 based on AD106 instead of AD107), we get 259 good dies.

Like I said we have no idea what TSMC charges for a 4nm wafer but at 259 good dies:
at $10,000 per wafer it would be around $39
at $15,000 per wafer it would be around $58
at $20,000 per wafer it would be around $77
at £30,000 per wafer it would be around £116
$30k likely to be far too high, but let us take the €20k price. Under $80 per die, then Nvidia sell the die and 8GB of GDDR6 to the OEMs. A quick search as GDDR6 at a spot price at around $3.50 so $28 or so (I think DRAMExhange prices per GB - but also Nvidia have huge volumes so would a lot less than spot prices).

So at worst we are at $108 now. Plenty of scope to cut prices from $400 IMO. And lower end parts should be lower margins parts which make up for that by volume.

Which is immediately followed by Huang Scaling: as transistors get smaller, Nvidia's margins stays constant.
Or maybe it is even worse than that: if we ignore their largely self-inflicted mining boom overstocking and the huge corrections they had to declare for their results this years, then Nvidia's margins have actually gone up as the cost per transistor has gone up. So much for Jensen moaning like O'Leary about the cost of doing business (in Jensen's case the cost of wafers, in the case of the Ryanair boss the cost of just about anything).
I'm not joking when I say AMD could easily pull out of the dGPU business if things don't improve.
IMO AMD's problem recently has been that while their fixed costs have gone up (leading edge GPUs in 2023 are really really expensive to design, write drivers for etc.), they are not willing to aggressively go for volume.

High fixed costs spread over a low volume = high overheads per item sold. Little profit overall.

High fixed costs spread over high volumes = lower overheads per item sold. More overall profit?

Profit = cost (fixed and unit costs) less (margins times volume).
 
  • Like
Reactions: Tlh97 and Aapje

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,455
20,468
146
Gotta start buying AMD to really tell Nvidia to piss off.
No I don't, I just bought the ARC LE A750 for $199. ;) I haven't been this excited about a GPU purchases in a good while. The lack of polish compared to the competition makes it spicy. Going to be fun having to learn a new user experience from the driver downloads to UI, too. In the last couple of years, I've had 6 RDNA2 cards, 4 GTX 10, 2 RTX 20, 2 RTX 30, time to mix it up.

Also, it's nigh impossible to get any tech tuber to test the games I want at the settings I want, so Imma do it myself.
 

maddie

Diamond Member
Jul 18, 2010
4,738
4,667
136
I have to say that I am with @BFG10K here: show us your workings!

While we can only speculate on wafer prices, we do have some concrete facts:
  1. AD106 is 190mm²
  2. TSMC's 4nm yields should be similar to 7nm at this stage, so a defect rate of 0.07 #/sq. cm is reasonable.
  3. From the TPU dissection I make the die ratio to be about 1:23, or that AD106 is 12.5mm by 15.2mm.
Taking those figures to https://isine.com/resources/die-yield-calculator/ gives me:
72NboGl.png

So ignoring partial dies (if there are many of these, Nvidia might eventually release a plain 4060 based on AD106 instead of AD107), we get 259 good dies.

Like I said we have no idea what TSMC charges for a 4nm wafer but at 259 good dies:
at $10,000 per wafer it would be around $39
at $15,000 per wafer it would be around $58
at $20,000 per wafer it would be around $77
at £30,000 per wafer it would be around £116
$30k likely to be far too high, but let us take the €20k price. Under $80 per die, then Nvidia sell the die and 8GB of GDDR6 to the OEMs. A quick search as GDDR6 at a spot price at around $3.50 so $28 or so (I think DRAMExhange prices per GB - but also Nvidia have huge volumes so would a lot less than spot prices).

So at worst we are at $108 now. Plenty of scope to cut prices from $400 IMO. And lower end parts should be lower margins parts which make up for that by volume.


Or maybe it is even worse than that: if we ignore their largely self-inflicted mining boom overstocking and the huge corrections they had to declare for their results this years, then Nvidia's margins have actually gone up as the cost per transistor has gone up. So much for Jensen moaning like O'Leary about the cost of doing business (in Jensen's case the cost of wafers, in the case of the Ryanair boss the cost of just about anything).

IMO AMD's problem recently has been that while their fixed costs have gone up (leading edge GPUs in 2023 are really really expensive to design, write drivers for etc.), they are not willing to aggressively go for volume.

High fixed costs spread over a low volume = high overheads per item sold. Little profit overall.

High fixed costs spread over high volumes = lower overheads per item sold. More overall profit?

Profit = cost (fixed and unit costs) less (margins times volume).
Fantasy spin unbound. Good luck getting an answer. I never got one & it used to drive me crazy, until I realized the truth. A few weeks ago we got the cost of 8 x 2GB memory chips as $200. Same old story being repeated with no attempt to explain.

Now the latest. Might be close to stopping selling GPUs, at least those lower than the extreme top end.
 
Last edited:

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
I think we can stop pretending this is an honest launch. The only honest launch was the 4090. My hypothesis of what's actually happening is that Nvidia is in the middle of a huge AI hardware boom and they don't want to waste wafer space on a bunch of stingy, broke ass gamers. They release cards they know won't sell and allocate the space to AI hardware and rake it in. They figure gamers will always be around to buy their gaming cards when they finally decide to sell real gaming cards again. These launches are designed to fail. They have to be.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,117
136
Ugh. Not often you can just flat out call an NV launch bad, but here we are. At least the 2xxx series was bringing newfangled RTX and DLSS to the table to make up for the lack of perf/$$$ movement, but even that sheen is gone now.

The unspoken tragedy of this launch is that 3070's and 3060Tis etc have no reason to drop in price now. Maybe there will be a little downward pressure in the used market as the ignorant gen on gen upgraders unload their cards, but I don't expect it.

I was really hoping the 4060Ti/4060 would crush 3070's down into the $100-200 price range, but no such luck. They'll keep selling at their inflated prices and budget gaming will take another kick in the nuts.

Been shopping around and its startling how $100 is basically the floor on anything but the most antiquated used GPUs at this point. 2060 Supers and such are still going for $120, its nuts.
 

Saylick

Diamond Member
Sep 10, 2012
3,127
6,304
136
I think we can stop pretending this is an honest launch. The only honest launch was the 4090. My hypothesis of what's actually happening is that Nvidia is in the middle of a huge AI hardware boom and they don't want to waste wafer space on a bunch of stingy, broke ass gamers. They release cards they know won't sell and allocate the space to AI hardware and rake it in. They figure gamers will always be around to buy their gaming cards when they finally decide to sell real gaming cards again. These launches are designed to fail. They have to be.
I think Nvidia recognizes that competing in the consumer GPU market is not viable in the long-term for them: AMD and Intel will eventually make inroads and the gaming market is already very mature and not exactly growing like gangbusters. Therefore, they would rather focus as much of their financial and engineering resources on the AI market and whatever paltry amount is left goes to maintaining their consumer segment. Lovelace as a generation isn't supposed to be appealing because they'd much rather have no one buy consumer GPUs, thus maintaining the status quo, so that they could allocate it to AI. Once the AI market takes off and is a long-term sustainable revenue source for Nvidia, I suspect Jensen will be content to having his consumer GPU marketshare drop down if it means they get to enjoy much higher margins in the consumer space. They will essentially be the "Apple of consumer GPUs", whereby Nvidia won't ship the most consumer GPUs out of the three vendors but they would make the most profit. Then, whatever wafer allocation used to go to consumer GPUs would be allocated to enterprise products.

I mean, we're already seeing some of this happen with respect to mobile GPUs. Nvidia is getting pushed out of the lower end and frankly, I don't think they care.
 

maddie

Diamond Member
Jul 18, 2010
4,738
4,667
136
I think Nvidia recognizes that competing in the consumer GPU market is not viable in the long-term for them: AMD and Intel will eventually make inroads and the gaming market is already very mature and not exactly growing like gangbusters. Therefore, they would rather focus as much of their financial and engineering resources on the AI market and whatever paltry amount is left goes to maintaining their consumer segment. Lovelace as a generation isn't supposed to be appealing because they'd much rather have no one buy consumer GPUs, thus maintaining the status quo, so that they could allocate it to AI. Once the AI market takes off and is a long-term sustainable revenue source for Nvidia, I suspect Jensen will be content to having his consumer GPU marketshare drop down if it means they get to enjoy much higher margins in the consumer space. They will essentially be the "Apple of consumer GPUs", whereby Nvidia won't ship the most consumer GPUs out of the three vendors but they would make the most profit. Then, whatever wafer allocation used to go to consumer GPUs would be allocated to enterprise products.

I mean, we're already seeing some of this happen with respect to mobile GPUs. Nvidia is getting pushed out of the lower end and frankly, I don't think they care.
Maybe they need to spin-off the gaming GPU division. One problem is Wall St valuations that are affected by present & future margins. A new consumer oriented company licensing technology from the main company, would allow separate accounting and could have lower margin products without lowering the main margin powerhouse.
 
Jul 27, 2020
16,165
10,240
106
Stinky Nvidia can go away for all I care. What have they REALLY given us since COVID that's worth talking about?

3090 (badly cooled RAM chips)
3090 Ti (30% hungrier spitting out only 10% extra fps)
4090 (too big. bad connector)

If Nvidia goes all in with AI and exits the consumer GPU market, I can't say I will miss them.
 
  • Like
Reactions: Tlh97 and Saylick

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
It actually looks like Nvidia is no longer concerned with earning our business. If they don't need us or want to sell to gamers anymore, then OK. I understand that. They have grown a lot as a company and have honestly moved into far more sophisticated things than gaming. Nvidia has to maintain similar margins in gaming as they do in AI to justify selling gaming cards I suppose. I guess it's going to be Intel and maybe AMD then moving forward. It just is what it is. I'm not bitter about it. I get it.
 

Aapje

Golden Member
Mar 21, 2022
1,379
1,856
106
The sad part is that Intel is getting chance after chance to profit and yet they can't.

The optimistic part of me dreams of a 2024 where Battlemage is actually good, AMD decides they do want market share in the GPU market and Nvidia gets so much competition in the AI market that their growth in that market falters and they need gamers to prop up their profits & stock prices, so they suddenly all start to compete for our business.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,327
10,035
126
The optimistic part of me dreams of a 2024 where Battlemage is actually good, AMD decides they do want market share in the GPU market and Nvidia gets so much competition in the AI market that their growth in that market falters and they need gamers to prop up their profits & stock prices, so they suddenly all start to compete for our business.
That's also about when Crypto makes a roaring comeback.
 

Saylick

Diamond Member
Sep 10, 2012
3,127
6,304
136
That's also about when Crypto makes a roaring comeback.
Hahaha, if there's one thing Jensen is good at, it's making sure Nvidia is riding the next big hype wave. First, it was HPC, then crypto, then the metaverse, and now it's AI and LLMs. That stock price isn't going to prop itself up!

In all seriousness, I think Jensen will hype self-driving cars next.
 

Coalfax

Senior member
Nov 22, 2002
395
71
91
I am curious as to what the 7800 and 7700 cards from AMD will be like actually. odd that they went from 7900XT all the way down to 7600...
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,117
136
I am curious as to what the 7800 and 7700 cards from AMD will be like actually. odd that they went from 7900XT all the way down to 7600...

-Starting to wonder if we'll even get them at this point. N32 is 100% AWOL at the moment.

Almost wonder if AMD is gonna write the midsection of this gen off entirely. Who knows what ends up happening with the N32, limited to a professional only launch (like the W7800) or held up for a mid-gen refresh or outright sandbagged for the 8xxx series launch.
 

PJVol

Senior member
May 25, 2020
533
446
106
Also, Tom's Hardware can screw themselves. They keep simping for Nvidia and have been since they "Just buy it" crap. I used to think Jarred was fairly neutral but I don't think that's the case anymore
Just read the last page of the guru3d review and it feels like the conclusion is written by NV PR team (or even by chatgpt)
What a sad state of tech journalism we're in
 
Last edited:

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,330
4,914
136
Verdict: Skip this generation. If you must upgrade, buy a gently used graphics card from a friend with >= 12GB of vRAM.

I sold my 6800XT to a friend for $420.69. It had 2 years warranty left as well so he's set for a good long time (upgraded from GTX 1070 series) until the next upgrade. 16GB vRAM will be plenty for years to come.

I am curious as to what the 7800 and 7700 cards from AMD will be like actually. odd that they went from 7900XT all the way down to 7600...

Well, conservatively they had 90-100 days of RDNA2 inventory backlog. That was before demand for GPUs dropped off a cliff so who knows how long it will take to clear out old stock...
 

Thunder 57

Platinum Member
Aug 19, 2007
2,674
3,796
136
I am curious as to what the 7800 and 7700 cards from AMD will be like actually. odd that they went from 7900XT all the way down to 7600...

I've read this a few times now and don't understand why it is so surprising. It is uncommon but has been done before. AMD Launched the HD 7900 series, followed it up with the 7700 series, then went back to fill in the middle with the 7800 variants. It isn't unprecedented. I do wonder why they are doing it this time around, though.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
I've read this a few times now and don't understand why it is so surprising. It is uncommon but has been done before. AMD Launched the HD 7900 series, followed it up with the 7700 series, then went back to fill in the middle with the 7800 variants. It isn't unprecedented. I do wonder why they are doing it this time around, though.

While it has been done before, it's still kind of strange and a bit worrying from AMD who will skip segments in a generation, so I think some are concerned that N32 is never coming.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,674
3,796
136
While it has been done before, it's still kind of strange and a bit worrying from AMD who will skip segments in a generation, so I think some are concerned that N32 is never coming.

I hope that is not the case, for their well being and for competition lowering prices.