4070TI reviews thread

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,706
2,956
126

First official performance leak. A perfect opportunity to highlight the fraud pushed by nVidia with DLSS.

Performance looks fantastic, right? Look again, at the smallprint - 4070 is using frame generation.

I said exactly this would happen as soon as DLSS 3.0 was announced. 2560x1440 is a lie given it's actually ~1080p due to legacy DLSS. Since nVidia got away with it, the next step is frame interpolation lies. It's "faster", yo!

Adjusting for frame generation (i.e. halve the 4070's bars), it's actually barely faster than the 3080 in MFS, and slower in Warhammer 40K. All this from a card that will likely cost more.

Customers not seeing the smallprint and/or not understanding DLSS are having systematic fraud perpetuated on them.


Reviews



 
Last edited:

jpiniero

Lifer
Oct 1, 2010
14,578
5,203
136
Also for what it's worth:

3090 (GA102) - 28.3 billion transistors - $1499
4090 (AD102) - 76.3 billion transistors - $1599

~2.7x transistors for ~6.7% more money.

Very deep cut versus essientally the full die. That makes a big difference. Plus they have plenty of 9k+ Quadros they are selling, and the 4090 Ti will likely be 2 grand or more.
 

maddie

Diamond Member
Jul 18, 2010
4,738
4,667
136
Also for what it's worth:

3090 (GA102) - 28.3 billion transistors - $1499
4090 (AD102) - 76.3 billion transistors - $1599

~2.7x transistors for ~6.7% more money.

Even if we use the idiotic and twisted definition of Moore's law that's being pushed, JHH is still wrong even by that definition. I'll grant that the Samsung node used for Ampere wasn't quite as good as what TSMC had at the time, but it's not going to substantially change the results here.
What the ........... I keep reading that transistors/$ is flat or falling. Who's shoveling that pile?
 

Saylick

Diamond Member
Sep 10, 2012
3,123
6,286
136
Also for what it's worth:

3090 (GA102) - 28.3 billion transistors - $1499
4090 (AD102) - 76.3 billion transistors - $1599

~2.7x transistors for ~6.7% more money.

Even if we use the idiotic and twisted definition of Moore's law that's being pushed, JHH is still wrong even by that definition. I'll grant that the Samsung node used for Ampere wasn't quite as good as what TSMC had at the time, but it's not going to substantially change the results here.
Ehhh, what Nvidia charges to consumers vs. what they actually pay to the foundry are two separate things. MSRPs and $/xtor are correlated, but loosely.

I'd argue that Nvidia just got good prices on their Samsung wafers (allegedly only paying for good dies rather than the conventional approach of paying per wafer), as evidenced by the $699 MSRP of the RTX 3080. The 3090 and 3090 Ti were just stupidly overpriced and I suspect the margins on those SKUs were far above what's typical for Nvidia. The 4090 being "only" $100 more expensive than the MSRP of the 3090 is just a return to Nvidia's historically "normal" gross margins, which is still pretty high to begin with.

JHH's excuse of using rising manufacturing costs as justification for slotting Lovelace into the inflated price structure of flagship Ampere lets them:
1) Maintain traditional gross margins, even with higher wafer prices
2) Protect Ampere sales so that they can clear out channel inventory
 

maddie

Diamond Member
Jul 18, 2010
4,738
4,667
136
Ehhh, what Nvidia charges to consumers vs. what they actually pay to the foundry are two separate things. MSRPs and $/xtor are correlated, but loosely.

I'd argue that Nvidia just got good prices on their Samsung wafers (allegedly only paying for good dies rather than the conventional approach of paying per wafer), as evidenced by the $699 MSRP of the RTX 3080. The 3090 and 3090 Ti were just stupidly overpriced and I suspect the margins on those SKUs were far above what's typical for Nvidia. The 4090 being "only" $100 more expensive than the MSRP of the 3090 is just a return to Nvidia's historically "normal" gross margins, which is still pretty high to begin with.

JHH's excuse of using rising manufacturing costs as justification for slotting Lovelace into the inflated price structure of flagship Ampere lets them:
1) Maintain traditional gross margins, even with higher wafer prices
2) Protect Ampere sales so that they can clear out channel inventory
OK, but a few here are selling higher prices as inevitable due to Moore's Law dying. Shut up & pay your dues gamer, don't you know we're actually doing you a favor.

The next few years are going to be so interesting, in a Chinese sort of way.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,324
10,034
126
The next few years are going to be so interesting, in a Chinese sort of way.
Are you talking about JHH adopting the CCP policy, "Here's a choice - you get to pick US, no other choices".

Or was that a reference to one of several "home-grown" Chinese GPU makers that are fresh to market, selling them here in the USA.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,780
7,107
136
My guess is splitting the difference between 3090 and 3090ti when a broad range of games are tested. It will be more like a 3090ti for the 5-10 game review sites.
 

Mopetar

Diamond Member
Jan 31, 2011
7,830
5,977
136
Very deep cut versus essientally the full die. That makes a big difference. Plus they have plenty of 9k+ Quadros they are selling, and the 4090 Ti will likely be 2 grand or more.

Not sure I'd call 88% a "very deep" cut, but to each their own. The 3090 Ti only had an additional 2 SMs but had a $2,000 MSRP. If the 4090 Ti is priced similarly, you don't really have an argument.

I would expect to see it anytime soon though. AMD doesn't come nearly as close this generation and they can sell products to data center customers for a lot more.

JHH's excuse of using rising manufacturing costs as justification for slotting Lovelace into the inflated price structure of flagship Ampere lets them:
1) Maintain traditional gross margins, even with higher wafer prices
2) Protect Ampere sales so that they can clear out channel inventory

I didn't say that he was lying for no reason. He's not stupid.
 

insertcarehere

Senior member
Jan 17, 2013
639
607
136
Not sure I'd call 88% a "very deep" cut, but to each their own. The 3090 Ti only had an additional 2 SMs but had a $2,000 MSRP. If the 4090 Ti is priced similarly, you don't really have an argument.

I would expect to see it anytime soon though. AMD doesn't come nearly as close this generation and they can sell products to data center customers for a lot more.

It's definitely a deeper cut than what even the 3080ti went through, 128/144 SMs is already a deeper bin than 80/84 SMs proportionally and that's even before taking into account that the 4090 only has 72/96MB possible L2 Cache in AD102.

...Not that it matters when the closest competitor performance-wise to full AD104 with 80 ROPs and 192bit memory from AMD appears to be 320bit memory 7900XT with 192 ROPs.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,852
5,795
136
What a joke, it's a 3090 and not a 3090 Ti at 4k, and lower price to performance than the 3080 10GB.

relative-performance_3840-2160.png
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Readers Digest version of the reviews. The 7900 XT pretty much destroys this card in most cases. Which just brings out how bad this card was going to be if it was named 4080.

NOTE: There is no FE version of the 3070Tie, all the other cards on this chart are reference versions.
1672842214479.png

1672842389946.png
 
  • Like
Reactions: Elfear and Saylick

Timorous

Golden Member
Oct 27, 2008
1,605
2,742
136
Place your bets time. Do we think N32 based 7800XT can match this level of performance? That would be a 20% uplift from the 6800XT or a 41% uplift from the 60CU 6800.

With high enough clocks I think yes or very close to.