Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 80 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

insertcarehere

Senior member
Jan 17, 2013
639
607
136
Hell, you don't even need to wait. You can buy an MSI 6700XT on Newegg right now for $350 with two free games and get basically 3070 performance.
Friendly reminder: This is a global forum and not everyone lives in the U.S. Newegg is not an option in my country and RX 6000 series gpus at anything near MSRP have basically went out of stock here for months now. Far easier to get Ada/Ampere parts at reasonable pricing here.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,746
740
136
I don’t use it because my personal gaming PC is Radeon and I’ve only had short stints with a 1070 recently and before that a 7900 GTO. Also, I refuse to login to get the control panel so I haven’t seen it for a bit. I imagine in the last few years it had a chance to get better.

That said, I guess so?

@DeathReborn that will never, ever get old. As Jared Walton is at Tom’s now, does it count if he revisits it there? :)

It would count if he used the AT benches and charts, otherwise it's just another Toms review. I know people who use the GFE Optimiser religiously but with every AMD/Nvidia card I have never used their "optimisers" myself, I just go into the settings and do things like AF 16x as that's free IQ improvement. I do log in to GFE but only for driver laziness.
 

CP5670

Diamond Member
Jun 24, 2004
5,510
588
126
Supposed images of the 4090 ti leaked. It looks even more comically large than the 4090. The AIO cards are a much better design than these huge brick coolers.
 

jpiniero

Lifer
Oct 1, 2010
14,584
5,206
136
Supposed images of the 4090 ti leaked. It looks even more comically large than the 4090. The AIO cards are a much better design than these huge brick coolers.

IIRC Kimi said that version was cancelled. Probably because even with the faster memory it's still not enough to need that high of clock speed.
 

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,698
136
Supposed images of the 4090 ti leaked. It looks even more comically large than the 4090. The AIO cards are a much better design than these huge brick coolers.
That cooler is really weird looking. That third picture with the QR that surface looks like it's the top of the cooler, but why's there so much going on there? It almost looks like the raised spaces for VRMs or something.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,108
1,260
126
Looking at the IO, makes me think the PCB is horizontal? It’s wide enough that it would probably work.
 

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,698
136
Looking at the IO, makes me think the PCB is horizontal? It’s wide enough that it would probably work.
Yeah, it's just really weird. If that's the case, then the PCB would be right beside and parallel to the motherboard, and you'd need some kind of 90° to get the card edge connector for the PCIe slot.
Even outside that, how do you get heat from the vapor chamber to the fins, a bunch of heatpipes like a tower cooler?
1675011502945.png

It'd be massive cooling capacity since the whole thing would be flow through, but it would need really good coupling from the whole PCB to the cooler because there would be no air flow at all on the PCB.
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,980
136
So we are going to see a 4090titi as well? Awesum!

I'd doubt it. Any full card is going to a data center customer who makes the prices paid for cards during the mining boom look like a pittance.

Unless AMD has something that can beat NVidia on some technical performance aspect, they don't need to put out a full due consumer card.
 

biostud

Lifer
Feb 27, 2003
18,237
4,755
136
Goodbye to getting 30% more performance per dollar each generation :(
This is a "new tech" generation, just like the 2xxx so very little increase in performance/$. The performance/$ improvements will come with 5xxx again.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,355
2,848
106
This is a "new tech" generation, just like the 2xxx so very little increase in performance/$. The performance/$ improvements will come with 5xxx again.
I don't see why 50 series using 3nm should provide better perf/$, unless they move to chiplets.

I just checked RTX 4070Ti vs RTX 3090Ti, and I was pretty surprised seeing AD104 having 7.5 billion(+26.5%) more transistors.
That's a big difference, yet It still looses by 10% at 4K.
Yes, you can say It's because of 48MB L2, but GA102 is better spec wise and N21 is also a lot smaller with 128MB IC.
 
  • Like
Reactions: scineram

biostud

Lifer
Feb 27, 2003
18,237
4,755
136
I don't see why 50 series using 3nm should provide better perf/$, unless they move to chiplets.

I just checked RTX 4070Ti vs RTX 3090Ti, and I was pretty surprised seeing AD104 having 7.5 billion(+26.5%) more transistors.
That's a big difference, yet It still looses by 10% at 4K.
Yes, you can say It's because of 48MB L2, but GA102 is better spec wise and N21 is also a lot smaller with 128MB IC.
There's no technological reason for it, it is their market strategy. If there is no improvement in price/performance then nobody will buy your product, except if you can deliver something that is faster than current gen. But I really don't believe the +$799 GPU market can carry a whole generation.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,355
2,848
106
There's no technological reason for it, it is their market strategy. If there is no improvement in price/performance then nobody will buy your product, except if you can deliver something that is faster than current gen. But I really don't believe the +$799 GPU market can carry a whole generation.
You said It yourself, there is no reason why It can't cost less. If they set the price this high with RTX 40 series, why do you expect they will change their strategy with next gen?
Because of low sold volumes? Did they lower the prices of AD104,103 or 102 because of bad sales?
 

biostud

Lifer
Feb 27, 2003
18,237
4,755
136
You said It yourself, there is no reason why It can't cost less. If they set the price this high with RTX 40 series, why do you expect they will change their strategy with next gen?
Because of low sold volumes? Did they lower the prices of AD104,103 or 102 because of bad sales?
No, because people are apparently willing to pay that amount of money. The market for +$999 video cards has become a large enough segment to justify the 4080 and 4090 to exist, and if you compare these cards with the 3090 or 3090Ti, or the covid GPU prices, then performance/$ has increased.

But if performance /$ does not increase next gen, then the price will have to be $2400 for a 4090 owner to upgrade to a 5090, just to get 50% better performance, and those with a 4080 are not going to buy a 5080 with 4090 performance at same price of a 4090. At some point they can't raise the price ceiling, as they have done with current generation. I also seriously doubt it will be easy for them to sell newer video cards in the sub $600 market if performance/$ does not increase.
 
  • Like
Reactions: TESKATLIPOKA

Aapje

Golden Member
Mar 21, 2022
1,376
1,853
106
Because of low sold volumes?

Yes. They need a lot of volume.

Did they lower the prices of AD104,103 or 102 because of bad sales?

The generation has just been released, so this is a silly objection. As I've already explained before, the impact of the lower prices will mostly be felt later in the life cycle and at lower tiers.

So logically, pressure will increase more and more to lower the prices. I suspect that with these prices, sales will be so low that they will give in eventually. But it might take a long time.

If they set the price this high with RTX 40 series, why do you expect they will change their strategy with next gen?

Again, as I've explained, every generation there will be fewer and fewer people that still benefit from the price/perf improvements of the 3000/2000/etc series. Plus we still have catch-up demand after the mining boom.

So if they don't improve price/perf, next gen sales will be even lower.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
I don't see why 50 series using 3nm should provide better perf/$, unless they move to chiplets.

I just checked RTX 4070Ti vs RTX 3090Ti, and I was pretty surprised seeing AD104 having 7.5 billion(+26.5%) more transistors.
That's a big difference, yet It still looses by 10% at 4K.
Yes, you can say It's because of 48MB L2, but GA102 is better spec wise and N21 is also a lot smaller with 128MB IC.

Memory BW is obviously why the 3090 Ti pulls ahead at 4K.

4070Ti has literally only half the Memory BW, and that is going to be most impactful at highest resolution. It would be more surprising if this wasn't the case.

A lot of those extra transistors will be attempting to compensate for the lost BW, but they fall short, some of the other transistors might be boosting RT, also more transistors can be used to boost clock speed as well...
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,355
2,848
106
Memory BW is obviously why the 3090 Ti pulls ahead at 4K.

4070Ti has literally only half the Memory BW, and that is going to be most impactful at highest resolution. It would be more surprising if this wasn't the case.

A lot of those extra transistors will be attempting to compensate for the lost BW, but they fall short, some of the other transistors might be boosting RT, also more transistors can be used to boost clock speed as well...
OK. I should have compared only at 1080p, at that resolution It's a bit faster than RTX 3090Ti.

What you said is true, but don't forget GA102 has more of everything, SM, Cuda, ROPs, TMUs, memory width etc.
Here is maybe a better comparison.
TransistorsSMCudaTMUROPRT coresTensor coresBus widthL2 cacheBase frequencyBoost frequency
GA10417.44861441929648192256-bit4 MB1580 MHz1770 MHz
ADA10435.8 (+106%)60 (+25%)7680 (+25%)240 (+25%)80 (-17%)60 (+25%)240 (+25%)192-bit (-25%)48 MB (+1100%)2310 MHz (+46%)2610 MHz
(+47%)
Performance is only 41-43% better, but transistor count is >100% higher.
I wonder how much transistors were used for the extra cache.
 
Last edited: