Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 109 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DavidC1

Member
Dec 29, 2023
170
233
76
Where would Nvidia give you 4070 Ti Super level of performance for only $499? They will ask $799 for It now and even If they release the next gen, you will be happy If they will sell that level of performance for $599 let alone $499, and even that would be 33% better perf/$.
Nvidia won't, but they don't have to.

RTX 3060 sells way more than A750/770 ever will despite the latter two offering superior value.

Battlemage is said to have RTX 4070 Ti performance. The RTX 4070 Super is pretty much at that level of performance and considering how low level details and execution can easily swing it by 5-10%, that's extremely competitive considering 4070 Super will be in days at $599 versus Battlemage in Q3 or even Q4.

Intel would need to execute in a stellar way, which means offering 4080 level performance or 4070 Ti Super level while at 225W and at $499. I doubt they can do this, is my point.

4070 Ti non-super in late 2024 is nothing amazing. Nvidia can easily counter that because it's so far away!
 
Last edited:

poke01

Senior member
Mar 8, 2022
715
685
106
That remains to be seen in bencharks. I doubt that company will let a 7 series card trample all over an 8 series card in the same generation.
I think it will get close maybe 10-15% slower but the 4070 TiS you can actually make use of the 16GB. Like you said the proof will be in the benchmarks.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,226
136
Only idiots or people with more money than brains would have done that.

And if they did they probably would have got it for close to the new $999 MSRP anyway. It's not like Super is significantly faster.

Newsflash: Any GPU you buy today will be superseded it fairly short time.

I bought an RTX 4070 in late November. I'm not sweating about better deals on Super Cards after that.

I'm just going to enjoy what I have for many years.
 

MrTeal

Diamond Member
Dec 7, 2003
3,568
1,696
136
I think it will get close maybe 10-15% slower but the 4070 TiS you can actually make use of the 16GB. Like you said the proof will be in the benchmarks.
That doesn't seem unreasonable at all. The 2070S was within 5-10% of the 2080 so it's not like it's unprecedented, and TPU already had the 4080 only 14% faster at 1080p. That increased to 19% at 1440 and 24% at 4k, some of which is CPU limitations but some will be the low relative BW of the 4070 Ti. Add 10% more shaders with the Super and give it 1/3rd more memory bandwidth and it will come pretty close to the 4080.

Of that the most useless card is the 4070. I'm sure they'll come down more (there's a Zotac at Newegg already for $535) but at only $50 less MSRP than the 4070 Super it's a non-factor. Either spend the extra on the Super, or wait to see if there's a 4060 Ti refresh.
 
  • Like
Reactions: Tlh97 and Executor_

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,226
136
Oh no, are you going to be another one that starts lecturing us about the "right settings"?

EDIT, Damn, TESKATLIPOKA beat me to it. Well done.

I'm not lecturing, just responding the the nonsense that my card is somehow going to be suddenly useless soon.

I literally only had an 8800GT for about 15 years. If I can cope with nursing that card along for that long, I'm pretty sure I can get 5+ years out of 4070.
 
  • Like
Reactions: Rigg and psolord

Thunder 57

Platinum Member
Aug 19, 2007
2,669
3,786
136
I'm not lecturing, just responding the the nonsense that my card is somehow going to be suddenly useless soon.

I literally only had an 8800GT for about 15 years. If I can cope with nursing that card along for that long, I'm pretty sure I can get 5+ years out of 4070.

The 8800GT was legendary. The 4070 is not. But yes, you should get 5 years out of it.
 

Ranulf

Platinum Member
Jul 18, 2001
2,345
1,164
136
I literally only had an 8800GT for about 15 years.

Heh, that got me thinking about my old cheapo Radeon HD 5450 tester card from 2010 or so. I have a Radeon WX2100 that sort of replaced it as a test card that runs only off mobo power. I can't think of a card that has been my daily gamer card as it were for more than 3 years. Maybe the 2060S now. That or the 5850 from 2010/11 to 2014.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,226
136
The 8800GT was legendary.

I feel like I should build a shrine to my 8800GT, for serving so well, for so long. I felt that my 9700 Pro was equally legendary, but it died right after the warranty, but my 8800GT just kept going.

I remember at one point the fan bearings got noisy, I just peeled back the sticker and injected some oil into it and it kept trucking.
 

Ranulf

Platinum Member
Jul 18, 2001
2,345
1,164
136
I like how Tie guy flips back to TI there briefly. More amusing is the FG used to sell the new Super duper performance jumps.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,355
2,845
106
I'm not lecturing, just responding the the nonsense that my card is somehow going to be suddenly useless soon.

I literally only had an 8800GT for about 15 years. If I can cope with nursing that card along for that long, I'm pretty sure I can get 5+ years out of 4070.
And who said your 4070 will suddenly become useless?
With correct settings you can use It even for >5 years, but that doesn't change the fact that 12GB of Vram is not that much even today.

I can agree, that there weren't much better options If you want RT, but still.

If RTX 4060Ti 16GB was for $399, then I would probably choose that instead, but I still don't understand why Nvidia used a cutdown chip even for the 16GB version.
The other thing I am wondering about is why there is still no GPU using Samsung's 24Gbit/s.
 
Last edited:

Aapje

Golden Member
Mar 21, 2022
1,370
1,836
106
If RTX 4060Ti 16GB was for $399, then I would probably choose that instead, but I still don't understand why Nvidia used a cutdown chip even for the 16GB version.
If they had used the full chip then the 4060 Ti would have the exact same amount of cores as the 4070 laptop. So it would expose how they are lying to consumers.

Although the old 4080 has the same number of cores as the 4090 laptop and the new 4080 Super actually has more, but I guess that they don't think that 4090 laptop buyers have any functioning brain cells, so it doesn't matter there.
 
  • Like
Reactions: igor_kavinski

TESKATLIPOKA

Platinum Member
May 1, 2020
2,355
2,845
106
If they had used the full chip then the 4060 Ti would have the exact same amount of cores as the 4070 laptop. So it would expose how they are lying to consumers.

Although the old 4080 has the same number of cores as the 4090 laptop and the new 4080 Super actually has more, but I guess that they don't think that 4090 laptop buyers have any functioning brain cells, so it doesn't matter there.
This is no news. They also used different names in previous generation, AMD does It too.
So this doesn't explain why It has less cores. A full chip + 16GB 20gbps memory could have been nice, although not for $499.
 
  • Like
Reactions: MoogleW

Aapje

Golden Member
Mar 21, 2022
1,370
1,836
106
This is no news. They also used different names in previous generation, AMD does It too.
This is not true though. They used GA107 for the 3050 desktop and laptop, GA106 for the 3060 desktop and laptop, GA104 for the 3070 desktop and laptop. The cuts were not the same, but they were the same chips.

Ada is the first generation where they consistently used a lower tier chip for the laptops than for the similarly names desktop cards.