• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."
  • Community Question: What makes a good motherboard?

Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 198 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MrTeal

Platinum Member
Dec 7, 2003
2,939
596
136
39% of the SM of the 3070. The 1660 Ti is 35.3% of the SM of the 2080 Ti. Outside other variables like memory, expect something between a 1660 Ti and 1070 Ti for 1080p performance?
 

Bouowmx

Golden Member
Nov 13, 2016
1,059
443
146
39% of the SM of the 3070. The 1660 Ti is 35.3% of the SM of the 2080 Ti. Outside other variables like memory, expect something between a 1660 Ti and 1070 Ti for 1080p performance?
I think so too.
RTX 3050 ~ GTX 1660 Ti
Maybe stretching to RTX 2060
 

Glo.

Diamond Member
Apr 25, 2015
4,589
3,190
136
Since 3050 with 18 SMs is 90W TGP, we should not expect 3050 Ti to be less than 140W with 28 SM's.

Geez that 8 nm process is dogcrap.
 
  • Like
Reactions: Mopetar

Glo.

Diamond Member
Apr 25, 2015
4,589
3,190
136
Guys, even if Kopite says that he can be wrong, its Kopite who is the source for this ;).

So i'd say that there is huge chance that this is actually correct ;).
 

Mopetar

Diamond Member
Jan 31, 2011
5,049
1,552
136
3060 with 12GB?? After the 3080 launched with 10?? I'll believe it when I see it.
Would it be worse for them to make it a 6 GB product that will probably hit a massive wall on some games due to memory limits, or to bump it up to 12 GB so that it's a good card that pisses in the face of everyone who bought a 3070 or 3080?

I don't know what production at Micron is looking like, but it may honestly have been better for NVidia to hold off releasing until next year if there would have been 2 GB GDDR6X modules available.

Even though AMD could have claimed the crown, NVidia customers would wait 3-4 months because they would believe NVidia would be better, because that's just what NVidia does.
 

lobz

Golden Member
Feb 10, 2017
1,503
1,922
106
Would it be worse for them to make it a 6 GB product that will probably hit a massive wall on some games due to memory limits, or to bump it up to 12 GB so that it's a good card that pisses in the face of everyone who bought a 3070 or 3080?

I don't know what production at Micron is looking like, but it may honestly have been better for NVidia to hold off releasing until next year if there would have been 2 GB GDDR6X modules available.

Even though AMD could have claimed the crown, NVidia customers would wait 3-4 months because they would believe NVidia would be better, because that's just what NVidia does.
In retrospect, practically every solution would have been better than what Jensen decided to do and has been continuously deciding to do so ever since then :D
 

beginner99

Diamond Member
Jun 2, 2009
4,667
1,078
136
I don't know what production at Micron is looking like, but it may honestly have been better for NVidia to hold off releasing until next year if there would have been 2 GB GDDR6X modules available.
Instead of esoteric OCed RAM they should have gone with a wider bus, then using half amount of modules would have worked. 512-bit bus for 16gb and so forth...
 

Pinstripe

Member
Jun 17, 2014
196
4
81
A 12GB 3060 is a smart move, since the endless bitching about another 6GB card would have backfired too hard. Besides GDDR6 modules with 16Gb density should finally be in volume production.

I suppose all the other GA104 based cards using 8GB VRAM will get a SUPER refresh with 16GB VRAM by next year.
 

Stuka87

Diamond Member
Dec 10, 2010
5,216
985
126
A 12GB 3060 is a smart move, since the endless bitching about another 6GB card would have backfired too hard. Besides GDDR6 modules with 16Gb density should finally be in volume production.

I suppose all the other GA104 based cards using 8GB VRAM will get a SUPER refresh with 16GB VRAM by next year.
2GB GDDR6 modules have been in production for years. Its the 2GB GDDR6X that aren't in production. The 3060 should use regular GDDR6, not the expensive X variant.
 
  • Like
Reactions: Mopetar

Pinstripe

Member
Jun 17, 2014
196
4
81
2GB GDDR6 modules have been in production for years. Its the 2GB GDDR6X that aren't in production. The 3060 should use regular GDDR6, not the expensive X variant.
Not the faster 16 Gbps modules though. AMD uses those on the upcoming RX 6000 series too, so I assume Nvidia saw an opportunity now for the RTX 3060 to double VRAM size without exploding prices.
 
  • Like
Reactions: Stuka87

Mopetar

Diamond Member
Jan 31, 2011
5,049
1,552
136
Instead of esoteric OCed RAM they should have gone with a wider bus, then using half amount of modules would have worked. 512-bit bus for 16gb and so forth...
I don't know if this is true, but I heard it from another poster here when I proposed something similar and that with GDDR6X a 512-bit bus isn't possible due to technical reasons having something to do with signaling. Again, I'm not sure how true that is, but having a bus that large would also eat up a considerable bit of power on top of a chip that's already gobbling down plenty of it, so I could see NVidia avoiding it even if it were actually possible for them to make a wider bus.
 

ozzy702

Golden Member
Nov 1, 2011
1,045
438
136
I don't know if this is true, but I heard it from another poster here when I proposed something similar and that with GDDR6X a 512-bit bus isn't possible due to technical reasons having something to do with signaling. Again, I'm not sure how true that is, but having a bus that large would also eat up a considerable bit of power on top of a chip that's already gobbling down plenty of it, so I could see NVidia avoiding it even if it were actually possible for them to make a wider bus.
Wouldn't a 512-bit bus and the fastest spec'd GDDR6 had plenty of bandwidth? I wonder if signaling is still an issue with GDDR6 assuming it's a problem with GDDR6X? Did NVIDIA really just underestimate AMD that badly? I mean, Samsung not delivering is something that maybe NVIDIA couldn't have foresaw, but memory issues? Did they just think Micron would have 2gb memory modules earlier in the game?

Just seems like Ampere is a fantastic design massively handicapped by memory and process limitations.
 

Stuka87

Diamond Member
Dec 10, 2010
5,216
985
126
Apparently nVidia's partners aren't too happy with Ampere cards. And they say they can't actually build the 3060 class cards for what nVidia is telling them to build them for.

 

maddie

Diamond Member
Jul 18, 2010
3,363
2,277
136
Apparently nVidia's partners aren't too happy with Ampere cards. And they say they can't actually build the 3060 class cards for what nVidia is telling them to build them for.

How could Nvidia have screwed up so many things in so many different areas so badly this launch? This would have been impossible to predict based on past performance

Cheaper (?) node with apparently poor characteristics.
Unbalanced design with too many compute units relative to rest.
Supply side disaster.
Terrible memory options. Too much or too little, choose.
 

PhoBoChai

Member
Oct 10, 2017
114
363
106
How could Nvidia have screwed up so many things in so many different areas so badly this launch? This would have been impossible to predict based on past performance

Cheaper (?) node with apparently poor characteristics.
Unbalanced design with too many compute units relative to rest.
Supply side disaster.
Terrible memory options. Too much or too little, choose.
As long as gamers still buy their GPUs in record numbers, why should they improve? It's a winning formula for NV to grow profits.
 

exquisitechar

Senior member
Apr 18, 2017
443
502
106
Terrible memory options. Too much or too little, choose.
This one is pretty simple. They are scrambling not to get wrecked by AMD in VRAM in similar tiers, but it's not so simple due to bus width and GDDR6X.
Supply side disaster.
If you believe kopite7kimi, yields are bad and there will be a new stepping to improve things in this regard. It also seems the launch was rushed.
 

beginner99

Diamond Member
Jun 2, 2009
4,667
1,078
136
This one is pretty simple. They are scrambling not to get wrecked by AMD in VRAM in similar tiers, but it's not so simple due to bus width and GDDR6X.
This alone makes the infinity cache actually a very good deal. You get a 256-bit bus with reasonable vram size and can scale down accordingly. Higher than 256-bit you are facing the problem of having too little or way too much vram, eg. exactly nv's problem. the 3080 should have the full bus for 12 GB of vram and it would already be far less of an issue. Yields must be terrible or else the smaller bus really makes no sense.
 
  • Like
Reactions: lightmanek

beginner99

Diamond Member
Jun 2, 2009
4,667
1,078
136
I don't know if this is true, but I heard it from another poster here when I proposed something similar and that with GDDR6X a 512-bit bus isn't possible due to technical reasons having something to do with signaling.
Of course it would only be GDDR6 then but yeah I read that too and it was about GDDR6 but I could not find any confirmation with a google search albeit it's hard to find relevant hits.
 

ASK THE COMMUNITY