I think so too.39% of the SM of the 3070. The 1660 Ti is 35.3% of the SM of the 2080 Ti. Outside other variables like memory, expect something between a 1660 Ti and 1070 Ti for 1080p performance?
Nope. Since it will have 128 bit GDDR6 bus. This should be around 1660 Ti, at best.I think so too.
RTX 3050 ~ GTX 1660 Ti
Maybe stretching to RTX 2060
YES! 6GB's on RTX 3050 Ti.RTX 3060 will have 12GB GD6, RTX 3050 Ti will get 6GB.
It cannot be 12gb. Nvidia is not so generous. Even amd isn't so generous.3060 with 12GB?? After the 3080 launched with 10?? I'll believe it when I see it.
Would it be worse for them to make it a 6 GB product that will probably hit a massive wall on some games due to memory limits, or to bump it up to 12 GB so that it's a good card that pisses in the face of everyone who bought a 3070 or 3080?3060 with 12GB?? After the 3080 launched with 10?? I'll believe it when I see it.
A 12 GB 3060 would be the ultimate spit in everyone's face who will have waited 2+ months for a 3080 that they'd somehow successfully ordered within minutes of its launch back in September.
YES! 6GB's on RTX 3050 Ti.
In retrospect, practically every solution would have been better than what Jensen decided to do and has been continuously deciding to do so ever since thenWould it be worse for them to make it a 6 GB product that will probably hit a massive wall on some games due to memory limits, or to bump it up to 12 GB so that it's a good card that pisses in the face of everyone who bought a 3070 or 3080?
I don't know what production at Micron is looking like, but it may honestly have been better for NVidia to hold off releasing until next year if there would have been 2 GB GDDR6X modules available.
Even though AMD could have claimed the crown, NVidia customers would wait 3-4 months because they would believe NVidia would be better, because that's just what NVidia does.
Instead of esoteric OCed RAM they should have gone with a wider bus, then using half amount of modules would have worked. 512-bit bus for 16gb and so forth...I don't know what production at Micron is looking like, but it may honestly have been better for NVidia to hold off releasing until next year if there would have been 2 GB GDDR6X modules available.
2GB GDDR6 modules have been in production for years. Its the 2GB GDDR6X that aren't in production. The 3060 should use regular GDDR6, not the expensive X variant.A 12GB 3060 is a smart move, since the endless bitching about another 6GB card would have backfired too hard. Besides GDDR6 modules with 16Gb density should finally be in volume production.
I suppose all the other GA104 based cards using 8GB VRAM will get a SUPER refresh with 16GB VRAM by next year.
Not the faster 16 Gbps modules though. AMD uses those on the upcoming RX 6000 series too, so I assume Nvidia saw an opportunity now for the RTX 3060 to double VRAM size without exploding prices.2GB GDDR6 modules have been in production for years. Its the 2GB GDDR6X that aren't in production. The 3060 should use regular GDDR6, not the expensive X variant.
I don't know if this is true, but I heard it from another poster here when I proposed something similar and that with GDDR6X a 512-bit bus isn't possible due to technical reasons having something to do with signaling. Again, I'm not sure how true that is, but having a bus that large would also eat up a considerable bit of power on top of a chip that's already gobbling down plenty of it, so I could see NVidia avoiding it even if it were actually possible for them to make a wider bus.Instead of esoteric OCed RAM they should have gone with a wider bus, then using half amount of modules would have worked. 512-bit bus for 16gb and so forth...
Better for gamers isn't better for Jensen though.In retrospect, practically every solution would have been better than what Jensen decided to do and has been continuously deciding to do so ever since then![]()
Wouldn't a 512-bit bus and the fastest spec'd GDDR6 had plenty of bandwidth? I wonder if signaling is still an issue with GDDR6 assuming it's a problem with GDDR6X? Did NVIDIA really just underestimate AMD that badly? I mean, Samsung not delivering is something that maybe NVIDIA couldn't have foresaw, but memory issues? Did they just think Micron would have 2gb memory modules earlier in the game?I don't know if this is true, but I heard it from another poster here when I proposed something similar and that with GDDR6X a 512-bit bus isn't possible due to technical reasons having something to do with signaling. Again, I'm not sure how true that is, but having a bus that large would also eat up a considerable bit of power on top of a chip that's already gobbling down plenty of it, so I could see NVidia avoiding it even if it were actually possible for them to make a wider bus.
How could Nvidia have screwed up so many things in so many different areas so badly this launch? This would have been impossible to predict based on past performanceApparently nVidia's partners aren't too happy with Ampere cards. And they say they can't actually build the 3060 class cards for what nVidia is telling them to build them for.
As long as gamers still buy their GPUs in record numbers, why should they improve? It's a winning formula for NV to grow profits.How could Nvidia have screwed up so many things in so many different areas so badly this launch? This would have been impossible to predict based on past performance
Cheaper (?) node with apparently poor characteristics.
Unbalanced design with too many compute units relative to rest.
Supply side disaster.
Terrible memory options. Too much or too little, choose.
This one is pretty simple. They are scrambling not to get wrecked by AMD in VRAM in similar tiers, but it's not so simple due to bus width and GDDR6X.Terrible memory options. Too much or too little, choose.
If you believe kopite7kimi, yields are bad and there will be a new stepping to improve things in this regard. It also seems the launch was rushed.Supply side disaster.
This alone makes the infinity cache actually a very good deal. You get a 256-bit bus with reasonable vram size and can scale down accordingly. Higher than 256-bit you are facing the problem of having too little or way too much vram, eg. exactly nv's problem. the 3080 should have the full bus for 12 GB of vram and it would already be far less of an issue. Yields must be terrible or else the smaller bus really makes no sense.This one is pretty simple. They are scrambling not to get wrecked by AMD in VRAM in similar tiers, but it's not so simple due to bus width and GDDR6X.
Of course it would only be GDDR6 then but yeah I read that too and it was about GDDR6 but I could not find any confirmation with a google search albeit it's hard to find relevant hits.I don't know if this is true, but I heard it from another poster here when I proposed something similar and that with GDDR6X a 512-bit bus isn't possible due to technical reasons having something to do with signaling.