Question [Videocardz] nVidia planning 2060 12 GB

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jpiniero

Lifer
Oct 1, 2010
14,595
5,215
136

Could be released by January. Here's your answer for what they plan on doing with the Turing wafers that were meant for CMP.
 

jpiniero

Lifer
Oct 1, 2010
14,595
5,215
136
The only products AMD have that use GDDR6 use TSMC 7 nm.

Ethereum's price crashing is certainly possible but who knows if that will actually happen. As I sort of mentioned this could just be a temporary release to fulfill the Turing wafers they bought for CMP. But it could be extended if they think they can continue to get $400+ for a card that came out 2 years ago.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,340
10,044
126
Honestly, depending on price, there could be a LOT of demand for an entry-level / mid-range gaming card like the 2060 / Super with 12GB of GDDR6 on it. It would also be a really desirable mining card, at the appropriate pricing.

Edit: I don't know if NV could make the 2060 refresh an "LHR" card or not. It depends on how much of it is silicon "hooks", and how much is BIOS and drivers. They could certainly TRY, but if it's mostly BIOS/drivers, it could be bypassed a lot easier than if it were based in the actual GPU silicon.

Which, I'm a bit unconvinced that even the LHR v2 of the 30-series is really much silicon-based. Quite frankly, I suspect that the LHR, because it starts high, and then slowly ramps down to the lower limit, is based on SMM BIOS handlers in the VBIOS, checking "Performance Counters" in the GPU, to analyse what kind of algorithms are executing on the GPU clusters, and if it matches the pattern of Ethash, then it ramps down the memory clock or inserts wait-states or somehow throttles the GPU.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,812
7,168
136
12GB? It won't be a competitive price then.

-Indeed. It'll be competing for all the same GDDR6 as everything else on the market and it'll continue to be relevant to miners.

Shoulda slapped 6gb on it to keep the price down and call it a day, but no such luck.

OTOH, just introducing as much mining supply as possible should push up the difficulty just a bit more, and every little bit counts if GPUs are ever going to stop printing money.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
The 2060 uses an older process than the 3060. Those production lines may have additional capacity available for Nvidia. Also, the 2060 was soecced with slower gddr6. Those bins of gddr6 may be more plentiful and cheaper to source.

Making a 12GB 2060 version neatly plugs the hole in Nvidia's product line between the 3050, which, while being nearly as fast as the 2060 in most low to mid quality benchmarks, suffers at higher quality levels due to its limited memory pool. Just rehashing the 2060 avoids the need to do a new "1660" class of product. The full stack keeps ray tracing, Nvidia doesn't have to spend a penny on development. In addition, the 12GB memory pool makes it useful for the direct VRAM transfer mode in DX12.

It makes a lot of sense, and lines up with the rumors of how AMD's next gen stack is going to look.
 
  • Like
Reactions: Tlh97

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
no LHR on 2060.
I think the way LHR works is it checks the device id and then goes down a different route in the drivers. The new 2060 will have a new id so I would guess it will be trivial for LHR to be added if they want.

I suspect this is primarily for OEM's who want something that will stay in stock and is better/newer than the 1660's. Hence putting 12GB on it as that looks good on the stickers - "Nvidia 12gb, RT, DLSS".
 

KompuKare

Golden Member
Jul 28, 2009
1,015
930
136
These will get absolutely clobbered by miners and scalpers.
A quick google suggests that (via whattomine.com), that a 2060 gets 30Mh/s @ 120W while a 3060 Ti gets 58Mh/s @ 130W.
Not saying that makes it impossible miners would snap these up, but it should make them a bit less appealing to them.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
A lot of small time miners don't pay for the electricity that they use. It hashes, they won't care. Look at the rooms filled with laptops propped up on their sides, mining away, that are posted on the internet.
 
  • Like
Reactions: DAPUNISHER

jpiniero

Lifer
Oct 1, 2010
14,595
5,215
136
A quick google suggests that (via whattomine.com), that a 2060 gets 30Mh/s @ 120W while a 3060 Ti gets 58Mh/s @ 130W.
Not saying that makes it impossible miners would snap these up, but it should make them a bit less appealing to them.

Used 2060 6 GB are going for over $400 right now so it doesn't matter.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
definitely nowhere near these prices i hope...

View attachment 50219

lolz... the 4000 series is where Nvidia intends sodomize us with a large long stick if those prices are true.

Every one of the cards above would sell out before release day at those prices. People would buy them by the pallet. They'd be asking their friends, "how many pallets are you getting this time?" The prices shown are too low and in no way reflect the reality of the market. Tell me a 4060 wouldn't sell out instantly at $800? Of course it would, with or without miners in the picture.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
At this point, I'm essentially waiting for a mining conglomerate to just outright buy an AIB manufacturer, bulk buy you chips, design mining focused cards, and then use them for themselves. Then, once the next generation of chips comes out, sell off their existing cards for a price higher than they could ever mine back and repeat the process. Kind of like the development of Bitcoin Asics went.
 
  • Wow
Reactions: VirtualLarry

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
At this point, I'm essentially waiting for a mining conglomerate to just outright buy an AIB manufacturer, bulk buy you chips, design mining focused cards, and then use them for themselves. Then, once the next generation of chips comes out, sell off their existing cards for a price higher than they could ever mine back and repeat the process. Kind of like the development of Bitcoin Asics went.

Sadly, that doesn't seem so far out of the question anymore.
 
  • Wow
Reactions: VirtualLarry

jpiniero

Lifer
Oct 1, 2010
14,595
5,215
136

Gigabyte is doing a couple models. Videocardz seems to think the announcement is Dec 7.
 

Gideon

Golden Member
Nov 27, 2007
1,639
3,674
136
I wonder If Nvidia released a new smaller Die?

If these are still the good old TU106's, it's such a waste. These could just as easily be RTX 2060 Supers or even RTX 2070s (it's the exact same Die after all).

Nvidia castrates these GPUs on purpose as RTX 2060 and 2070 both essentially have the same performance as a RTX 3060 (less than 10% difference, even @ 4K) and would eat it's sales if any cheaper.

Rather than waste scarce chip resources using double the memory on 12GB cards they should just re-release the 8GB RTX 2060 Super or RTX 2070.
If you're afraid of cannibalization, just sell them at ~90% of the price of the 3060. The stock is nonexistent anyway
 
  • Like
Reactions: Tlh97 and Thibsie

Spikke

Member
Jun 23, 2007
30
3
71
I don't understand the 12GB of VRAM. I imagine that in the next few years 12GB might become common and more necessary, but at that point the 2060 will probably be near obsolete and unable to run AAA games at a respectable framerate anyway.
 

jpiniero

Lifer
Oct 1, 2010
14,595
5,215
136
Nvidia castrates these GPUs on purpose as RTX 2060 and 2070 both essentially have the same performance as a RTX 3060 (less than 10% difference, even @ 4K) and would eat it's sales if any cheaper.

There is enough of a gap between the 2060 NS and the 2060S/2070/3060 to make a difference. As for why 12 GB, Six 2 GB chips might be cheaper than Eight 1 GB chips now. But that's really a guess.
 
  • Like
Reactions: Tlh97 and Gideon