Question [Videocardz] nVidia planning 2060 12 GB

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jpiniero

Lifer
Oct 1, 2010
14,585
5,209
136

Could be released by January. Here's your answer for what they plan on doing with the Turing wafers that were meant for CMP.
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,400
2,437
146
Yeah I was kinda talking about that. Though afaik, only the 3070Ti and higher use GDDR6X, the others use GDDR6.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
The only option for Turing that I know of was the 15.5GBps gddr6 on the 2080 super. I'm assusing that they had the option to go that way with the 2060 12G if they wanted.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,465
20,499
146
Hmm Nvidia has been making some really weird product options lately. Less memory then more memory then less then more, also the whole LHR issue, it boggles the mind :/
With record revenue, they are experiencing the Cartman effect -


On a more serious note: LTT has a speculation vid about it. TLDW - the ram probably lets partners use the same or similar PCBs that they currently produce. The chip will yield more per wafer than something with a bigger die like a 2070Super. Tapping production that is not already fully booked is a smart move?
 
  • Like
Reactions: Tlh97 and blckgrffn

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,465
20,499
146
Well, Nvidia are very good at capitalism, so more too sell at high profits (for an old card) sounds just about right. The enthusiast gamer market has about as much strength as a fart in the wind.
Worse yet, most are paying the big markup and bragging about it. r/PCMR constantly has posts of the younger crowd thinking they scored because they got their card from a Best Buy drop or MicroCenter. That undoubtedly has retail execs twirling their evil mustaches.
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,062
136
www.teamjuchems.com
With record revenue, they are experiencing the Cartman effect -


On a more serious note: LTT has a speculation vid about it. TLDW - the ram probably lets partners use the same or similar PCBs that they currently produce. The chip will yield more per wafer than something with a bigger die like a 2070Super. Tapping production that is not already fully booked is a smart move?

Exactly. If they can use the 192 bit 2060 PCBs that seemingly never stopped production for OEMs and just juice it up with the full core and higher density ram then, it's a win for them.

The die size seems to be the same from 2060 thru 2070S at least...


2060S might be the sweet spot with the yield and power usage especially on what's probably cheaper PCBs.


OK, more research and it looks like the TU106 was more commonly used for 2060 cards, even if there seems to have been a run of cut TU104's used for 2060 cards as well. My bad.

TU104 is about 25% larger than TU106.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,465
20,499
146
OK, more research and it looks like the TU106 was more commonly used for 2060 cards, even if there seems to have been a run of cut TU104's used for 2060 cards as well. My bad.

TU104 is about 25% larger than TU106.
Yup, that was LTT's speculation on the reason for the choice. If accurate, it is smart biz. They also covered a point I have made. It should mean driver support for these cards for much longer than would originally be expected. Otherwise they will be abandoning owners that bought the product only a few years ago. Which they might; the Cartman effect is real.
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,062
136
www.teamjuchems.com
Yup, that was LTT's speculation on the reason for the choice. If accurate, it is smart biz. They also covered a point I have made. It should mean driver support for these cards for much longer than would originally be expected. Otherwise they will be abandoning owners that bought the product only a few years ago. Which they might; the Cartman effect is real.

To be fair, I was more excited about the 2060S core vs the stock 2060 core and paired with 12GB of ram for some longevity. You know, if it was slotted in UNDER the 3060. You, know, like $280 MSRP and like ~$560 street value (of course). The 2060S core is a solid piece of kit and what I think the 2060 should have essentially always been. It was always a little too cut down IMO. Which everyone knows is very important :p

But in a world where value is determined by hashrate vs performance and feature set well... it's not anything I need to pay attention to.

Even when the dust settles I will probably avoid the 2060 12GB under the assumption that it was mined on and has extra hundreds/thousands of hours on its fans, etc.
 
  • Like
Reactions: DAPUNISHER

Ajay

Lifer
Jan 8, 2001
15,433
7,849
136
Worse yet, most are paying the big markup and bragging about it. r/PCMR constantly has posts of the younger crowd thinking they scored because they got their card from a Best Buy drop or MicroCenter. That undoubtedly has retail execs twirling their evil mustaches.
I'll have to take a look. I know my 26 y/o nephew was *happy* to get a 2080Ti from Micro-Center for ~$1800 o_O. That's more than twice as much as I payed for and older SLI system, when that was a thing. I had some hope for buying a new card sometime in 2023 - but the hashrates/watt are probably going to double so - IDK. I'll be buying games from 2020 or earlier for the next couple of years just hoping Bitcoin implodes for at least a few months.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,465
20,499
146
I'll have to take a look. I know my 26 y/o nephew was *happy* to get a 2080Ti from Micro-Center for ~$1800 o_O. That's more than twice as much as I payed for and older SLI system, when that was a thing. I had some hope for buying a new card sometime in 2023 - but the hashrates/watt are probably going to double so - IDK. I'll be buying games from 2020 or earlier for the next couple of years just hoping Bitcoin implodes for at least a few months.
I play a lot of games on Ryzen APUs, not because I have to, they just don't need any more power than that.
 
  • Like
Reactions: Zepp

maddie

Diamond Member
Jul 18, 2010
4,739
4,668
136
I'll have to take a look. I know my 26 y/o nephew was *happy* to get a 2080Ti from Micro-Center for ~$1800 o_O. That's more than twice as much as I payed for and older SLI system, when that was a thing. I had some hope for buying a new card sometime in 2023 - but the hashrates/watt are probably going to double so - IDK. I'll be buying games from 2020 or earlier for the next couple of years just hoping Bitcoin implodes for at least a few months.
Can't remember if I ever made this observation, but here goes. Shouldn't the whole gaming ecosystem start to suffer from this shortage/high price situation, if it continues? A few years of this and we're in new territory.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
What made the 2060 super desirable was that it had the same memory bandwidth of the 2070 series, with the 2070 chip just cut don a bit internally. It could be made to boost well enough to keep close enough to the 2070 to be worth it.

I would think that miners would rather have the old 8gb 2060 super as mining seems to bememory bandwidth constrained once you are past 5.5 GB. 12GB at 192 bits should make for worse mining than 8GB at 256 bits. However, the one bench I saw for this was the opposite.
 

Ajay

Lifer
Jan 8, 2001
15,433
7,849
136
Can't remember if I ever made this observation, but here goes. Shouldn't the whole gaming ecosystem start to suffer from this shortage/high price situation, if it continues? A few years of this and we're in new territory.
Yes, I do wonder. Then again, seems like a they are selling well, even to gamers. I do wonder if those who stick with lower tier or middle tier cards are just. I typically buy around the $400-500 range. I'm pretty cheesed off right now and wonder how many are in the same boat. I could probably bump that up by $200 hoping to get 5 years out of it. I bought my current GPU ~4 years ago for $450, IIRC.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,117
136
Worse yet, most are paying the big markup and bragging about it. r/PCMR constantly has posts of the younger crowd thinking they scored because they got their card from a Best Buy drop or MicroCenter. That undoubtedly has retail execs twirling their evil mustaches.

- No kidding. So many "I scored a 3070 for $1100 MSRP!" posts on there, as though MSRP is the sticker price when the card is sitting on the shelf. FOMO is in maximum overdrive right now and people are paying absurd prices on hardware to play the next cookie cutter Ubisoft open world garbage.

If this keeps up I'll be sitting out of the rat race for a good long while.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,327
10,036
126
At least as long as APU prices don't explode ...
As long as it's pointless to "mine" on APU's IGPU portion, then I don't think you'll have anything to worry about. 128-bit DDR4 memory bus (same thing as GT 1030 "sucktastic edition") isn't really amenable to mining.

We'll see once DDR5 and DDR5-utilizing APUs (RDNA2-based) come out, whether those are suitable for mining at all. Better pray that they aren't.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
They'll be useless, or as close as possible to it, for mining as long as they can't break even with respect to cost of purchase, energy usage and mining production. APUs, unless they get some sort of integrated memory that is larger than the DAG, will not have sufficient memory bandwidth to compete with any dGPU, so long as their memory bandwidth is below 200GB/sec. The only APU that theoreticallymight be in danger is the M1max, and that's trapped in the Apple ecosystem. Infinity cache won't really put APUs in danger either.

On top of all of that, you have to deal with the cost of thermal management. Using an APU for mining will generate a lot of heat, which will need a solid heat sink as well, which will also add to break even cost.

AMD, if they chose to, could make an APU that was essentially a 5800x, with 64MB of stacked cache to operate as an infinity cache and a 20cu RDNA2 iGPU with quad channel DDR5 memory controller with only four actual dimm slots to keep board costs down and sell every one that they make at $200 over what they charge for the 5800x now. That would be a perfectly competent gaming solution for high quality 1080p gaming for 90% of the market. They won't, of course, but they could. It wouldn't even require developing any new IP, save for a socket. And, as a bonus, even after all of that, it wouldn't be interesting to miners because, even with quad channel DDR5, it would still suck at mining.
 
Feb 4, 2009
34,555
15,771
136
Hear me out here,
The main reason to own a current $500 or greater card is games. I know there is VR and rendering and so on too just seems like those are either work thing that generate money like mining does or a luxury like VR.
As of now to me it appears people making games should code so all resources are used effectively and focus more of “fun” vs “eye candy”. There are many games you can play on an old card and have a very good experience. Basically make games more fun and less pretty.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,465
20,499
146
The solution to all of this is decent desktop APUs...
Counter opinion- AMD already makes good desktop APUs. I can play well over a 100 games in my personal library with a 5600g or 5700G. Many of those with maxed out or higher settings 1080p. Forza Horizon 5 just came out and you can do better than locked 30 1440p low with either APU.

If someone demands to be able to play Warzone, BF 2042, Tarkov, or the big AAA's at higher settings and refresh? I say pony up or shut up. Particularly applicable to our crowd i.e. financially secure, older, and evidently, salty. :p When most of the members here complain, I check it off as old man screams at clouds. "It's the principle of the thing! They ain't extorting me! I never paid more than X amount, and I ain't doing it now! I'll buy a console! I'll stop gaming, and they'll be sorry! 60fps is minimum! I'm a high refresh gamer! I play at 4K" or whatever fist shaking at the clouds complaint is made. For me it translate to "Do you have any Grey Poupon?" Pay up if that is your bag Mr. Fancy pants. :D

Myself? I just started Halo Infinite. It is way too hardware demanding for the way it looks. I suspect it will be like MSFS and get a performance patch later that improves things quite a bit. But my 2070Super is good enough. If all I had was my 5700G though, I'd make it work. Or I'd skip it and play older games I missed or was wanting to play again. I played the Master Chef collection on the 5700G just fine, to get hyped for Infinite. There are many more. GTAV is still one of the most played games, and a 5700G can do 1080p high with above 60fps. I will stop there, since I think that illustrates my point well enough.

TLDR - If you don't think the 5 series APUs are decent, then it is my considered opinion that the problem is you, not the APUs. And if your contention revolves around hardware crushing modern games? My response is - pony up or shut up. Life ain't fair, and you are old enough to know that having caviar taste and tuna budget is a character flaw. ;)

And this not targeted at anyone in particular, despite the quote. I am strongly opinionated on this subject, that's all.