• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 209 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 
I'd guess $1,200 MSRP. It's basically a 3090 capable chip without the need for memory modules on both sides of the board.

It will be interesting to see if they change anything else like going with faster RAM or running the clocks closer to (or maybe above) 3080 levels.
 
This design makes more sense given the memory shortages, but would be less attractive than the 20GB version they talked about before. The 6900XT would remain competitive versus this.

There are actually some current games that use 12-16GB at 4K (the latest MSI afterburner shows the actual memory usage), especially the ones with large texture packs. In VR, 14GB or so is typical, and some games like HL Alyx are 20-22GB. It's surprising how memory hungry VR is in general, but for a card costing this much I don't think 12GB is enough.
 
That's the option I expected from the start as it makes more sense. It's less money for memory, but higher bandwidth, and you don't need VRAM on both sides of the PCB.

You do need to use a chip that almost certainly could have gone into a 3090 though. If GDDR6X supply is the bottleneck then less VRAM and a better chip makes sense. They could still do the 20 GB model later when/if 2 GB chips arrive.
 
This will be even harder to buy than the 3080, since it'll be the perfect mining card as you'll save 50+ watts off the 3090 by not having the extra 12 GB of memory.

Videocardz has an article about Chinese mining farms buying Ampere laptops now. Those chips are better binned than the 3070 desktop I bet.
 
Videocardz has an article about Chinese mining farms buying Ampere laptops now. Those chips are better binned than the 3070 desktop I bet.
Ampere... laptops... for mining??? GEEZUS, this is CRAZY-town we're in. Mining crazy-town.

(I'm still using an assortment of RX 5700XT and RX 5600XT, and GTX 1660 ti cards for mining, you know, all the ones that the gamers shirked. WFM.)
 
You do need to use a chip that almost certainly could have gone into a 3090 though. If GDDR6X supply is the bottleneck then less VRAM and a better chip makes sense. They could still do the 20 GB model later when/if 2 GB chips arrive.

Or a chip that could have gone into a 3080. Better 12 chips than 20.
 
Or a chip that could have gone into a 3080. Better 12 chips than 20.

Given the price difference between the 3080 and 3090, it probably doesn't make any difference to Nvidia or it actually works out better to sell a 3090.

The old rumors for the 3080 Ti would have been for a chip that had disabled memory controllers so it would have been a 3080 anyways, but one that uses twice as much memory. Obviously not ideal even at a higher price if it limits the total number of cards that can be produced.

Now it's basically a 3090 since none of the memory controllers are disabled. Whatever number of shaders it has could just be being disabled to allow for higher clocks. However, it does use half the memory, so at $1,200 it makes more money just because more cards are sold.

The real interesting possibility is that it uses the faster 21 GB/s modules that Micron has. Even if they run hotter, only having them on a single side of the board makes it vastly easier to cool. Just give the card the same power budget as the 3090 and even at $1,200 people will line up to buy it.
 

Mining is so popular in Iran it's causing power outages. Has several pics of rows of Ampere cards and boxes.
 

Mining is so popular in Iran it's causing power outages. Has several pics of rows of Ampere cards and boxes.

It's probably also popular because trade sanctions imposed by other countries don't affect miners at all and any economic idiocy that the Iranian government might do itself won't hurt it much either. The cheap electricity is just icing on the cake.
 
So that's where all the video cards have gone. At this point mining is dominated by players who can do it at scale like this. If you can only get one or two cards then it's not worth the trouble, unless you are buying the cards for games and doing it on the side. It's also why the 3090 is somewhat easier to find these days than the cheaper cards. The price make it less attractive for miners.
 
Now it's basically a 3090 since none of the memory controllers are disabled. Whatever number of shaders it has could just be being disabled to allow for higher clocks. However, it does use half the memory, so at $1,200 it makes more money just because more cards are sold.

I would bet that the majority of 3080s right now have ZERO faulty memory controllers.

Most segmented parts, are not faulty, they are just market segmentation. That market segmentation lets you recover those few parts with disabled sections, but most segmented parts are not actually faulty.

So yeah, a 12GB 3080 Ti, could likely be a full 3090, but so could most 10GB 3080's.
 
I would bet that the majority of 3080s right now have ZERO faulty memory controllers.

That's certainly possible, but it really depends a lot on the defect density of the Samsung node and how much space the memory controllers take up, which I'd guess is around 15% of the area.

There was some sentiment that the Samsung node wasn't as mature as people had hoped, so either that means the defect density isn't as low as TSMC, but it could also have meant that the characteristics of it weren't allowing Nvidia to push the clocks as much as they had hoped. Given the size of GA 102, there's a possibility that over half of the dies have at least one defect. Of those we'd expect about 1 in 6 to have a defective memory controller.

Of course, there's also performance of functional silicon. Not everything that's functional performs within specifications, but that's far more difficult to figure out without more information. Most 3080s that have no faulty memory controllers probably don't have enough SMs that can function at all or at the performance levels necessary for the card to be sold as a 3090, so it's probably not quite as simple.

And while normally that is true, right now you'd want to make as many of the high-end part as possible because you'll sell anything you've got, even well above MSRP. In that case, exhaust the part of the market that will pay for a $1500 GPU before you start disabling hardware to better align with what the market will typically bear.
 
Most 3080s that have no faulty memory controllers probably don't have enough SMs that can function at all or at the performance levels necessary for the card to be sold as a 3090, so it's probably not quite as simple.

No reason to expect that level of issues with SMs either. People believe it works like this but typically it doesn't.

You have companies like Apple which essentially make hundreds of millions of chips, and they are leading edge process users, and yet they seldom do any segmentation. They aren't tossing most of their wafers in the garbage.

The majority of parts are good. Segmentation with disabled units is primarily marketing, and as a bonus lets you recover the minority of parts with faulty units.
 
Apple's SoCs are much smaller, so the percentage of defective dies is much lower. Their last 7nm SoC was around the same size of one of AMD's Zen 3 chiplets. If you look at the dies, around half of the space is taken up by fixed-function units as opposed to CPU or GPU cores. If any of that is defective they have to throw the chip out because it wouldn't be able to decode a video or whatever functionality was carried out by the damaged part.

Since the cost per unit is so low and they make so much on their hardware, there's little pressure for them to keep the parts with defects in other parts of the chip either. Sure they could, but it would be a small percentage of the chips they make, and they'd have to disable a lot of cores to account for the fact that of the chips with defects in cores, etc. some would have all the CPU cores, but a non-functional GPU core and others would have the opposite problem. So the resulting chip would need to account for any of the areas where something in the non-fixed function hardware could have a defect. Since their products all tend to be pretty high volume, they'd have to disable perfectly functional chips just to have enough of the crippled ones to sell that product. It's probably not worth the hassle.
 

Another Videocardz post about laptop mining, with more pictures of another farm.
 

Another Videocardz post about laptop mining, with more pictures of another farm.
I expected to see the underside access panel removed to reduce throttling. Would be a pain to re-install them for resale, I guess.
 
They should be since it's supposed to be releasing today. No idea exactly what time the embargo drops, but I'd suspect sometime within the next few hours.

Edit: Looks like they're starting to drop. TPU has just posted a few different cards:
https://www.techpowerup.com/review/msi-geforce-rtx-3060-gaming-x-trio/
https://www.techpowerup.com/review/zotac-geforce-rtx-3060-amp-white-edition/
https://www.techpowerup.com/review/evga-geforce-rtx-3060-xc/
https://www.techpowerup.com/review/palit-geforce-rtx-3060-dual-oc/

GN video is out as well:

 
Last edited:
So I don't want to start one because I don't care enough to maintain it, but I'm really surprised there's so little interest in the 3060 that no one's even bothered to make a launch/review thread.
 
They should be since it's supposed to be releasing today. No idea exactly what time the embargo drops, but I'd suspect sometime within the next few hours.

Edit: Looks like they're starting to drop. TPU has just posted a few different cards:
https://www.techpowerup.com/review/msi-geforce-rtx-3060-gaming-x-trio/
https://www.techpowerup.com/review/zotac-geforce-rtx-3060-amp-white-edition/
https://www.techpowerup.com/review/evga-geforce-rtx-3060-xc/
https://www.techpowerup.com/review/palit-geforce-rtx-3060-dual-oc/

GN video is out as well:


After reading/watching reviews it seems fairly lackluster. About RTX 2070 performance.

If cards were selling at MSRP, the 3060 Ti is delivering better FPS/$. Usually lower end cards deliver better FPS/$ than higher end cards, so in normal times, this would have been something of letdown.

But crazy times, and if you can get one near MSRP it would be seen as a victory.
 
Back
Top