Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 125 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

dr1337

Senior member
May 25, 2020
344
598
106
The post you quoted already says one reason; TAA.
DLSS looking better than TAA isn't saying much. I've yet to see a good comparison of DLSS quality vs real native 4k without any TAA/smeary AA filters. I will say 100% that DLSS is better than TAA in every comparison ive seen, but im really not sure that its actually better than native. TAA at 4k is a bit silly anyways as you really don't need that much AA as you increase resolution.

And if DLSS isn't as good as native, it begs the question. Would it not be better to have more shaders instead of tensor cores?
 

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
The numbers are marketing. TSMC 12NM was 16NM.Globalfoundries 12NM was 14NM.

NV being on 8nm would be 1nm behind amd.

And the density of 12/14/16 was pretty close to 20nm. But funny enough this process of minor improvements and making up fictitious node numbers managed to actually leapfrog the more honest Intel.
 
  • Haha
Reactions: pcp7

Veradun

Senior member
Jul 29, 2016
564
780
136
The same could have been said of their CPU division pre zen launch

Sent from my SM-N975F using Tapatalk
Actually, the CPU division at that time was a lot worse shaped than the GPU division now.


remember internet is an echo chamber, i got a 5700 almost at launch, zero issues in the games i play
I got a 5700 with no issue too replacing a 580 again with no issue. My self imposed upper bound is 350 bucks toh, so I had to wait for a couple of months :>

I doubt it. It would be only suitable for smartphone Soc, not 600mm+ GPUs. High end GPUs doesn't reach close to the density limit of the node it uses.
This density might explain the fournaces they are going to be, tho. Maybe pushed too far? No clue.
 
Last edited:
  • Like
Reactions: lightmanek

kurosaki

Senior member
Feb 7, 2019
258
250
86
original recording. Feed an algorithm enough content and it can guess what a lower quality image should have looked like. This includes improving models and textures over the original.

The only picture that comes to mind is:
dogs.png

I know they are doing their best, but in reality, they are guessing what the artists work should look like to be able to present better performance figures.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
But it will never look as good, the tradeoff is going to be a janky ride, from almost nice upscaling, to quite bad. We are cheating ourselves to higher framerates, but it looks good on paper though... "4k with RTX and DLSS 3.0" WOWZA! But all i hear is a fancy phrase for "1440p upscaled with crappy image and higher FPS" But we make it look like it's all great features and clearly superior 4k perf. to the competition.

Thats the thing - DLSS2.0 looks close to or sometimes better than native! It is not just "on paper" and there is not much trade-off. This was analyzed in detail by Digital Foundry and it is an understatement to call the technology impressive. They call the 540p->1080p reconstruction "shockingly similar" to native 1080p for example and 720p->1440p is even better.
You cannot just compare it with other upscaling methods, where you are loosing significant detail.
 
Last edited:

kurosaki

Senior member
Feb 7, 2019
258
250
86
Thats the thing - DLSS2.0 looks close to or sometimes better than native! It is not just "on paper" and there is not much trade-off. This was analyzed in detail by Digital Foundry and it is an understatement to call the technology impressive. They call the 540p->1080p reconstruction "shockingly similar" to native 1080p for example and 720p->1440p is even better.
You cannot just compare it with other upscaling methods, where you are loosing significant detail.
No, instead Nvidia takes the freedom to let their algorithms decide the details. I can't come up with a reason to turn a feature like that on.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
No, instead Nvidia takes the freedom to let their algorithms decide the details. I can't come up with a reason to turn a feature like that on.

Interesting - you cannot come up with a reason to switch on a feature which drastically improves the framerate with literally no IQ degradation - but you surely come up with a reason to buy a new GPU to increase the framerate... makes sense.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
... with literally no IQ degradation

This is false. I am not going to go and find all the screen shots and video to compare. Sure DLSS 2 looks a lot better than the original. But the images displayed with DLSS are DIFFERENT than if you were running at native resolution without DLSS. Which means you CANNOT claim there is no IQ degradation, because its NOT the image the game makers intended you to see. Instead its an interpretation that nVidia gives you, that you, and the developer, have zero control over.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
Which means you CANNOT claim there is no IQ degradation, because its NOT the image the game makers intended you to see. Instead its an interpretation that nVidia gives you, that you, and the developer, have zero control over.

So even if the reconstructed image has more subpixel details than the native image there would be an IQ degradation based on the argument, that the developer did not want you to see these subpixel details? Ok...very strange interpretation of IQ...but whatever floats your boat...

Besides it is not an "interpretation that nVidia gives" but an interpretation of an algorithm, which is trained on very high resolution and detailed images - it is not that the algorithm randomly adds details but it is based on its knowledge of very high resolution content.
 

kurosaki

Senior member
Feb 7, 2019
258
250
86
dlss2.JPG
dlss.JPG
Interesting - you cannot come up with a reason to switch on a feature which drastically improves the framerate with literally no IQ degradation - but you surely come up with a reason to buy a new GPU to increase the framerate... makes sense.
DLSS 2.0 is full of high contrast sharpening. It might give an appearance to eyes that have not spent hours working i photoshop that it is sharper than the original, but.. The best way i can explain it is if you took a photo and posted it on instagram. But before you did, you would go to the settings menu and crank up the HDR, and the contrasts. There is also a pretty weird thing around edges that become jagged, its a type of AA that makes it look like the edges was stitched together with pluses(+x+x+x+x+).
For me, it's not worth the trade off, but to each and their own I guess!
 

Attachments

  • dlss2.JPG
    dlss2.JPG
    58.5 KB · Views: 24
Last edited:

Thala

Golden Member
Nov 12, 2014
1,355
653
136
View attachment 28992

DLSS 2.0 is full of high contrast sharpening. It might give an appearance to eyes that have not spent hours working i photoshop that it is sharper than the original, but.. The best way i can explain it is if you took a photo and posted it on instagram. But before you did, you would go to the settings menu and crank up the HDR, and the contrasts. There is also a pretty weird thing around edges that become jagged, its a type of AA that makes it look like the edges was stitched together with pluses(+x+x+x+x+).
For me, it's not worth the trade off, but to each and their own I guess!

Sure, if you need to bring a negative example, which by the way is highly zoomed and totally ignore the rest of the comparison video, where you see greater detail with DLSS2.0 and minimal ringing/sharpening artifacts, i would call this a very selective argument.
On the bright side, not many of the GPU buying crowd will have this selective view.
 
  • Like
Reactions: xpea