Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 123 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

Veradun

Senior member
Jul 29, 2016
564
780
136
The same could have been said of their CPU division pre zen launch

Sent from my SM-N975F using Tapatalk
Actually, the CPU division at that time was a lot worse shaped than the GPU division now.


remember internet is an echo chamber, i got a 5700 almost at launch, zero issues in the games i play
I got a 5700 with no issue too replacing a 580 again with no issue. My self imposed upper bound is 350 bucks toh, so I had to wait for a couple of months :>

I doubt it. It would be only suitable for smartphone Soc, not 600mm+ GPUs. High end GPUs doesn't reach close to the density limit of the node it uses.
This density might explain the fournaces they are going to be, tho. Maybe pushed too far? No clue.
 
Last edited:
  • Like
Reactions: lightmanek

kurosaki

Senior member
Feb 7, 2019
258
250
86
original recording. Feed an algorithm enough content and it can guess what a lower quality image should have looked like. This includes improving models and textures over the original.

The only picture that comes to mind is:
dogs.png

I know they are doing their best, but in reality, they are guessing what the artists work should look like to be able to present better performance figures.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
But it will never look as good, the tradeoff is going to be a janky ride, from almost nice upscaling, to quite bad. We are cheating ourselves to higher framerates, but it looks good on paper though... "4k with RTX and DLSS 3.0" WOWZA! But all i hear is a fancy phrase for "1440p upscaled with crappy image and higher FPS" But we make it look like it's all great features and clearly superior 4k perf. to the competition.

Thats the thing - DLSS2.0 looks close to or sometimes better than native! It is not just "on paper" and there is not much trade-off. This was analyzed in detail by Digital Foundry and it is an understatement to call the technology impressive. They call the 540p->1080p reconstruction "shockingly similar" to native 1080p for example and 720p->1440p is even better.
You cannot just compare it with other upscaling methods, where you are loosing significant detail.
 
Last edited:

kurosaki

Senior member
Feb 7, 2019
258
250
86
Thats the thing - DLSS2.0 looks close to or sometimes better than native! It is not just "on paper" and there is not much trade-off. This was analyzed in detail by Digital Foundry and it is an understatement to call the technology impressive. They call the 540p->1080p reconstruction "shockingly similar" to native 1080p for example and 720p->1440p is even better.
You cannot just compare it with other upscaling methods, where you are loosing significant detail.
No, instead Nvidia takes the freedom to let their algorithms decide the details. I can't come up with a reason to turn a feature like that on.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
No, instead Nvidia takes the freedom to let their algorithms decide the details. I can't come up with a reason to turn a feature like that on.

Interesting - you cannot come up with a reason to switch on a feature which drastically improves the framerate with literally no IQ degradation - but you surely come up with a reason to buy a new GPU to increase the framerate... makes sense.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
... with literally no IQ degradation

This is false. I am not going to go and find all the screen shots and video to compare. Sure DLSS 2 looks a lot better than the original. But the images displayed with DLSS are DIFFERENT than if you were running at native resolution without DLSS. Which means you CANNOT claim there is no IQ degradation, because its NOT the image the game makers intended you to see. Instead its an interpretation that nVidia gives you, that you, and the developer, have zero control over.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
Which means you CANNOT claim there is no IQ degradation, because its NOT the image the game makers intended you to see. Instead its an interpretation that nVidia gives you, that you, and the developer, have zero control over.

So even if the reconstructed image has more subpixel details than the native image there would be an IQ degradation based on the argument, that the developer did not want you to see these subpixel details? Ok...very strange interpretation of IQ...but whatever floats your boat...

Besides it is not an "interpretation that nVidia gives" but an interpretation of an algorithm, which is trained on very high resolution and detailed images - it is not that the algorithm randomly adds details but it is based on its knowledge of very high resolution content.
 

kurosaki

Senior member
Feb 7, 2019
258
250
86
dlss2.JPG
dlss.JPG
Interesting - you cannot come up with a reason to switch on a feature which drastically improves the framerate with literally no IQ degradation - but you surely come up with a reason to buy a new GPU to increase the framerate... makes sense.
DLSS 2.0 is full of high contrast sharpening. It might give an appearance to eyes that have not spent hours working i photoshop that it is sharper than the original, but.. The best way i can explain it is if you took a photo and posted it on instagram. But before you did, you would go to the settings menu and crank up the HDR, and the contrasts. There is also a pretty weird thing around edges that become jagged, its a type of AA that makes it look like the edges was stitched together with pluses(+x+x+x+x+).
For me, it's not worth the trade off, but to each and their own I guess!
 

Attachments

  • dlss2.JPG
    dlss2.JPG
    58.5 KB · Views: 24
Last edited:

Thala

Golden Member
Nov 12, 2014
1,355
653
136
View attachment 28992

DLSS 2.0 is full of high contrast sharpening. It might give an appearance to eyes that have not spent hours working i photoshop that it is sharper than the original, but.. The best way i can explain it is if you took a photo and posted it on instagram. But before you did, you would go to the settings menu and crank up the HDR, and the contrasts. There is also a pretty weird thing around edges that become jagged, its a type of AA that makes it look like the edges was stitched together with pluses(+x+x+x+x+).
For me, it's not worth the trade off, but to each and their own I guess!

Sure, if you need to bring a negative example, which by the way is highly zoomed and totally ignore the rest of the comparison video, where you see greater detail with DLSS2.0 and minimal ringing/sharpening artifacts, i would call this a very selective argument.
On the bright side, not many of the GPU buying crowd will have this selective view.
 
  • Like
Reactions: xpea

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Sure, if you need to bring a negative example, which by the way is highly zoomed and totally ignore the rest of the comparison video, where you see greater detail with DLSS2.0 and minimal ringing/sharpening artifacts, i would call this a very selective argument.
On the bright side, not many of the GPU buying crowd will have this selective view.

You just posted about how there is no loss in quality. Somebody posts loss of quality, and you say people should ignore that? That "super zoomed in" shot is actually not that zoomed in. The text on the wall is huge and the issue can be seen when you are standing quite a ways from it (in game). Text issues are all over the place when DLSS is on.

EDIT: Fixed typo, forgot to add the word 'not' before 'that zoomed in'.
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
So even if the reconstructed image has more subpixel details than the native image there would be an IQ degradation based on the argument, that the developer did not want you to see these subpixel details? Ok...very strange interpretation of IQ...but whatever floats your boat...

Besides it is not an "interpretation that nVidia gives" but an interpretation of an algorithm, which is trained on very high resolution and detailed images - it is not that the algorithm randomly adds details but it is based on its knowledge of very high resolution content.
The algorithm 'knows' better? More subpixel details? It's all imaginary. I suppose that artistic forgeries are better than the original also? Tell that to the artist.

I might get banned if I express what I really think.

Many years ago, as a hobby, I was exploring a 'sort of compression' that allowed the use of AI to produce images with very little data. Sort of like telling someone to draw an image from a detailed description. A white plane with 2 engines on the wings, 40 widows per side, a t-tail, etc, etc, etc. Each attempt produced a different image. It was roughly based on how human storytelling works and what we image internally.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
You just posted about how there is no loss in quality. Somebody posts loss of quality, and you say people should ignore that? That "super zoomed in" shot is actually that zoomed in. The text on the wall is huge and the issue can be seen when you are standing quite a ways from it (in game). Text issues are all over the place when DLSS is on.
You did pick 1080 to 4k which is a huge upscale with a corresponding huge performance uplift.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
View attachment 28994
View attachment 28992

DLSS 2.0 is full of high contrast sharpening. It might give an appearance to eyes that have not spent hours working i photoshop that it is sharper than the original, but.. The best way i can explain it is if you took a photo and posted it on instagram. But before you did, you would go to the settings menu and crank up the HDR, and the contrasts. There is also a pretty weird thing around edges that become jagged, its a type of AA that makes it look like the edges was stitched together with pluses(+x+x+x+x+).
For me, it's not worth the trade off, but to each and their own I guess!

Yeah,I see the same. For people who do not work with ps,etc they can't see the obvious problems. The sharpening is on edges,which make it more appealing to the eye(usm in ps). These so called reviewers,are not proficient with image manipulation and cannot see this.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,894
3,247
126
Gainward released leaks...

RTX 3090 Phoenix
  • CUDA cores: 5248
  • Clock speed: 1695 MHz (Boost)
  • Memory: 24GB GDDR6X
  • Memory clock: 9750 MHz
  • Bandwidth: 936 GB/s
  • PCIe: Gen 4
  • Max power consumption: 350W
  • Output: HDMI 2.1, DisplayPort 1.4a
RTX 3080 Phoenix
  • CUDA cores: 4352
  • Clock speed: 1710 MHz (boost)
  • Memory: 10GB GDDR6X
  • Memory clock: 9500 MHz
  • Bandwidth: 760 GB/s
  • PCIe: Gen 4
  • Max power consumption: 320W
  • Output: HDMI 2.1, DisplayPort 1.4a

Im seriously LMAFOing at that 3090 GPU Mem. 24GB on a single card, with 350W....
Im also wondering when we'll start seeing more DP1.4a monitors.... we are given so many DP ports, yet almost none of them can do HDR over DP unless its 1.4.
 

Saylick

Diamond Member
Sep 10, 2012
3,532
7,859
136
I'm still surprised at the 30W TDP between the two SKUs even though they are clocked about the same, but yet the 3090 has 20% more cores and memory chips. I feel like the real TDP of the 3090 has to be closer to 375W...
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,894
3,247
126
I'm still surprised at the 30W TDP between the two SKUs even though they are clocked about the same, but yet the 3090 has 20% more cores and memory chips. I feel like the real TDP of the 3090 has to be closer to 375W...

yeah that's why i was saying LMFAO... i also do not believe the 3090 can be 350W with more then double the ram, almost 1000 more CUDA cores.
 

CakeMonster

Golden Member
Nov 22, 2012
1,502
659
136
HDMI 2.1 with all features better be supported on all upcoming monitors, because without DP 2.0 support HDMI will have to do for the long awaited better monitor specs. I'd love to see above 120hz on 4-8K monitors soon. DP 2.0 spec has been finished for a long while, I wonder why they couldn't put it on this gen...