Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 208 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

Ajay

Lifer
Jan 8, 2001
15,428
7,847
136
Oh, and I think things will get better by this summer. Chinese new year will have passed, COVID new cases should be down and sales tend to be slower in the summer (unless ethereum prices skyrocket).
 

uzzi38

Platinum Member
Oct 16, 2019
2,620
5,870
146
Man, you guys are all worried right now, I'm absolutely frightened for next gen. If rumours hold true we'll have AMD and Nvidia both using N5 for GPUs, AMD using it for CPUs and Apple still using the node as well.
 

Ajay

Lifer
Jan 8, 2001
15,428
7,847
136
Man, you guys are all worried right now, I'm absolutely frightened for next gen. If rumours hold true we'll have AMD and Nvidia both using N5 for GPUs, AMD using it for CPUs and Apple still using the node as well.
Well, Apple will be fine - they are the only company willing to buy a lot of risk wafers; essentially underwriting TSMC's process development efforts. So, they get whatever they need. The rest, well, unless Samsung can get it's act together, it's going to be a lean couple of years. I really do wonder if Intel and Samsung will get together to boost each other's fortunes. Separately, they just can't catch TSMC.
 

reb0rn

Senior member
Dec 31, 2009
221
58
101
All 3080 and 3090 are defect for any memory load, HWinfo now show gddr6x temp and in heavy load it goes over 100C

The info was hidden and not disclosed and none review site even tried to test OC memory cards! just imagine how defect and throttling in summer!
 

Mopetar

Diamond Member
Jan 31, 2011
7,831
5,979
136
High temperatures don't mean something is defective. Unless it gets so hot it starts to melt the solder (really unlikely) or cause other issues like warping the board, it's not really a problem.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,112
136
I am beginning to hate GPUs. We had the 2017 mining boom that drove prices insane, followed by the extremely expensive Turing cards, followed by the unavailable and heavily scalped Ampere/RDNA2 cards, and now we are getting another mining Boom...

I am starting to think Indie gaming on iGPUs is the way forward for PC gaming. If you want AAA gaming, get a console.

- I think I'm going to be replacing my 980ti with an RTX 6030/ RX 9300XT at this rate if I want to get any kind of performance bump without blasting a Hiroshima sized hole in my wallet.

980Ti 4 LYFE. Seriously, its starting to look like the last GPU I'll ever purchase at this rate.
 
  • Like
Reactions: lightmanek

reb0rn

Senior member
Dec 31, 2009
221
58
101
High temperatures don't mean something is defective. Unless it gets so hot it starts to melt the solder (really unlikely) or cause other issues like warping the board, it's not really a problem.

The cards hit that temp all the time on heavy use, memory chip will degrade in ~1-2 years most will degrade after 3+ and die
SO it DEFECT :D
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
- I think I'm going to be replacing my 980ti with an RTX 6030/ RX 9300XT at this rate if I want to get any kind of performance bump without blasting a Hiroshima sized hole in my wallet.

980Ti 4 LYFE. Seriously, its starting to look like the last GPU I'll ever purchase at this rate.

So true, speaking from 290x...I'm really glad I saw this coming 4-5 years ago when I bought a 1080p 144hz display instead of going 2k or 4k. GPU pricing was my reasoning. with 1080p mid-range to low-end will be fine in the future and for older games or more efficent one you get the 144hz feature. 2k at 120hz is already more taxing than 4k so "high refresh gaming" is still really only viable on 1080p if one doesn't want to spend $1000 on GPU every 2-3 years.
 
  • Like
Reactions: GodisanAtheist

JujuFish

Lifer
Feb 3, 2005
11,003
735
136
So true, speaking from 290x...I'm really glad I saw this coming 4-5 years ago when I bought a 1080p 144hz display instead of going 2k or 4k. GPU pricing was my reasoning. with 1080p mid-range to low-end will be fine in the future and for older games or more efficent one you get the 144hz feature. 2k at 120hz is already more taxing than 4k so "high refresh gaming" is still really only viable on 1080p if one doesn't want to spend $1000 on GPU every 2-3 years.
2K is 1080p.
 
  • Like
Reactions: Heartbreaker

DeathReborn

Platinum Member
Oct 11, 2005
2,746
740
136
I was "2K" with CRT's @ 2304x1440 (Sony Trinitron FW900 that I still have), then spent nearly 15 years on "1K" 1080p before going "4K" 2160p & "2K" 1440p last year. DCI 4K resolution is true 4K to me (4096x2160).
 
  • Like
Reactions: lightmanek

sze5003

Lifer
Aug 18, 2012
14,182
625
126
I gave up looking for a new card. Sticking with the 1080ti and my 3440x1440 monitor for now. My goal soon is to buy a house/condo in the next 6 months so I dont need to be spending as freely as I have been. Nor would I ever spend more than msrp either way seeing how gpu prices have increased already. I've got both next gen consoles and that should do nicely until stock levels return to normal.
 

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,698
136
2K is 1080p.
Using 1080p/2k/4k for 1920x1080, 2560x1440 and 3840x2160 is incredibly common even if there is no official single definition for 2k and 4k. It's obviously weird to round 3840 up to 4k but not 1920 up to 2k, but personally I don't actually mind it that much because while it's a crappy way of discussing horizontal resolution 1440p has 2x the pixels of FHD and 2160p has 4x.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,112
136
Using 1080p/2k/4k for 1920x1080, 2560x1440 and 3840x2160 is incredibly common even if there is no official single definition for 2k and 4k. It's obviously weird to round 3840 up to 4k but not 1920 up to 2k, but personally I don't actually mind it that much because while it's a crappy way of discussing horizontal resolution 1440p has 2x the pixels of FHD and 2160p has 4x.

- Maybe it would be more accurate to have the unofficial nomenclature follow a 1x/2x/4x standard.

If you could put out the word to get the Secret Society of Computer Terminology Geeks together we can hash it out over some brews :D
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
Using 1080p/2k/4k for 1920x1080, 2560x1440 and 3840x2160 is incredibly common even if there is no official single definition for 2k and 4k. It's obviously weird to round 3840 up to 4k but not 1920 up to 2k, but personally I don't actually mind it that much because while it's a crappy way of discussing horizontal resolution 1440p has 2x the pixels of FHD and 2160p has 4x.

People commonly get things wrong. That doesn't make them right, even when they have odd rationalizations to go with it.

2K = ~2000 horizontal pixels
4K = ~4000 horizontal pixels
8K = ~8000 horizontal pixels

That isn't even remotely challenging to get correct.

It would probably be better to refer to Vertical resolution directly as in in 720p, 1080p, 1440p, 2160p, 4320p .... Or the whole resolution if it's an oddball one like 3440x1440.

If you want to use "K" as a lazy designation with 2560x1440 displays, it should be 2.5K.
 

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,698
136
People commonly get things wrong. That doesn't make them right, even when they have odd rationalizations to go with it.

2K = ~2000 horizontal pixels
4K = ~4000 horizontal pixels
8K = ~8000 horizontal pixels

That isn't even remotely challenging to get correct.

It would probably be better to refer to Vertical resolution directly as in in 720p, 1080p, 1440p, 2160p, 4320p .... Or the whole resolution if it's an oddball one like 3440x1440.

If you want to use "K" as a lazy designation with 2560x1440 displays, it should be 2.5K.
¯\_(ツ)_/¯
DCI doesn't consider 1920x1080 2K either, even though there's only 2.5% more pixels in DCI 2k than FHD. Ultimately if you're going into a store asking for 2k or typing 2k display into Google and getting taken to 1440p displays, you're being pedantic saying "Acktually 1920x1080 is closer to 2k than 2560x1440" It doesn't matter how many times you try and tell Gramma you're playing a Sega Genesis, if she tells you to get off your Nintendo and go mow the lawn, you best stop playing Sonic.

Edit: But you are right, 1080p and 1440p are the better way to describe them if given the choice. 2160p/4k is tough, because 4k is far and away the accepted nomenclature there. 8k I'd imagine even most technically minded people could give the actual resolution off the top of their head without doing the math.
 
Last edited:
  • Like
Reactions: beginner99

JujuFish

Lifer
Feb 3, 2005
11,003
735
136
Using 1080p/2k/4k for 1920x1080, 2560x1440 and 3840x2160 is incredibly common even if there is no official single definition for 2k and 4k. It's obviously weird to round 3840 up to 4k but not 1920 up to 2k, but personally I don't actually mind it that much because while it's a crappy way of discussing horizontal resolution 1440p has 2x the pixels of FHD and 2160p has 4x.
You can call it common, but that doesn't make it not a mistake. 2K and 4K refer to the approximate number of horizontal pixels.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,746
740
136
People commonly get things wrong. That doesn't make them right, even when they have odd rationalizations to go with it.

2K = ~2000 horizontal pixels
4K = ~4000 horizontal pixels
8K = ~8000 horizontal pixels

That isn't even remotely challenging to get correct.

It would probably be better to refer to Vertical resolution directly as in in 720p, 1080p, 1440p, 2160p, 4320p .... Or the whole resolution if it's an oddball one like 3440x1440.

If you want to use "K" as a lazy designation with 2560x1440 displays, it should be 2.5K.

Steve from Gamers Nexus agrees with us, 8K as we buy displays is marketing only. (4:03 in video)

 

CP5670

Diamond Member
Jun 24, 2004
5,510
588
126
8K is only relevant for top end TVs, but 5K-7K or so (6460x3240) comes up in VR. The 3090 is substantially faster than a 3080 at such resolutions (as discussed here), but is still not fast enough for many VR games. There is a role for even faster cards whenever they come out.
 

Tup3x

Senior member
Dec 31, 2016
959
942
136
HD, FHD, QHD and UHD. xxxxp doesn't tell anything about width so it's not necessarily accurate enough (unless we agree that it only applies to 16:9 aspect ratio).
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
¯\_(ツ)_/¯
DCI doesn't consider 1920x1080 2K either,

DCI doesn't consider 3840x2160 to be officially 4K either.

It doesn't change the point that the intent of all the "K" designations from 1K to 8K are as the approximate number of Horizontal pixels.

A lot dumb people thinking that 2560 fits that pattern, doesn't make them right. It just makes for a lot of dumb people.

I think there is ample evidence that we have an overabundance of dumb people, but that is not sufficient reason to yield to their incorrect view of reality.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
HD, FHD, QHD and UHD. xxxxp doesn't tell anything about width so it's not necessarily accurate enough (unless we agree that it only applies to 16:9 aspect ratio).

Absolutely,

That way I can say I downgraded my resolution from lower quality WQXGA monitor, to a higher quality WUXGA, because of panel issues, but I plan on getting WQHD monitor next, and everyone will know exactly what I am talking about. It's so intuitive. ;)

Maybe we could use Megapixels like is often done for digital cameras. It would actually be more useful from a GPU Load perspective as it compares the number of actual pixels, not one horizontal or vertical measure.

I went from 4MP (WQXGA) to 2.3MP (WUXGA) and next up is likely 3.7MP (WQHD), but I think 8.3MP is overkill and don't get me started on 33MP.
 
Last edited: