• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 160 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mopetar

Diamond Member
Jan 31, 2011
6,119
2,939
136
Guess we'll you guys will have to see how it pans out in the end.

Seems odd to call the lowest bin that only gets 30% of the dies the "average" bin.

If they're going to piss on my face and tell me it's raining they may as well call the bins "amazing", "better than best", and "holy @!$& this is impossibly good!"

I am kind of curious what the golden 3090 cards will sell for and how much a bump they'll offer. Maybe those are the ones that just get water blocks and pushed as far past unreasonable as the laws of physics allow for.

I'm not in the market for such a thing, but I do always like looking through the shop window at such parts.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,502
2,244
136
Kinda of article at this point. All chips will meet the specs Nvidia has put out will be discarded (or saved for a future lower spec card).
Did you leave some words out?

We'll have to see how it pans out once the reviewers and general public gets their hands on them.
 

Aikouka

Lifer
Nov 27, 2001
29,966
620
126
I've been considering the idea of an upgrade to the 30-series cards, but given that I have a 2080 Ti, it's likely that the 3080 or 3090 would be the only decent upgrade. ...and I'm sure some would raise a slight objection about losing some VRAM going from the 2080 Ti to the 3080. Although, it's hard to ignore that the 3090 doesn't really offer much over the 3080... outside of the much larger VRAM pool. I ran basic numbers on the jump in performance/hardware from the 3080 to the 3090 (via a table on Wikipedia), and the difference was only an average of +23.05%. (Removing the memory increase dropped that to 13.05%... ouch.) On the flip side, the jump from the 3070 to the 3080 was 36.24%, which was far closer to the increase in price of 40.08%.

Does anyone have some interesting napkin math or other guesstimates to performance between these cards?
 

Markfw

CPU Moderator, VC&G Moderator, Elite Member
Super Moderator
May 16, 2002
21,500
9,557
136
Newegg now has a couple of pre-releases advertised, you can pick auto-notify, no prices.
 
  • Like
Reactions: FatherMurphy

sze5003

Lifer
Aug 18, 2012
13,502
355
126
I've been considering the idea of an upgrade to the 30-series cards, but given that I have a 2080 Ti, it's likely that the 3080 or 3090 would be the only decent upgrade. ...and I'm sure some would raise a slight objection about losing some VRAM going from the 2080 Ti to the 3080. Although, it's hard to ignore that the 3090 doesn't really offer much over the 3080... outside of the much larger VRAM pool. I ran basic numbers on the jump in performance/hardware from the 3080 to the 3090 (via a table on Wikipedia), and the difference was only an average of +23.05%. (Removing the memory increase dropped that to 13.05%... ouch.) On the flip side, the jump from the 3070 to the 3080 was 36.24%, which was far closer to the increase in price of 40.08%.

Does anyone have some interesting napkin math or other guesstimates to performance between these cards?
The only reason I'm not considering a 3080 is for the vram being lower than my 1080ti. I am considering the 3090 because I've kept my current card for about 3 years now and would want to do the same with whatever card I upgrade to.
 

jpiniero

Diamond Member
Oct 1, 2010
9,939
2,282
136

Videocardz has some benchmarks.
 

exquisitechar

Senior member
Apr 18, 2017
490
564
136

Videocardz has some benchmarks.
Looks a bit weak in these games, but it's only a sample of two. Anyway, the performance/watt of Ampere is disappointing.
 

eek2121

Golden Member
Aug 2, 2005
1,228
1,285
136

Videocardz has some benchmarks.
Looks a bit weak in these games, but it's only a sample of two. Anyway, the performance/watt of Ampere is disappointing.
Of course. DLSS off and the performance advantage goes away. Somehow the card has landed 30-40% faster than the 2080ti. I am *shocked*. Not really. Wait for final benchmarks, of course.
 

JoeRambo

Golden Member
Jun 13, 2013
1,207
1,105
136
That the gap is lower with DLSS+RT on in SOTR is a little disturbing.
The sad reality is that we are bound by CPU power, look at Far Cry New Dawn - CPU simply can't pump more than ~130 frames per second.
Same could be happening with DLSS tests, if there is CPU bound, DLSS cannot increase framerates cause CPU is not readying them fast enough.

I expect 1440P to become the new 1080P with this new generation of cards.
 

exquisitechar

Senior member
Apr 18, 2017
490
564
136
The sad reality is that we are bound by CPU power, look at Far Cry New Dawn - CPU simply can't pump more than ~130 frames per second.
Same could be happening with DLSS tests, if there is CPU bound, DLSS cannot increase framerates cause CPU is not readying them fast enough.
I think so too.
I expect 1440P to become the new 1080P with this new generation of cards.
And with DLSS, it's more like 4k is the new 1080p.
 
  • Haha
Reactions: GodisanAtheist

Aikouka

Lifer
Nov 27, 2001
29,966
620
126
The only reason I'm not considering a 3080 is for the vram being lower than my 1080ti. I am considering the 3090 because I've kept my current card for about 3 years now and would want to do the same with whatever card I upgrade to.
Yeah, the VRAM is what makes me shy away from the 3080 as well given that it's 10GB vs 11GB in my 2080 Ti. Albeit, I am mostly gaming at 2560x1440 (144Hz), but I can still use high-res texture packs. I remember trying out the Final Fantasy XV with its texture pack, and it was routinely hitting up against the 2080 Ti's 11GB of VRAM. Now, like others have alluded to, issues with playback can be a problem with the game's engine itself, and we can't be too sure how well FFXV was coded. (I'm sure most can recall the game's woolly cow framerate debacle.) Given that the 3080 has around +40% over the 3070, it seems like it should've really had 12GB of VRAM.

In regard to the 3090, if we ignore its pricing for the moment, I really don't need its insane amount of VRAM. It's sort of like a Goldilocks situation where 10GB is just a little too low, and 24GB is just too unnecessary. (I think something like 8GB, 12GB, and 16GB would've made a lot more sense.) It almost seems more like an awkwardly rebranded Titan. The one thing that I do worry about with the 3090 is that we'll have another price change situation on our hands. I can't remember which card it was, but Nvidia priced it a bit high... until AMD released a competitive card, and Nvidia lowered the MSRP. If Big Navi pans out as well as some think, could we end up with a similar situation?
 

IntelUser2000

Elite Member
Oct 14, 2003
7,586
2,437
136
If it is good at mining I might just pick one up. Who cares if it's a power hog that eats up 300W when I can just run it 24/7 instead of turning on the heat.
There's no way it'll be 3x the performance of RTX 2080 in Eth when memory bandwidth only improves by 70%. The best is still Radeon VII.

The sad reality is that we are bound by CPU power, look at Far Cry New Dawn - CPU simply can't pump more than ~130 frames per second.
Exactly. New GPUs are about increasing frame rates in situations where it wasn't so good. So the overall frame rate should also be more stable.

Compare the 3080 to 2080 in Quake 1 and the gains might be zero.

I hate miners for what they have done to the video card industry.
The faults are with the massive mining farms not with individuals buying few at max.
 
Last edited:

sze5003

Lifer
Aug 18, 2012
13,502
355
126
Yeah, the VRAM is what makes me shy away from the 3080 as well given that it's 10GB vs 11GB in my 2080 Ti. Albeit, I am mostly gaming at 2560x1440 (144Hz), but I can still use high-res texture packs. I remember trying out the Final Fantasy XV with its texture pack, and it was routinely hitting up against the 2080 Ti's 11GB of VRAM. Now, like others have alluded to, issues with playback can be a problem with the game's engine itself, and we can't be too sure how well FFXV was coded. (I'm sure most can recall the game's woolly cow framerate debacle.) Given that the 3080 has around +40% over the 3070, it seems like it should've really had 12GB of VRAM.

In regard to the 3090, if we ignore its pricing for the moment, I really don't need its insane amount of VRAM. It's sort of like a Goldilocks situation where 10GB is just a little too low, and 24GB is just too unnecessary. (I think something like 8GB, 12GB, and 16GB would've made a lot more sense.) It almost seems more like an awkwardly rebranded Titan. The one thing that I do worry about with the 3090 is that we'll have another price change situation on our hands. I can't remember which card it was, but Nvidia priced it a bit high... until AMD released a competitive card, and Nvidia lowered the MSRP. If Big Navi pans out as well as some think, could we end up with a similar situation?
I'm thinking if that happens, the price could change too depending on what amd brings out. I have to stick with Nvidia for my gsync monitor. But regardless if I'm buying now I want to stick with the card for as long as I did my 1080ti.

I'm not expecting the 3090 to be a crazy amount better than the 3080 because of the vram but I use my vr headset a lot and for the apps I use, I see it does use a lot of my vram.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,323
758
136
Console hardware have always been sold at a lost because that's how the business work. Just like razors, you make your money on the blades. MS/Sony/Nintendo make money on every game that get sold. I don't know why people keep comparing price of consoles to PC hardware, they're not comparable.
Of course they're comparable. No one is going to buy a GPU that costs three as much as a shipping console and performs a third as good.

Yes, you need to make an allowance the console ecosystem, and you can't expect PC hardware to sell at a loss. That makes comparison somewhat muddy. It certainly doesn't invalidate it.

For example, when the PS4 and XBone launched, you could get a GPU with comparable performance (eg. an HD 7850) for around half the price of either one. The console CPUs were weak, and storage was unimpressive. Consoles gave a good value in terms of aggregate hardware, sure, but they weren't embarrassing PCs in that area.

That's night and day versus this coming generation, where the console CPUs are superior to the vast majority of user systems, where the SSDs are extremely impressive (and in the case of the PS5, superior to anything currently available to consumers), and a roughly equivalent GPU is set to cost the same as an entire console.
 

IntelUser2000

Elite Member
Oct 14, 2003
7,586
2,437
136
ETH speed is FUD
but I expect:
3080 ~80mhs
3090 near 100Mhs
3060ti 60Mhs
60 is too high for the 3060 Ti. It's same width as the Polaris series but with GDDR6 and 14GT/s. The highest you can get with Polaris is ~32MH/s so with 14GT/s with absolutely optimal timings and overclocks you'll get 56MH/s. I've seen super high settings resulting in lower rates at pool so when you factor that in you'd end up at 50MH/s.

The 3060 Ti actually seems like a good gaming card. Seems on the surface like a 2080, but with updated features.
 

jpiniero

Diamond Member
Oct 1, 2010
9,939
2,282
136

Videocardz says 3080 reviews on Wednesday, and the 3070's release is October 15th.
 

DJinPrime

Member
Sep 9, 2020
87
89
51
Of course they're comparable. No one is going to buy a GPU that costs three as much as a shipping console and performs a third as good.

Yes, you need to make an allowance the console ecosystem, and you can't expect PC hardware to sell at a loss. That makes comparison somewhat muddy. It certainly doesn't invalidate it.

For example, when the PS4 and XBone launched, you could get a GPU with comparable performance (eg. an HD 7850) for around half the price of either one. The console CPUs were weak, and storage was unimpressive. Consoles gave a good value in terms of aggregate hardware, sure, but they weren't embarrassing PCs in that area.

That's night and day versus this coming generation, where the console CPUs are superior to the vast majority of user systems, where the SSDs are extremely impressive (and in the case of the PS5, superior to anything currently available to consumers), and a roughly equivalent GPU is set to cost the same as an entire console.
You can't just make wild statement like "GPU that costs three as much as shipping console and performs a third as good". A RTX 3070 is neither of those. Also, back in the PS4 XBone days, things are very different. The online services was no where near the money making capability as today, console had a bigger lock on game exclusivity, and the hardware at the time was not that great. AMD was in a bad shape with both their CPU and GPU.

As you say yourself and I previously broke down the cost of the new consoles, they're pretty good! So, obviously MS/Sony is betting that they can still make money offering such cheap console. Looking at the Series S and the bundle online package, you can get a console with $25/month! Unless you think all components, CPU, memory, storage, etc should just magically drop 50% in price (or more), it's not logical to make the price comparison because there's something else that's making the console so cheap. If you're going to complain about a $500 NV GPU, then what about a Ryzen 3700x that's $300 when just a slightly slower version is in the $300 series S?

The console hardware is just to get you locked into their ecosystem for the next 5-7 years, you think it's a good deal but add up the cost over time and compare to PC gaming, it's might not be as good as you think. Also, with a PC, there's no walled eco system. I can run any program that's compiled, I don't need MS to certify what I can and cannot run. The price of freedom is well worth the cost. $500 GPU from NV or AMD at around 2080 TI performance is awesome. Expecting them to be $200-$300 is unrealistic.
 

Accord99

Platinum Member
Jul 2, 2001
2,207
86
91
If it is good at mining I might just pick one up. Who cares if it's a power hog that eats up 300W when I can just run it 24/7 instead of turning on the heat.
You probably won't need to run it at full power; Ethereum is so bandwidth-limited that you'll probably be able to run it at extremely low core clock and voltage.

But I agree with the other estimates, 80 MHash/s is probably the ceiling for the 3080 and only with a noticeable memory overclock.
 
  • Like
Reactions: Shmee

ASK THE COMMUNITY