nVidia 3080 reviews thread

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
Written:


Video:

 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
NVidia claims to have achieved 90% perf/watt increase over Turing, and even showed a graph on how they did it.

But these kind of self marketing statements are usually meaningless, because they are done by essentially taking the new generations more powerful GPU, and limiting performance to the old generation model, so it running at MUCH lower clocks, in a part of the curve where power drops way off.

Yup. 2020 is year of hardware hype.
 

CP5670

Diamond Member
Jun 24, 2004
5,508
586
126
That's what I'm doing already, I have CX48 hooked up to my PC. 90cm deep desk, and it's doable. We're in the extreme minority though I believe :D

I think it will become more common in the future, especially since fewer people use desktops for work or web browsing anymore. I have mine on a desk too but am in the process of replacing it with a mobile TV stand and a smaller table to hold the mouse/KB, and want to step up to a 77" model with HDMI 2.1.
 

Timorous

Golden Member
Oct 27, 2008
1,532
2,535
136
The RX590 - > 5700XT uplift was aided by a ~25% increase in clocks from 12nm to 7nm process (1.6 to 2.0ghz). I highly doubt that gain is replicable with RDNA 1 -> 2 on the same process without blowing up their power budget.

I would have thought the same but then a 2.23Ghz PS5 was announced with 36 CUs in a console power envelope and a 1.825 Ghz Xbox Series X was announced with 52 CUs running on a 315W power supply.

NVidia claims to have achieved 90% perf/watt increase over Turing, and even showed a graph on how they did it.

But these kind of self marketing statements are usually meaningless, because they are done by essentially taking the new generations more powerful GPU, and limiting performance to the old generation model, so it running at MUCH lower clocks, in a part of the curve where power drops way off. Crank back up to normal clocks to achieve the performance you are looking for out of the part and those gains are mostly gone.

Remains to be seen what AMD did, but it seems it would have been safer to go a little more than double size, so they could go more relaxed on clocks.

AMD made the same claim with RDNA over GCN and apart from vega 56 to 5700XT (which is 49% perf/watt) they matched or exceeded their target.
 

MrTeal

Diamond Member
Dec 7, 2003
3,554
1,658
136
Macdonal combo meal is $8, pretty sure it was only $6 a few years ago. Have you look at the price of housing? Cell phones? Not to get political, but official inflation number is bs, it's so much more expensive for many things.
If you're experiencing +114% inflation in 3.5 years across a broad spectrum of items, it's time to stop buying them in Zimbabwean Dollars.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136

Did the whitepaper get posted here? The 1.9x perf/w claim was for Control versus a 2080 Super.

And nVidia reiterates their claim that the 3070 is faster than the 2080 Ti.

First time I saw the WP link and I was looking for it. I am very dubious 3070 will match 2080 Ti. 3080 is barely 30% faster than 2080 Ti, and much more than 30% greater than a 3070, particularly in memory BW.
 

insertcarehere

Senior member
Jan 17, 2013
639
607
136
I would have thought the same but then a 2.23Ghz PS5 was announced with 36 CUs in a console power envelope and a 1.825 Ghz Xbox Series X was announced with 52 CUs running on a 315W power supply.

The PS5 design looks to be seriously pushing clocks and power limits itself. Between the large footprint (esp vs XSX), rumors of poor yields, PC-esque "boost clocks", and apparently custom/expensive cooling solution, I am guessing they are pushing the gpu hard to get to 2.2ghz, so I doubt we'd see RDNA 2 GPUs going much past this point clock-wise, especially if they want to get near 50% perf/watt claims.
 
Last edited:

MrTeal

Diamond Member
Dec 7, 2003
3,554
1,658
136

Did the whitepaper get posted here? The 1.9x perf/w claim was for Control versus a 2080 Super.

And nVidia reiterates their claim that the 3070 is faster than the 2080 Ti.
It's going to be interesting to see. TPU has the 3080 as 23.4% faster than a 2080Ti at 1440p and 32% at 4k. With the 3070 being 2/3rds of a 3080, it seems a stretch to match the 2080 Ti, especially at 1440p which is where its sweet spot will probably be.
 

Zstream

Diamond Member
Oct 24, 2005
3,396
277
136
The PS5 design looks to be seriously pushing clocks and power limits itself. Between the large footprint (esp vs XSX), rumors of poor yields, PC-esque "boost clocks", and apparently custom/expensive cooling solution, I am guessing they are pushing the gpu hard to get to 2.2ghz, so I doubt we'd see RDNA 2 GPUs going much past this point clock-wise.
Again opinion here, but we're all assuming that the MS & Sony chips are the same. Same as in layout, grid, allocation of units etc.. Sony did not anticipate 2.2ghz, that much is fairly certain. The chip was not initially designed for that target. To assume that the speed reflects a PC GPU chip is silly. The only thing we know is that 2.2 is possible with Sony's gpu block.
 

Timorous

Golden Member
Oct 27, 2008
1,532
2,535
136
The PS5 design looks to be seriously pushing clocks and power limits itself. Between the large footprint (esp vs XSX), rumors of poor yields, PC-esque "boost clocks", and apparently custom/expensive cooling solution, I am guessing they are pushing the gpu hard to get to 2.2ghz, so I doubt we'd see RDNA 2 GPUs going much past this point clock-wise, especially if they want to get near 50% perf/watt claims.

Sony has denied those rumours.
 

MrTeal

Diamond Member
Dec 7, 2003
3,554
1,658
136
Sony has denied those rumours.
Which doesn't say much, as confirming "Yeah we got caught with our pants down and have to overclock the crap out of our SoC." doesn't sound like the kind of thing the marketing department would likely do.

However, the large footprint and expensive cooling solution isn't going to be something made as a reaction to the MS console. Those designs would have been locked down a long time ago.
 
  • Like
Reactions: spursindonesia

Saylick

Diamond Member
Sep 10, 2012
3,084
6,184
136
Which doesn't say much, as confirming "Yeah we got caught with our pants down and have to overclock the crap out of our SoC." doesn't sound like the kind of thing the marketing department would likely do.

However, the large footprint and expensive cooling solution isn't going to be something made as a reaction to the MS console. Those designs would have been locked down a long time ago.
Yeah, Sony's statement only says that they are still aiming for the same production numbers as before. How difficult it is for production to hit those numbers is still anyone's guess.
 

A///

Diamond Member
Feb 24, 2017
4,352
3,154
136
Again opinion here, but we're all assuming that the MS & Sony chips are the same. Same as in layout, grid, allocation of units etc.. Sony did not anticipate 2.2ghz, that much is fairly certain. The chip was not initially designed for that target. To assume that the speed reflects a PC GPU chip is silly. The only thing we know is that 2.2 is possible with Sony's gpu block.

I swear, didn't one of those stupid rumor sites suggest a 2.4 or 2.8 Ghz clock speed for Ampere?
 

Timorous

Golden Member
Oct 27, 2008
1,532
2,535
136
Again opinion here, but we're all assuming that the MS & Sony chips are the same. Same as in layout, grid, allocation of units etc.. Sony did not anticipate 2.2ghz, that much is fairly certain. The chip was not initially designed for that target. To assume that the speed reflects a PC GPU chip is silly. The only thing we know is that 2.2 is possible with Sony's gpu block.

What we do know is that the Series X is powering 3.6Ghz 8c16t zen2 + 1.825Ghz 52 CU RDNA2 + 1TB nvme ssd + optical drive + 16 GB GDDR6 ram and ancillaries on a 315W power supply.

I doubt that would be possible id RDNA2 was power hungry. MS also said that they were trying to keep SoC power inline with One X which is around 180W so all signs are pointing to a decent power decrease.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,715
7,004
136
First time I saw the WP link and I was looking for it. I am very dubious 3070 will match 2080 Ti. 3080 is barely 30% faster than 2080 Ti, and much more than 30% greater than a 3070, particularly in memory BW.

- We know a ton of 3080 overhead is being tied up at "lower" resolutions like 1080p and 1440p, which is muddling a clean comparison of the two archs because game engines and cpus are just not in a place to feed the 3080 fast enough. As the bottleneck moves off the CPU to the GPU, we see the 3080 start extending its lead. Will be interesting to see if we can get a native res 8K benchmark between the two at some point here.

I wouldn't be surprised to see a bit of a performance traffic jam around the 2080ti (and nearly as fast 2080S) at those "lower" resolutions with the 3070 squeeking out wins against the 2080ti, but never extending its lead or even regressing/falling behind in its lead as the resolution increases.
 

tajoh111

Senior member
Mar 28, 2005
298
312
136
I would have thought the same but then a 2.23Ghz PS5 was announced with 36 CUs in a console power envelope and a 1.825 Ghz Xbox Series X was announced with 52 CUs running on a 315W power supply.



AMD made the same claim with RDNA over GCN and apart from vega 56 to 5700XT (which is 49% perf/watt) they matched or exceeded their target.

When you combine new node and new architecture, with a very inefficient baseline like Vega 64, it is easy to reach those targets.

Hence the criticism Nvidia is getting at the moment. We should be seeing a way bigger increase in performance per watt.

AMD will be trying to reach this 50% increase in performance per watt with a refinement of it's current architecture without a new node. And on top of that, without real reviews, we still don't know how much power the Series X is using of that 315 watt power supply. Considering AMD has typically missed it's performance targets in regards to performance per watt with their GPU's, the odds are stacked against them when you add in what i previously mentioned(not new architecture, not a new node).
 

insertcarehere

Senior member
Jan 17, 2013
639
607
136
Games don't seem to really be ROP limited as much for the last few years. In every other way that counts the 3070 is cut down to roughly 2/3rds of a 3080.

Their boost clocks look to be almost identical at this point too.

Not quite, according to the whitepaper RTX 3070 retains 6 GPCs similar to 3080, each GPC feeds 8 SMs in GA104 vs 12 SMs in GA 102. So similar to how the gaming performance delta from 2080ti to 2080 is less than the compute delta suggests, we should be seeing similar from 3080 to 3070 as well.
 

Zstream

Diamond Member
Oct 24, 2005
3,396
277
136
What we do know is that the Series X is powering 3.6Ghz 8c16t zen2 + 1.825Ghz 52 CU RDNA2 + 1TB nvme ssd + optical drive + 16 GB GDDR6 ram and ancillaries on a 315W power supply.

I doubt that would be possible id RDNA2 was power hungry. MS also said that they were trying to keep SoC power inline with One X which is around 180W so all signs are pointing to a decent power decrease.
Yes, most certainly. I don't believe RDNA2 will be awful. I think that many of the SKU's will resemble Nvidia's. They might not have a 3090, but in practicality, who cares.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,107
1,260
126

It's not so bad if you look at it as price point to price point. The 3080 costs about what the 2080S does and is almost 60% faster. The 3090 is the same price point as the 2080ti, will be about 50% faster? I have no idea.

If you're moving in the same price tier, you're getting a real upgrade for the same money. Of course not less money, and of course somehow the price tiers are now $1200US for the flagship $700 for the next one down and so on.

If AMD pulled what they did on Intel CPUs to GPUs, we'd probably see what happened to the price of 8+core chips happen to nvidia GPUs. Until then nvidia is going to make us pay. I couldn't stomach the 2080ti at the price it was coming from the pascal Titan. I can live with that price for the 3090 this time, it's a big speed increase from the card I have.