nVidia 3080 reviews thread

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,654
2,773
126
Written:


Video:

 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
A full 3 years on from my 1080Ti I can pay the same again to sometimes double my frame rate while using more power, and have less RAM.

Disappointing if you look at the reality of it and ignore the 2080 cards. Now than 3 years to give us barely double the performance in maxed out cases, at the cost of more power and the same money in dollar terms.

Feels like improvement rates have really dropped.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Any review with power consumption numbers on a system with Intel 10th Gen 10C 20T OC + RTX3080 OC ???

Not that I have seen. TPU didn't even bother posting power consumption while OC'ed. But my guess is up around the furmark usage of 370W.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
The 3080 is 31% faster than the 2080ti and 67% faster than the 2080 according to TPU (4k). IMO the dissapointing aspect is power consumption at just 17% better than 2080ti at 4k. All in all, definitely not the best jump ever, but what was expected. Perf/$ is much better than Turing, obviously.

IMO AMD can definitely be competitive with the 3080.

I hope! But realistically I see them coming in around 75-80% faster than 5700 XT @ 4k.
 

Zstream

Diamond Member
Oct 24, 2005
3,396
277
136
Yeah... about the card. Just to show you what ***** NVIDIA was in the pricing of the 2k cards. Anyways... do we ever get reviews with more common CPU's like the 3600 or Intel's equivalent.
 
  • Like
Reactions: guachi

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,248
136
do we ever get reviews with more common CPU's like the 3600 or Intel's equivalent.


Not quite the mortal mans cpu, but this review which is in the original post used a 3900x as the test rig. You should be able to get a rough idea on the performance.

Didn't see any PCI bandwidth testing done. Would have liked to see the effect if any.

 

CP5670

Diamond Member
Jun 24, 2004
5,506
583
126
It looks like a great card for 4K at 120hz, often over double the fps of the 1080ti. The power usage is high but I might only game a few hours a day, and am not concerned about it as long as the card is quiet (the 3 fan AIB cards usually are). Power usage is more important for servers that are running continuously.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
It looks like a great card for 4K at 120hz, often over double the fps of the 1080ti. The power usage is high but I might only game a few hours a day, and am not concerned about it as long as the card is quiet (the 3 fan AIB cards usually are). Power usage is more important for servers that are running continuously.

The 3 fan AIB cards are going to be dumping 350W into your case, which means everything else is going to run hotter, which means all their fans are going to spin up. This card uses 160W more than a 1080Ti.
 
  • Like
Reactions: krumme

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Didn't see any PCI bandwidth testing done. Would have liked to see the effect if any.
I think part of the problem is Intel cpu's are just flat out faster at gaming so you're better off with a high end Intel and PCIE 3 then you are with any AMD and PCIE 4 right now.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,248
136

I think part of the problem is Intel cpu's are just flat out faster at gaming so you're better off with a high end Intel and PCIE 3 then you are with any AMD and PCIE 4 right now.

Not much effect at all yet. At least he's got a rig ready for Zen 3's launch. Most likely it's the real reason why he switched over.....Guess we'll have an idea soon enough if Intels gaming dominance falls.
 

Guru

Senior member
May 5, 2017
830
361
106
Completely NORMAL performance gains generation over generation just as I expected. Remember that the GTX 1080ti cost $700, I think the GTX 980ti cost $650, the RTX 2080ti should have cost $700, its just that Nvidia pulled a fast one on your average ignorant consumer and prices the RTX 2080 at $700+, which otherwise would have cost $500 every previous generation or LESS and then prices the RTX 2080ti at an insane $1200.

Again if we normalise prices like they were before the 2000 series, the RTX 3080 is about 30% faster than RTX 2080ti, which is in line with GTX 1080ti over GTX 980ti and GTX 980ti over GTX 780ti.

Again its a good performance increase, 30% uplift generation over generation is good, its just that its 100% normal, its what we've been getting for the past 6+ years! Its nothing unseen or unheard of. At 1440p it seems to be only 20% faster over the RTX 2080ti FE and basically around 15% faster over a overclocked 2080ti.

Even at 4k the difference between a quality and overclocked RTX 2080ti and a RTX 3080 is about 25%.

I think its overall a move in the right direction and it is a good product no doubt, but all of the hype was wrong, again this is just a standard, normal, expected, historical increase generation over generation. Around 30 to 35% difference at 4k over the 2080ti FE and about 25% to 30% difference over 2080ti OC.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
As expected, the "double 2080 performance" was utter nonsense. More power effecient than turing at 4k, but below that is comparable. Much better performance per dollar though. FE cooler seems pretty decent actually, but I probably will go for a Strix by Asus.
 

Wall Street

Senior member
Mar 28, 2012
691
44
91
Any review with power consumption numbers on a system with Intel 10th Gen 10C 20T OC + RTX3080 OC ???

This is what I want to see as well. I previously used a system which drew ~400 watts from the wall while gaming. That was a while ago with a 7970 and first-gen Core i7 with OC.

With a 320 watt GPU, 150 watt OC CPU, and power supply (in)efficiencies, 3080 / 9900k OC rigs could put 500-550 watts of heat into your room.

In my old system you could feel the room temperature increase after a 2 - 3 hours of gaming at 100% load. I would imagine with the 3090 you will be able to 'feel the power' literally, much more quickly.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,219
5,218
136
Again its a good performance increase, 30% uplift generation over generation is good, its just that its 100% normal, its what we've been getting for the past 6+ years! Its nothing unseen or unheard of. At 1440p it seems to be only 20% faster over the RTX 2080ti FE and basically around 15% faster over a overclocked 2080ti.

It's not normal. Pascal was an outlier among the best ever releases, this is close to matching Pascal. Putting them both near the pinnacle of generational increases. The last time anything approached this level of gains before Pascal was the Mighty 8800 GTX.

Also it's not a 30% generation upgrade. Generation is NOT 3080 vs 2080 Ti.

Generation would be 3080 vs 2080, and these gains are typically around 70% at 4K. A very Pascal type gain, and very much beyond the typical release.
 
  • Like
Reactions: xpea

JustMe21

Senior member
Sep 8, 2011
324
49
91
It just doesn't seem like it's doing as well as it should be though. The # of CUDA cores were doubled on the 3080 from the 2080 TI, which only had an about a 1.2x increase from the 1080 TI, yet we only see about 30% performance gains.
 

CP5670

Diamond Member
Jun 24, 2004
5,506
583
126
The 3 fan AIB cards are going to be dumping 350W into your case, which means everything else is going to run hotter, which means all their fans are going to spin up. This card uses 160W more than a 1080Ti.

That's true, but some of the existing AIB cards already get up that high and stay quiet. I set my 1080ti (default 280W) to a 117% power limit, so about 330W, and it's audible but fairly quiet in games, much more than most blower or 2-fan cards. It also runs passively on idle. It will be interesting to see what the custom 3080/3090 are like.
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
It's not normal. Pascal was an outlier among the best ever releases, this is close to matching Pascal. Putting them both near the pinnacle of generational increases. The last time anything approached this level of gains before Pascal was the Mighty 8800 GTX.

Also it's not a 30% generation upgrade. Generation is NOT 3080 vs 2080 Ti.

Generation would be 3080 vs 2080, and these gains are typically around 70% at 4K. A very Pascal type gain, and very much beyond the typical release.

That's just because of the name changes in this generation, it's also why the 3080 is so much faster than the 3070 and only slightly slower than the 3090, the naming convention changed this series. Both the 3080 and 2080 ti are cut down 102 cards, where the 2080 was a 104.
 
Last edited:

Dave2150

Senior member
Jan 20, 2015
639
178
116
A full 3 years on from my 1080Ti I can pay the same again to sometimes double my frame rate while using more power, and have less RAM.

Disappointing if you look at the reality of it and ignore the 2080 cards. Now than 3 years to give us barely double the performance in maxed out cases, at the cost of more power and the same money in dollar terms.

Feels like improvement rates have really dropped.


This is to be expected, as people didn't buy AMD when they had a similarly performing, or better, card. The Nvidia mindshare is just too strong. Too many influencers on the payroll, too much money in the R&D and driver department for AMD to compete with.

Expect the prices to increase, and the generational performances to erode year on year, until it's mostly old men, hunched over the chairs at their desks playing PC. The majority, young people, will be on next generation consoles, playing optimised 4k titles, on gorgeous 4K OLED TV's.
 

Dave2150

Senior member
Jan 20, 2015
639
178
116
On a side note, why are Anandtech reviews always late? Are they struggling financially?

Before Anand sold, it used to feel much more alive, responsive and meaningful.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
It just doesn't seem like it's doing as well as it should be though. The # of CUDA cores were doubled on the 3080 from the 2080 TI, which only had an about a 1.2x increase from the 1080 TI, yet we only see about 30% performance gains.

Only FP32 are doubled, and games use FP32 and Int both. It is misleading for games. They have equal SMs.
 

CP5670

Diamond Member
Jun 24, 2004
5,506
583
126
Expect the prices to increase, and the generational performances to erode year on year, until it's mostly old men, hunched over the chairs at their desks playing PC. The majority, young people, will be on next generation consoles, playing optimised 4k titles, on gorgeous 4K OLED TV's.

You can connect your PC to a 4K OLED TV and play from a distance like I do. It's the best PC gaming experience at this point. I do expect high end PC gaming with top hardware (like the 3090) to become more niche though, with few games that really justify it over cheaper hardware.