Question nVidia 3070 reviews thread

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126






VRAM test proving 8GB isn't enough in Minecraft and Wolfenstein: https://www.pcgameshardware.de/Gefo...747/Tests/8-GB-vs-16-GB-Benchmarks-1360672/2/

The 3070 is the first card that actually interests me from the Ampere line.
 
Last edited:
Jul 27, 2020
15,738
9,801
106
- I personally enjoy buying a higher end system so I can play older games at super maxed out settings, not so much playing current and next gen games at maxed out settings.

For example, My 980 TI has served me really well playing X360 and early gen XBO titles at 1440P/144hz or even with DSR cranked up for that buttery smooth look, but I haven't attempted anything too new (outside of Doom 2016) otherwise. Games like Dishonored 2, DX: Mankind Divided, and newer are waiting for an upgrade before I get to enjoy them totally maxed out.

Additionally, many newer technologies need a fair amount of time before their benefits are baked in to game engines. We've had cheap, plentiful cores since around when I bought my system, but my CPU is really only starting to be a major bottleneck in the newest titles. With the next gen consoles stamped out using current gen tech, (and using some NVME direct access magic) I don't anticipate things like DDR5 are going to affect the gaming experience substantially.

My goal would be to get set-up with a solid 8c/16t CPU and a platform that's PCI-E 4.0 ready and hold on to that baby for the next 4-8 years. Slap an Ampere or RDNA2 card in there and baby you got yourself a stew goin'.
Your system's Achilles' heel is the i5. Upgrade it to an i7 without changing mobo. Get faster SSD and maybe supercharge it with Optane in conjunction with Enmotus's caching software. Upgrade your GPU and save the rest of your cash for when you really need it. That's what I would do. Incremental upgrades make more sense until they are no longer possible. You have plenty of upgrading possibilities at your disposal without breaking the bank.
 
  • Like
Reactions: Shmee

GodisanAtheist

Diamond Member
Nov 16, 2006
6,715
7,004
136
Your system's Achilles' heel is the i5. Upgrade it to an i7 without changing mobo. Get faster SSD and maybe supercharge it with Optane in conjunction with Enmotus's caching software. Upgrade your GPU and save the rest of your cash for when you really need it. That's what I would do. Incremental upgrades make more sense until they are no longer possible. You have plenty of upgrading possibilities at your disposal without breaking the bank.

- Issue with that is prices which have gone crazy in 2020 (and were never particularly good for prior gen CPUs). A 7700K or 6700K still go for $150 - $200 used, which I would not consider a deal for a 4c/8t processor in this day and age.

I'd much rather get an 8c/16t CPU + a PCI-E 4.0 board brand new (My DDR4 memory would carry forward) for ~$350 than spend $150 now, then $350 anyway in a couple years as engines start to lean on 8+ cores.

Putting money toward a "faster SSD" at this point is a huge waste of money IMO. For gaming, there is not much tangible difference already between a SATA SSD and an NVME SSD, and differenced between NVME SSDs outside of benchmarks is non-existent in practice.

Anyhow, we are way off the beaten path for this topic, DM me if you want to continue the discussion I guess.
 

undertaker101

Banned
Apr 9, 2006
301
195
116
Pretty underwhelmed tbh, hopefully the 2080 Tis fall to $600 soon and are available as all eyes are on the 3xxx series. Unless 3080s start going on trees next month...
 

Hitman928

Diamond Member
Apr 15, 2012
5,177
7,628
136
They themselves says it's an issue on their end. Quote:
Wither III was the one and the only title we had some perf issues with. We ran the test several times with a couple of new driver installs as well. We'll consider this to be an anomaly on our side for now, however, we always report what we measure.

They said they saw a performance issue, not that they had an issue with their setup. Maybe there is an issue, maybe not, but you can't just dismiss a result because it was unexpected. Other reviewers have also showed bad performance in certain games with certain settings so I'm not going to dismiss their results unless they can investigate it and figure out why it is happening or if a reinstall or something resolves it.
 

DJinPrime

Member
Sep 9, 2020
87
89
51
They said they saw a performance issue, not that they had an issue with their setup. Maybe there is an issue, maybe not, but you can't just dismiss a result because it was unexpected. Other reviewers have also showed bad performance in certain games with certain settings so I'm not going to dismiss their results unless they can investigate it and figure out why it is happening or if a reinstall or something resolves it.
I'm not sure why you're arguing with what the reviewer himself said. He can't explain the low performance, but he decided to post the result anyway, which I don't blame him because he stated the issue. And I linked another review with Witcher 3 with no performance anomaly, it was slightly slower than the 2080ti.
 

Hitman928

Diamond Member
Apr 15, 2012
5,177
7,628
136
I'm not sure why you're arguing with what the reviewer himself said. He can't explain the low performance, but he decided to post the result anyway, which I don't blame him because he stated the issue. And I linked another review with Witcher 3 with no performance anomaly, it was slightly slower than the 2080ti.

Ok, you said that there was a problem on the reviewer's end implying an issue with their setup. Please show me where the reviewer said they had an issue with their setup.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
As time goes on you will even see edge cases where the 2080ti's 1% lows beat a 3080 due to the additional 1 Gb memory. Remember the 3.5 gb 970? :)
The slow 0.5gb on the 970 never actually had any effect in games outside of some extreme non realistic examples. It was a complete non issue as far as performance was concerned.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
1603839399252.png

Granted Doom is pushing really dense textures at this setting, but this is max settings in a game that's available today. With raster performance equal to a 2080 Ti in most games, we are seeing the limitations of VRAM capacity. Settings can always be turned down, but this is a game that is already slightly limited by a $500 component purchased today, what will it look like in a year or two? If we look at things some of the greats like the Radeon 290x or 1080 Ti, cards can and really should be viable for 4 years or more. An 8GB card will be for sure limited in 4k titles.

Efficiency does look better compared to the larger Ampere cards but that's not hard considering how massively power hungry they are.

1603839611105.png

Looking at these power numbers... NVIDIA managed to match 2080 Ti while shaving off about 60 watts. Impressive but not really ground breaking for a full node jump and a revised architecture. They raised high-end performance by pushing power. What could have Turing achieved if given 350 watts?

 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
The slow 0.5gb on the 970 never actually had any effect in games outside of some extreme non realistic examples. It was a complete non issue as far as performance was concerned.

This is wrong. It did cause issues in games. nVidia made a driver change to prevent the card from using the last 512MB unless it absolutely had to. Initially the card would put whatever memory it wanted there, and it did cause hitching issues before the driver changes. There is a reason nVidia lost the class action lawsuit over it.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
Granted Doom is pushing really dense textures at this setting, but this is max settings in a game that's available today. With raster performance equal to a 2080 Ti in most games, we are seeing the limitations of VRAM capacity. Settings can always be turned down, but this is a game that is already slightly limited by a $500 component purchased today, what will it look like in a year or two? If we look at things some of the greats like the Radeon 290x or 1080 Ti, cards can and really should be viable for 4 years or more. An 8GB card will be for sure limited in 4k titles.

Some perspective:

3070 is still averaging 120 FPS, (94fps 1% Low) at 4K extreme settings in one game, in a lower tier ($500) card aimed at 1440P.

Meanwhile go look at what 2080 Ti did in Metro Exodus or Red Dead Redemption 2 at max settings. FPS in the 40's, for the top card of it's generation aimed at 4K, in current games of it's generation. $1200 card doing sub 50 FPS at it's target mission.

Lower tier 1440p card getting 120 FPS = End of the world because it's slightly slowed by memory capacity at 4K max.
Top tier 4K card getting under 50 FPS = total non issue...

There is no future proofing. Your cards will choke sooner on absolute performance issues, before memory capacity issues, because it's already happening today.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Some perspective:

3070 is still averaging 120 FPS, (94fps 1% Low) at 4K extreme settings in one game, in a lower tier ($500) card aimed at 1440P.

Meanwhile go look at what 2080 Ti did in Metro Exodus or Red Dead Redemption 2 at max settings. FPS in the 40's, for the top card of it's generation aimed at 4K, in current games of it's generation. $1200 card doing sub 50 FPS at it's target mission.

Lower tier 1440p card getting 120 FPS = End of the world because it's slightly slowed by memory capacity at 4K max.
Top tier 4K card getting under 50 FPS = total non issue...

There is no future proofing. Your cards will choke sooner on absolute performance issues, before memory capacity issues, because it's already happening today.

This is all very valid. So I guess a potential takeaway is the opportunity cost of choosing a 3070 with 8GB. If AMD releases some card tomorrow that equals 3070 in performance but with 16GB then the case can be made for the latter.

The only thing I will take exception to is there is no real mission for cards. AMD/NVIDIA likes to tell you which resolution they are "aimed" at but its just happenstance. People upgrade monitors over the life of a video card, or they get sold to other people with diff resolution monitors.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
There is no future proofing. Your cards will choke sooner on absolute performance issues, before memory capacity issues, because it's already happening today.

This! People complaining about memory issues with settings the GPU is not targeted at anyway. And this argument even holds in the future, where not only game's memory footprint will increase but also the overall performance demand.
Anyway, I cannot agree more.
 
Last edited:

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
Those who are complaining about less vram, make sure you buy the 16gb amd card because i think that is the only advantage that its going to have. With DLSS ideally you should not need more than 8gb memory.
 

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
8GB VRAM test added to OP.

Sorry if the answer is obvious, but why I can not see the promised RT performance uplift (ampere vs. turing) based on reviews here?
Because it looks like it doesn't exist, or any architectural improvement doesn't make a difference. Hardware Unboxed showed RTX performance is exactly where you expect because the card is simply faster overall, not because of any specific improvements.

Some perspective:

3070 is still averaging 120 FPS, (94fps 1% Low) at 4K extreme settings in one game, in a lower tier ($500) card aimed at 1440P.

Meanwhile go look at what 2080 Ti did in Metro Exodus or Red Dead Redemption 2 at max settings. FPS in the 40's, for the top card of it's generation aimed at 4K, in current games of it's generation. $1200 card doing sub 50 FPS at it's target mission.

Lower tier 1440p card getting 120 FPS = End of the world because it's slightly slowed by memory capacity at 4K max.
Top tier 4K card getting under 50 FPS = total non issue...

There is no future proofing. Your cards will choke sooner on absolute performance issues, before memory capacity issues, because it's already happening today.
But this is clearly false: https://www.pcgameshardware.de/Gefo...747/Tests/8-GB-vs-16-GB-Benchmarks-1360672/2/

At 1440p, witness the massive frame spikes in Minecraft and Wolfenstein on the 8GB 2080S/3070 compared to the 11GB 2080TI, which delivers a stable framerate.

Also the 2080TI is twice as fast in Minecraft and 3.5x faster in Wolfenstein than the 8GB cards in average framerate, far in excess of memory bandwidth difference.

I've been saying for months 8GB isn't enough for 2080S performance levels or higher. Growing evidence continues to prove this.

I had 8GB for $379 on a 1070; now almost five years later I'm being asked to pay $500 for the same 8GB. That's not right, folks.
 

amenx

Diamond Member
Dec 17, 2004
3,847
2,013
136
Not sure why ppl ignore the elephant in the room. The differences between 2080ti and 3070 is also memory bandwidth:

2080ti = 616gb/s
3070 = 448gb/s

THAT will have a greater impact on FPS at higher res. If game plays choppy, too stuttery, THEN it may be due to insufficient vram. If game shows less FPS, but smooth game play, then it is not vram insufficiency, but bandwidth and/or other uarch limitations. The only way to know for sure is testing the same exact cards with different vram capacity. NOT entirely different cards with differing specs.
 

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
Not sure why ppl ignore the elephant in the room. The differences between 2080ti and 3070 is also memory bandwidth:

2080ti = 616gb/s
3070 = 448gb/s
That's 38% more bandwidth for the 2080TI. In the link above Minecraft is 100% faster and Wolfenstein is 3.8x faster on the TI (avg FPS). A 38% increase in bandwidth doesn't cause that.

Besides, memory bandwidth is almost never the primary limitation. 2060S has the same 448GB/sec as the 3070, yet the 3070 is 60% faster overall.

If game plays choppy, too stuttery, THEN it may be due to insufficient vram.
That's exactly what we're seeing in the frametime analysis. We're also seeing the 2080S/3070 clumped together - despite the 2080S having more bandwidth than the 3070 - while the TI delivers consistent framerate. 8GB size clearly causes the clumping.
 
Last edited:
  • Like
Reactions: Tlh97 and Mopetar

amenx

Diamond Member
Dec 17, 2004
3,847
2,013
136
That's 38% more bandwidth for the 2080TI. In the link above Minecraft is 100% faster and Wolfenstein is 3.8x faster on the TI. A 38% increase in bandwidth doesn't cause that.

Besides, memory bandwidth tends to be vastly overblown anyway, and is almost never the primary limitation. 2060S has the same 448GB/sec as the 3070, yet the 3070 is 60% faster overall.


That's exactly what we're seeing in the frametime analysis. We're also seeing the 2080S/3070 clumped together - despite the 2080S having more bandwidth than the 3070 - while the TI delivers consistent framerate. 8GB size clearly causes the clumping.
Again, you are ignoring other factors and variables that may play a part, namely that these are DIFFERENT cards with different specs. Frametime limitations would have to be drastically off the chart for vram insufficiency, and reviewers would have to report the choppiness, exaggerated stuttering that would produce. Minor frametime differences is not something to build a case on nor is the singular focus on vram capacity as the only variable capable of producing the results you see. Only SAME cards with different vram amounts will settle the issue imo.
 

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
Frametime limitations would have to be drastically off the chart for vram insufficiency, and reviewers would have to report the choppiness, exaggerated stuttering that would produce.
You mean like this?

FPS.jpg


Minor frametime differences is not something to build a case on nor is the singular focus on vram capacity as the only variable capable of producing the results you see.
350ms spikes on the 3070 is "minor"? Lulz.

The problem here is that even with repeated objective examples from multiple games, some people simply refuse to believe (tm).
 

Ranulf

Platinum Member
Jul 18, 2001
2,331
1,138
136
Some perspective:

3070 is still averaging 120 FPS, (94fps 1% Low) at 4K extreme settings in one game, in a lower tier ($500) card aimed at 1440P.

Meanwhile go look at what 2080 Ti did in Metro Exodus or Red Dead Redemption 2 at max settings. FPS in the 40's, for the top card of it's generation aimed at 4K, in current games of it's generation. $1200 card doing sub 50 FPS at it's target mission.

Lower tier 1440p card getting 120 FPS = End of the world because it's slightly slowed by memory capacity at 4K max.
Top tier 4K card getting under 50 FPS = total non issue...

There is no future proofing. Your cards will choke sooner on absolute performance issues, before memory capacity issues, because it's already happening today.

The problem with this comparison is between the two extremes for games this generation. Doom 2020 and 2016 are well optimized game engines in Vulkan that can run fairly well on 5-8 year old video cards depending on resolution and settings. RedDead2 is your standard Rockstar game that will hopefully be optimized in 4-5 years of patches but really will just be brute forced by stronger hardware. Even then, with the right graphics settings I can get RDR2 to run on my FX 8350 with RX 570 4GB at 45 fps avg in 1080p. The only thing that looks bad is the water in some situations. Nevermind all the other bugs in the game that affect all hardware brands.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Guru3D updated their results:

"Note: we have updated the Witcher III charts, the initial test results showed a performance that was odd. After reinstalling the game and applying the original configuration, the performance normalized. From the looks of things a configuration error somehow kicked in, which now is taking properly. "

The 3070 and the 2080 Ti are now neck and neck.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
This is wrong. It did cause issues in games. nVidia made a driver change to prevent the card from using the last 512MB unless it absolutely had to. Initially the card would put whatever memory it wanted there, and it did cause hitching issues before the driver changes. There is a reason nVidia lost the class action lawsuit over it.
It worked fine before anyone knew it existed and continued to work fine after everyone found out, the lawsuit was because they weren't truthful in the specs, not because it effected performance. But don't let that you stop continuing to beat this long dead horse if that's your thing.