Question nVidia 3070 reviews thread

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,956
126






VRAM test proving 8GB isn't enough in Minecraft and Wolfenstein: https://www.pcgameshardware.de/Gefo...747/Tests/8-GB-vs-16-GB-Benchmarks-1360672/2/

The 3070 is the first card that actually interests me from the Ampere line.
 
Last edited:

aleader

Senior member
Oct 28, 2013
502
150
116
Seems like the best card to get if you don't have a 4k monitor. Power consumption is surprisingly good. Wonder if we will see a price drop if AMD offers better performance with their equivalent 2nd tier card?

That's what I'm hoping for, a price drop. If the rumours are true (they have been so far with Ampere), the AMD card will be faster than the 3070 and have double the VRAM, so the price should come down on the 3070. The flipside is that we know there will actually be no 3070's available in 2 days, so AMD doesn't have any incentive with their launch to keep prices low, other than caring for customers :D I love how Newegg keeps advertising the 3080 on it's 'Featured' page every day...OUT OF STOCK.
 

DJinPrime

Member
Sep 9, 2020
87
89
51
Witcher 3 was one I wanted to see, as it's a game I most want to play next when I upgrade my system, and it always seemed to be heavily swayed by memory BW.

2080Ti walks away from the 3070 in Witcher 3:


2080Ti is ~22% faster than 3070 at 1440p.
They think the problem is on their end. And TechPowerUp also shows it running close to the 2080ti.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,227
136
Sorry, that's the way it sounded. I'm actually enjoying playing my older games (like Combat Mission) with my peasant 1060 right now while I wait for the GPU nonsense to subside. I'm not sure why you mentioned the 2080ti then? Would you actually overpay for that card just to play Witcher 3?

Did you miss that Witcher 3 is getting a Ray Tracing upgrade as well? And again, it's not the only game, I will play over the time I will use the card which might be 5+ years.
 

Hitman928

Diamond Member
Apr 15, 2012
5,239
7,785
136
They think the problem is on their end. And TechPowerUp also shows it running close to the 2080ti.

Could be different settings, different areas of the game tested, etc. That doesn't mean there's a "problem" with Guru3d's results.

Edit: I'm not saying there may not be an issue in the results, but you can't just assume so based upon a single result from another website which probably is testing under different conditions.
 

PJVol

Senior member
May 25, 2020
532
446
106
Sorry if the answer is obvious, but why I can not see the promised RT performance uplift (ampere vs. turing) based on reviews here?
 

Mopetar

Diamond Member
Jan 31, 2011
7,830
5,979
136
impressive perf/Watt and even perf/$ where nVidia rarely does well on their upper models. still expensive though for a midrange card at 500$.
now the countdown begins for Big Navi.

Even though it's called the 3070, it's pretty much the full die GA-104 part, which is normally used for the xx80 card. The mid-range cards have been the 106 die or the cut down 104 die parts which haven't been announced yet. If you base the discussion around the die rather than what NVidia has chosen to call it, then Ampere represents a return to the Pascal prices where GP-102 (1080 Ti) was $700 and GP-104 (1080) was $500.

We'll likely see GA-106 come in at $300 and a cut down GA-104 chip going for somewhere around $400. Everything makes a lot more sense if you ignore the musical chairs that NVidia is playing with the card names and base all of the analysis off the different dies and what they've been used for historically.

Seems like the best card to get if you don't have a 4k monitor. Power consumption is surprisingly good. Wonder if we will see a price drop if AMD offers better performance with their equivalent 2nd tier card?

It's a pretty decent card if you do have a 4k monitor as well. The 3080 is 31% better at 4K (using TPU's overall average here) but the 3070 still gets above 60 FPS at 4k in most titles. There are even a few of the titles where it doesn't where the 3080 doesn't either (e.g., Control) or just scrapes over the 60 FPS mark (e.g., AC: Odyssey) so unless there's a specific title where you see a larger than average difference, the 3070 is going to be more than serviceable as a 4K card for current titles.

Also, AMD isn't going to go for some kind of massive undercut on price against NVidia. We've already seen that with RDNA1 they're fine with positioning their pricing relative to what NVidia is charging. As several others have pointed out, there's no incentive for AMD to lower their prices when they're almost assuredly going to have low supply due to consoles and Zen 3 all coming out and competing for wafers along with the simple fact that if they did have excess supply for their RDNA2 cards they could just shift to making more Zen 3 chiplets which have a much better margin than any of the consumer GPUs do.
 
  • Like
Reactions: Tlh97

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,227
136
Also, AMD isn't going to go for some kind of massive undercut on price against NVidia. We've already seen that with RDNA1 they're fine with positioning their pricing relative to what NVidia is charging. As several others have pointed out, there's no incentive for AMD to lower their prices when they're almost assuredly going to have low supply due to consoles and Zen 3 all coming out and competing for wafers along with the simple fact that if they did have excess supply for their RDNA2 cards they could just shift to making more Zen 3 chiplets which have a much better margin than any of the consumer GPUs do.

Yes. Even ignoring supply issues, AMD won't be interested in a price war. They want healthy margins as well. Just look at the price increases on Ryzen 5000. The new CPUs will cost no more than Ryzen 3000. Chiplet is about the same size, I/O die is reused, process is more mature, so higher yield, prices are jacked up, just to get more profit. It' how the world works.

NVidia started this cycle with more reasonable prices, leaving AMD little room to cut if they want healthy margins.
 

linkgoron

Platinum Member
Mar 9, 2005
2,293
814
136
The 3070 is actually better than I expected. I thought that it would end up ~5% slower, but it's essentially identical to the 2080TI, perf/watt is also ~10% better than 3080/3090.
The Memory size isn't great, but I assume that for 1440p it should be fine, maybe a lower settings here and there - but it's not TOTL, so some compromises are probably expected. It's weird that RTX is only marginally better than the 2080ti (The difference is less than 5%), maybe it's the bandwidth that's limiting the card.

I don't get why people are saying that it's such a great value card. $/perf is decent, but not more than that IMO for a 70 card.

I'm hopeful that AMD will show us something that will beat the 3070 on all fronts tomorrow. More memory, better performance, better perf/watt (willing to sacrifice this) for the same or $50-$100 less.
 
  • Like
Reactions: lightmanek

Mopetar

Diamond Member
Jan 31, 2011
7,830
5,979
136
If you ignore the name changes, the prices that NVidia are charging are pretty much inline with what they were going all the way back to Kepler. $700 for the flagship card and $500 for the card below that. The only difference is that they were called the 680 Ti and 680 instead of the 3080 and 3070.

For anyone that gets caught up on the names NVidia has used, ask yourself if you'd feel like you got some kind of amazing deal if the 3070 had been called the 3080 Ti and what are currently the 3080 and 3090 had been given some other names like (3090 and Titan-A) with no other changes to the prices. Sure there's no longer any performance increase from the 2080 Ti or the 3080 Ti, but you get an xx80 Ti for $500 which is less than it's ever been at any point in history!
 
  • Like
Reactions: guachi

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
Looks like almost an excellent 500$ card, especially for 1080/1440p. But I feel like this should've been the 10GB card and 3080 12-16GB- in 1-2 years 8GB could be seriously limiting even for 1440p (if max settings are desired).

1-2 years ??? try today


Minecraft-DSR.png


Wolfenstein-Young-Blood-DSR.png
 

DJinPrime

Member
Sep 9, 2020
87
89
51
Yes. Even ignoring supply issues, AMD won't be interested in a price war. They want healthy margins as well. Just look at the price increases on Ryzen 5000. The new CPUs will cost no more than Ryzen 3000. Chiplet is about the same size, I/O die is reused, process is more mature, so higher yield, prices are jacked up, just to get more profit. It' how the world works.

NVidia started this cycle with more reasonable prices, leaving AMD little room to cut if they want healthy margins.
Oh I don't expect AMD to undercut price. I expect Navi to be priced at the performance level they're at. NV however, is the type to do a price cut if the 6800 comes in slightly faster than 3070.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,227
136
Oh I don't expect AMD to undercut price. I expect Navi to be priced at the performance level they're at. NV however, is the type to do a price cut if the 6800 comes in slightly faster than 3070.

NVidia won't cut price unless the difference is very significant, and AMD likely won't make it very significant.
 

DJinPrime

Member
Sep 9, 2020
87
89
51
1-2 years ??? try today


Minecraft-DSR.png


Wolfenstein-Young-Blood-DSR.png
For wolfenstein, they're setting the streaming buffer to ultra. So, in addition to the textures that get stored in vram, there's also a streaming buffer. In this case, they're just setting the buffer to be higher than what the card can handle. You can run the game with ultra texture, but with more reasonable streaming setting.
Serious Sam 4 has a similar setting. If you turn off the streaming completely, the game actually performs better than with it on for some people (me included).
I can't tell you not to be concern about 8GB, but personally I'm not.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,780
7,108
136
Essentially matching the 2080TI in performance is solid. Disappointed by the amount of RAM, but as someone on a 980TI for the last 4 years the card would be an across the board upgrade for someone like me in every metric. .

Let's wait an see what AMD has in the bag, but I'm thinking this is the generation to do a core system rebuild (CPU/GPU) to a Zen 3 core and an Ampere/RDNA 2 card.
 
Jul 27, 2020
16,144
10,231
106
Essentially matching the 2080TI in performance is solid. Disappointed by the amount of RAM, but as someone on a 980TI for the last 4 years the card would be an across the board upgrade for someone like me in every metric. .

Let's wait an see what AMD has in the bag, but I'm thinking this is the generation to do a core system rebuild (CPU/GPU) to a Zen 3 core and an Ampere/RDNA 2 card.
Could be a mistake with DDR5 hardly a year away. It could make DDR4 systems look really old. Think carefully unless you know something about DDR5's performance that most of us don't.
 

Mopetar

Diamond Member
Jan 31, 2011
7,830
5,979
136
Could be a mistake with DDR5 hardly a year away. It could make DDR4 systems look really old. Think carefully unless you know something about DDR5's performance that most of us don't.

Do you think it will make that big of a difference? Maybe for people chasing the highest frame rates at 1080p, but for 4K gaming I don't think it will change much.
 

linkgoron

Platinum Member
Mar 9, 2005
2,293
814
136
Could be a mistake with DDR5 hardly a year away. It could make DDR4 systems look really old. Think carefully unless you know something about DDR5's performance that most of us don't.

There's always something. DDR5 might be extremely expensive, and might have shortages. You also don't know if AM5 (or intel equivalent) will have teething issues. We might only see expensive x670 board, and no 650 boards. In a year you might say "wait for PCIe-5" or whatever. There's always something.

In ST, Ryzen 5000 is finally a real and complete upgrade to Skylake in every way (MT performance has already been there).
 
Jul 27, 2020
16,144
10,231
106
There's always something. DDR5 might be extremely expensive, and might have shortages. You also don't know if AM5 (or intel equivalent) will have teething issues. We might only see expensive x670 board, and no 650 boards. In a year you might say "wait for PCIe-5" or whatever. There's always something.

In ST, Ryzen 5000 is finally a real and complete upgrade to Skylake in every way (MT performance has already been there).
Memory bandwidth is crucial. I just hate the idea of crippling my brand new system with a memory technology on its way out. That's all. But that's just me.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,780
7,108
136
Could be a mistake with DDR5 hardly a year away. It could make DDR4 systems look really old. Think carefully unless you know something about DDR5's performance that most of us don't.

- I personally enjoy buying a higher end system so I can play older games at super maxed out settings, not so much playing current and next gen games at maxed out settings.

For example, My 980 TI has served me really well playing X360 and early gen XBO titles at 1440P/144hz or even with DSR cranked up for that buttery smooth look, but I haven't attempted anything too new (outside of Doom 2016) otherwise. Games like Dishonored 2, DX: Mankind Divided, and newer are waiting for an upgrade before I get to enjoy them totally maxed out.

Additionally, many newer technologies need a fair amount of time before their benefits are baked in to game engines. We've had cheap, plentiful cores since around when I bought my system, but my CPU is really only starting to be a major bottleneck in the newest titles. With the next gen consoles stamped out using current gen tech, (and using some NVME direct access magic) I don't anticipate things like DDR5 are going to affect the gaming experience substantially.

My goal would be to get set-up with a solid 8c/16t CPU and a platform that's PCI-E 4.0 ready and hold on to that baby for the next 4-8 years. Slap an Ampere or RDNA2 card in there and baby you got yourself a stew goin'.
 
  • Like
Reactions: lightmanek

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Witcher 3 was one I wanted to see, as it's a game I most want to play next when I upgrade my system, and it always seemed to be heavily swayed by memory BW.

2080Ti walks away from the 3070 in Witcher 3:


2080Ti is ~22% faster than 3070 at 1440p.

The 2080 TI is already noticeably faster at 1080p in the game according to those benchmarks. And at 4k, the 3070 closes the gap somewhat.

This is not a vram problem.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Definitely nice to see the power draw much more under control with this one.

Makes you slightly wonder why A102 got so power hungry actually.

It's only 5-6% more efficient than the RTX 3080. Power draw is down primarily because the card is noticeably slower than the 3080. Which leads me to this next point - this is the largest gap between the X70 and X80 Nvidia card in perhaps the history of it's naming nomenclature. I fully expect an RTX 3070 TI to be coming early next year.

Nvidia shoe horned themselves with the vram on their products. The RTX 3080 should have been 384-bit / 12gb, and 3070 TI 320-bit/ 10gb.
 

DJinPrime

Member
Sep 9, 2020
87
89
51
Could be different settings, different areas of the game tested, etc. That doesn't mean there's a "problem" with Guru3d's results.

Edit: I'm not saying there may not be an issue in the results, but you can't just assume so based upon a single result from another website which probably is testing under different conditions.
They themselves says it's an issue on their end. Quote:
Wither III was the one and the only title we had some perf issues with. We ran the test several times with a couple of new driver installs as well. We'll consider this to be an anomaly on our side for now, however, we always report what we measure.