• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."
  • Community Question: What makes a good motherboard?

Question nVidia 3070 reviews thread

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

psolord

Golden Member
Sep 16, 2009
1,308
352
136
Witcher 3 was one I wanted to see, as it's a game I most want to play next when I upgrade my system, and it always seemed to be heavily swayed by memory BW.

2080Ti walks away from the 3070 in Witcher 3:


2080Ti is ~22% faster than 3070 at 1440p.
Just a heads-up, they say it was fixed after a re-installation of the game! :S

The 3070 is now just as fast!
 
  • Like
Reactions: Mopetar

BarkingGhostar

Diamond Member
Nov 20, 2009
7,667
913
126
Witcher 3 was one I wanted to see, as it's a game I most want to play next when I upgrade my system, and it always seemed to be heavily swayed by memory BW.

2080Ti walks away from the 3070 in Witcher 3:


2080Ti is ~22% faster than 3070 at 1440p.
And 40-100% more expensive. Sure, I bet paying half to double will yield better results in any situation. Most places I looked for a 2080Ti they were going for $900-1500. YMMV.
 

amenx

Platinum Member
Dec 17, 2004
2,740
494
126
You mean like this?




350ms spikes on the 3070 is "minor"? Lulz.

The problem here is that even with repeated objective examples from multiple games, some people simply refuse to believe (tm).
Nah, the problem is too many variables that can affect what you observe. Variables that arise from different cards, different specs. Vram capacity is not the only thing that can cause rough frame times or spikes. You are treading in uncharted territory basically with assumptions (as am I), and as stated the ONLY way to be absolutely certain is to remove the variables. Meaning same cards with different vram amounts is only way to go about it. You may in the end be right, but certainly not with what you've argued or shown.
 

guidryp

Senior member
Apr 3, 2006
589
465
136
But this is clearly false: https://www.pcgameshardware.de/Geforce-RTX-3070-Grafikkarte-276747/Tests/8-GB-vs-16-GB-Benchmarks-1360672/2/

At 1440p, witness the massive frame spikes in Minecraft and Wolfenstein on the 8GB 2080S/3070 compared to the 11GB 2080TI, which delivers a stable framerate.

Also the 2080TI is twice as fast in Minecraft and 3.5x faster in Wolfenstein than the 8GB cards in average framerate, far in excess of memory bandwidth difference.

I've been saying for months 8GB isn't enough for 2080S performance levels or higher. Growing evidence continues to prove this.

I had 8GB for $379 on a 1070; now almost five years later I'm being asked to pay $500 for the same 8GB. That's not right, folks.
Great, so we have new poster child for this FUD.

This is in no way representative of how other games render pipeline works.

Minecraft RTX is beta SW, and it defaults to 8 chunks when you turn on RTX. What he does turn chunks to the Max with RTX on to choke off the VRAM.

It's an oddball corner case, not a sign of things to come.

A great example of the yellow "journalism" of our times though.
 

Stuka87

Diamond Member
Dec 10, 2010
5,216
985
126
It worked fine before anyone knew it existed and continued to work fine after everyone found out, the lawsuit was because they weren't truthful in the specs, not because it effected performance. But don't let that you stop continuing to beat this long dead horse if that's your thing.
To be clear, you are the one that brought it up, and then made a false claim that nobody was effected by it, and that everything worked fine.

Games that didn't use much video memory did work ok the majority of the time. But the issue happened often enough that, like I said, nVidia put in a driver change to mitigate it. If it wasn't a real issue, there would not have been a driver change.

Guru3D updated their results:

"Note: we have updated the Witcher III charts, the initial test results showed a performance that was odd. After reinstalling the game and applying the original configuration, the performance normalized. From the looks of things a configuration error somehow kicked in, which now is taking properly. "

The 3070 and the 2080 Ti are now neck and neck.
Figured they had something weird since other reviews did not show the issue. Although it makes me wonder what setting tanked performance so much.
 

SKORPI0

Lifer
Jan 18, 2000
16,788
1,029
126
I'll wait for the release and review of the AMD Radeon™ RX 6000 Series. See how specs compare. But I'll still get a RTX 3080 to replace the eVGA 1080Ti FTW3.
 

JoeRambo

Senior member
Jun 13, 2013
913
648
136
I'll wait for the release and review of the AMD Radeon™ RX 6000 Series. See how specs compare. But I'll still get a RTX 3080 to replace the eVGA 1080Ti FTW3.
Same, with Cyberpunk delayed I have plenty of time before the game and a few patches hit and i actually need the card. Hopefully RX6000 is very strong and forces NV to release 3080TI with 12GB of RAM, that would be perfect to upgrade my 1080TI.
 

DJinPrime

Member
Sep 9, 2020
72
80
51
Ok, you said that there was a problem on the reviewer's end implying an issue with their setup. Please show me where the reviewer said they had an issue with their setup.
Lol, guess you didn't read the review, or my quote from the review.

NVidia won't cut price unless the difference is very significant, and AMD likely won't make it very significant.
Dang it on 6800 price. Probably no surprise price change to 3070 tomorrow...
 

Hitman928

Diamond Member
Apr 15, 2012
3,200
3,209
136
Lol, guess you didn't read the review, or my quote from the review.
Yes I did, you still made an assumption which is what I was pointing out. That assumption turned out to be correct, which I'm fine with, but that doesn't mean you should just always exclude a result because it's not what you expected. Otherwise you could ignore all bad results and come to very wrong conclusions.
 

BarkingGhostar

Diamond Member
Nov 20, 2009
7,667
913
126
8GB VRAM test added to OP.


Because it looks like it doesn't exist, or any architectural improvement doesn't make a difference. Hardware Unboxed showed RTX performance is exactly where you expect because the card is simply faster overall, not because of any specific improvements.


But this is clearly false: https://www.pcgameshardware.de/Geforce-RTX-3070-Grafikkarte-276747/Tests/8-GB-vs-16-GB-Benchmarks-1360672/2/

At 1440p, witness the massive frame spikes in Minecraft and Wolfenstein on the 8GB 2080S/3070 compared to the 11GB 2080TI, which delivers a stable framerate.

Also the 2080TI is twice as fast in Minecraft and 3.5x faster in Wolfenstein than the 8GB cards in average framerate, far in excess of memory bandwidth difference.

I've been saying for months 8GB isn't enough for 2080S performance levels or higher. Growing evidence continues to prove this.

I had 8GB for $379 on a 1070; now almost five years later I'm being asked to pay $500 for the same 8GB. That's not right, folks.
And its also more than twice the price. Some of you honestly act like everyone is rich or something. You come into a thread for mid-range cards and wine about its performance and how people need to get something a lot more expensive just because you can. People looking at this card, seriously looking, at looking for $500 cards (3070), not $1200 cards (2080Ti/11GB). BTW, I reference nVidia's own website for pricing, new for new.

I guess I am the only one not rich in here? :p
 
  • Like
Reactions: lightmanek

Hitman928

Diamond Member
Apr 15, 2012
3,200
3,209
136
And its also more than twice the price. Some of you honestly act like everyone is rich or something. You come into a thread for mid-range cards and wine about its performance and how people need to get something a lot more expensive just because you can. People looking at this card, seriously looking, at looking for $500 cards (3070), not $1200 cards (2080Ti/11GB). BTW, I reference nVidia's own website for pricing, new for new.

I guess I am the only one not rich in here? :p
You have a point about the price difference, but I think everyone is critical of the VRAM amount because GPUs at this performance level can push graphics enough to use more than 8GB of VRAM and that last gen you were getting the same performance with more VRAM. Additionally, AMD just announced a competitor that's 16% more expensive but is equally faster (if AMD's claims are confirmed) but has twice the VRAM. So, having 8 GB of VRAM makes the card overall less competitive and thus have less comparative value.
 

DJinPrime

Member
Sep 9, 2020
72
80
51
Yes I did, you still made an assumption which is what I was pointing out. That assumption turned out to be correct, which I'm fine with, but that doesn't mean you should just always exclude a result because it's not what you expected. Otherwise you could ignore all bad results and come to very wrong conclusions.

Ok, you said that there was a problem on the reviewer's end implying an issue with their setup. Please show me where the reviewer said they had an issue with their setup.
They themselves says it's an issue on their end. Quote:
Wither III was the one and the only title we had some perf issues with. We ran the test several times with a couple of new driver installs as well. We'll consider this to be an anomaly on our side for now, however, we always report what we measure.
Guess you missed the entire part after I said "Quote". The "assumption" wasn't mine, I was just pointing out what the reviewer stated.

I'm not sure why you're arguing with what the reviewer himself said. He can't explain the low performance, but he decided to post the result anyway, which I don't blame him because he stated the issue. And I linked another review with Witcher 3 with no performance anomaly, it was slightly slower than the 2080ti.
If you actually read the review and my quote, which was a direct copy and paste from the review, they admit there's an anomaly. I don't know how you still think it's my assumption. It's not about bad results, but when the result looks funny, then it should be treated with caution. And the danger with posting the suspected result is that people can get the wrong conclusion just by looking at the graph and not reading the actual comments related to the graph.
 

Hitman928

Diamond Member
Apr 15, 2012
3,200
3,209
136
Guess you missed the entire part after I said "Quote". The "assumption" wasn't mine, I was just pointing out what the reviewer stated.



If you actually read the review and my quote, which was a direct copy and paste from the review, they admit there's an anomaly. I don't know how you still think it's my assumption. It's not about bad results, but when the result looks funny, then it should be treated with caution. And the danger with posting the suspected result is that people can get the wrong conclusion just by looking at the graph and not reading the actual comments related to the graph.
I read everything. If you had said, hey it looks to be an anomaly so wait to see what happens as they look into it, I would have totally agreed with you. The only point I was trying to make is you can't just dismiss the result, there have been several cases in the past where anomalous results could have been dismissed because others weren't reporting it, but it turned out to be a real issue. They had tried to even do some troubleshooting with no success so there was no obvious problems in the setup. In this case, it was a software issue solved by a reinstall, great, I'd still rather see the result they achieved with the understanding that it may need more investigation.

This is akin to the Doom Eternal VRAM issue where people were dismissing HWB's results as an anomaly because other reviewers weren't getting the same result, but HWB was able to narrow it down to specific settings and another reviewer also showed the same thing when matching settings. So I am always in favor of transparency in sharing data and if there ends up being a problem with the data from a setup perspective, well, that's what corrections are for.
 

Mopetar

Diamond Member
Jan 31, 2011
5,049
1,550
136
Dang it on 6800 price. Probably no surprise price change to 3070 tomorrow...
AMD isn't going to price their salvage part for a 500 mm^2 die to compete with a smaller die on price when it both has more memory and beats it on performance. If they made that card more competitive, it would mean people would want to buy more of them. Any die that AMD artificially bins as a 6800 is a die that couldn't be sold as a 6800 XT. The 6800 exists just to catch any dies so defective that they'd otherwise have to be thrown out.

We'll have to wait for AMD to launch the smaller Navi die before they have something that's more in-line with the 3070. Unless they have a 48 CU chip, I think there's going to be a really odd performance gap for AMD around the 3070 where they have a card that clearly beats it, but they'll also have a ~$400 card that's clearly not quite at the 3070's level either. I don't know if $500 is an important segment of the market, but the 3070 could well be NVidia's saving grace this generation.
 

guidryp

Senior member
Apr 3, 2006
589
465
136
. Unless they have a 48 CU chip, I think there's going to be a really odd performance gap for AMD around the 3070 where they have a card that clearly beats it, but they'll also have a ~$400 card that's clearly not quite at the 3070's level either. I don't know if $500 is an important segment of the market, but the 3070 could well be NVidia's saving grace this generation.
I think they would both kind of prefer it that way. They each have their pricing niche, and dont' have to directly compete as much.
 
  • Like
Reactions: VirtualLarry

LightningZ71

Senior member
Mar 10, 2017
571
511
106
They should have a little Navi RDNA2 that's roughly half of Big NAVI2. It probably is an 8GB card with half the IfCache. Even with that config, it's likely a 1440p capable card and a 1080p monster. They need something for the *600/*700 market. If theyae smart, they will segment on lower capacity GDDR6X for the 700 and slower, low end, GDDR6 but still at 8GB for the 600.
 

cmdrdredd

Lifer
Dec 12, 2001
26,707
249
106
Since when does a game have to be good in order to become a graphics benchmark? ;)
For me? I have no reason to buy a card for a game I'm not trying to upgrade specifically for. I'm way past the "it's tough on hardware therefor it's the standard" viewpoint these days.
 

Zstream

Diamond Member
Oct 24, 2005
3,397
277
136
I went to Best Buy's website 30 minutes before launch.. Refreshed the screen and added the item to the cart and clicked order (paypal was an option as well) and as soon as it was clicked it said it was unavailable.

Complete BS, and if companies don't fix this crap, more and more will head to consoles. I'm not putting up with this for a $500 video card.
 
  • Like
Reactions: Tlh97 and Edrick

SKORPI0

Lifer
Jan 18, 2000
16,788
1,029
126
Had a Nvidia RTX 3070 in my cart from Micro Center Web Store. $554 after tax & shipping, but I really want a Gigabyte RTX 3080.
Cart deleted after I waited too long.

Snap247.jpg

Snap245-orig.jpg
 

richierich1212

Platinum Member
Jul 5, 2002
2,710
340
126
I went to Best Buy's website 30 minutes before launch.. Refreshed the screen and added the item to the cart and clicked order (paypal was an option as well) and as soon as it was clicked it said it was unavailable.

Complete BS, and if companies don't fix this crap, more and more will head to consoles. I'm not putting up with this for a $500 video card.
That’s only if you can even get a new console. Freaking scalpers.
 

aleader

Senior member
Oct 28, 2013
219
50
101
And its also more than twice the price. Some of you honestly act like everyone is rich or something. You come into a thread for mid-range cards and wine about its performance and how people need to get something a lot more expensive just because you can. People looking at this card, seriously looking, at looking for $500 cards (3070), not $1200 cards (2080Ti/11GB). BTW, I reference nVidia's own website for pricing, new for new.

I guess I am the only one not rich in here? :p
Yah, this nonsense is getting tiring in every forum. One thing I hate is the condescending 'if you can afford it' statement. A lot of people don't seem to get that even if we CAN afford it, some of us don't WANT to spend so much cash on a computer part to play games. These people spending $700 to $1,500 USD just in anticipation of a SINGLE game (Cyberpunk) that isn't even out and may be a mediocre pile of crap just boggle my mind.

I could also care less about nerd bragging rights, and have said it before, I'd be embarrassed to tell other adults in my circle (family and coworkers) that I spent so much to play games. I would rather put the cash to something else. Now, I don't know anyone that even has a PC these days, that could be part of it...
 

aleader

Senior member
Oct 28, 2013
219
50
101
I went to Best Buy's website 30 minutes before launch.. Refreshed the screen and added the item to the cart and clicked order (paypal was an option as well) and as soon as it was clicked it said it was unavailable.

Complete BS, and if companies don't fix this crap, more and more will head to consoles. I'm not putting up with this for a $500 video card.
Yes you will, and you'll keep coming back for more no matter what they do :D With the unhinged buying frenzy going on right now, they could sell them without coolers and they'd still sell out.
 
  • Haha
Reactions: Zstream

Edrick

Golden Member
Feb 18, 2010
1,909
196
106
I went to Best Buy's website 30 minutes before launch.. Refreshed the screen and added the item to the cart and clicked order (paypal was an option as well) and as soon as it was clicked it said it was unavailable.

Complete BS, and if companies don't fix this crap, more and more will head to consoles. I'm not putting up with this for a $500 video card.
Same thing happened to me. Best Buys system is a joke.
 

ASK THE COMMUNITY