nVidia: "We expected more from the 7970"

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
I've read that before, and I read over it again just now. Some of the last few sentences summarizes the article perfectly:

Nvidia was able to integrate multi-threaded rendering into their drivers, substantially improving the performance of the game. You said exactly that "it has been proven that Nvidia coded Civ V to run better on it's GPU," whereas that isn't really the case, is it? It has never been documented or discussed that Civilization V itself was coded to run specifically better on Nvidia's hardware. Nvidia improved their drivers by incorporating a DX11 feature that AMD's hardware should also be capable of doing.

So do you still think it is unfair to show benchmarks of Civilization 5 when comparing Nvidia and AMD hardware because Nvidia's driver team has enabled a DX11 feature that is and should be available on ALL DX11 hardware? Or do you still think that there are very specific enhancements within Civilization 5 itself that takes advantage of Nvidia hardware and excludes AMD hardware?
o_O I honestly don't see what is such a difficult concept here. If the game is coded to run better in multi-threaded DX11, and NVIDIA's drivers support multi-threaded DX11 while AMD's don't, then the game is automatically going to run better on NVIDIA's cards. Therefore, it's no longer a comparison of the cards, but the drivers. It'd be like running a single card triple monitor display benchmark and then declaring AMD the winner. No wonder, NVIDIA doesn't support that.

Furthermore, you didn't read Ryan Smith's write up, as he stated that it's optional. Once more programs take advantage of it, it will probably be in AMD's drivers as well.
But, um, couldn't you just put a 7970 under water and also get that same 57% overclock? Any dx11 high end gpu with a 57% overclock in sli/xfire is going to trounce a stock 7970.
Honestly, don't feed the troll. He comes into every thread about a 7970 and cries about his GTX 470's, makes up performance claims, and then never owns up to them, despite getting destroyed over and over: http://forums.anandtech.com/showthread.php?t=2221009&page=2 .
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
7970 is only slightly less dissapointing than bulldozer? Welcome to my ever growing list of peoples opinions who I hold very little value to.

Yeah, sorry ocguy, but that statement was a bit much. Bulldozer can't beat 4 year old intel cpus, it uses twice as much power/watt, heck, it can't even beat 2 year old AMD cpu technology. 7970 is the fastest single gpu on the planet, runs relatively cool, it overclocks like mad, but it is just priced a bit higher than most of think it should be. You see any difference between those two products now?
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
So do you still think it is unfair to show benchmarks of Civilization 5 when comparing Nvidia and AMD hardware because Nvidia's driver team has enabled a DX11 feature that is and should be available on ALL DX11 hardware? Or do you still think that there are very specific enhancements within Civilization 5 itself that takes advantage of Nvidia hardware and excludes AMD hardware?

Do you think that it's unfair to show benchmarks, where the 6970 and 6950 maul the GTX580 and GTX570 due to VRAM limitations? Why are most reviewers doing their best to avoid such situations?
 

MrTeal

Diamond Member
Dec 7, 2003
3,918
2,706
136
I just told you, so let me try again.

The picture wasn't meant for you.

It was meant for MrTeal who said 900 core was crazy, which is why 950 was shown since it's quite a bit crazier at that level than 900 thus 900 not so crazy.

Perhaps I should have put the picture before saying that, to avoid confusion, but no 950 core in SLI on reference air with 470's is something I can't imagine would be possible unless you were playing outside in the winter.

I'd maintain that a 900MHz core with the 470s is a crazy OC. That doesn't mean it's impossible, just that it's probably not possible for many people regardless of temps or voltage, and even for you it takes WC to get there. Most people aren't willing to have their computer sitting open air with a WC system.

Even with your system, with the stock cooler and with SLI, you reached 810MHz with one GPU dumping its heat into the water loop, and while the other was partially blocked by the 9800 it was also in open air and had another case fan blowing right on it. Shows just how efficient your water system is that you can add an extra 150MHz on top of what you got with the system you pictured.

I'd actually be really interested to see what kind of temps and OC you could get with those same cards and their stock coolers in a well ventilated case.
 

96Firebird

Diamond Member
Nov 8, 2010
5,739
334
126
Nvidia cards should not be tested at 2560x resolutions! :colbert::p

Come on, this is getting petty. So a game shouldn't be reviewed because it shows one brand in a favorable light? People read reviews to see how the card will perform in the games and settings they play, take those games or settings away and it fails as a review.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Do you think that it's unfair to show benchmarks, where the 6970 and 6950 maul the GTX580 and GTX570 due to VRAM limitations? Why are most reviewers doing their best to avoid such situations?

I think it is fair to show where the memory limitations may be. Thought it was great to see Kyle's view on this in his multi-monitor articles. Because the awareness creates better choice for gamers and nVidia's partners offered double the ram sku's

By offering constructive criticism is a good thing to me.
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
Nvidia cards should not be tested at 2560x resolutions! :colbert::p

Come on, this is getting petty. So a game shouldn't be reviewed because it shows one brand in a favorable light? People read reviews to see how the card will perform in the games and settings they play, take those games or settings away and it fails as a review.

Yes, because a review should provide more or less objective performance comparison. That's why you don't choose CPU limited games for GPU benches and avoid VRAM bottlenecks.

SirPauly said:
I think it is fair to show where the memory limitations may be. Thought it was great to see Kyle's view on this in his multi-monitor articles. Because the awareness creates better choice for gamers and nVidia's partners offered double the ram sku's

By offering constructive criticism is a good thing to me.

I agree that you shouldn't ignore features one product has over the other, but outliers aren't a good measurement for overall performance.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
So first to market = win? Unfortuantely the marketshare-gain retention by AMD with the first DX11 cards was poor.

Generally speaking, at least with GPU's, first to market really does win. 9700 pro = huge win. 8800gtx = huge win. 5870 dominated fermi, at first b/c their was nothing but wood screws, and later b/c the original 480/470's were too loud/hot/power hungry. Interestingly enough, 580/570 fixed most of the fermi I issues (and also came out first), smacking 69x0. And now the wheel turns around again, and 7970 is out first but not as fast we wanted it to be. I'll say the same thing now that I said two years ago: the longer NV takes to bring out their nextgen part, the "more better" than AMD's offering it needs to be. So if we see a reasonably fast kepler (say 10-20% faster than 7970) in 6 wks or so then we'll probably be happy with NV again assuming that power/heat/noise remain under control as they were in the 580/570. In fact, NV would get a pass in this instance b/c many people expect much more improvement over gtx 580 than that. 3 months from now we would need a 20-30% improvement. 6 months from now all bets are off b/c AMD will be able to start talking about their refresh part already. Remember, part of NV's recent success with fermi was due to AMD's 15 month refresh cycle from 5870 to 6970, while NV really pulled out all the stops by only taking 7 or 8 months to get a 20% performance improvement and a lightyears better bump in power/heat/noise.
 

destrekor

Lifer
Nov 18, 2005
28,799
359
126
Am I the only one that believes Kepler will make the 7970 essentially a $550 mid range card?

Honestly, I'll be disappointed if they don't end up completely smashing AMD's best this generation. I think nVidia's midrange card will rival the 7970, and likely come in under $400.

The reason I say this, is because this new architecture for AMD is quite a bit backwards from what they used to do. nVidia has worked the kinks out of Fermi, and likely have a far better understanding on that style of architecture;

Nvidia takes Fermi (400 series), refines it, and releases Fermi 1.5 (500 series). They get a fair bit of improvement going that route.

This style of architecture is fairly foreign to AMD, and the best they are able to get right now is just a little bit over Nvidia's top Fermi product.

Nvidia should have no problem matching and/or besting the 7970.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I agree that you shouldn't ignore features one product has over the other, but outliers aren't a good measurement for overall performance.

I don't disagree on the over-all context but one should investigate limitations on hardware. Personally enjoy really tough reviews with a lot of investigations.
 

96Firebird

Diamond Member
Nov 8, 2010
5,739
334
126
Yes, because a review should provide more or less objective performance comparison. That's why you don't choose CPU limited games for GPU benches and avoid VRAM bottlenecks.

I disagree. I think we may be talking about different situations... I am not talking about the graphs that show all games tested at a certain resolution, those I don't like very much. But if someone is looking to see how Civ V performed on a GTX 570, they should be able to see that in a review. There is no reason to not test it because it doesn't conform with the usual stacking of things. Just as a person should see that a game is CPU-starved, knowledge is power.
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
I disagree. I think we may be talking about different situations... I am not talking about the graphs that show all games tested at a certain resolution, those I don't like very much. But if someone is looking to see how Civ V performed on a GTX 570, they should be able to see that in a review. There is no reason to not test it because it doesn't conform with the usual stacking of things. Just as a person should see that a game is CPU-starved, knowledge is power.

I agree that the more games benched the better, but due to the constraints on the reviewers time and sanity I'd rather see them bench and analyze games, where the performance isn't that predictable. ;)
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I'd prefer to hear nVidia talking about their own tech. To me this would signify a card that is worthy of mention in the works at their labs.

But this is nVidia, what do you expect from their reps? We've seen this kind of behavior before from the green team when they are beat and without a competing product on the shelves.

This sort of statement is ridiculous. Any marketing rep worth a crap would be trying to rain on AMD's parade right now. What are they supposed to say, "we suck so everybody should buy a 7970"? AMD does this as well, remember when barcelona was supposed to be 40% faster than clovertown?
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Do you think that it's unfair to show benchmarks, where the 6970 and 6950 maul the GTX580 and GTX570 due to VRAM limitations?

Not at all, as long as the hd6970 and hd6950 themselves will produce playable frame rates in the given situations. If not, then it is only a waste of time to see who is faster at settings no one would want to play with.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I agree with him. The performance was there but EVERYTHING else was a joke -- it was too expensive, too hot, it was basically a power guzzling dust buster piece of junk. Obviously NV went on to perfect it with the 500 series, and the GTX 580 was and still is one of the best GPU's ever. But IMO the 400 series were a disappointment, and many reviewers felt the same way at release.

The main takeaway from this is how nvidia boasted that GTX 480 would be 2-3 times faster than 5870 and what happened? 6 months later and 10-15% faster. Guess that was their viral marketing then as well :p If we've learned anything from history, it usually takes nvidia 2 tries to perfect a new architecture.

8800gtx didn't take 2 tries, it was a monster from the word go. 280->285 and 480->580 were just iterative improvements in the same vein as 4870->4890 and 5870->6970, but both camps did it. Even 9800 pro was quite a bit faster than AMD's best of the best, the 9700 pro.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I'd actually be really interested to see what kind of temps and OC you could get with those same cards and their stock coolers in a well ventilated case.

Sadly, well not really... I don't own a computer case.

I'd rather cut myself than listen to blower fans again, water or wait until I can go water with the card I want.

I did the best I could with the hardware I could afford, I made as many purchases of decent hardware during sales and found rebates and discounts whenever I could.

I wouldn't have bought 470's @ $350, they weren't worth that much to me, I would have gotten a $180 460 instead, but as luck would have it black Friday gave me a 470 for $185.

I mean who doesn't enjoy overclocking here? I'm not ragging on the 7970 just cause, this setup was cheaper than a 580 and out performs one. I said the same thing then as I say about the 7970, it's not enough for the cost.

c1fea071.png


I might own two 7970s someday, but I won't pay $550 for one, no sir.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I'd agree future proofing is stupid, but having a product that takes you into the future much more gracefully isn't stupid. Nor is keeping a product longer because it held up better. 470s when they came out were priced below the 5870, but as you can see out perform it considerably in modern titles even with it's drastically low factory clock speed of 607 MHz.

6% isn't far less, and it's been out for over a year?

Why would you go with AMD if what you've really wanted all along was the 580?

Maybe if your only experience is with AMD, I can't say on that front. I've had a great time with my SLI setup, I probably won't even upgrade to Kepler at least not at first, I don't want to be in a situation where I downgraded performance for more money which is what the 7970 is currently offering.

I remember how happy RussianSensation was when he got his 2 x gtx 470's for ~ $205 each back in the day...until he actually started using them. How disappointed was he? I haven't seen him around here lately to ask him myself, but I know that he researched reapplication of TIM and bought at least one high flow exhaust bracket (that he later gave to me for my 480). If I had a choice between a 5870 in sept 2009 or a gtx 470 in march 2010 then there really is no choice at all, the 5870 was just a much much better buy all around. Fortunately for me, NV figured out their fermi power/heat/noise woes and manufactured some 480's that didn't suck so much in power/heat/noise, and I got one for less than $200 during that crazy sale last year. I'm incredibly sensitive to noise (came from a gtx 460), and I'm still wildly happy with my gtx 480 because they fixed their issues. But if I'd gotten a gtx 480/470 at launch prices with all their issues, I probably wouldn't have stayed with them and would have switched to something from AMD.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
o_O I honestly don't see what is such a difficult concept here. If the game is coded to run better in multi-threaded DX11, and NVIDIA's drivers support multi-threaded DX11 while AMD's don't, then the game is automatically going to run better on NVIDIA's cards. Therefore, it's no longer a comparison of the cards, but the drivers. It'd be like running a single card triple monitor display benchmark and then declaring AMD the winner. No wonder, NVIDIA doesn't support that.

So your logic is: "if either company sucks at it, then it shouldn't be compared. AMD had a 6-month lead in DX11 hardware, yet has fallen behind Nvidia in DX11 support and because of this, no future games with multi-threaded rendering should be used to compare Nvidia and AMD GPU's until AMD codes in said industry-standard feature."

And if AMD never codes in said industry standard feature, then???????????????

On the other hand, Nvidia does not support 3+ monitor setups with 1 card. Therefore no multi-monitor setups should ever be used when benchmarking both Nvidia and AMD hardware because Nvidia can't do it so it can't possibly amount to useful information.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I remember how happy RussianSensation was when he got his 2 x gtx 470's for ~ $205 each back in the day...until he actually started using them. How disappointed was he? I haven't seen him around here lately to ask him myself, but I know that he researched reapplication of TIM and bought at least one high flow exhaust bracket (that he later gave to me for my 480). If I had a choice between a 5870 in sept 2009 or a gtx 470 in march 2010 then there really is no choice at all, the 5870 was just a much much better buy all around. Fortunately for me, NV figured out their fermi power/heat/noise woes and manufactured some 480's that didn't suck so much in power/heat/noise, and I got one for less than $200 during that crazy sale last year. I'm incredibly sensitive to noise (came from a gtx 460), and I'm still wildly happy with my gtx 480 because they fixed their issues. But if I'd gotten a gtx 480/470 at launch prices with all their issues, I probably wouldn't have stayed with them and would have switched to something from AMD.

I wasn't aware Nvidia refined GF100 :hmm:

All my cards were bought after 5 series came out, all of them have been nothing short of fantastic with none of the problems others talked about. Water didn't solve any problems, it just took some beast clockers and made them stupid beast clockers lol.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
8800gtx didn't take 2 tries, it was a monster from the word go. 280->285 and 480->580 were just iterative improvements in the same vein as 4870->4890 and 5870->6970, but both camps did it. Even 9800 pro was quite a bit faster than AMD's best of the best, the 9700 pro.

5870->6970 wasn't just iterative improvement, it was a new architecture. I know that the end result looks like iterative improvement but under the hood it's a pretty significant change.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Honestly, I'll be disappointed if they don't end up completely smashing AMD's best this generation. I think nVidia's midrange card will rival the 7970, and likely come in under $400.

The reason I say this, is because this new architecture for AMD is quite a bit backwards from what they used to do. nVidia has worked the kinks out of Fermi, and likely have a far better understanding on that style of architecture;

Nvidia takes Fermi (400 series), refines it, and releases Fermi 1.5 (500 series). They get a fair bit of improvement going that route.

This style of architecture is fairly foreign to AMD, and the best they are able to get right now is just a little bit over Nvidia's top Fermi product.

Nvidia should have no problem matching and/or besting the 7970.

I hope it goes like this. AMD has gotten lazy with re-releasing the same stuff (beloved patriot NV did a few years back) and jacked-up prices. Hopefully good competition from Kepler helps this. AMD left a lot on the table with the 7xxx series because of lack of competition.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
...words...
It's been proven that NVIDIA actually had Civ5 coded to run better on its GPU's alone. The fact that Anandtech still uses it in its suite is a black mark.
...more words...

What are you talking about? Ryan Smith actually posted an indepth description of this issue almost a year ago here in the VC&G forums. What actually happened is that civ 5 implemented proper multi-threaded rendering. Nvidia gpus can take advantage of this, leading to better performance in civ5 vs AMD gpus which STILL TO THIS DAY EVEN WITH 7970 DON'T IMPLEMENT MULTI-THREADED RENDERING. Civ5 is one of my favorite games, I generally prefer AMD cards, but guess what I've been using for the past few years?

Here's Ryan's original thread post on this:

http://forums.anandtech.com/showpost.php?p=31520674&postcount=28

And Ryan mentioned during the 7970 review that AMD still hasn't implemented multi-threaded rendering. Hopefully if enough of us bitch/moan/complain about it, they'll eventually get around to it.

Anandtech's own Ryan Smith wrote a great analysis of the situation: http://forums.anandtech.com/showpost.php?p=31520674&postcount=28 . It's great to see what kind of performance multi-threading rendering in the drivers can lend, but it's pointless to use as a comparison until both companies are on the same page. If you're a Civ 5 buff, then it's good to see the tests regardless, but if you're designing a limited benchmarking suite, there are better games out there. It's just weak if you're going to use any form of scientific process to analyze the cards.

Wow, that's funny, you ninja'd your own criticism!

Multi-threaded rendering isn't some esoteric BS 3dmark 2006 application with no basis in reality, it's used in one of the most popular games out there and offers a huge boost in performance for any game that properly implements it. It's like the anti-physx b/c it has a real-world performance boost and a fantastic positive impact in gaming experience to those who use it. Why isn't that a reasonable comparison to make? Maybe if reviewers keep using civ5 then eventually AMD will pull their heads out of their asses and implement this dx11 feature in their cards.
 
Last edited:

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
o_O I honestly don't see what is such a difficult concept here. If the game is coded to run better in multi-threaded DX11, and NVIDIA's drivers support multi-threaded DX11 while AMD's don't, then the game is automatically going to run better on NVIDIA's cards. Therefore, it's no longer a comparison of the cards, but the drivers. It'd be like running a single card triple monitor display benchmark and then declaring AMD the winner. No wonder, NVIDIA doesn't support that.


And yet you had no reservations comparing 6950 to 7970 in Crysis 2 and based on that(and some other games where 7970 does very well) say that it is 100% faster. And somehow you completely ignored games like metro2033 where the performance difference is much smaller. We all know that Crysis employs ridiculous levels of tessellation not to improve visuals but to improve nvidia performance relative to radeons.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I've read that before, and I read over it again just now. Some of the last few sentences summarizes the article perfectly:



Nvidia was able to integrate multi-threaded rendering into their drivers, substantially improving the performance of the game. You said exactly that "it has been proven that Nvidia coded Civ V to run better on it's GPU," whereas that isn't really the case, is it? It has never been documented or discussed that Civilization V itself was coded to run specifically better on Nvidia's hardware. Nvidia improved their drivers by incorporating a DX11 feature that AMD's hardware should also be capable of doing.

So do you still think it is unfair to show benchmarks of Civilization 5 when comparing Nvidia and AMD hardware because Nvidia's driver team has enabled a DX11 feature that is and should be available on ALL DX11 hardware? Or do you still think that there are very specific enhancements within Civilization 5 itself that takes advantage of Nvidia hardware and excludes AMD hardware?

Excellent post.

And we can't say that NV pulled a batman-type shenanigan, either, because we'd all be incredibly pissed off by it (as would AMD). No, this is completely a driver-writing fail on AMD's part, and it's definitely cost them money over the past few years. Hopefully they'll take some of that money they saved by firing half the company to hire a few more competent programmers.
 
Status
Not open for further replies.