AMD "Never Settle" 12.11 Driver - benchmarks are in!

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Welcome to VC&G, surely you've been here a while, right? ;)

Yea, you would think I would expect it by now. I think I need a break though.

This is exactly the same verbal sewage I get when arguing politics.
You tell someone you are on the same side of (or at least neutral on the subject) that they shouldn't use strawman and unfair demonization of opponent because it is unfair to them and/or only hurts their/our argument and suddenly you are a monster.

You would think at least people could discuss something as unimportant as a video card without doing the same. Unlike serious issues I have no vested interest here (AMD v nVidia? who cares)
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Yes it's the top GPU for this generation, however that will mean absolutely nothing in short order, so why is it still commanding such a large price premium? Now I know I'll catch flack for this because the 680 is still there as well, but my point is that I haven't bought a 680 and AMD is not in the same position as Nvidia. That's why they're offering these deals, they need sales and personally I don't think they've justified it - at least for me.

I get your point now. It's like MS selling Xbox360 for $300-400 with game / Kinect bundles or Sony offering a $269 PS3 bundle despite those consoles being old/near EOL. You would have much rather liked to see HD7970 at $299 and HD7950 at $199 than a 3 game bundle for $399. In your view this entire 28nm generation is overpriced relative to what you expected from NV (i.e., the real GK100/110) and AMD. However, NV beat AMD last 4 generations and it more or less was AMD's strategy to price their cards at $269-299/369-379. I doubt HD8970 will be $369-379 unless NV crushes it with GTX780. Maybe we are back to the previous high prices of $500-600 flagship GPUs that NV frankly never left. Only AMD stopped competing for high-end GPUs during 2900-6000 eras.

I guess you can just skip this generation then and either go for GTX700 or buy a current gen cards when they drop to $170-200 like GTX480 did. I don't think AMD can afford to sell HD7970 at $299 or below and NV isn't going to lower price on its cards as long as they keep selling. Your best bet is just to wait until 2013 or maybe until Maxwell to upgrade. GTX470/480 were also a special case I feel since once HD6950 came out at $299 and fell to $250, it was basically impossible for NV to sell GTX470/480 above $250. If you look at how past GPU generations happened, ATI and NV would both launch at $500-600 for flagships and prices would drop over time but still to $300-350 levels. It's only since the small die strategy that we had amazing deals and it forced NV to be more price competitive as well. This generation, NV even raised GTX460's successor's price from $199-229 to $299 with GTX660Ti.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I meant power usage :)

My bad. Legit Reviews shows 14W more power consumption on an HD7970 Vapor-X in 4 games. Not sure how much more 7970 GE uses specifically in BF3 with this new driver. Per Legit Review, the average power consumption of the GTX680 TOP system is 364W in games vs. 391W for HD7970 GE Vapor-X. GTX680 TOP costs $518 with shipping and has no free games. Newegg has Gigabyte 1.1ghz 7970 which costs $450 with 3 free games and is going to be 7-10% faster on average than the GTX680 TOP (if not 15% at 1600P). The power consumption difference between 680 and 7970 GE is practically immaterial in the context of the free game bundle, the price difference between 2 cards, the performance and overclocking advantage 7970 GE holds. Even without the free games bundle, HD7970 / 7970 GE cards cost less than GTX680s and perform as fast or faster. The game bundle puts GTX680 completely out of the running and if you consider overclocking, unless you get GTX680 Lightning, GTX680 isn't even on the map without a hard volt mod. Even if you care that much about power consumption, feel free to reduce the clocks of your HD7970 to 1Ghz and drop the voltage to 1.05V and have power consumption at GTX680 level with similar performance and still 3 free games.

Keep in mind, AMD ships HD7970 GE cards with conservative BIOS that cranks the GPU to 1.212-1.256V at load for just 1.05ghz. That's simply not necessary and quick adjustments in MSI AB can take care of that bringing a stock HD7970 GE 1.05ghz very close to 680's power consumption. Alternatively you can probably overclock that HD7970 Vapor-X they tested to 1150-1180mhz without even touching the voltage. I would expect gamers who are into overclocking and power consumption to find some balance by manually tweaking their overclocks and undervolting to achieve such a balance and not stick to AMD's 1.25V BIOS settings. The power consumption continues to be blown way out of proportion on an enthusiast forum where quick tweaks can resolve a lot of these issues and for starters an enthusiast gaming system with a GTX680 is already drawing well above 350W.

I know I keep repeating it but it's simply not necessary to crank 1.25V into an HD7970 for 1.05ghz settings:
hd79701200mhz109vavg.jpg


Max GPU voltage is about 1.182V. I think it's fair to say that even if a person has no intention of overclocks, HD7970 / GE can operate at well below 1.25V at 1.05Ghz if the user cares to squeeze every ounce of power consumption efficiency.

And again with HD7970 1.1ghz Gigabyte card going for $450 with 3 free games, GTX680 is clearly overpriced now even if you do take into account 30W extra power usage at load if you don't want to use MSI AB and just look at out of the box price/performance and cost per FPS including electricity costs.
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
I agree.


Have you actually read my posts?

I said that I think those improvements ARE worth it to me and ARE pretty nifty.

What I took ubrage at was the reducto ad absurdum and strawman fallacy used to attack those who make a perfectly legitimate complaint. A complaint I do not share with them. Nor do I insist on the "facts" they provided being true.

Whether it takes 14w or is totally free is IRRELEVANT to everything I said because all I said was that a specific argument is unfair and made up of fallacies.
The argument in question, the one getting all those pluses from people did NOT claim that certain "facts" were false data aka "no it does not take 14w extra, that claim is false".
It instead dismissed (using both a strawman and a reducto ad absurdum fallacies) complaining about it as unacceptable.
Effectively saying "I accept your claim that it takes 14w extra, and the fact you complain about it makes you a fanboy"

I don't see why pointing that this is what it did should make the me the target of so much hate and such attacks. As well as continuing strawman attributing to me nonsense that I never said and is in fact the exact opposite of my own statements and opinions.
Then I am sorry you feel this way, I meant not ill will. My post was referencing the fact that you identified a fallacious argument used to attack the original assertion (that is, that the increased performance would cost extra power consumption), except that the original assertion was already based on a fallacious argument (sweeping generalization) which you continued (either intentionally or not) in that post. In either case, I think we seem to be arguing a facet of the same point.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Welp, I just read this thread, and for those new to it, a summary:
  • AMD released/will-release new drivers that improve performance ranging from 5 to 20%+
  • There are future bundles with games, and a coupon, if you like games and coupons, cheer now.
  • No price cuts
  • 14 watt increase in certain games, this may or may not burn your house down - you've been warned.
  • AMD loses more money due to bundle, nVidia is laughing on their way to the bank
  • 3x GTX 470 are still NOT obsolete, be warned about watts and burning one's house down.
  • Due to power increase, this performance upgrade isn't free, please keep an eye on your electricity bills
Haha, the bipolar nature of these threads are most amusing.
EDIT: THis just in, they've been released! Fire proof your pets, NAO!

http://support.amd.com/us/gpudownload/windows/Pages/radeonaiw_vista64.aspx
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Receiving more performance, bundles is obviously a good thing for potential or prospective gamers. This is a good thing for PC gaming over-all based on even improved value for AMD customers but may help take sales away from nVidia -- which should help improve gamer price/performance/value for the GeForce brand. Potentially a win-win for PC gaming and the ever improving 28nm price/performance/value.

What kind of gamer doesn't desire more perfomance, savings, value and bundles?
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
What kind of gamer doesn't desire more perfomance, savings, value and bundles?

Judging by this thread, nVidia gamers ;) I'm just messing haha - you set it up :)

anyways, this made me smile:

wow_1920_1200.gif


Will download and install IF I ever get home tonight :( haha.
 

Zxian

Senior member
May 26, 2011
579
0
0
For the people complaining about power usage, you're really arguing to spend more money for a card that will save you a dollar or two per year in utility costs? Really? Big picture must be lost on some people.

I'm looking forward to the extra power output. Winter is coming....better heat up the house while I'm gaming. :D /tongueincheek
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Judging by this thread, nVidia gamers ;) I'm just messing haha - you set it up :)

anyways, this made me smile:

wow_1920_1200.gif


Will download and install IF I ever get home tonight :( haha.

That's a nice increase and welcomed. On the surface, one may think that World of WarCraft isn't a performance hog -- but when one adds features like transparency AA -- considering the huge amounts of alpha tests and stereo3d -- one needs a nice GPU and raw performance.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Then I am sorry you feel this way, I meant not ill will.
Alright, no problem. And thank you.
I should clarify that the part at the end about being attacked should have been properly separated from the earlier argument. As your post was mild... I just quoted you because your post actually had something worth replying to.

My post was referencing the fact that you identified a fallacious argument used to attack the original assertion (that is, that the increased performance would cost extra power consumption), except that the original assertion was already based on a fallacious argument (sweeping generalization)
Ah, but pointing out one fallacy does not mean I support a fallacy by "the other side". Which you do appear to understand as you continue with:
which you continued (either intentionally or not) in that post. In either case, I think we seem to be arguing a facet of the same point.
It was not my intent to do so, and I do not think what I was saying actually came off as supporting the original fallacy. Merely not touching upon it because the fallacious reply which I was focusing on did not actually attempt to refute it and I have yet to sufficiently examine it to see if it was fallacious or not (since its not as immediately obvious as a strawman merged with reducto ad absurdum attack)
 

moonbogg

Lifer
Jan 8, 2011
10,734
3,454
136
I'm actually feeling for all the software engineers who were probably pulling 15 hour days with no weekends in order to pull this epic driver package out of their collective hats, all the while wondering if Nvidia would hire them if their magic trick doesn't save their company. I want AMD to pull through, but this is a desperate move and this level of enthusiasm should have been present long ago. Maybe then AMD would not have the reputation of having bad drivers and wouldn't have to try so hard for a miracle.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
lol please don't :D

There is no car analogy that is acceptable.


At the rate I am going I'll probably get blasted out of here.

I actually can't wait to see some actual user input on these drivers. I'm thinking about picking up a 7970 and running some tests to see how much of an improvement. I know others mention about how smooth Nvidia is compared to AMD in some games (particularly BF3 I believe) I wonder if the experience is now the same between both cards. I know heavy gunfire and explosions brought my 7970 and 7950 to it's knees, also 4x MSAA was practically unplayable in 64 player maps.


EDIT: If anyone wouldn't mind testing BF3 on a 64 player map 4x MSAA, Ultra graphics setting, it would be appreciated :)
 
Last edited:

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Welcome to VC&G, surely you've been here a while, right? ;)

Do you do anything but troll? You have't added a single iota to the overall discussion. You're literally just trying to post stuff to get people arguing with each other.

Not only that you do in a smug "holier than thou" attitude like your superior to the people around here. Get over yourself.
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
First results: :eek:
My previous testing was done when the 7970 first came out and was using the 11.12, which was pretty much the worst drivers for the 7970. Anywho, at 2560x1600, Ultra settings, 4xAA, no FXAA, I got 59.817 FPS average in my benchmark sequence (which consists of the first 60 playable seconds of the game). With the 12.11 beta 3's that same exact sequence got 70.117 FPS, or a 17.2% improvement. Not Earth-shattering, but pretty remarkable. I imagine changes from even the 12.2's and definitely the 12.9 betas will be less. Here's a link to the data and I also through it on the graph (in purple): https://docs.google.com/spreadsheet...lmUGVQMUZReHI0bFg4czR1Z3AwdXc&hl=en_US#gid=12

I'm going to try some BF3 64-player MP now :).
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
I play everyday like that and it runs perfect.


I know the 670 runs fine that was not my question.



First results: :eek:
My previous testing was done when the 7970 first came out and was using the 11.12, which was pretty much the worst drivers for the 7970. Anywho, at 2560x1600, Ultra settings, 4xAA, no FXAA, I got 59.817 FPS average in my benchmark sequence (which consists of the first 60 playable seconds of the game). With the 12.11 beta 3's that same exact sequence got 70.117 FPS, or a 17.2% improvement. Not Earth-shattering, but pretty remarkable. I imagine changes from even the 12.2's and definitely the 12.9 betas will be less. Here's a link to the data and I also through it on the graph (in purple): https://docs.google.com/spreadsheet/...l=en_US#gid=12

I'm going to try some BF3 64-player MP now :).


MrK6 - Looks promising so far :)
 

Majcric

Golden Member
May 3, 2011
1,409
65
91
Great Job AMD. If I were buying now I'd have my first AMD(ATI) card, since the ATI 9250.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Totally subjective analysis of 64-player MP. I tried out Firestorm 64-player, full server with all vehicles. Playing at @ 2560x1600, Ultra settings, 4xMSAA and no FXAA. Clocks are 1325/1650. There's a remarkable improvement. To be noticeable to me I'd say that's at least 10%. Looking over the whole map at the initial screen I'm getting 60FPS, averages I'd say are in the 70FPS ranges, sometimes climbing a mountain it'll peak over 100 FPS. During some really large terrain scenes or some explosions I'll drop below 60 FPS but it's rare. Looks promising guys!
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
I can also confirm a fairly large improvement in BF3. Upon loading the current level I am playing, 12.9 driver, I was seeing 66fps. Close out, install 12.11 beta, restart computer then BF 3, I get 74 fps. Not a very thorough test but very repeatable. I did each test 2 times.

This was @ 2560x1600