Spin Off: AT's Testing Methods & Uber Mode

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

BoFox

Senior member
May 10, 2008
689
0
0
Personally, I think that this whole "Quiet mode" and "Uber mode" is a mess. A huge mess.

AMD should've allowed AIBs to release aftermarket versions at the same time at launch (and actually send them to reviewers), while also offering reference versions to those who must have exhaust coolers.

That way, it would've removed the need for a "quiet mode", while the minority of the buyers with reference exhaust versions could simply reduce the fan profile themselves (and suffer somewhat reduced clocks). Not that many would've bought reference exhaust versions for such massive GPUs if there were already aftermarket versions, after all, but it's always good to offer options (unlike 7970GE only retailing with aftermarket heatsinks).

Then it would be more like what has happened in the past - like Nvidia quietly flooding the retail channels with substantially downclocked versions of GTS 250 months after the reference, without changing the specs on their website, etc..
 
Last edited:

Anand Lal Shimpi

Boss Emeritus
Staff member
Oct 9, 1999
663
1
0
So I've got some thoughts here, and perhaps we'll turn these into a post on the main site but I wanted to get in this thread as you guys are honestly the source/inspiration for any such post.

Let's start with why we want to define a clear framework for how general performance/power/sound testing goes. Not only does it allow for fair comparisons between products, it also helps us deal with the inevitable situation where a manufacturer submits a ringer for review (e.g. factory overclocked card). I don't think there's much argument against this point - we all want a level playing field.

Similarly, it should be obvious why we'd want to include in such a framework the idea of testing a card at default settings. Having a strict policy there prevents a situation where AMD/Intel/NVIDIA show up and say hey we're selling the card in configuration x because of yields/experience/someothervalidreason, but it's really quite awesome and can run in configuration x+50% and that's how you should test it and btw we rule the world if you test it like that. This makes a lot of sense particularly when talking about encouraging factory overclocked comparisons.

The close relationship between fan speed and performance sort of throws a wrench in all of this. When Intel first started introducing aggressive turbo modes back in Lynnfield I was worried that it would completely corrupt our ability to reliably test CPUs. It turns out that wasn't the case. With graphics however the situation is a bit different, and with the 290/290X we're beginning to get a feel for exactly why that is.

I originally assumed the reason this was a problem now (and is going to be in the future) is because we're stuck on 28nm trying to get more performance without a good process tech solution until 14/16nm FinFET in 2015. Now I'm feeling like this is just going to be a part of the reality going forward, so we need a real solution.

AMD's Uber mode in my eyes isn't the same as a factory overclocked card. At the same time, it's not the same as what we've done in the past - which is test a totally stock configuration (reference clocks and fan speed). I personally believe in the whole living document philosophy when it comes to things like constitutions or review policies, but here's where we can get into trouble. In the case of the 290X, AMD has two modes and you can make a good argument for why you should test both. Let's now take it one step further: what happens if NVIDIA shows up next round with 3 modes? Do we test all 3? Which modes do we then compare against AMD modes, particularly if they only line up along one vector (e.g. performance or acoustics, not both). What if AMD responds the next round with 4 modes, etc... It can quickly get out of hand.

What I'd like to do here is define a good policy for what to do if this turns into a fan speed arms race. Dealing with the 290X is simple: Ryan tested both quiet and uber modes, and I can totally appreciate the argument for including analysis based on both. What Ryan is concerned about is the future. This isn't a matter of him being lazy (me being the person he reports to, I can tell you that's definitely not the case - he's kept up an insane work schedule over these past several weeks in order to get everything done as best as possible. The launches aren't done yet for the year, add in short NDA windows, issues with cards/drivers and of course any travel and the pace you have to keep in order to put out these reviews is insane). The precedent we set here today will directly impact what manufacturers attempt to do with their reviews programs in the future. The safe bet is to stick with testing in default configurations. I am (and assuming Ryan is too) more than willing to expand/change/redefine that, but the question is how? Let's look beyond the present 290X situation and think about what happens next. If acoustics and performance become even more tightly coupled in future GPU designs, and multiple optimization points exist for each card (with 1 default setting obviously) how should we deal with that going forward? If things get crazy, we could be in a situation where there would even have to be a tradeoff in terms of review depth vs. card configuration combinations. E.g. would you be willing to give up a resolution setting across all games tested in order to get another operating mode included? What does this do to the complexity of graphs?

I don't know that I've got the answer/a solution here, but this is the discussion I'd love to have.

As I mentioned earlier, this is a discussion that we might take to the main site at some point. We felt like we owed it to you guys to start it here given the time/effort you guys have put into it already. We're here to listen and will obviously take your input into account (e.g. the 290 fan noise update was a direct result of your feedback). All that I'd ask is please be respectful of Ryan in your discussions of his work. He really puts a ton of time and effort into this stuff and takes all of your feedback very seriously. Obviously you're free to post/say whatever (as long as it doesn't violate our ToS), but I've always been a fan of the golden rule :)

Thank you all for reading the site and for caring enough to engage in hundreds of comments on the forums and on the site itself. I'm off to bed for now but I'll check back tomorrow.

Take care,
Anand
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The real problem with not doing the test including uber mode...

nv-amd.png


If the comparison was done in uber mode this would have been the results. These are AT's numbers for the 780ti and the 290X in uber mode compared. Overall, virtually dead even except for price and noise.

What's the decision you are left with? Do you want to spend an additional $150 for a superior cooler that will be surpassed for far less money by every AIB partner? This completely changes the dynamic between the cards, Using these numbers how can the conclusion be that the 780ti is the fastest GPU in the world?
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Let's start with why we want to define a clear framework for how general performance/power/sound testing goes. Not only does it allow for fair comparisons between products, it also helps us deal with the inevitable situation where a manufacturer submits a ringer for review (e.g. factory overclocked card). I don't think there's much argument against this point - we all want a level playing field.

Exactly. The fairness argument was brought up several times in this thread, but some people just don't want a level playing ground when it comes to AMD and NVidia. :rolleyes:
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
If the comparison was done in uber mode this would have been the results. These are AT's numbers for the 780ti and the 290X in uber mode compared. Overall, virtually dead even except for price and noise.

How about you raise the fan speed on the 780 Ti so that it gets maximum boost clock speed at all times with no down clocking?

That's basically what "uber mode" is on the 290x..
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
AMD's Uber mode in my eyes isn't the same as a factory overclocked card. At the same time, it's not the same as what we've done in the past - which is test a totally stock configuration (reference clocks and fan speed). I personally believe in the whole living document philosophy when it comes to things like constitutions or review policies, but here's where we can get into trouble. In the case of the 290X, AMD has two modes and you can make a good argument for why you should test both. Let's now take it one step further: what happens if NVIDIA shows up next round with 3 modes? Do we test all 3? Which modes do we then compare against AMD modes, particularly if they only line up along one vector (e.g. performance or acoustics, not both). What if AMD responds the next round with 4 modes, etc... It can quickly get out of hand.

Lets say you have 1 card with 100 modes.

One of them will be the lowest performance and one of them will be the highest performance.

Everything else will be in between.

Took me 1 second to realize this.

So obviously you test the lowest performance mode and the highest performance mode and claim that there is to time to review all the other modes.

Of course this is all looking at straws.

What is the point of different SKUs if one has a card with wildly variable modes?

Cards being release with 4 bios?
Yeah, right.

And what is this silly talk of aligning vectors?
We buy cards to play games - the vectors that matter are ability to play games, IQ, performance and price.
All the other are minor points that we take in account after the first big 4.

In reviews you align ability to play games (fortunately these days cards generally work fine), IQ and then get performance.
 
Last edited:

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Exactly. The fairness argument was brought up several times in this thread, but some people just don't want a level playing ground when it comes to AMD and NVidia. :rolleyes:

So because AMD has developed a more advanced turbo it's unfair on Nvidia to use it? Its not unfair on AMD not to use it? :confused:

I'm not sure why any company would have 3 or 4 different modes, but if that happened then surely picking the top and bottom one would be sensible and the readers could figure out what was happening in-between?
 

R0H1T

Platinum Member
Jan 12, 2013
2,582
163
106
How about you raise the fan speed on the 780 Ti so that it gets maximum boost clock speed at all times with no down clocking?

That's basically what "uber mode" is on the 290x..
So what you're saying is that if one is getting less of a performance, due to the card throttling, on his 780Ti at say 40% fan speed he'll let it be & not try to up the fan speed manually o_O

The uber mode is for games & anyone gaming would use it so long as the noise is not unbearable, the noise issue is subjective anyway, & his temps are not too high.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
So because AMD has developed a more advanced turbo it's unfair on Nvidia to use it? Its not unfair on AMD not to use it? :confused:

I'm not sure why any company would have 3 or 4 different modes, but if that happened then surely picking the top and bottom one would be sensible and the readers could figure out what was happening in-between?

Yeah, AMD has such an advanced turbo mode, that you lose up to 20% due to throttling :awe:

NVidia GPU boost 2.0 is so superior, it's not even funny.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
So what you're saying is that if one is getting less of a performance, due to the card throttling, on his 780Ti at say 40% fan speed he'll let it be & not try to up the fan speed manually o_O

Throttling is not an issue on Kepler, especially the GPU boost 2.0 variants. I said that to emphasize fair play..

If you allow a fan speed increase on the 290x to prevent throttling, then why not on NVidia cards? :colbert:
 

R0H1T

Platinum Member
Jan 12, 2013
2,582
163
106
Throttling is not an issue on Kepler, especially the GPU boost 2.0 variants. I said that to emphasize fair play..

If you allow a fan speed increase on the 290x to prevent throttling, then why not on NVidia cards? :colbert:
Sure why not but then the stock Nvidia cooler being cool'n'quiet argument would be thrown out of the window & its not like flicking a hardware switch is cheating is it?
 

ICDP

Senior member
Nov 15, 2012
707
0
0
The most common sense thing would be do the review in regular mode against the stock of the other card. Then, at the end of the review do an additional review with the other feature with the caveat that this is not the default mode and this is why the scores are listed separate and alone, and maybe those scores should be best compared to OC 780 models.

That would be fine as long as both modes are tested and consumers can decide for themselves what mode they prefer. A reviewer simply refusing to test or evaluating performance in both modes is deliberately hiding information that is highly relevant.

It is pretty obvious that AMD was targeting the 780 in quiet mode and the Titan in Uber Mode...trying to kill two birds with one stone (which is why they tried to trump it up as the Titan Killer). The problem is Nvidia dropped the price of the 780 negating the advantage, so AMD then updated the driver of the 290 to try to put heat on the 780 because in it's regular profile it (the 290) was up against the $329 770 which would have had it over priced.

Basically, Nvidia just out played AMD with this, and even though AMD does have some great great cards at killer price points...they both come with very big drawbacks as the cards were made to operate at their threshold by default rather than have them run cooler with more potential to overclock.

I cannot emphasize enough that AMD fans trying to nitpick AnandTech and Ryan about this is one of the worst cases of sour grapes I've ever seen. Just admit that while AMD has some great cards, they just got outplayed by Nvidia because they did not have the foresight to put a better cooler on their card after deciding to allow it to run at it's peak threshold.

This is all irrelevant to this discussion. It isn't about politics no mater how much you want it to be. AMD did not get outplayed by Nvidia, Nvidia scrambled to reduce prices because AMD finally showed up with some competition. Review sites are about giving people accurate unbiased information with objective conclusions. They should not be a vehicle for a reviewer to vent about what he personally considers unacceptable noise levels. Tell us the performance and noise level from both settings and let us decide if we think it's acceptable.

Ryan is fine breaking his "out of the box" rule when he switched Titan to DP in the driver CP, or installing precisionX, or MSI Afterburner to adjust test overclocking. So refusing to flick a switch is double standards and hypocritical.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Yeah, AMD has such an advanced turbo mode, that you lose up to 20% due to throttling :awe:

NVidia GPU boost 2.0 is so superior, it's not even funny.

And yet it's still just as fast. lose 20% from what? Clocks don't matter, performance does. Looks to me like powetune is better at squeezing that most performance out of a situation as possible. I'm just gonna go grab myself a Windforce 3 290 and enjoy a lower price and better performance. You can keep your "superior" GPU boost 2.0

Powertune is the problem, everyone (but you apparently) will agree powertune is more impressive. The cooler is the issue here and because I know better, I will just get an aftermarket card that will be faster than and quieter. Like you did it seems. Looks like we agree.

Throttling is not an issue on Kepler, especially the GPU boost 2.0 variants. I said that to emphasize fair play..

If you allow a fan speed increase on the 290x to prevent throttling, then why not on NVidia cards? :colbert:

It's not a Factory option. Maybe nvidia should have thought of that too.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Sure why not but then the stock Nvidia cooler being cool'n'quiet argument would be thrown out of the window & its not like flicking a hardware switch is cheating is it?

The aftermarket coolers are going to offer superior cooling, with less noise. Same for the 290x.

However, the 780 Ti has more potential gain from an increase in fan speed as it uses less power and runs cooler, so the core speed can be increased further with boost than the 290x. The reference model had a max boost clock of 1020mhz, but I wouldn't be surprised to see aftermarket versions boosting to 1100 and greater.

And if flicking a hardware switch isn't cheating, then setting the fan speed to manual rather than auto isn't cheating either. Both cards have the same capability, just that one does it through hardware and the other through software.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
The aftermarket coolers are going to offer superior cooling, with less noise. Same for the 290x.

However, the 780 Ti has more potential gain from an increase in fan speed as it uses less power and runs cooler, so the core speed can be increased further with boost than the 290x. The reference model had a max boost clock of 1020mhz, but I wouldn't be surprised to see aftermarket versions boosting to 1100 and greater.

And if flicking a hardware switch isn't cheating, then setting the fan speed to manual rather than auto isn't cheating either. Both cards have the same capability, just that one does it through hardware and the other through software.

780Ti doesn't use less power.
 

Edgy

Senior member
Sep 21, 2000
366
20
81
Anticipating and setting a policy for the fear that future potential issues that may lead to complication (multiple modes from multiple GPU vendors) at this point may be a bit premature.

As you say Anand that there were concerns with Intel's Lynnfield and turbo modes but the data and analysis were done for standard and turbo modes then, and those concerns were alleviated as time passed.

Certainly as you say this case for GPU may be wholly different than the CPU case but preparing a solid policy in anticipation of potential issues that may rise is not the same as implementing the said policy solely based on prediction of potential problems that may arise.

Excluding coverage of Uber mode for these new AMD cards would be like excluding the Turbo mode details for Lynnfield in my mind - premature.

As a somewhat of a long reader here at anandtech I value all the efforts of its staff and their work but without citing specific examples - I do have to say that the reviews here in general do have some undercurrent of expanded "enthusiasm" toward some vendors over others which seems unavoidable as all review site I know of do have these tendencies.

For example, I consider Anandtech coverage and reviews to be more Apple/Intel/Nvidia friendly site. Nothing is blatant but words/phrases here and there, some few inconsistencies, and general article tones leads to my impressions.

Having said that, I still think Anandtech is by far most objective and useful in its coverage comparatively speaking and with room for improvements.

And that's why I keep coming back here.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,800
1,528
136
What I'd like to do here is define a good policy for what to do if this turns into a fan speed arms race.

I don't see much chance of that happening given the events of the last few weeks. AMD shot itself in the foot with the noise issue, and Nvidia has wisely capitalized on it. Had AMD invested in a quality cooler they likely could have walked away with this round.

I think it's more likely that open air coolers will begin finding their way onto reference designs. That would be the easiest, cheapest way to push GPUs into higher power bands without sacrificing too much in terms of acoustics.

Ryan is fine breaking his "out of the box" rule when he switched Titan to DP in the driver CP, or installing precisionX, or MSI Afterburner to adjust test overclocking. So refusing to flick a switch is double standards and hypocritical.

I get the argument, and while you're technically correct it's not exactly the same. There's a substantial number who purchase Titans for (relative to Tesla at least) cheap CUDA development, with that being the major if not singular use of the card. While the uber switch changes the noise profile of the card, the DP switch in Titan changes it's entire purpose from gaming to compute. If you're running compute on a Titan, you're always going to have it in DP mode. On a 290X, you'll only run it in uber mode if you can stand the acoustics irregardless of usage.

Yeah, AMD has such an advanced turbo mode, that you lose up to 20% due to throttling :awe:

NVidia GPU boost 2.0 is so superior, it's not even funny.

AMD's new turbo is obviously more advanced to any technically competent observer in terms of using as much available headroom as possible. Whether or not it's being used and marketed in a way that benefits the consumer is of course hotly contested. Before the 290X launch AMD made a fuss about deterministic vs. non-deterministic boost, and although their stance was completely PR driven they had some valid points. At the very least, AMD deserves to be called out for the hypocrisy of it all.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
I get the argument, and while you're technically correct it's not exactly the same. There's a substantial number who purchase Titans for (relative to Tesla at least) cheap CUDA development, with that being the major if not singular use of the card. While the uber switch changes the noise profile of the card, the DP switch in Titan changes the entire purpose of the card from gaming to compute. If you're running compute on a Titan, you're always going to have it in DP mode.

Are you saying most people will run the 290X silent mode?
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
However, the 780 Ti has more potential gain from an increase in fan speed as it uses less power and runs cooler, so the core speed can be increased further with boost than the 290x. The reference model had a max boost clock of 1020mhz, but I wouldn't be surprised to see aftermarket versions boosting to 1100 and greater.

The 780 Ti runs cooler because of the better cooler not because it consumes significantly less power.

In fact a 290X with a better cooler will consume less power than the reference 290X.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Yeah, AMD has such an advanced turbo mode, that you lose up to 20% due to throttling :awe:

NVidia GPU boost 2.0 is so superior, it's not even funny.

Lol? The 290 boosts 43% with a crappy reference cooler while the 780 Ti boosts 17% with the Titan reference cooler.
 

Z15CAM

Platinum Member
Nov 20, 2010
2,184
64
91
www.flickr.com
OK here's input - Not saying it will the same tomorrow:

I can see why Anand is concerned about Discussing Testing Procedures for Review Policies as these New GPU's with Variable Aggressive Turbo Modes & Thermal Regulation does indeed throw a wrench into present procedures.

Some how I doubt the average gamer is overly concerned about Extreme OC's but more or less expects a Fair and Quiet OC. This matter didn't affect my decision to buy a particular Brand, other then having confidence in AMD and nVidia to standup behind their latest Hi-End GPUs.

I just shelled out for a Gigabyte Radeon R9 290X and an Accelero Xtreme III. Should I be concerned about this issue severely hampering the performance of my new R9 290X?
 
Last edited:

HurleyBird

Platinum Member
Apr 22, 2003
2,800
1,528
136
Are you saying most people will run the 290X silent mode?

A significant number of consumers, especially of the reference design, aren't ever going to experiment with the switch to begin with.

I can't speak for the more technically inclined but if it was me then yeah, I'd run it in quite mode. I have a reference 7970 that I run at stock clocks in order to keep the noise down.

Like Ryan I primarily use open back headphones. I have a fully balanced HD-800/LCD-3 setup with a Smyth Realiser, m903, and Audio-gd Phoenix... you could say that audio quality is important to me! Even when I'm gaming I prefer to keep my noise floor relatively low.
 

Abwx

Lifer
Apr 2, 2011
11,837
4,790
136
How about you raise the fan speed on the 780 Ti so that it gets maximum boost clock speed at all times with no down clocking?

That's basically what "uber mode" is on the 290x..

Throttling is not an issue on Kepler, especially the GPU boost 2.0 variants. I said that to emphasize fair play..

If you allow a fan speed increase on the 290x to prevent throttling, then why not on NVidia cards?
:colbert:

And this is the case , it s just that it s the other way around...

Actualy the 780ti fan has higher turns/min setting than the radeons,
hence the Gforce are forcibly allowed to spin faster in the tests.

In such a situation we can say that a card is more noisy
but we couldnt use the noise argument as a basis to equalize
the tests since a card has higher speed fan at equal noise
and that would be a blatant way of doctoring the results,
so much for Ryan s tests protocols if they follow such
an unscientific path.

http://www.hardware.fr/articles/912-4/bruit-temperatures.html
 
Last edited:
Status
Not open for further replies.