New GPU Test Methodology

sm625

Diamond Member
May 6, 2011
8,172
137
106
This is a plea to the hardware tech review sites. When it comes to measuring gpu performance I would like to see it more closely coupled with power consumption. What I propose is a measurement in terms of Frames per Watt and Frames per Watt hour (for notebooks).

If a notebook can run Crysis at 30 fps @ 720p for 68 minutes before the battery dies, and the battery capacity is 45Wh, then the benchmark would be 30fps*60 seconds *68 minutes / 45Wh = 2720 Frames/Wh.

In this way we would have one single easy to comprehend number that tells us pretty much everything we need to know about how well a particular system can handle mobile gaming. Take 10 common titles and blend the results into one single score. If the same notebook scores just 15fps then the score becomes 15fps*60 seconds *68 minutes / 45Wh = 1360 Frames/Wh.

With variable TDP just around the corner, and also with the discrepancy between power consumption on mobile trinity vs HD4000 gpus, it is clear we need something more. By having some benchmarks in Frames per WH, we might even be able to tune our TDP to get the best possible combination of battery life and performance.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Your metric would be meaningless.

A laptop with a 45Wh battery that produced 1 fps and ran for 2040 minutes would have the same exact rating as your theoretical example. What has it told us exactly?


That, and laptop gaming benchmarks are typically done plugged in anyway (because they don't run at their peak on battery power).
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
What a horrible idea. Just one example - a notebook with an SSD would have a different rating because of less power draw from the HDD.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
it would be interesting to see a power consumption curve overlaid with a frames-per-second curve, to see how they change over the gameplay.

i know some review sites use an FPS chart over time so you see how it goes up and down during the gameplay, but I never see any such detailed info on power consumption.

I don't think it would be too hard either, so long as you get the information on where to monitor the voltage/current/wattage of the card. I think some fancy cards even have ports you can monitor with a voltmeter etc. surely there is a program that can log the readout over time and convert it to power over time to coincide with the FPS chart.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I would rather see them measure the latency and the time to render per frame to show consistency and then deal with consumption as a separate topic, because it if doesn't perform consistently well no one will be doing it anyway making the power consumption data pointless.
 

chimaxi83

Diamond Member
May 18, 2003
5,649
61
101
Meh, there would have to be a lot more unplugged laptop gamers for this to even make sense.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Techpowerup already has a "performance per watt" page in all of their newer GPU reviews.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
Your metric would be meaningless.

A laptop with a 45Wh battery that produced 1 fps and ran for 2040 minutes would have the same exact rating as your theoretical example. What has it told us exactly?

That would certainly be an intriguing piece of hardware. You realize you are using an impossible example, which is a logical fallacy whose exact name I cannot recall. There is no notebook, nor will there ever be a notebook that gets precisely 1 fps and runs for 2040 minutes. If we every do get a notebook with that run time, it will be at 0 fps, or 0.01 fps at most which would totally wreck a score.

As the world goes more and more towards mobile devices, battery life becomes a critical component of gpu performance. It is being almost totally ignored. Anyone with a first generation ipad who tries to play infinity blade surely has realized this.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Ok, fine. if it will make you feel better, 15fps that runs for 136 minutes would be the same in your metric.

So again, what value is that supposed to tell us? Surely nothing about suitability for gaming (because it gives you absolutely no gaming performance indication). Surely nothing about battery life (I can't tell the battery life from this).

So what's it telling me?
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
In that case its not telling you anything, because they are equal. If you have two machines:

30fps*60 seconds *68 minutes / 45Wh = 1360 Frames/Wh.
15fps*60 seconds *136 minutes / 45Wh = 1360 Frames/Wh.

They have the same score in frames per Wh. They're equal. I have no problem with them being rated as equal, because they are. They both render about 120 thousand frames before the battery runs out.

Now if you made a change to one and its Frames/Wh score doubled as a result, then I would instantly know it was better. Doesnt even matter if I know what changed. Could be a battery slice was added. Could be a change from 40nm to 22nm. Does it really matter?
 

chimaxi83

Diamond Member
May 18, 2003
5,649
61
101
There are too many variables for this to even remotely make sense. All in all, this is pretty pointless beyond what is already provided. Laptops don't get serious gaming reviews (unless it's a niche gaming model) for a reason.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
I think it's more he's taken two meaningful metrics (battery life and gaming framerate) and combined them in such a way as to totally remove all of their inherent value and in turn ends up telling you absolutely nothing about either framerate or battery life.


I think he has a future creating dashboards in corporate america.
 

felang

Senior member
Feb 17, 2007
594
1
81
In that case its not telling you anything, because they are equal. If you have two machines:

30fps*60 seconds *68 minutes / 45Wh = 1360 Frames/Wh.
15fps*60 seconds *136 minutes / 45Wh = 1360 Frames/Wh.

They have the same score in frames per Wh. They're equal. I have no problem with them being rated as equal, because they are. They both render about 120 thousand frames before the battery runs out.

Now if you made a change to one and its Frames/Wh score doubled as a result, then I would instantly know it was better. Doesnt even matter if I know what changed. Could be a battery slice was added. Could be a change from 40nm to 22nm. Does it really matter?

They are not equal... one has twice the performance and half the battery life than the other one.

I´ll take one that renders at 60fps and plug it in :)
 
Last edited:

DominionSeraph

Diamond Member
Jul 22, 2009
8,391
31
91
Make sure to multiply by screen size and divide by weight and base temperature. I want to know how many frames per watt-hour per inch per pound per pint of lap-sweat I'm getting.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
we need to know about how well a particular system can handle mobile gaming.

If you are very serious about mobile gaming, get Nintendo 3DS or PSVita.

If you are very serious about mobile PC gaming, then you should be plugging in your gaming laptop into an outlet since it has any modern GPU that will eat battery life in less than 2 hours of hardcore gaming.

Based on the above, the test you propose would add little to no value to the hardcore mobile gaming crowd. Something like Trinity might have better performance/watt but what does that have anything to do with playing Crysis 1 or 2 or Metro 2033 or Witcher 2 maxed out?

Furthermore, if you are that hardcore about mobile PC gaming, you'd buy 2-3 spare batteries.

Unless you play PC games on a train 1.5 hours a way on your way to work in Tokyo every morning, I don't see how this test is relevant. Not to mention on most laptops a discrete GPU will not even operate in full performance mode on battery alone.
 
Last edited: