- Apr 15, 2014
- 291
- 0
- 76
Awhile back I got an Idea and a Question.
The Idea: Find ways to show driver overhead in a more specific way, find ways to show GPU usage in a more specific way, show that no dGPU company is actually moving forward, all the time.
The Question: Why is there variability of performance from all the GPU's, in each GPU, in the Thief benchmark? (Or any benchmark for that matter?)
Is this Drivers? Maybe. API's? Maybe. Why is this happening!?!?
This is not an AMD vs Nvidia thread. It could be, but it could also be a Nvidia vs Nvidia thread. Or an AMD vs AMD thread. So, avoid simple arguments. Stay on topic. Thank you!
As you can see, each card has a Min.FPS and a Max.FPS, this is the variability of it.
Hardware wise "% Of Variability", the higher, the worse the hardware is being utilized at any one time.
"FPS Variability", the higher, the more likely that the gaming experience will be affected.
From best to worse, in Hardware Utilization:
Rank___GPU___% Of Variability___FPS Variability___
1st___GTX 970 Stock___19.34%___14.10fps___
2nd___GTX 970 OC___19.46%___15.10fps___
3rd___R9 290X Uber Mantle___20.35%___14.20fps___
4th___R9 290 D3D___22.96%___14.90fps___
5th___GTX 980___23.15%___19.10fps___
6th___R9 280X D3D___25.19%___13.20fps___
7th___GTX 780___25.65%___15.70fps___
8th___R9 290X Uber D3D___27.41%___18.50fps___
9th___GTX 770___27.79%___14.20fps___
10th___R9 290X D3D___27.89%___18.60fps___
11th___R9 285 OC D3D___34.26%___19.25fps___
12th___R9 285 D3D___37.38%___20.30fps___
As you can see, from the info above, performance varies greatly.
The performance of the Top 3 GPU's varies by up to 20%!
The GTX 970 is used better than the GTX 980.
The GTX 980 is in 5th place. (And as worse as a R9 285 in "FPS Variability"!)
Using Mantle gave the 290X Uber over 7% less in % Of variability and 4fps less variability.(Quite an Improvement!)
The 285 is obviously not optimized, in any way.
I would like to know what people think of this, how can we make better tests and anything that is related to the topic.
The Idea: Find ways to show driver overhead in a more specific way, find ways to show GPU usage in a more specific way, show that no dGPU company is actually moving forward, all the time.
The Question: Why is there variability of performance from all the GPU's, in each GPU, in the Thief benchmark? (Or any benchmark for that matter?)
Is this Drivers? Maybe. API's? Maybe. Why is this happening!?!?

This is not an AMD vs Nvidia thread. It could be, but it could also be a Nvidia vs Nvidia thread. Or an AMD vs AMD thread. So, avoid simple arguments. Stay on topic. Thank you!


As you can see, each card has a Min.FPS and a Max.FPS, this is the variability of it.
Hardware wise "% Of Variability", the higher, the worse the hardware is being utilized at any one time.
"FPS Variability", the higher, the more likely that the gaming experience will be affected.
From best to worse, in Hardware Utilization:
Rank___GPU___% Of Variability___FPS Variability___
1st___GTX 970 Stock___19.34%___14.10fps___
2nd___GTX 970 OC___19.46%___15.10fps___
3rd___R9 290X Uber Mantle___20.35%___14.20fps___
4th___R9 290 D3D___22.96%___14.90fps___
5th___GTX 980___23.15%___19.10fps___
6th___R9 280X D3D___25.19%___13.20fps___
7th___GTX 780___25.65%___15.70fps___
8th___R9 290X Uber D3D___27.41%___18.50fps___
9th___GTX 770___27.79%___14.20fps___
10th___R9 290X D3D___27.89%___18.60fps___
11th___R9 285 OC D3D___34.26%___19.25fps___
12th___R9 285 D3D___37.38%___20.30fps___
As you can see, from the info above, performance varies greatly.
The performance of the Top 3 GPU's varies by up to 20%!
The GTX 970 is used better than the GTX 980.
The GTX 980 is in 5th place. (And as worse as a R9 285 in "FPS Variability"!)
Using Mantle gave the 290X Uber over 7% less in % Of variability and 4fps less variability.(Quite an Improvement!)
The 285 is obviously not optimized, in any way.
I would like to know what people think of this, how can we make better tests and anything that is related to the topic.
Last edited: