Well, I deleted my post because I thought I was mistaken but it turns out I am not. 
		
		
	
	
Pcper is not normalizing their data. Relative Variance is far greater at high than ultra on the 980. However, this is not seen on the percentile frametine chart because there is no normalization to performance. 
Essentially, for a constant distribution variance with scale with magnitude. Double the value of each point in the data set and you will double the absolute variance, however they are the same data set just one is scaled. Therefore the 980 with a 20-25% advantage over the 970 will enjoy 20-25% lower frame time variances. 
Think of it this way. 980 vs. 970 on ultra, 980 is 25% ahead with 25 vs 20 fps. On high the 980 is still 25% higher, 50 vs. 40 fps. The gap between the two cards has widened in terms of absolute fps yet has remained constant in terms of relative performance. 
The variance graph pcper gives us is only the absolute framerate variance, not the relative framerate variance.
(The higher variance on high vs. ultra settings is likely an effect of driver granularity and game engine execution (ie player triggers some animation in the background by crossing a line).
Likewise the bf4 tests are pretty useless.
-970 better than 980 at 150% scaling.
-1 spike at 970 @ 130% is meaningless, especially when the 140% spike is smaller 
You can look at the rest of the charts but the few couple spikes throw the charts right off. Which is bad because they appear to be completely random.
The variance charts also do not normalize to performance.