Originally posted by: Idontcare
Originally posted by: CTho9305
That article is crap. Only an idiot would publish
this without an explanation. A bunch of the results are clearly noise - they probably just ran each benchmark once. Maybe during that Sandra run, Windows synchronized the clock or something. A lot of what they say is just plain wrong.
Hey I just thought they were trying to say G2 was the shiznit
But you have a good point that they did too much of a cursory analyses and not enough attention to the details. Maybe it was intern day?
For something like that you really should get yourself say 10 retail samples of each stepping and generate some probability distributions before attempting to speak to the significance of the mean power consumption value of any given stepping.
BUT what doesn't help is that AMD doesn't do this themselves (i.e. publish actual power consumption numbers by stepping), and instead the task of generating
this kind of comparison gets left up to the least scientific of efforts by the THG folks.
You know AMD has the data, why not just put it out there and at least reduce the chances of malicious articles doing even more harm.
But something did happen to 65nm...maybe CIT worked too well and the final iterations of 90nm ended up having 95% of the 65nm node's goodies in them already?