Originally posted by: lopri
Originally posted by: Zenoth
Even if the E8500 would be $40 more I'd still buy that one over the previous 65nm models. It consumes less electricity, and that alone in the long term is an economy for the purchaser's finances and an economy of energy overall, that alone in my book is already a good reason to go with the 45nm models, better performance or not.
Nothing personal, and yes I'd pick E8400 over E6850 as well. But I'd like to point out the whole performance-per-watt thingy is an 'ideology' that the Industry has been pushing ever since they figured that they can't increase performance forever. Not saying that it's completely baseless, though - it is a very important factor for large computing environment (i.e. server farms) as well as portables. But not for desktops.
Even then, server farms and home users as well, the bigger issue is heat and other heat-related issues. An ordinary business machine will use less than $10 electricity per year. Now, if you add discrete graphics card the cost will go up, but I can hardly imagine any more than $20~30 per PC unless sub-zero cooling is involved. Straight from the horse's mouth:
Energy Efficient Performance 2.0 (.pdf)
IMO, Performance-per-Watt is such an overrated measure, especially when it comes to the enthusiast communities. Shutting off one light-bulb might save more dollars than picking CPUs based on electricity usage for home users.
I completely agree.
But when thinking on a large scale...
Try to imagine (and not just you, but everyone should try to imagine it) if all 130nm, 90nm and 65nm CPU models from Intel/AMD would be gone tomorrow and every single one of those consumers altogether would now be using the power-efficient 45nm revision, how much energy per-state/province/sectors/country would be saved? How many PC's based on old CPU architectures are still being used today? A lot. In fact certainly the majority still are.
Heck, the office's computers at my job used by my supervisors and boss are mostly all Intel P4C's and some of them are old A64's. On a scale of hundreds of millions of consumers, the move from older architectures to the most recent one is always better than to stay on the "last" generation while the new one is already available, better and some times more affordable.
However when people buy PC hardware such as CPU's they often think for "the moment", the present time, the very day. But if they thought "ok, this one is newer, but it's 20 bucks more expensive, however if I still buy it, I'll still save money in the long term, because it consumes less energy and my power bills will be more acceptable, even if it's only on a negligible extent...it's still better". If consumers would think that way they'd help themselves save money for them, of course, that's the first important point, I presume, but then if they think for the "collective" as well (let's all become Borgs for a moment shall we) then they'd realize that they also contribute to alleviate the demand in power for the whole country, and to the extreme - why not - for the whole planet.
I know it might sound far-fetched, but hell... it's how I see it.
And anyways for me, it's a matter of both. I wanted a Quad Core first, but then I realized that the virtual world didn't need one more e-penis in the crowd so I decided I'd stay with Duals. That means I saved money. And by having two less Cores I save even more power, which means even more money...