The day I have to start trying to calculate performance per effing watt of electricity to base my CPU purchase is the day I throw away my keyboard and go live in a cave.
I live eight miles from a nuclear reactor for cripes sake.
I get it if it's for a company wide tons of seats deal, but I quit doing that stuff for a reason.
The price thing is harder. $20 is insignificant to an end user right?
Is $30? $50? $10? Where do you draw the line? I have had a lot of times where
ten bucks mattered when I was younger (and married, and had a kid lol).
It's more significant to the seller, since that $10 or whatever over a qty starts to add up.
I can tell you when I started with a clean slate looking at AMD vs Intel a year or so ago, my "budget" I decided I should spend on a CPU was two hundred bucks or so, right at what the 2500k and the 8350 was at the time as I recall. The motherboards were what killed me, not only did Intel have what seemed like 47 different sockets and cpu's, but the number of boards out across those different sockets and chipsets was depressing. To an OCD guy like myself, it was far, far easier to sort out and buy AMD. If I'd seen any evidence of dramatic performance differences, I might have gone a different route. But I didn't, and I still don't. I just built another 9590/990fx box for work, and my former 955be setup was a serious upgrade for some old ass Intel stuff the boss was running.