I don't really care about performance per watt, I doubt most enthusiasts do.
The total power used by the CPU does kind of make a difference, because it impacts the cooling and thus the noise the system produces. There is also a very real limit to how much you can cool on air and then on water and doubt anyone wants to move to the noisy and unreliable phase change cooling solutions just to run a modern CPU. So I think they had to stop increasing the power consumption but I don't particularly care about performance per watt as a metric in itself, only to some extent the impact it has on a system and its cooling. I would say there is a maximum around 150Watts that I don't want the CPU to exceed by default. Beyond that it doesn't matter to me and I doubt to many others.
However I do care about performance. I have programs today that take 10s of minutes to compile due to their size and complexity. If someone gave me a CPU with 100x as many cores or 100x more frequency I could put it to work immediately. What I can't get working is 100 machines on the same activity, it doesn't scale well due to the ratio of the size of the data to the size of the calculation. The overhead of a network dominates. I need more performance, like a million times more at least.
If intel were to continue to push on with 10% increases it would 8 product releases before they even double performance. In the 1995 to 2005 period we could have relied to get that within a single product release. To get the 1000x we have enjoyed in over a decade it would now take 73 product releases. 73! SB, IB, Haswell are all kind of disappointing products from a computing perspective. We can already see the impact, software has moved predominantly onto the server side into massive clusters of machines.