Off the top of my head, if I'd bought an i3 instead of an i5 back in 2013, and replaced it with a brand new i3 two years later, I'd still have a slower system than what I have now, and I'd have spent more money. (2x ~$130 CPUs instead of a single $200 CPU, two motherboards instead of one, etc.) Heck, for the 2x i3 rigs cost, I could have bought an i7 and been REALLY happy. 😀
So there's definitely some circumstances where "go big or go home" is fiscally responsible.
Or maybe the question is whether to get a new low end system every 3-4 years vs. a high end one every 7-8 years?
Would a Duron in 2002 -> E6300 in 2006 -> i3-540 in 2010 -> i3-4330 in 2014 be better than a OC'd Athlon Barton in 2002 -> Q6600 in 2008 -> i5-4670k in 2014? That many platform and RAM upgrades would mostly make up for savings on CPUs, I think.
Thing is, the high end consumer CPUs (consumer i7s) are about 4x as powerful as the lowest-end Pentium/Celeron desktop chips, and IIRC that's been a pretty consistent spread for a while. (At least as long as there's been consumer quads.) And we've been seeing relatively minor bumps in speed every generation for a while now. So you have to go quite a few generations back to find a high end CPU that the current low end can really embarrass.