I remember new CPUs being introduced back in the 80s and 90s, the last generation almost instantly became junk. Wait 2 generations for an upgrade and your old $2000 PC was relegated to a job as a doorstop or footstool.
Now we wait 5 years and argue endlessly about minutia over a ~20% IPC gain. And we don't even get the full 20%, because the new chips can't clock as high as the old chips. Edit And, that's not even areal IPC increase, it's mostly due to cache re-work. To wit, on TPU's 1080P aggregate performance with a 2080 Ti, the difference between a 5800X and an i3-10300 is 5.4%. That is not noticeable to most humans, and all of it can be attributed to the clock speed difference between those 2 chips (i3 @ 4.4Ghz single turbo vs 5800X at 4.7Ghz single turbo).
It's real clear to me that the future of desktop performance increases is going to be just like it has been for mobile phone SoC's. Specialized circuitry with developer libraries & compilers to make use of them for common algorithms. Apple gets that, and Intel is quickly moving that way.
General purpose compute is pretty much done, it's a very 1990s concept at this point.