Just upgraded from a 2500k to a 8600k. Sure there's a considerable difference.
In games, with 1070@1080p, I have seen ranges from 30% up to 100%+ (in AotS-average cpu framerate all batches
@high) .
Since this thread is about the 2500k though, I would say that it is still a very decent performer for the average user. Only if you are into 120/144fps gaming or heavy productivity apps, will you face a problem. For office apps it's a champ.
Set it at 4.3Ghz so you don't stretch it too much for long term use, and you can play anything at 60fps with a decent graphics card. It's the only cpu from recent history that has driven a wide range of video cards, from the 570 to the 1070. I believe the only one that will surpass it, is the 8600k.
Just a quick example to put things in perspective. 2500k@4.8Ghz 8600k@5Ghz. Ultra settings 1080p.
As you can see the differences are not huge. That is because the game is primarily gpu limited. However what is worth to note here, is that the 2500k managed that 70fps almost maxed out, while the 8600k was at around 50% overall. So there's still a lot of power left on the 8600k. Throw in a 2080ti in these two systems next year, do the same run at 1080 Ultra and the difference will be huge.
See you in six years on a similar thread for the 8600k!