A 16 percent clock speed advantage will not translate into a drastically different experience in an era of stagnation in IPC, games becoming more multithreaded, and the adoption of higher resolution displays is picking up the pace.
1440p is almost 80 percent more pixels than FHD, and a GPU would have to be 80 percent faster than a 1080Ti to start being limited by the CPU at 1440p. That means something two generations into the future. You'll be incredibly lucky to see that 16 percent in games, that's not considering whether you've upgraded your monitor by then.
At 1440P, the 1080 Ti is actually close to 80% faster than a 980 Ti. Well, 75% if you want to be exact, according to the TR charts:
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_Ti/30.html
If a hypothetical GTX 2080 Ti (or whatever the 2018 flagship is called) has anywhere close to this gain in performance (or even a ~50% increase) then yes, we would definitely see an increase in fps between a stock 8700 and 8700K @ 5GHz. Even in certain games today (not just ARMA btw, Project Cars for example) are showing higher performance on a 8700K @ 5GHz vs stock, in fact performance increase scales linearly with clockspeed in this game. Yes I am aware this is an exception rather than the norm, but my point is that there are certain games that CAN take advantage of the additional clockspeed. Therefore, if I was building a high end gaming rig for myself personally and my overall budget was ~$1500, i would most likely get the 8700K over a stock 8700. I may not even need to overclock the 8700K with current GPUs, depending on the games I play, but the additional headroom will definitely come in handy one day, as I said previously.