In different applications, which applications (games, video editing, etc) utilize the cache more and which one utilizes higher clockspeeds? Are both important for games?
CPUs with small caches tended to fall behind much more than equally-clocked CPUs with much more, after they are a few years old. Also, Intel was known to reduce cache performance in the process of reducing size, which was one of the reasons many older Intel CPUs faltered with larger data sets. Chopping the cache in half was bad, but chopping it in half (or even quarter!) by reducing the ways is what made it bad enough to spend more money on a better CPU

. As that got to be less of an issue, they made sure the memory interface was crippled (C2D era).
AMD was different, but not mucn, in practice. A Sempron at the same speed as an A64, FI, was close to the performance, but a few years down the road, feels sluggish like a Celeron, while the A64s hang in there.
With the needs of games and content creation applications, today
(in that a truly crippled cache subsystem on an otherwise fast CPU would be stupid to bother doing, compared to other options, as too many things would be too slow without enough), I think we have reached a point where most CPUs have enough cache to feed them, so it's not really a concern, outside of servers.
IoW, your i5-2500 might need that extra 3MB to feed two more real cores...but it has it. Meanwhile, the i3-2320 is sharing time between only two cores, so when it could use more cache, it could probably also use more real cores. By having the smaller-cache CPU also have fewer cores and lower clock speeds, it will tend to balance out.