At this point, infinite. A Core2 still surpasses the need of 95% of the population's general needs. The industry has compensated with increasingly inefficient software.
Core2? Nah. Not unless you bought towards the end with the top-end Core2 models, especially top-end Core2Quad.
Bought a first-gen i7, or better yet, a Sandy Bridge (or later) i5 or i7? Good to go for a decade, most likely.
Software isn't getting inefficient, but more efficient threading is becoming a must, because single-thread max capability hasn't really been improving at the rate that overall computing has been improving. What we're seeing is single-thread software making even somewhat recent models, especially all those Core2 series parts, look like absolute shit.
Point being: if you barely use the full capabilities of a computer, as in, you really just use it for simple tasks, late C2Qs and any i5/i7 will probably work perfectly for a long while, especially as Windows gets more efficient with every release.
If you DO stretch those computing legs with some resource-intensive software, either regularly or on occasion, even Core2Quads are going to start becoming long in the tooth, if not already... that is, unless y'all are patient - I'm not known for patience with resource-intensive software unless I know even top-end systems still take time for the task, like loading thousands of RAW files or editing multiple large RAW/TIFF files, encoding/transcoding, rendering if that's your line of work, etc.
I say that because Core2Quads are very old threads, and while I said thread performance hasn't been changing significantly, there was a rather drastic change between Core2 and, at the very least, the second-gen Core series (Sandy Bridge).
Some aspects of computing don't seem to follow the hardware - but that's because, indirectly, they have. Depending on what the application is, something may be sampled more, the resolution is higher, the bit rate is higher, more processing is going on at the same time for immediate access instead of delayed response for the less often used features... some software always seems slow because it just keeps demanding more and more because it is finally capable of more and more.
And games HAVE been improving. Some publishers/developers are taking advantage of the market with yearly releases on the same ol' tired engines, yes... but some developers are constantly upgrading their engines.
I mean, hell, The Witcher 2 wasn't even utilizing the latest APIs and brought systems to their knees because it looked so good. Granted, recent APIs could have introduced more efficiency - so the devs could have done a better job on that front... but I digress.
I haven't seen BF4 in a good light yet (top settings, good driver, etc), so I can't comment how much that has improved, but some recent engines have introduced MUCH better multi-threading (multi-core) efficiency.
Sometimes the raw number of polygons and whatnot isn't drastically changing, but behind the scenes many things have improved. Due to the way everything changed with hardware audio in the Windows kernel stack (with NT 6.0 aka Vista) and the failure of any new direct hardware access to catch on (EAX was huge - you don't see support for X-Fi or other hardware directly offered in games all that often, if at all in recent days), most audio in games has been entirely pushed to software. Surround is a common supported feature now, and that's also still software driven. That requires CPU time, more than one might expect as the audio quality goes up.
That, shadow/lighting calculation and the quality of shadows (also CPU), better physics (CPU)... also the increased particles and modeling requirements for better physics (remember, more CPU time) also mandates by necessity more polygons and in general more demands of the GPU.
By making more things in the virtual environment something the user can interact with, especially the more destructible the environment becomes, the more both the CPU and GPU are getting taxed - and the overall visual appeal of the engine might not even change that much.
Add a higher quality sound engine, better network access coding (multiplayer), better lighting and shadows, and you might require more CPU and GPU capabilities, all without ever changing the polygon count and textures. Increase the quality and amount of "physics" present, you DO force more polygons to be drawn (which the developer obviously has to support, mind you).
Include more demanding peripherals in the equation as well. Some gaming peripherals are using 1000hz polling. Might not seem like much, but the less capable the CPU you have, the more you'll notice the impact as you add additional peripherals or up the ante in some way.
There are so, SO MANY things being added to game engines, that you really have to stop and look at the details to really see (and hear) the less than obvious differences.
There are many iterative games, many rehashes of the same tired idea... and many games that utilize the same engine developed what seems like forever ago, with perhaps minimal additional flourishes. I get that.
But there are plenty of golden examples to pick out of that mess that keep upping the ante in so many ways.