I wouldn't really consider games to be "general use"
I'd consider "general use" to be Excel / Word / Powerpoint / web browser / media consumption.
For that kind of usage, I think it's hard to argue that HT is worth an extra $100ish for an i7 vs. i5.
Even if you consider future proofing... I just don't see "general" applications ever requiring massive threading. Word and Firefox... just not gonna need 8 threads to run well. I know absolutes in this industry are a no no, but I just don't see it happening.
-----
Gaming depends on your intended games and GPU.
This AT article:
http://www.anandtech.com/show/6934/choosing-a-gaming-cpu-single-multigpu-at-1440p/5
Demonstrated how little CPU matters when games run at high settings. Most CPU benchmarks are run at very low settings in order to demonstrate differences in CPU performance, but what really matters in gaming is what settings you actually use, and the reality is that most will crank up the settings until the GPU is a bottleneck and the CPU barely matters.
I think a lot of people don't realize how little CPU matters in real world gaming. It's how AMD can setup a rig side by side an Intel CPU and peoplee won't be able to tell the difference. At graphics settings people actually use,
most games (not all) are pretty GPU limited and CPU threading, etc... barely matters.
Occasionally you get a game like GTA IV that's ported by monkeys and CPU is an issue as a result. Skyrim was the same way before it got patched to solve the bad port. Occasionally you have games like Civ 5 or Total War series where there is potentially so much unit UI that CPU becomes a signficant issue, or Starcraft where unit UI is an issue because the engine is old and can only handle 2 threads.
on the other hand:
I think the stagnation we see on the CPU side makes a better argument than ever for spending an extra $100ish on an i7 and having that hold you tight an extra year or two on a system upgrade. I've been able to hang on to my OCed i3-530 for about two years longer than I ever expected when I bought it. If I had spent on an i5-750 instead I'd probably be good for one or two more instead of looking at Haswell or potentially an outgoing Ivy.
The direction the industry is moving really has me.... Mr. budget CPU and OC the hell out of it... Mr. haven't spent more than ~$130 on a CPU since a Pentium Pro 150 (1996 I think)... looking at a $300+ CPU as a
legitimately good value.
Let's face it, the days of upgrading every generation are gone. It doesn't make sense. SB purchasers are potentially looking at 4 generations to make a significant upgrade worthwhile? If they had bought an i7, they may be able to stretch that to 5 or 6. It's a completely different landscape unless you're a 1%er who has multi-GPUs and plays on the bleeding edge. Buying high end for "future proofing" didn't make sense in the past, because the next generation OCed low to midrange CPUs would be faster than last gens big dogs. Not so anymore. Big dogs are not learning new tricks with new generations, so you can keep them around a while.