Even these days I think games can yet fully utilize all cores of a cpu, especially when its more than 4. Higher clock speed/performance has always been the priority in gaming over multiple cores. For example, a dual-core 4ghz cpu would be significantly faster than a quad-core at 3ghz in games...
This is double wrong.
First, it is wrong because modern CPUs have turbo boost where it will, as needed, turn off cores and clock the remaining cores higher. This means that on a quad optimized game you use it as a quad core, on a dual or single core limited game you use it as a faster dual or single core.
Second, it is wrong because the assertion that games are still better off in dual or single core (and thus turbo boost will actually turn off your cores and OC the rest rather then running it in quad core mode).
in outdated games that require relatively (by modern standards) light CPUs, a dual core at higher ghz would indeed be "superior" but that is only in very artificial benchmarks measuring hundreds of FPS. Far more then your display can do (60Hz, or 120Hz if you have such a nice expensive monitor).
In games that require heftier CPUs the developers had to, out of necessity, make them quad core capable because they NEEDED the CPU performance.
Those games have been long since optimized to run on quad cores and will do far better on it then on the dual core.
It doesn't even need to be developer optimization... Most developers outsource their graphics engine and build a game on top of that outsourced one.
The most common engine in games is unreal engine
http://en.wikipedia.org/wiki/List_of_Unreal_Engine_games#Unreal_Engine_3
Check out those games list. Modern unreal 3 games will:
1. Use up all 4 cores effectively.
2. Will likely lag up on a 2 core CPU.
I empirically tested and proved (had a thread and a discussion about it back in the day) microstutter due to CPU limitations in Mass Effect 1 using an intel E8400 wolfdale dual core (100% utilization; minimal resolution and graphics settings, tested both nvidia and AMD cards that were more then beefy enough for it at min graphics quality), and overcame that by switching to a Q6600 (low CPU utilization, microstutter gone as empirically tested via FRAPs frame dump). (interestingly just 8 months prior to that I had sold my Q6600 to buy that E8400 after writing up why the quads make no sense for a gamer... I believe that I wasn't wrong per se, they didn't make sense at the time, they made sense 8 months later; I later upgraded my Q6600 to a Q9400 and then i7)