I've found with GPUs, when comparing cards from ATI and nVidia, and between various generations that the difference for performance seems to come from the memory interface. 256-bit seems to be the minimum (I suppose it could just be that the 256-bit interface correlates with another feature that's more important, but at least it seems to be a good predictor of GPU performance).
I haven't figured out something similar for CPUs - back in the Celeron/P2 days it was easy - on die cache was the best, followed by off die and no L2 cache. Cores aren't really useful either, as there are hardly any single core CPUs anyway. Is there something I can look for in CPU specs that are a good indicator of CPU gaming performance?
I haven't figured out something similar for CPUs - back in the Celeron/P2 days it was easy - on die cache was the best, followed by off die and no L2 cache. Cores aren't really useful either, as there are hardly any single core CPUs anyway. Is there something I can look for in CPU specs that are a good indicator of CPU gaming performance?