I'm extremely curious how an Intel chip at, say, 3.2GHz runs a given battle compared to an AMD chip equivalent of 3200MHz would run it. That is to say a 3200+ of some sort. I would almost guarantee that the chip actually running at 3.2GHz (the Intel chip) would run the game better, but I want to see that backed up with testing.
R:TW's real-time battles are a great gaming stress test for CPU comparisons since they really show which CPU can crunch through the thousands of calculations faster for a given battle.
I remember a year or two ago when the game was used for Time Commanders (TV show in the UK), they were running it on some crazy-fast machines just to make it run smoothly for the show. (Point being it's a known CPU hog of a game.)
Currently my Barton 2500+ (oc'd to 2700+ speeds) crawls in big battles. My 9800Pro sits there twiddling its virtual thumbs waiting for the CPU. I am considering upgrading my system but have yet to see R:TW used in any CPU comparison and I'm very curious to see if actual raw clock speed is more important than fancy instruction routing and caching when it comes to this game (and others that are CPU intensive).
Thoughts?
R:TW's real-time battles are a great gaming stress test for CPU comparisons since they really show which CPU can crunch through the thousands of calculations faster for a given battle.
I remember a year or two ago when the game was used for Time Commanders (TV show in the UK), they were running it on some crazy-fast machines just to make it run smoothly for the show. (Point being it's a known CPU hog of a game.)
Currently my Barton 2500+ (oc'd to 2700+ speeds) crawls in big battles. My 9800Pro sits there twiddling its virtual thumbs waiting for the CPU. I am considering upgrading my system but have yet to see R:TW used in any CPU comparison and I'm very curious to see if actual raw clock speed is more important than fancy instruction routing and caching when it comes to this game (and others that are CPU intensive).
Thoughts?