They use only one graphics card at a high resolution, hence they could easily end up gpu bound and not cpu bound in the test. Pretty much makes the article and the results somewhat pointless.
		
		
	 
Its not pointless at all. It proves yet again that somebody gaming at realistic settings for their target hardware is almost always going to find they're GPU bottlenecked, while their CPU makes little to no difference. 
 
Ive seen this thing time and time again, both in real-world gaming scenarios and also with benchmarking. The fact is, the importance of the CPU is vastly overstated on the internet.
 
Its more than reasonable to expect a 5870/GTX470 owner to be gaming at 1080p or higher. Thats the whole point of purchasing high-end graphics cards. I have a GTX470 and I generally dont game at anything less than 1920x1200 with 2xAA.
 
If anything, the article was biased towards the CPU because they didnt use AA, which again is something that can be reasonably expected to be enabled by owners of such cards.
 
	
	
		
		
			Test should of been down with tri-fire or tri-sli and at a lower resolution and lower gfx settings.
		
		
	 
Uh, what? Do you seriously think tri-GPU owners run at lower settings than a single GPU owner? What then is the point of such systems?
 
Tell me, do you expect somebody that drops down $1K for an Intel hex-core and another $1K for tri-GPU is the kind of person that will be playing games at 1280x1024 with no AA?
 
Seriously, if you want to run at such prehistoric settings then buy yourself a cheap passively cooled HTPC GPU.