Originally posted by: cusideabelincoln
He has an E6300, based off the old 65nm Core 2 architecture and it only has 2mb of L2 cache while only being clocked at 2.8 GHz. I am afraid that is providing a bottleneck for his GTX 280 even at 1080p+ resolutions. Here's the proof:
http://www.pcgameshardware.com...ticle_id=663794&page=4
Compare the E4400 to the E8600, and there's a huge difference (20% in COD4) in just cache size alone at 19x12. Now just think of the added performance of the higher clock speed of an overclocked E8x00 on top of that, and he'll have all the CPU power needed for his GTX 280.
Originally posted by: cusideabelincoln
....
Firstly in CoD4, the difference is 10 frames per second. You can't use your experience because you are using an 8800GTS, and he's using a much more powerful GTX 280. And for statisticians, in CoD4 the performance increase is about 20% and in GRID it is about 15%, which is definitely noticeable in game. That is like the difference between using an HD4850 and an HD4870.
Secondly, the much higher clock speed, through overclocking, of an E8x00 processor will definitely increase his performance over a measely clocked 2.8 GHz E6300.
And I hate the "stop buying the same old architecture" argument that is so rampant around here. Why is it used? i7 is going to be much more expensive than even a complete Core 2 upgrade, for him, and through overclocking he could easily and more cheapily build a Core 2 rig that will be just as fast as a Core i7 rig, if he really wanted to do an overhaul.
Originally posted by: aclim
yea lol, trust me that CPU is not a big bottleneck. And that cache in my opinion is not noticeable in games. At least I dont think so. As I went from a e6300 to my current CPU, no difference at all. His money is much much better served getting more RAM and Vista 64bit to support it and play DX10.
LOL and after looking at that article, its a whole 6 fps max difference between a 2mb and a 6mb cache at the same clock speed. That is a joke. I dunno where you get that is a bottleneck. LOL.
Originally posted by: jaredpace
azn the op already has the 2nd fastest graphics card: gtx280. why would a cpu upgrade not benefit him? LOL
if there is a 15-20% increase just going from 65nm 4mb --> 45nm 6mb then there will be about another %20 going from 2.8ghz to 4.2ghz. Also those tests are done on 22" and 26" monitors with 4XAA and maximum settings.
It's either $400 to SLI, or $150 for an E8400. or $550 for both.
edit: or $300 for a nice 45nm quad or $1000 for an i7
going from 65nm 4mb --> 45nm 6mb
going from 2.8ghz to 4.2ghz.
Originally posted by: jaredpace
yah, i'm not saying to keep the same architecture...
going from 65nm 4mb --> 45nm 6mb
going from 2.8ghz to 4.2ghz.
Those benches ARE 1920 x 1200 WITH GTX 280 and 4XAA?
Originally posted by: jaredpace
Well I guess it goes from Conroe to Penryn. Do those terms sound familiar? Please explain architecture to me
4xAA:
http://www.pcgameshardware.com...ticle_id=663794&page=4
Originally posted by: jaredpace
It's not just "another name" Azn.
Read these articles to find out more:
http://www.anandtech.com/cpuch...el/showdoc.aspx?i=2972
http://www.anandtech.com/cpuch...el/showdoc.aspx?i=3069
http://www.anandtech.com/cpuch...el/showdoc.aspx?i=3137
Here's a graph measuring cpu speed in Mhz w/ 4xAA with the OP's graphics card and not taking into account all the improvements covered in the three previous links:
http://www.pcgameshardware.com...f_the_Geforce_GTX_280/
You linked to graphs out of this article earlier, but I'm not sure you are seeing the differences.