http://www.insidehw.com/Review...de-By-Side/Page-2.html
They basically took a QX9650 and brought it down to Q9300 speeds. So both processors were running at 2.5 ghz. After looking at that chart, all I can say is WTF? Is there an error or something?
The average FPS they got in FEAR with a Q9300 is 299; for a QX9650 400 FPS. Similarly, in World in Conflict they got 166 FPS with a 9300 and 199 fps with the 9650. Furthermore the max fps they got in in FEAR is almost 200 more, and 100 more for WIC.
People have said that cache is not that important in real-life applications and it won't be noticeable to the average user. However if in gaming, I can get 200 MORE FPS just with double the cache, than that is pretty damn important.
Is that test flawed? I've been considering either getting the Q9400 or the Q9450. They're identical except for the cache size, and i'm thinking the extra ~$70 in getting the 9450 might be worth it for 100~200 more fps
They basically took a QX9650 and brought it down to Q9300 speeds. So both processors were running at 2.5 ghz. After looking at that chart, all I can say is WTF? Is there an error or something?
The average FPS they got in FEAR with a Q9300 is 299; for a QX9650 400 FPS. Similarly, in World in Conflict they got 166 FPS with a 9300 and 199 fps with the 9650. Furthermore the max fps they got in in FEAR is almost 200 more, and 100 more for WIC.
People have said that cache is not that important in real-life applications and it won't be noticeable to the average user. However if in gaming, I can get 200 MORE FPS just with double the cache, than that is pretty damn important.
Is that test flawed? I've been considering either getting the Q9400 or the Q9450. They're identical except for the cache size, and i'm thinking the extra ~$70 in getting the 9450 might be worth it for 100~200 more fps