[Techspot] Q6600 a decade later - does it still game?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
I have a close friend who games a lot more than I do with considerably less hardware. I always sell for cheap or give away to him my older gear and he's never complained. He's currently running a Q6600 GO @ 3Ghz with 8GB DDR3 and a small SSD and some spinning rust for bulk storage. Last thing I gave him was a 2GB Asus 750 Ti and he loves it as he upgraded from a 512MB 4870. Plays fallout 4 and multi-player BF4 very regularly and never complains. Plus he normally gets a higher score than me when we game together online. He has a simple way of dealing with the lack of performance. If the game "runs a little choppy" at native resolution on his 10 year old 42" 1080P Panny Plasma, he drops to 720P and games away. As long as his frames stay around 30 he's happy with his old potato.

I think the key is having low expectations combined with a general ignorance of modern PC gaming performance. I have a few other friends that I built APU systems for and they also happily play similar games with settings turned down if needed.

I have a decent upgrade planned for the Q6600 guy as it pains me to watch him play like that and he just got BF1 which may be the game breaks him :) (8350 at a stable 4.5Ghz, 16GB RAM, Gigabyte UD5 990FX, 256GB SSD, 8GB RX 480) and I bet with his level of performance expectations it'll easily last him a good 5 - 7 years, possibly longer if DX12/Vulcan ever fully take off.
 
Last edited:
  • Like
Reactions: VirtualLarry

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
I have a close friend who games a lot more than I do with considerably less hardware. I always sell for cheap or give away to him my older gear and he's never complained. He's currently running a Q6600 GO @ 3Ghz with 8GB DDR3 and a small SSD and some spinning rust for bulk storage. Last thing I gave him was a 2GB Asus 750 Ti and he loves it as he upgraded from a 512MB 4870. Plays fallout 4 and multi-player BF4 very regularly and never complains. Plus he normally gets a higher score than me when we game together online. He has a simple way of dealing with the lack of performance. If the game "runs a little choppy" at native resolution on his 10 year old 42" 1080P Panny Plasma, he drops to 720P and games away. As long as his frames stay around 30 he's happy with his old potato.

I think the key is having low expectations combined with a general ignorance of modern PC gaming performance. I have a few other friends that I built APU systems for and they also happily play similar games with settings turned down if needed.

I have a decent upgrade planned for the Q6600 guy as it pains me to watch him play like that and he just got BF1 which may be the game breaks him :) (8350 at a stable 4.5Ghz, 16GB RAM, Gigabyte UD5 990FX, 256GB SSD, 8GB RX 480) and I bet with his level of performance expectations it'll easily last him a good 5 - 7 years, possibly longer if DX12/Vulcan ever fully take off.

The kids have it good with hand-me-downs these days. I would rather get stuck with C2Q 2008/9 tech today than the Socket 7 garbage without AGP/DIMMs and the 2D-only video cards back in 2002 that struggles to run games even it for its era.
 

Marburg U

Junior Member
Jan 20, 2017
5
1
81
I remember playing the Demo of BF2142 in 2006, with a GeFo 6600 and an AthlonXP 2400+, from 2002. It took 25 minutes to build the shaders cache. Nowadays a 5 years old CPU is still worth a bunch of money on the secondary market.
 

jihe

Senior member
Nov 6, 2009
747
97
91
Just putting things in perspective, the CPU 10 years before Q6600 was Pentium II 300Mhz. I wonder how it would've handled a game back in 2007.
 

bigboxes

Lifer
Apr 6, 2002
38,574
11,968
146
Just putting things in perspective, the CPU 10 years before Q6600 was Pentium II 300Mhz. I wonder how it would've handled a game back in 2007.

They're lasting a lot longer than they used to, but ten years is a long time in computer years.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
I bet a 7700k or 6950x would still game plenty fine in 2027.

That won't be remotely surprising, Q6600 is already 10 years old, yet quads and better CPUs haven't even broke 50% ownership on Steam by now.
 

Suijin

Junior Member
Aug 19, 2015
20
0
16
My Q6600 is still the family desktop computer and it was my only gaming computer till last summer. I really only played Final Fantasy XIV on it at that point though.

Still a great general purpose internet, office computer though.

It is funny that my in-laws bought a Q6600 Dell at about the same point and when we visited over there my wife asked why theirs was so slow (this was just internet, etc. type stuff). I think theirs was gimped for memory as the biggest issue, along with crapware.
 

WhoBeDaPlaya

Diamond Member
Sep 15, 2000
7,414
401
126
I bet the personal computer industry will be dead by that date, if they don't find some technology to motivate the push for more power.
I could use more power right now - would speed up those dang BD (or God forbid 4K) encodes ...

dual_x5675.png
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
I bet the personal computer industry will be dead by that date, if they don't find some technology to motivate the push for more power.

I'm a bit more optimistic. I think as long as the PC gaming industry continues gaining marketshare it'll be just fine. New technology like 4K HDR @ 120+Hz and 8K gaming eventually. VR may help as well but they need much better optics and refresh rates before I'll let my wallet loose on one. I'm sure the next version of the Vive or Rift will have many more pixels to push. These two use cases alone require much much faster processing. AMD looks to finally have a competitive CPU and GPU's coming soon so that should light the fire necessary to wake up the competition. We also have that 3D xpoint Intel memory/ssd DIMMs coming later this year. I expect to see big advances in computing over the next 2 - 3 years. Computing can always be faster, and rasterized graphics will only take us so far before we need to something new to render things more realisticly.
 
  • Like
Reactions: bigboxes

superstition

Platinum Member
Feb 2, 2008
2,219
221
101
VR may help as well but they need much better optics and refresh rates before I'll let my wallet loose on one. I'm sure the next version of the Vive or Rift will have many more pixels to push.
Foveated rendering actually helps VR push fewer pixels. AMD said we need 16K resolution for optimal VR but didn't mention that one only needs that in a comparatively small area being rendered because eye tracking + foveated rendering can limit how many pixels need to be in 16K.

tftcentral said:
Visual acuity (VA) is a measure of the acuteness of foveal vision. The fovea is the part of the retina with the highest visual acuity. It sees only the central 2° of our vision and comprises less than 1% of the total retina, but more than half of the information processed by the visual cortex in the brain comes from that small area of the retina. To put that 2° angle in perspective: if you look straight ahead, your total horizontal field of view (peripheral vision) is about 190° and your binocular vision (seen by both eyes) is about 130°. Not just the acuity is, but also accuracy of colour vision is the highest in the fovea.
tftcentral said:
In fact that 2° angle is still optimistic, because only the central region of the fovea, the foveola, can achieve maximum acuity. The foveola measures only about 350 µm in diameter and comprises only the central 1.2° of vision. Cone cell density outside the fovea is very low. The relation between visual acuity and cone density is near linear for normal lighting conditions. Human visual acuity is diffraction limited by the pupil at around 2.5 (20/8 vision), but anything higher than 1.8 (roughly 20/11) is very rare.
The article talks about how, at reasonable viewing distances, super-high pixel count is overkill for TV. This points to how foveated rendering can be so helpful in bringing high-quality to us with a lowering of the pixel pushing required. Maybe monitors for gamers will have eye tracking at some point, to take advantage of foveated rendering.
tftcentral said:
If you think about that you only have maximum visual acuity in the foveola, 8K seems quite a waste; you can only see a circle with a 69 pixel diameter with maximum accuracy at the time out of 7680x4320.
Of course, this changes with VR because the distance between the screen and the eye is so much smaller.
tftcentral said:
Even if you move your focus by moving your eyes or turning your head you still can’t see the maximum detail of resolutions wider than 3438 pixels if your head is straight in the front of the centre of the screen (VA = 1.0). Even for VA=1.6 that’s still only 5500 pixels. For resolutions wider than that you’d really have to move your head sideways to see all the detail. To see all detail of 8K by just rotating your head and not moving lateral or axial you would need to have near perfect vision of VA = 2.23 (roughly 20/9 vision).

http://www.tftcentral.co.uk/articles/visual_acuity.htm

Once we have foveated rendering perfected we will eventually hit a wall where, even for people with unusually rare visual acuity, no more pixels will ever need to be added. The pixel density race will come to a close.

Going back to your comment. What VR needs the most in terms of visuals is a wider field of view (available on a non-consumer headset recently licensed to IMAX) and optimal eye tracking. These things are more important than just increasing the pixel count at the moment, although the eye tracking will, of course, enable foveated rendering to increase the pixel count and, more importantly, the pixel count's perceived efficiency.
 
Last edited:
  • Like
Reactions: Madpacket

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Foveated rendering actually helps VR push fewer pixels. AMD said we need 16K resolution for optimal VR but didn't mention that one only needs that in a comparatively small area being rendered because eye tracking + foveated rendering can limit how many pixels need to be in 16K.



The article talks about how, at reasonable viewing distances, super-high pixel count is overkill for TV. This points to how foveated rendering can be so helpful in bringing high-quality to us with a lowering of the pixel pushing required. Maybe monitors for gamers will have eye tracking at some point, to take advantage of foveated rendering.

Of course, this changes with VR because the distance between the screen and the eye is so much smaller.


http://www.tftcentral.co.uk/articles/visual_acuity.htm

Once we have foveated rendering perfected we will eventually hit a wall where, even for people with unusually rare visual acuity, no more pixels will ever need to be added. The pixel density race will come to a close.

Going back to your comment. What VR needs the most in terms of visuals is a wider field of view (available on a non-consumer headset recently licensed to IMAX) and optimal eye tracking. These things are more important than just increasing the pixel count at the moment, although the eye tracking will, of course, enable foveated rendering to increase the pixel count and, more importantly, the pixel count's perceived efficiency.

Interesting and thanks for sharing this info. It looks like this technology could be a useful bridge to close the gap until the rendering power becomes widely available to do things natively. Ultimately things like eye tracking in combination with foveated rendering are neat hacks but I don't think it's the type of technology that can ever be perfected (there will always be a latency penalty with this technology, or people with eye issues that limit calibration). This would also only work in very personal viewing experiences so VR is a good testbed.

Display technology has come a long way but still needs a lot more attention. Contrast ratios and pixel response times could always be improved on. I'm sure this is being worked on and is part of my optimism that we'll continue to get (perhaps accelerated due to real competition) some big CPU/GPU upgrades.
 

x26

Senior member
Sep 17, 2007
734
15
81
"We set out to discover if the decade old Core 2 Quad Q6600 could cut the mustard in 2017, and the answer is a resounding no. Of course, this won't surprise many of you -- testing 10-year-old computer tech is a bit like comparing steam trains against maglevs.

Out of the box, the Q6600 is only good for a Cinebench R15 multi-threaded score of around 250pts. By comparison, a $120 Core i3-6100 scored around 400pts and the i7-6700K hit over 900pts. Overclocking the Q6600 to 3.1GHz only boosted the Cinebench scores to around 320pts."

http://www.techspot.com/article/1313-intel-q6600-ten-years-later/page3.html

The simple answer is: HA!

Remember when CPU development was actually exciting?

That chip is my Most Hated Intel CPU Purchase Ever!!! --And my first was the Intel Celeron 266@400-1998!! :D
 

pantsaregood

Senior member
Feb 13, 2011
993
37
91
I think the takeaway here should be that 30 FPS gaming is possible on a modestly overclocked 10 year old CPU. Had you asked if a 10 year old CPU could game in 2007, you would've been laughed at.
 
  • Like
Reactions: Madpacket