[Techspot] Q6600 a decade later - does it still game?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
I may be thinking of Penryn regularly hitting 4.0 GHz. Still, 3.6 GHz is 16% faster than the 3.1 GHz tested, and 3.8 GHz is 22% faster. The point still stands that this comparison did not take into account significant headroom that most Q6600s had.

It also fails to account for the memory bandwidth issue someone above mentioned.

EDIT: I'd really like to see someone take a Core 2 Quad QX6850, Core 2 Quad QX9770, Core i7 875K, Xeon X5672 (in dual channel), Core i7 2700K, Core i7 3770K, Core i7 4770K, Core i7 5775C (L4 disabled), Core i7 6700K, and Core i7 7700K, clock them all to 4 GHz with 1866 MHz DDR3 (or 2133/2400 if you can push the Core 2s to do so), and benchmark them against each other. This would be a fantastic means to compare the IPC gains from the last ten years.

I picked up a Q6600 B3 shortly after it was released, and it capped out at 3ghz. Later I traded it for a G0 rev Q6600 at no cost, and that one capped out at 3.2ghz. 3.6 was a good OC which not all chips were capable of.
 

EXCellR8

Diamond Member
Sep 1, 2010
3,982
839
136
3.2Ghz is what I maxed out with mine as well... on a Rampage Formula from 07 or 08. Hardware I can't even give away at this point.
 

Hi-Fi Man

Senior member
Oct 19, 2013
601
120
106
It also fails to account for the memory bandwidth issue someone above mentioned.

Core 2s didn't benefit from DDR3 due to the FSB bottleneck and the their relatively large and fast L2 caches (which was done due to using an FSB). I remember using an Intel X38 board with DDR3 support (one of the first boards with support) with my Q6600 and measuring no difference in performance compared to my Gigabyte P45 DDR2 board.
 

EXCellR8

Diamond Member
Sep 1, 2010
3,982
839
136
There was an X38 board with DDR3 support? I'd believe X48 maybe but that's going back a while...
 

jimbob200521

Diamond Member
Apr 15, 2005
4,108
29
91
This can be look at one of two ways: first, can a Q6600 game at todays resolutions/settings? No, it's not going to be pushing 4k with max details. The second way you can look at it is can it game at more realistic settings; 720/1080 at medium settings with ample memory, video card, ssd, etc? Sure, most of the time. But don't get me wrong, I wouldn't recommend anyone build a system around a Q6600 today but if you've got one in the closet with a decent video card to throw in it for a 2nd/backup gaming rig, why the hell not, what have you got to loose? Also, there are plenty of other uses for a Q6600 today, they are not dead in the water yet. Home built router, file server, decent Plex server if you don't get too crazy with transcoding, web surfing rig, office tasks like Word, Excel, etc. Still very usable today (I say this as I type this on a C2D work machine I wish was a Q6600) but certainly not a high end gaming CPU.
 

Denithor

Diamond Member
Apr 11, 2004
6,300
23
81
C2D/Q needed their large L2 cache because they would have otherwise been very, very bandwidth starved. While AMD had integrated their memory controller (two?) generations earlier, Intel didn't make this change until Nehalem. http://www.anandtech.com/show/2594/11

AMD chips saw almost no advantage at all from DDR -> DDR2 thanks to their IMC. DDR2 was heavily pushed by Intel because they needed bandwidth for the P4 series and into the C2D days (although, with smaller node they were able to add more cache which helped to alleviate the problem).
 

ChronoReverse

Platinum Member
Mar 4, 2004
2,562
31
91
I had to upgrade from my C2Q (all the way to Haswell) to play BF3 reasonably so modern games would completely crush it.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
This can be look at one of two ways: first, can a Q6600 game at todays resolutions/settings? No, it's not going to be pushing 4k with max details. The second way you can look at it is can it game at more realistic settings; 720/1080 at medium settings with ample memory, video card, ssd, etc? Sure, most of the time. But don't get me wrong, I wouldn't recommend anyone build a system around a Q6600 today but if you've got one in the closet with a decent video card to throw in it for a 2nd/backup gaming rig, why the hell not, what have you got to loose? Also, there are plenty of other uses for a Q6600 today, they are not dead in the water yet. Home built router, file server, decent Plex server if you don't get too crazy with transcoding, web surfing rig, office tasks like Word, Excel, etc. Still very usable today (I say this as I type this on a C2D work machine I wish was a Q6600) but certainly not a high end gaming CPU.

Resolution has little to nothing to do with CPU load, and neither do most graphical settings. If a Q6600 can only deliver 25fps at 4k / max settings in a game, it's still only going to deliver around 25fps at 720P lowest settings, and regardless of what GPU you pair with it. I ended up replacing the Q6600 in my wife's PC with an i3 for this very reason - GW2 was a slideshow at times on the older C2Q, and that game is already 5 years old.
 

rchunter

Senior member
Feb 26, 2015
933
72
91
I still use q6600 with a gtx 580 in my metal shop/office. It's mostly used for internet browsing though, not overclocked anymore I put it back down to stock speeds.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,327
10,035
126
I had a Q6600 in my unRAID server, for a number of years, until the PSU died, and I decided to re-build it with an FM2 board. (Both the Gigabyte P35 board, and the replacement A85X, had eight SATA ports on it, which was the important feature for me.)
 

EXCellR8

Diamond Member
Sep 1, 2010
3,982
839
136
I think mine got replaced with a i7 920 D0 and then I sold it off. It eventually made its way back though... might as well hold onto it since it's worth basically nothing.
 

[DHT]Osiris

Lifer
Dec 15, 2015
14,079
12,173
146
I had a q6800 in the GF's computer until like a year and a half ago? Maybe two? Paired with a 275gtx if I remember right. It was fine in stuff like Diablo 3, WoW, Guild Wars 2, whatever other mmos/games we had at the time. Would have been all @ 1080p, probably middling settings for some games, but it got the job done.

Hell, more remarkable for me was that MY system had a i7 950/470?gtx until xmas '15 when I upgraded to Skylake. It played anything I threw at it like a champ.
 

daxzy

Senior member
Dec 22, 2013
393
77
101
I may be thinking of Penryn regularly hitting 4.0 GHz. Still, 3.6 GHz is 16% faster than the 3.1 GHz tested, and 3.8 GHz is 22% faster. The point still stands that this comparison did not take into account significant headroom that most Q6600s had.

I don't think Penryn's could hit 4.0 GHz without expensive cooling and increasing voltages by >0.2V. I had a Q9450 (20 x133 @ 2.66 GHz stock) with a Zalmann CNPS10x and while I could easily raise the FSB to 166 (for 3.33 GHz) with just +0.05V, hitting 3.66 GHz and beyond wasn't feasible without increase voltage by >0.25V.

I know a lot of folks ran Penryn and G0 Q6600's way beyond the specified VID range for the extra speed, but what happens is that the CPU slowly dies. You can search this forum for people who ran their Penryn/G0 Q6000's overclocked + overvolted for prolonged times and they slowly devolved to not even keeping their stock clocks @ default voltage.
 
  • Like
Reactions: VirtualLarry

lopri

Elite Member
Jul 27, 2002
13,209
594
126
Yup Q6600 is the one and the only CPU I have managed to kill. It was a terrible chip all around.
 

nickmania

Member
Aug 11, 2016
47
13
81
I am using one in my main computer, paired with a RX460 and a SSD. I am using cinema 4d, Photoshop, music sequencers, vst instruments... also got latest codecs thanks to the RX460. YouTube gives me 5-15% max CPU use.

I got it clocked 2.5 GHz max and aside last two year games I have do not find a task that this CPU is so low that you cannot use it.

Also has been amazing the transition from win xp to win 10 and the feeling that the computer was faster every year thanks to the multicore programing of the OS and new programs and the SSD.

Without a doubt the best building of my life.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
EDIT: To be clear, I'm not claiming that a Q6600 at 4.0 GHz paired with DDR3 at 1600 MHz can offer a premium gaming experience, only that it can offer a playable 30 FPS experience. We are discussing a 10 year old CPU. Can anyone imagine discussing running Crysis on a Pentium II in 2007?

This was a top tier 2x1.4GHz P3 rig in 2002 with the fastest AGP card ever made and the performance for HL2 only 3 years later is...just acceptable: https://www.youtube.com/watch?v=R1p7tqyfaB8

Even HL1 on a 1997 P2 was doing worse then 2017 games do on Q6600: https://youtu.be/2fM9OfB4Jvk?t=798
 

EXCellR8

Diamond Member
Sep 1, 2010
3,982
839
136
When I first introduced myself to custom computing in early 2000's I was using an Athlon XP... so the Q6600 was a godsend with gaming.

I can't believe how bad HL2 looks now; I used to think that game looked great when maxed out back then. 6800GT alllllll day.
 

bigboxes

Lifer
Apr 6, 2002
38,576
11,968
146
When I first introduced myself to custom computing in early 2000's I was using an Athlon XP... so the Q6600 was a godsend with gaming.

I can't believe how bad HL2 looks now; I used to think that game looked great when maxed out back then. 6800GT alllllll day.

Nothing wrong with an Athlon XP when it came out. Was competitive with the Pentium. Maybe your Athlon had aged by the time you got the Q6600. I'm sure it was quite the upgrade. I think I had at least six Athlons before I made the move to Intel (Nehalem).
 

EXCellR8

Diamond Member
Sep 1, 2010
3,982
839
136
I probably still have the 3200+ lying around somewhere with my Abit NF7. I remember applying paste to the little die and spreading it around with a plastic stick.

OC'd to a whopping 2.4Ghz on 3rd party BIOS and I could still barely play Bioshock. Did work though, so slow clap I suppose.
 

MongGrel

Lifer
Dec 3, 2013
38,751
3,068
121
4.0Ghz was VERY uncommon (I might say rare) for a Q6600. I got mine to 3.6Ghz, which was considered a realistic max for those CPUs back in the day. I don't know who started the 4.0Ghz rumor. There was a person on these forums claiming that he got to 4.0 on his, but most people I don't think believed him fully.

Edit: And if you thought the Skylake CPUs were bandwidth-limited, you should try a quad-core with a FSB like the Q6600! Way more memory bandwidth-limited than Skylake, especially on scientific compute tasks.

Mine ran jut fine at 4.0