nVidia 8 series and CPU scaling

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
I think people are just tired of hearing that you need a OC 3.2Ghz Core 2 to make use of a 8800GTX. I think that is bull schmidt and am tired of hearing it.

You might not make full use of the 8800 GTX with a X2 - 4400, but who said you have to make full use of a GPU? Where is this unwritten rule that the bottleneck must always be the video card? I guess that is what many people are tired of.

The botton line is this, does it make sense to upgrade a GPU or a CPU? As stated previously, in most circumstances, you will get more mileage out of a GPU upgrade.

Case in point - When I purchased my system 2 years ago, I purchase the fastest card available, the 7800GTX. I had a A64 3500 @ 2.53 Ghz. Now, when it came time for two years later, what made more sense, a GPU upgrade or a CPU upgrade? I think the answer is obvious, the GPU! Of course I still did upgrade my CPU (luckily that was an option without having to replace MB+Memory), but my current one was more than sufficient to power this 8800GTS. In fact, it can be argued strongly, that the move from a 2.53Ghz A64 to a X2 A64 of the same clock speed is a simple lateral move in most games... Again, my system has no problem working this GTS into the ground. And hey, I run a relatively low resolution 1280 X 768... No regrets.

Another point I wanted to make was that it has taken over two years to double processor performance and it can be argued that it has no doubled, they just added more cores (cheaters method). So, compare, say a A64 2.6Ghz to a Intel Core 2 @ 3.0Ghz in a single threaded app, have we doubled the speed? No, we have not. Only in multi-threaded situations do we double, and that is obvious, we added another core, or three for that matter. So, again, even in games, does a QC 3.0Ghz double the speed of an A64 3200 in most games? No, not really... But in two years what has changed onthe GPU front? Quite a bit, one year later, after the 7800GTX, was the 8800GTX which has a performance lead as much as 400% in some games and at LEAST 250% lead universally. Again, this one ONE year. One year and they tripled the performance of the previous year... When does a CPU do that? Like never... With G92 around the corner, you can increase that performance further and directly compare it to the life cycle of a CPU.

So, currently, the CPU has more longevity to the gamer and it is the GPU that should be upgraded more regularly to see performance gains. GPU is GREATER than CPU in gaming and yes, that is a blanket statement that you can take to the bank.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
have i been saying that too much?
:confused:

and i was always careful to say "full use of a" high-end GPU usually in answer to a question in a post. After all, it is the truth.

i also tend to upgrade my GPUs much more often then my CPUs ... letsee, i got my P4 2.80c @ 3.31Ghz 3-1/2 years ago and started with a r8500 and went thru 9800xt/x850xt/x1950p and then upgraded to my P4 ee because there was a mismatch and FEAR was running to slow for the minimums. ii would have kept it except my old MB died and it was cheaper to just redo the whole thing ...

but you are right ... on my current e4300 i already have had a x1950p and a 2900xt ... i will probably go for xfire and then the next GPU to match Penryn if my plan holds out.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
I dunno, I am not directed that at anyone. As long as people include full use, I am fine. I just don't like it when people say it is a waste to have a 8800GTX on anything less than Core 2 @ 2.8+ Ghz... Anyway... That was MY rant for the day :D
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
:cool:

and it DOES seem like it is being said often - i am in CPU forum - a lot now - and all i see is the hysteria for C2D upgrades and especially QuadCore - which i just can't quite figure out ... as they are hot, expensive, overclock poorly compared to DC ... and are just "interim" for the 45nm intel CPUs ... and barely used by any games
... can't all be video encoders can they?

and it ISN'T a waste to have a 8800GTX paired with a mid-level CPU ... you do get to max everything ... but IF the CPU is too slow ... like a64-3000+ it would make just as much sense to get a GTS and save money
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: ArchAngel777
Another point I wanted to make was that it has taken over two years to double processor performance and it can be argued that it has no doubled, they just added more cores (cheaters method). So, compare, say a A64 2.6Ghz to a Intel Core 2 @ 3.0Ghz in a single threaded app, have we doubled the speed? No, we have not. Only in multi-threaded situations do we double, and that is obvious, we added another core, or three for that matter. So, again, even in games, does a QC 3.0Ghz double the speed of an A64 3200 in most games? No, not really... But in two years what has changed onthe GPU front? Quite a bit, one year later, after the 7800GTX, was the 8800GTX which has a performance lead as much as 400% in some games and at LEAST 250% lead universally. Again, this one ONE year. One year and they tripled the performance of the previous year... When does a CPU do that? Like never... With G92 around the corner, you can increase that performance further and directly compare it to the life cycle of a CPU.

So, currently, the CPU has more longevity to the gamer and it is the GPU that should be upgraded more regularly to see performance gains. GPU is GREATER than CPU in gaming and yes, that is a blanket statement that you can take to the bank.

Then graphics processing has always been "cheating" since the advent of the Voodoo 2, they have been adding more cores since the very beginning if I recall, 1-2-4-8-16-etc etc...
On the graphics front they are just called "pipelines", they recently turned to this method on the CPU front because it is becoming extremely difficult to extract more single threaded performance...

Could you imagine how slow graphics processing performance would increase if we just stayed on 1 pixel pipeline and tried to make it more efficient, or up the clockrate??

Until we can write multithreaded code that can scale as well as graphics processing does, it will still be slower going regardless on the CPU front for performance overall, it cannot be helped.

7800 GTX to 8800 GTX was almost a year and a half, and if you haven't noticed most of the performance increase is coming from the fact that it has much more mini-cores then the previous generation. even assuming it takes to 2 scalar processors to make the equivalent of 1 pipeline. You have about double the execution units, as well as a clockspeed bump and higher efficiency usage due to the general purpose nature of them now.

So CPU's are now going the way that graphics processing has been going all along and now has more potential to increase performance at a faster then past recent history has allowed.



 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: betasub
Originally posted by: coldpower27
Blanket statements suck on the whole

Sig worthy, although I wish you'd gone for the full irony of something along the lines of:

"blanket statements always suck"

:laugh:

I didn't even notice that, now I do see the irony, how to word this again without making it sound ironic..

Blanket statements don't paint the complete picture, you will very easily with some digging find examples which contradict what they state.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: apoppin
:cool:

and it DOES seem like it is being said often - i am in CPU forum - a lot now - and all i see is the hysteria for C2D upgrades and especially QuadCore - which i just can't quite figure out ... as they are hot, expensive, overclock poorly compared to DC ... and are just "interim" for the 45nm intel CPUs ... and barely used by any games
... can't all be video encoders can they?

and it ISN'T a waste to have a 8800GTX paired with a mid-level CPU ... you do get to max everything ... but IF the CPU is too slow ... like a64-3000+ it would make just as much sense to get a GTS and save money

Considering the rate of performance increases before on the CPU side, the Dual Core and Quad Core race is so exciting compared to the dull time of when we had miniscule performance increases.

Quad Core is just cool, these days because you feel like you have twice the performance (even if it's still limited to multi-threaded scenarios), I believe it was the same with AMD Dual Core's, what is most interesting right now is obviously the $266 Q6600 which is cheap in the relatively sense, it was only about 2 years ago in August 2005 when a cheap Dual Core was introduced by AMD.

 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: apoppin
have i been saying that too much?
:confused:

and i was always careful to say "full use of a" high-end GPU usually in answer to a question in a post. After all, it is the truth.

i also tend to upgrade my GPUs much more often then my CPUs ... letsee, i got my P4 2.80c @ 3.31Ghz 3-1/2 years ago and started with a r8500 and went thru 9800xt/x850xt/x1950p and then upgraded to my P4 ee because there was a mismatch and FEAR was running to slow for the minimums. ii would have kept it except my old MB died and it was cheaper to just redo the whole thing ...

but you are right ... on my current e4300 i already have had a x1950p and a 2900xt ... i will probably go for xfire and then the next GPU to match Penryn if my plan holds out.

What you did make sense, architectural overhauls happened very often on the GPU front, as opposed to the CPU front, Intel stayed with NetBurst for about 5.5 years, AMD with K8 for at least a bit over 4 years.

After the Pentium 4 3.0GHZ milestone, Single Thread performance increases sort of dialed down, it was obvious that a new way was needed in addition to bring performance up hence the utilization of additional cores.

 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: Piuc2020
Did you just quadruple posted?

I guess I did, it never occurred to me you could address 4 people in 1 post, I have always address a single post per posting.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: coldpower27
Originally posted by: ArchAngel777
Another point I wanted to make was that it has taken over two years to double processor performance and it can be argued that it has no doubled, they just added more cores (cheaters method). So, compare, say a A64 2.6Ghz to a Intel Core 2 @ 3.0Ghz in a single threaded app, have we doubled the speed? No, we have not. Only in multi-threaded situations do we double, and that is obvious, we added another core, or three for that matter. So, again, even in games, does a QC 3.0Ghz double the speed of an A64 3200 in most games? No, not really... But in two years what has changed onthe GPU front? Quite a bit, one year later, after the 7800GTX, was the 8800GTX which has a performance lead as much as 400% in some games and at LEAST 250% lead universally. Again, this one ONE year. One year and they tripled the performance of the previous year... When does a CPU do that? Like never... With G92 around the corner, you can increase that performance further and directly compare it to the life cycle of a CPU.

So, currently, the CPU has more longevity to the gamer and it is the GPU that should be upgraded more regularly to see performance gains. GPU is GREATER than CPU in gaming and yes, that is a blanket statement that you can take to the bank.

Then graphics processing has always been "cheating" since the advent of the Voodoo 2, they have been adding more cores since the very beginning if I recall, 1-2-4-8-16-etc etc...
On the graphics front they are just called "pipelines", they recently turned to this method on the CPU front because it is becoming extremely difficult to extract more single threaded performance...

Could you imagine how slow graphics processing performance would increase if we just stayed on 1 pixel pipeline and tried to make it more efficient, or up the clockrate??

Until we can write multithreaded code that can scale as well as graphics processing does, it will still be slower going regardless on the CPU front for performance overall, it cannot be helped.

7800 GTX to 8800 GTX was almost a year and a half, and if you haven't noticed most of the performance increase is coming from the fact that it has much more mini-cores then the previous generation. even assuming it takes to 2 scalar processors to make the equivalent of 1 pipeline. You have about double the execution units, as well as a clockspeed bump and higher efficiency usage due to the general purpose nature of them now.

So CPU's are now going the way that graphics processing has been going all along and now has more potential to increase performance at a faster then past recent history has allowed.

The data you posted here I do not contest, but I do contest the validity of it in regards to this thread. While it is true that GPU's have been 'cheating', they do it and it WORKS. A QC CPU is wasted on all but ONE game! So, you know, sure, go ahead and buy that QuadCore, but it won't do you any good for games currently available. When the GPU's do it, they extract performance out of it. When CPUs do it, we don't get jack except for a few specialized applications. Will that change? Yes, probably. But RIGHT NOW, that doesn't apply.

I, too, am very excited with the advent of DC, QC, OC, etc... But, for the future, not for the now. Because, who really cares right now? Unless my sole purpose is to rip DVD's, I just don't care.

I agree with Apop 100% when he said "They can't all be video encoder enthuesiests"... People are buying QC because it is cool... Kinda like 4 tipped exhaust... And, hey, I do not blame them! I think it would be cool to have a QC, but it is not economical viable when I can O/C an E4400 to ~3Ghz, extract the same performance in all but a few applications, and save power while doing it.

I suppose that is why they call us enthusiests and maybe I am not so much of this crowd anymore, because now I don't have money I can just spend at will... It won't get past my wife :-(

So, I suppose life goes in stages.. When I was 14-21 I spent every dime on computer performance. Used to upgrade my machine every 6 months, always had the top end video card in my machine. Now that I am 26, family and kids, I look at performance per dollar... Do I want a QC rig? Yes... Is it worth the money to me? No... That is where practically meets reality. :D
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
ArchAngel777, but man, you gotta get 8800GTX SLI. It's the equivalent of spinners for your PC! You know how many girls guys pick up with SLI? hehe

Yes, definately, the practicality factor starts to matter once you realize money is better spent on other things than gaming 24/7, unless of course an amazing deal comes up and you cannot resist the urge :)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: coldpower27
Originally posted by: Piuc2020
Did you just quadruple posted?

I guess I did, it never occurred to me you could address 4 people in 1 post, I have always address a single post per posting.

the cool thing is that that were made minutes apart - not one-after-another ... which shows the answers were considered to each poster. Unfortunately, no one posted in-between

as to QuadCore being "cool" ...

--so is Apple :p

 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: ArchAngel777
Originally posted by: coldpower27
Originally posted by: ArchAngel777
Another point I wanted to make was that it has taken over two years to double processor performance and it can be argued that it has no doubled, they just added more cores (cheaters method). So, compare, say a A64 2.6Ghz to a Intel Core 2 @ 3.0Ghz in a single threaded app, have we doubled the speed? No, we have not. Only in multi-threaded situations do we double, and that is obvious, we added another core, or three for that matter. So, again, even in games, does a QC 3.0Ghz double the speed of an A64 3200 in most games? No, not really... But in two years what has changed onthe GPU front? Quite a bit, one year later, after the 7800GTX, was the 8800GTX which has a performance lead as much as 400% in some games and at LEAST 250% lead universally. Again, this one ONE year. One year and they tripled the performance of the previous year... When does a CPU do that? Like never... With G92 around the corner, you can increase that performance further and directly compare it to the life cycle of a CPU.

So, currently, the CPU has more longevity to the gamer and it is the GPU that should be upgraded more regularly to see performance gains. GPU is GREATER than CPU in gaming and yes, that is a blanket statement that you can take to the bank.

Then graphics processing has always been "cheating" since the advent of the Voodoo 2, they have been adding more cores since the very beginning if I recall, 1-2-4-8-16-etc etc...
On the graphics front they are just called "pipelines", they recently turned to this method on the CPU front because it is becoming extremely difficult to extract more single threaded performance...

Could you imagine how slow graphics processing performance would increase if we just stayed on 1 pixel pipeline and tried to make it more efficient, or up the clockrate??

Until we can write multithreaded code that can scale as well as graphics processing does, it will still be slower going regardless on the CPU front for performance overall, it cannot be helped.

7800 GTX to 8800 GTX was almost a year and a half, and if you haven't noticed most of the performance increase is coming from the fact that it has much more mini-cores then the previous generation. even assuming it takes to 2 scalar processors to make the equivalent of 1 pipeline. You have about double the execution units, as well as a clockspeed bump and higher efficiency usage due to the general purpose nature of them now.

So CPU's are now going the way that graphics processing has been going all along and now has more potential to increase performance at a faster then past recent history has allowed.

The data you posted here I do not contest, but I do contest the validity of it in regards to this thread. While it is true that GPU's have been 'cheating', they do it and it WORKS. A QC CPU is wasted on all but ONE game! So, you know, sure, go ahead and buy that QuadCore, but it won't do you any good for games currently available. When the GPU's do it, they extract performance out of it. When CPUs do it, we don't get jack except for a few specialized applications. Will that change? Yes, probably. But RIGHT NOW, that doesn't apply.

I, too, am very excited with the advent of DC, QC, OC, etc... But, for the future, not for the now. Because, who really cares right now? Unless my sole purpose is to rip DVD's, I just don't care.

I agree with Apop 100% when he said "They can't all be video encoder enthuesiests"... People are buying QC because it is cool... Kinda like 4 tipped exhaust... And, hey, I do not blame them! I think it would be cool to have a QC, but it is not economical viable when I can O/C an E4400 to ~3Ghz, extract the same performance in all but a few applications, and save power while doing it.

I suppose that is why they call us enthusiests and maybe I am not so much of this crowd anymore, because now I don't have money I can just spend at will... It won't get past my wife :-(

So, I suppose life goes in stages.. When I was 14-21 I spent every dime on computer performance. Used to upgrade my machine every 6 months, always had the top end video card in my machine. Now that I am 26, family and kids, I look at performance per dollar... Do I want a QC rig? Yes... Is it worth the money to me? No... That is where practically meets reality. :D

Quad Core is similar to 64Bit Processing in a way, you can't just plug it in and take advantage of it (with regards to gaming), and that one of it's greatest weaknesses it needs background support in order to be taken advantage of. But you also have to think about the chicken and the egg scenario, without the hardware infrastructure in place, developers won't program for it either. It's comparable to programing DX10 features, it needs the infrastructure in place to take advantage of it.

Be that as it may, we are going this route because the other way has been shown to not yield enough performance benefits by itself, so the only way you can extract more performance is to use methods where you previous have untapped potential such as multi processing.

Quad Core is still performance-mainstream level, it's barely been around for a year so of course it's going to cost more in relation to Dual Core, the inflection point for Dual has no premium over Quad on desktop hasn't been reached yet, though interestingly enough has already been reached in the mid range server 2P Intel SKU's as well as certain Xeon UP's. If your looking for price/performance Quad is not the product for you if your a gamer, but I never argued that.

To be hot and chic you spend money on whatever the latest gizmo is, that has always been how it works, Quad Core as explained is now about economical value, since when have the high end been about that anyway. High end products have never been about "practical".

CPU's have always had that problem of easily reaching high end performance through overclocking as you can't disable "cores" as previous there is only 1, as we move to multi-threaded programs though, we will get a situation where you do lose dramatic performance.

Quad Core at $266 is indeed "economically viable" what I think your trying to say it's not as cheap as an "overclocked" Dual Core, which it was never designed to be. At that price it is quite affordable, by many people. I guess your saying it's not economically viable for you.




 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Quad Core at $266 is indeed "economically viable" what I think your trying to say it's not as cheap as an "overclocked" Dual Core, which it was never designed to be. At that price it is quite affordable, by many people. I guess your saying it's not economically viable for you.
and yet it costs more then double ... for zero gaming performance benefit over overclocked DC ... and uses more energy, runs hotter and OCs worse percentage-wise then DC.

-From a practical standpoint of this gamer - NO way! ... Penryn will simply blow it away - at least 10-15% faster at the same clock while wasting less energy, running cooler and OCing better ... and maybe by then there will be more then one game that uses more then 2 cores.

i'd rather put my money toward a faster GPU solution ... like CrossFire :p
--Now why did i want to upgrade my CPU twice?
:confused:
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: apoppin
Quad Core at $266 is indeed "economically viable" what I think your trying to say it's not as cheap as an "overclocked" Dual Core, which it was never designed to be. At that price it is quite affordable, by many people. I guess your saying it's not economically viable for you.
and yet it costs more then double ... for zero gaming performance benefit over overclocked DC ... and uses more energy, runs hotter and OCs worse percentage-wise then DC.

-From a practical standpoint of this gamer - NO way! ... Penryn will simply blow it away - at least 10-15% faster at the same clock while wasting less energy, running cooler and OCing better ... and maybe by then there will be more then one game that uses more then 2 cores.

i'd rather put my money toward a faster GPU solution ... like CrossFire :p
--Now why did i want to upgrade my CPU twice?
:confused:
I'm of the belief that the highest clockspeed wins in today's CPU market. You're pretty much limited by the speed of your fastest 'core' in most applications.

That said, the Intel Xeon 3210 is very tempting. It's server-grade, cut from the centre of the wafer, and only costs $250 for 4 cores that will clock to 3.5ghz.

If you play alot of Supreme Commander, it could actually be worth it. I know I'm addicted to C&C3 so there must be some SC addicts out there. :light:
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: apoppin
Quad Core at $266 is indeed "economically viable" what I think your trying to say it's not as cheap as an "overclocked" Dual Core, which it was never designed to be. At that price it is quite affordable, by many people. I guess your saying it's not economically viable for you.
and yet it costs more then double ... for zero gaming performance benefit over overclocked DC ... and uses more energy, runs hotter and OCs worse percentage-wise then DC.

-From a practical standpoint of this gamer - NO way! ... Penryn will simply blow it away - at least 10-15% faster at the same clock while wasting less energy, running cooler and OCing better ... and maybe by then there will be more then one game that uses more then 2 cores.

i'd rather put my money toward a faster GPU solution ... like CrossFire :p
--Now why did i want to upgrade my CPU twice?
:confused:

Quad Core is not about value either, as already explained or anything economical. Quads are targeted at enthusiast, and gamers are only 1 sect of that.

The first revision of something usually isn't all that amazing, but considering the competition doesn't even have Quad Core to begin with, Q6600 is very nice. Kentsfield is just helping build infrastructure now rather then later, Yorkfield the Penryn derivative will help it along even more...

I wasn't arguing performance per dollar, as I already explained before.

For what it's worth, I am not bothering with a Quad at this point either because I have a E4300 that runs just fine for what I do, my next item on the list to upgrade would still be the video card as the one am using is not all that amazing, but I am waiting for 2nd generation DX10 hardware, at 100W+ the current gen cards are just consuming way too much energy.






 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
OK, i see your point, but i am not so sure you see mine


if you are an enthusiast - a GAMER - QC only offers disadvantages imo - except for one game

-QC gets poorer overall gaming performance since it doesn't OC as far without extreme measures ... worse thermal characteristics [which affect case temps]
-forget my price arguments ... just talk gaming performance ... QC is worse than useless - AtM [not 'shortly']
-on the OTHER hand, IF i played SC - exclusively - or did Video encoding, i'd agree 100% and i'd already have QC

and when another game that i like comes out that effectively uses QC over DC - i will make the jump - BUT with Penryn [because i prefer low-heat/hi OC unlike the current "fad" CPU's characteristics]
... and CPU's don't exhaust hot air from my case like GPUs do.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
One problem is that the future of 2007 quad-core gaming is uncertain at the moment. We have Crytek claiming that Crysis will take advantage of quads (but we've heard this song and dance before from developers about multi-core usage only to discover the benefits just weren't there)... we have the Unreal Engine 3 that claims supports for quads, but I read something recently where it was stated that UT3 will not take advantage of quads (wish I could find that link). Alan Wake won't be out until well into 2008, and I haven't heard a thing about Bioshock and quads. (Just naming a few of the games that interest me.)

There may not be too many games out now (or until sometime in '08) that utilize quads like SC does. But, a quad could still help the gamer that occasionally finds his or herself doing another cpu-intensive task while gaming. I'm still on single-core for crying out loud. I save my encoding for the middle of the night or just make sure I set the priority to "idle" so that I don't eat up all my CPU cycles while I'm actually using the system for other things. Having the flexibility to use 2 cores for gaming (and more games are now benefiting from dual cores), and the other 2 (or even 1) for an encoding job, would be beneficial to me. Though, granted, most of my interest lies in the question of whether Crysis will show improvements from quads-- and, if so, at what settings... cuz I don't play at 1024x768.

The way I see it, I have 2 choices when I upgrade this Oct/Nov. I can get the quad for $2xx or get an e4300 for $1xx. Saving that $100+ on the cheaper CPU might at first seem like the best choice... but, seeing as how I am still running an A64 3000+, I don't upgrade my CPU very often. I will very likely spend that extra $100+ on the quad, knowing that the flexibility to use those cores across different apps will help me even though there might not be a single game that really uses all 4 cores to the extent of making it worth the price increase for a "gamer". And, hopefully, developers will deliver, and we'll see some good quad usage in at least 1 or 2 more titles this year. But, there is no question that if the q6600 had not dropped to the price that it is now, I wouldn't even be considering it.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I certainly understand your guys' stance on Q6600 for gaming today if you already have a good cpu but consider it from a perspective of someone buying a system today:

1) With E6750 2.66ghz for $220 and E6850 for $310 on Newegg it costs less than $100 to double the number of cores you have relative to E6750 and $0 to E6850, when in January a Quad cost $850 => price fall of $540!;

2) Once you consider overclocking, a Quad at 3.4ghz is plenty fast relative to a 3.7-3.8ghz E6850. You won't be gaining much in games at such speeds today, but you can never make up for the additional 2 cores. In a sense you are paying 40% extra for potential to get 75%+ performance over E6750 down the line. We know how this turned out with A64 2.4ghz 4000+ vs. X2 3800+ 2.0ghz...

3) Quad would imply the smoothest experience in Vista and no compromises like waiting to run Ad-aware, anti-virus or Folding@Home or Seti@home while gaming, etc.

4) If you intend to keep your system for longer than a year without upgrading, spending $80-100 today for 2 extra cores is probably a better value;

5) It'll be well into next year before Penryn is sold at reasonable prices since it appears this year they are only releasing 3.3ghz XE version for $1G. This means buying a Quad today will provide at least 6 months of head start for those who can benefit from it;

6) The final factor that tipped me over is resale value. I know I can probably sell a Q6600 @ 3.2ghz to an average Joe in 1 year for $400 or maybe even more based on the prices that Quads sell today ($1,000+ for 3.0ghz Quad). However, in 1-2 years it's going to be impossible to sell E6850 dual core processor for anywhere near breakeven. How do I know? Because when I was selling my E6400 @ 3.4ghz I showed the customer that 2.93ghz C2D costs $1,000US and he was getting not only a faster C2D but at far lower value than $1,000 and far higher value than what I bought the processor for. Considering a 3.0ghz costs $1G today, that means an overclocked Q6600 has a lot more cushion to afford to lose value with time while the most expensive dual core today is only $320 so you wouldnt' even be able to reasonably assess a fair resale value.

For these reasons, it still does make sense to get a quad core today (but not for everyone).
 

thilanliyan

Lifer
Jun 21, 2005
12,045
2,261
126
Originally posted by: SickBeast
That said, the Intel Xeon 3210 is very tempting. It's server-grade, cut from the centre of the wafer, and only costs $250 for 4 cores that will clock to 3.5ghz.

How do you know it would OC so well? I'm also tempted by this CPU.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: thilan29
Originally posted by: SickBeast
That said, the Intel Xeon 3210 is very tempting. It's server-grade, cut from the centre of the wafer, and only costs $250 for 4 cores that will clock to 3.5ghz.

How do you know it would OC so well? I'm also tempted by this CPU.
Everything I've read about it online has suggested it overclocks better than the Q6600. It's like the Opteron of the intel world. :light: