E7200 vs. E8400 vs. Q6600 vs. Q9450 Cost/Performance

ccubed

Member
Jul 4, 2008
75
0
0
I'm currently looking to build a new system (mostly for gaming, outputting movies to an HDTV, and general internet action.) I am on a somewhat limited budget, so I was confining myself to the E8400 (or even falling back to the E7200) due to price, but I am now thinking about the long term effectiveness of a quad processor. I will be doing some overclocking on any of them using the Xigmatek HDT-S1283 CPU cooler.

Prices (as of 8-2-08 in California):
E7200 at $120.
E8400 at $150.
Q6600 at $180.
Q9450 at $280.

The E9450 seems much more than I'd really like to pay right now, but there's a chance it will go down on Aug 10th when the new chipset officially launches (of course since it's end of line, the cost might not change much.) From a cost to performance ratio on these processors, and assuming the 2 year life span, what would be the better chip?

I can wait until Aug 10th, but that's stretching my already thin patience. The Q9450 might also force me to cut back on other things such as a cheaper motherboard and a different brand of RAM. I would also possibly have to go with a Samsung lcd instead of the LG I covet.

So which gets the best bang for the buck for the next couple of years?

Thank you for your time.
 

solog

Member
Apr 18, 2008
145
0
0
best bang for the buck out of what you listed would be the E7200. Overall, it would likely be the E2180. You are not gonna see any difference on your movies or internet using a quad core. Some current games will benefit from the quad. Maybe you should consider the soon to be launched E5200 also?

 

harpoon84

Golden Member
Jul 16, 2006
1,084
0
0
It looks like gaming is the most CPU intensive thing you do, so an E7200 clocked to near 4GHz would do you just fine (unless you play FSX or Sup Com, which benefit greatly from quads), and would be the most 'cost effective' option from your list. That said, any of the CPUs bar the Q9450 would be great value when overclocked.
 

Mango1970

Member
Aug 26, 2006
195
0
76
Everyone will tell you something different as all those CPU's have their merit. I have a dual core e6850 oc'ed to 3.7Ghz and a quad Q6600 G0 oc'ed to 3.2. I always end up going back to my quad. Although I never thought I would be maxing out all four cores, and for much of what I do, I don't but I still feel like it's snappier. When I have 20 things going, I tent to be able to switch between apps way faster... also I do encode quite a bit.. and do use it to watch movies which all need converted etc., and honestly for what you can get a G0 Q6600 at some places, it's hard to beat.
 

tallman45

Golden Member
May 27, 2003
1,463
0
0

For $30 over the E7200 the E8400 offers more cache and some better features, plus is easier to OC

in 2 years it would still perform quite well
 

Drsignguy

Platinum Member
Mar 24, 2002
2,264
0
76
This has been 1 of the most asked questions and So, as of now, if speaking of long term, best price/performance is still the Q6600. If you will upgrade in 6 months or so, then E7200 is the better value. You can get great OC with this chip and the differance in performance is not far off compaired to the E8400. Personally, I would go with the Quad core as applications are becoming more and more multi threaded towards the future.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
I think it will be at least 3 - 5 years before a 3.6ghz G0 Q6600 outperforms a 4.0ghz E8400 in > 50% of benchmarks across the board. But it's a sure fire bet that the 4ghz E8400 will be faster in games between now and then.

As for price/performance: E7200, E5xxx, or E2xxx is a smarter choice.

 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,268
11
81
I'm going to have to agree with Mango1970. As far as gaming goes, and assuming you're overclocking, you're really going to notice a huge performance difference among any of the listed processors because games are mostly GPU-bound. Right? However, if you went with the cheap quad core, like the Q6600 (which is a steal at $180), then you will most likely see better system responsiveness, not to mention there's no way to really know how (or how many) future games are going to take advantage of multiple-cores. If some of Anandtech's Unreal Tournament 3benchmarks are any indication, quad core will certainly offer better longevity. So I think the Q6600 would be the best value for you.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
here is a little blurb from xbit about dual vs. quad:

E8600 Review:
" In 2-3 weeks Intel will start shipping new dual-core processors with new E0 stepping. There will a new CPU among them: the today?s highest frequency model ? Core 2 Duo E8600. Today?s article will talk about this new processor and the features of all upcoming CPUs with the new processor stepping.
What CPUs suit best for contemporary applications ? dual- or quad-core ones? It is very hard to answer this question, so no wonder that the adherents of both concepts are constantly engaged into long fierce debates about what?s best. While top quad-core processors work at the same frequencies as the dual-core ones, there are not that many applications out there that could really use their entire potential. On the other hand, dual-core CPUs overclock better, boast more favorable thermal characteristics, and the most important thing ? cost considerably less than their quad-core counterparts. That is why many enthusiasts do not hurry to spend their money on Core 2 Quad and Core 2 Extreme processors just yet. "

link:
http://www.xbitlabs.com/articl...ay/core2duo-e8600.html
 

Cheex

Diamond Member
Jul 18, 2006
3,123
0
0
I'm in the same boat as the OP and I'm thinking Q6600. Feel free to convince me otherwise if I'm making a bad decision (which I don't think I am).
 

tallman45

Golden Member
May 27, 2003
1,463
0
0
Originally posted by: Cheex
I'm in the same boat as the OP and I'm thinking Q6600. Feel free to convince me otherwise if I'm making a bad decision (which I don't think I am).

energy cost over that 2 years means you ended paying more for the Q6600 than a Q9550

 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,268
11
81
Originally posted by: tallman45
Originally posted by: Cheex
I'm in the same boat as the OP and I'm thinking Q6600. Feel free to convince me otherwise if I'm making a bad decision (which I don't think I am).

energy cost over that 2 years means you ended paying more for the Q6600 than a Q9550

How did you figure this? I did a quick look at some of Anandtech's articles, and in one of them the Q6600 seemed to use 23W more than a Q9300 at idle and 56W more at load. The Q9550 would use more power than a Q9300, and I don't know which Q6600 Anand tested (the G0 stepping uses less power than the older versions), but we'll just use these numbers.

The average price of electricity in the US is about 10 cents/kWh in 2006. Let's just use 15 cents for fun and higher-cost areas. Assume: Computer runs 24 hours a day, it idles for 20 hours, and it is heavily used (gaming or other CPU-intensive tasks) for 4 hours. So in one day it would use 684 more watts. In one year it would use about 250 more kWh. At 15 cents per kWh, that would mean in one year the Q6600 would cost him about $37.50 more than a Q9300 would. In two years that's about $75. The Q6600 is about $200, while the Q9550 is probably going to be over $300. How is he really going to be saving money? Let's face it: He would save way, way more money if he just turned off his computer or put it on standby. My math should be correct and I think my assumptions would be fair for the average gamer. I even think the numbers I used were way overshooting the Q6600's power consumption, because in another article on Anandtech the Q6600 only used 4W more than an E8400 at idle and 21W more at load of a WMV encoding.
 

Dadofamunky

Platinum Member
Jan 4, 2005
2,184
0
0
Those prices are astounding. NewEgg and TankGuys can't touch them.

For $280, I'd go for the Q9450. I just don't see the sense in getting a 65nm Q6600 at this point. But hat's just me.
 

tallman45

Golden Member
May 27, 2003
1,463
0
0
Originally posted by: cusideabelincoln
Originally posted by: tallman45
Originally posted by: Cheex
I'm in the same boat as the OP and I'm thinking Q6600. Feel free to convince me otherwise if I'm making a bad decision (which I don't think I am).

energy cost over that 2 years means you ended paying more for the Q6600 than a Q9550

How did you figure this? I did a quick look at some of Anandtech's articles, and in one of them the Q6600 seemed to use 23W more than a Q9300 at idle and 56W more at load. The Q9550 would use more power than a Q9300, and I don't know which Q6600 Anand tested (the G0 stepping uses less power than the older versions), but we'll just use these numbers.

The average price of electricity in the US is about 10 cents/kWh in 2006. Let's just use 15 cents for fun and higher-cost areas. Assume: Computer runs 24 hours a day, it idles for 20 hours, and it is heavily used (gaming or other CPU-intensive tasks) for 4 hours. So in one day it would use 684 more watts. In one year it would use about 250 more kWh. At 15 cents per kWh, that would mean in one year the Q6600 would cost him about $37.50 more than a Q9300 would. In two years that's about $75. The Q6600 is about $200, while the Q9550 is probably going to be over $300. How is he really going to be saving money? Let's face it: He would save way, way more money if he just turned off his computer or put it on standby. My math should be correct and I think my assumptions would be fair for the average gamer. I even think the numbers I used were way overshooting the Q6600's power consumption, because in another article on Anandtech the Q6600 only used 4W more than an E8400 at idle and 21W more at load of a WMV encoding.


There are more factors involved

1) Heat produced by the Q6600 is more especially if one overclocks, it will cost $$ to keep a room cooler as a result

2) If speeds are left at stock ( which keeps heat and power usage lower) the Q6600 runs at 2.4ghz while a Q9450 runs at 2.66ghz. The difference while small at 260mhz is x 4, for the 4 cores, which means that for less power and less heat you have 1Ghz more available power all the time. Processing any work that will mean the reads and writes to HDD subsystem and memory will also be faster so those components will become idle faster doing the same work on a Q9450 as a Q6600, which means other system components generate less head and power draw.

3) Lastly factor in the time the user saved over the year with a machine able to process more work always 1Ghz faster

There is a lot more than just the actual processor when factoring in system resource useage. What one needs to look at is the time charts in reviews, tasks completed in fewer seconds also means that the rest of the system is finished as well and the entire system returns to idle, not just the processor

In review, if a system performs just 60 tasks in an hour and say for example a Q9450 performs them just one second faster, that means in that one hour that Q9450 system returned to idle 1 minute faster than the Q6600 which is still running at peak for that minute.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,268
11
81
Originally posted by: tallman45There are more factors involved

1) Heat produced by the Q6600 is more especially if one overclocks, it will cost $$ to keep a room cooler as a result

2) If speeds are left at stock ( which keeps heat and power usage lower) the Q6600 runs at 2.4ghz while a Q9450 runs at 2.66ghz. The difference while small at 260mhz is x 4, for the 4 cores, which means that for less power and less heat you have 1Ghz more available power all the time. Processing any work that will mean the reads and writes to HDD subsystem and memory will also be faster so those components will become idle faster doing the same work on a Q9450 as a Q6600, which means other system components generate less head and power draw.

3) Lastly factor in the time the user saved over the year with a machine able to process more work always 1Ghz faster

There is a lot more than just the actual processor when factoring in system resource useage. What one needs to look at is the time charts in reviews, tasks completed in fewer seconds also means that the rest of the system is finished as well and the entire system returns to idle, not just the processor

In review, if a system performs just 60 tasks in an hour and say for example a Q9450 performs them just one second faster, that means in that one hour that Q9450 system returned to idle 1 minute faster than the Q6600 which is still running at peak for that minute.
The benchmarks I used were for complete system draw, not just the processor. What you're assuming is extremely minimal - much more minimal than the raw power usage of a Q6600 vs. a Q9450.

First, in the fall or winter the "extra" heat produced by the Q6600 will help "save money" on the electric bill. Not to mention, this "extra heat" is insignificant if the computer is turned off when not in use. Go take a thermodynamics or basic chemistry class, please.

Second, just because the Q6600 is slower doesn't mean it's taking longer to perform tasks. When he's playing a game or watching a movie, it's not going to take 10% (clock speed difference between the processors) longer to do. They will take the same amount of time.

Third, you can't add the performance difference among the cores to magically say Q9450 is 1 GHz faster than a Q6600. This is by far the most asinine statement I've ever read and it would probably make baby Anand Lal Shimpi cry at night.

Stop trying to make a case for your argument. Over a two year's (even three year's) time, assuming the US doesn't go into a catastrophic economic collapse, he isn't going to save money by buying the Q9450 or Q950. And if the US does collapse, then we have better things to worry about than debating over the efficiency of Core 2 Quad processors. Oh and most importantly, I overstated the power draw of the Q6600 over the Q9300.

ccubed has already stated the Q9450 is really stretching his budget right now. RIGHT NOW, so there's nothing wrong with the Q6600.

I just found some good power benchmarks here:
http://techgage.com/article/in...2_quad_q9450_266ghz/12

This shows the Q6600 only drawing 14W more under full load than the Q9450. Assume the a computer is running full load 24/7-365 with $0.18 per kWh, and that only means it would cost him an extra $45 in total on the electric bill over two years' time.
 

Billy Idol

Member
Jan 31, 2005
40
0
0
Knowing what I know now I'd probably drop back to an E7200. My 8400 + S1283 combo performs satisfactorily at stock clock for the heat of summer. I had intended to run at 3.6 or or 3.8 for daily operation but I haven't seen a need. Everything in the desktop environment happens immediately and fps in games is almost all video card anyway. Save the money and go with a 7200 or if you sleep in the room with the computer (as I do) on 24/7 an 8400 at barely above ambient with fans on low ain't so bad either.
 

Shortass

Senior member
May 13, 2004
908
0
76
Originally posted by: cusideabelincoln
Originally posted by: tallman45There are more factors involved

1) Heat produced by the Q6600 is more especially if one overclocks, it will cost $$ to keep a room cooler as a result

2) If speeds are left at stock ( which keeps heat and power usage lower) the Q6600 runs at 2.4ghz while a Q9450 runs at 2.66ghz. The difference while small at 260mhz is x 4, for the 4 cores, which means that for less power and less heat you have 1Ghz more available power all the time. Processing any work that will mean the reads and writes to HDD subsystem and memory will also be faster so those components will become idle faster doing the same work on a Q9450 as a Q6600, which means other system components generate less head and power draw.

3) Lastly factor in the time the user saved over the year with a machine able to process more work always 1Ghz faster

There is a lot more than just the actual processor when factoring in system resource useage. What one needs to look at is the time charts in reviews, tasks completed in fewer seconds also means that the rest of the system is finished as well and the entire system returns to idle, not just the processor

In review, if a system performs just 60 tasks in an hour and say for example a Q9450 performs them just one second faster, that means in that one hour that Q9450 system returned to idle 1 minute faster than the Q6600 which is still running at peak for that minute.
The benchmarks I used were for complete system draw, not just the processor. What you're assuming is extremely minimal - much more minimal than the raw power usage of a Q6600 vs. a Q9450.

First, in the fall or winter the "extra" heat produced by the Q6600 will help "save money" on the electric bill. Not to mention, this "extra heat" is insignificant if the computer is turned off when not in use. Go take a thermodynamics or basic chemistry class, please.

Second, just because the Q6600 is slower doesn't mean it's taking longer to perform tasks. When he's playing a game or watching a movie, it's not going to take 10% (clock speed difference between the processors) longer to do. They will take the same amount of time.

Third, you can't add the performance difference among the cores to magically say Q9450 is 1 GHz faster than a Q6600. This is by far the most asinine statement I've ever read and it would probably make baby Anand Lal Shimpi cry at night.

Stop trying to make a case for your argument. Over a two year's (even three year's) time, assuming the US doesn't go into a catastrophic economic collapse, he isn't going to save money by buying the Q9450 or Q950. And if the US does collapse, then we have better things to worry about than debating over the efficiency of Core 2 Quad processors. Oh and most importantly, I overstated the power draw of the Q6600 over the Q9300.

ccubed has already stated the Q9450 is really stretching his budget right now. RIGHT NOW, so there's nothing wrong with the Q6600.

I just found some good power benchmarks here:
http://techgage.com/article/in...2_quad_q9450_266ghz/12

This shows the Q6600 only drawing 14W more under full load than the Q9450. Assume the a computer is running full load 24/7-365 with $0.18 per kWh, and that only means it would cost him an extra $45 in total on the electric bill over two years' time.

Agreed. I was quite set on getting a Q9550 just for the bit of speed and lower power consumption - quite important when Folding - but decided the price difference is just too great to justify the investment. These arguments being made for gaming is ridiculous... there will be no difference between a 3.2 6600 and a 3.6 9550 (or dual).

Unless you require sse4.1, 24/7 at full load power consumption or simply want the newest process then go for the 6600. If you're simply gaming just get the e7200 and be done with it... you will be quite pleased.
 

OLpal

Member
Feb 12, 2008
188
0
0
One thing you didn't factor in is Heat [affects comfort level of room, room & electric bill !!
Also after they get done OC'n the Q6600 for performance equal to Q9550 it'd be using alot more energy than you've factored in here !!!

I'd take the E8400 over them all for a gaming upgrade @ this time !!

Ol'Pal


Originally posted by: cusideabelincoln
Originally posted by: tallman45
Originally posted by: Cheex
I'm in the same boat as the OP and I'm thinking Q6600. Feel free to convince me otherwise if I'm making a bad decision (which I don't think I am).

energy cost over that 2 years means you ended paying more for the Q6600 than a Q9550

How did you figure this? I did a quick look at some of Anandtech's articles, and in one of them the Q6600 seemed to use 23W more than a Q9300 at idle and 56W more at load. The Q9550 would use more power than a Q9300, and I don't know which Q6600 Anand tested (the G0 stepping uses less power than the older versions), but we'll just use these numbers.

The average price of electricity in the US is about 10 cents/kWh in 2006. Let's just use 15 cents for fun and higher-cost areas. Assume: Computer runs 24 hours a day, it idles for 20 hours, and it is heavily used (gaming or other CPU-intensive tasks) for 4 hours. So in one day it would use 684 more watts. In one year it would use about 250 more kWh. At 15 cents per kWh, that would mean in one year the Q6600 would cost him about $37.50 more than a Q9300 would. In two years that's about $75. The Q6600 is about $200, while the Q9550 is probably going to be over $300. How is he really going to be saving money? Let's face it: He would save way, way more money if he just turned off his computer or put it on standby. My math should be correct and I think my assumptions would be fair for the average gamer. I even think the numbers I used were way overshooting the Q6600's power consumption, because in another article on Anandtech the Q6600 only used 4W more than an E8400 at idle and 21W more at load of a WMV encoding.

 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,268
11
81
Do people not even read what I wrote?? The heat output difference is INSIGFICANT! And the "extra heat" has different effects depending on where you live Do the world a favor and turn off your PC, or put it in standby, when you're not using. Doing this will make heat output a non-factor. But, if people insist on bringing this up I'll just try to do a quick estimate...

q=mcdeltaT

The amount of (q) required to raise the room temperature from 22.22C to 22.77 (1 degree F), is. Density of air is about 1.2 kg/m^3. In a 8mx4mx4m room, there would be 153.6 kg of air. In a 6 room house, that's 926.1 kg.

q = (153,600 g) * (1.012 J/g-K) * (0.55 K) = 84593 Joules, x6 = 507558 J for the whole house

To raise the temp 1 degree F for a house over the span of an hour (time the computer would be turned on) would require 141W. Over half a day would require 11.7W

The TDP of a Q9450 is 95W and the Q6600 is 105W. That's a 10W, or 10 J/s, difference. So in about half a day, the average time the computer should be on or in use, the Q6600 should put out enough heat to raise the temp of a house by 1 degree F more than what the Q9450 would. (43200 seconds in 12 hours, 10W * 43200 = 432000 J more output of the Q6600 over the Q9450. Since it takes 500,000 J to raise the temp of a house by one degree, I can say it would take the Q6600 about half a day to raise the temp of a house one degree more than what a Q9450, under the same operating conditions, does.)

If you google it, most energy saving tips say if you turn your AC up (or heat down) by one degree you'll save an average of 2% on your bill, assuming the units run 24/7. Since I'm only assuming the computer is staying on half the day, we can estimate the extra cost of the Q6600, raising the temp in the summer by one degree more than a Q9450 would, to cost you 1% more on your bill. Say your bill is fairly high, $100/mo, and you would spend an extra $1/mo on your bill, or $12/yr. That's $24 over 2 years if you live in a place that uses AC year-around. If you live in a place that gets cold, then the Q6600 wouldn't hurt your electric bill over the course of a year, since you'll run AC for half the year and heat for the other half and the extra heat output by the Q6600 would cut the need for a heater's output.

All of these numbers are estimates, and I think I fairly assumed and over-estimated standard operating practices and costs.
 

tallman45

Golden Member
May 27, 2003
1,463
0
0
Originally posted by: cusideabelincoln
Do people not even read what I wrote?? The heat output difference is INSIGFICANT! Do the world a favor and turn off your PC, or put it in standby, when you're not using. Doing this will make heat output a non-factor. But, if people insist on bringing this up I'll just try to do a quick estimate...

q=mcdeltaT

The amount of (q) required to raise the room temperature from 22.22C to 22.77 (1 degree F), is. Density of air is about 1.2 kg/m^3. In a 8mx4mx4m room, there would be 153.6 kg of air. In a 6 room house, that's 926.1 kg.

q = (153,600 g) * (1.012 J/g-K) * (0.55 K) = 84593 Joules, x6 = 507558 J for the whole house

To raise the temp 1 degree F for a house over the span of an hour (time the computer would be turned on) would require 141W. Over half a day would require 11.7W

The TDP of a Q9450 is 95W and the Q6600 is 105W. That's a 10W, or 10 J/s, difference. So in about half a day, the average time the computer should be on or in use, the Q6600 should put out enough heat to raise the temp of a house by 1 degree F more than what the Q9450 would.

If you google it, most energy saving tips say if you turn your AC up (or heat down) by one degree you'll save an average of 2% on your bill, assuming the units run 24/7. Since I'm only assuming the computer is staying on half the day, we can estimate the extra cost of the Q6600, raising the temp in the summer by one degree more than a Q9450 would, to cost you 1% more on your bill. Say your bill is fairly high, $100/mo, and you would spend an extra $1/mo on your bill, or $12/yr. That's $24 over 2 years if you live in a place that uses AC year-around. If you live in a place that gets cold, then the Q6600 wouldn't hurt your electric bill over the course of a year, since you'll run AC for half the year and heat for the other half and the extra heat output by the Q6600 would cut the need for a heater's output.

All of these numbers are estimates, and I think I fairly assumed and over-estimated standard operating practices and costs.

Also, the Q6600 can overclock well at stock voltages. At least, it can overclock well enough to provide satisfactory performance.

Not much credibility in your words, sad to say, but as you get older you will learn
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,268
11
81
Tallman45, you have absolutely NO PROOF WHATSOEVER. Did you even go to school? You have provided zero evidence to anything you have said.
 

Cheex

Diamond Member
Jul 18, 2006
3,123
0
0
cusideabelincoln stop bashing on Tallman45 and let's keep this discussion civil.

Moving on...

In my case the energy cost isn't so much the concern, it is the immediate purchase cost and what will be better over time (price/performance).
Since the first announcements, I've wanted a Q9450...however, at this point, with Nehalem on our heels, the Q6600 seems the best interim upgrade option for me. At least until a Nehalem platform becomes affordable.
 

Drsignguy

Platinum Member
Mar 24, 2002
2,264
0
76
Originally posted by: Cheex
cusideabelincoln stop bashing on Tallman45 and let's keep this discussion civil.

Moving on...

In my case the energy cost isn't so much the concern, it is the immediate purchase cost and what will be better over time (price/performance).
Since the first announcements, I've wanted a Q9450...however, at this point, with Nehalem on our heels, the Q6600 seems the best interim upgrade option for me. At least until a Nehalem platform becomes affordable.



Good call cheex, I too would wait. Both chips are very good but the better value of the quads, is the q6600. Btw, the quad core will help your crunching......:)
 

toadeater

Senior member
Jul 16, 2007
488
0
0
Originally posted by: cusideabelincoln
The TDP of a Q9450 is 95W and the Q6600 is 105W. That's a 10W, or 10 J/s, difference. So in about half a day, the average time the computer should be on or in use, the Q6600 should put out enough heat to raise the temp of a house by 1 degree F more than what the Q9450 would.

Only if you live in a hobbit house, with extremely poor ventilation.

I was going to say something about generating 300W in my pants while browsing image***, but I decided this probably isn't the best place for these types of comments.

EDIT: I didn't realize the q6600 was so cheap now. There's no question I would buy that over a q9450. The minor energy savings is insignificant unless you run at 100% load all the time.