Core 2 Duo E6850 vs. E8500

hallmarksucks

Junior Member
Jan 21, 2008
2
0
0
Intel CPU E6850 has a cache of 4MB SHARED while E8500 has a 6MB (non-shared).
Both operate at 3 GHz (no overclocking).

What's you suggestion: should I get an E6850 or an E8500?
No comparison charts were available to look at them both side by side.

Thanks,
 

Zenoth

Diamond Member
Jan 29, 2005
5,190
185
106
Originally posted by: jaredpace
which ones cheaper?

Even if the E8500 would be $40 more I'd still buy that one over the previous 65nm models. It consumes less electricity, and that alone in the long term is an economy for the purchaser's finances and an economy of energy overall, that alone in my book is already a good reason to go with the 45nm models, better performance or not.
 

Zenoth

Diamond Member
Jan 29, 2005
5,190
185
106
Originally posted by: Saiyukimot
Get an E6750 and overclock it :)

He could also get an E8400 or E8500 and over-clock it, what's your point?
 

DSF

Diamond Member
Oct 6, 2007
4,902
0
71
Originally posted by: Saiyukimot
Get an E6750 and overclock it :)

As Zenoth was subtly suggesting, there's no compelling reason to buy a 6-series Core2Duo anymore.
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
Originally posted by: Zenoth
Even if the E8500 would be $40 more I'd still buy that one over the previous 65nm models. It consumes less electricity, and that alone in the long term is an economy for the purchaser's finances and an economy of energy overall, that alone in my book is already a good reason to go with the 45nm models, better performance or not.
Nothing personal, and yes I'd pick E8400 over E6850 as well. But I'd like to point out the whole performance-per-watt thingy is an 'ideology' that the Industry has been pushing ever since they figured that they can't increase performance forever. Not saying that it's completely baseless, though - it is a very important factor for large computing environment (i.e. server farms) as well as portables. But not for desktops.

Even then, server farms and home users as well, the bigger issue is heat and other heat-related issues. An ordinary business machine will use less than $10 electricity per year. Now, if you add discrete graphics card the cost will go up, but I can hardly imagine any more than $20~30 per PC unless sub-zero cooling is involved. Straight from the horse's mouth:

Energy Efficient Performance 2.0 (.pdf)

IMO, Performance-per-Watt is such an overrated measure, especially when it comes to the enthusiast communities. Shutting off one light-bulb might save more dollars than picking CPUs based on electricity usage for home users.
 

Zenoth

Diamond Member
Jan 29, 2005
5,190
185
106
Originally posted by: lopri
Originally posted by: Zenoth
Even if the E8500 would be $40 more I'd still buy that one over the previous 65nm models. It consumes less electricity, and that alone in the long term is an economy for the purchaser's finances and an economy of energy overall, that alone in my book is already a good reason to go with the 45nm models, better performance or not.
Nothing personal, and yes I'd pick E8400 over E6850 as well. But I'd like to point out the whole performance-per-watt thingy is an 'ideology' that the Industry has been pushing ever since they figured that they can't increase performance forever. Not saying that it's completely baseless, though - it is a very important factor for large computing environment (i.e. server farms) as well as portables. But not for desktops.

Even then, server farms and home users as well, the bigger issue is heat and other heat-related issues. An ordinary business machine will use less than $10 electricity per year. Now, if you add discrete graphics card the cost will go up, but I can hardly imagine any more than $20~30 per PC unless sub-zero cooling is involved. Straight from the horse's mouth:

Energy Efficient Performance 2.0 (.pdf)

IMO, Performance-per-Watt is such an overrated measure, especially when it comes to the enthusiast communities. Shutting off one light-bulb might save more dollars than picking CPUs based on electricity usage for home users.

I completely agree.

But when thinking on a large scale...

Try to imagine (and not just you, but everyone should try to imagine it) if all 130nm, 90nm and 65nm CPU models from Intel/AMD would be gone tomorrow and every single one of those consumers altogether would now be using the power-efficient 45nm revision, how much energy per-state/province/sectors/country would be saved? How many PC's based on old CPU architectures are still being used today? A lot. In fact certainly the majority still are.

Heck, the office's computers at my job used by my supervisors and boss are mostly all Intel P4C's and some of them are old A64's. On a scale of hundreds of millions of consumers, the move from older architectures to the most recent one is always better than to stay on the "last" generation while the new one is already available, better and some times more affordable.

However when people buy PC hardware such as CPU's they often think for "the moment", the present time, the very day. But if they thought "ok, this one is newer, but it's 20 bucks more expensive, however if I still buy it, I'll still save money in the long term, because it consumes less energy and my power bills will be more acceptable, even if it's only on a negligible extent...it's still better". If consumers would think that way they'd help themselves save money for them, of course, that's the first important point, I presume, but then if they think for the "collective" as well (let's all become Borgs for a moment shall we) then they'd realize that they also contribute to alleviate the demand in power for the whole country, and to the extreme - why not - for the whole planet.

I know it might sound far-fetched, but hell... it's how I see it.

And anyways for me, it's a matter of both. I wanted a Quad Core first, but then I realized that the virtual world didn't need one more e-penis in the crowd so I decided I'd stay with Duals. That means I saved money. And by having two less Cores I save even more power, which means even more money...
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
Zenoth: An issue that I'm taking is that a lot of reviewers/editors are taking this 'performance-per-watt' to an extreme measure without thinking twice, which in turn help spreading this heavily exaggerated measure in enthusiast community. I am not sure how much I can blame it on them since they're rather 'geeks' than 'socio-economists', but for instance in this G80 debut article:

http://www.anandtech.com/video/showdoc.aspx?i=2870

Mr. Anand and Derek, who have a long-time reputation of favoring NV over ATI, go as far as to measure the 'Performance-per-Wat' in every single game tests. Obviously one can imagine heavy effects from NV's PR at that time, but to me that was really a joke. (Don't' get me wrong G80 is an art of engineering, IMO. But we're talking something entirely different) Thankfully, that practice quickly disappeared (they now just measure whole system power consumption using 3DMark loop, I think) but it remains as an example of how the 'paradigm' shifts occurs in this volatile market.

Anyway, to be clear, same performance at lower power cost is definitely a good thing. Better performance at lower power is even better. I don't (and can't) dispute that 'fact'. But it is always interesting to observe who's saying what/when/where. OK, no more digressing. :p
 

cuti7399

Platinum Member
Jul 9, 2003
2,583
0
76
how about e8400 vs. q6600?
I got the q6600 for $200 and debating to keep that or buy the e8400
 

Zenoth

Diamond Member
Jan 29, 2005
5,190
185
106
Originally posted by: cuti7399
May I ask why you're debating?

because the e8400 is 45nm and q6600 is 65nm and I'm not a gamer but I do lots of video editing/production

Then get a 45nm Quad, when they're available (I mean the more affordable ones, not the QX9650).
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: cuti7399
May I ask why you're debating?

because the e8400 is 45nm and q6600 is 65nm and I'm not a gamer but I do lots of video editing/production
you're just throwing money away if you upgrade before nehalem at this point, with the one caveat being if you have a good ($120+ p35 or newer) mobo and a crappy b3 q6600 that won't bust 3 ghz with good aftermarket air cooling. If that is the case then you would probably see a good enough bump from a Q9450 to justify the added expense, otherwise be patient grasshopper.
 

hallmarksucks

Junior Member
Jan 21, 2008
2
0
0
Thanks for your feedback. I was not concerned about power consumption but that the prices for an E8400/8500 are lower than those for an E6850. Heating the planet might be an issue though! Now considering the trends of the industry in the past few years, this price point is quite a surprise. What's the catch? More performance for less money? This doesn't sound right.

Intel as always does not have much more, a 6MB cache, a 45 nm technology - nothing big for the end user. Or is it?
 

dingetje

Member
Nov 12, 2005
187
0
0
Originally posted by: hans007
well the e8400 is better than the e6850... so ... the e8500 is even better than that.

i agree! (and i would even prefer an e8200 over an e6850)
 

harpoon84

Golden Member
Jul 16, 2006
1,084
0
0
Originally posted by: hallmarksucks
Thanks for your feedback. I was not concerned about power consumption but that the prices for an E8400/8500 are lower than those for an E6850. Heating the planet might be an issue though! Now considering the trends of the industry in the past few years, this price point is quite a surprise. What's the catch? More performance for less money? This doesn't sound right.

Intel as always does not have much more, a 6MB cache, a 45 nm technology - nothing big for the end user. Or is it?

There is no catch. The 45nm E8x00 line is essentially taking over the 65nm E6x50 line.

The E8500 replaces the E6850, the E8400 replaces the E6750 and the E8200 replaces the E6550. Basically you get more cache and MHz for the dollar, it's that simple.

 

coldpower27

Golden Member
Jul 18, 2004
1,677
0
76
Originally posted by: hallmarksucks
Thanks for your feedback. I was not concerned about power consumption but that the prices for an E8400/8500 are lower than those for an E6850. Heating the planet might be an issue though! Now considering the trends of the industry in the past few years, this price point is quite a surprise. What's the catch? More performance for less money? This doesn't sound right.

Intel as always does not have much more, a 6MB cache, a 45 nm technology - nothing big for the end user. Or is it?

This makes Intel's entire E8xxx line up faster then AMD's Dual Core lineup so it puts pressure and maximum on what they can charge on their Dual Core's.

Hey more performance for the same price or the same performance for cheaper is nothing to complain about. :p
 

Conroy9

Senior member
Jan 28, 2000
611
0
0
Originally posted by: hallmarksucks
Thanks for your feedback. I was not concerned about power consumption but that the prices for an E8400/8500 are lower than those for an E6850. Heating the planet might be an issue though! Now considering the trends of the industry in the past few years, this price point is quite a surprise. What's the catch? More performance for less money? This doesn't sound right.

It does sound weird, but it's in line with what they've been doing recently when improved cpu's come out - check out the price difference of e6600 vs e6750, for example