Dollar wise, Intel CPUs do *NOT* cost much more to operate than a comparable AMD. If you disagree lets see some facts:

FelixDeCat

Lifer
Aug 4, 2000
30,959
2,670
126
People who claim that you need to spend more than a dollar or two per month to run the current batch of Intel CPUs are not being truthful or are basing their assumptions incorrectly.


I did a quick search on for price per kw/h and came up with this from the Nebraska Power Utility District. The page was copyright 2004 and states that "These figures are based on an electric price of 8.14¢ per kWh "


In the middle of the table youll notice it says Computer w/Monitor, Printer 77.6¢ / week . Prescott was available in 2004. And that includes a computer *with moniter*.


It doesnt state which CPU was being used, but you have to figure that this probably included a CRT. So if the total for all items was 77 cents per week and assuming that was an AMD AND A CRT MONITER, we can call that a base.

There is no way a Prescott or an 8xx / 9xx could be using more than 25 cents /week more, so Ill at least give you that.

Now .25 x 52 = $13 !!!!!!

$13<>$20 or $200 for that matter.

I know it costs more to run a Prescott or a D, but wow a whole $13. :roll:

Dont get me wrong, Im open to a good argument otherwise with links to back up any facts so at least we can put this to rest.

:)
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,222
16,101
136
I don;t have links. I have my electric bill, and figures from web sites as to the power usage of the 8xx series cpu's. I posted my numbers in many threads, and you can read just like the rest.
From Anand's article (non-overclocked)
If that seems like a lot of power initially, it actually only works out to $32-$42 per year, running both systems 24/7.

Now I think that is low, and not overclocked. If you were running at 50% OC at just under 4 ghz (as many claim they can) that would be $63/year more for the 805 over an X2 3800, both overclocked running 24/7.

No matter what, they DO cost more to run, whichever number you pick.

/thread
 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
That 77.6 cents per week is bunk...flat out wrong!!!! Like a pentium 75mhz cpu not a modern cpu by any means...

Here are some more..

http://michaelbluejay.com/electricity/computers.html

This guy breaks it down for the math challenged....

http://pmdb.cadmusdev.com/powermanagement/quickCalc.html

here is a calculator for you...

http://www.maximumpcguides.com/?p=34

here is another one that makes your math lame!!!

*********************************** HERE YOU GO ********************************

http://www.hardwareinreview.com/cms/content/view/33/1/

pay attention to the Pentium D versus 3800+ column.....



You would think you could use google a bit more then finding some obscure power company....find some real data
 

stevty2889

Diamond Member
Dec 13, 2003
7,036
8
81
Since I started running folding@home 24/7 my electric bill has doubled from ~$75 per month to around $150 per month. Thats a lot more than 77.6 cents per week..
 

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
Oh yeah, and where's your numbers mark? The only things you've posted in regards to this topic in the past two years are repeated assertions of expense and some snarky remark about air conditioning requirements in the summer. That does not consitute proof, LOL.

PGE does not break down your bill into seperate rooms or computers. Nor do any reviews I've read give the isolated power draw of the CPU, only the system total. On top of that, since you have multiple computers, your bill is totally useless as a method of comparison.

On the other hand, I've done the math with PGE as the provider, and at 100% usage 24/7 for a month, the difference is about $80 a year with a generous power estimate against the P-D (feel free to do the math yourself, if you can). So it's more expensive, but only if the user runs crap all day (which is probably less than 0.01% of all computer users in the US, at most).

As for the remarks about air conditioning in the summer, with the same niche usage model (24/7), the P4-D's dissipate no more heat than a couple 60W light bulbs over a X2. Which of course makes your argument meaningless. It's like bitching about two light bulbs overheating the kitchen when the oven is on.

Edit: Using the op's power rates, 8.14¢/kwh, that works out to $100 a year difference, if the two platforms have a constant 150W power differential, which is too high to begin with. Just illustrating a point.
 

FelixDeCat

Lifer
Aug 4, 2000
30,959
2,670
126
Originally posted by: stevty2889
Since I started running folding@home 24/7 my electric bill has doubled from ~$75 per month to around $150 per month. Thats a lot more than 77.6 cents per week..

When did you start this endeavor? The cost of power continues to rise. And thankfully most people dont run their computers 24 x 7 = 672 hours month. I use mine about 6 hours a day or 180 a month. Certainly my bill hasnt doubled.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,222
16,101
136
Originally posted by: dmens
Oh yeah, and where's your numbers mark? The only things you've posted in regards to this topic in the past two years are repeated assertions of expense and some snarky remark about air conditioning requirements in the summer. That does not consitute proof, LOL.

PGE does not break down your bill into seperate rooms or computers. Nor do any reviews I've read give the isolated power draw of the CPU, only the system total. On top of that, since you have multiple computers, your bill is totally useless as a method of comparison.

On the other hand, I've done the math with PGE as the provider, and at 100% usage 24/7 for a month, the difference is about $80 a year with a generous power estimate against the P-D (feel free to do the math yourself, if you can). So it's more expensive, but only if the user runs crap all day (which is probably less than 0.01% of all computer users in the US, at most).

As for the remarks about air conditioning in the summer, with the same niche usage model (24/7), the P4-D's dissipate no more heat than a couple 60W light bulbs over a X2. Which of course makes your argument meaningless. It's like bitching about two light bulbs overheating the kitchen when the oven is on.

Edit: Using the op's power rates, 8.14¢/kwh, that works out to $100 a year difference, if the two platforms have a constant 150W power differential, which is too high to begin with. Just illustrating a point.


I know how hot it is on my house with and without the Intel systems running, Duvie has been over here. I know what my AC bill is, and what it costs in the winter. And I don;t need to prove it. You don't believe me, fine, thats your problem. Duvie's link's prove the power use, and I dare you to come over here, but your hand in the 805 case or the 820 case, and tell me they only put out the same heat as 2 60 watt bulbs.

My bill was over $200 one month running all computers. I turned off most of them, same ambient temps, and the bill dropped to $100. In off heat periods, with all computers on, its $120, so AC DOES make a difference. Why do you think computer rooms in big companies require so much cooling ? ever been in one ? I have.
 

FelixDeCat

Lifer
Aug 4, 2000
30,959
2,670
126
Originally posted by: dmens

Edit: Using the op's power rates, 8.14¢/kwh, that works out to $100 a year difference, if the two platforms have a constant 150W power differential, which is too high to begin with. Just illustrating a point.

How did you come up with 150W? And how do you figure its too high? And this is based on 672+ hours a month or 180 a month like my typical usage?

This is enlightening. :light:

 

FelixDeCat

Lifer
Aug 4, 2000
30,959
2,670
126
If you think about it, if the extra $100 per year is based on 24x7 usage, with no sleep or hibernation mode (which I also employ), my typical 6hr a day / 180 month usage would work out to 1/4th that number or $25 /12 /4 = or 48 cents a week. Less then the 77 cents a week quoted by the Nebraska utility. :shocked:
 

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
Originally posted by: Markfw900
I know how hot it is on my house with and without the Intel systems running, Duvie has been over here. I know what my AC bill is, and what it costs in the winter. And I don;t need to prove it. You don't believe me, fine, thats your problem. Duvie's link's prove the power use, and I dare you to come over here, but your hand in the 805 case or the 820 case, and tell me they only put out the same heat as 2 60 watt bulbs.

My bill was over $200 one month running all computers. I turned off most of them, same ambient temps, and the bill dropped to $100. In off heat periods, with all computers on, its $120, so AC DOES make a difference. Why do you think computer rooms in big companies require so much cooling ? ever been in one ? I have.

Yeah, that really helps your argument. "It feels hotter". LOL! Sometimes, my G4 powerbook feels faster than the X2, but only when I'm ripping two porns at once, HAHAHA. Also, learn how to read, I said the 8xx dissipates about two 60W bulbs extra over the X2.

It is painfully obvious the case of a 8xx machine is hot. But since you're not cramming all your computers into a 4x4 closet with ****** circulation, it's not going to make a damn difference. Even a 20C difference in case results in a minute change in a decently sized room. Plus, since your AC is so powerful (demonstrated by your bill), it makes that argument even more stupid.

So big surprise, your bill dropped $100 with all your machines turned off. Duh. But with the number of machines in your sig, and the fact all the premium hardware goes into other machines (at least I hope so), the 8xx processor would only be responsible for a fraction of that. Maybe 10%, at most. Look, my numbers work out yet again, LOL.

Look dude, if I don't believe you, it's because you've provided absolutely zero proof worth anything to the argument. Yeah, my problem is that I'm a skeptical guy who actually looks at data instead of grandoise statements.
 

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
Originally posted by: FelixDeKat
How did you come up with 150W? And how do you figure its too high? And this is based on 672+ hours a month or 180 a month like my typical usage?

This is enlightening. :light:

24/7 usage. Cut off the platforms, let's say the X2 uses 60W at full load (kinda low, but whatever), tack on 150W means the P4-D does 210W at full load. That is higher than every in-house power virus test result I've seen, except on dual core EE's, and paxvilles.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,222
16,101
136
So nobody here can read ? Duvie's links ? where they used a power meter on almost identical systems ? and came to the same conclusions that Anand and I have come to ? Not enough proof ?

You can lead a horse to water, but you can't make him drink.......
 

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
I know I can read, but you obviously can't, because those links prove my point, and my approximated calculations. LOL!
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,222
16,101
136
Originally posted by: Matrix21
http://www.pcstats.com/articleview.cfm?articleid=1918&page=3

"When both of the Intel Pentium D 840 cores are under load, the system draws upwards of 240W of power. On the flip side, the when the dual cores of the AMD Athlon64 FX-60 are under load, that system draws just 196W - almost 45W less than the Intel processor-based system. "

And they should have been comparing the 840 to something that performs at the same level, like the 3800 X2.
 

FelixDeCat

Lifer
Aug 4, 2000
30,959
2,670
126
Originally posted by: Matrix21
http://www.pcstats.com/articleview.cfm?articleid=1918&page=3

"When both of the Intel Pentium D 840 cores are under load, the system draws upwards of 240W of power. On the flip side, the when the dual cores of the AMD Athlon64 FX-60 are under load, that system draws just 196W - almost 45W less than the Intel processor-based system. "


Theres some useful information and a good link to boot. Theres no disagreement that the current generation of Intel CPUs use more power. In your example 45W more.

So over the course of a year, assuming 180 hours a month it wont put much of a dent in anyones wallet for the average PC user.
 

robertk2012

Platinum Member
Dec 14, 2004
2,134
0
0
Originally posted by: FelixDeKat
People who claim that you need to spend more than a dollar or two per month to run the current batch of Intel CPUs are not being truthful or are basing their assumptions incorrectly.


I did a quick search on for price per kw/h and came up with this from the Nebraska Power Utility District. The page was copyright 2004 and states that "These figures are based on an electric price of 8.14¢ per kWh "


In the middle of the table youll notice it says Computer w/Monitor, Printer 77.6¢ / week . Prescott was available in 2004. And that includes a computer *with moniter*.


It doesnt state which CPU was being used, but you have to figure that this probably included a CRT. So if the total for all items was 77 cents per week and assuming that was an AMD AND A CRT MONITER, we can call that a base.

There is no way a Prescott or an 8xx / 9xx could be using more than 25 cents /week more, so Ill at least give you that.

Now .25 x 52 = $13 !!!!!!

$13<>$20 or $200 for that matter.

I know it costs more to run a Prescott or a D, but wow a whole $13. :roll:

Dont get me wrong, Im open to a good argument otherwise with links to back up any facts so at least we can put this to rest.

:)

I know for a fact my my computer draws more than .77 a week.
 

Amaroque

Platinum Member
Jan 2, 2005
2,178
0
0
Felix, your stats are way off. My electric is elevated about $50 a month just from running 4 AMD machines 24/7 ((2AXP, 1 A64, and 1 AX2) Three of the machines almost always have the monitors off.

I can provide scans of my electric bills....
 

madrad

Junior Member
May 11, 2006
1
0
0
Originally posted by: Markfw900


And they should have been comparing the 840 to something that performs at the same level, like the 3800 X2.

Yea that makes sense... compare the highest end 8XX series processor to the lowest end X2... good thinking
 

skooma

Senior member
Apr 13, 2006
635
28
91
Originally posted by: dmens
I know I can read, but you obviously can't, because those links prove my point, and my approximated calculations. LOL!
:laugh:

Do you ever get tired of destroying markfw's arguments? Its almost painful to watch at times.

Almost....

LMAO
 

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
Originally posted by: Amaroque
Felix, your stats are way off. My electric is elevated about $50 a month just from running 4 AMD machines 24/7 ((2AXP, 1 A64, and 1 AX2) Three of the machines almost always have the monitors off.

I can provide scans of my electric bills....

That number is for average usage (something like 4 hours a day, plus lots of idling), not 24/7 with idling. Actually, $50 a week is pretty low for 4 machines.
 

FelixDeCat

Lifer
Aug 4, 2000
30,959
2,670
126
Originally posted by: Amaroque
Felix, your stats are way off. My electric is elevated about $50 a month just from running 4 AMD machines 24/7 ((2AXP, 1 A64, and 1 AX2) Three of the machines almost always have the monitors off.

I can provide scans of my electric bills....

Well it has been pointed out several times above that the utility companies estimates are probably based on my average use of 6hrs/day or 180/month.

If you are running the machine 24x7 that would translate to 672 hours a month - almost triple what Im figuring.

Not only that your also running FOUR MACHINES! :shocked:

No wonder your power bill is up - your running triple the average on four machines! :beer:
 

Matrix21

Member
May 26, 2005
87
0
0
Originally posted by: Markfw900
Originally posted by: Matrix21
http://www.pcstats.com/articleview.cfm?articleid=1918&page=3

"When both of the Intel Pentium D 840 cores are under load, the system draws upwards of 240W of power. On the flip side, the when the dual cores of the AMD Athlon64 FX-60 are under load, that system draws just 196W - almost 45W less than the Intel processor-based system. "

And they should have been comparing the 840 to something that performs at the same level, like the 3800 X2.



True, the 3800 X2 is about 65w Compared to 110w for the fx. so the difference would probably be more like 90W less.
 

robertk2012

Platinum Member
Dec 14, 2004
2,134
0
0
Originally posted by: madrad
Originally posted by: Markfw900


And they should have been comparing the 840 to something that performs at the same level, like the 3800 X2.

Yea that makes sense... compare the highest end 8XX series processor to the lowest end X2... good thinking


well that 3800 would perform at least as well.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,222
16,101
136
Originally posted by: skooma
Originally posted by: dmens
I know I can read, but you obviously can't, because those links prove my point, and my approximated calculations. LOL!
:laugh:

Do you ever get tired of destroying markfw's arguments? Its almost painful to watch at times.

Almost....

LMAO

Speaking of funny, his arguments make no sense at all, and you think he destroyed my arguments ? Mine agree with Anandtech and at least 5 other sites, and his are
"approxamations" based on garbage. See the other 10 posters or so that agree with me. You are just burying yourself deeper.

LOL