Dollar wise, Intel CPUs do *NOT* cost much more to operate than a comparable AMD. If you disagree lets see some facts:

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
Originally posted by: Markfw900
Speaking of funny, his arguments make no sense at all, and you think he destroyed my arguments ? Mine agree with Anandtech and at least 5 other sites, and his are
"approxamations" based on garbage. See the other 10 posters or so that agree with me. You are just burying yourself deeper.

LOL

You made arguments? All I saw were a bunch of isolated statements. LOL.

So basically, you are:
1. Incapable of understanding basic arithmetic (kWH -> $)
2. Narrow-minded (cannot even comprehend the fact that people actually turn off their computers, in relation to the op's premise of average usage times, rather than your own operation)
3. Functionally illiterate (unable to read my posts and respond properly, instead preferring meaningless accusations of garbage)
4. Engineering illiterate (basing your point on feelings rather than actual engineering principles such as on heat dissipation and airflow, in regards to the case temperatures)
5. Incompetent (you referenced links which further my point instead of yours, nice one)
6. Incapable of understanding basic utilization of the scientific process (use TDP to estimate isolated CPU draw, assume worst case and do the math, see point 1)
7. Delusional (see quote above)

But hey, you have a high post count and you can overclock stuff, guess that justifies your proclamations! Rock on! :D
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Pretty bad case scenario assume the difference is 110W.

At 8.14 cents per kilowatt hour.

0.00814 cents per watt hour x 110 = 0.8954 cents per Hour extra.

= 6.44688 Dollars extra per month. or 13 if you account for cooling.

17 Dollars if you account for power supply inefficiency.

This is of course assuming your 100% load at all times.
This assumes for 1 month of course and for 1 computer system.

I don't think using electric bills are an good indicator of power usage of computer systems, that only would work if you could have meticulously monitored all other usage of the residence and account for how much you used in your place in other ways. There is just to great a possibility for error.

Though for me this isn't a large issue as my electric prices are something like 5 cents CAN per kilowatt Hour.

 

dexvx

Diamond Member
Feb 2, 2000
3,899
0
0
Except its not that extreme:

http://techreport.com/reviews/2006q2/core-duo/index.x?pg=15

System Power Draws (Idle/Load) with a high end system with 7800GTX:

Core Duo T2600 106W/125W

FX60 189W/225W

XE955 (B Revision) 189W/286W

XE965 (C Revision) 156W/225W

The differential between a high end X2 and a XE Presler was about 60W at load. MANY people on this forum have stated that the electrical savings outweighed the initial savings. By that logic, everyone should be a Core Duo + Aopen i975 board. Even though the inital expense is large, you're going to be saving about 100W.
 

Absolute0

Senior member
Nov 9, 2005
714
21
81
Ja, 8xx series @ max overclocked running 100% load 24/7 doesn't describe the typical user...

And if you're my roommate, you live in a dorm, where we have 6 computers in the room and we try to suck as much free power from the University as possible :)


I think since Intel released a cheap dual core people have run out of arguments against it, and now nitpicking power differences is used to attack it : /
 

eastvillager

Senior member
Mar 27, 2003
519
0
0
whatever the number ultimately is, it adds up quickly when you start doing datacenter planning. Sun actually has a nifty tool for just that---sure it is half marketing/half solid facts, but it gives you a ballpark---where you can setup a datacenter and drag and drop racks of servers ala visio, and it'll calculate total wattage for the servers and total wattage needed for cooling, and cost to operate at a given energy cost.

When you see power consumption and heat dissapation as part of the marketing for a "our processor vs. their processor, or our server vs. theirs", you're not really looking at stuff targeted at your typical home user, you're seeing stuff targeted at the person who has to maintain a datacenter. When you're that person and you find out that while you have plenty of floorspace left, but no additional ac or power capactiy, you start paying attention to SUN and AMDs opteron marketing.
 

Duvie

Elite Member
Feb 5, 2001
16,215
0
71
Fact is to this thread....

FELIX was WRONG AGAIN!!!! It was far worse then his shotty math....

Enough said...

The rest is nitpicking....



Dexvx... The presler is known to be better than the smithfields...rev C preslers were much better and obviously conroe on the 65nm process is about to take power usage back over to Intel. This argument wont fly much longer.

Lets not nitpick over dollars, but just to say Felix was trolling with some bad math...like usual!!!

People who claim that you need to spend more than a dollar or two per month to run the current batch of Intel CPUs are not being truthful or are basing their assumptions incorrectly.

This statement even for non 24/7 power folding users is still been proven wrong....MOst may not use the cpu 100% for the 24hours, but most do not shut off PC and thus there is still some power draw....
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Originally posted by: Amaroque
Felix, your stats are way off. My electric is elevated about $50 a month just from running 4 AMD machines 24/7 ((2AXP, 1 A64, and 1 AX2) Three of the machines almost always have the monitors off.

I can provide scans of my electric bills....
Your experience isn't contrary to Felix's numbers. The key is that the power company is going by average power on hours and not 24/7 operation. The average home computer is probably only on for a few hours a day.

$0.77/week*4weeks*4 computers*5 times the average daily runtime = $61.6

 

z42

Senior member
Apr 22, 2006
465
0
0
I'm envious of all of you. I go over my base Kwh every month. I have no idea how much xxx processor draws as compared to yyy processor, but I do know that I pay a lot more than $.0813/kwh when I ADD to my current usage. It's closer to $.25/kwh since i'm over the baseline usage. One of the downsides to living in CA i guess.
 

FelixDeCat

Lifer
Aug 4, 2000
30,959
2,670
126
Originally posted by: z42
I'm envious of all of you. I go over my base Kwh every month. I have no idea how much xxx processor draws as compared to yyy processor, but I do know that I pay a lot more than $.0813/kwh when I ADD to my current usage. It's closer to $.25/kwh since i'm over the baseline usage. One of the downsides to living in CA i guess.

A lot of numbers come from 2004 before prices really started to skyrocket.
 

FelixDeCat

Lifer
Aug 4, 2000
30,959
2,670
126
Originally posted by: Duvie
Fact is to this thread....

FELIX was WRONG AGAIN!!!! It was far worse then his shotty math....

Enough said...

The rest is nitpicking....

Dexvx... The presler is known to be better than the smithfields...rev C preslers were much better and obviously conroe on the 65nm process is about to take power usage back over to Intel. This argument wont fly much longer.

Lets not nitpick over dollars, but just to say Felix was trolling with some bad math...like usual!!!

People who claim that you need to spend more than a dollar or two per month to run the current batch of Intel CPUs are not being truthful or are basing their assumptions incorrectly.

This statement even for non 24/7 power folding users is still been proven wrong....MOst may not use the cpu 100% for the 24hours, but most do not shut off PC and thus there is still some power draw....


Stop trolling duvie. Youve proven nothing other than that you know nothing. :thumbsup:

 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Keep in mind, that a decent portion of the increased electricity consumption will be from AC to keep the area with the computer livable for humans.:p
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
And y'all realize the '45 watt' delta is not exactly appropriate once you crank both the voltage and the clock speed of the 8XX, right? Once you crank the clock on that bad boy up 50%, expect a similar increase in power used. Increase the voltage, and even more of the same.

Also remember that this delta will be multiplied by the PSU not being 100% efficient. So if the CPU draws 100 more watts (for ease of math) and you have an 80% efficient psu, you're pulling ~120 more watts out of the wall. Add a bit more for losses in the house wiring past your meter. =)

That being said, a 70 watt delta (estimated) works out to running the box for 14 hours at full tilt to cost 9 cents. 160 for a buck, say. That's a full 40 hour a week work month of operation for $1, or a year for $12. Assuming 100% load on both cores.

That's not a huge amount. If you use your machine at 100% for 8 hours a day 5 days a week, it'd take about 4 years to add up to the difference between an 805 and x3800. Under more typical usage which only runs one core at 100% for demanding applications (games), and is idle most of the time, the payback period is probably on the order of a decade or two.

Folding 24x7 is not typical usage, and for that I agree a more efficient Opteron is the way to fly.


 

Absolute0

Senior member
Nov 9, 2005
714
21
81
under typical load a computer is basically idling as someone surfs the internet, and one core goes to 100% during gaming.
Everyone has to keep in mind that the only way you're going to get sustained 100% dual load for hours is from folding, and if you bought an Intel you probably didn't buy it for folding! lol... More likely a cheap dual core for multimedia needs.
 

dexvx

Diamond Member
Feb 2, 2000
3,899
0
0
Originally posted by: v8envy
Folding 24x7 is not typical usage, and for that I agree a more efficient Opteron is the way to fly.

No, if you were folding 24x7 (depending on application because it varies quite a bit between CPU's), Core Duo's are the way to go. Their performance/watt is undisputed, and you will save money in the long term despite higher overhead cost.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
I stand corrected. Previous discussion was re: x3800 vs 805, and that's the box I limited myself to.

Gotta wonder if a geode or via CPU based cluster would do even better. Probably not.
 

o1die

Diamond Member
Jul 8, 2001
4,785
0
71
You can almost double that $13. My rates just went up to 14.5 cents/kwh.
 

Thorny

Golden Member
May 8, 2005
1,122
0
0
Originally posted by: v8envy
And y'all realize the '45 watt' delta is not exactly appropriate once you crank both the voltage and the clock speed of the 8XX, right? Once you crank the clock on that bad boy up 50%, expect a similar increase in power used. Increase the voltage, and even more of the same.

Also remember that this delta will be multiplied by the PSU not being 100% efficient. So if the CPU draws 100 more watts (for ease of math) and you have an 80% efficient psu, you're pulling ~120 more watts out of the wall. Add a bit more for losses in the house wiring past your meter. =)

That being said, a 70 watt delta (estimated) works out to running the box for 14 hours at full tilt to cost 9 cents. 160 for a buck, say. That's a full 40 hour a week work month of operation for $1, or a year for $12. Assuming 100% load on both cores.

That's not a huge amount. If you use your machine at 100% for 8 hours a day 5 days a week, it'd take about 4 years to add up to the difference between an 805 and x3800. Under more typical usage which only runs one core at 100% for demanding applications (games), and is idle most of the time, the payback period is probably on the order of a decade or two.

Folding 24x7 is not typical usage, and for that I agree a more efficient Opteron is the way to fly.


Lets also not forget that power supplies reach maximum efficiency at higher power draws. When your power draw is lower your effenciency is lower and vice versa. If your power draw increased by 100 watts, your effeciency could go up as much as 20% depending on your supply. After seeing all the people incapable of doing math in this thread, I'm not going to delve into it, but you can draw your own conclusions.

v8envy is right, the cost isn't worth worrying about unless you've got a room full of racks running 24/7
 

Bobthelost

Diamond Member
Dec 1, 2005
4,360
0
0
Originally posted by: Thorny
Lets also not forget that power supplies reach maximum efficiency at higher power draws. When your power draw is lower your effenciency is lower and vice versa. If your power draw increased by 100 watts, your effeciency could go up as much as 20% depending on your supply. After seeing all the people incapable of doing math in this thread, I'm not going to delve into it, but you can draw your own conclusions.

v8envy is right, the cost isn't worth worrying about unless you've got a room full of racks running 24/7

Whoa! Where the hell are these numbers coming from?

1) Max efficency is generally found at around 80% of load, between roughly 20% and 80% the curve is fairly flat. In for most decent PSUs you're only looking at a few % difference.

2)Efficency going up 20%? That's unlikly to say the least. Only by going from the ultra low range to the flat part of the curve do you get anywhere near that much difference. For those stupid enough to buy a 600W PSU for a 200W computer i don't think they will care.