If you're waiting for those 45w Athlon II X3's and X4's

netxzero64

Senior member
May 16, 2009
538
0
71
woah that's cool coz it'll hurt you in your utilities bill for having the 140w 965... :)
 

LoneNinja

Senior member
Jan 5, 2009
825
0
0
Originally posted by: netxzero64
woah that's cool coz it'll hurt you in your utilities bill for having the 140w 965... :)

965 only consumes a few more watts than the 955 under full load, if you don't believe me look at some of the reviews on it.

Personally I'm waiting to see how the Athlon II X4 630 turns out for price/performance and energy efficiency. I'm interested in replacing my Athlon X2 7750 with a quad, but am on a rather tight budget.
 

frostedflakes

Diamond Member
Mar 1, 2005
7,925
1
81
Originally posted by: LoneNinja
Originally posted by: netxzero64
woah that's cool coz it'll hurt you in your utilities bill for having the 140w 965... :)

965 only consumes a few more watts than the 955 under full load, if you don't believe me look at some of the reviews on it.

Personally I'm waiting to see how the Athlon II X4 630 turns out for price/performance and energy efficiency. I'm interested in replacing my Athlon X2 7750 with a quad, but am on a rather tight budget.
They should hopefully be much cheaper and more energy efficient than the Phenom IIs due to less cache. How much do they have anyways? IIRC it's 512KB L2 per core and no L3. Wonder what kind of performance hit they'll see from this. I know back in the 754 and 939 days L2 cache size didn't really have a significant effect on performance (perhaps because of the relatively low latency access to memory that the integrated memory controller provided, or maybe software back then could not utilize large on-die caches). Don't know if that's still the case today, though. I've been very disappointed with the performance per watt of the Phenom IIs, so hopefully these lower cache quads will offer comparable performance per clock at a much lower TDP.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: LoneNinja
Originally posted by: netxzero64
woah that's cool coz it'll hurt you in your utilities bill for having the 140w 965... :)

965 only consumes a few more watts than the 955 under full load, if you don't believe me look at some of the reviews on it.

Personally I'm waiting to see how the Athlon II X4 630 turns out for price/performance and energy efficiency. I'm interested in replacing my Athlon X2 7750 with a quad, but am on a rather tight budget.

that is because TDP means "thermal dissipate power" or "to dissipate power"... aka, how much HEAT it generates, and that is rounded up to accomodate specific HSF designs. So if it goes from 120 to 130, it will have its label changed from 125 to 140... But that is just the heat it outputs, not the actual power drawn from the wall.

http://www.lostcircuits.com/ma...k=view&id=44&Itemid=42
TDP: From Design Guide to Marketing Hype

The actual power draw of the CPU became a design specification called TDP and depending on whose numbers were posted, it stood for typical design power or thermal design power. Semantics aside, in the single core processor environment dominating at the time the TDP was usually considered the absolute maximum power consumption that any CPU could face under worst case scenario conditions. Suffice it to say that in an overhwelming amount of cases, it was not possible to even get close to these numbers using commercially available software. In short, the two reasons why a TDP rating was created in the first place were the tendencies to cut corners on the motherboard (remember those dreadful single-phase VRMs used by MSI on a number of boards?) as well as with respect to OEM heatsink solutions. In other words, as soon as there was a standard, the entire infrastructure could be tested and approved against this standard, with the major benefit of improved motherboard and heatsink designs in the PC space.

The big change came with the increased awareness of global warming. Suddenly, what was originally conceived to force third party manufacturers to have some headroom in their design became a negative attribute. In short, the standard grain of wisdom did not differentiate between a maximum power consumption under worst case conditions and the typical power consumption. Hence a CPU that was labeled with a high TDP , many times to force the enthusiast segment motherboard manufacturer to have the overhead necessary for even some insane overclocking, was labeled as a power hog, even if under normal operating conditions a power consumption even close to the TDP could never be reached. At this point, TDP became a marketing tool; the lower the better.