Is AMD FX Bulldozer really that bad for Gaming?

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
So 3.6ghz > 4.2ghz on stock volts? What are stock volts for that chip anyway?

The power consumption is rising on that 2600K. From 3ghz>4ghz the power consumption jumped ~30-40W with an unchanging vcore. Not much, right? Well, the chip has a 95W TDP so that's 30%+ bump in power consumption. The watts consumed go up and they're absolutely noticeable. I'm not sure that graph is helping your cause here, dude.

And that graph is linear, like I said. Look at the vcore adjustments and you'll see they're exponential =P
 
Last edited:

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Actually wattage will go up by raising the Voltage(W = V * I) not the frequency, that is why OCing with default voltage doesn't raise the power usage a lot. ;)

From IDCs post

http://forums.anandtech.com/showthread.php?t=2195927

See how the power consumption stays low as frequency raises using the same Vcc (red) vs Green

Reading the same chart as you (red line), I see cpu power consumption double between 1.5Ghz and 3.0Ghz - on an undervolted cpu. The blue line - nominal voltage - shows the same behavior.

IDC's findings were power consumption scales linearly with frequency. That's not the same as "OCing with default voltage doesn't raise the power usage a lot"
 
Last edited:

MentalIlness

Platinum Member
Nov 22, 2009
2,383
11
76
So 3.6ghz > 4.2ghz on stock volts? What are stock volts for that chip anyway?

These volts here, I have never changed them. Stock is 3.1Ghz and it went to 4.0Ghz with no voltage increase. I have not tried to go higher, nor have I increased any volts.

In the UEFI Bios, it is set under normal mode. Have not changed any advanced settings.

Is the 4100 or 6100 volts different for stock ?



2347705.png
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
Reading the same chart as you (red line), I see cpu power consumption double between 1.5Ghz and 3.0Ghz - on an undervolted cpu. The blue line - nominal voltage - shows the same behavior.

IDC's findings were power consumption scales linearly with frequency. That's not the same as "OCing with default voltage doesn't raise the power usage a lot"

I said not a lot because of the small OC, from 3.6GHz to 4.2GHz.

i7-2600KVCCClockspeedversusPower-Consumption.png


With Vcc 0.972(Red) at 1.5GHz ther power usage is close to 155w. At 2.5GHz (1GHz raise) the power usage is close to 170W.

With 1.165 Vcc(blue) at 1.5GHz the power usage is close to 175W. At 2.5GHz the power usage is close to 190W.

For a 1GHz raise in frequency we only have ~15-25W raise in power usage.

Now, look what happens when you keep the same Frequency and you raise the Voltage.

i7-2600KVccversusDynamicPower-Consumption.png
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
But that's exactly what I said. Clock speed bumps increase power consumption linearly and voltage increases are exponential (squared, actually).

With higher voltages the clock speed increases also show a higher increase in power consumption, albeit still linearly. So assuming you get from 3.6ghz > 4.2ghz on like 1.2>1.3vcore you'll still get a significant hit on power consumption percentage wise. You were arguing that the increase in power is negligible when it's quite clearly not.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
pelov, yes thats what you said i dont disagree with you.

I still dont believe the OC from 3.6GHz to 4.2GHz will rise the power usage more than 15-20W at full load for the 4-Threaded FX4100.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
OK here we go,

Running TrueCrypt 500MB. Maximum reading from a kill a watt

FX8150 with 1.425v 3.6GHz = 312W
FX8150 with 1.425v 4.2GHz = 342W

30W difference. That was the maximum reading, actual power usage difference for the entire time it took to run the benchmark would be smaller.
 

Hypertag

Member
Oct 12, 2011
148
0
0
I have the Corsair H80 on my FX 8120, and at idle at exactly 4.0Ghz I have never seen the temps higher than 33c. A few days ago, during Crysis 2....I monitored the load temps while playing the game for right around 2 1/2 hours, and the highest temp recorded was 47c.

No clue about power usage, since I really do not care about that anyway. So I don't check nor monitor it. It was at stock voltage as well.

And I am not agreeing with anyone here, just what I have found with my 8120.

Well, there we have it. My Phenom II X4 955 must be better than Sandy Bridge too since it maxes out at 56 degrees Celsius, and Sandy Bridge's core temperature is never that low at load. I mean how dare Intel make processors that run so HOT. It is just crap. I mean the 1090t idles at 14 Degrees Celsius according to this review http://www.guru3d.com/article/phenom-ii-x6-1055t-1090t-review/5 . AMD certainly makes a lot of sub-ambient processors! Intel couldn't dream of matching that.

I wonder what the 8-Bit Zilog Z80 runs in my TI-84? That must be the most powerful processor ever made since it runs so cold.
 

Hypertag

Member
Oct 12, 2011
148
0
0
OK here we go,

Running TrueCrypt 500MB. Maximum reading from a kill a watt

FX8150 with 1.425v 3.6GHz = 312W
FX8150 with 1.425v 4.2GHz = 342W

30W difference. That was the maximum reading, actual power usage difference for the entire time it took to run the benchmark would be smaller.


You increased the frequency by 16.6%. Your processor's power usage increased by approximately this amount. The reason it wasn't exactly this number is you are measuring total system power.
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
I think the issue with your arguments here is that you don't know what you're talking about. If I'm being a bit belligerent then you'll have to excuse me, but your arguments either haven't made any sense or they're the exact thing I, and others, have been saying and are saying but you're still parroting them, quoting them and assuming we disagree with you. If you quote something that states nearly the exact same thing that you're saying then you're being redundant.
Perhaps I wasn't clear enough. My point is that GF deserves part of the blame for failing to deliver the speeds necessary for BD to succeed. You place the entire blame on AMD. I don't think AMD would have went down the clockspeed path if GF wasn't saying it was possible.

And yes, your being belligerent and should probably take a few deep breaths.
 

Smoblikat

Diamond Member
Nov 19, 2011
5,184
107
106
? I can guarantee you that every FX CPU will OC to 4.2GHz, you know why ??? because they already selling a 4.2GHz FX CPU (FX4170).

;)

No you cant, you can guarentee that every FX-4170 can hit at least 4.2, no more, no less.
 

Smoblikat

Diamond Member
Nov 19, 2011
5,184
107
106
OK here we go,

Running TrueCrypt 500MB. Maximum reading from a kill a watt

FX8150 with 1.425v 3.6GHz = 312W
FX8150 with 1.425v 4.2GHz = 342W

30W difference. That was the maximum reading, actual power usage difference for the entire time it took to run the benchmark would be smaller.

Power scales linearly with clocks, but exponentially with voltage.
 
Feb 6, 2007
16,432
1
81
Regarding all this talk of overclocking; you can't compare overclocked to stock, it's not reasonable. Someone who is willing to overclock an AMD chip will be just as willing to overclock an Intel chip, all other things being equal. You're guaranteeing overclocked results and comparing them to the competition at stock, and that's just wrong. But even more telling, you say that you can get certain clocks out of specific AMD chips, yet you're running a watercooling rig on your own OCed chip (as is MentalIllness, the other member posting his OCed AMD results). How much does a watercooling set-up run? $100 or so? Now you're comparing comparably priced chips but adding a $100 watercooler and OCing one and running the other on stock clocks/voltages/cooling. Well if a $300 watercooled rig isn't 50% faster than a $200 rig, isn't the price/performance ratio STILL in favor of Intel?

I'm not a fanboy of any hardware, but from all objective standpoints, Intel chips are superior for use in gaming computers. They use less power, they are more efficient and they deliver higher FPS in every comparable scenario (OC to non-OC does not count as comparable as it's a broken comparison to begin with). Couple that with the fact that they are virtually indistinguishable in price, and there is really no good reason to choose an AMD chip right now in a gaming machine (unless you get a killer deal like the Microcenter deal, which they also offered for the 2600K, a far superior gaming chip to any of AMD's offerings).

Supporting a company is fine. But don't be dishonest about why you're doing it. If you like AMD more than Intel as a company, it's a subjective judgment, and it shouldn't affect which chip is objectively better. For the purposes of building a gaming rig, there's a reason every reputable tech site on the internet is currently suggesting Intel as the CPU, and it's not because they're biased.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
OK here we go,

Running TrueCrypt 500MB. Maximum reading from a kill a watt

FX8150 with 1.425v 3.6GHz = 312W
FX8150 with 1.425v 4.2GHz = 342W

30W difference. That was the maximum reading, actual power usage difference for the entire time it took to run the benchmark would be smaller.

How much did you pay for your FX-8150?
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Regarding all this talk of overclocking; you can't compare overclocked to stock, it's not reasonable. Someone who is willing to overclock an AMD chip will be just as willing to overclock an Intel chip, all other things being equal.

So please, compare an overclocked FX4120 to an overclocked i3-2100 I'm waiting with baited breath.

I think bulldozer is a failure of truly epic proportions, but If I were faced with choosing either i3 2100 or FX4200 for my own rig I would reluctantly go with FX. I know that the difference in power consumption would be truly massive but as you can infer from my sig I'm not big on power saving. The point is that Intel has nothing appealing to enthusiasts under 200$.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
So please, compare an overclocked FX4120 to an overclocked i3-2100 I'm waiting with baited breath.

I think bulldozer is a failure of truly epic proportions, but If I were faced with choosing either i3 2100 or FX4200 for my own rig I would reluctantly go with FX. I know that the difference in power consumption would be truly massive but as you can infer from my sig I'm not big on power saving. The point is that Intel has nothing appealing to enthusiasts under 200$.

You'd go for the FX that consumes 2x more power than the i3 for a 15-20% performance improvement in multi-threaded even though you need to spend $40 more on the motherboard and $30 on an aftermarket CPU cooler.

Makes a lot of sense. I mean, it's obvious why you'd go for this instead of an i5-2400 and an H61 board, even if the 4100 overclocked to its max will be slower than a stock 2400.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
You'd go for the FX that consumes 2x more power than the i3 for a 15-20% performance improvement in multi-threaded even though you need to spend $40 more on the motherboard and $30 on an aftermarket CPU cooler.

Makes a lot of sense. I mean, it's obvious why you'd go for this instead of an i5-2400 and an H61 board, even if the 4100 overclocked to its max will be slower than a stock 2400.

Where did I talk about total platform cost etc.? I just compared two CPUs without consideration for other factors.