Is AMD FX Bulldozer really that bad for Gaming?

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

lifeblood

Senior member
Oct 17, 2001
999
88
91
I don't think we need to vilify AMD any further for BD. It sucks for almost everything and we all know it :) Hell, even AMD's new executives seem to agree considering they're taking a completely different approach with Steamroller. The chase for clock speeds at the cost of IPC and pipeline length is like the search for the fountain of youth. You'll only end up with malaria and die along the way.
I disagree that the search for higher clock speed is like the search for the fountain of youth, unless your using Global Foundries as your guide. Then your right, its malaria city. Seeing how AMD is paying GF $700 million to get out of its APU exclusivity agreement with GF I guess they figured that out that as well.
 

blckgrffn

Diamond Member
May 1, 2003
9,686
4,345
136
www.teamjuchems.com
The Denebs pull quite a bit of power. I have a 955 at 3.6ghz that I've been able to get up to 4ghz on a 770 ASrock board but no further. They absolutely suck at FSB overclocks :(

I don't think we need to vilify AMD any further for BD. It sucks for almost everything and we all know it :) Hell, even AMD's new executives seem to agree considering they're taking a completely different approach with Steamroller. The chase for clock speeds at the cost of IPC and pipeline length is like the search for the fountain of youth. You'll only end up with malaria and die along the way.

Here's to hoping they pull the shiny bits out of BD (FPU, Turbo, Memory Controller, etc.) and incorporate them into a future product that is more competitive overall.

How the blazes they'll keep up with Intel on the manufacturing side is anyone's guess. At least they are free to pick and choose from fabrication partners now, I for one wouldn't be surprised if they try to leverage Samsung or other players in the future.
 

MentalIlness

Platinum Member
Nov 22, 2009
2,383
11
76
I ended up with a FX 8120. But it was because at the time of purchase at Microcenter the X6 AMD cpus and the 2500K's were all out of stock. So I went with the 8120, and a 990FX Sabertooth. The motherboard I only ended up paying $80 for it.

Well worth it for me. Coming from a E7200 Core 2. And yes, it is plenty fast for what I do. May not be the best, but for the 8120 and a 990FX ASUS, how can I complain ?

FpV8t.jpg

yC2l0.jpg

up5AJ.jpg
 
Last edited:

Hatisherrif

Senior member
May 10, 2009
226
0
0
I'd rather have a i5-750 for gaming than any current or past AMD cpu. Heck, Q9xxx is faster than Phenom II, which is in turn faster than Bulldozer...

In number crunching the Core2 architecture may be better. In real life tasks, a Phenom II is miles ahead, and I've had both. There is an Anandtech article discussing about huge microstutter encountered with the Core2 q9550 while testing a Crossfire config. For gaming, a Phenom is miles ahead, and in everything else it is even tied or it beats the quad.

I'm allergic to people who still claim Core2 is better than Phenom II. It simply isn't. The sad thing is AMD doesn't have a competitive product versus the Core i architecture, but the Core2s are easily outclassed in real-world performance.
 

pauldun170

Diamond Member
Sep 26, 2011
9,490
5,699
136
In number crunching the Core2 architecture may be better. In real life tasks, a Phenom II is miles ahead, and I've had both. There is an Anandtech article discussing about huge microstutter encountered with the Core2 q9550 while testing a Crossfire config. For gaming, a Phenom is miles ahead, and in everything else it is even tied or it beats the quad.

I'm allergic to people who still claim Core2 is better than Phenom II. It simply isn't. The sad thing is AMD doesn't have a competitive product versus the Core i architecture, but the Core2s are easily outclassed in real-world performance.

What was the root cause of the microstutter (repeatable across multiple driver sets for both chipset and vid card(s)? Was a separate intel platform also tried in case it was a voltage issue on the test board?)

Do you remember when the article was published?
 

Gheris

Senior member
Oct 24, 2005
305
0
0
I seriously just don't get some people. The proof is in the results on numerous sites so I have no idea what you are trying to prove here. The Bulldozer platform is a wash for gaming. It's a well known fact. No matter how you spin it the chips are not up to par for gaming. Please stop trying to push something that has already been proven by people not posting anonymously on a forum.

I myself was an AMD guy for years. Bulldozer did not deliver for me. Do I still like AMD? Sure but I am not a fanboy and I will not support what I consider to be a sub par product. Bang for buck as far as gaming is concerneed your better off with an i3 SB processor than a Bulldozer processor. It's that simple.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
I disagree that the search for higher clock speed is like the search for the fountain of youth, unless your using Global Foundries as your guide. Then your right, its malaria city. Seeing how AMD is paying GF $700 million to get out of its APU exclusivity agreement with GF I guess they figured that out that as well.

The only chip that I can think of that pushed clock speeds up high and it sort of worked was Power6, but mainly because IBM didn't really compete with anyone at that time. I guess Power7 is another one but the clocks don't reach the same speeds due to sheer amount of cores/threads.

IBM is also part of The Fishkill Alliance and one of the only fabs that still deploys SOI HKMG just like GloFo does, so I'm not sure where you're going with the GloFo statement :\ People have been blaming GloFo for Bulldozer's performance for months but the only issues GloFo had with AMD's chips was the transistor density of the Llano APUs. The yields during the early part of last year were very poor but they managed to ramp those up and actually exceed their figures by year's end. BD was an architectural mistake from the beginning and not a GloFo problem.

On the x86 side of things there's been two chips in recent memory that went for clock speeds and both failed: Netburst and Bulldozer. You'll have to excuse me and many others who automatically cringe at the thought of any chip architecture basing their design on clock speeds as their main goal. You can safely say it's had a bad rap but it's been well deserved.


Here's to hoping they pull the shiny bits out of BD (FPU, Turbo, Memory Controller, etc.) and incorporate them into a future product that is more competitive overall.

How the blazes they'll keep up with Intel on the manufacturing side is anyone's guess. At least they are free to pick and choose from fabrication partners now, I for one wouldn't be surprised if they try to leverage Samsung or other players in the future.

The FPU performs well, particularly if the Trinity benchmarks prove to be true. Half the FPUs yet some great performance so there might be something worth salvaging there. The Turbocore features are better than what Intel has and not just in terms of clock speeds but implementation as well so that's a major bonus. The IMC still needs a LOT of work. Though it might slightly be better than Thubans it's still well behind what Intel's got. That's kind of shameful considering AMD were the ones to really spark that race. The power gating AMD concocted is also fantastic and works really well to keep idle power consumption very low; the load figures are another matter entirely.

They won't be able to keep up with Intel on the fab side. Nobody can :p It's going to depend on out-thinking them and leveraging those APUs and GPGPU as it's something they still have a very large lead in. We'll see how things turn out with Haswell but by then we'll have Kaveri so it's almost certain AMD will stay ahead there. Unfortunately for AMD, software support has never been their strong suit. If AMD starts working more closely with developers in pushing HSA then they've got a great shot but that'll require a complete change in direction for them -- essentially taking a page out of nVidia's book.
 
Last edited:

bononos

Diamond Member
Aug 21, 2011
3,938
190
106
.......
I moved two Intel G620 based systems while sitting on the AMD system. And about the on board gpu....they were wholly irrelevant to the sales of the systems. Both systems (or all three once you count the two G620 systems I sold while I had the FX-4100) were comparably priced, outfitted as identically as possible (4GB DDR3-1333 Crucial Ballistix RAM, 500GB SATA Samsung Spinpoint hd's, 22X Samsung dual layer DVD/RW optical drives, Win 7 Pro 64-bit OS's on all of them).....and the AMD system was almost universally rejected as most felt the Intel systems were described as being "snappier", "smoother", "quieter", among other adjectives used.
.....
What sort of apps did your customers test your computers with to feel that the 4100 was less snappy/smoother?
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
IBM is also part of The Fishkill Alliance and one of the only fabs that still deploys SOI HKMG just like GloFo does, so I'm not sure where you're going with the GloFo statement :\ People have been blaming GloFo for Bulldozer's performance for months but the only issues GloFo had with AMD's chips was the transistor density of the Llano APUs. The yields during the early part of last year were very poor but they managed to ramp those up and actually exceed their figures by year's end. BD was an architectural mistake from the beginning and not a GloFo problem.
GloFo did had problems, just look at llano vs Ph2
llanos at 32nm with igp disabled, baraly could reach 3.6Ghz in overclock....
at 45nm there is Ph2 at 3.7Ghz stock!

the 32nm yields were so bad, that the 45nm wafer were actually better

sure bulldozer have many problems, AND GloFo issues made things even worse
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Lets see,

FX4100 is faster in multithreaded apps than any dual core 4 threads Intel SB CPU when OCed and cost less. Yes it has more power consumption. Wins in 2 out of 3

FX6100 is faster in multithreaded apps than any dual core 4 threads Intel SB CPU and when OCed it is faster and cheaper than any Core i5 up to Core i5 2400. Yes it has more power consumption. Wins in 2 out of 3

FX8120 when OC is faster and cheaper than any Core i5 in multithreaded apps. Yes it has more power consumption. Wins in 2 out of 3

Price/performance in multithreaded apps the FX series are faster and cheaper. If you dont OC and you want to save on electricity buy Intel. But at the same or less price points the FX CPUs can give you more performance in Multithreaded apps. If you only game and you can afford it, then Buy the new Intel 3570K or the 3770K. But if you are in a low budget and you are willing to OC your components the FX can be very performance/price competitive.
I love it how you state arbitrary overclocking values as if they were a given fact.

Can you please direct me where AMD guarantees a given Bulldozer will overclock to a given clock, guarantees 100% accurate operation in such situations, and will honor all RMAs as a result of failure thereof? Thanks.

You havent seen any FX6100 @ 4.5GHz benchmarks or in-game power usage and yet you claim that FX sucks for games.
A 4.5 GHz Bulldozer doesn’t exist. You’re basically saying “yeah, if I run the AMD processor way out of spec it’s the same speed as a stock Intel CPU costing the same price, while using twice the power. See, it’s competitive with Intel!”

Can you not see how ridiculous your reasoning is?

BDs thermals are much lower than SB and IB, not to mention that you can buy the FX6100 + after market heat-sink for the same price an Intel Core i5 2300 costs.
BD uses more power (watts) than Lynfield/SB/IB, especially when overclocked. Those watts all become heat that needs to be dissipated.

So it's still chewing more power and making more heat even with an after-market heatsink.

You keep saying that BD cost more but that is only valid for the FX8150 vs the Core i5 2500K.
http://www.newegg.com/Product/Produc...82E16819115078

http://www.newegg.com/Product/Produc...82E16819103962

SB is $15 cheaper while being faster and using less power at the same time. Heck, you can see the 65W vs 95W TDP right there in the links.
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
On the x86 side of things there's been two chips in recent memory that went for clock speeds and both failed: Netburst and Bulldozer. You'll have to excuse me and many others who automatically cringe at the thought of any chip architecture basing their design on clock speeds as their main goal. You can safely say it's had a bad rap but it's been well deserved.
Actually, every chip has tried to increase clock speed. Look at history, how fast was the 8086? Now what are we at? 4GHz+ at the high end? Yes, Intel took a step back when it went from Microburst to core, but has gone back to slowly ramping speed over time. If clock speed didn't matter than why do so many of us overclock? Whats the difference between the i5-2400 & 2500? Clockspeed.

Where Intel and AMD failed was trying to push speed faster than the process allowed. AMD should have realized that if Intel couldn't do it with Microburst and their excellent (well funded) fabs, then how realistic was it that GF could do it? I'm sure GF was telling AMD they could, but clearly they were wrong.

BD is a very forward looking chip. It supports functions that software does not yet take advantage of. They had to give up IPC to include it, so they had to increase clockspeed to counter the loss of IPC. Unfortunately GF couldn't deliver.

That brings us to the OPs question. BD is "as good as" an Intel CPU in many new games designed to use take advantage DX11 and new functions, but falls behind on older of "unoptimized" games. So the question is, why get a CPU that's "as good as or worse" when you can get an Intel CPU and not worry about if the game runs well on it? Maybe as software starts to take advantage of BDs capabilities a BD will be superior to Intel, but we're not there yet.
 

Don Karnage

Platinum Member
Oct 11, 2011
2,865
0
0
I love it how you state arbitrary overclocking values as if they were a given fact.

Can you please direct me where AMD guarantees a given Bulldozer will overclock to a given clock, guarantees 100% accurate operation in such situations, and will honor all RMAs as a result of failure thereof? Thanks.


A 4.5 GHz Bulldozer doesn’t exist. You’re basically saying “yeah, if I run the AMD processor way out of spec it’s the same speed as a stock Intel CPU costing the same price, while using twice the power. See, it’s competitive with Intel!”

Can you not see how ridiculous your reasoning is?


BD uses more power (watts) than Lynfield/SB/IB, especially when overclocked. Those watts all become heat that needs to be dissipated.

So it's still chewing more power and making more heat even with an after-market heatsink.


http://www.newegg.com/Product/Produc...82E16819115078

http://www.newegg.com/Product/Produc...82E16819103962

SB is $15 cheaper while being faster and using less power at the same time. Heck, you can see the 65W vs 95W TDP right there in the links.

Guy on ebay sells bnib 2130s for 119.99 ;)
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
Actually, every chip has tried to increase clock speed. Look at history, how fast was the 8086? Now what are we at? 4GHz+ at the high end? Yes, Intel took a step back when it went from Microburst to core, but has gone back to slowly ramping speed over time. If clock speed didn't matter than why do so many of us overclock? Whats the difference between the i5-2400 & 2500? Clockspeed.

tumblr_m3098ob3c81r2y10t.jpg


I hope you realize that every single time you've quoted me then responded to that respective quote you skipped over the fact that the remainder of my post actually agrees with your statement. If you want to argue with a brick wall then might I suggest an actual brick wall.

Or, if you're going to attempt to prove a point -- namely one about how clock speeds gains are a natural progression (no shit) then don't say this just a couple of lines later:

BD is a very forward looking chip. It supports functions that software does not yet take advantage of. They had to give up IPC to include it, so they had to increase clockspeed to counter the loss of IPC. Unfortunately GF couldn't deliver.

They chased clock speeds because they've been losing the IPC battle. Simple. I think the issue with your arguments here is that you don't know what you're talking about. If I'm being a bit belligerent then you'll have to excuse me, but your arguments either haven't made any sense or they're the exact thing I, and others, have been saying and are saying but you're still parroting them, quoting them and assuming we disagree with you. If you quote something that states nearly the exact same thing that you're saying then you're being redundant.

Read this Kanter article about Bulldozer. It explains the microarchitecture in great detail and he goes on about what their goals were and why they went this direction. He also explains the potential pitfalls (which were spot on) and just what will have to go right in order for it to succeed. Bear in mind, they missed IPC by ~10% or so, which is quite a massive backwards step if you consider the meager 3%-6% gains we've seen in IB and how much of an impact they have at stock clocks.
 
Last edited:

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
Of course it's Nic Cage!!

http://knowyourmeme.com/memes/you-dont-say

With all of the negative comments about Bulldozer, I think it's important to keep in mind that one can argue that Bulldozer is AMD's first home-grown microarchitecture. Unlike previous architectures this one is an AMD creation first and foremost. That they were able to match the Thuban is good news but when you realize that you're matching your previous chips on a larger node... well that doesn't look so good :p
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I love it how you state arbitrary overclocking values as if they were a given fact.

Can you please direct me where AMD guarantees a given Bulldozer will overclock to a given clock, guarantees 100% accurate operation in such situations, and will honor all RMAs as a result of failure thereof? Thanks.

Just because we cannot OC the Core i3 we have to guarantee the FX overclock ?? I can guarantee you that every FX CPU will OC to 4.2GHz, you know why ??? because they already selling a 4.2GHz FX CPU (FX4170).

Could you tell me why Intel's Core i5 2500K is the highest sold retailed CPU ??? I will tell you why, because it is multiplier unlocked and 99% of the people that bought it they OCed the hell out of it.

I dont know about you but if the manufacturer(AMD, Intel) sells an unlocked CPU then it gives me the tools to overclock it. If it fails within the 3 years of the given guarantee by the manufacturer then i will RMA it and i will get a replacement. Actually, i overclock from the days of the Pentium 166MHz (non MMX) and i have never had a CPU fail. I dont play with LN2 and i never overdo it with voltage on Air/Water cooling.


A 4.5 GHz Bulldozer doesn’t exist. You’re basically saying “yeah, if I run the AMD processor way out of spec it’s the same speed as a stock Intel CPU costing the same price, while using twice the power. See, it’s competitive with Intel!”

Can you not see how ridiculous your reasoning is?

I dont need to OC the BD to 4.5GHz to mach the stock Intel CPU. I have asked you to tell me what Intel CPU at $149,99 will be faster than a FX6100 at 4,.5GHz.

Im still waiting for an answer to that.



BD uses more power (watts) than Lynfield/SB/IB, especially when overclocked. Those watts all become heat that needs to be dissipated.

So it's still chewing more power and making more heat even with an after-market heatsink.

You said that at 4.5GHz the BD will be an inferno, i have simple said that the CPU temps will be lower than IB. I know that BD uses more power than the SB and clearly more than IB, but the operational core temperature of BD is lower. I have demonstrated with my review that FX4100 can be OCed with default voltage and heatsink at 4-4.2GHz. The power usage with this OC is almost the same as at stock and yet it produces the same fps with a higher priced Core i3.

http://www.newegg.com/Product/Produc...82E16819115078

http://www.newegg.com/Product/Produc...82E16819103962

SB is $15 cheaper while being faster and using less power at the same time. Heck, you can see the 65W vs 95W TDP right there in the links.

Ill tell you this, ill run the same games of my review plus Crysis 2 DX-11 with the HD6950 at 1920x1080 and filters on(same settings most of the people will game). I'll use my FX8150 with one module closed to simulate the FX6100.

Then ill ask you if you will still believe that Core i3 is better than FX6100. ;)
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
but the operational core temperature of BD is lower

It isn't lower, in fact it's likely higher. AMD puts their temp sensors in different places and they adjusted their TJmax accordingly. This is one of the many reasons why comparing temperatures across manufacturers and even architectures doesn't make sense.

The power usage with this OC is almost the same as at stock and yet it produces the same fps with a higher priced Core i3.

I'd wager it isn't. Power consumption goes up with clock speeds regardless of vcore stagnation. It probably isn't as high as it would be with a vcore bump but believe me it's higher and certainly not almost the same.
 

Hypertag

Member
Oct 12, 2011
148
0
0
Hey look, this guy got an idle temperature minimum on bulldozer of 6 Degrees Celsius when the ambient temperature was 21 Degrees Celsius! Bulldozer runs so cool that is acts as an air conditioner when idle! TAKE THAT INTEL! YOUR DEVIL CHIPS ARE TOAST! Oh wait, they are likely too hot for the devil since they don't run 16 degrees below ambient while idle!

http://www.legitreviews.com/article/1741/18/
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I'd wager it isn't. Power consumption goes up with clock speeds regardless of vcore stagnation. It probably isn't as high as it would be with a vcore bump but believe me it's higher and certainly not almost the same.

I have said almost the same wattage because i know it will raise with frequency. But the OC is too little to have a substantial difference.

Ill measure them and tell you the outcome.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Hey look, this guy got an idle temperature minimum on bulldozer of 6 Degrees Celsius when the ambient temperature was 21 Degrees Celsius! Bulldozer runs so cool that is acts as an air conditioner when idle! TAKE THAT INTEL! YOUR DEVIL CHIPS ARE TOAST! Oh wait, they are likely too hot for the devil since they don't run 16 degrees below ambient while idle!

http://www.legitreviews.com/article/1741/18/

It is funny to make a joke out of it but we all know that this reading is not correct ;)
 

Hypertag

Member
Oct 12, 2011
148
0
0
It is funny to make a joke out of it but we all know that this reading is not correct ;)

Hence, since bulldozer's non-core temperatures are not correct and not remotely comparable to Intel's core temperatures, could we stop pretending that bulldozer runs cooler than Intel's chips?
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
I have said almost the same wattage because i know it will raise with frequency. But the OC is too little to have a substantial difference.

Ill measure them and tell you the outcome.

You don't have to but generally speaking the power consumption goes up linearly with clock speed. Either way it should have a noticeable effect as you ramp it up a few hundred mhz.

For example, the Pentium 4 2.8 GHz has 68.4 W typical thermal power and 85 W maximum thermal power. When the CPU is idle, it will draw far less than the typical thermal power. The power consumed by a CPU, is approximately proportional to CPU frequency, and to the square of the CPU voltage:[1]

http://en.wikipedia.org/wiki/CPU_power_dissipation

Actually answers both questions/statements here. Bulldozer consumes more power thus creates more heat. It's that simple, really :p Now the matter of if it's hotter than another chip on a different process with different die sizes and temp sensors placed in different locations produced by a different manufacturer is a completely different story altogether and one that neither I, nor you, nor anybody else can claim to know unless we accurately describe just how we define hotter. In this case where the chips are so different it's impossible. Note the difference between produces more heat and runs hotter than X chip.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
You don't have to but generally speaking the power consumption goes up linearly with clock speed. Either way it should have a noticeable effect as you ramp it up a few hundred mhz.

Actually wattage will go up by raising the Voltage(W = V * I) not the frequency, that is why OCing with default voltage doesn't raise the power usage a lot. ;)

From IDCs post

http://forums.anandtech.com/showthread.php?t=2195927

See how the power consumption stays low as frequency raises using the same Vcc (red) vs Green
i7-2600KVCCClockspeedversusPower-Consumption.png
 
Last edited:

MentalIlness

Platinum Member
Nov 22, 2009
2,383
11
76
I have the Corsair H80 on my FX 8120, and at idle at exactly 4.0Ghz I have never seen the temps higher than 33c. A few days ago, during Crysis 2....I monitored the load temps while playing the game for right around 2 1/2 hours, and the highest temp recorded was 47c.

No clue about power usage, since I really do not care about that anyway. So I don't check nor monitor it. It was at stock voltage as well.

And I am not agreeing with anyone here, just what I have found with my 8120.