Why is the response to Bulldozer so overwhelmingly negative?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Concillian

Diamond Member
May 26, 2004
3,751
8
81
Total cost of ownership, including cost of CPU, MB, and power consumed, anyone in CA who would choose AMD is either incompetent at this kind of math or ignorant of their power costs.

When I looked at the cost of running at 30% on time, the difference in the cost of running the computer for a year or two was as much or more than the CPU. Up front costs are almost insignificant if you live in CA. Also this means the cost of additional cores is also significantly more than the up front costs. I need an i3-2xxxk to not cost an arm and a leg to upgrade. A quad hurts a lot more than the up front cost. Ivy needs an OC'able dual, though perhaps the additional power features will allow a quad to be as or more efficient than my dual.
 

Belegost

Golden Member
Feb 20, 2001
1,807
19
81
Yeah, I know, but MCMC is a small enough part of my workload that it's not worth dumping the money into a dedicated GPU computing system (yet - though I have a few grants out that might change things if I get them). The vast majority of what I do is the genomics work.

Who needs a dedicated GPU computing system? I run my GPU loads on GTX460s I picked up for like $80 a piece. The performance is not really in the same league as the dedicated GPGPU hardware, but it's way above CPU performance.
 

nanaki333

Diamond Member
Sep 14, 2002
3,772
13
81
meh.. i have no complaints. i'm interested in it's distributed computing power vs the i5. i got 1 to test out that should be in my office tomorrow. pity i won't be in to play with it until friday though :(

if it performs great, then GREAT! if not, then whatever. i'll continue to buy the athlon x2 for my office computers until they start forcing you on to BD assuming the price is right.
 

86waterpumper

Senior member
Jan 18, 2010
378
0
0
Has anyone yet done any harddrive tests? I had wondered if any of the new chipsets or motherboards had been able to equal sandy bridge systems on transfer speed? I know that has been a issue all along especially with the 6gb/s drives...
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
As the prices align now, the FX-8150 is not a particularly compelling value, unless you primarily use heavily-multithreaded applications.

Hence the disdain. For almost all home and business use the dozer family is out of gas. It's a design that only shines in some niche uses like servers and scientific computing.

Poor single-threaded performance, poor power consumption, weaker for gaming.

If you look at the Sandy Bridge line which now starts at $40, you'll find a better choice at most any price point, and decent motherboards start at around $50.
 

paperwastage

Golden Member
May 25, 2010
1,848
2
76
My complaints with Bulldozer:

Thermals
Performance/Watt
Very little performance increase per thread vs Phenom II

this

BD is worse than their existing Phenom II product

with an inferior compute-per-core and so many single-threaded programs still out there, BD doesn't look well
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
Hence the disdain. For almost all home and business use the dozer family is out of gas. It's a design that only shines in some niche uses like servers and scientific computing.

And niches are just what AMD needs. They don't have MFG volume to produce head to head samples in a price war. They want a niche that's low-ish volume and high profit margin so they can get their books back in order while giving Glo Flo breathing room while transitioning to making their GPUs on a 28nm process. If they can pick up some enthusiasts or desktop users, they'll be happy to, but it was clear a long time ago that the typical enthusiast wasn't the target for BD.

Many people are approaching this from the standpoint of a typical enthusiast usage profile. Who said that AMD was even considering the typical user with an FX chip? Seems they're happy to leave that to Intel while seeking out specific (and profitable) markets to increase their marketshare and, most importantly, profits. Many of their GPU and CPU advances of late seem to specifically target the scientific modeling market segment. This is a segment with gigantic profit margins that AMD wants a piece of. They know they need to get some stable high margin revenue to bring their share prices up. The average price fickle consumer is much less of a concern, I bet.
 
Last edited:

Eeqmcsq

Senior member
Jan 6, 2009
407
1
0
Why is the response to Bulldozer so overwhelmingly negative?

I think everyone was expecting Bulldozer to be "a step forward" and be an overall improvement over Phenom II, not two steps back (single thread, power) and one step forward (multi thread).

Will there be people who benefit from Bulldozer? Sure. My guess is servers or workstations who run a lot of parallel stuff. And I agree that I'd like to see some Johan server reviews on Bulldozer. If I remember correctly, Bulldozer was designed for servers, because servers supposedly do a lot more int calculations than float calculations.

But for everyone else, including home desktop users, single/lightly threaded performance matter more, and Bulldozer is a worst choice for them than even a Phenom II.
 
D

Deleted member 4644

If the FX-8150 were priced the same as the i5-2500K, it would be a better value unless you were primarily a single-threaded application user. It's a harder sell at its current pricing. For gaming, it would be a moot point since anyone using these chips is gaming at 1080P and is GPU-bound, not CPU-bound. ...

OK. So you are a scientist and think logically. That's where you are failing.

For almost all realistic gaming situations (and general personal home use situations), we are GPU-bound today. You are very right about that. Now, to answer your question...

1) People are mad because AMD hyped it, and the hype was BS

2) People are worried because current games are GPU bound with 6xxx and 5xx GPUs, but that may not be true for the next gen, and people generally upgrade CPU less often than GPU

3) People are mad, see 1.

4) People can't read charts, so they don't understand GPU-bound.

5) People are mad, see 1.

6) It suggests that AMD can't get their shit together, which is bad for competition, and makes people mad, see 1.

7) For a small number of new games, CPU is more important than in most games for the last 10 years or so (WoW, SC2, other strategy games). People who play WoW are always really really mad, see 1.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
This thread was posted simply to point out that for all its flaws - which are numerous - it has strengths, too. CPUs are becoming increasingly niche-specific. I'm sorry, but I do not understand why so many people dismiss CPUs built for other people as 'failures.'

I think people consider a general purpose CPU as a failure when it fails at competing in general purpose situations.
 

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
Things are negative because AMD has lost its effing mind. AMD generally is known to have competitive pricing and performance while not being the fastest out there. Pricing followed performance and was generally a cheaper, acceptable trade off to an Intel rig. The 8150 would have to be set to the 2500k price to even be considered as an alternative. I don't know where AMD gets off setting the 8150 $60 higher than a 2500k when it can at best only beat it in a few tests. On top of that AMD's Phenoms match or beat the 8150 is several tests. The 8150 should probably be $200 or maybe a bit lower.

I have a 990fx board and was waiting for BD. I might be able to stomach buying an 8120 to overclock it, but the power it uses when overclocked at load is unreal. I am sure that will get fixed in later revisions. I am just not willing to wait for or subsidize AMD for this mistake. I going to be selling my 990fx board and go to Intel this round and stick through IB likely. I'll re-evaluate it then.

Bulldozer's power draw when overclocked at load reminds me of Clark Griswold plugging in his house in National Lampoons Christmas Vacation.

http://www.youtube.com/watch?v=rqxe9-xPxng&feature=youtu.be&t=1m34s
 
Last edited:

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
You shouldn't buy an FX-8150. You shouldn't buy an i5-2500K, either. Without knowing what games you play, based on what else you describe, you would be fine with a $60 AMD Athlon II X2 250.

Again, why would anyone think "Bulldozer is a failure because its capabilities are not what I would take advantage of?" That seems to be the overwhelming sentiment here and elsewhere. That's like saying semitrailers suck because all you do is run to grocery store a few times every week.

BD is terrible because it is the 'future' of CPUs for AMD. Do you see much possibility to leverage the BD design for laptops (which you bring-up as the future of computers) with it's poor IPC and power-hungry nature? No.

Low-voltage Intel CPUs will stomp on any BD derivitive's throat in performance. Llano and Bobcat are poor performers, but will have to be the 'future' of mobile for AMD. Watch out if BD cores make it into mobile parts, they will be terrible.
 

HNNstyle

Senior member
Oct 6, 2011
469
0
0
My response is negative because I wanted an upgrade to something comparable to an i7. I'm running a phenom x4 right now.
 
Last edited:

tomoyo

Senior member
Oct 5, 2005
418
0
0
I actually feel sad right now. Bulldozer is so bad for so many of the reasons everyone has already stated. I was really hoping for the next Athlon/Athlon64, instead we got basically the next AMD K5. If you remember that chip, it just wasn't competitive and it took the K6 for AMD to finally be able to fight.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
BD is terrible because it is the 'future' of CPUs for AMD. Do you see much possibility to leverage the BD design for laptops (which you bring-up as the future of computers) with it's poor IPC and power-hungry nature? No.

Low-voltage Intel CPUs will stomp on any BD derivitive's throat in performance. Llano and Bobcat are poor performers, but will have to be the 'future' of mobile for AMD. Watch out if BD cores make it into mobile parts, they will be terrible.

This. Lots of people are comparing this to Intel's "Prescott debacle" but really it would be more appropriate to compare it to their Willamette launch.

Willamette had lower IPC and higher power consumption that its predecessor, and was internally intentionally delayed for release until they could get the clocks up high enough to avoid total embarrassment (but not high enough to escape some embarrassment).

With this in mind, project out where Bulldozer is headed with this Piledriver->Steamroller->excavator future roadmap and draw the parallels to Intel's own netburst roadmap.

Has anyone done the calculation of what kind of iGPU-less Llano chip would look like if given a 2B xtor budget and a 315mm^2 Si footprint budget? Would we be looking at a 8-10core thuban beast?
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
Has anyone done the calculation of what kind of iGPU-less Llano chip would look like if given a 2B xtor budget and a 315mm^2 Si footprint budget? Would we be looking at a 8-10core thuban beast?

And they managed to milk a little more IPC for Llano WITHOUT having to use L3$.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
This. Lots of people are comparing this to Intel's "Prescott debacle" but really it would be more appropriate to compare it to their Willamette launch.

Willamette had lower IPC and higher power consumption that its predecessor, and was internally intentionally delayed for release until they could get the clocks up high enough to avoid total embarrassment (but not high enough to escape some embarrassment).

With this in mind, project out where Bulldozer is headed with this Piledriver->Steamroller->excavator future roadmap and draw the parallels to Intel's own netburst roadmap.

Has anyone done the calculation of what kind of iGPU-less Llano chip would look like if given a 2B xtor budget and a 315mm^2 Si footprint budget? Would we be looking at a 8-10core thuban beast?

Great point. They already have 8-12C Opterons today, they could cut-down the xtor footprint somewhat by reducing cache and memory channels, plus 32nm production helps too. Those already have pretty decent thermals on 45nm production. Why they didn't release something like this baffles me.

The mobile sector is a huge issue with BD. They will have to release a 2M/4C chip at a rediculous clock-speed of 800mhz to keep power consumption acceptable.

It's even more sad when you realize that SB desktop CPUs inclue a GPU in their xtor budget while BD does not. 2x xtors and 2x the cores for sometimes faster results???? Thats very disappointing.
 
Last edited by a moderator:

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
I'm really puzzled as to why AMD wouldn't have worked on CMT as an offshoot mobile cpu. They know their rocky history with new architectures on new nodes better than anyone. GPUless Llano with L3 up to 8 cores would look pretty good right now for AM3+ socket.
 

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
With this in mind, project out where Bulldozer is headed with this Piledriver->Steamroller->excavator future roadmap and draw the parallels to Intel's own netburst roadmap.

If anyone has the "We will hit 20Ghz in 2020" powerpoint slide please post it.

The Intel 10Ghz in 10 years slide still makes me LOL.
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Would these same people, that would, in theory, purchase BullDozer... would they purchase an ATI graphics card, just because it had 4000SPs, even if it was slower than a previous-gen card that only had 2000SPs, because of a new architecture?

Or is the 7xxx going to make a fail trifecta, if GCN blows goats too?
 

Kevmanw430

Senior member
Mar 11, 2011
279
0
76
You know, if they HAD shrunken K10 to 32nm (basically used Llano) with L3 cache, and had the die footprint of BD, you'd have a chip with about 10% better IPC than Bulldozer (based on Llano being about 4% faster than Phenom II w/o L3 cahce), and like, 10 cores. You may not get the clockspeed above 4.6GHz on water, but the IPC increase would rock....
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
Has anyone done the calculation of what kind of iGPU-less Llano chip would look like if given a 2B xtor budget and a 315mm^2 Si footprint budget? Would we be looking at a 8-10core thuban beast?

Thuban is less than 1B transistors. A 2B budget gets you 12 cores. More realistic would be moving to 8, use the extra thermal budget for clocks and use extra space to re-arrange the layout to reduce some of the cache latency.

Check out the LLano die:
http://techreport.com/r.x/llano/die-shot.jpg

Remove the iGPU and you have room for 2 more cores. grow out LxW to get to 315 and you have room for another set of cores on the horizontal. separate the cores between the L2's and put the L3 in-between. Add a John Madden BOOM! and you got your basic armchair engineer. Actually I am a process engineer, just not for semi.

But all this assumes that they actually wanted to create a CPU that has similar goals to their previous CPUs. I'm reasonably certain this is precisely what they did not want to do. They know by now that they can't compete directly with Intel. I think it's pretty obvious they've made a business decision to target specific high profit margin niches and design for those specific niches. They seem to be happy to leave the enthusiast consumers to Intel, for now at least.