Why is the response to Bulldozer so overwhelmingly negative?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Tsavo

Platinum Member
Sep 29, 2009
2,645
37
91
please pay attention to what I said? torque is a spec for cars just like cores and cache are for cpus. AGAIN you were comparing car specs but cpu performance. if you want to do a sensible analogy then you would talk about 0-60 or handling just like you were talking about whether the cpu benchmarks would be noticeable in real world.

HP = ALU
Torque = FPU

;)
 

Maximilian

Lifer
Feb 8, 2004
12,604
15
81
The HW sites are analyzing synthetic benches and technical data and post the results in graph form.

Most of those graphs will never be useful for 99.9% of the PC users but the HW enthusiasts take them like gospels. A little shorter bar is a disaster. The dudes never question if they really need that longer bar, they just see the obvious difference and act accordingly to the Pavlov conditional reflex.

Then the HW enthusiasts write on the forums convincing the people without knowledge that they really need that Intel CPU or that other piece of HW that's so much better. I 've seen countless gamers on a budget being convinced to buy an expensive CPU (see the charts) and not having enough money left to buy a decent video card.

Insert Anand or other HW site benchmark here


The HW sites themselves are proud to be highly technical and give no hint about the usefulness of those benches and results in the real life. I doubt they even know it.

For instance in MS Excel Monte Carlo simulation the Ph II 955 has a score of 20. 20 what? The 2500K has a score of 15.4. Is 20 a disaster? Is it good enough? Will my Excel table stall and crash my PC?

The most ludicrous thing is that they have to tell you if lower is better or higher is better proof that we don't know what they're talking about. But do they?I have yet to see a HW site that tells you:

- The CPU "X" is slower in this bench than the CPU "Y" but don't sweat over this. You will never need that extra performance whatsoever. Actually a score of 20 in the MS Excel Monte Carlo simulation is like 10x the performance you will ever need. Scratch that, forget it.



The thing is any modern quad can be very well used for gaming, especially at the new standard 1080p resolution and the difference between an expensive CPU and a cheap quad are minimal once you get a decent video card and use that resolution. But no, in the benchmarkland the Phenom II or Bulldozer are crap. They can't even play flash games, see those tiny bars?

Umm you dont understand what the reviews indicate so... they are wrong? Yes any quad would be fine for gaming, people dont come to anandtech to build something "fine for gaming" they want the best bang for buck for gaming which is the 2500k at this point. None of the AMD cpus are strong at gaming, in two years time they will be limiting the graphics cards of 2013, whereas sandy bridge will still be a decent match. Besides its not like you need to understand every single benchmark, just those that are relevant to you. If you dont know what cinebench etc indicates then its likely not all that relevant to you, i dont know what a good cinebench score indicates... all i care about is the gaming benches, but its still nice to be able to tell which CPU does better in cinebench hence the "higher/lower is better" labels.

People who bought a phenom 1 for example, it would give similar benches to a core 2 quad back in 2007 for gaming at 1080p, today its another story, the phenom was weaker at gaming and would bottleneck say a 560TI whereas people with the Q6600 or Q9X will still be okay because it was a stronger gaming cpu.

Only low settings low res gaming tests show this, testing at everyday resolution shows nothing, it does not show which CPU will last years and which will last a year if its lucky. GPU does all the work at everyday settings therefore only the GPU is really being benched, not the CPU.
 

mosox

Senior member
Oct 22, 2010
434
0
0
please pay attention to what I said? torque is a spec for cars just like cores and cache are for cpus. AGAIN you were comparing car specs but cpu performance. if you want to do a sensible analogy then you would talk about 0-60 or handling just like you were talking about whether the cpu benchmarks would be noticeable in real world.

I was talking about the abstract CPU benches that nobody cares to explain.

That's why I picked as an analogy the torque for cars because few people know what's that and not the 0-60 time than any idiot can understand.




And the FX-8150 is not the only Bulldozer CPU released by AMD.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Because those capabilities are tiny and a comparable Intel CPU is good enough and much faster for everything else. Don't think most people using a desktop are using it exclusively for one or two specific tasks.

And not only that, but even in tasks that BD has an advantage in its performance/watt is still lower. There is nothing good to say about BD outside of ORNL or perhaps the WinZip HQ.
 

Kyanzes

Golden Member
Aug 26, 2005
1,082
0
76
It's okay if you have a lab or tend to play Deep Fritz or perhaps want to run HyperV or something.
 

mosox

Senior member
Oct 22, 2010
434
0
0
Umm you dont understand what the reviews indicate so... they are wrong?

No, I said they should explain them and most of all they should tell us what performance is "good enough" for every one of them. Not for the gaming benches, the fps are self explanatory.

Only low settings low res gaming tests show this, testing at everyday resolution shows nothing, it does not show which CPU will last years and which will last a year if its lucky. GPU does all the work at everyday settings therefore only the GPU is really being benched, not the CPU.
I got this, now they should add some real world gaming benches, it shouldn't be so hard to change the resolution and bench once you have the game already up and running.

In the CPU reviews they test them at low resolution for the bottleneck thinghy, in video card reviews they bench them using only top of the line CPUs, how the heck can one tell if his budget CPU is good or not for this or that new game at 1080p? The info is not in the CPU review and it's not in the video card review.

The reviews lack most important thing, real world benches.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
Hell, at 1920x1200 I noticed much better smoothness going from a PhII 955BE @ 3.8 to my 2500k @ 4.5, this is with an XFX 2GB 6950. Games that really seemed to boost up were Witcher 2 and SC2. I loved the PhII, it wasn't bad at all, but I definitely noticed the change.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
No, I said they should explain them and most of all they should tell us what performance is "good enough" for every one of them. Not for the gaming benches, the fps are self explanatory.

I got this, now they should add some real world gaming benches, it shouldn't be so hard to change the resolution and bench once you have the game already up and running.

In the CPU reviews they test them at low resolution for the bottleneck thinghy, in video card reviews they bench them using only top of the line CPUs, how the heck can one tell if his budget CPU is good or not for this or that new game at 1080p? The info is not in the CPU review and it's not in the video card review.

The reviews lack most important thing, real world benches.

The more you post, the more IQ points we lose for reading them.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Epic freaking post Russian... EPIC.

Thanks! But if you want to see EPIC post, read this 2600k @ 5.0ghz overclocking with Noctua NH-D14 vs. Corsair H100 lapping comparison by IDC!! :thumbsup:

I think if AMD can increase performance per core by 15-20%, raise clock speeds 20-25%, shift to 22nm and half the power consumption, then an 8-core BD will become an awesome CPU. At the current time, imho AMD needs to drop prices - FX-8120 @ $179.99 and FX-8150 @ 199.99.
 
Feb 19, 2009
10,457
10
76
Hell, at 1920x1200 I noticed much better smoothness going from a PhII 955BE @ 3.8 to my 2500k @ 4.5, this is with an XFX 2GB 6950. Games that really seemed to boost up were Witcher 2 and SC2. I loved the PhII, it wasn't bad at all, but I definitely noticed the change.

Dunno bout W2, but SC2 and WoW hates AMD cpus and radeons.

If u had the equivalent OC intel Q8400 or above, would be fine for ur single card.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
Dunno bout W2, but SC2 and WoW hates AMD cpus and radeons.

If u had the equivalent OC intel Q8400 or above, would be fine for ur single card.

I've found I'm much more sensitive to minimums compared to maximum framerate. In my decidedly unscientific experience, I've found basically zero points in the titles that I play at my native 19:12 that get laggy in spots, which was the case with my PhII from time to time. I only have a 60hz LCD anyway :( But yeah, nothing drives me battier than a sudden chug into low framerate, even if it's gone relatively quickly.
 

nismotigerwvu

Golden Member
May 13, 2004
1,568
33
91
I've found I'm much more sensitive to minimums compared to maximum framerate. In my decidedly unscientific experience, I've found basically zero points in the titles that I play at my native 19:12 that get laggy in spots, which was the case with my PhII from time to time. I only have a 60hz LCD anyway :( But yeah, nothing drives me battier than a sudden chug into low framerate, even if it's gone relatively quickly.

That just makes you normal. It's quite jarring to have a smooth immersive environment suddenly go all herky jerky. Besides, after a certain point (and it's a large debate as to exactly what) the human eyes (or your monitor) can't perceive/display so it's pretty much impossible to tell the difference between 60 fps and 70 fps on the average lcd (unless you have a 120hz model, or you are rocking a crt and have some exceptional visual perception). In all, minimum frame rate is, in practice, the most important measure (assuming the average is above your perception level)
 

Maximilian

Lifer
Feb 8, 2004
12,604
15
81
No, I said they should explain them and most of all they should tell us what performance is "good enough" for every one of them. Not for the gaming benches, the fps are self explanatory.

People that read these reviews know this stuff already, not many care about the "good enough for everyone" part because average joe or just wants to surf the web etc will buy a dell and never even realize it can be upgraded.


I got this, now they should add some real world gaming benches, it shouldn't be so hard to change the resolution and bench once you have the game already up and running.

In the CPU reviews they test them at low resolution for the bottleneck thinghy, in video card reviews they bench them using only top of the line CPUs, how the heck can one tell if his budget CPU is good or not for this or that new game at 1080p? The info is not in the CPU review and it's not in the video card review.

The reviews lack most important thing, real world benches.

Real world benches at 1080p etc show nothing new, sandy bridge thrashes bulldozer in the low res low setting benches, but if the GPU is holding the game back at say 20FPS, then it wont matter if you have a sandy bridge/bulldozer/core2duo from years ago... It is a poor show of CPU performance. When you upgrade that GPU it will matter a lot.

If you want real world benches check out a GPU review.
 

thilanliyan

Lifer
Jun 21, 2005
12,065
2,278
126
Hell, at 1920x1200 I noticed much better smoothness going from a PhII 955BE @ 3.8 to my 2500k @ 4.5, this is with an XFX 2GB 6950. Games that really seemed to boost up were Witcher 2 and SC2. I loved the PhII, it wasn't bad at all, but I definitely noticed the change.

What do you mean smoothness?

My 1055 is at 3.8GHz and I have the same card as you and at 1920x1200 Witcher 2 is really smooth.

What were you experiencing before with your 2500k? If it was actually stuttering that is not the CPU, it's something else.
 
Last edited:

frostedflakes

Diamond Member
Mar 1, 2005
7,925
1
81
I get tired of seeing the "good enough" argument. I like AMD, but they FAILED. It ISNT good enough.

I have a gaming rig to do just that. Play games. That means I expect it to run any game I throw at it. Fine, maybe 90 percent of the games I throw at it dont use all of its resources. But what about those other 10 percent?

Example: FSX. I'd love to see a benchmark using FSX comparing a stock 2500k and FX-8150. There are other games out there that rely heavily on single-threaded CPU performance where it DOES make a difference. And yes, I want to be able to play them on max, so even taking the horrifying (for a modern cpu) power consumption out of the picture, BD fails here. It is NOT a gaming CPU. It works well for multi-threaded apps. Great for people who use 'em - enjoy them.

But it really isnt good enough. Factor in heat, huge power draw, and the cost which is MORE than intel and it makes zero sense for a gamer to buy one. None. You're paying more for less performance.
The "good enough" argument falls flat on its face when it's more expensive than its competitor and doesn't even perform better in the majority of situations. FX-8150 only outperforms the 2500K in 16 of the 50 benchmarks AnandTech ran. And it does so while costing more and consuming quite a bit more power, especially when overclocked. There's really no defending this. Yes an FX-8150 would be fast enough for most people. But if you're spending $245+ on a CPU, it makes no sense to get it over the $220 2500K IMO (barring a few niche uses where the FX-8150 really excels).
 

Kyanzes

Golden Member
Aug 26, 2005
1,082
0
76
Because it underachieves most people's expectations, rumors, hopes for a better future etc.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
What do you mean smoothness?

My 1055 is at 3.8GHz and I have the same card as you and at 1920x1200 Witcher 2 is really smooth.

What were you experiencing before with your 2500k? If it was actually stuttering that is not the CPU, it's something else.

It never quite got to what I would call stuttering, but the framerate dropped into the high 20s a few times depending on how much was happening. Just my estimate based on years of FPS gaming :)

I think the X6 is faster than the X4 in W2?
 

mosox

Senior member
Oct 22, 2010
434
0
0
If you want real world benches check out a GPU review.

They use some top end Intel CPUs in those. Can you find me on the Internet the results for, say, Athlon II X4 at 1080p in Mafia2? Or for the Phenom II 955 in some other newer game?

We have the Monte Carlo simulation stuff and lots of meaningless benches but where IS the info we need? Nobody bothers to actually do some meaningful tests?
 

Maximilian

Lifer
Feb 8, 2004
12,604
15
81
They use some top end Intel CPUs in those. Can you find me on the Internet the results for, say, Athlon II X4 at 1080p in Mafia2? Or for the Phenom II 955 in some other newer game?

We have the Monte Carlo simulation stuff and lots of meaningless benches but where IS the info we need? Nobody bothers to actually do some meaningful tests?

Well they are not meaningful tests.

The benches you propose would show that the CPU's are being held back by the GPUS, but it dosent show how much they are being held back. Running the same game at a lower settings removes any GPU bottleneck and shows the CPU's true potential.
 

gramboh

Platinum Member
May 3, 2003
2,207
0
0
The HW sites are analyzing synthetic benches and technical data and post the results in graph form.

The whole point of benchmarks and reviews is to help users decide what is the best purchase value wise for their needs. We can look at:

a) Performance
b) Power consumption
c) Price

If BD does not offer a compelling combination of those factors for most workloads, the fact that it is "good enough" for most users does not make it a better value purchase compared to Intel offerings. Benchmarks differentiate between these two commodity pieces of hardware. CPUs cannot be compared to cars because cars have a lot of intangibles (feel, brand perception, reliability) that are hard to measure in a meaningful way compared to a CPU performing calculations.

If you have to choose between BD or SB from a value perspective, the vast majority of people are going with SB, that is why people are disappointed with BD.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
What? Bulldozer's module 8-core design isn't able to process a program with 1, 2, 3 or 4 threads faster than a fast 4-core CPU. Basically, the design sacrifices per core performance in favour of situations where you may use 6-8 threads (rare with today's programs). My point is with most of today's programs, you want a fast 4-core CPU, not a slow 8-core CPU, which is why single-threaded ==> i.e., per core performance throughput is more important when discussing quad vs. hexa vs. octa core CPUs. The main exception is if you run programs that actually use 6-8 threads well, in which case the quad may be much slower (but in the case of BD, a single SB core is at least 45-50% faster, so this is also pretty much negated).

What I mean by this is that there are architecture decisions that can affect how a program (with multiple threads) is executed. You strictly mentioned cores-to-threads, but AMD specifically limited Bulldozer's floating point capability by 1/2 (there's one FPU per module). This means that even though there are 8 cores, 8 threads performing floating point operations will not perform them close to the timing of a single thread.

Although, I don't think it was AMD's intent to actually produce an architecture that ended up underperforming. It makes me wonder what the difference is process is when it comes to AMD vs. Intel. It seems that Intel tends to refine smaller aspects of their processors (branch predictor, etc.), which as a whole add up to better performance. We haven't really seen any crazy departures from them (if you ignore the addition of an integrated GPU) since the introduction of the Core series, right?

I do agree that most people really won't use all of the available resources when you start providing so many cores. Grandma doesn't need eight cores to check her e-mail (two are fine in that case).
 

mosox

Senior member
Oct 22, 2010
434
0
0
Well they are not meaningful tests.

The benches you propose would show that the CPU's are being held back by the GPUS, but it dosent show how much they are being held back. Running the same game at a lower settings removes any GPU bottleneck and shows the CPU's true potential.

I understand this. Now I want to know how, say, my Phenom II X3 720 will be doing in that new game at 1920x1080. Do I need to upgrade or what? From where can I get the info?

crickets

What I would like is this:

Take 50 CPUs and 50 video cards and test them at 1920x1080, every single CPU with every single video card in 10 games.

Make a page with dropdown "select CPU" (50 options)

Second dropdown "select video card" (50 options)

Click "result" button.

There you go: 10 games tested at 1080p on that particular CPU and video card.

A site like this would be like the Holy Grail of the gamers.