[OFFICIAL] Bulldozer Reviews Thread - AnandTech Review Posted

Page 27 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
"The first one is that there is a new stepping coming, SemiAccurate is hearing mid- to late Q1/2012 for the next rev. That rev is said to bump performance, specifically integer performance, up by quite a bit, and possibly improve clocks too. Either way, it looks like that stepping is the one to keep an eye out for. It isnʼt a Barcelona type fiasco, but it isnʼt an HD4870 launch either."

it seems that charlie is right (again)

bulldozer B3 is on the way....link
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
"The first one is that there is a new stepping coming, SemiAccurate is hearing mid- to late Q1/2012 for the next rev. That rev is said to bump performance, specifically integer performance, up by quite a bit, and possibly improve clocks too. Either way, it looks like that stepping is the one to keep an eye out for. It isnʼt a Barcelona type fiasco, but it isnʼt an HD4870 launch either."
it seems that charlie is right (again)
Olikan said:
bulldozer B3 is on the way....link

Then AMD should just have waited and launched the B3 to begin with.
 
Last edited:

BlueBlazer

Senior member
Nov 25, 2008
555
0
76
These guys explains AMD's handling of Bulldozer >>> AMD's new chip: what a load of Bulldozer............
Like many we've been finding AMD's Bulldozer to be underwhelming... and it is. Just don't expect a magic bullet to fix it.

Over the past few weeks we've had AMD’s new FX-8150 CPU in the labs, running a variety of benchmarks to get our head around the performance delivered by AMD’s new Bulldozer architecture. Like many other reviewers we've been finding the performance of the CPU underwhelming, benching below Intel’s Core i7-2600K and even consistently dipping below the Core i5-2500K.

You may not be aware but there is a simmering dissent in some corners of the internet over Bulldozer reviews. These cover off everything from suspected caching bugs to AMD deliberately sending out a lower performing ASUS motherboard with the review kit (that one in particular has us scratching our heads). But no matter how much our memories of AMD’s greatness want us to subscribe to the crazy theories we just can't do it – it really just seems like AMD painted itself into a corner with years of pre-launch hype over the Bulldozer architecture.

We had initially expected AMD to launch Bulldozer at Computex this year, alongside AMD’s new 990X chipset. Instead we ended up with samples of motherboards and the suggestion we test them using a Phenom II X6 – the implication being that it would be months before Bulldozer arrived. The hot rumour floating around Taipei during Computex was that AMD was struggling to make Bulldozer a competitive performer, and that it was reaching out to overclockers for help tweaking the CPU.

Ever since then there were several rumoured release dates for Bulldozer, but months went by without news. As each rumoured release passed it seemed more and more like AMD was searching for some sort of magic bullet and we suspect that it ended up a case where AMD had to put up or shut up, hence last week’s launch. We wouldn’t be surprised if the fact that Intel is set to launch Sandy Bridge-E before the end of the year played a part in AMD’s timing.

Bulldozer was accompanied by none of the confident bravado that came alongside the launch of Brazos and Llano APUs earlier this year. If anything AMD has subtly shifted gears in recent months to begin hyping up Trinity – next year’s APU – which will combine the Bulldozer-based Piledriver CPU with a 7000 series Radeon GPU. Generally speaking manufacturers tend to shy away from hyping the next generation of product until the current generation launches.

Even during the review process we were being hit with more and more outlandish suggestions, largely revolving around performance in nefarious ‘future workloads’. The implication was that a flood of heavily threaded applications is on the way, the kind of promise that has been made ever since the first multicore CPUs hit the market.

Some of the suggestions for testing applications practically flew in the face of good benchmarking methodology. One suggestion was Battlefield 3, despite the fact that we were at the time hours from the end of beta, despite the fact that the beta version featured old code and was lacking high end assets. Even more importantly it meant testing in an online multiplayer environment, which is about the worst thing you can do for replicability.

Then there were suggestions we use Windows 8, whose task scheduler is apparently more advanced and better leverages the multiprocessing-focused Bulldozer. Of course, there were no associated drivers and the only copy of Windows 8 available is the strange UI-focused developer preview build.

But by that point we had moved well into ridiculous territory. Speculation about the future is well and good but if we need to be testing with unreleased operating systems just to see improvements we aren’t actually reviewing a product in the real world anymore. No matter how outlandish the conspiracy theories, the hard reality is that if you went out and bought an FX-8150 today you’d get less value for your money than if you bought a Core i5-2500K (or even a Phenom II X6 1100T).

We've asked AMD to see if we can get some lower end models of Bulldozer, where its traditional price/performance advantages might be seen. We're also testing with motherboards other than the Crosshair V Formula to see what, if any, performance benefits may arise. But if you are one of those AMD fans desperately hanging out to hear that a simple tweak will make Bulldozer awesome, then you should probably just accept the fact that while it’s a perfectly cromulent CPU, AMD has lost this round of the performance battle to its far more embiggened opponent.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
Then AMD should just have waited and launched the B3 to begin with.

Well, AMD said the BD CPU would debut in 1H11, then 2Q11, and them 3Q11. If AMD suddenly said that it would be out in late 1Q12, the effects on their stock and internal politics would probably have been even more significant.

At least now they have something to sell for their efforts. Consumers, most of whom know very little about the internal workings of their systems, will see desktop advertised as having an '8 core Bulldozer processor!' - that sounds good and will lead to some Christmas sales so long as GloFo can deliver.

As far as enthusiasts/fanbois go, since B3 won't be Piledriver**, we probably won't even see an average 10% gain in performance (unless GloFo pulls a rabbit out of the hat and saves AMD's @ss - though this is unlikely). BD B3 will be up against IB, and will likely fair even worse than it did against SB. The only hope for BD B3 is that Intel takes advantage of the situation and jacks up the prices of IB by 25%, just because they can, though, Intel could lower the prices of SB and run a squeeze play on AMD.

What ever the case, the road map** for BD through 2014 suXors - beyond that it will take a new and brilliant CPU design to change that.


**
zambezi-slide-10.jpg
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
As far as enthusiasts/fanbois go, since B3 won't be Piledriver**, we probably won't even see an average 10% gain in performance (unless GloFo pulls a rabbit out of the hat and saves AMD's @ss - though this is unlikely). BD B3 will be up against IB, and will likely fair even worse than it did against SB. The only hope for BD B3 is that Intel takes advantage of the situation and jacks up the prices of IB by 25%, just because they can, though, Intel could lower the prices of SB and run a squeeze play on AMD.

**
zambezi-slide-10.jpg

I have seen people make the same mistake over and over again all over the web, (even Anand did it)

The above AMD roadmap chart says that they(AMD) will have an increase in Performance per Watt of 10-15% every year.

That doesn't show's us the performance increase over the previous design of last year, simple because we can have the same Performance as before with 10-15% lower power usage (Increase Performance per Watt).

So, saying that AMD will raise the performance by 10-15% every year is simple wrong, we have no idea how much the performance increase will be, if any ;)
 

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
I have seen people make the same mistake over and over again all over the web, (even Anand did it)

The above AMD roadmap chart says that they(AMD) will have an increase in Performance per Watt of 10-15% every year.

That doesn't show's us the performance increase over the previous design of last year, simple because we can have the same Performance as before with 10-15% lower power usage (Increase Performance per Watt).

So, saying that AMD will raise the performance by 10-15% every year is simple wrong, we have no idea how much the performance increase will be, if any ;)

Thanks, never noticed that the left axis was labeled. In that case, yes, this is a fairly useless chart, like the similar one that Nvidia uses :rolleyes:
 

drizek

Golden Member
Jul 7, 2005
1,410
0
71
Aren't we supposed to get a 100% increase in performance/watt every 18-24 months?

Sandy Bridge is faster than Nehalem and uses less power.

Bulldozer is slower than Phenom and uses more power.

urdoinitrong
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Aren't we supposed to get a 100% increase in performance/watt every 18-24 months?

Sandy Bridge is faster than Nehalem and uses less power.

Bulldozer is slower than Phenom and uses more power.

urdoinitrong

Not according to Anand's review, Bulldozer is faster than Phenom II X6 1100T in Multithreaded apps with a little more power usage.

http://www.anandtech.com/show/4955/the-bulldozer-review-amd-fx8150-tested/9

41715.png


41697.png


Bulldozer is 13,65% faster in x264 than Phenom II X6 1100T
Bulldozer uses 14,5% more power in x264 than Phenom II X6 1100T


Performance per Watt of Phenom II X6 1100T = 0,157
Performance per Watt of FX8150 = 0,156

In this test (x264 second pass) they have the same Performance per Watt but FX is faster.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Not according to Anand's review, Bulldozer is faster than Phenom II X6 1100T in Multithreaded apps with a little more power usage.

http://www.anandtech.com/show/4955/the-bulldozer-review-amd-fx8150-tested/9

41715.png


41697.png


Bulldozer is 13,65% faster in x264 than Phenom II X6 1100T
Bulldozer uses 14,5% more power in x264 than Phenom II X6 1100T


Performance per Watt of Phenom II X6 1100T = 0,157
Performance per Watt of FX8150 = 0,156

In this test (x264 second pass) they have the same Performance per Watt but FX is faster.

If they were adjusted to the same clocks the BD isn't any faster. It's actually slower in benches I've seen where they've done this (this is with all 8 cores). Although, they didn't do power measurements. So, BD might very well not be using more power than Thuban.
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
Whats the deal with this review?

http://www.hardwaresecrets.com/arti...i5-2500K-and-Core-i7-2600K-CPU-Review/1402/12

BD beats i5-2500k in all games tested, nearly beating i7-2600k in SC2 and Lost Planet 2, while taking the lead over all in Dirt 3 and Deus Ex.

Is there something fishy going on? Is this just the perfect set of games to showcase BD's potential advantages? I don't think so, because BD does so poorly in SC2 in AT's review. So what is going on here?
 

nanaki333

Diamond Member
Sep 14, 2002
3,772
13
81
Whats the deal with this review?

http://www.hardwaresecrets.com/arti...i5-2500K-and-Core-i7-2600K-CPU-Review/1402/12

BD beats i5-2500k in all games tested, nearly beating i7-2600k in SC2 and Lost Planet 2, while taking the lead over all in Dirt 3 and Deus Ex.

Is there something fishy going on? Is this just the perfect set of games to showcase BD's potential advantages? I don't think so, because BD does so poorly in SC2 in AT's review. So what is going on here?

if you spend enough time with enough hardware/software/driver/resolution configurations, you're bound to find a sweet spot that shows BD winning.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
That doesn't show's us the performance increase over the previous design of last year, simple because we can have the same Performance as before with 10-15% lower power usage (Increase Performance per Watt).


but is not actually impossble, actually is "easy".

just get rid of that massive l2 cache, a 512kb or 1mb for each core (not shared), and just by that a ~5% performance and some energy reduction appers, add some better clocks, and there are your 10-15% performance/watt.

that my accour with trinity, since it probably won't have l3 cache, amd will have to work on the l2
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,811
1,290
136
Whats the deal with this review?

http://www.hardwaresecrets.com/arti...i5-2500K-and-Core-i7-2600K-CPU-Review/1402/12

BD beats i5-2500k in all games tested, nearly beating i7-2600k in SC2 and Lost Planet 2, while taking the lead over all in Dirt 3 and Deus Ex.

Is there something fishy going on? Is this just the perfect set of games to showcase BD's potential advantages? I don't think so, because BD does so poorly in SC2 in AT's review. So what is going on here?

Anandtech's Rig

Motherboard:
ASUS P8Z68-V Pro (Intel Z68)
ASUS Crosshair V Formula (AMD 990FX)
Hard Disk:

Intel X25-M SSD (80GB)
Crucial RealSSD C300
Memory:
2 x 4GB G.Skill Ripjaws X DDR3-1600 9-9-9-20
Video Card: ATI Radeon HD 5870 (Windows 7)
Video Drivers:
AMD Catalyst 11.10 Beta (Windows 7)
Desktop Resolution:
1920 x 1200
OS:
Windows 7 x64



Hardware Secrets Rigs


Operating System Configuration

  • Windows 7 Ultimate 64-bit
  • NTFS
  • Video resolution: 2560x1600 60 Hz
Driver Versions

  • AMD video driver version: Catalyst 11.10
  • AMD chipset driver version: 3.0.816.0
  • Intel Inf chipset driver version: 9.2.0.1030
41703.png


Anand: Starcraft 2 has traditionally done very well on Intel architectures and Bulldozer is no exception to that rule.

---
Hardware Secrets

We tested this game at 1920x1200. The quality of the game was set to the “low” preset, disabling both anti-aliasing and anisotropic filtering. We then used FRAPS to collect the frame rate of a replay on the “Unit Testing” custom map. We used a battle between very large armies to stress the video cards.

http://www.hardwaresecrets.com/article/AMD-FX-8150-vs-Core-i5-2500K-and-Core-i7-2600K-CPU-Review/1402/12

On StarCraft II, the Core i7-2600K, the Core i5-2500K, and the AMD FX-8150 achieved the same performance level, with the new AMD processor being 12% faster than the Phenom II X6 1100T.
 
Last edited:

BlueBlazer

Senior member
Nov 25, 2008
555
0
76
Wow! What's up at AMD? Where's the top reviewers like Anandtech, Tom's Hardware and Tech Report? Not to mention oldies like Guru3D, TechSpot, Bittech, Hexus, TweakTown, OverclockClub and Xbit? LostCircuits fell out of favor too? :hmm:

And in AMD's list, they picked Computer Shopper?! Anyone saw TechwareLabs review? Hi Tech Legion graphs are so ridiculous (reminds me of DonanimHaber). :D
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
I'm not surprised Anandtech is not there as Anand likes to use what I think of as non-real world hardware and software to test hardware. Anand gets the absolute best of everything else to find the true differences between products. This is best seen in GPU testing where he'll use high end hardware to test low end GPUs. Yet nobody in the real world uses a HD 6350 with a i7 2600k. His reasoning is absolutely correct, he wants to show the tested hardware unlimited by other components, stressing each individual part of the item he is testing. Testing CPU's with Excel's Monte Carlo is useful to see how the CPU handles it, but very few people ever use anything like Monte Carlo in real life. For us that understand whats he's doing it is absolutely valid. To some guy off the street who doesn't understand, it will be deceiving.

I read a few of the others and they seemed to use more real world hardware setups and tests which tend to show BD in a more forgiving light. BD is only two FPS slower than an Intel CPU in their tests, as opposed to being totally thrashed in Anands test.

So AMD wants you to see BD in more real-world senarios where it doesn't look as bad? Ok, but it still doesn't change the final verdict. Two minutes slower encoding a movie is still two minutes. And two FPS in a game is still two FPS. Why buy BD when I can get a Intel CPU for the same cost, plus 2 FPS or 2 minutes as well?
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Shame they didn't talk like this in their blogs pre-launch, would have softened the criticism a bit I think. Then again some internet posters are hate machines.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Just adding a frustrated caveat about how they've been doing their mini-itx power testing with huge wattage PSUs. Big site like Anandtech should be able to get their hands on a 300W 80 Plus Bronze or Silver, even if they *shudder* have to go to Bestbuy.

Still doing it too: http://www.anandtech.com/show/4833/asus-f1a75i-deluxe-review-llano-and-miniitx/4

1000W Power Supply, really?

I'm not surprised Anandtech is not there as Anand likes to use what I think of as non-real world hardware and software to test hardware. Anand gets the absolute best of everything else to find the true differences between products. This is best seen in GPU testing where he'll use high end hardware to test low end GPUs. Yet nobody in the real world uses a HD 6350 with a i7 2600k. His reasoning is absolutely correct, he wants to show the tested hardware unlimited by other components, stressing each individual part of the item he is testing. Testing CPU's with Excel's Monte Carlo is useful to see how the CPU handles it, but very few people ever use anything like Monte Carlo in real life. For us that understand whats he's doing it is absolutely valid. To some guy off the street who doesn't understand, it will be deceiving.

I read a few of the others and they seemed to use more real world hardware setups and tests which tend to show BD in a more forgiving light. BD is only two FPS slower than an Intel CPU in their tests, as opposed to being totally thrashed in Anands test.

So AMD wants you to see BD in more real-world senarios where it doesn't look as bad? Ok, but it still doesn't change the final verdict. Two minutes slower encoding a movie is still two minutes. And two FPS in a game is still two FPS. Why buy BD when I can get a Intel CPU for the same cost, plus 2 FPS or 2 minutes as well?
 

BlueBlazer

Senior member
Nov 25, 2008
555
0
76
Both Guru3D and Overclockers Club are notorious for testing games in often GPU limited scenarios (where everything is max'ed out). Bjorn3D and TweakTown often does the same thing as well. And yet they all got left out of the list?! :p

So what gives? Is it some game they used (I have a feeling that one of them is "Far Cry 2") or some application they do not approve (example like Overclockers Club used Excel and ancient ScienceMark, etc)? :hmm:

Edit: In Bjord3D's review, Lost Planet didn't favor the AMD FX

LostPlanet2.jpg


whereas (AMD approved) Icrontic's one did........

Bulldozer_Lost_Planet_2_0xAA.png
 
Last edited:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
That is pretty funny. My review made AMD's list but didn't make it here an an "approved" review for ATF CPU forum
;)
Well if you had sent me a PM or something saying you had a review... I had no idea.:p