• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

PileDriver Performance only 10% -15% (maybe)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
AMD may very well never beat Intel in all out performance again, it is difficult to out do a company that spends many times what your company does on R&D. First AMD needs to get competitive in areas that it can actually compete.

They do have a product that sells very well and fairs better than Intel which is why they haven't announced anything after Vishera for the desktop enthusiast. I believe their original plan was to go full APU after AM3+ sees its last CPU later this year.

If Bulldozer to Piledriver is similar to Phenom I to Phenom II I will be happy. I think, hope and pray that the AMD engineers and most importantly the front office realize that Bulldozer was hardly a homerun with CPU enthusiasts

There are 2 very big differences between the Phenom I > Phenom II and BD>PD releases: AMD actually had a bug to fix in Phenom I that affected its performance. Secondly, AMD also had the luxury of using a smaller node for Phenom II. These are 2 advantages (or disadvantages) that they won't have now.

It looks as though the performance is getting a nice bump if you believe the rumors. We'll likely know more from even more leaked benchmarks and data as the chips will be released in Q2 and OEMs have or will be getting their hands on them soon. I wouldn't expect Ivy Bridge level performance but I'd expect something that's respectable and graphical performance that kicks IVB in the teeth.
 
From what I heard piledriver is also gonna be a new socket. Which means even if you have an AM3+ board your gonna have to upgrade mobo as well if you want a piledriver.

The APU version is a different socket. FX Piledriver should be compatible with current AM3+ boards. Unless I overlooked something somewhere.....
 
AMD actually had a bug to fix in Phenom I that affected its performance.

Actually the initial 'fix' for the bug is what affected performance. NOT the bug in itself. Still no one in the real world was able to duplicate the tlb bug anyways. Majorly overblown.
 
Yeah, not sure why Intel's chipset issue hasn't lingered as much...

Not that I think it should, I think the shear volume and length of time spent talking about Phenom I's TLB issue was quite silly.
 
There are 2 very big differences between the Phenom I > Phenom II and BD>PD releases: AMD actually had a bug to fix in Phenom I that affected its performance. Secondly, AMD also had the luxury of using a smaller node for Phenom II. These are 2 advantages (or disadvantages) that they won't have now.

The bug in Phenom I was fixed with the B3 stepping, models like the 9850 never had that bug. The improvements from Phenom II were primarily based on a die shrink. Piledriver isn't getting any known improvements to the manufacturing process, at least we know it's not a die shrink, it's relying entirely on architectural tweaks.
 
For highly threaded applications the bulldozer arch does make a good bit of sense. Look up adobe lightroom benchmarks if you want to see.
 
Also the tlb bug was fixed with Phenom 1. With the B3 stepping they fixed it. It was already fixed well before Phenom 2.

Edit: Ninja beat me to it
 
IMO they should use Excavator to take the Phenom II out of the ground so that they can die shrink it and enhance it. Hopefully Piledriver didn't pound it too deep. 🙂
 
I've said it once and i'll say it again. Amd has no chance against Intel. There quad cores can't even keep up with Intel's dual core's
 
So, Bulldozer, Piledriver, and Steamroller. Take away the second part of each word, we have
Bull, Pile, and Steam.

Now we put Steam first, because it shares a name with an awesome gaming service
Steam Bull Pile

Now we insert some fun words
Steaming Pile of Bullshit!

Anyway, back to your regular programming. If they can keep to 10-15% per year, then maybe they will eventually catch up to Intel. For PileDriver, they need to increase IPC by 10-15%, increase clock speed headroom by 10-15%, while simultaneously decreasing power drain by 10-15%, even at the elevated clock speeds.

😀
 
15% is deff not worth returning my A6 laptop i bought 10 days ago... I figured i should return it and get the trinity but it wont be out for another month or 2... and for all i know the price will be like $600 vs the $430 i paid for mine... and if its really only 15% then thats what.. 5 more fps ?

Or will the gaming performance really be 50% like they said... i wish we had some real benchmarks.. grr
 
Rumor is it will be 384 VLIW4 800MHz SPs versus current 400 VLIW5 600MHz SPs. Virtual napkin calculations would say 30-50% GPU performance. That's top end desktop though, not sure how the notebook chips will square off.
 
Rumor is it will be 384 VLIW4 800MHz SPs versus current 400 VLIW5 600MHz SPs. Virtual napkin calculations would say 30-50% GPU performance. That's top end desktop though, not sure how the notebook chips will square off.

Die sizes correlate well to performance for GPUs. The Trinity GPU is 20-30% larger than the Llano one.
 
I love how people focus only on AMD's cpu performance and not the pretty damn stellar GPU performance that AMD cranks out.

AMD's exploiting their strengths -- they own ATI! I know if I wanted a laptop that could game but not break the bank, I'd go with a Llano/Trinity. HD 3000 just doesn't cut it for me.
 
Die sizes correlate well to performance for GPUs. The Trinity GPU is 20-30% larger than the Llano one.

The extra die size vs Llano is 99% from the iGPU, the two PD modules are almost equal in size to 4 Star cores in the Llano die.

Edit: Ehm, i read your GPU as CPU, sorry for that 😉
 
Last edited:
The extra die size vs Llano is 99% from the iGPU, the two PD modules are almost equal in size to 4 Star cores in the Llano die.

If this is true, and if PD per watt is better than Llano on a 32nm process, then I don't see how anybody could say the BD idea is a fail.
 
If this is true, and if PD per watt is better than Llano on a 32nm process, then I don't see how anybody could say the BD idea is a fail.

BD is considered a failure because it underperformes the previous cpu architecture.
Because people would rather put a phenom 2 in their machine than a BD cpu.
Because only by overclocking it can it keep up with the previous architecture, at the cost of substantial power consumption and heat.
Because its not a true an X8 cpu though it was advertised as one.
Because the resale value of a BD cpu requires one to sell at a loss @For Sale and Trade.
Because it is to this day still overpriced.
And Because a Pentium G30 SB 65w low end cpu dual core, outperforms the FX-4100, -6100, and x8 -8120 at stock clocks as shown here.
http://www.tomshardware.com/reviews/gaming-cpu-review-overclock,3106-2.html

Im sure others could come up with more if they tried.

That being said BD is a value cpu if it is accompanied by a Free motherboard, like at Microcenter or your running tons of virtual machines. So I would definately consider a high clocked piledriver if it came with a free motherboard.
 
Last edited:
BD is considered a failure because in the worst case senario it underperforms the previous cpu architecture.
Because some people would rather put a phenom 2 in their machine than a BD cpu.
Because only by overclocking it can it keep up with the previous architecture in some things (the rest of the tests it's much better), at the cost of substantial power consumption and heat.
Because its not a true an X8 cpu according to traditional ideas (even though graphics cards are allowed to have many sub "cores") though it was advertised as one.
Because the resale value of a BD cpu requires one to sell at a loss @For Sale and Trade (I'm not even going to touch this one... it's THAT good).
Because it is to this day still overpriced, when all you do is play games.
And Because in games Pentium G30 SB 65w low end cpu dual core, outperforms the FX-4100, -6100, and x8 -8120 at stock clocks as shown here.
http://www.tomshardware.com/reviews/...ck,3106-2.html

I'm running an am3+ board without bulldozer, so obviously I don't think it's a top design. But thought I'd correct your points anyway, I'm sure these people prefer to hear from someone capable of reading reviews.
 
Last edited:
If this is true, and if PD per watt is better than Llano on a 32nm process, then I don't see how anybody could say the BD idea is a fail.

Area breakdown:

Without L2

Llano core: 11.3mm2
Trinity module: 19.7mm2

With L2

Llano core: 18.9mm2
Trinity module: 33.8mm2

Bulldozer does give you 2x the cores at less than 2x the core area. The only problem is you get less performance compared to 2x cores that seems almost in proportion to how much die space it saves.

On the CPU side, Piledriver is only few % faster per clock compared to Bulldozer. The rest are all clocks. It's gains basically make up for 2700K to 3770K gains so we end up in the same level as now.
 
Last edited:
15% is deff not worth returning my A6 laptop i bought 10 days ago... I figured i should return it and get the trinity but it wont be out for another month or 2... and for all i know the price will be like $600 vs the $430 i paid for mine... and if its really only 15% then thats what.. 5 more fps ?

Or will the gaming performance really be 50% like they said... i wish we had some real benchmarks.. grr

llano can't play dirt 3, 1333x768, high settings, 2xAA....

trinity can.... http://www.youtube.com/watch?feature=player_embedded&v=WD6GaFEpfC4
 
I've said it once and i'll say it again. Amd has no chance against Intel. There quad cores can't even keep up with Intel's dual core's

I've said it once and i'll say it again. Intel has no chance against AMD. I want to transcode movies/run a ton of virtual machines/game on a laptop on the cheap and my dollar goes much farther with AMD than Intel.

You see, AMD have to stay competitive with ten times less funds than Intel. Expecting them to compete on IPC is sheer madness, because there is a thing called diminishing returns and Intel can sink a ton more funds into R&D to keep ahead. To stay alive, AMD have to find it's niches where Intel is weaker - iGPU and heavily multi-threaded software. If those don't interest you, then fine, but proclaiming that AMD has lost because they're not fighting windmills is silly.
 
If we're to believe rumors then Trinity is looking like it has exceeded early expectations. 30-50%+ GPU improvement along with ~10-30% compute performance over equivalent Llanos at equal TDP. I'm just hoping the 30% compute isn't tied instruction sets. The GPU will be awesome and that I don't think anyone doubts.

As the poster above said, I'd much rather have a Trinity laptop than an IB one. I'm not going to be using the laptop for anything CPU intensive and the most challenging task it would face would be the occasional game. If price, battery life and GPU performance are great and the CPU is good enough then I'm sold. If it comes in 1080p and crossfire works (unlike Llano where it was almost useless for months) then I may just replace my desktop 😛
 
Last edited:
Back
Top