• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Bulldozer may not provide dramatic performance increase

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I'll be satisfied if BD comes out in 1H2011 and keeps or improves AMD's price/performance ratio. Although, I am hoping that they manage to get some significant IPC improvements out of BD. It's a shame AMD has nothing to compete with at the very high end but that's not the majority of the market.

AMD earns a lot of credit in the affordable computing department. I have an Athlon II X3 440 unlocked 4th core running at 3.4GHz which came with a 785G motherboard for $80. Intel really isn't offering anything in that everyman bracket.
 
um, dirk meyer announced a few weeks ago that BD wouldn't be out until 2h 2011, so get ready for disappointment.

$80 for an x4 @ 3.4 is ridiculous btw. how many of us will even need to upgrade in the next year or two, current gen is sufficient for 99% + of users anyway. of course, now that I think about it, 8bd cores should be faster than my i7 920...hmmm...
 
That's why I keep saying the Socket 1366/2011 offering 6 and 8 cores are not important. The benefits come from a new architecture. If 6 cores are brought to the mainstream socket it might be little better.

The larger shared caches offered by more core CPUs are more relevant to most than the actual number of cores. Of course even that has detrimental effects like higher latency as well.

Same is true for AMD. But when they say they have 8 cores, they also mean there are 8 threads. Hyperthreading is one thing that makes 4+ core CPU less important.

ok, I agree, I'll take that the # of cores, specifically, 6 vs 8, is not. But I'll ask a similar question, why should the average joe care about SB or BD? If, they don't care about graphics, don't won't see the added benefits of having the GPU integrated, if they did care about graphics, they would have a discrete card.

My take is that SB/BD is actually targeted for the mobile folks that care about graphics, having the option to off load some of the less intensive graphic compute onto the GPU w/ increase battery life, and that's one of the top items for a mobile platform.

Aside from that, I don't see what's the big deal w/ integrating the GPU on die. I know I'm missing something w/ this new architecture that should get me excited about it.
 
ok, I agree, I'll take that the # of cores, specifically, 6 vs 8, is not. But I'll ask a similar question, why should the average joe care about SB or BD? If, they don't care about graphics, don't won't see the added benefits of having the GPU integrated, if they did care about graphics, they would have a discrete card.

My take is that SB/BD is actually targeted for the mobile folks that care about graphics, having the option to off load some of the less intensive graphic compute onto the GPU w/ increase battery life, and that's one of the top items for a mobile platform.

Aside from that, I don't see what's the big deal w/ integrating the GPU on die. I know I'm missing something w/ this new architecture that should get me excited about it.

First gen bulldozer has no integrated GPU, neither will the high end Sandy Bridge.
 
well, I just read the report. all I could see about bulldozer is "we are on track for 2011 launches". It does seem highly likely, however, that a 1st or 2nd quarter launch would have elicited something more like "...for 1q 2011" or "...for 1h 2011".

edit: here's the report: http://seekingalpha.com/article/214781-advanced-micro-devices-inc-q2-2010-earnings-conference-call

more here on the Q&A, specifically this is where dirk said that llano would be delayed by a few months:
http://seekingalpha.com/article/214...c-q2-2010-earnings-conference-call?part=qanda
 
Last edited:
ok, I agree, I'll take that the # of cores, specifically, 6 vs 8, is not. But I'll ask a similar question, why should the average joe care about SB or BD? If, they don't care about graphics, don't won't see the added benefits of having the GPU integrated, if they did care about graphics, they would have a discrete card.

My take is that SB/BD is actually targeted for the mobile folks that care about graphics, having the option to off load some of the less intensive graphic compute onto the GPU w/ increase battery life, and that's one of the top items for a mobile platform.

Aside from that, I don't see what's the big deal w/ integrating the GPU on die. I know I'm missing something w/ this new architecture that should get me excited about it.

There's a lot that will get the low-end chips like the Core i3 540 chips, and market data shows those type of chips all the way to $160 or so is the most popular. Would the i3 540 be preferrable or the Sandy Bridge based i3?

You can do a dumb integration which merely put the GPU core next to the CPU core, or a more tight integration that shares resources. For instance Sandy Bridge is said to share caches between the CPU and GPU. The significantly faster communication between the two allows them to do things that was previously not practical.
 
So you open PDFs that require 16 cores often?
I don't think he means just the reader. In any case, Handbrake is a program I use fairly often which is multithreaded, and many other programs I use - browsers, MyDefrag, iTunes, Photoshop elements, to name just a few - I wish had multithreading capability.
 
Well, I can use 6, 8, 12, 16... 64 cores. I'm a "regular Joe". I just happen to 3D model as a hobby. If I can get an 8 core/8 thread AMD processor that will outperform a 4 core/8 thread Intel for less money (very likely) then it's a big win in my book. If it will plug into a 2 year old mobo and I can continue to use the same RAM, even bigger win.
 
well, I just read the report. all I could see about bulldozer is "we are on track for 2011 launches". It does seem highly likely, however, that a 1st or 2nd quarter launch would have elicited something more like "...for 1q 2011" or "...for 1h 2011".

edit: here's the report: http://seekingalpha.com/article/214781-advanced-micro-devices-inc-q2-2010-earnings-conference-call

more here on the Q&A, specifically this is where dirk said that llano would be delayed by a few months:
http://seekingalpha.com/article/214...c-q2-2010-earnings-conference-call?part=qanda

JF has explained this before. They give increasingly more detailed launch windows as we get closer. So Magny-Cours was originally "2010", then they announced "Q1" then "March" then the launch day. Bulldozer is the same. The official statement, in full, is: "2011, and not December 31st". You can't draw any conclusions from that.

Llano is delayed only because of the process. It is not a design issue. Bulldozer was scheduled to come later anyway, so it is unaffected and on track.

--

As for Bulldozer's performance, let's look at Intel's highest end chip next year: a 10-core Westmere-EX. That's right, NOT a Sandy Bridge.

Since a 12-core MC is competitive with a 6-core Westmere, a "50 percent more performance" BD will be competitive with a 10-core Westmere-EX. With the same power consumption.

The "50 percent more performance" claim refers only to server apps going from 12-core MC to 16-core BD. Without clockspeeds and more benches, you can't deduce how that will affect desktop performance at all.

Idontcare is right though: you can't buy it, so it doesn't matter yet.
 
Last edited:
Which track? I remember when BD was a 2010 CPU.

lol, serious but funny anecdote, you know how TI was the fab for SUN's chips for some 15-20 yrs right? Well in 2004 SUN cancelled a product they internally codenamed the "millenium" chip (externally the chip was referred to as UltraSparc V).

So why was it called the millenium chip but it was being cancelled in 2004, some four years after the millenium had turned? A rational explanation might be to assume they must have started designing it in the year 2000...but no. It was codenamed the millenium chip because it was supposed to come to market in the year 2000.

Not only was the chip 4 yrs behind schedule when they finally decided to cancel it, but when they cancelled it they estimated it had cost them some $5B in R&D that had never resulted in a sellable product. Ouch.

AMD may not be tracking to their original internal release timeline for bulldozer but its not like they are forging new ground in this industry when it comes to project management and milestone pushouts.

History shows us it could be worse, a lot worse, even with Intel's wealth look how delayed Montecito was, or for that matter look at how delayed Tukwila is currently.
 
Which track? I remember when BD was a 2010 CPU.

It was a 2009 CPU, announced 2007. The roadmap was torn up by the time of the 2008 Analyst Day, where they said it would be 2011. They have at least stuck to that since.

JF did say that this Bulldozer is not the same design: the 2009 one would have been 45nm and the core design would be different.
 
lol, serious but funny anecdote, you know how TI was the fab for SUN's chips for some 15-20 yrs right? Well in 2004 SUN cancelled a product they internally codenamed the "millenium" chip (externally the chip was referred to as UltraSparc V).

So why was it called the millenium chip but it was being cancelled in 2004, some four years after the millenium had turned? A rational explanation might be to assume they must have started designing it in the year 2000...but no. It was codenamed the millenium chip because it was supposed to come to market in the year 2000.

Not only was the chip 4 yrs behind schedule when they finally decided to cancel it, but when they cancelled it they estimated it had cost them some $5B in R&D that had never resulted in a sellable product. Ouch.

AMD may not be tracking to their original internal release timeline for bulldozer but its not like they are forging new ground in this industry when it comes to project management and milestone pushouts.

History shows us it could be worse, a lot worse, even with Intel's wealth look how delayed Montecito was, or for that matter look at how delayed Tukwila is currently.

I recently worked on a program that spent over $200B in R&D before it was canceled. (It was planned to cost $350B, so at least it didn't go over budget before it was canceled.)
 
I just skimmed over the first page, but in case this hasn't been mentioned, didn't AMD say that it would have 33% less ALUs?
 
lol, serious but funny anecdote, you know how TI was the fab for SUN's chips for some 15-20 yrs right? Well in 2004 SUN cancelled a product they internally codenamed the "millenium" chip (externally the chip was referred to as UltraSparc V).

So why was it called the millenium chip but it was being cancelled in 2004, some four years after the millenium had turned? A rational explanation might be to assume they must have started designing it in the year 2000...but no. It was codenamed the millenium chip because it was supposed to come to market in the year 2000.

Not only was the chip 4 yrs behind schedule when they finally decided to cancel it, but when they cancelled it they estimated it had cost them some $5B in R&D that had never resulted in a sellable product. Ouch.

AMD may not be tracking to their original internal release timeline for bulldozer but its not like they are forging new ground in this industry when it comes to project management and milestone pushouts.

History shows us it could be worse, a lot worse, even with Intel's wealth look how delayed Montecito was, or for that matter look at how delayed Tukwila is currently.

intel can afford to be late because their competition for the girl has missed the party entirely.
 
Back
Top