AMD sheds light on Bulldozer, Bobcat, desktop, laptop plans

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
I hope this is not much of a derailment, but since it is about AMD's supposed "leadership" being only a myth, it does have some relevance in a thread about their most ambitious roadmap to date...

IDC, thanks to the links you sent to point me to BSN, Fudzilla, and INQ, I ran across this article about AMD. How factual is this? Despite the AMD-bashing, it does sound well-researched and rather "factual", but I do not have enough knowledge about the matter to make an educated opinion.

If that article pointing to how AMD has always been second-fiddle to Intel is absolutely true, and there's no such thing as Intel and AMD leap-frogging each other because Intel has always been on top, then there does seem to be little reason to think Bulldozer/Bobcat would suddenly be any different?

Intel desktop-segment definitely got their proverbial asses handed back to them on a platter with Prescott. Not in the financial sense, and some would argue that is where being an effective monopoly helps, but in the technological sense.

What made the train-wreck that was Prescott all the more painful to watch was that despite the fact it had a whole node-shrink going for it (130nm -> 90nm) and twice the transistor budget (55m for P4C -> 125m Prescott) the resultant architecture and process technology advantages weren't leveraged in a way that could even compete with Intel's own older and fewer xtor-count Northwood architecture (let alone AMD's).

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=1956&p=22
(note the P4 EE's in the charts in that article are all 130nm Northwood EE's...yes even with the node shrink Prescott, at time of its debut, could not trump the clockspeeds of Northwood, or the IPC)

This is why you'll see me make comments from time to time along the lines of "It is not that AMD took two steps forward with K8 and trumped Intel, it is that AMD took their expected one step forward with K8 while Intel inexplicably elected to take one step back with Prescott, creating a gap where none should have existed...".

I owned a K6-2, then a PII, then a K7, then a P4C Northwood. Prescott wasn't even an option, and because of that the K8 X2 prices were just silly exorbitant high and I never bought one. I held onto that 130nm P4C all the way until 65nm Kentsfields came out.

The opportunity exists for Intel to take another step back, but the probability of Intel making the same mistake twice are all the smaller.
 
Last edited:

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
Well, that's three posters in a row, and seems to be very well thought out replies. Thanks. I guess AMD leapfrogging Intel only happened once, and mostly due to Intel taking a step back.

So, who's the "Phil" person who's supposed to respond as well, according to IntelUser2000?
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
I think it's a bit unfair to AMD to say their lead came mostly from Intel being lazy or incompetent.
The Athlon core was massively complex for a desktop design at the time, baring a strong resemblance to the high end RISC cores of the time. Granted the P4 was an even more complex core, but they each represented a possible future of chip design. The P4's design even appeared to be the superior choice until heat issues were hit. The communication protocol (along with DDR memory support) was rather forward thinking as well.

The Athlon era was probably also the smallest gap between AMD and Intel's manufacturing that ever existed.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
If Pentium III came with actual architectural enhancements, it would have had a much better chance with the Athlon, while AMD continously brought improvements between PII and PIII. Pentium III brought absolutely NO improvements that would benefit existing applications(which is why SSE doesn't count). It was even a step backwards from the Pentium II mobile featuring 256KB cache already, granted Coppermine's cache performed better, but on-die cache was brought by the Pentium II.

It's like assuming if Intel never released Nehalem. AMD would be beating Intel even in desktops by now.

Regarding manufacturing process, it looks like we have two contenders for the closest ever. The first 0.18 micron Intel chip was Intel's Pentium II mobile chip. The Coppermine based Pentium III released over 5 month later.

Pentium II mobile 0.18u - 6/15/1999
Pentium III Coppermine 0.18u - 10/25/1999
AMD Thunderbird 0.18u -11/29/1999
AMD K6-3 0.18u -4/18/2000

The 90nm Prescott with the 6 month delay or so also delayed the entire process. It was supposed to be out by Summer of 2003, not Feb of 2004.

Mobile Athlon 64 processors were the first AMD CPUs to reach 90nm, in August 2004, following rest of the family 2 month later at November.

Intel seems to have increased the process technology leadership over everyone else since then.

"Phil" seems to be my mistake. It's supposed to be IDC(surprise, and sorry if I accidentally revealed it if its true).
 
Last edited:

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
"Phil" seems to be my mistake. It's supposed to be IDC(surprise, and sorry if I accidentally revealed it if its true).
Ah. Ok, got it.

The history lessons are very much appreciated. That more than satisfies the first part of my question.

The second part, about whether to expect Bulldozer/Bobcat to one-up Intel, nobody seems to have expressed their opinion on how likely it is. How about it? It seems that historical trend (among other things, I am sure) is on Intel's favor.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
"Phil" seems to be my mistake. It's supposed to be IDC(surprise, and sorry if I accidentally revealed it if its true).

Oh actually that is true too, my name is Phil, its just that I am so used to being referred to as IDC around here that my immediate assumption when I read your post was that there was actually some other Phil you had in mind that might/should/ought to be responding as a joke or something given the smiley emoticon you used.

So, who's the "Phil" person who's supposed to respond as well, according to IntelUser2000?

Mystery solved ;)
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
Oh actually that is true too, my name is Phil, its just that I am so used to being referred to as IDC around here that my immediate assumption when I read your post was that there was actually some other Phil you had in mind that might/should/ought to be responding as a joke or something given the smiley emoticon you used.

I was about to make a joke (in hindsight, I'd say a very poor one) about how you couldn't possibly be a "Phil", but decided against it. In light of your latest post, I am relieved I didn't. I am sorry it even crossed my mind, Mr. IDC/Phil. Mystery solved indeed... which leads to the other mystery of how IntelUser2000 guessed your name out of a million possible names? :D
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
I was about to make a joke (in hindsight, I'd say a very poor one) about how you couldn't possibly be a "Phil", but decided against it. In light of your latest post, I am relieved I didn't. I am sorry it even crossed my mind, Mr. IDC/Phil. Mystery solved indeed... which leads to the other mystery of how IntelUser2000 guessed your name out of a million possible names? :D

Trust me when I say I wouldn't have held it against you, if you knew my family then you'd know there is no way possible you could find a new way to use my name as a way to offend me :p

IntelUser2000 and I go back a ways, he knows more of the less public details about me by way of pm, etc. A few people here do, but not many. At any rate it was the first time he's referred to me by my name, hence the confusion on my behalf. No confusion when someone refers to "IDC" though ;)

Anyways, enough about me, lets get back to bitching about Intel's suckage in this AMD bulldozer thread :p lol ;)
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
It's like assuming if Intel never released Nehalem. AMD would be beating Intel even in desktops by now.

Not beating it but probably so close that only pricing will make the difference between both and the slight edge in performance going to Intel. The Phenom II X4 its quite competitive with the Kentsfield and Penryn Quads. I'm quite sure that if AMD was able to get Phenom II in 2006, Nehalem would be much faster today than it is now.
 

JFAMD

Senior member
May 16, 2009
565
0
0
The second part, about whether to expect Bulldozer/Bobcat to one-up Intel, nobody seems to have expressed their opinion on how likely it is. How about it? It seems that historical trend (among other things, I am sure) is on Intel's favor.

Much of intel's market position has less to do with having better products and had more to do with the ways that they used to do business.

This has changed.

More importantly, you are asking about 2 architectures for 2011 and asking people to extrapolate, based on what they know today, whether one will be better than the other, without explaining HOW one is supposed to be better than the other. What is better? Is it price? performance? power efficiency? virtualization?

Intel fans will tell you intel is better. AMD fans will tell you AMD is better. Because you are already pre-disposed to say that Intel has the advantage then I think you already have your answer.

I personally believe that 2011 will be an interesting year, but I only have half of the data. However, I would propose that I have half of the data which might be more than many of the others that are giving opinions.

Instead I recommend digging into what you are going to use the processor for and how the architecture will aid or inhibit that. Then you will have a more clear answer of the better architecture.
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
Instead I recommend digging into what you are going to use the processor for and how the architecture will aid or inhibit that. Then you will have a more clear answer of the better architecture.

Hello, Mr. Fruehe. When I put forth the question, and rather haphazardly, I completely forgot you were here. I am not sure how to respond, save for being a bit embarrassed. You spent a lot of time and thought into your reply, so I am thankful. I hope I can return the favor.

Let me try to put my question in better context. Until very recently, I was uninterested in hardware matters. I am a software guy at heart. I create programs and frameworks and "enterprise" modules, chat with accountants and executives, manage a few IT people, etc. You get the picture. I am not completely detached from hardware, for obvious reasons, but I am not too deeply interested in it. So it doesn't come as a surprise that I am (was) one of the people who had the idea that Intel/AMD were in a virtual racehorse track, overtaking each other every now and then.

When I started to get more deeply interested in hardware, thanks to some of my software R&D interests that are intrinsically bound to hardware, I eventually learned the discrepancy between the financial states of AMD and Intel (circa 2008-2009). From there, I wondered how AMD could catch up, considering the chasm between the two companies' R&D budgets.

And then, yesterday, imagine my surprise when I read the article I linked to, that outlined in very specific detail, how the whole "leap-frogging" thing between AMD/Intel is a myth, and how Intel is pretty much just competing with itself. I am not familiar with such history, so I threw it out here and asked for feedback. Additionally, I asked what could that mean for the future? My question was merely a generic one: if AMD, in a span of years upon years, was never able to overtake Intel except when Intel got lazy, is it possible for AMD to still eventually overtake Intel? I guess everything is possible, but is it likely?

If anything, I am probably just seeking some reassurance that in the high-tech race between AMD/Intel, although history has shown mostly an Intel triumph, nothing is etched in stone, R&D budgets notwithstanding.

Much of intel's market position has less to do with having better products and had more to do with the ways that they used to do business.

This has changed.
Yes, and I do hope this makes a lot of difference. I am outside of the US. Being an AMD proponent of some sort since the Athlon X2, the biggest obstacle I find is the difficulty of sourcing AMD parts. One of the biggest (if not the biggest) PC store chains here is 100% Intel only. While the rest of the popular PC chains do carry AMD products, sometimes the listings are laughable - a deluge of Intel parts, compared to a trickle of AMD ones.

(As a side note, things have improved a little. Although we still generally can't find, for example, Phenom II 965's here, the cheap Athlon II X4 surprisingly made its way here rather fast, and tends to fly off the shelves rather quickly in my estimation)

So I do hope that AMD's latest agreement with Intel regarding business practices becomes something worth much more than the $1.25 billion in cash from Intel.

Intel fans will tell you intel is better. AMD fans will tell you AMD is better. Because you are already pre-disposed to say that Intel has the advantage then I think you already have your answer.
I would love to be told, with good reasoning behind it, that AMD will definitely "kick ass", as JHH would put it. I still recommend AMD parts whenever the corporations I serve as a consultant in needs to rollout low-cost PCs for their purposes. I can justify it thanks to the value AMD parts give right now, especially the latest Athlon II X4 620 part. I can't seem to order/recommend enough of it.


I personally believe that 2011 will be an interesting year, but I only have half of the data. However, I would propose that I have half of the data which might be more than many of the others that are giving opinions.
If you can share any of it, I can only but appreciate it very much.
 

JFAMD

Senior member
May 16, 2009
565
0
0
The general problem that I see in comparing architectures is that there are 2 general methodologies:

1. Run a few client-based programs that run almost completely out of cache or rely on video card configurations and then declare a "winner"

2. Actually figure out what you need and run your software with your data.

The "leapfrogging" happens in #2. As to synthetic benchmarks that do not reflect actual server loads (I am a server guy, I won't even attempt to say what happens on the client side), generally speaking, they are a waste of time on servers.

Typically I find that when people evaluate servers they are looking at the following:

1. Value - what am I going to spend, and what am I going to get for that. Top bin speed processors tend to garner ~1-5% of the total shipments. Why is this? Because the extra cost does not justify the performance. With 95%+ looking at something other than performance, it is a shame that so much of the time is spent focusing on performance in reviews. My guess is that this is mainly because numbers are easy to quantify and "value" becomes far more difficult.

2. Power - And not just because power is a cost. The server guys generally do not pay for power, that is facilities. Maybe they get hit with a charge. The real reason is that they are running out of power capacity in the data center. So they need to focus on getting more systems in under the same power budget. Not an easy thing to do. More performance at the same power level is very appealing.

3. Concurrency - How well does the system multi-task. Server workloads are almost always multi-threaded and there are plenty of things going on at any one second. This will only get worse.

4. Manageability - With a data center full of servers, IT folks spend most of their time dealing with software, not hardware. Hardware is robust, it doesn't generally present problems. How many system images, how often do you update drivers, what is the schedule, the cadence of new releases?

5. Virtualization - Most workloads are being virtualized today, so virtualization efficeincy will be critical. NOT virtualization performance. Customers have on average 5-10 VMs on a single server, with generally 4-8GB of memory per VM. Currently VMmark, the most used virtualization benchmark, shows configurations of 80-100 VMs per server with maybe 512MB-750MB of memory per VM. This is not realistic to determine performance.

If you look at where bulldozer will be in 2011, you'll see 16-core server processors optimized for power efficiency, concurrency and virtualization. And AMD has always been the value choice for processors and platforms.

If you can strip out the hype and get down to what really matters to a server customer, you find that the business is not about leapfrogging and not about raw benchmark performance, it is about who can drive the most efficiency and value in their products.

If you look at AMD today, you will see that. If you look at AMD in 2010 with Magny Cours and Lisbon you will continue to see that. And in 2011, when the bulldozer architecture is here (which plugs directly into the Magny Cours and Lisbon sockets) you see that we take the ideas of value and efficiency to new levels. Performance may ping-pong back and forth, but true value has been consistently on AMD's side. And I believe it will continue.
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
Apologies. You are a server guy, talking about servers. I am just a regular guy, talking about desktops. I guess I started this mistake when I failed to distinguish between the Desktop and Server markets and more or less issued an unnecessary blanket statement.

I have no doubt AMD's position is a lot better on the servers compared to their position on the desktop. In fact, I just read the latest IT article on AnandTech by Johan, he still has a lot of praise for AMD.
 

DrMrLordX

Lifer
Apr 27, 2000
21,634
10,850
136
I often get confused about that as well. The slides from AMD seem to clearly indicate 1-10W, or so I thought, but upon reviewing those that I can find, all I found was a phrase saying "scale as low as 1W", and then in a different slide it was billed as "sub-1W capable". If AMD can't get it straight, no surprise we can't either (see two images below, first one says "as low as 1 watt" while the second one says "sub 1 watt")

Indeed.

I might add that seemingly-floating preliminary specs like that make me worry that we're looking at a part that will be delayed and/or short of expectations at release. Granted, it may be that Anandtech has done more to foster confusion through the use of old slides than anyone else, but . . .

Oh well! All will be answered once review samples get out or once ES chips leak. Knowing AMD we probably won't see many informative leaks like we do with Intel ES chips.
 

zsdersw

Lifer
Oct 29, 2003
10,560
2
0
Performance may ping-pong back and forth, but true value has been consistently on AMD's side.

It has not. Performance per dollar per watt across a broad range of server workloads has not always been in AMD's favor.

I realize you're a marketing guy for AMD, and all, but really...
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
I just hope AMD can start beating Intel regularly to the smaller manufacturing process. Efficient microarchitecture is one thing but smaller nodes really help a lot.

Maybe AMD will be able to popularize smaller desktop CPUs (that scale well with Voltage) for gaming since they have Eyefinity and LCD prices are dropping so quickly.

AMD Mini-DTX form factor (that can support Crossfire) smart money or not? (I am not aware of all the details of the specification)

With multiple high resolution 60 Hz LCDs is the CPU really as important as it once was back in the single core/high refresh rate CRT days?
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
I just hope AMD can start beating Intel regularly to the smaller manufacturing process. Efficient microarchitecture is one thing but smaller nodes really help a lot.

That would be interesting, definitely would benefit AMD if it happened. Has it ever happened though? Ever?
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
I am under the impression that Intel's microprocessor R&D and fabs are unmatched. If so, then there's no way AMD can start regularly beating Intel to smaller nodes. Beating them in architecture, perhaps. But on manufacturing tech itself, it seems unlikely to me.
 

tommo123

Platinum Member
Sep 25, 2005
2,617
48
91
well, i imagine that if IBMs 'alliance' can't keep up with intel then it's unlikely

would ibm, samsung etc combined even be enough to keep up with intel?
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
If Intel keeps on earning tons of cash as they've always done, and barring any tremendous misstep on their part, seems unlikely.

IBM and Samsung are both major players... it's just that unlike Intel, Samsung isn't 100% focused on chips, and IBM has turned part (or majority, I don't know the extent) of its business into an Accenture-like consulting firm.
 

JFAMD

Senior member
May 16, 2009
565
0
0
There was an earlier thread that stated that only 10% of Intel's desktop business is quad core. So isn't all of this "who's beating who" mainly a philosophical discussion that isn't really reflecting the market?

Does it matter who gets to the next node faster if 90% of the market is still buying the old node products?