You think the desktop price/perf of Phenom is bad?

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Long story short,

http://www.phoronix.com/scan.p...tem=opteron_2356&num=8

As far as pricing goes, a single AMD Opteron 2356 will set you back $670 USD. On the Intel side, their 2.33GHz Xeon E5410 Harpertown only costs $284.99 and their 2.66GHz part will run you $500 USD.

Who on earth does AMD expect to buy this stuff? I mean, on the desktop you have people going 'meh, close enough, I don't mind throwing AMD a few more bucks for a bit less performance.' But people buying enterprise servers are DEFINITELY expected to do some objective measurements.

300 mhz clock disadvantage vs a 45nm Intel quad combined with a whopping 35% price premium of the 2.3 ghz 2356 vs 2.66 ghz Xeon? Plus a performance/watt disadvantage?

Does AMD just want to shrink their server market share to 0 ASAP? What am I missing here? Do they expect to only sell 4 way quads?
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Performance per watt disadvantage?
Possibly, but possibly not as much as you think, given the Intel systems use FB-DIMMs, which suck juice.
Also, the E5410 is rated at 80w TDP and the 2356 at 75W TDP. So I really don't know what you're on about.
 

themisfit610

Golden Member
Apr 16, 2006
1,352
2
81
That's a good point about the FB-DIMMs, but he's still right about the overall cost of the CPUs versus their performance. 5W TDP difference per processor isn't much ...

~MiSfit
 

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: Lonyo
Performance per watt disadvantage?
Possibly, but possibly not as much as you think, given the Intel systems use FB-DIMMs, which suck juice.
Also, the E5410 is rated at 80w TDP and the 2356 at 75W TDP. So I really don't know what you're on about.

OP was talking about chips, you are talking about the whole system. Even then, you are only addressing one part of the reasons for his conclusion (price, performance, performance/watt).
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Well, comparing just specs we see the E5440 Xeon (a 2.83 ghz part) is also 80 watts. Considering it's got a 23% clock rate advantage even if we disregard the IPC advantage the 45nm Intel quad has we've got some ground to make up in the computation/watt department. Intel seems to be rating a very broad range of chips with that 80 watt label.

I'm also taking a hint from the desktop equivalent performances. I've seen several benchmarks where a 9850 sucks up to 50 more watts at load than a Q6600, never mind Q9450. I think something strange is afoot at the circle-K. I remember something about upcoming ratings of chips with a 'typical TDP' as opposed to maximum. Time to dig further...

Edit: 5 watts/cpu is a *huge* amount! If your data center has hundreds of CPUs it could add up to having the capacity for several more servers with existing power infrastructure... or not! Depending how much revenue is generated or supported by each system (low or high, depends) compared to expanding or upgrading your facility (huge) this is actually enough of a difference to matter.

Edit2: the 2.5 ghz harpertown for $349 and $479 E5430 are also 80 watt rated parts. Let's discuss performance/watt and performance/watt/$ using the E5430, like I originally started.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,131
3,667
126
are we comparing this to the 179.99 Q6600? Cuz at that price, there isnt any phenom thats worth it.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
No, aigo, we're not. I'm genuinely curious re: what I'm overlooking. What caused the AMD marketing guys think that a Barcelona chip can have a 100% price premium over an equivalent Xeon? About the only thing I can think of is they're pricing vs. Tigertons, not Harpertowns. But those would be the 8XXX series chips, which will presumably have a different price structure once released.

It can't be the FBDIMM requirement. Looks like Tyan supports run of the mill registered ECC DDR2 for their dual socket 771 nforce 3600 professional or serverworks boards just fine -- exact same RAM they support for the Barcelona boards.

I can't come up with any explanation for this pricing structure other than 'we want out of the server business' in my mind. Was hoping YOU guys were smarter.
 

Dadofamunky

Platinum Member
Jan 4, 2005
2,184
0
0
You're right, based on price/performance Intel has to bear the bell away this time. However, I have some more thoughts.

First, I doubt 5W/CPU makes much of a difference in the data center context.

Second, you know that for all intents and purposes AMD is the incumbent provider for many many server OEMs and data centers. they established a huge beachhead in this area in thier previous Opteron generation and gained a lot of goodwill thereby. I also think that AMD's IMC also provides a significant advantage over Intel in many server applications where access to arrays of data in memory are at a premium.

Benchmarks can't always reflect the benefits of a given platform. I say this coming from the standpoint of being the author of a rather old book called The Benchmark Book, which discussed in detail the old Linpack, SPEC and other scientific banchmarks. Everybody in that busines as a coder or marketer knows that benchmarks don't tell the whole story. Most of the time, even the best benchies are simply marketing tools and norotiously inaccurate for predicting real-world performance.

That being said, AMD's new Opterons, while grievously late and horribly launched, were something that a lot of OEMs were waiting for. For most data center applications, the Barcelona is More Than Good Enough even if Intel squeezes out 5% more Linpacks.

OTOH, Intel's use of FB-DIMMs, while expensive and power-hungry, do provide a specific benefit: isolation of memory errors. When a financial firm runs these things in data centers, they HAVE TO STAY UP 24/7/365. No downtime is acceptable. FB-DIMMs promote better system stability in mission-critical environments. I am not familiar with how the AMD side of things deals with this issue, but there must be a competitive solution over there or AMD wouldn't have made so much progress in that market in the recent past.

I would personally consider a Barcelona platform now for VMWare work now that it's out in volume. For home? Nah. The positions are decisively reversed there, and I'm a very happy Intel user now.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: v8envy
Who on earth does AMD expect to buy this stuff? I mean, on the desktop you have people going 'meh, close enough, I don't mind throwing AMD a few more bucks for a bit less performance.' But people buying enterprise servers are DEFINITELY expected to do some objective measurements.

300 mhz clock disadvantage vs a 45nm Intel quad combined with a whopping 35% price premium of the 2.3 ghz 2356 vs 2.66 ghz Xeon? Plus a performance/watt disadvantage?

Does AMD just want to shrink their server market share to 0 ASAP? What am I missing here? Do they expect to only sell 4 way quads?

My suspicion is that they are charging higher prices because AMD has relatively low supply of the parts at the moment. No sense selling 100 parts (just example) for $200 if there are customers out there willing to pay $600 for the priviledge of owning them.

As volume cranks up and supply increases then naturally I'd expect AMD to drop the prices to bring them in-line with the competitive landscape.

In the meantime yes this does mean AMD's overall marketshare in this segment is declining as they only have 100 parts (for example) to sell at any price at the moment.
 

Owls

Senior member
Feb 22, 2006
735
0
76
Originally posted by: Dadofamunky
You're right, based on price/performance Intel has to bear the bell away this time. However, I have some more thoughts.

First, I doubt 5W/CPU makes much of a difference in the data center context.

Second, you know that for all intents and purposes AMD is the incumbent provider for many many server OEMs and data centers. they established a huge beachhead in this area in thier previous Opteron generation and gained a lot of goodwill thereby. I also think that AMD's IMC also provides a significant advantage over Intel in many server applications where access to arrays of data in memory are at a premium.

Benchmarks can't always reflect the benefits of a given platform. I say this coming from the standpoint of being the author of a rather old book called The Benchmark Book, which discussed in detail the old Linpack, SPEC and other scientific banchmarks. Everybody in that busines as a coder or marketer knows that benchmarks don't tell the whole story. Most of the time, even the best benchies are simply marketing tools and norotiously inaccurate for predicting real-world performance.

That being said, AMD's new Opterons, while grievously late and horribly launched, were something that a lot of OEMs were waiting for. For most data center applications, the Barcelona is More Than Good Enough even if Intel squeezes out 5% more Linpacks.

OTOH, Intel's use of FB-DIMMs, while expensive and power-hungry, do provide a specific benefit: isolation of memory errors. When a financial firm runs these things in data centers, they HAVE TO STAY UP 24/7/365. No downtime is acceptable. FB-DIMMs promote better system stability in mission-critical environments. I am not familiar with how the AMD side of things deals with this issue, but there must be a competitive solution over there or AMD wouldn't have made so much progress in that market in the recent past.

I would personally consider a Barcelona platform now for VMWare work now that it's out in volume. For home? Nah. The positions are decisively reversed there, and I'm a very happy Intel user now.

[Tom Cruise]
You don't know the history of datacenters. I do.
[/Tom Cruise]

All joking aside, 5W is indeed a big difference. I managed a DC a few years back and we closely looked at these things. Multiply 5Wx500 and that's how many extra watts more you have to account for.
 

Scotteq

Diamond Member
Apr 10, 2008
5,276
5
0
The actual answer depends on what you want the chips to do.

For general purposes, it's hard to not ignore Intel's advantage in raw computing performance.

However, if the task at hand involves a lot of virtualization (or other memory intensive function), then an AMD chip's integrated memory controller and capacity for higher memory bandwidth is VERY hard to ignore.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
IDC, I think you've nailed it. They've probably done a yield vs demand analysis and figured out they'd have to price their chips with 0 margin, so were better off selling far fewer chips but at a much higher margin. Moving alll the way the right on the supply/demand curve.

As far as synthetic benchmarks sometimes painting a misleading picture -- very right. That's why you look at as many possible benchmarks as possible to get a general performance idea. When the general benchmark trend is 'underperform' it's reasonable to expect that from your applications as well. It's unreasonable to expect the opposite. Nobody is going to buy strictly off of a few benchmarks on the web -- but I expect analysis companies will do exhaustive research including apps the prospective workstation and server CPU customers are interested in. I haven't seen any server load benchmarks where Barcelona outperforms Xeon by 20-100%+, have you?

And 'good will' is not something you risk your position for. You create a folder of documentation supporting your purchasing decision and go. Standing in front of a board of directors explaining why you took a chance on a as yet unproven AMD platform for twice the price having 'I'm a fan of AMD' as the only justification is not something rational people want to risk.

Incumbent is no big deal. If companies were progressive enough to ditch Intel and switch to AMD they'll be just as progressive and proactive in doing cost/benefit analysis to switch right back. They aren't going to be progressive 2-4 years ago and suddenly staunchly conservative now, they'll be just as interested in any competitive advantage now as they were previously. In other words, their existing customers are the worst possible ones to expect irrational brand loyalty from.

 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: Scotteq
The actual answer depends on what you want the chips to do.

For general purposes, it's hard to not ignore Intel's advantage in raw computing performance.

However, if the task at hand involves a lot of virtualization (or other memory intensive function), then an AMD chip's integrated memory controller and capacity for higher memory bandwidth is VERY hard to ignore.

Virtualization is not significantly more memory intensive than the apps running in the VMs. Which is no different from running the apps natively, you simply just gain isolation and hardware abstraction.

Run a VM with an idle machine instance inside -- no OS work being done. You'll notice there are hardly any page faults in the host system, nearly zero CPU load. Compare the # of page faults and CPU load with the same applications running native and inside a VM and you'll see at most a 1-5% difference.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
If I had to guess it's a combination of supply limitations and that AMD's chips have some clear advantages in the business server arena.
 

Dadofamunky

Platinum Member
Jan 4, 2005
2,184
0
0
Originally posted by: Owls

[Tom Cruise]
You don't know the history of datacenters. I do.
[/Tom Cruise]

All joking aside, 5W is indeed a big difference. I managed a DC a few years back and we closely looked at these things. Multiply 5Wx500 and that's how many extra watts more you have to account for.

LOL. Yeah, you're right, that can add up. For some places, I think AMD's "platform" approach (i.e. drop-in upgrades) for Barcelona makes sense. But AMD's power ratings are screwy, or so the grapevine has it.
 

Pelu

Golden Member
Mar 3, 2008
1,208
0
0
yeah... sounds like is no big deal to have phenom... cheaper because a store was closing and selling everything for less lol....

what i dont like from intel is that is kinda tuff to get a intel board that is so good and have crossfire from now on... amd and ati conspiring.. and if you save some money on intel processor.. you going to toss it away in the gpu.... those nvidia cards are kinda more expensive...
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
On the server front AMD still has a fighting chance. New chips from Intel will be hard to beat so the fight is def. on

I'm extremelly happy with my cpu, it's got to 2.9ghz on stock vcore, now that's a nice improvement for Phenom. Things are shaping up pretty well. Competition is great for us.
 

crimson117

Platinum Member
Aug 25, 2001
2,094
0
76
Originally posted by: v8envy
What caused the AMD marketing guys think that a Barcelona chip can have a 100% price premium over an equivalent Xeon?
Because if you already have an opteron-based server, it's easier to drop in a new barcelona chip than replace nearly the entire server with intel parts.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: crimson117
Originally posted by: v8envy
What caused the AMD marketing guys think that a Barcelona chip can have a 100% price premium over an equivalent Xeon?
Because if you already have an opteron-based server, it's easier to drop in a new barcelona chip than replace nearly the entire server with intel parts.

Good thing socket F doesn't have the same upgrade problems that AM2 had then.

In reality how often does this happen? I mean, how many people build their own servers vs. lease or buy with support? Would e.g. IBM still support your server if you replaced the CPUs? I wouldn't risk it; if I had software problems the vendors would immediately start pointing fingers at a non-qualified hardware platform.

So looks like availability/yields is still the front runner in the theory of why Intel has the price/performance crown in spades. Things look favorable for more mainstream TPC benchmarks as well, judging from MySQL performance.

And holy crapski, Jrockit is nearly 50% faster than sun's JVM on 64 bit for middle tier-ey type work! I may have to give that JVM another shot.