Ars: AMD may be irrelevant

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
Why do you think the K series aren't enthusiast CPU's?

Because they have huge useless on-die graphics that no enthusiast uses? Past being capable of playing 1080p video, there's no reason for big on-die graphics for enthusiasts with discrete GPUs -- although even that can be handled by the very efficient modern day discrete GPUs. Intel's 1155-line is laptop and low end first. It offers great perf-per-watt, good on-die graphics performance, and for good reason. Intel sells far more Sandy and Ivy laptops than they do desktops.

Most enthusiasts don't care for better on-die graphics until it can replace their discrete GPU and they sure as hell don't care about perf-per-watt and power efficiency if it means lower clock speeds and fewer threads/cores at the expense of crappy on-die graphics. People are complaining about Trinity's graphics not being capable of playing 1080p games at high settings. They're not complaining because "it's too efficient"

It isn't just Intel, though. AMD sells APUs primarily meant for mobile, and their AM3+ is just the server chips rebranded (minus the MCM)
 
Aug 11, 2008
10,451
642
126
Vishera? You've got 2-module to 4-module parts on AM3+. Not that I think it's going to be good processor. In fact I think it's going to underperform just like Bulldozer (minus the horrendous power consumption which Vishera should improve upon), but that doesn't matter.

I think a lot of you need to realize that neither Intel nor AMD are making enthusiast processors anymore. This notion that if either one of them fail on the desktop and that somehow that's going to run them out of business is quite funny. How many people buy desktops nowadays? And how many of them update their hardware frequently? How is that small percentage of users going to impact their sales, exactly? AMD still holds 30% market share on the desktop and they haven't been able to compete since AMD64 X2. That should tell you something...

Whether AMD can make it out of their slump depends on whether they can improve upon Bobcat and produce ULV chips in the 17W and under range. It has nothing to do with IPC, clock speed or cores.

You are correct about the "desktop" market if you mean desktops for individual consumer use. However, even though it may not be growing much, there is a huge market for servers and enterprise computers for business use. Perhaps one day those will all be ARM and low power, but if it happens, I think it will be quite a few years away. The problem for AMD (at least one of them) is that they have lost a huge amount of market share in the server market, where the most money is made.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
The problem for AMD (at least one of them) is that they have lost a huge amount of market share in the server market, where the most money is made

Oh yea, they've actually dipped below ARM there -- or they might be tied. Either way they're far too close at ~<5% overall server market share. But efficiency and perf-per-watt are king in server just as they are in mobile, so in that sense their focus should remain the same at least as far as those 2 are concerned. Granted, so has Intel with Haswell and Ivy.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Because they have huge useless on-die graphics that no enthusiast uses? Past being capable of playing 1080p video, there's no reason for big on-die graphics for enthusiasts with discrete GPUs

Not this argument again.

Just disable the the IGPU. Problem solved. Sheesh.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
Not this argument again.

Just disable the the IGPU. Problem solved. Sheesh.

The point is it's taking up valuable die space and TDP that could be used for CPU improvements rather than a crappy on-die GPU most people here don't want nor care for. Would you rather want a crappy on-die HD4000 that can't game and is too powerful for 1080p video or would you want a cheaper processor? Maybe a higher clocked chip as well?

Quit twisting the argument. I'm not the fanboy here. Just pointing out the hypocrisy in your argument
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
The point is it's taking up valuable die space and TDP that could be used for CPU improvements rather than a crappy on-die GPU most people here don't want nor care for. Would you rather want a crappy on-die HD4000 that can't game and is too powerful for 1080p video or would you want a cheaper processor? Maybe a higher clocked chip as well?

Quit twisting the argument. I'm not the fanboy here. Just pointing out the hypocrisy in your argument

Disable it and it's not taking any power, it will all be available to the CPU. What do you think would be used in that die space instead?

And the "cheaper processor" is a non-starter. Any cost savings that may be saved by not having a GPU will be eaten up - and then some - by having a special low volume "enthusiast" sku.

You have zero evidence that have an on-die GPU is in some way preventing CPU improvements from occurring. At least Intel isn't doing an AMD and chopping half the cores off to make room for the GPU.
 

Makaveli

Diamond Member
Feb 8, 2002
4,976
1,571
136
The point is it's taking up valuable die space and TDP that could be used for CPU improvements rather than a crappy on-die GPU most people here don't want nor care for.

Question about this point!

Why do they need to use up Die space for cpu improvements when their middle and low tier offering are already giving the competition enough trouble.

I think everyone knows AMD has intel beat in the gpu area for now because of buying ATI. The better question is how long will that lead last?? Intel has enough money and brains to eventually get a good gpu out.

Even with a gpu on die you will need a cpu to push it. And this is one of the reasons in their GPU reviews they ask the reviewers to use intel cpu's, to show their chip in its best light.
 
Last edited:

Hacp

Lifer
Jun 8, 2005
13,923
2
81
Disable it and it's not taking any power, it will all be available to the CPU. What do you think would be used in that die space instead?

And the "cheaper processor" is a non-starter. Any cost savings that may be saved by not having a GPU will be eaten up - and then some - by having a special low volume "enthusiast" sku.

You have zero evidence that have an on-die GPU is in some way preventing CPU improvements from occurring. At least Intel isn't doing an AMD and chopping half the cores off to make room for the GPU.

There are two sockets, FM2 and AM3+. FM2 is the space where people who want an integrated solution can buy their products. AM3+ is for CPU only.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
You have zero evidence that have an on-die GPU is in some way preventing CPU improvements from occurring. At least Intel isn't doing an AMD and chopping half the cores off to make room for the GPU.

That's exactly what they're doing...

Core_I7_LGA_2011_Die.jpg


And it's not like it suffers from lower clock speed. More L3, same IPC, better IMC. Larger die, yea, but it's not like they're only making on-die chips, is it? Why are you buying the mobile chips? Don't you want the enthusiast-level 2011 monsters?

Having no on-die GPU means more wafers per die, more lenient binning (the GPU is the most transistor dense part of any CPU) and lower prices passed on to you and me without having the dead weight.

Why do they need to use up Die space for cpu improvements when their middle and low tier offering are already giving the competition enough trouble.

I think everyone knows AMD has intel beat in the gpu area for now because of Buying ATI. The better question is how long will that lead last?? Intel has enough money and brains to eventually get a good gpu out.

Eventually even with a gpu on die you will need a cpu to push it. And this is one of the reasons in their GPU reviews they ask the reviewers to use intel cpu's, so show their chip in its best light.

I agree wholeheartedly. Intel is playing catchup on the GPU side and will likely match and even surpass AMD, particularly on the lower end in mobile, 17W and under and perhaps 35W too depending on how they decide to spread around their GT4 SKUs. Haswell looks to be an efficiency monster. On the desktop, Haswell's on-die GPU will still fall well short of discrete-level gaming. Where AMD's on-die GPUs are more impressive at higher TDPs, Intel fairs much better at lower TDP mobile.

My point was to highlight the hypocrisy of his posts, and others like him, who scold AMD for making a "not good enough for 1080p gaming" on-die GPU but completely ignore the crappiness that is HD3000/HD4000 and the ever-increasing size of the crappy on-die GPU. And those people who claim the desktop enthusiast segment still drives the market... oh boy.
 

alexruiz

Platinum Member
Sep 21, 2001
2,836
556
126
Why do you think the K series aren't enthusiast CPU's?

While unlocked, they are targeted as mainstream. Here is your answer:

amd2012roadmapcpu.jpg


ps. For the same money you spent on you i5, I could get an A10, plus a SSD and truly get my work done faster... wanna pit them against each other?
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
While unlocked, they are targeted as mainstream. Here is your answer:

amd2012roadmapcpu.jpg

1. I wasn't talking about AMD. They haven't had an enthusiast CPU in years.
2. I don't pay attention to marketing slides. They are created to deceive the uninformed.

For the same money you spent on you i5, I could get an A10, plus a SSD and truly get my work done faster... wanna pit them against each other?

Well, I don't have an i5, I have an i7.

But yes, let's have a little contest. Let's edit and author 60 minutes of AVCHD raw cam footage into a DVD. I'll bet I'm done and having a beer while you're still waiting for render to complete.

But I'll even run it on my kids' i5, he has a single spinner. It will still beat your SSD A10.

What other "work" would you like to do? Compile an Eclipse project? I'll smoke your A10 in that too. Publish a technical document to web? I get to go home early. You on the other hand are working overtime without pay.

Are you game dev? Let's import and render a game level in Maya. Or, we could compile a BSP tree. Since time = $$, I'm actually ahead of you with with my so called more expensive investment.
 
Last edited:

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Correct me where I'm wrong:

Intel only makes 2 chips(at least in the desktop space). They have a quadcore that starts as an i7 and can have hyperthreading and cache disabled to become an i5, and they have a dual core, which starts as an i3 and can be cut down to a Pentium or Celeron chip.

AMD builds 2 chips as well: They have their FX chips, which are all natively 4 module. The FX-4xxx chips have 2 of those modules disabled, the FX-6, only one, but otherwise AMD leaves all features intact. They also have their APUs, in this case Trinity. This is a native 2 module chip with varying amounts of graphical hardware (and cache?) disabled.

The fewer production lines, the better. The FX-4100 wouldn't be cheaper if AMD made it natively a 2 module chip with no iGPU, because they'd have to have a completely different production line?

So yeah I'll take an i5 K chip with an iGPU if it's cheaper than one at the same clocks without the GPU.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
My point was to highlight the hypocrisy of his posts, and others like him, who scold AMD for making a "not good enough for 1080p gaming" on-die GPU but completely ignore the crappiness that is HD3000/HD4000 and the ever-increasing size of the crappy on-die GPU. And those people who claim the desktop enthusiast segment still drives the market... oh boy.

Do people here make the Intel GPU to be the greatest thing ever? Does Intel market their CPU's almost exclusively on their GPU performance? No, the Intel GPU's are so bad nobody talks about them.

And who in this thread has ever said the enthusiast drives the market? If you inferring me, do you not know what I do for a living?
 

sequoia464

Senior member
Feb 12, 2003
870
0
71
Just re-read that article...you're correct, kind of meaningless without a source.

Do you take your coffee black?
 

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
Do people here make the Intel GPU to be the greatest thing ever? Does Intel market their CPU's almost exclusively on their GPU performance? No, the Intel GPU's are so bad nobody talks about them.

And who in this thread has ever said the enthusiast drives the market? If you inferring me, do you not know what I do for a living?

Their GPU works well enough to power netbooks...seems like something to be happy with. Also works well with the idea of switching between an iGPU and discrete GPU.
 

alexruiz

Platinum Member
Sep 21, 2001
2,836
556
126
1. I wasn't talking about AMD. They haven't had an enthusiast CPU in years.
2. I don't pay attention to marketing slides. They are created to deceive the uninformed.



Well, I don't have an i5, I have an i7.

But yes, let's have a little contest. Let's edit and author 60 minutes of AVCHD raw cam footage into a DVD. I'll bet I'm done and having a beer while you're still waiting for render to complete.

But I'll even run it on my kids' i5, he has a single spinner. It will still beat your SSD A10.

What other "work" would you like to do? Compile an Eclipse project? I'll smoke your A10 in that too. Publish a technical document to web? I get to go home early. You on the other hand are working overtime without pay.

Are you game dev? Let's import and render a game level in Maya. Or, we could compile a BSP tree. Since time = $$, I'm actually ahead of you with with my so called more expensive investment.

Sounds good. I am sure you are not against corel video studio pro X5 or arcsoft media converter for the AVDHC conversion, as they feature OpenCL and that Radeon HD7950 can be put to good use. You don't think the HD7950 might be scared of the i7 is floating point performance, do you? I'll be drunk by the time you open your first beer... Wait, that will be a tie, as you just *finally* got a HD7850. Remember, it is a system and the difference in price between the CPUs allow room improvements in other areas. Even without the HD7950, the A10 can leverage OpenCL.

Yeah, lets publish a document to Web. PDF Xchange does it quite faster than acrobat, and it is multithreaded, so quad cores CPUs from both brands are very close. But wait, you need to take all the data before you leave to make a minor change in the layout. How about merging the files? How is that spinner doing?

No, I don't do game development, but how about working with a ton of technical specs? You know, something as trivial as opening them, reading something and making some little calculations dumped into another spreadsheet. Save, open, save... you get the point. How is that spinner doing? Merging those files not so hot?

Going home early? Wait, we need to verify some dimensional data in the assembly models math data. Launch Unigraphics. Specially in this one, how is that spinner doing? I'll tell your family that you'll be really late.

For every example you give me of "CPU demanding app", I can give you 2 that are more dependent on data throughput. At the end of the day, the SSD makes a more perceptible impact in performance... and the SSD and better video card can be enjoyed after work. Ideally, we can have the very best of any component, but how often is that true?
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
Why do you think the K series aren't enthusiast CPU's?

Because the 1155 processors aren't enthusiast CPUs. Just because you unlock the multiplier doesn't make it an enthusiast CPU. It just makes it a laptop chip with an unlocked multiplier that enthusiasts will buy :colbert:
 

inf64

Diamond Member
Mar 11, 2011
3,884
4,692
136
A perfect summary of A10 was done by SA:
Today is a special day, it&#8217;s only a few times a year that we get to see new processors try and carve out a place for themselves in the desktop market. In our Trinity Preview last week we found that AMD&#8217;s mainstream wonder chip has enough GPU horsepower to provide a reasonably high quality 1080P gaming experience. Today we&#8217;re aiming to find out if the CPU side of Trinity can keep pace with Intel&#8217;s Ivy Bridge. The short story is that it can&#8217;t; it&#8217;s not even close. The slightly longer story is that it doesn&#8217;t matter, in the real world, Trinity still ties with Ivy Bridge at everything. The things that were so fast they just happened are still so fast that they just happen. The things that are so slow that you can get up and grab a cup of coffee are still so slow that you can get a cup of coffee. This is true for both CPUs, but Trinity costs far less and has vastly better graphics too.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Because the 1155 processors aren't enthusiast CPUs. Just because you unlock the multiplier doesn't make it an enthusiast CPU. It just makes it a laptop chip with an unlocked multiplier that enthusiasts will buy :colbert:

So what makes a CPU an "enthusiast" CPU? The one you pay the most money for? Or the one you get the most money's worth out of?
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
So what makes a CPU an "enthusiast" CPU? The one you pay the most money for? Or the one you get the most money's worth out of?

One with high IPC, clock speed and preferably more cores at a higher TDP that doesn't sacrifice any of those at the expense of on-die graphics.

Would you rather have a cheaper 2500K with no on-die graphics or a more expensive fatter 2500K with HD3000?

Or the other route...

Would you rather have an extra 2 cores and higher clock speed due to a more lenient TDP ceiling because the HD3000 isn't there?

An unlocked multiplier does not an enthusiast processor make
25.jpg


AMD's AM3+ chips aren't "enthusiast processors" just because they're all unlocked. They're just server chips, just like their APUs are laptop chips. Same with Intel: 1155 is mobile while 2011 is workstation/server. Although the 2011 platform fits that "enthusiast" mold much better than any of the others.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
I can't understand why you're going down this road when all you're doing is projecting your personal opinion.
To actually answer your question, I'd choose the Intel chip with Integrated graphics for one sole reason. So that if I had to, I could use it. For whatever reason. In between discrete graphics card purchases, failed discrete GPU, RMA'ing.
It's my opinion, as you are displaying yours, but don't go on about it like your opinion is the only way and the "right" way. It might be for you, but that's as far as it goes besides a few who may share your outlook. K?