With the current rate of Intel CPU performance increases, could AMD be catching up?

Page 19 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
AtenRa just wants the light away from the fact the FX CPUs are 140W CPUs instead of 125W.

Wasn't there some AMD mantra about perf/watt...once...it really mattered....not so much now I guess ;)

Funny how all sorts of new "fancy" metrics gets all swapped out like socks when the prime metric fails to reach the crown:

Performance.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Wasn't there some AMD mantra about perf/watt...once...it really mattered....not so much now I guess ;)

Funny how all sorts of new "fancy" metrics gets all swapped out like socks when the prime metric fails to reach the crown:

Performance.

There is something called Hybris and Nemesis from one of the many repeatable days in the AMD Marketing department:
http://shintai.ambition.cz/amd.pdf

But good idea, maybe the FX CPUs are 125W ACP ;)
 

Piroko

Senior member
Jan 10, 2013
905
79
91
If it consumes more than 77W/125W then it will be dissipating more than 77W/125W of heat. Any type of energy emitted by the CPU in other forms like electromagnetic radiation is small enough to be negligible. I don't know if you're arguing against that point but it sounds like you are.
When you turn on a 45W light bulb it will momentarily draw more than 1A/230V. It's still called a 45W light bulb because averaged over a timespan it is dissipating no more than 45W.
TDP is averaged over a timespan, the actual energy usage can still vary wildly as long as the average is at or below the TDP.
Sure, a CPU can use more power than its rated TDP and still be within specified temperature. But that greatly brings into question as to what the TDP figure actually means.
The TDP was meant as a guidline for cooling designs. It does correlate with what you'll pay for running a PC at full load, but that's it.

Amd could have sold their processors at a higher price. Supply-demand and all.
In my opinion, the largest damage was in the form of mindshare. The bigger companies were buying Dells, HPs, IBMs and Fujitsu-Siemens PCs. None of them sold AMD PCs, all were carrying "Intel P4" stickers on their fronts. That will make people sceptical about the quality of AMD PCs, because "companies only buy good stuff".
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
In my opinion, the largest damage was in the form of mindshare. The bigger companies were buying Dells, HPs, IBMs and Fujitsu-Siemens PCs. None of them sold AMD PCs, all were carrying "Intel P4" stickers on their fronts. That will make people sceptical about the quality of AMD PCs, because "companies only buy good stuff".

AMD did themselves no favors there though. I owned AMD chips at every major CPU release from the 386 up to the X2 and trying to get non-buggy chipsets, non-buggy drivers, was a real challenge.

It wasn't enough to deter me, but it was not something one could easily ignore either.

But when DELL did start selling AMD chips I told my dad it was OK to buy a DELL AMD rig...big mistake. That thing would routinely crash, the drivers were crap, and my parents never asked me for computer advice again. (which turned out to be a blessing in disguise :D)

But yeah, AMD's notorious product-quality at the time was definitely not a myth, it cost me my "computer-guru" reputation with my parents. Whether that was AMD's fault, DELL's fault, the mobo-maker's fault...doesn't matter, if you were a customer of AMD it was your reality after you bought the box and that made it AMD's mindshare problem.
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,224
589
126
AMD did themselves no favors there though. I owned AMD chips at every major CPU release from the 386 up to the X2 and trying to get non-buggy chipsets, non-buggy drivers, was a real challenge.

It wasn't enough to deter me, but it was not something one could easily ignore either.

But when DELL did start selling AMD chips I told my dad it was OK to buy a DELL AMD rig...big mistake. That thing would routinely crash, the drivers were crap, and my parents never asked me for computer advice again. (which turned out to be a blessing in disguise :D)

But yeah, AMD's notorious product-quality at the time was definitely not a myth, it cost me my "computer-guru" reputation with my parents. Whether that was AMD's fault, DELL's fault, the mobo-maker's fault...doesn't matter, if you were a customer of AMD it was your reality after you bought the box and that made it AMD's mindshare problem.

How about the Intel Graphics drivers?

And the Intel chipset bugs; SATA problems, USB problems, etc.

Not sure if Intel really is any better in this regard.

Also, if you were buying pre-built computers from majors computer builders such as Dell/HP/Acer/etc they in most cases come preloaded with unstable crapware and in some cases funky special driver versions. I've seen severe software and hardware problems on pre-built computers that my friends and relatives have bought, both using AMD and Intel CPUs (and related chipsets). So I'm not sure if it's AMD/Intel that actually is what's causing the problems in many cases. I would suspect the root cause of the problems often lies elsewhere.
 
Last edited:

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
When you turn on a 45W light bulb it will momentarily draw more than 1A/230V. It's still called a 45W light bulb because averaged over a timespan it is dissipating no more than 45W.
TDP is averaged over a timespan, the actual energy usage can still vary wildly as long as the average is at or below the TDP.

I've been very careful to refer to sustained power consumption of a realistic load. I'm not talking about power on transients or other momentary spikes, so what you're referring to isn't relevant. When IDC did his power consumption measurements I don't think he was looking at maximums over short periods of time either.

The TDP was meant as a guidline for cooling designs. It does correlate with what you'll pay for running a PC at full load, but that's it.

It doesn't matter if the TDP is specified as a cooling requirement, although I do want to stress that - a requirement, not a guideline. It's not supposed to be some fuzzy metric that you can ignore entirely. If all other criteria are met (ambient temperature, junction temperature, etc) the CPU is not supposed to expose more than this amount of heat to the HSF while running a realistic workload.

The only ambiguity is what qualifies as realistic workload. Going for absolute maximum sustainable here would need too much slack. I would argue that any software that was designed to accomplish useful work and not carefully engineered as a power virus should qualify. In FX-8350's case I don't think you really need to cherry pick software to find something that will make it greatly exceed 125W under proper cooling conditions, it's not exactly straddling the edge.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
It doesn't matter if the TDP is specified as a cooling requirement, although I do want to stress that - a requirement, not a guideline. It's not supposed to be some fuzzy metric that you can ignore entirely. If all other criteria are met (ambient temperature, junction temperature, etc) the CPU is not supposed to expose more than this amount of heat to the HSF while running a realistic workload.

Its true. If it weren't true then CPU manufacturers like AMD and Intel would not bother specifying a TDP value for their processors in the first place.

Take the FX-8350. That 125W TDP rating is supposed to mean something. If the rating needs to be 130W or 120W then that is what the rating is supposed to be, the specific number selected is supposed to be adhered to, otherwise a different specific number was supposed to have been selected.

As far as MSI is concerned, the FX-8350 should have been spec'ed as having a 140W TDP, not a 125W TDP. It is easy to see why this might be the case for any number of data points.

For starters the HSF that AMD bundles with the FX-8350 is a beast, it bests even the HSF they use to bundle with their prior 140W TDP processors. Second it is easy to measure the actual power consumption by the processor, as MSI's engineers did.

And third, it is easy for us lay-people to measure power at the wall and compare between systems and if the FX-8350 truly was a 125W TDP processor then my 3770k really ought to be spec'ed as being a 45W TDP processor.

The thing is, and this is where AMD is clever and in ways we can all recognize with our own eyes, a TDP spec means nothing if there is no accompanying temperature spec. And AMD was keen to see that they never published a temperature spec for the piledriver processors.

AMD could spec the TDP as being 95W, heck why not 77W and just be absurdly dishonest about it, and so long as they don't spec a maximum allowed operating temperature they technically have not done anything wrong.

Right now what AMD is saying is "3 + Red = Fish". Its nonsense, gibberish, and they know it. The temperature spec is intentionally absent, and it must be absent, so long as they continue to ship processors that are spec'ed as being 125W TDP but can consume 140W running routine rendering apps. So long as they leave that part of the spec blank they can continue to do what they please with the TDP situation.

(for the folks who follow discrete GPU's, it was no different than when AMD decided to ship the 6990 GPU and violate the 300W PCIe spec for power-consumption...AMD just did it anyways and when people asked them to comment on it they muttered a collective "meh")
 

Piroko

Senior member
Jan 10, 2013
905
79
91
AMD did themselves no favors there though. I owned AMD chips at every major CPU release from the 386 up to the X2 and trying to get non-buggy chipsets, non-buggy drivers, was a real challenge.

It wasn't enough to deter me, but it was not something one could easily ignore either.

But when DELL did start selling AMD chips I told my dad it was OK to buy a DELL AMD rig...big mistake. That thing would routinely crash, the drivers were crap, and my parents never asked me for computer advice again. (which turned out to be a blessing in disguise :D)

But yeah, AMD's notorious product-quality at the time was definitely not a myth, it cost me my "computer-guru" reputation with my parents. Whether that was AMD's fault, DELL's fault, the mobo-maker's fault...doesn't matter, if you were a customer of AMD it was your reality after you bought the box and that made it AMD's mindshare problem.
I built and bought about a dozen AMD rigs and none had problems. Was a bit lucky though with picking the components. There were some serious issues with AMD pcs, probably most related to the later Via and SiS chipsets.

But that's also part of the problem chain, they weren't able to get high profile wins (which would have resulted in a lot of identical boards sold and thus a better validation for said boards) and in turn had to deal with what SiS and Via threw into the retail market. Oddly enough Nvidia was a blessing with their Nforce 2 &3 boards back then. I think I was one of few at that time who never had any form of data loss or resume bugs with hibernate.

I've been very careful to refer to sustained power consumption of a realistic load. I'm not talking about power on transients or other momentary spikes, so what you're referring to isn't relevant. When IDC did his power consumption measurements I don't think he was looking at maximums over short periods of time either.
Just noted it because many people keep mixing momentary power draw (enough to trip fuses), average power draw (sampling time of Kill-a-Watts) and TDP while arguing.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
How about the Intel Graphics drivers?

And the Intel chipset bugs; SATA problems, USB problems, etc.

Not sure if Intel really is any better in this regard.

Also, if you were buying pre-built computers from majors computer builders such as Dell/HP/Acer/etc they in most cases come preloaded with unstable crapware and in some cases funky special driver versions. I've seen severe software and hardware problems on pre-built computers that my friends and relatives have bought, both using AMD and Intel CPUs (and related chipsets). So I'm not sure if it's AMD/Intel that actually is what's causing the problems in many cases. I would suspect the root cause of the problems often lies elsewhere.

Just to clarify, it wasn't me that was buying pre-builts, it was my folks who bought the DELL and suffered the consequences.

I bought AMD and quality components, with known good driver teams, and never suffered for it any more so than the stability I observed with my Intel based computers.

The truth is back then just about everything was crap. Remember when you couldn't open/close/re-open MS Word more than six times before it crashed and you had to reboot Win95? Nvidia chipsets were notoriously crappy, etc.

We live in a golden age of consumer-PC uptime and stability. My FX-8350 is every bit as stable as my 3770k, no question there.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
I built and bought about a dozen AMD rigs and none had problems. Was a bit lucky though with picking the components. There were some serious issues with AMD pcs, probably most related to the later Via and SiS chipsets.

But that's also part of the problem chain, they weren't able to get high profile wins (which would have resulted in a lot of identical boards sold and thus a better validation for said boards) and in turn had to deal with what SiS and Via threw into the retail market. Oddly enough Nvidia was a blessing with their Nforce 2 &3 boards back then. I think I was one of few at that time who never had any form of data loss or resume bugs with hibernate.

You make a good point here which is that AMD was kinda forced to work with the bargain-basement questionably-competent 3rd party system integrators because Intel was able to keep the top-tier board makers "pre-occupied" with serving Intel's needs as priority #1.

The only way around that is to become a Samsung and be vertically integrated top-to-bottom and control quality at every level. Samsung is the master of their own fate, AMD never had that luxury.
 

scannall

Golden Member
Jan 1, 2012
1,960
1,678
136
I keep seeing these AMD vs Intel threads, and I don't get them. Just buy whatever part you feel best fills your needs and be done with it. And don't worry about what other people think about it.

I have AMD in my HTPC and my laptop. Both work great. No complaints. And Intel in my gaming rig. No complaints there either. Computer component manufacturers are not sports teams, so stop treating them as such.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
(for the folks who follow discrete GPU's, it was no different than when AMD decided to ship the 6990 GPU and violate the 300W PCIe spec for power-consumption...AMD just did it anyways and when people asked them to comment on it they muttered a collective "meh")
I think that article is a bit dishonest. The stock GTX480 was already suspected to blow the pci spec, the pre-OCed versions most certainly did as well as those with the GTX 580 and the GTX590 (which only got released a week later and was already known to blow the spec).
Wasn't a good move by AMD, but people already "okayed" it.
You make a good point here which is that AMD was kinda forced to work with the bargain-basement questionably-competent 3rd party system integrators because Intel was able to keep the top-tier board makers "pre-occupied" with serving Intel's needs as priority #1.

The only way around that is to become a Samsung and be vertically integrated top-to-bottom and control quality at every level. Samsung is the master of their own fate, AMD never had that luxury.
Imho they could have avoided some of their problems if they had gone at least partially integrated back then. Their own server chipsets were fine for the most part.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
I keep seeing these AMD vs Intel threads, and I don't get them. Just buy whatever part you feel best fills your needs and be done with it. And don't worry about what other people think about it.

I have AMD in my HTPC and my laptop. Both work great. No complaints. And Intel in my gaming rig. No complaints there either. Computer component manufacturers are not sports teams, so stop treating them as such.
If only it were that simple...
 

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
You make a good point here which is that AMD was kinda forced to work with the bargain-basement questionably-competent 3rd party system integrators because Intel was able to keep the top-tier board makers "pre-occupied" with serving Intel's needs as priority #1.

The only way around that is to become a Samsung and be vertically integrated top-to-bottom and control quality at every level. Samsung is the master of their own fate, AMD never had that luxury.

Yes that a great point.

AMD could have worked on becoming more vertically integrated, which was something deserving of substantial consideration. Instead they came up with this silly plan of selling their own internal GFX group, selling their fabs and buying ATI for crazy money. The politics of AMD are mind blowing and Dr. Evil, AKA Hector Ruiz made things even more complicated.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
You make a good point here which is that AMD was kinda forced to work with the bargain-basement questionably-competent 3rd party system integrators because Intel was able to keep the top-tier board makers "pre-occupied" with serving Intel's needs as priority #1.

The only way around that is to become a Samsung and be vertically integrated top-to-bottom and control quality at every level. Samsung is the master of their own fate, AMD never had that luxury.
Too bad Samsung doesn't take full advantage of that.

I don't know what your opinion on Apple is, but I hope it isn't in line with the general sentiment of the enthusiast community. Too many people are blinded by their and PC snobbery and Apple's litigiousness to observe the company's dealings objectively.

Apple, while not owning as many rungs on the vertical integration ladder as Samsung, seemingly extracts far more tangible advantages for the consumer's benefit. Their battery life, for instance, is in a class of its own.

Now they're a full-blown ARM dev (and have surpassed Samsung in this regard, in that they have custom, Qualcomm-esque cores), and I believe they pretty much have the capital to run their own fab business.

It's a shame there isn't an OEM like Apple on the PC side of things... I don't know if Intel or Microsoft ever shapes up the PC OEMs, but man would the world be better off if they did.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I own several Apple products (MacBook Air, iPhone 5, 3rd gen iPad, AppleTV) and while their products are very easy to use, integrate well, and are head and shoulders above the competition WHEN INITIALLY RELEASED, that trend does not continue through out a products life cycle. When the competition catches up, apple does just enough to keep their customer base loyal and keep from making an eco-system switch.

Battery life on iPhone 5 is average
Other ultra books on the market also easily meet and in some cases exceed the MBA (on multiple metrics including battery life and portability)

While Apple products certainly deserve more credit than some [ignorant] PC loyalists would like to give them, they are, at the same time, no where near as infallible as the Apple fans think they are.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Apple is genius. They can't always be genius on each and every iteration of a product within the family but overall they are geniuses.

Case in point to both Homeles and 2is - my iPod Nano 6G. Genius. Its tiny, the touchscreen is functional and amazing, battery life is crazy good. There is nothing not to like about it. Even the price was great.

But then they had to go and make it all huge form-factor wise for the Nano 7G. WTF!? I wanted a Nano iPod, not a Micro iPod :(

mqdefault.jpg