• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Impressed with FX-8350 and the new article at Anand

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Charles Kozierok

Elite Member
May 14, 2012
6,762
1
0
And they are replaced fast due to power inefficiency. Simply because the running cost is too high.

Quite right. In my research I frequently read about supercomputers that are broken down after just a few years, not because they are no longer useful, but because newer machines are more efficient.

Furthermore, saying supercomputers are inefficient because they use a lot of power is off-base.. it's performance per watt, not total watts, that matters.

The top machine on the Green500 list right now -- which is due to be updated -- is pushing 2.5 GFLOPS/W. Think your PC CPU can pull that? Best of luck.

500px-Green500_evolution.svg.png


That's a log scale.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
It's not one size fits all. There is a place for a CPU that is fast for what you need it for but costs a bit more a year in electricity than a more efficient CPU in the same budget that happens to also be a bit slower for your most demanding tasks.

If initial price was not a factor there would be a lot less variety in what is sold. There are definitely builds where an AMD processor will make more sense than an Intel processor even taking into account power use. This doesn't somehow diminish the fact that the K series i5 and i7 are great CPUs.

Side note: What do supercomputers have to do with the FX CPU series?
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
Power consumption is not just about a few extra watts being used by the computer. It's also about:

  • The extra watts you have to pay to air condition the room the computer is in, which applies to most people. Or the extra discomfort you have being in a room with a hot computer that's not air conditioned.
  • The additional noise from extra fans that have to run. Many people prefer quiet operation.
  • The extra cost of beefed up coolers to deal with the heat.
  • The stress that heat puts on hardware -- not just the CPU but other components as well, which reduces their longevity and increases the chance of having to waste time and money fixing them.
  • The environment. Some of you may not have noticed that we now live in an era where people try to be environmentally responsible. People care about wasted power, the fuel it wastes to generate it, the fuel wasted to cool the heat, the CO2 generated by the entire process.

I prefer to use devices that are efficient, not ones that brute-force their way around their design flaws by maxing out power limits. If you prefer to use inferior technology to save a few bucks, go for it. There's a reason why AMD can only compete on a performance-per-dollar basis, and it's not because their CPUs are great.

It's an interesting argument, but completely false in my real world experiences.

As far as total heat produced, you are half correct, but the cost to cool the difference is insignificant. The heat difference produced is also insignificant. If the difference between an AMD and an Intel CPU is enough to push you into using AC instead of not using it, it means with the Intel CPU you were sitting on the edge wishing for an excuse to turn that AC on, it's really such a tiny difference though.

The noise, cost of coolers, "stress" are all patently false in my experience. My FX-8120s work find with stock coolers, and actually run at a lower temperature than my i7-7770 CPU. The cost of the OEM heat sink and fan is already included in the cost of the CPU, so it's a rather dishonest argument to try to claim that it's some extra cost that applies to AMD, given that AMD CPU are already cheaper than Intel counterparts.

As far as noise, I've never noticed it on either system, as the noisy video card coolers overshadow any noise from the CPU fan. I'll give you that it might be slightly louder compared to intel on an absolute scale, but even then the AMD coolers are fairly silent compared to other computer components.

Your "stress" argument though is an absolute joke. AMD CPU are designed to run at 65C or less, while the Intel equivalents seem to be fine up into 80C or so. I've seen my i7-7770 run that hot, and I've never seen my FX-8120 even reach 70C. Between the two, if you are going to seriously call temperature stress a big deal, you need to realize the Intel runs hotter despite it's lower power consumption. It's just one of the bonuses of having a smaller die size I guess.

The environment, really? If you really care, use an ARM based Chromebook or Windows 8 RT device. Anything else is using an order of magnitude more power. The idea that using an Intel system and cutting your power usage by 15% is going to make a huge difference is absurd when there are other options that will cut your power usage by 95%.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
234
106
Furthermore, saying supercomputers are inefficient because they use a lot of power is off-base.. it's performance per watt, not total watts, that matters.
Today, it's not only performance per watt, it's also total watts and price. Today's fastest supercomputer is built around AMD's Opteron 6274 processor, which isn't exactly the best and most power-efficient processor out there.

Titan has AMD Opteron CPUs in conjunction with Nvidia Tesla GPUs to improve energy efficiency while providing an order of magnitude increase in computational power over Jaguar. It uses 18,688 CPUs paired with an equal number of GPUs to perform at a theoretical peak of 27 petaFLOPS; however, in the LINPACK benchmark used to rank supercomputers' speed, it performed at 17.59 petaFLOPS. This was enough to take first place in the November 2012 list by the TOP500 organisation.

Intel has the best technology but it isn't used in Titan, because performance per dollar matters even more :cool:
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Your "stress" argument though is an absolute joke. AMD CPU are designed to run at 65C or less, while the Intel equivalents seem to be fine up into 80C or so. I've seen my i7-7770 run that hot, and I've never seen my FX-8120 even reach 70C. Between the two, if you are going to seriously call temperature stress a big deal, you need to realize the Intel runs hotter despite it's lower power consumption. It's just one of the bonuses of having a smaller die size I guess.

You do know they measure it differently, right?(Well leading question since you dont.) The FX cooler is not 55-60C by accident when the sensor says 60-65C on the CPU.
http://www.alcpu.com/forums/viewtopic.php?f=63&t=892
"Why is the temperature of my FX, Phenom, Athlon based processor lower than the ambient temperature?
Starting with the Phenoms, AMD's digital sensor no longer reports an absolute temperature value anymore, but a reading with a certain offset, which is unknown. It is estimated that this offset is between 10 - 20c."

It's an interesting argument, but completely false in my real world experiences.

Your sense of real world is based on a subset of americans in a minority market.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Today, it's not only performance per watt, it's also total watts and price. Today's fastest supercomputer is built around AMD's Opteron 6274 processor, which isn't exactly the best and most power-efficient processor out there.



Intel has the best technology but it isn't used in Titan, because performance per dollar matters even more :cool:

Opterons are no longer used. Cray was foolish enough to sign a 3 year contract with AMD. That contract is up and nobody uses Opterons anymore. Hence the 3-4% marketshare and dropping.
 

Charles Kozierok

Elite Member
May 14, 2012
6,762
1
0
As far as total heat produced, you are half correct, but the cost to cool the difference is insignificant. The heat difference produced is also insignificant.

This isn't a counterargument, it's just handwaving. If you want to dismiss the importance of power consumption, feel free, but this is not about your personal preferences. The point of my post was to highlight some of the reasons why millions of other people do care about efficiency, beyond the simple additional cost of the electricity.

It's also worth reiterating that there are people paying 20, 30 and 40c/KWh. Not everyone has cheap power.

Intel has the best technology but it isn't used in Titan, because performance per dollar matters even more

That's true, but it's also more evidence that AMD can only compete by cutting prices to the point where customers are sufficiently enticed to purchase inferior technology.

On a performance per dollar basis, sure, AMD's chips are great. Anything can be successful in the marketplace if you sell it cheaply enough.

I highly doubt that AMD designed Bulldozer with the thought that several years later their flagship product would be going for $199.
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
This isn't a counterargument, it's just handwaving. If you want to dismiss the importance of power consumption, feel free, but this is not about your personal preferences. The point of my post was to highlight some of the reasons why millions of other people do care about efficiency, beyond the simple additional cost of the electricity.

It's also worth reiterating that there are people paying 20, 30 and 40c/KWh. Not everyone has cheap power.

If you are paying 40c/kwh, you have three options: AMD, Intel, ARM. You can argue all you want that the ~15W average saved by switching from Intel to AMD is going to make a massive huge difference, but it's far more logical to switch to an ARM device using a tenth or less of the power of your Intel equivalent, if you seriously care about power usage.

Intel is in an interesting place, but it's actually losing in both measures. Lowest performance per dollar, and second worst power usage. If you care about performance per dollar there is AMD, and if you care about power usage you can use ARM.

You seem to be trying to argue some odd situation where the tiny power savings of switching to Intel is super important, but somehow the much greater power savings of switch to ARM isn't worthwhile. What is your real life scenario that makes this the case?
 
Last edited:

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
Starting with the Phenoms, AMD's digital sensor no longer reports an absolute temperature value anymore, but a reading with a certain offset, which is unknown. It is estimated that this offset is between 10 - 20c."

So in other words, it's just a wild guess. That doesn't disprove my point at all. Do you have any actual proof, or just some website's guess?

You may well be correct, but I'd like to see real evidence, not just some wildly random guess.

Also, even if the offset is 10C, my FX-8120 still runs cooler than my i7-7770. I was disputing the point that AMD runs hotter, your post doesn't do anything to prove *that* point, you are just nitpicking a part of my argument.

Your sense of real world is based on a subset of americans in a minority market.

Ah, so CPU run differently in your country? Please explain, I'm really curious.

Oh right, you are just trying to brag about the insane prices you pay for power. It's not America that is the minority, it's you. Just because you choose to live in a place where the power company can rape and pillage your wallet every month doesn't mean the rest of the world does the same.

http://en.wikipedia.org/wiki/Electricity_pricing

Except for a few Islands, you are the only ones paying more than 31c per kwh. You are the minority.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
So in other words, it's just a wild guess. That doesn't disprove my point at all. Do you have any actual proof, or just some website's guess?

You may well be correct, but I'd like to see real evidence, not just some wildly random guess.

Also, even if the offset is 10C, my FX-8120 still runs cooler than my i7-7770. I was disputing the point that AMD runs hotter, your post doesn't do anything to prove *that* point, you are just nitpicking a part of my argument.



Ah, so CPU run differently in your country? Please explain, I'm really curious.

Oh right, you are just trying to brag about the insane prices you pay for power. It's not America that is the minority, it's you. Just because you choose to live in a place where the power company can rape and pillage your wallet every month doesn't mean the rest of the world does the same.

http://en.wikipedia.org/wiki/Electricity_pricing

Except for a few Islands, you are the only ones paying more than 31c per kwh. You are the minority.

That post on a random wild website is by the creator of CoreTemp.

And you missed the rest of the post completely. But I assume thats because you dont care at all and never will.

And you ability to write i7-7770 is amazing.

Even with your low prices and my high prices on electricity, your utility bill is still way beyond mine I bet. Simply because you dont understand the basic concept of the economics behind.
 
Last edited:

galego

Golden Member
Apr 10, 2013
1,091
0
0
Power consumption is not just about a few extra watts being used by the computer. It's also about:

  • The extra watts you have to pay to air condition the room the computer is in, which applies to most people. Or the extra discomfort you have being in a room with a hot computer that's not air conditioned.

AMD owners have a standard reply to that: "I can play games and warm my room in winter, saving some bucks."

  • The additional noise from extra fans that have to run. Many people prefer quiet operation.

The stock coolers are not more noisy than Intel ones, and by saving bucks you can use a much better cooler.

  • The extra cost of beefed up coolers to deal with the heat.

See above.

  • The stress that heat puts on hardware -- not just the CPU but other components as well, which reduces their longevity and increases the chance of having to waste time and money fixing them.

Even accepting this as true, hardware will be obsolete much much before it fails.

  • The environment. Some of you may not have noticed that we now live in an era where people try to be environmentally responsible. People care about wasted power, the fuel it wastes to generate it, the fuel wasted to cool the heat, the CO2 generated by the entire process.

And CPU/APU are not a problem. The European Union has plans to cut down high-end graphics cards with its energy eco-law.
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
833
136
If you care about performance per dollar there is AMD, and if you care about power usage you can use ARM.

You seem to be trying to argue some odd situation where the tiny power savings of switching to Intel is super important, but somehow the much greater power savings of switch to ARM isn't worthwhile. What is your real life scenario that makes this the case?

You need to remember software compatibility.

AMD has it with Intel, ARM does not.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Opterons are no longer used. Cray was foolish enough to sign a 3 year contract with AMD. That contract is up and nobody uses Opterons anymore. Hence the 3-4% marketshare and dropping.

Cray announced a XC30 cluster supercomputer after Intel buy part of Cray by $140M.

But Cray continues using AMD in the existing clusters and in future models:

Barry Bolding, VP of Storage and Data Management and corporate marketing at Cray told The INQUIRER that the firm's use of Intel Xeon chips in its Cascade XC30 cluster doesn't mean that Cray has completely shut the door on AMD.

Bolding said Cray will still sell AMD's chips for its existing clusters, and noted that XE6 clusters can be upgraded using AMD 6300 series Opteron chips.

Apart from upgrading to new 6300, my bet is that Cray will use future ARM-based Opteron in one or two years.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
The power difference is relevant only up to a point. I wouldn't toss out a Nehalem system for a Ivybridge system just because I save $20 a year in electricity, my main motivation would be performance gains first and using less electricity second.

The degree to which some argue the power draw matters is being overstated, imo. Is an i3 a good option vs FX 6300 when looking to maximize performance? No. Is it a good option when both choices exceed the performance needed? Yes.

Is the FX-83xx a good choice vs i5-3570K/i7-3770K in a gaming box with everything else in the PC being equal? Not really. Is it a good choice if picking the FX-83xx let's you jump up a tier in GPU? Absolutely.
 
Last edited:
Aug 11, 2008
10,451
642
126
No matter how many times it is repeated, and by how many amd fans, it is a false argument to say a small initial savings is significant while the same cost spread over a few years can be totally ignored.

If the primary use is gaming there is very little case for picking 8350 over 3570k. The Intel has superior performance stock in the vast majority, sometimes by a wide margin, while the fx is equal or faster in very, very few games and then only by a small margin or not at all depending on the benchmark.

A small initial higher cost is offset by power use in a few years, while offering more overclocking headroom and much less power use when overclocked.

I don't think anyone is arguing that the fx is not a good choice if your primary concern is some of the productivity apps it is best at. Nobody is saying it is terrible or inadequate either. It is just that the small initial cost savings is more than offset by the advantages of the 3570k.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
The power difference is relevant only up to a point. I wouldn't toss out a Nehalem system for a Ivybridge system just because I save $20 a year in electricity, my main motivation would be performance gains first and using less electricity second.

Very well said Vesku. Many here are hanging to their first generation i7 mainly due to the performance gap not being that large, power consumption takes a distant place in the decision making. I'm happilly using my i7 860 @ 4.0 and i don't care one bit about it's power consumption because it gives me great performance.

This power consumption stuff is the only thing Intel fanboys have now to paint an AMD purchase as a dumb thing. Funny cuz not too long ago it was slower single thread performance. Keep on guys, it's clear now to everyone what agendas you have.
 
Last edited:

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
I don't think anyone is arguing that the fx is not a good choice if your primary concern is some of the productivity apps it is best at. Nobody is saying it is terrible or inadequate either.

Frozentundra123456 you're right when it comes to games alone the FX looses to the competition but don't fool yourself, people might not say it out loud but the only reason they're posting here is to call the FX a piece of crap. There are guys who just can't leave AMD threads alone, they have to post the usual nonsense "but intel pays for itself, it generates $$ not heat"
 

24601

Golden Member
Jun 10, 2007
1,683
40
86
9 Pages of worthless AMD arguments.

If you got a 8350 instead of a 3570k either for gaming when stock, or for anything when OC'd, then you are mentally unable and/or unwilling to process logic in a useful form.

The most hilarious arguements are from people using AMD CPUs with AMD GPUs.

AMD GPUs have horrendous drivers compared to Nvidia and ALWAYS are more CPU bottle-necked than Nvidia GPUs, since time immemorial.

If you aren't using an Nvidia GPU with you hugely bottlenecking AMD CPU, then you are double mentally unable and/or unwilling to process logic in a useful way.

Personal attacks are not acceptable
-ViRGE
 
Last edited by a moderator:

galego

Golden Member
Apr 10, 2013
1,091
0
0
You need to remember software compatibility.

AMD has it with Intel, ARM does not.

This can be a problem in the closed source Windows world, but is not in the open source linux world where some distros give ARM support.
 

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
AMD owners have a standard reply to that: "I can play games and warm my room in winter, saving some bucks."
No, it'll cost you more since heat is typically less than electricity.
The stock coolers are not more noisy than Intel ones, and by saving bucks you can use a much better cooler.
The money you save by going with a 8350 instead of a i5-3570K is not enough to get any good quality cooler.
And CPU/APU are not a problem. The European Union has plans to cut down high-end graphics cards with its energy eco-law.
...and what do high-end graphics cards being cut down have to do with the 8350 consuming more watts? This is bad news for AMD CPUs, not good news.
Cray announced a XC30 cluster supercomputer after Intel buy part of Cray by $140M. But Cray continues using AMD in the existing clusters and in future models:
Because the most valuable and difficult-to-design part of a supercomputer is the interconnects between the processing units, and so they are locked into the same CPUs unless they want to redesign all the interconnects.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
AMD GPUs have horrendous drivers compared to Nvidia and ALWAYS are more CPU bottle-necked than Nvidia GPUs, since time immemorial.

Did you logic that out all the way? If AMD GPUs are *more* CPU bottlenecked than NVidia GPUs, aren't you saying the AMD GPU is more powerful/higher performance than the NVidia? I don't know if you meant that, but usually when you criticize other people so harshly for poor logic, you should watch out for making similar errors in logic etc.