AMD set to slash FX CPU pricing on September 1

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

DrMrLordX

Lifer
Apr 27, 2000
23,112
13,215
136
Yes, we are saving a lot of money running a few FX chips in the mix -- Although, for 24/7 running on World Community Grid.... Most of us are going in the other direction -- Undervolting and Underclocking. My undervolted FX 8320 is only pulling 27 more watts than my 3770K. So far, I haven't been able to get my i7 stable when undervolted -- although I suspect that has more to do with the cheap motherboard. But unlocked chips from either company really open up a ton of options. If you're running a variant of Ubuntu -- an FX performs much like an equivalent Ivy. Where Windows 7 appears to throttle an FX down to Nehalem levels. The bang for the buck of an FX is insanely good if you plan on being a Linux user.

See, I can understand wanting to use an FX if you know you are going to be loading all modules 24/7, in an environment when the compilers aren't working against you, and in a situation where reducing power draw lets you run another machine to produce more outpoints (in this case, WCG points). That's why I specifically mentioned your case. What you are doing with the FX makes sense, and there are probably ~$60 motherboards out there with 4+1 phase power setups that would be just fine for undervolting. Now THAT is a value proposition, especially after the price cuts come Sept. 1st. You could get by with poor cooling on a cheap motherboard, significantly improving the upfront cost of acquisition.

My main problem is that some of my workloads use two cores or less, for which anything over a quad is overkill. There are some occasions on which I could use the full octal core power of an 8xxx/9xxx FX, yes, but that does not represent every use-case. Furthermore, I can run my tricore not-exactly-Deneb that cost me $30 off of eBay at 3.9 ghz with a nice NB overclock. According to what I've seen from Piledriver vs Stars comparisons, I'd have to push Piledriver to around 4.4 ghz or higher to get consistently better IPC than what I can get from my Stars chip. Sure, I'd have much greater ability to handle multiple threads, but I'd be sad if I made a significant investment in new hardware only to barely beat my Stars chip on some everyday workloads.

I use this board. It is a bit cheaper than the Sabertooth and a good chunk cheaper than the Crosshair. I've never used either of the Asus boards, so I can't say anything confidently, but I wouldn't be surprised if the ASRock was every bit as good. I am benchmark stable above 5.3GHz with enough voltage to pull almost 400 watts on the CPU alone. But, I generally don't run it that way, I'm having more fun undervolting these days.

I have seen mixed reviews of the Extreme9. Some claim it is the top-performing 990FX board out there today, while others say that it is an inferior overclocker compared to the Sabertooth. The cheapest Sabertooth I can find is $170 shipped, so the price advantage of the Extreme9 is somewhat moot. Based on your feedback, I would be tempted to get the Extreme9 at that price point, but I know I wouldn't be saving any money getting it.

There is no argument that the FX CPU's use more power than their Intel counterparts. But, I think this is something that is blown out of proportion for the average user. For server farms and people who will have their CPU loaded often for extended periods of time, I get it, that makes sense. For someone who games even for hours at a time, it just doesn't really matter. (That is if you live where electricity is reasonable.)

I agree when it comes to utility costs, somewhat. The up-front costs associated with running these FX CPUs comes from delivering power and cooling them. Demanding users who want the most from their CPU, bar none, are really going to have to put something like an nh-d15 or some kind of water cooling on these things. What do you use to keep up your 5.3 ghz overclock?

Also, if you live in a cooler climate, that extra heat isn't wasted. I live in WI, any extra wattage that passes off my radiator enters the environment in my home. For ~7-8 months a year for me, that means I'm paying whatever the electricity cost would be for that heat. I don't know how it would compare in efficiency or costs to my natural gas furnace doing it's job in my home. But the point is, if you heat your home that extra heat energy isn't wasted cost. No one takes this into account when they compare costs. How much in electric did I pay for from my CPU use that is keeping my furnace off longer for the majority of the year? How much does that in turn save me in natural gas costs?

I have been using CPUs as heating units since my 1.4 ghz Tbird. That works great in the winter, but in TN, it is pure pain in the summer.

I kind of wish they would shrink them. If they could chop the power use down and keep performance as is they could probably sell a few Opterons. .

As do I, though the time for that has past. The die is cast.

First off all there are more than a single reason to get the FX CPUs, and certainly OC to FX9590 levels is not the only one.

While this is true (the WCG situation MiddleOfTheRoad mentions is one example), consider where people might be coming from when buying a chip like an 8320SE or 8370SE: you're got a lot of AMD fans still sitting on 3.8-4 ghz Deneb chips, Thuban chips, and the like. In my case, I have an x2 with pretensions of being an x3 that can push 3 cores at 3.9 ghz. Is a 4.4 ghz FX going to beat that x2 in every use case? I have some situations, such as compiling Java in Eclipse, where only one core is consistently utilized. I would like some more cores AND some more IPC for running emulators/VMs. I am not confident that the FX is going to give me both until I start pushing 9590-like speeds.

I have run the x2 as an x2 before, and I know that if I could get a significant boost in IPC, that it would help me in most if not all my use cases, even if I didn't get any more cores. So there's this little thing called a G3258 out there serving as a big reason why a discounted FX might not be in the cards. There's an upcoming $100 CPU/board combo for the G3258 which supposedly will be 4.7 ghz capable with an aftermarket cooler (and I just happen to have one available) that would absolutely annihilate my faux Deneb in everything. Why am I spending ~$220 or more for an FX that will be better some of the time, when I can get the G3258 that will be better all of the time?

(sure, the FX will be even better still in heavily-threaded cases, but cmon . . . we're talking budget OC here!)

The Vishera sweet spot is 4.4GHz with Turbo off. At that frequency you get very acceptable single thread performance and really really good MT performance at almost the same power usage as an FX-8350 at default.

Acceptable on what basis? Is it going to beat a 3.9 ghz Stars chip's IPC?

Turbo uses way too much Voltage elevating power usage of the entire platform. Overclock an FX8320 to 4.2GHz (Default Heat-sink) with Turbo off and lower voltage than 1.425V that is the default and you get a much faster CPU at lower power consumption than FX8350 at default. Reviews havent shown that because they only run the CPU at default settings.

. . . and that would be wonderful if I spent all my time using all of those cores.

The high price motherboard for OC is a myth, my FX8350 was working at 4.7GHz stable with my ASUS M5A97 R2.0. There was no Throttling at all.

I looked at that board. If you hit 4.7 ghz with no throttling, then you are quite fortunate. Cheaping out on VRMs just seems like a bad, bad idea when it comes to AM3+.

Use the default Heat-sink and OC to 4.2GHz Turbo off, for $249 you have a very nice system with acceptable ST and very nice MT performance, 6x sata-III and USB-3.
Add in a nice GPU like R9 280 or Tonga and you can play every available game today and even next releases too for at least 2-3 years.

For a budget system and for users that want/need that CPU performance it is still good to go. Now with even lower prices they will become even better.

At $250 for board + CPU, we're starting to get outside of budget territory. An FX at 4.2 ghz would be even less likely to clearly beat by "Deneb" chip in IPC. I would much rather try out an overclocked A8-7600 on the Asus A88x Plus (or maybe something even cheaper, if I'm feeling lucky). Sure, it's only two modules, but at least I wouldn't have to spend anything on a dGPU.

Or for even less money, there's the G3258.
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
See, I can understand wanting to use an FX if you know you are going to be loading all modules 24/7, in an environment when the compilers aren't working against you, and in a situation where reducing power draw lets you run another machine to produce more outpoints (in this case, WCG points). That's why I specifically mentioned your case. What you are doing with the FX makes sense, and there are probably ~$60 motherboards out there with 4+1 phase power setups that would be just fine for undervolting. Now THAT is a value proposition, especially after the price cuts come Sept. 1st. You could get by with poor cooling on a cheap motherboard, significantly improving the upfront cost of acquisition.

Our team quickly discovered that running more 6-8 thread CPU's underclocked generates a ton more points than 1 super expensive CPU running overclocked.

Not only that -- people are throwing away Visheras used on ebay for almost nothing because of the Windows stigma.... Yet these CPU's do around Ivy Bridge performance in Linux. Underclocking/undervolting our boxes, we can easily run 2-3 Desktops around the power envelope of a modestly overclocked single desktop (even if they're running a vastly superior CPU like an Haswell i7 or Xeon).

Our multiple "dumpster dive" FX's can mop the floor with those guys when working as a team -- and I think the Xeon guys are now only starting to catch on to what we've been doing.

We're a team of two people -- we managed to go from being ranked 30,000 to #7,300 in about 10 months. We are cheapskates -- but understand the Grid. FX with Linux has been our "secret weapon."
 
Last edited:

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I have seen mixed reviews of the Extreme9. Some claim it is the top-performing 990FX board out there today, while others say that it is an inferior overclocker compared to the Sabertooth. The cheapest Sabertooth I can find is $170 shipped, so the price advantage of the Extreme9 is somewhat moot. Based on your feedback, I would be tempted to get the Extreme9 at that price point, but I know I wouldn't be saving any money getting it.



I agree when it comes to utility costs, somewhat. The up-front costs associated with running these FX CPUs comes from delivering power and cooling them. Demanding users who want the most from their CPU, bar none, are really going to have to put something like an nh-d15 or some kind of water cooling on these things. What do you use to keep up your 5.3 ghz overclock?


I haven't used the Asus boards (and I'm not buying a third AM3+ board! :awe: ) so I can't say much about them. They have a solid reputation, though. I also have a MSI 990FX-GD80v2. I had terrible voltage droop above 5GHz, so I got the ASRock and a Rosewill Lightning-1300. To keep things cool I am using custom water (Swiftech MCP655, Koolance CPU-380A water block, this Fluidyne cooler - 18"x9"x1", and four 200mm Coolermaster fans in a push / pull setup on the radiator, two of these and two of these.)

It was a hobbyist build, definitely for the fun of building something different and my first go at water cooling. Learned lots about liquid cooling setups (will never put my reservoir lower than the tubes again!) I'm very happy with the performance. As much fun as finding how hard I can push it has been, I am going the other way with it now, planning on finding a real sweet spot for power use and a profile for ~5.15GHz operation (5.3GHz+ is too close the the edge, and the power use is painful...!)

I'm going to reuse the MSI board for my son. I have a Tagan BZ 900w, and plenty of DDR3 1600. Just got to decide, FX8320, FX6300, or maybe one of these new CPU's.
 
Last edited:

rancherlee

Senior member
Jul 9, 2000
707
18
81
I got about the best price/performance combo from Microcenter a few weeks a back and for 210$ The 8320 + M5A99fx pro is a fun combo to tinker with. Week 11 of 14' chip and it's happy running 4.4 @ stock voltage (1.33v under prime) with just a bit of tweaking, I had it all the way down to 1.15v at stock speed (didn't try lower) which puts it as a calculated 86w TDP. As far as speed against my Phenom 960t (6x3.9g) it roughly the same per clock on many synthetic tests but is noticeably faster in games, even ones that only use 1-4 cores. Originally planned on grabbing an i7 combo but couldn't quite shake the AMD out of me. I really need a better cooler, My CNPS 9700 is just enough to keep it happy at 4.4ghz but will not support more speed.
 
Last edited:

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I got about the best price/performance combo from Microcenter a few weeks a back and for 210$ The 8320 + M5A99fx pro is a fun combo to tinker with. Week 11 of 14' chip and it's happy running 4.4 @ stock voltage (1.33v under prime) with just a bit of tweaking, I had it all the way down to 1.15v at stock speed (didn't try lower) which puts it as a calculated 86w TDP. As far as speed against my Phenom 960t (6x3.9g) it roughly the same per clock on many synthetic tests but is noticeably faster in games, even ones that only use 1-4 cores. Originally planned on grabbing an i7 combo but couldn't quite shake the AMD out of me. I really need a better cooler, My CNPS 9700 is just enough to keep it happy at 4.4ghz but will not support more speed.


I am going to mess around more this weekend if I can. But I've been able to run my FX at 1.225 volts 4.4GHz / 4.7GHz turbo. Keep going, the manufacturing process for these CPU's has to be quite mature by now.
 
Last edited:

inf64

Diamond Member
Mar 11, 2011
3,884
4,692
136
Regarding Stars vs Steamroller. Unless one does specific workloads that are heavy on FP unit, Kaveri is pretty much on par with 32nm Stars core when IPC is in question, maybe even a tad better. It clocks ~30% more than 32nm Stars core within same power envelope too and consumes less power in full load.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
FX-8320E at 95W should be great. OC to 4.1GHz and keep the 95W TDP like FX8370E but only at $139. :thumbsup:
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
http://www.xbitlabs.com/news/cpu/di..._Other_FX_Processors_New_Prices_Revealed.html
amd_fx_prices_spt_1.png
 

bononos

Diamond Member
Aug 21, 2011
3,940
190
106
AMD gave away some extra bits to the server/workstation folks in their website:

http://products.amd.com/en-us/Opter...=1000&f6=G34&f7=C0&f8=32nm&f9=&f10=6400&f11=&

Here, for example, they disclose the maximum temperature of the chip, so indeed FX works at lower temperatures compared to Intel chips. But yet they still refuse to provide the thermal datasheet even for opteron processors.

Again how would we know if the 'lower' temps are really what is being reported by the sensor given the wide variance of temps reported. No ones doing thermocouple/heatsink tests anymore. AMD FX cpus need beefier heatsinks and put out alot of heat, so its reasonable to say that AMD cpus run hotter than their competition.

So as I see it AMD specs on their temps don't mean much, just like their published tdp without accompanying thermal/current specs which allows AMD to get away with running their cpus out of spec.
 
Aug 11, 2008
10,451
642
126
Hope someone does a direct comparison (performance and power usage) of the 125 watt vs 95 watt TDP 8370s to see what the actual difference is. Be interesting to see if it does in fact use 30 watts less at the same sustained performance level, or is it just "creative" TDP rating and/or spending less time at the max turbo.
 

Jovec

Senior member
Feb 24, 2008
579
2
81
Again how would we know if the 'lower' temps are really what is being reported by the sensor given the wide variance of temps reported. No ones doing thermocouple/heatsink tests anymore. AMD FX cpus need beefier heatsinks and put out alot of heat, so its reasonable to say that AMD cpus run hotter than their competition.

So as I see it AMD specs on their temps don't mean much, just like their published tdp without accompanying thermal/current specs which allows AMD to get away with running their cpus out of spec.

They don't run hotter, they are less tolerant of heat. 70C is the max temp according to AMD Overdrive (at least for my 8350), as reported by the Thermal Margin value (the distance between 70C and the current temp). For example, when Coretemp reports 40C, OD reports the margin as 30C.
 
Last edited:

Abwx

Lifer
Apr 2, 2011
11,935
4,909
136
AMD gave away some extra bits to the server/workstation folks in their website:

http://products.amd.com/en-us/OpteronCPUDetail.aspx?id=845&f1=AMD+Opteron%E2%84%A2+6300+Series+Processor&f2=&f3=Yes&f4=&f5=1000&f6=G34&f7=C0&f8=32nm&f9=&f10=6400&f11=&

Here, for example, they disclose the maximum temperature of the chip, so indeed FX works at lower temperatures compared to Intel chips. But yet they still refuse to provide the thermal datasheet even for opteron processors.

It is written 69°C and 99W TDP, so how are they refusing to provide the thermal datasheet.?.

What does an engineer need more, than those two values, to calculate a heatsinking adequate solution.?..

If they say so it means that if your casing has 30°C ambiant temperature you only have 39°C left to evacuate those 99W, this means that your heatsink must have 39/83 = 0.4 °C/Watt thermal resistance if you want your chip to be full speed capable on a continuous basis, otherwise it will throttle according to your heatsink thermal energy evacuation capabilities.

By the same principles one could deduce what is the max temp of a 125W chip if the same chip is allowed 99W at 69°C, that s a basic rule of three once you know what is the ambiant used as basis for their datasheet and hence the thermal resistance of their reference design used for calculations.

Generaly 27°C is retained because it is about 300°K, a convenient number, and so far this is the value by default on the spices simulators i m aware of, so their heatsink Rth is probably 0.425°C/watt, wich point to 80°C max temp for a 125W chip.

Edit : i ll add that this is assuming that they use the same reference cooler for a 99W and a 125W chip, of course....
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
23,112
13,215
136
I got about the best price/performance combo from Microcenter a few weeks a back and for 210$ The 8320 + M5A99fx pro is a fun combo to tinker with. Week 11 of 14' chip and it's happy running 4.4 @ stock voltage (1.33v under prime) with just a bit of tweaking, I had it all the way down to 1.15v at stock speed (didn't try lower) which puts it as a calculated 86w TDP. As far as speed against my Phenom 960t (6x3.9g) it roughly the same per clock on many synthetic tests but is noticeably faster in games, even ones that only use 1-4 cores. Originally planned on grabbing an i7 combo but couldn't quite shake the AMD out of me. I really need a better cooler, My CNPS 9700 is just enough to keep it happy at 4.4ghz but will not support more speed.

Combos like that may be the saving grace of FX post price-cut, especially if the new set of boards that are supposedly coming out to complement the cut can deliver 6+2 phase power designs at an acceptable price (I'm thinking $60-$80 would be awesome, but I'm not holding my breath waiting for it).

Regarding Stars vs Steamroller. Unless one does specific workloads that are heavy on FP unit, Kaveri is pretty much on par with 32nm Stars core when IPC is in question, maybe even a tad better. It clocks ~30% more than 32nm Stars core within same power envelope too and consumes less power in full load.

That is one of the reasons why I am more interested in Kaveri than a discounted FX. I know I can whip my K10.5 with a A8-7600, provided I can do a bclk-based overclock. Assuming full access to the turbo multipliers (which is something I can not necessarily take for granted), one should be able to hit 4218 mhz in AHCI mode (110 x 38) or 4902 mhz in IDE mode (129 x 38). Of course, there's no way it's going to boot at that speed on air.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
They don't run hotter, they are less tolerant of heat. 70C is the max temp according to AMD Overdrive (at least for my 8350), as reported by the Thermal Margin value (the distance between 70C and the current temp). For example, when Coretemp reports 40C, OD reports the margin as 30C.

The sensor is not located in the core, plus its not calibrated either.

http://help.argusmonitor.com/index.html?TemperaturemeasurementforAMDCPUs.html

While we cant say unless measured from the core. I would be surprised if AMDs CPUs didnt throttle in the 100C area like Intel. Its just 2 wastly different ways to measure it. The less heat tolerant is utter bogus tho. I have seen heatsinks being hotter than the temp reported.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Why not ?? You have the same hardware, you use the same voltage and OC to FX-8370E frequency. You have the same TDP at 4.1GHz at lower price, simple as that. ;)

No you dont, because they are not binned equally. You may be able to be below 95W, you may also be able to be above.

Keeping the same voltage is not some kind of security against higher powerdraw. As frequency goes up, so does the amps.

So its utter rubbish and misleading to claim you can just OC and have same TDP.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
No you dont, because they are not binned equally. You may be able to be below 95W, you may also be able to be above.

Keeping the same voltage is not some kind of security against higher powerdraw. As frequency goes up, so does the amps.

So its utter rubbish to claim you can just OC and have same TDP.

Yes you can, TDP is not power consumption.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
So as I see it AMD specs on their temps don't mean much, just like their published tdp without accompanying thermal/current specs which allows AMD to get away with running their cpus out of spec.

One things is to not believe their TDP claims. It's fair here, because they didn't provide the rest of the variables used to calculate TDP. Another thing is their maximum operating temperature they are committed on the spec sheet. Regardless of whether you are able to measure it, AMD chips must somehow measure and control it in order to activate these thermal protections. This metric doesn't allow for cheating, it doesn't depend on other variables.

So yes, I think AMD chips run cooler than Intel. But no, I don't think that has any meaning beyond operating at lower temperatures than Intel chips. It is still an extremely inefficient chip, it is still a power hog, it still demands a lot of noisy cooling, it doesn't have a reliability or life spam edge over Intel chips.

I also don't believe AMD TDP claims, 8350 breached their TDP specs by at least 10%, and I think this is the explanation for the 95W 8370E, it is in reality a "95W" chip, much like the 8370 is a "125W" chip except when talking to an AMD reseller, but I really can't see them fudging with their maximum operating temperature, they have no reason to do so.
 

Jovec

Senior member
Feb 24, 2008
579
2
81
The sensor is not located in the core, plus its not calibrated either.

http://help.argusmonitor.com/index.html?TemperaturemeasurementforAMDCPUs.html

While we cant say unless measured from the core. I would be surprised if AMDs CPUs didnt throttle in the 100C area like Intel. Its just 2 wastly different ways to measure it. The less heat tolerant is utter bogus tho. I have seen heatsinks being hotter than the temp reported.

I'll clarify my lax comment.

Single sensor is true (same temp reported for all cores). Heatsink temperature is irrelevant. Socket temperature is irrelevant.

AMD sets TCTL_MAX to 70 for lidded CPUs. AMD CPU "temperature" gets reported as TCTL. What monitoring programs report as AMD core temp is really TCTL_Max - TCTL. When TCTL >= TCTL_Max, the CPU will employ throttling and other techniques to reduce TCTL. Yes, these values do not necessarily correspond to actual temperatures. I don't care if my FX CPU is 5C or 500C in actual degrees, I care about how close it is to TCTL_Max. Common vernacular has equated TCTL_Max - TCTL with core temps in degrees Celsius, so we use them interchangeably to very little detriment.

We do to the same thing with Intel and Tcase and Tjunction. My "less tolerant to heat" comment was poorly worded. Intel's TJmax is higher than TCTL_Max, so monitoring apps can report higher values for Intel CPUs. A reported value of 80 for Intel is well within margins. 80 for AMD will have the system implementing safety measures.

How close to or far from the reported Intel and AMD temps are to the actual has little effect when we talk about the effects of overclocking and cooling. They are, in effect, the CPU temperatures that both we, and the CPU itself, base decisions on.
 
Last edited:

monkeydelmagico

Diamond Member
Nov 16, 2011
3,961
145
106
Hope someone does a direct comparison (performance and power usage) of the 125 watt vs 95 watt TDP 8370s to see what the actual difference is. Be interesting to see if it does in fact use 30 watts less at the same sustained performance level, or is it just "creative" TDP rating and/or spending less time at the max turbo.

In a recent TH review of A10-7800 they underclock a 7850k to compare. It's a stretch but I'm thinking the 95w 8370 part will just have a few performance tweaks like you mention such as less time at turbo, binned for lower voltage, etc. Even slight changes to operating parameters can yield some impressive efficiency gains.
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
So yes, I think AMD chips run cooler than Intel. But no, I don't think that has any meaning beyond operating at lower temperatures than Intel chips. It is still an extremely inefficient chip, it is still a power hog, it still demands a lot of noisy cooling, it doesn't have a reliability or life spam edge over Intel chips.

Life span or reliability over Intel chips? I've had AMD chips (as well as Intel) since the 486 days -- One brand isn't any less reliable than the other. I've actually had to replace more Intel chips under warranty over the years to be honest.... even though we had a pretty even split of Intel/AMD at the Non-Profit.
 
Status
Not open for further replies.