• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

AMD set to slash FX CPU pricing on September 1

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
From an enthusiast's point-of-view, the only reason to get an FX is to overclock it to 9590 levels of performance so as to mitigate Piledriver's single-threaded performance problems. So, that means getting an 8320 or 8350 (or 8320SE/8370SE after Sept. 1) and pushing it to 4.5-4.7 ghz at least, depending on what is allowed by the silicon lottery.

Now you have to deal with the fact that only a few motherboards out there reliably support the kind of power draw that Vishera can muster at such speeds. Assuming your overclocked chip is pulling over 200W (which is not outside the realm of possibility), you do not want to be caught with a 4+1 phase board, period.

If you are going to stop at the 4.5-4.7 ghz range, the Gigabyte UD3 is probably the sweet spot with a 6+2 phase design. There are reportedly some folks that have run the 9590 (often with a bit of undervolting) on this board.

If you want your FX to shine in all its glory by shooting for 5 ghz or higher, then it's pretty much the Sabertooth. There are other options, but most of the good ones cost more.

So, just looking at the board, you can expect to pay at least ~$110 for the UD3, or $170 (or more) for the Sabertooth. And that doesn't even include cooling for the chip, which will almost certainly have to be a strong aftermarket HSF or better. Some of us are fortunate enough to already have big HSFs sitting around, but even those might not be enough for the 5 ghz club.

And since neither the UD3 nor Sabertooth have onboard video, you're going to need a vid card too.

Alternatively, if you are just looking for WCG points, you could probably get a cheap 4+1 board and run an 8320 undervolted (or maybe one of the new 8370SEs) for much, much less money.

But, I digress.

For those of us determined to make the most of the FX in a wide variety of tasks, overclocking is a must, and when overclocking comes into play, the need for excellent cooling and solid boards rears its ugly head.

I realize that Kaveri does not offer the kind of multithreaded power of an overclocked FX, but for my money, I would much rather see an A8-7600 on an Asus A88x Plus. Allegedly, the Asus Plus and Pro boards can be made stable at 129 mhz blck in IDE mode, allowing for the 7600 to hit 4257 mhz. You probably wouldn't need much of an HSF to pull that off, either, AND you wouldn't even need dGPU unless you really wanted one.

And all that for maybe $200-$220 for CPU + board.

FX ownership is expensive even before taking utility bills into account.

If board costs can come down somehow, then the prospect of AM3+ being a discount overclocking platform might seem more realistic. As it stands, there are too many other interesting budget overclock options going on right now (G3258, maybe the A8-7600, the 860k) for octal-core FX to fit in, even with a healthy price cut.

First off all there are more than a single reason to get the FX CPUs, and certainly OC to FX9590 levels is not the only one.
The Vishera sweet spot is 4.4GHz with Turbo off. At that frequency you get very acceptable single thread performance and really really good MT performance at almost the same power usage as an FX-8350 at default.

Turbo uses way too much Voltage elevating power usage of the entire platform. Overclock an FX8320 to 4.2GHz (Default Heat-sink) with Turbo off and lower voltage than 1.425V that is the default and you get a much faster CPU at lower power consumption than FX8350 at default. Reviews havent shown that because they only run the CPU at default settings.

The high price motherboard for OC is a myth, my FX8350 was working at 4.7GHz stable with my ASUS M5A97 R2.0. There was no Throttling at all.

FX-8320 = $153,50 on amazon

ASUS M5A97 R2.0 = $95,24 on Amazon

Use the default Heat-sink and OC to 4.2GHz Turbo off, for $249 you have a very nice system with acceptable ST and very nice MT performance, 6x sata-III and USB-3.
Add in a nice GPU like R9 280 or Tonga and you can play every available game today and even next releases too for at least 2-3 years.

For a budget system and for users that want/need that CPU performance it is still good to go. Now with even lower prices they will become even better.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Meh, I'm happy with their decision to be honest. Instead of spending their limited resources on a processor based on a flawed architecture, they are putting it towards their next generation product. Hopefully the new x86 architecture will be a return to competitiveness.

They already have Steamroller and excavator ready on 28nm, at 20nm those designs would be way better today and not in 2016+.
Im pleased they are designing a new mArch but leaving the FX line at 32nm for 4+ years, in my opinion, was a mistake.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
No problems with heat? There are some who complain that their Haswells aren't stable running Prime95 28.5 at stock. While hardly scandalous, it isn't entirely accurate to say that Intel chips are always running cooler thanks to their lower TDPs.



What you are saying sort of plays into one of my previous comments in this thread (and serves to highlight why FX price cuts alone may not be enough).

From an enthusiast's point-of-view, the only reason to get an FX is to overclock it to 9590 levels of performance so as to mitigate Piledriver's single-threaded performance problems. So, that means getting an 8320 or 8350 (or 8320SE/8370SE after Sept. 1) and pushing it to 4.5-4.7 ghz at least, depending on what is allowed by the silicon lottery.

Now you have to deal with the fact that only a few motherboards out there reliably support the kind of power draw that Vishera can muster at such speeds. Assuming your overclocked chip is pulling over 200W (which is not outside the realm of possibility), you do not want to be caught with a 4+1 phase board, period.

If you are going to stop at the 4.5-4.7 ghz range, the Gigabyte UD3 is probably the sweet spot with a 6+2 phase design. There are reportedly some folks that have run the 9590 (often with a bit of undervolting) on this board.

If you want your FX to shine in all its glory by shooting for 5 ghz or higher, then it's pretty much the Sabertooth. There are other options, but most of the good ones cost more.

So, just looking at the board, you can expect to pay at least ~$110 for the UD3, or $170 (or more) for the Sabertooth. And that doesn't even include cooling for the chip, which will almost certainly have to be a strong aftermarket HSF or better. Some of us are fortunate enough to already have big HSFs sitting around, but even those might not be enough for the 5 ghz club.

And since neither the UD3 nor Sabertooth have onboard video, you're going to need a vid card too.

Alternatively, if you are just looking for WCG points, you could probably get a cheap 4+1 board and run an 8320 undervolted (or maybe one of the new 8370SEs) for much, much less money.

But, I digress.

For those of us determined to make the most of the FX in a wide variety of tasks, overclocking is a must, and when overclocking comes into play, the need for excellent cooling and solid boards rears its ugly head.

I realize that Kaveri does not offer the kind of multithreaded power of an overclocked FX, but for my money, I would much rather see an A8-7600 on an Asus A88x Plus. Allegedly, the Asus Plus and Pro boards can be made stable at 129 mhz blck in IDE mode, allowing for the 7600 to hit 4257 mhz. You probably wouldn't need much of an HSF to pull that off, either, AND you wouldn't even need dGPU unless you really wanted one.

And all that for maybe $200-$220 for CPU + board.

FX ownership is expensive even before taking utility bills into account.

If board costs can come down somehow, then the prospect of AM3+ being a discount overclocking platform might seem more realistic. As it stands, there are too many other interesting budget overclock options going on right now (G3258, maybe the A8-7600, the 860k) for octal-core FX to fit in, even with a healthy price cut.


I use this board. It is a bit cheaper than the Sabertooth and a good chunk cheaper than the Crosshair. I've never used either of the Asus boards, so I can't say anything confidently, but I wouldn't be surprised if the ASRock was every bit as good. I am benchmark stable above 5.3GHz with enough voltage to pull almost 400 watts on the CPU alone. But, I generally don't run it that way, I'm having more fun undervolting these days.

There is no argument that the FX CPU's use more power than their Intel counterparts. But, I think this is something that is blown out of proportion for the average user. For server farms and people who will have their CPU loaded often for extended periods of time, I get it, that makes sense. For someone who games even for hours at a time, it just doesn't really matter. (That is if you live where electricity is reasonable.)

Also, if you live in a cooler climate, that extra heat isn't wasted. I live in WI, any extra wattage that passes off my radiator enters the environment in my home. For ~7-8 months a year for me, that means I'm paying whatever the electricity cost would be for that heat. I don't know how it would compare in efficiency or costs to my natural gas furnace doing it's job in my home. But the point is, if you heat your home that extra heat energy isn't wasted cost. No one takes this into account when they compare costs. How much in electric did I pay for from my CPU use that is keeping my furnace off longer for the majority of the year? How much does that in turn save me in natural gas costs?

I'm not trying to justify the FX as a better CPU than what Intel builds. But I see the costs of running the machines compared a lot but no one takes into account that for many homes the extra heat energy isn't money wasted necessarily.



Meh, I'm happy with their decision to be honest. Instead of spending their limited resources on a processor based on a flawed architecture, they are putting it towards their next generation product. Hopefully the new x86 architecture will be a return to competitiveness.


I kind of wish they would shrink them. If they could chop the power use down and keep performance as is they could probably sell a few Opterons. As I said above, I don't think power use matters much for average users and gamers. But for data centers it does. But, I should mention that I don't know how aged their platform for Vishera based Opterons is, might be a deal killer anyway.
 
Last edited:

Atreidin

Senior member
Mar 31, 2011
464
27
86
It is nice that I can use ECC on AM3+ boards. I suppose I could upgrade the one I have that has a 1090T if I feel like it. I am wondering why anybody would get the 8370 over the 8370E. If the only difference is TDP, why would anybody get the higher one, especially since both cost the same? Maybe there is something else.
 

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
Also, about power usage.

FX-8350 + HD7950 BF3 MT

Core i7 3770K + HD7950 BF3 MT

The difference in real gaming is close to 70W, do your calculations for 2 hours of game play per day with your electricity rates.

That seems about right. If you look at Legitreviews and pcgameshardware,the difference between the FX8350 and IB/Haswell Core i5 and Core i7 CPUs is between 35W to 100W at the wall when gaming.

It will be interesting to see how much power the 95W FX8370E consumes though.
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
Also, about power usage.

FX-8350 + HD7950 BF3 MT

Core i7 3770K + HD7950 BF3 MT

The difference in real gaming is close to 70W, do your calculations for 2 hours of game play per day with your electricity rates.

Those figures aren't set in stone.

I'm undervolting my FX 8320 and it only draws 27 watts more than my i7 3770K. Pretty much any unlocked processor can be run at whatever power consumption / performance level that the user desires. 27 watts is pretty insignificant.... If I go and pop a bag of popcorn now in my 1100 watt microwave -- I'll use way more electricity in a scant 3 minutes. 27 watts is not even half of an average light bulb.

Many people don't even realize how much more power that overclocks are responsible for. I remember my buddy Dave is like -- "I bought my Haswell i7 for the lower power consumption. You've got to dump that FX 8320 because it is so power hungry." Hooked up my Kill A Watt to both desktops, His overclocked Haswell was pulling more than twice as much electricity as my FX does under load. Power consumption depends on your settings and the graphics card weighs in a lot, too. Some of the high end video cards pull more juice than the CPU.
 
Last edited:

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
I doubt it. In a perfect world, even if AMD had caught up in IPC to Intel's next generation architecture, they would continue to be behind Intel's world leading manufacturing node process. It's really not possible to compete with Intel on performance or price/performance in the mid-range and high-end CPU space for anyone. For AMD to have any chance, TSMC/GloFo/Samsung/IBM would need to have a high performance class leading fabrication process -- they don't! But even the first part of AMD matching Intel in IPC is hardly possible given how far behind they are and how much less funds AMD has for R&D/world class engineers. AMD would need to pull a Pentium 4 --> Core 2 Duo type of architectural leap to overcome 65-70% IPC disadvantage, which will become almost 90-100% with Skylake/Cannonlake.

The only timeframe in the history of AMD vs. Intel when AMD was seriously competitive is mostly the result of Intel's failed architecture (Pentium 4/D), not because of anything extraordinary from AMD's CPU division. Had Intel launched Intel Pentium M Banias on the desktop, it would have owned Athlon 64/X2 as well.

Because AMD is going to be constantly behind Intel in the fabrication process, even if their CPUs were to be just as fast for a similar price, AMD's CPUs would use more power. Even if AMD's next architecture manages to match Skylake/Cannonlake in IPC and price / performance, Intel can just lower a 6-core Skylake-E to $299-340 and it's not going to be possible for AMD to compete on the high-end. I think it's only really possible to compete in the high-end with Intel if HSA takes over and the graphics component starts taking over various aspects of general purpose computing/PC tasks. Otherwise, AMD will compete in the lower segments where their APUs and FX-6300 currently reside.

the very fact that you bring up such an argument would be a success for AMD. AMD 's next FX in 2016 will be a CPU which will sport atleast 8 big cores (16 threads) on a Samsung/GF 14 nm LPP process. AMD needs a high performance core which is competitive in perf / sq mm and perf / watt with Skylake. Even being marginally behind (<= 10%) won't hurt AMD as they can still compete on price / perf. Intel is loving their record gross margins while AMD is struggling to turn a profit. A competitive core will definitely improve AMD's sales and financials. Also AMD can look at high performance CPU and flagship GPU bundles to gaming system integrators along with high performance DDR4 Radeon branded memory. AMD can do a lot of aggressive bundling. Their strategy with Radeon branded memory, SSDs etc point to them laying the foundations for complete AMD gaming bundles in 2016. I foresee AMD doing the same with their HBM based FINFET APUs too. AMD has all the IP except a world class high performance core. Once they have anywhere close to a competitive core its going to be a much better situation for AMD than today. Beyond a point Intel will not want to compete on price (esp when marginally ahead) and hurt their gross margins. AMD has no such problems as right now they have almost 0% marketshare in high end desktop. A 8 core 16 thread FX CPU competitive with 8 core Skylake (< 10% perf gap) selling for USD 350 - 400 is a distinct possibility. For AMD any sale is gain as now its 0. btw TSMC and Samsung are closing the process gap aggressively. Intel's lead at 14nm is roughly 9 - 12 months. Future nodes like 10nm could see that go down to as low as 6 months.
 
Last edited:

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
Meanwhile, Haswell's single-thread performance is 65% higher than Piledriver based on my tests.

Yet many Ivy Bridge owners (like myself) feel Haswell is still a huge disappointment. It probably ain't worth the upgrade for most Sandy / Ivy owners.

I think Maximum PC sums it up best:
"So, Haswell runs a bit hotter, takes some voltage control out of your hands, eliminates the non-K overclocks, doesn&#8217;t give enthusiasts access to the large L4 cache version, doesn&#8217;t have TSX in the K parts, and, well, requires a new motherboard, too. You&#8217;re probably wondering just where the hell the good news is for enthusiasts with Haswell."
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Yet many Ivy Bridge owners (like myself) feel Haswell is still a huge disappointment. It probably ain't worth it for most Sandy / Ivy owners.

I think Maximum PC sums it up best:
"So, Haswell runs a bit hotter, takes some voltage control out of your hands, eliminates the non-K overclocks, doesn’t give enthusiasts access to the large L4 cache version, doesn’t have TSX in the K parts, and, well, requires a new motherboard, too. You’re probably wondering just where the hell the good news is for enthusiasts with Haswell."

That is if you already own Sandy or Ivy, for someone with Core 2/Nehalem/Westmere CPUs, Haswell is exceptional. ;)
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Yet many Ivy Bridge owners (like myself) feel Haswell is still a huge disappointment. It probably ain't worth it for most Sandy / Ivy owners.

I think Maximum PC sums it up best:
"So, Haswell runs a bit hotter, takes some voltage control out of your hands, eliminates the non-K overclocks, doesn’t give enthusiasts access to the large L4 cache version, doesn’t have TSX in the K parts, and, well, requires a new motherboard, too. You’re probably wondering just where the hell the good news is for enthusiasts with Haswell."


Not to change the subject, but what voltage control do you lose with Haswell? I know each CPU has it's own VID, but I was under the impression that you could still adjust voltage.
 

jhu

Lifer
Oct 10, 1999
11,918
9
81
No problem there: Phenom II is faster than the Zambezi/Vishera clock for clock.

It's faster than Zambezi but not Vishera. AMD's stated goal was to get IPC similar to Phenom II from Bulldozer. They met that goal with Piledriver (which is really what Bulldozer should have been, and Piledriver should have been Steamroller).
 
Last edited:

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
Hmm. Well, whatever their reasons for the price cut, for the sake of AMD and the buying public, let us hope they have something interesting to take over the flagship spot from the aging FX CPUs.

You do raise an interesting point; in fact, much of what I have been doing lately has been in Xubuntu. However, one of the major headaches involved with the octal-core FX chips in particular are the insane thermals/power draw, no matter what your operating system. Sure, the FX lineup will be dropping in price, but you still have to pay for:

Even with the price drop, are you saving any money versus going Intel by getting an 8350 and overclocking it to, say, 4.5 ghz (let's be conservative)?

Yes, we are saving a lot of money running a few FX chips in the mix -- Although, for 24/7 running on World Community Grid.... Most of us are going in the other direction -- Undervolting and Underclocking. My undervolted FX 8320 is only pulling 27 more watts than my 3770K. So far, I haven't been able to get my i7 stable when undervolted -- although I suspect that has more to do with the cheap motherboard. But unlocked chips from either company really open up a ton of options. If you're running a variant of Ubuntu -- an FX performs much like an equivalent Ivy. Where Windows 7 appears to throttle an FX down to Nehalem levels. The bang for the buck of an FX is insanely good if you plan on being a Linux user.
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
It's faster than Zambezi but not Vishera. AMD's stated goal was to get IPC similar to Phenom II from Bulldozer. They met that goal with Piledriver (which is really what Bulldozer should have been, and Piledriver should have been Steamroller).

:(

When I underclocked an FX-6300 to 2.8 Ghz it was still a bit slower than the AMD Phenom II X6 1055T we were testing it against. Both were running the same memory speed -- so I dunno.... In the real world, I still think the Stars core has a little more pop when its at the same clock speed. The stars core hits a brick wall faster for overclocking -- so Vishera owns when you get up past 4+ Ghz.
 
Last edited:

jhu

Lifer
Oct 10, 1999
11,918
9
81
:(

When I underclocked an FX-6300 to 2.8 Ghz it was still a bit slower than the AMD Phenom II X6 1055T we were testing it against. Both were running the same memory speed -- so I dunno.... In the real world, I still think the Stars core has a little more pop when its at the same clock speed. The stars core hits a brick wall faster for overclocking -- so Vishera owns when you get up past 4+ Ghz.

When 1 thread has a Piledriver module all to itself it's about as fast as a Phenom II core. When there are 2 threads on one module, that's when they become slower than 2 Phenom II cores. It's about a 15% decrease in performance when there are 2 threads on one module.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Definitely agree. I would hope an enthusiast would have moved on from those antiques by now.

Ironic considering a 2008 Core i7 920 @ 4.2Ghz and 2009 i7 860 @ 4.0Ghz would beat a 5Ghz 9590 in a just about any game. That's been my counter-argument against every $200+ AMD CPU in the last 5-6 years. You save some $$$ upfront but long term you end up upgrading a lot sooner and suffer low performance in games. For example, look at i7 2600k @ 4.7-5Ghz. Those systems from early 2011 are still top-of-the-line for games and by January 2015 will be 4 years old. There has been no point in upgrading an Intel CPU since SB came out. If you amortize your cost of ownership of an i7 vs. FX, account for electricity costs and the AMD system is barely more cost effective, or maybe if you are running DC projects like SETI is going to be a disaster for electricity costs.

With 5820K, Intel is moving their 6 core to $400 or less. Soon DDR4 prices will fall. But despite how good 5820K might be, the fact that 2600K is still top notch nearly 4 years later shows just how little risk there is for buying Intel's CPU if you don't know exactly what your workload will be in the next 4-5 years because they are good at both single threaded and multi-threaded tasks.
 

alexruiz

Platinum Member
Sep 21, 2001
2,836
556
126
Regarding power consumption, let's not overlook the fact that at idle Vishera is very frugal, and unless you are running CFD simulations ro similar all day long, the processors will come back to lower power states. In many cases, even during normal operation, they stay at the lower speeds. The A10-7850K in the Mrs. PC rarely speeds up even with a ton of facebook tabs open ;)
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Ironic considering a 2008 Core i7 920 @ 4.2Ghz and 2009 i7 860 @ 4.0Ghz would beat a 5Ghz 9590 in a just about any game. That's been my counter-argument against every $200+ AMD CPU in the last 5-6 years. You save some $$$ upfront but long term you end up upgrading a lot sooner and suffer low performance in games. For example, look at i7 2600k @ 4.7-5Ghz. Those systems from early 2011 are still top-of-the-line for games and by January 2015 will be 4 years old. There has been no point in upgrading an Intel CPU since SB came out. If you amortize your cost of ownership of an i7 vs. FX, account for electricity costs and the AMD system is barely more cost effective, or maybe if you are running DC projects like SETI is going to be a disaster for electricity costs.

With 5820K, Intel is moving their 6 core to $400 or less. Soon DDR4 prices will fall. But despite how good 5820K might be, the fact that 2600K is still top notch nearly 4 years later shows just how little risk there is for buying Intel's CPU if you don't know exactly what your workload will be in the next 4-5 years because they are good at both single threaded and multi-threaded tasks.

You will not need to upgrade a 4.4GHz 8-core Vishera for at least another 3-4 years. Modern Game engines can utilize more than 4 threads. All upcoming games using FrostBite 3, Cryengine 3, and Unreal Engine 4 will be just fine with a 4.4GHz vishera.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
AM3+ is not anywhere near frugal at idle unless you are comparing it to the E platforms. FM2+ is pretty frugal.
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
Ironic considering a 2008 Core i7 920 @ 4.2Ghz and 2009 i7 860 @ 4.0Ghz would beat a 5Ghz 9590 in a just about any game. That's been my counter-argument against every $200+ AMD CPU in the last 5-6 years. You save some $$$ upfront but long term you end up upgrading a lot sooner and suffer low performance in games. For example, look at i7 2600k @ 4.7-5Ghz. Those systems from early 2011 are still top-of-the-line for games and by January 2015 will be 4 years old. There has been no point in upgrading an Intel CPU since SB came out. If you amortize your cost of ownership of an i7 vs. FX, account for electricity costs and the AMD system is barely more cost effective, or maybe if you are running DC projects like SETI is going to be a disaster for electricity costs.

I guess you haven't read the last two pages of this thread. Power consumption is up to user preference for any unlocked processor -- regardless of manufacturer. All FX's happen to be unlocked.

I mean are you serious? You just complained about power consumption of AMD chips.... Then picked an i7 920 -- a chip that gobbles more than twice as much electricity as my undervolted FX 8320. Dude, seriously? My FX desktop is pulling 168 watts under full load compared to that 417 watts for an i7 920 (BTW is clocked slower by Tom than the speed you're suggesting to run it at). At 4.2 Ghz, that i7 920 is probably pulling around 450 watts. So not only did you pay more for that Intel CPU, now you're paying considerably more in electricity. My FX averages the same scores as my 3770k on BOINC -- and the extra 27 watts that the FX draws compared to my i7 ain't breaking the bank. What you're advocating doesn't add up at all. Running an old Intel chip is likely going to cost the user a lot more in electricity.

psu_load_power.png
 
Last edited:
Status
Not open for further replies.