Is there any reason to use FX CPUs right now?

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Saying far from meaningful is really an out of touch from reality statement, deal with Verizon is very meaningful and 1&1 is potentially too...

AMD entire CPU business is circa 300-350MM per quarter and servers are about 5-10% of this, while Intel server business is 4.1 billion per quarter. Given the discrepancy between the numbers, two order of magnitude to be more precise, I'd say that in the big scheme of things Verizon and 1&1 buying a few AMD servers are largely irrelevant. Those deals are too small, had AMD a healthy server business these deals wouldn't be a relevant fact, but AMD server business sank to levels that anyone ordering AMD servers soon becomes a relevant fact.

By the way, the goodwill AMD wrote down last quarter is related to the Seamicro business. It seems that AMD again had to admit it paid too much for an acquisition, and that Microservers are a much limited market prospect than they thought in the past.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Nope.

The seamicro SM15000 use both intel and amd chips.



http://www.amd.com/en-us/press-releases/Pages/verizon-selects-amd-2013oct7.aspx

Then it turns out that 3/4 of the actual chips are intel e3 xeons. AMD provides the interconnects and software but the meat of the machine is sandy bridge.

http://www.extremetech.com/computin...rver-win-was-actually-a-massive-win-for-intel

Nobody said Verizon only use AMD Opterons, but the latest deal was Opterons only.

http://www.moorinsightsstrategy.com/whos-processors-are-used-in-the-giant-verizon-cloud/

The answer is both, we are using the Intel Xeon class processors and a bunch of our infrastructure is using the Intel. Recently we switched to AMD Opterons, and as you wonder why, one of the things we are looking for is increasing the memory per host and the Opterons allow us in a single socket configuration to address more memory, and so all of the new deployments we are putting out there are carrying 64GB per host and the 8-core Opteron processors.
 

DrMrLordX

Lifer
Apr 27, 2000
22,768
12,776
136
Somebody asked whether or not an FX chip would make a good server. It sort of went from there . . .
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
I was an AMD user from the barton XP 1700 days (early 2000s) right up to Phenom II. With the bulldozer release fiasco in 2011 (more power consumption & less performance than Phenom II??? What kind of release is that??) i just switched to Sandy Bridge 2500k and never looked back. It doesnt really matter for desktops anymore anyways, now u buy a desktop CPU and it'll last u forever (all optimizations have focuesd on mobile platforms & power efficiency). I only upgraded to Haswell cause i got a killer deal on a 4790k ($300 & no tax as i was a tourist in japan). Otherwise the sandy bridge was just fine.
 

schmuckley

Platinum Member
Aug 18, 2011
2,335
1
0
I was an AMD user from the barton XP 1700 days (early 2000s) right up to Phenom II. With the bulldozer release fiasco in 2011 (more power consumption & less performance than Phenom II??? What kind of release is that??) i just switched to Sandy Bridge 2500k and never looked back. It doesnt really matter for desktops anymore anyways, now u buy a desktop CPU and it'll last u forever (all optimizations have focuesd on mobile platforms & power efficiency). I only upgraded to Haswell cause i got a killer deal on a 4790k ($300 & no tax as i was a tourist in japan). Otherwise the sandy bridge was just fine.

real stuff
 
Apr 20, 2008
10,067
990
126
I had no other reason to upgrade my overclocked core 2 quad besides a single basketball game and Starcraft 2. An FX-8350 for $125 back in September was a no brainer. Shocking to everyone here it plays every game I own very well and doesn't ever get hot enough to break 56C even with prime95. It's been a near silent (Arctic Freezer Pro 7) CPU/HSF that offers entry level i5/i7 performance at an i3 price at what I use it for.

Different strokes, but there's a big market for FX, especially the $90-115 FX-8310.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
I had no other reason to upgrade my overclocked core 2 quad besides a single basketball game and Starcraft 2. An FX-8350 for $125 back in September was a no brainer. Shocking to everyone here it plays every game I own very well and doesn't ever get hot enough to break 56C even with prime95. It's been a near silent (Arctic Freezer Pro 7) CPU/HSF that offers entry level i5/i7 performance at an i3 price at what I use it for.

Different strokes, but there's a big market for FX, especially the $90-115 FX-8310.

If there was such a big market FX wouldn't be on life support and AMD would actually care about updating AM3+ and not just focussing on APUs.
 
Apr 20, 2008
10,067
990
126
If there was such a big market FX wouldn't be on life support and AMD would actually care about updating AM3+ and not just focussing on APUs.

Considering 25% of the active gaming marketshare (steam systems) are AMD and we know how lackluster their laptop sales has been, where are these numbers coming from?
 
Aug 11, 2008
10,451
642
126
what, at the outside, 10million steam users? 25% of that is 2.5 million users. In the overall pc landscape that is a very tiny amount.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
In the past, AMD was more competitive, and a lot of steam users have aging systems. I'd be willing to bet a lot of those AMD chips are Phenom II's. And, with regards to servers, I can see those deals being a big deal to AMD even if they don't mean much in the grand scheme.
 
Aug 11, 2008
10,451
642
126
Steam is more like 80-90 million users.

However its hard to verify his claim or the type of CPUs used as the survey is currently having problems.

But when it works again we can see here:
http://store.steampowered.com/hwsurvey/processormfg/

How many of those participate in the hardware survey? In any case it still is a minuscule number compared to the total installed PC base, slightly more than the *new* pc sales for one quarter.
 

funboy6942

Lifer
Nov 13, 2001
15,362
416
126
I use the 8320 oc'd to 4.2ghz, with a GTX970 and I have no problems gaming with it. What I dont understand is why you have to have a billion fps. My eye doesnt mind as long as its over 30fps with all the eye candy turned on and rez all the way up, why do I need more??? And with my set up I have 14 tabs open, can run a movie, make a slideshow, burn it, watch you tube, and stream music all at the same time without a single problem. Isnt that what its all about?? Im on a tight budget, I cant afford more, if I could I would of gotten the 8350, but Im a fanboy alright, a HUGE fanboy, a fanboy of my money and how its spent! AMD...INTEL, IDC, if Intel were to make a chip that did the same that my amd chip does for me at the same PRICE or better, then yes, I would buy Intel, but they dont. Do I need to run my games at 5 billion frames per second? No, I just need them to run over 30 and as long as all the eye candy is turned up for my eye cant see any diff between 30 or 400 fps at 60hz.

So Im not seeing what the hubbub is all about screw amd and spend more and buy intel for you can run your game at 400fps instead of 350 but you got to spend $200 more on the intel chip to do it. Just buy what works for you, be a fanboy of your cash, do some research, buy what works for you and your situation. All my games I have I can run at 2560x1080P, with all the eye candy turned on just fine with no slow down I can see running with a gtx970, and a lousy little FX 8320. With my old GTX 770 it was struggling a little bit, but now with a 970, Im back in the game running all my new games just fine to me. Maybe not at 40 trillion fps if it was a intel chip, but just fine to me :D
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
30fps is setting your expectations pretty low. I'm OK with framerates in the 30's in some games (Civ5?) but find it very distracting in others.

Anyway, performance is performance. Regardless of what you use your computer for, I'd expect you'd want the best CPU for your uses. If an i3 will be faster in your games, and gaming is your primary use, why get an FX over an i3, if they're the same price? On the other hand, if you don't do much gaming, but multitask and run programs that are very parallel a lot (an i3 would handle your example quite well, FYI), an FX might be a better choice. Right chip for the job.


EDIT:

Many of you probably haven't used a modern i3. I'm sure much of the same can be said about an FX chip, but in day-to-day use, it's indistinguishable from an i7. Chrome with a ton of tabs + minecraft + skype + WMP + burning a DVD runs without a hitch. Given that, there aren't many excuses to have 8 cores for basic Windows desktop use.
 
Last edited:

funboy6942

Lifer
Nov 13, 2001
15,362
416
126
Why do I need more then 30? Will my eyes orgasm? Not saying my games I have will only run at 30fps, but I cant tell the diff between if they did run at 25~ they ran at with my 770, and the 50+ they run at now with my 970 other then I can turn my eye candy all the way up on all my newer games I couldnt do before with the 770 I had.
 
Last edited:

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
You don't. The human eye+brain will perceive 24 fps video as seamless.

Evidence: Hollywood

24fps was chosen arbitrarily, years ago. Movies only look seamless because they have motion blur. Without that, they'd be impossible to watch at that framerate.

So now, the argument for FX chips is that 24fps is plenty.
 

janeuner

Member
May 27, 2014
70
0
0
24fps was chosen arbitrarily, years ago.

And 90 years later, we are used to it. I play games at 30-60 fps without motion blur, and I never notice the framerate unless I am specifically looking for it. The game is still fun, so who cares.

So now, the argument for FX chips is that 24fps is plenty.

No, the argument is like this:

Many of you probably haven't used a modern FX. I'm sure much of the same can be said about an i3 chip, but in day-to-day use, it's indistinguishable from an i7. Chrome with a ton of tabs + minecraft + skype + WMP + burning a DVD runs without a hitch. Given that, there aren't many excuses to have hyperthreading for basic Windows desktop use.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
24fps was chosen arbitrarily, years ago. Movies only look seamless because they have motion blur. Without that, they'd be impossible to watch at that framerate.

So now, the argument for FX chips is that 24fps is plenty.

Yep. Even so the effect is jarring during large scene pans.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
And 90 years later, we are used to it. I play games at 20-60 fps without motion blur, and I never notice the framerate unless I am specifically looking for it. The game is still fun, so who cares.

I care, because unlike you, I immediately notice when my frame rate drops from 60 to 20. It's not a subtle difference either. If you ignore it that's one thing. If you say you don't notice it, I'd have to seriously question your honesty and/or eye sight.
 

janeuner

Member
May 27, 2014
70
0
0
I care, because unlike you, I immediately notice when my frame rate drops from 60 to 20. It's not a subtle difference either. If you ignore it that's one thing. If you say you don't notice it, I'd have to seriously question your honesty and/or eye sight.

Oops, 30. "I play games at 30-60 fps without motion blur...."

20 fps is noticable.

I am certain about my imperception of 30 vs 60 fps. Elite: Dangerous has a configuration mechanism that allows you to limit the framerate to 30, 45, or 60 fps. I cannot distinguish between them.
 
Last edited: