Does the AMD FX line make sense

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

FlanK3r

Senior member
Sep 15, 2009
320
82
101
Not to mention Intel CPUs run within their specs, unlike FX series. And they are much much more power efficient and higher overclocking headroom.

The highest OC headroom I saw is with Sandy Bridge-DT and Vishera FX. I got at "air" with Vishera FX 5700+ MHz validation and 1.565V! And 5 GHz stable, first my CPU with 5 GHz stable. Cinebench running over 5300 MHz with 1.55V, games around 5200 MHz.
My Sandy can 4940 MHz stable, 54xx MHz validation. Cinebench around 5 GHz. With Ivy no change get this without cold. So I like this two chips most. I know, there was some SBs which can get 5.1 GHz stable and around 5600 MHz validation!

But average could be around 4800 MHz for SB-DT, 4800 MHz for Vishera FX, 4600 MHz for Bulldozer FX, SB-E and IB-dt.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Consoles do have a say in how many threads are optimal. If both consoles end up with 8 core cpus then you are going to want 8 cores. But since a PC still needs one powerful thread you will probably still want an intel. If AMD wasnt totally stupid they would have paired one super powerful core with 4-8 bobcat type cores and labeled that as FX. One core with like 8 INT clusters, two pairs of 256 bit FPUs, and a massive OoO and register file, 2M cache and all other good stuff. And then 7-8 2 issue cores with a single 256 bit cpu and only 512K cache. Such a hypothetical cpu would crush single threaded tasks, and also put out great multithreaded numbers. And after a couple years it would even have proper windows support lol.

The previous consoles however didnt make it so on the PC platform.
People assume that all cores will be used for gaming. Yet the Xbox360/PS3 proved otherwise.
 

Charles Kozierok

Elite Member
May 14, 2012
6,762
1
0
Power has been barely mentioned in this discussion. If you use your machine a lot, over the course of a year or two that difference can make up a big chunk of the difference in sticker price between an FX and an equivalent Core iMumble chip.
 

crashtech

Lifer
Jan 4, 2013
10,573
2,145
146
FX makes sense if you already have a board that will accept it, and your electricity is fairly cheap.

I was all AMD during the time Intel foisted that awful Netburst crap on us. Those were the days! I look forward to AMD being truly competitive again some day.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
Power has been barely mentioned in this discussion. If you use your machine a lot, over the course of a year or two that difference can make up a big chunk of the difference in sticker price between an FX and an equivalent Core iMumble chip.
Only if you actually load it (games, video encoding). Idle difference between these is a lot smaller and surfing or watching movies is low load.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
The Intel 3770K is 20% to 50% faster at running games todaythan an AMD 8350K . Was the Q8200 that much weaker at the time?

I'm not willing to give up that much performance for the next 5 years hoping the 8350 catches up and surpasses it. The AMD cores are starting at too much of disadvantage.

Talking about misinformation :whiste:

CPU.png


CPU3.png


CPU_01.png


CPU1.png


CPU_01.png


proz.jpg
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Talking about misinformation :whiste:

I see yet again manipulating with GPU bottlenecks and a single cherry picked Crysis 3 benchmark while ignoring the 2 others. Simply to cover up the lackbuster performance of the product you keep trying to push on people.

CPU%20clock.png

51140.png

51141.png

51139.png

51138.png

51123.png


And the FX8150 still uses twice, if not more, the wattage.
 
Last edited:

FlanK3r

Senior member
Sep 15, 2009
320
82
101
you can not ignored good rendering, compression and video encoding performance. There is no doubt about Vishera performance (in some video higher than 3770k). Not every man is hardcore player. The "problem" of Vishera is not in most of action games type FPS. Some lags are in MMORPG or games for free (WOT). For daily users and normal gamers (exmaple 3h working+3h gaming per day) is Vishera no problem.
 

Charles Kozierok

Elite Member
May 14, 2012
6,762
1
0
Only if you actually load it (games, video encoding). Idle difference between these is a lot smaller and surfing or watching movies is low load.

Well, if you're not taxing the CPU, you don't need the FX anyway.

It really doesn't have a lot to recommend it, I'm afraid. For power users, Intel is better, and for casual use, either Intel or the A line is better.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
Well, if you're not taxing the CPU, you don't need the FX anyway.

It really doesn't have a lot to recommend it, I'm afraid. For power users, Intel is better, and for casual use, either Intel or the A line is better.
Just because you don't tax it often doesn't mean you won't tax it at all. I still agree with your conclusion, though.
 

Charles Kozierok

Elite Member
May 14, 2012
6,762
1
0
Even if the difference is 70W while gaming...

Suppose a serious gamer plays four hours a day. So that's 28 hours a week.

Over the course of a year, that's about 100 kWh of extra energy consumed. If you're in a warm area, you also have extra electrical costs part of the year to remove the heat produced.

If you keep your system for three years, and you pay 0.12/kWh (average rate in the US), the extra cost of ownership is about $40 not including additional cooling costs.

If you live somewhere like Denmark, where the rate is in the 0.40/kWh range, you're talking over $125 extra in power costs over three years.

I don't know the current difference in prices between FX and comparable Intel chips, but my guess is that, depending on where you live, you're going to give a lot of those savings back, in addition to having what is, in most respects, an inferior system.
 
Last edited:

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71


The Unreal engine and this game scales pretty well with cores,i kinda notice with my 7850 and my 3930k how much smoother 2 extra cores makes the game play and the 7850 offers such a huge fps count at 1920x1200 its virtually bottlenecked by the cpu,overclocking it pass 4ghz relives most of it but at stock i find my gpu usage as low as 74%...clocking the chip helps big time.

Of course recommending any 6 or 8 core for such a old title is not my goal as a budget quad core like a q6600+oc or x4 955 is all someone would need for this game to sustain 60+fps,its simply a observation i wanted to mention.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
Its not me trying to decieve a person to buy FX CPUs based on GPU bottlenecks.

So people have to lower their resolution from 1080p to whatever you think Intel excels because they're bottlenecking their GPUs. Most gamers here game @ 1080 ya know?

There isn't a game my FX 8350 can't run @ 1400p (amazingly well) , not one.
 
Last edited:

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
Even if the difference is 70W while gaming...

Suppose a serious gamer plays four hours a day. So that's 28 hours a week.

Over the course of a year, that's about 100 kWh of extra energy consumed. If you're in a warm area, you also have extra electrical costs part of the year to remove the heat produced.

If you keep your system for three years, and you pay 0.12/kWh (average rate in the US), the extra cost of ownership is about $40 not including additional cooling costs.

If you live somewhere like Denmark, where the rate is in the 0.40/kWh range, you're talking over $125 extra in power costs over three years.

I don't know the current difference in prices between FX and comparable Intel chips, but my guess is that, depending on where you live, you're going to give a lot of those savings back, in addition to having what is, in most respects, an inferior system.
Buy PS Vita, beats anything out there, performance-per-watt wise. Long-term? One could probably go on a cruise every other year or so :awe:
 
Last edited:

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
Consider using the search function before the next time you decide to incite nuclear holocaust.

I agree. I love the movie "Dr. Strangelove or how I learned to love the Bomb"

Everytime one of these Intel vs AMD threads gets hot and heavy it reminds me of the zany line of Peter Sellers playing the President when he observes the Russian Ambassador and the US General tussling on the ground in a secret room in the Pentagon. The line is "Gentlemen, Gentlemen! You can't fight in here. This is the WAR room!"

This thread sure makes that line come alive.
 

crashtech

Lifer
Jan 4, 2013
10,573
2,145
146
The problem with posting GPU bottlenecked charts is that they are not relevant to CPU performance. I think most people here understand this even if they don't want to admit it. This is a CPU thread, so graphs which do not accurately represent relative CPU performance have no place; they are irrelevant at best and intentionally misleading at worst.
 

cbrunny

Diamond Member
Oct 12, 2007
6,791
406
126
I agree. I love the movie "Dr. Strangelove or how I learned to love the Bomb"

Everytime one of these Intel vs AMD threads gets hot and heavy it reminds me of the zany line of Peter Sellers playing the President when he observes the Russian Ambassador and the US General tussling on the ground in a secret room in the Pentagon. The line is "Gentlemen, Gentlemen! You can't fight in here. This is the WAR room!"

This thread sure makes that line come alive.

All I was looking for was an estimate of whether or not pc gaming will be increasingly multicore or if 4 cores/8 threads would continue to be the way to go. I wasn't looking for an AMD vs. Intel argument....

It is entertaining though.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
All I was looking for was an estimate of whether or not pc gaming will be increasingly multicore or if 4 cores/8 threads would continue to be the way to go. I wasn't looking for an AMD vs. Intel argument....

It is entertaining though.

Its not that easy to answer. Because it depends on the speed per core and how well a certain application scales.

Amdalhs law comes to mind:
AmdahlsLaw.png


And besides encoding, compression, rendering. It rarely scales well on the desktop.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
The problem with posting GPU bottlenecked charts is that they are not relevant to CPU performance. I think most people here understand this even if they don't want to admit it. This is a CPU thread, so graphs which do not accurately represent relative CPU performance have no place; they are irrelevant at best and intentionally misleading at worst.

So, now it is misleading to show how the CPU (and the system in general) will perform at 1080p, the resolution that the majority of gamers will play.

But i guess 1024x768 is more relevant :rolleyes:

ps: from all the slides i posted only Max Payne 3 is GPU limited and yet the majority of the CPUs can produce more than 60fps.
 

AnonymouseUser

Diamond Member
May 14, 2003
9,943
107
106
The problem with posting GPU bottlenecked charts is that they are not relevant to CPU performance. I think most people here understand this even if they don't want to admit it. This is a CPU thread, so graphs which do not accurately represent relative CPU performance have no place; they are irrelevant at best and intentionally misleading at worst.

The problem with posting low-resolution CPU bottlenecked charts is that they are not relevant to real-world performance. At the resolutions that people actually play their games (with a few exceptions), the performance differences are negligible (as shown by AtenRa's charts).
 
Status
Not open for further replies.