Does the AMD FX line make sense

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

inf64

Diamond Member
Mar 11, 2011
3,703
4,034
136
AMD made a mistake labeling it as 8 core chip, that's true(even though it really has 8 integer cores). They would have been better of marketing it as 4M/8T or just 8 thread capable CPU.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
AMD made a mistake labeling it as 8 core chip, that's true(even though it really has 8 integer cores). They would have been better of marketing it as 4M/8T or just 8 thread capable CPU.

I agree. Their marketing team should be taken out and shot. Even BD wouldn't have looked so bad if they'd brought it out for a lower price and sold them as dual, tri, and quad core chips with a higher thread capability. Thus 8150 would have squared off with 2500k but less expensive, maybe hit the blocks at $159 or so. I imagine the bad press would have been far less damaging. The 2500k would still have handily beat the 8150 in most gaming situations, but the benches would have been split more evenly, particularly with 8150 striking wins in encoding and scientific apps.

Ah well. At least PD is a decent step forward, and not priced so bad.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I want to believe, but I had enough experiences where my i5-2500k at over 5GHz would bottleneck my 470s at sub 60 fps in such titles as SC2.

I look at games like BF3 and Crysis 3 and go yeah AMD, I can't wait to get back! Then I remember JF and how I play a lot of games that don't use cores like the aforementioned games and sink back into my chair :(


My newegg cart will forever have a 8320 and 7950 in it with a gigabyte board /:(\
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
What board is it? If it has at least 16X PCI-E 2.0 for GPU, and is 8xx or 9xx based, I'd say it's worth considering the FX 6300/8300 series if they are compatible. If it only runs BD, I would have to hesitate to recommend it. What level of GPU would you match with it?

Are you mainly going to play SC2? SC2 is ridiculously terribly coded, if that's your only game, then even an i3 is gold. Serious multiplayer competitive SC2 = Intel only. But for casual play and you want to play lots of other games .. PD should be decent, and saving the money by using an existing board is very attractive.

Yes, it's the 890GX board from Asus (ASUS m4a89gtd pro usb3), which is listed as AM3 only (not AM3+). But apparently someone said they got the FX-8350 running on that board by updating the BIOS to a beta version. Even the Asus website fails to list the 8350 as supported (ASUS only lists that the beta bios can support the 8150), so I'm just trusting this random internet guy that he is telling the truth about support for the 8350).

I plan to upgrade to a 7950 videocard. I think the most demanding CPU game I have is SC2. I hope that the heart of the swarm SC2 update will add support for additional cores beyond 2, but if not, I'm still OK, because I play 1v1 SC2 where CPU performance is not at issue as much as in the big 4v4 type of games. I'd also plan to run the 7950 in eyefinity 4800x1200 mode.

Anyway, my question is about the FX line itself. It seems the FX-8350 is priced extra high (well, relative to other AMD chips) because it's the top of the line chip. But, if I get, say, the FX-8320 and overclock it, will I be able to get about the same performance as an overclocked FX-8350?

Put another way, do the 8350 and 8320 provide equivalent performance at equivalent clock speeds? Is that the same for other chips in the FX line, with fewer cores? Can the FX chips be "unlocked" somehow, taking a 6-core to an 8-core?

I'm just ignorant about the "enthusiast" aspects of the FX line that are not specifically advertised, but very relevant to someone who will tinker with them and cause them to deviate from the listed specs.

After thinking about it, I would probably stick with the FX-8*** series to get the most cores, as it seems games are starting to better support multiple cores so it would make sense to have that support, because I wouldn't have to buy a new motherboard. Assuming you can't unlock cores.

But does it make sense to spend the premium on the very top of the line 8350, or get a lower chip and overclock? Is the 8320 the only other option, or would an even lower chip make sense, with fewer cores, if they typically have higher overclocking headroom than an FX chip with more cores?
 
Last edited:

AnonymouseUser

Diamond Member
May 14, 2003
9,943
107
106
You're in luck if the ASUS m4a89gtd pro does work with the 8350, since that board is rated to work with 140W processors. However, if you want to keep the cost down, get a 6300 and overclock it. Overclocked, the 6300 will hold it's own with the 8350, and you'll save some money as well. I don't know that I would recommend overclocking the 8350 on that board.
 

inf64

Diamond Member
Mar 11, 2011
3,703
4,034
136
Yep I agree with AnonymouseUser , FX6300+OC is the best price/perf. choice for you with that board. Provided the "beta" bios works with BD/PD that is. You can get ~4.5Ghz out of 6300 fairly easily on good air cooler or around ~4.2Ghz+maybe Turbo on top with stock cooler. At this clock it will be very good gaming(and general purpose) system.
 

cbrunny

Diamond Member
Oct 12, 2007
6,791
406
126
I've seen rumours around about a Vishera revision this year, either an fx-8370 or fx-8390. I can't find any actual press or solid evidence about this but the rumours are out there. Does anyone have any better intel than this useless piece of information/misinformation?
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
So it seems that at equal overclocks (say, 4.5 GHz) for gaming, the FX-6300 is very close to the FX-8350 in gaming performance. In other words, they scale up comparatively as you overclock? I guess the FX-8320 falls somewhat between them too, at that overclock.

Also, any ideas what wattage the FX-8350 pulls when overclocked? Seems like the 6300 uses WAY less power at full load, but I didn't see how they stack up against each other when overclocked.
 

inf64

Diamond Member
Mar 11, 2011
3,703
4,034
136
So it seems that at equal overclocks (say, 4.5 GHz) for gaming, the FX-6300 is very close to the FX-8350 in gaming performance. In other words, they scale up comparatively as you overclock? I guess the FX-8320 falls somewhat between them too, at that overclock.

Also, any ideas what wattage the FX-8350 pulls when overclocked? Seems like the 6300 uses WAY less power at full load, but I didn't see how they stack up against each other when overclocked.
6300 still uses less power when OCed vs 8350/8320 OCed. There is just one module less active which accounts to roughly 0.75x or 75% of 8350's power draw when pushed to similar volts and clocks(rough estimate of course,it all depends on individual specimens).
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
I wonder if an FX-8350, overclocked, would exceed the 140W power rating of the motherboard?

For example, lets say it does, and so I'd have to scale back the overclock of the FX-8350 to stay below 140 W. So maybe I get it up to 4.4 GHz safely, even though the chip could go higher if my mobo could pump more wattage.

But, with the FX-6300, what if I could get it up to it's full overclock potential, say 4.8 GHz, and it would still be under the 140 W limit. Wouldn't that overclocked chip perform better in gaming than the FX-8350, just due to the faster overclock and the fact that most games simply can't take advantage of 8 cores compared to 6?

Now I'm just guessing at the wattages and things. Who knows, maybe my motherboard could pump out more watts and enable the same high overclock on either chip. Or, maybe the 6300 just can't overclock as high as the 8350, though I doubt this is the case?
 

AnonymouseUser

Diamond Member
May 14, 2003
9,943
107
106
I wonder if an FX-8350, overclocked, would exceed the 140W power rating of the motherboard?

For example, lets say it does, and so I'd have to scale back the overclock of the FX-8350 to stay below 140 W. So maybe I get it up to 4.4 GHz safely, even though the chip could go higher if my mobo could pump more wattage.

But, with the FX-6300, what if I could get it up to it's full overclock potential, say 4.8 GHz, and it would still be under the 140 W limit. Wouldn't that overclocked chip perform better in gaming than the FX-8350, just due to the faster overclock and the fact that most games simply can't take advantage of 8 cores compared to 6?

Now I'm just guessing at the wattages and things. Who knows, maybe my motherboard could pump out more watts and enable the same high overclock on either chip. Or, maybe the 6300 just can't overclock as high as the 8350, though I doubt this is the case?

The 8350 has been known to exceed its 125W envelope at stock, so it can easily exceed 140W overclocked. That board may very well be fine when overclocking the 8350, but since it wasn't designed for the 8350 to begin with, I wouldn't trust it.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
They didn't make any sense at all, but since BF3 and Crysis 3 came around, now things are starting to change, at least regarding those two games. They still aren't as good as intel chips, but they aren't garbage for those two games. Again, for those TWO games. Those TWO (2) (II) (5-3) only games.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
So, now it is misleading to show how the CPU (and the system in general) will perform at 1080p, the resolution that the majority of gamers will play.

But i guess 1024x768 is more relevant :rolleyes:

Who are these majority you speak of?

If we can assume that the Steam hardware survey has a sufficient sample size to represent the entire gaming market, then your statement is wrong. The majority of gamers are not playing at 1080p. That would only be 29% of them.
 

AnonymouseUser

Diamond Member
May 14, 2003
9,943
107
106
Who are these majority you speak of?

If we can assume that the Steam hardware survey has a sufficient sample size to represent the entire gaming market, then your statement is wrong. The majority of gamers are not playing at 1080p. That would only be 29% of them.

That is the most popular single-monitor resolution by a long shot, but you are correct, it's not the majority. What is the majority is that most gamers game at resolutions higher than 1024x768 (4.06%) or even 1280x1024 (8.97%). After 1920x1080 (28.9%), 1366x768 is the next highest (20.68%), followed by 1680x1050 (8.59%), 1600x900 (7.51%), 1440x900 (6.81%), and 1920x1200 (3.04%). So, total of all gamers playing at 1366x768 or higher is more than 75.53%.

Even if we disregard 1366x768 (since most of those are laptops, and the FX isn't in laptops), there are still more than 54.85% gaming at higher than 1366x768. We haven't even touched multi-monitor setups yet, and we know they are higher than 1024x768.

So ultimately, it is disingenuous to benchmark these CPUs at 1024x768 and not offer benchmarks at 1920x1080 as well. It's also disingenuous to assert that Intel is 20-50% faster in most games based on a few 1024x768 benchmarks.
 
Aug 11, 2008
10,451
642
126
That is the most popular single-monitor resolution by a long shot, but you are correct, it's not the majority. What is the majority is that most gamers game at resolutions higher than 1024x768 (4.06%) or even 1280x1024 (8.97%). After 1920x1080 (28.9%), 1366x768 is the next highest (20.68%), followed by 1680x1050 (8.59%), 1600x900 (7.51%), 1440x900 (6.81%), and 1920x1200 (3.04%). So, total of all gamers playing at 1366x768 or higher is more than 75.53%.

Even if we disregard 1366x768 (since most of those are laptops, and the FX isn't in laptops), there are still more than 54.85% gaming at higher than 1366x768. We haven't even touched multi-monitor setups yet, and we know they are higher than 1024x768.

So ultimately, it is disingenuous to benchmark these CPUs at 1024x768 and not offer benchmarks at 1920x1080 as well. It's also disingenuous to assert that Intel is 20-50% faster in most games based on a few 1024x768 benchmarks.

If it is "disingenuous" to use low res benchmarks, then almost every review site is guilty. It is just as "disingenuous" to show a GPU limited benchmark and claim that it proves 2 cpus are equal. Kind of like saying a prius is as fast as a corvette if you put 60 mph limiting tires on both.
 

videogames101

Diamond Member
Aug 24, 2005
6,777
19
81
The point of using 1024x768 is to make it cpu-limited, how is that disingenuous for testing cpus?

This has been done forever.
 

AnonymouseUser

Diamond Member
May 14, 2003
9,943
107
106
If it is "disingenuous" to use low res benchmarks, then almost every review site is guilty. It is just as "disingenuous" to show a GPU limited benchmark and claim that it proves 2 cpus are equal. Kind of like saying a prius is as fast as a corvette if you put 60 mph limiting tires on both.

I didn't say that benchmarking at low resolutions was disingenuous, I said that not showing low and high resolutions was. Compare Anandtech's review to Tom Hardware's review and Tom's is more honest (did I really just say that?).

The point of using 1024x768 is to make it cpu-limited, how is that disingenuous for testing cpus?

This has been done forever.

Yes, it has been done forever, but doesn't show the whole picture with regards to gaming.
 

youshotwhointhe

Junior Member
Aug 23, 2012
11
0
0
I agree. It makes sense to use a low resolution to compare the general performance of CPUs (for the type of processing games need). But you need to put that in context. If I can get 95% of the performance for 50% of the price when running at the resolution I need, that is infinitely more relevant to my purchasing decision.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Last edited:

AnonymouseUser

Diamond Member
May 14, 2003
9,943
107
106

Piledriver not found.

Yes, the Bulldozer sucked hard, but not Piledriver. It's time to move on.

http://techreport.com/review/23750/amd-fx-8350-processor-reviewed

Still, with the FX-8350, AMD has returned to a formula that has endeared it to PC enthusiasts time and time again: offering more performance per dollar than you'd get with the other guys, right in that sub-$200 sweet spot. That's the sort of progress we can endorse.
 
Status
Not open for further replies.