Impressed with FX-8350 and the new article at Anand

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Aug 11, 2008
10,451
642
126
I think the success or failure of Intel or AMD is only partially related to the quality and capabilities of their products. Largely, consumers buy computers and care more about the size of the monitor and the brand than anything else. The existence of Atom CPU based desktops should be enough to show that many consumers just don't care at all about what CPU they get. Intel does a much better job at selling it's product to the OEMs, which results in lots of computer buyers ending up with Intel systems.



I assumed you were talking about AMD with your real work comment, because this has been gone over repeatedly already and you didn't address any of the points.

Total cost of ownership, think about that. Depending on your source of computers, you may save $100 or more buying AMD. You admit that an AMD computer would work, you just don't like the increased power usage.

If you rationally look at the big picture, you might find that the extra $8 per year you pay for your power-hungry AMD CPU still allows you to save money overall, because you paid $100 less upfront and you aren't going to use the computer for more than 10 years. This is the whole point that has been made through the thread. It's only when silly corner-case examples like $.40/kwh costs that start to make Intel look like the better value.

Again, I ask for an actual real-world example, I don't know why you refuse to answer me.

Does "work" means excel, word, web browsing, and powerpoint? If so, your power usage difference is going to be insignificant and/or possibly even favor AMD, as those leave the CPU largely idle. Does your work mean 24/7 video encoding? Then you have one of the corner cases where the Intel power usage makes a huge difference. I'd argue that such cases are rare though, and certainly not a standard consideration for the average CPU buyer.

The primary case is gaming, which in the overall scheme of things is a "corner case" as well I suppose, but not to users on these forums.


Disclaimer: my reasoning refers to gaming.

1. 3570k is not 100.00 more expensive than the 8350. More like 20 dollars.
2. Except for a few "corner cases" as you call them, 8350 performs worse in gaming than 3570.
3. Small initial cost advantage of 8350 disappears over the life of the processor due to increased power cost.

Most people are not as hung up on power savings as AMD fans think they are. If the 8350 offered clearly superior gaming performance across a wide spectrum of games, I could accept the increased power consumption, as I expect most other users would as well. I have a quad core CPU and a discrete card. That certainly uses more power than an i3 with no discrete card. I accept that for the vastly superior gaming performance.

Problem is, with the 8350, you get generally worse performance AND higher power consumption, and it is only slightly cheaper initially. If you overclock, the comparison becomes even worse.

Its like buying a car that is slower but also uses more gas.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
There are hardware and software differences between both reviews. For instance, the techspot review used only W7 SP1. The German review also installed the FX hotfixes provided by Microsoft. Those hotfixes correct a bad scheduler behaviour on Windows that affects the performance of the FX chips.

Techspot overclocked both chips to the same speed giving to the i7 an extra 0.5 GHz advantage, but in general the FX chips achieve more overclocking than the i7. Note that the FX owns the worldwide record of overclocking. Moreover, the Intel chips run at stock memory but the FX run with memory under the stock, which again affected performance and overclocking results.
again it probably comes down to different areas. my 2500k goes from 75-100% depending on where I am in the game. thats right even at 1920x1080, I am hitting well over 90% and even up to 100% cpu usage at times with a 2500k at 4.4. I will no longer recommend a 4 thread cpu for those wanting to build a gaming pc with higher end cards. 4770k will be my recommendation when Haswell comes out.
 
Last edited:

galego

Golden Member
Apr 10, 2013
1,091
0
0
The Linux benchmarks were also generally concluded to show a victory for the i7. Read the conclusion (emphasis added):

Nowhere the reviewer says "the i7 wins" or something as that. You also omitted important parts from his conclusions, explaining that some of the weak performance is not due to the chip but due to a problem with the version of the compiler used, which treats the FX chip as an old bulldozer, ignoring further piledriver improvements. His conclusions were (bold face in the original):

From the initial testing of the brand new AMD FX-8350 "Vishera", the performance was admirable, especially compared to last year's bit of a troubled start with the AMD FX Bulldozer processors.

For many of the Linux computational benchmarks carried out in this article, the AMD FX-8350 proved to be competitive with the Intel i7 3770K "Ivy Bridge" processor. Seeing the FX-8350 compete with the i7-3770K at stock speeds in so many benchmarks was rather a surprise since the Windows-focused AMD marketing crew was more expecting this new high-end processor to be head-to-head competition for the Intel Core i7 3570K on Microsoft's operating system.​

The slated retail price at launch for the FX-8350 is $195 USD. The Core i5 3570K is presently retailing for around $230 and the Intel Core i7 3770K is around $330. In other words, the AMD FX-8350 is offered at a rather competitive value for fairly high-end desktops and workstations against Intel's latest Ivy Bridge offerings -- if you're commonly engaging in a workload where AMD CPUs do well.​

In not all of the Linux benchmarks did the Piledriver-based FX-8350 do well. For some Linux programs, AMD CPUs simply don't perform well and the 2012 FX CPU was even beaten out by older Core i5 and i7 CPUs. We can hopefully see improvements here later on through compiler optimizations and other software enhancements. As shown in my earlier AMD Piledriver compiler tuning tests from the A10-5800K Trinity, with the current GCC release there isn't much improvement out of the "bdver2" optimizations for this processor that should expose the CPU's BMI, TBM, F16C, and FMA3 capabilities over the original AMD Bulldozer processors. I hope that we will see further compiler improvements out of AMD to close some of these performance gaps.
More modern versions of the software already exist, but I don't know if the problems were already corrected.

Exophase did a neat summary of the benchmark results in chart form:

Thanks by quoting that. In my previous posts above I said "And those are linux benchmarks where the FX outperforms the i7 (with HT enabled) even by large margins such as 30% in some of them", but as exophase points out the FX can be a 41.6% faster in some of them.

My point stands: the FX can touch the i7 and can destroy it in some benchmarks, even with HT enabled.

Giving >90% of the average performance (recall the artificial disadvantage due to compiler) of a i7 by a fraction of the cost is really impressive, specially when one adds AMD financial problems and its tiny resources (it is about a tenth of Intel size right?) to the equation. And SteamRoller is waited to be a big improvement over the FX-8350.
 
Last edited:

Charles Kozierok

Elite Member
May 14, 2012
6,762
1
0
I think the success or failure of Intel or AMD is only partially related to the quality and capabilities of their products.

If by "partially" you mean "not entirely", then I agree. If by that you mean "only a little bit", I do not.

It's not a coincidence that AMD's best days financially corresponded to the brief period when it was beating Intel technically.

I assumed you were talking about AMD with your real work comment, because this has been gone over repeatedly already and you didn't address any of the points.

That's nonsense. I said before, you drew a false dichotomy by suggesting that my two best options were AMD -- which I've already acknowledged is the best choice for a budget build, and which I considered using for the machine I just built my son -- and ARM. Suggesting that ARM was even among my choices was just foolishness and you know it.

You admit that an AMD computer would work, you just don't like the increased power usage.

Of course it "would work". Almost anything "would work". The question is how well it would work, and what makes the most sense.

If you rationally look at the big picture, you might find that the extra $8 per year you pay for your power-hungry AMD CPU still allows you to save money overall, because you paid $100 less upfront and you aren't going to use the computer for more than 10 years.

The entire point of my post that you replied to yesterday was to look beyond the pure power cost of the extra wattage of the AMD chips. I'm the one actually looking at the big picture, by considering all of the factors that go into a selection. You're ignoring all of them and fixating on your invented "$8 a year" scenario.

Does "work" means excel, word, web browsing, and powerpoint? If so, your power usage difference is going to be insignificant and/or possibly even favor AMD, as those leave the CPU largely idle. Does your work mean 24/7 video encoding? Then you have one of the corner cases where the Intel power usage makes a huge difference. I'd argue that such cases are rare though, and certainly not a standard consideration for the average CPU buyer.

I use my machine for everything from routine browsing to number crunching to photo editing and large panorama stitching to OCR to gaming, and pretty much everything in between. I'm still on an i7-920, and it's really getting a bit slow, but I need to wait at least another year before building a new box. When I get to that point, I'll evaluate what both Intel and AMD have to offer, and make my decision based on all of the factors that matter.

Performance per dollar will probably not be near the top of the list. I keep my machines for usually three or four years. I do not care about saving a few pennies a day on a cheaper CPU whose slower speed I have to put up with every day.

And this is not about me. It's about discussing the factors that a variety of people take into account. For some, it's just about whatever's cheapest, and for them, AMD is a fine option, never said otherwise. For those who care about all aspects of performance, it IMO is usually not the best choice.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Thats great results, it shows that the FX 8350 is on the same performance envelope as an Intel Core i7 2600 Sandy Bridge and 10% slower than an i7 3770 Ivy Bridge, all for 100$ less than the Intel cpus, this alone makes it a great cpu for Linux.

Yes. Just note that, as the reviewer noted in his review (see the quotes in my above post) some of the bad results of the FX were due to a software problem, which ignored performance enhancements of the piledriver chip. The real gap between the FX-8350 and the i7-3770k is less than a 10%.

See also my signature. It is from another poster who also notices how AMD is on the same performance envelope as Intel on CPUs, although I think he was referring to performance under Windows.
 
Last edited:

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
1. 3570k is not 100.00 more expensive than the 8350. More like 20 dollars.

3770 is the CPU we have been talking about through this thread. If you want to switch to the cheaper 3570k, that is fine, but now you are comparing a CPU that is actually slower than the 8350 in several benchmarks.

The 8350 doesn't need to cost less to justify itself against an inferior CPU.

Except for a few "corner cases" as you call them, 8350 performs worse in gaming than 3570.

Not in the real world. When you jack down the settings and use a $900 video card to put the entire bottleneck on the CPU, as some reviews do, the 8350 loses a few FPS in some games, but in the vast majority of real world gaming with realistic settings and mid-range video cards, the 8350 is fine.

The decision to get an 8350 or a 3570 should depend on other applications, where there is actually a noticeable difference. The 8350 wins many, the 3570 wins a few, but it really comes down to what you actually do at that point.
 
Last edited:
Aug 11, 2008
10,451
642
126
3770 is the CPU we have been talking about through this thread. If you want to switch to the cheaper 3570k, that is fine, but now you are comparing a CPU that is actually slower than the 8350 in several benchmarks.

The 8350 doesn't need to cost less to justify itself against an inferior CPU.



Not in the real world. When you jack down the settings and use a $900 video card to put the entire bottleneck on the CPU, as some reviews do, the 8350 loses a few FPS in some games, but in the vast majority of real world gaming with realistic settings and mid-range video cards, the 8350 is fine.

The decision to get an 8350 or a 3570 should depend on other applications, where there is actually a noticeable difference. The 8350 wins many, the 3570 wins a few, but it really comes down to what you actually do at that point.

Yes and the 3570 is just as "fine" in the mt benchmarks that amd fans love to cite so much as the 8350 is in sky rim WOW, sc2, civ 5, or any of a host of other games that run much faster on intel at realistic settings. I did however state up front that I was talking about a computer primarily used for gaming.
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
Yes and the 3570 is just as "fine" in the mt benchmarks that amd fans love to cite so much as the 8350 is in sky rim WOW, sc2, civ 5, or any of a host of other games that run much faster on intel at realistic settings. I did however state up front that I was talking about a computer primarily used for gaming.


The vast majority of gamers use LCD monitors with 60mhz refresh.

Looking at the Anandtech Bench, http://www.anandtech.com/bench/Product/701?vs=697 , while the 3570 is "faster", in 3/4 cases the 8350 is well above 60 fps anyway, so the extra performance is simply wasted.

The last game test, Starcraft 2, is tested on an absurd 1024X768 resolution to make Intel look better. Does anyone really play with a 1024X768 resolution? No, nobody does, it's a joke of a benchmark and I don't understand how reviews that use such settings are taken seriously. In the real world, you will play at your monitors native resolution, and your video card will largely be the bottleneck rather than your CPU, and the 8350 will perform nearly the same as the 3570k.
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
As a general rule of thumb older software is single/dual threaded and only uses about a 12.5--25% of an eight-core FX chip, whereas uses about a 25--50% of an i5/i7.

Many review sites run one single benchmark at once, and this is a unrealistic load for users who do multitasking. Under multitaksing loads all the cores are being used even if a single application does not.

Indeed, lots of users select a FX CPU because of its superior multitasking possibilities. You can find users who game whereas their computer is doing some background encoding task.
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
As a general rule of thumb older software is single/dual threaded and only uses about a 12.5--25% of an eight-core FX chip, whereas uses about a 25--50% of an i5/i7.

Many review sites run one single benchmark at once, and this is a unrealistic load for users who do multitasking. Under multitaksing loads all the cores are being used even if a single application does not.

Indeed, lots of users select a FX CPU because of its superior multitasking possibilities. You can find users who game whereas their computer is doing some background encoding task.

techreport did some "multitasking" tests, http://techreport.com/review/23750/amd-fx-8350-processor-reviewed/9

but to be honest, I don't think most people try to run heavy tasks like encoding and (heavy) gaming at the same time, I don't see how running just the game is unrealistic


The vast majority of gamers use LCD monitors with 60mhz refresh.

that's irrelevant, because there are many occasions where the FX will drop the FPS way bellow 60 (even for the i5 is not realistic to think that on CPU bottlenecked cases you will easily get 60),

random example, this is a new MMO
CPU-Perf.png

for this game, an overclocked i5 would be the perfect choice for anything near "60".

also, if you like lower latency, you will avoid vsync, and on a single refresh of your display you will get more than a single frame rendered by the GPU displayed (or at least parts of it), and at 100FPS you could have an advantage over 60, even if your display runs at 60Hz, now we could discuss about tearing, but just as some people don't care about higher latency, tearing is very hard to notice in many occasions, because there isn't a massive difference between the different frames presented on the same refresh.



anyway, my point posting here is, the FX is not impressive as a gaming CPU, simply because the i5 does better, for the same cost, that's not to say the FX 8350 is not good for most games, because it is.
also the FX have an advantage for MT, when you can utilize the 8 cores perfectly, but if you consider the other characteristics, like the much lower default clock, I don't know... OC both to their limits, 4.5GHz or whatever, and the situation will look better for the i5, a little slower for 4-8 thread usage, a lot faster for up to 4 threads usage (using maybe 100w less), and for most softwares I use the i5 would make more sense, I think there are more uses for (faster) 2-4 cores now than for (not as fast) 8 for an average user, gamer or whatever....

obviously for some the FX can make more sense for the same money, when you are really pushing the cores with the usual good stuff for MT (like rendering), but realistically most of this use is related to making money, when getting an even faster CPU (like the 3930K) could make sense,
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
but to be honest, I don't think most people try to run heavy tasks like encoding and (heavy) gaming at the same time, I don't see how running just the game is unrealistic

If you really *need* 8+ threads you shouldn't be looking for mainstream desktops CPU, you should buy SNB-EP Xeon processors or at least SNB-E.

because if you think you need 8+ threads and is counting beans deciding between a 3770k and FX8350, you probably don't need 8+ threads as much as you think you do.
 

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
random example, this is a new MMO
CPU-Perf.png

for this game, an overclocked i5 would be the perfect choice for anything near "60".

It's not hard to pick an example to fit your argument. My argument was based on the 4 games all tested in the anandtech CPU bench to represent "average" situations.

Also, that particular test is done with a Radeon 7970. My whole point has been when you run a balanced video card and CPU combination, in most games the bottleneck will be your video card, not your CPU. Of course if you use a $400 video card paired with your cheap $200 CPU the CPU becomes more of a limiting factor.

Does everyone have a 7970? I think a lot of frugal gamers are using lesser cards, and wouldn't be hurt in the slightest by using an 8350.
 

24601

Golden Member
Jun 10, 2007
1,683
40
86
It's not hard to pick an example to fit your argument. My argument was based on the 4 games all tested in the anandtech CPU bench to represent "average" situations.

Also, that particular test is done with a Radeon 7970. My whole point has been when you run a balanced video card and CPU combination, in most games the bottleneck will be your video card, not your CPU. Of course if you use a $400 video card paired with your cheap $200 CPU the CPU becomes more of a limiting factor.

Does everyone have a 7970? I think a lot of frugal gamers are using lesser cards, and wouldn't be hurt in the slightest by using an 8350.

As you well know, if you bitcoin/litecoin mine, your budget for a $150-200 video card can (and should) be fairly easily allocated into a $380-410 7970 for dual use. This is also the most cost effective way to support AMD if you're into that.

Also, nice moving the goalposts........ again.

"You should get 8350 for games that aren't cpu intensive" is a terrible argument for using 8350 over 3570k in gaming.
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
It's not hard to pick an example to fit your argument. My argument was based on the 4 games all tested in the anandtech CPU bench to represent "average" situations.

Also, that particular test is done with a Radeon 7970. My whole point has been when you run a balanced video card and CPU combination, in most games the bottleneck will be your video card, not your CPU. Of course if you use a $400 video card paired with your cheap $200 CPU the CPU becomes more of a limiting factor.

Does everyone have a 7970? I think a lot of frugal gamers are using lesser cards, and wouldn't be hurt in the slightest by using an 8350.

If you read my other posts I refereed to more tests (all from new, or newly updated games, a wider variety compared to what you are basing your argument), also I choose this graph for obvious reasons, is a CPU bottleneck, which is when the CPU is important, not when you are running a lighter (for the CPU) game or game situation, the Anandtech test had a selection of games that poorly represents how taxing games can be for the CPU imo.

if you look on the graph they are running on medium settings, it's just a CPU bottleneck, a slower card, like a 650 Ti wouldn't change anything.
http://media.bestofmicro.com/H/H/382949/original/Mid-1920.png

a faster CPU, or overclocking would.
 

Charles Kozierok

Elite Member
May 14, 2012
6,762
1
0
erm...what is it? I had an AMD 8088, and built my own amd 386sx40 back when.

You're "older school" than I am. My first homebuild was a 430HX Pentium system.

You got me curious so I cracked the box and took off the heatsink. It's a K6-III 400 MHz. The system is built into the case from a Gateway 486/66 that has to be 20 years old. The case is a 24" high full tower AT monstrosity! The motherboard doesn't even reach the bottom of the internal drive bays. All metal, too... those were the days. :) Hasn't been turned on in years but I think the power supply still works.
 

Charles Kozierok

Elite Member
May 14, 2012
6,762
1
0
Let me see if I can guess this correctly.

So are you in the 28-35 age ? ;)

Heh. I wish. :) I only got into this stuff in my late 20s.

Was that the Triton chipset? That was legendary back in the day. Probably the first chipset that intel marketed heavily from what I understand.

That was the Triton II. Well, one of the chipsets called "Triton II". The 430VX was also technically Triton II (confusing naming isn't anything new for Intel, lol) but everyone called it Triton III. The 430FX was the original Triton.

The 430HX was very popular and also possibly the last Intel mainstream chipset to provide parity checking.

The 440BX was also famous, but came years later.

Sorry for the diversion. We should really get back to flaming each other over benchmarks. ;)
 
Last edited:

galego

Golden Member
Apr 10, 2013
1,091
0
0
techreport did some "multitasking" tests, http://techreport.com/review/23750/amd-fx-8350-processor-reviewed/9

but to be honest, I don't think most people try to run heavy tasks like encoding and (heavy) gaming at the same time, I don't see how running just the game is unrealistic

The techreport review was not entirely fair for the FX chip. They used an Entertainment kit memory for AMD with a speed below the stock speed of the FX memory subsystem, whereas selected a high performance Vengeance kit for Intel including quad channel configurations for some Intel chips. They used W7 SP1 with manual hotfixes. Those reduce performance of FX chips whereas increase their power consumption due to a bug.

In the same link that you give they just selected two of the half dozen of games where the FX does it bad: Skyrim and Civ 5. Curious selection, specially when some people claims that civ 5 in the FX chip benefits a lot of from faster RAM.

In despite all that, they write in the review:

The overall performance scatter offers some good news for AMD fans: the FX-8350 outperforms both the Core i5-3470 and the 3570K in our nicely multithreaded test suite. As a result, the FX-8350 will give you more performance for your dollar than the Core i5-3570K, and it at least rivals our value favorite from Intel, the Core i5-3470.

Thus the FX is a good option even when its potential is not completely used.

anyway, my point posting here is, the FX is not impressive as a gaming CPU, simply because the i5 does better, for the same cost, that's not to say the FX 8350 is not good for most games, because it is.

For older games: i5.
For modern games: either FX or i5.
For future games: FX.
 
Last edited:

Mallibu

Senior member
Jun 20, 2011
243
0
0
The amount of misinformation by the AMD armada zealots is annoying.
Windows 7/8 or Linux, gaming and or multitasking, idle or fully loaded, the i7/i5 have better all-around performance, lower power consumption, more overclock headroom and gains, and that's all it takes.

There's a reason AMD is constantly lowering the prices, and losing market share.

Is FX8350 a good cpu? Yes.
Does it game well? Yes it does.
Is it worth the money? Yes.
Is it better than i5/i7? No.

The denial of some people is absolutely mind blowing. They prefer to blame test methodologies, patches for scheduler that give 1%, operating systems, rams, games used, compiler conspiracies, rather than accept the truth.
 
Last edited:

cbrunny

Diamond Member
Oct 12, 2007
6,791
406
126
The amount of misinformation by the AMD armada zealots is annoying.
Windows 7/8 or Linux, gaming and or multitasking, idle or fully loaded, the i7/i5 have better all-around performance, lower power consumption, more overclock headroom and gains, and that's all it takes.

There's a reason AMD is constantly lowering the prices, and losing market share.

Is FX8350 a good cpu? Yes.
Does it game well? Yes it does.
Is it worth the money? Yes.
Is it better than i5/i7? No.

The denial of some people is absolutely mind blowing. They prefer to blame test methodologies, patches for scheduler that give 1%, operating systems, rams, games used, compiler conspiracies, rather than accept the truth.

"And thats the bottom line... 'cause Stone Cold said so!"
 
Last edited: