Do high end user use AMD instead of Intel?

Page 17 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
ramses you are confusing "top-of-the-line" with "high-end" FX-8 is top-of-the-line, of the FX line, where the i3 is bottom-of-the-line,so to speak,of the core i line,but mid-of-the-line of the whole haswell lineup.

But if computer people talk about high end they don't talk about each company separately,everything (every- and any- thing any company makes) gets dumped into the big wide pot and you choose what performs the best.

And in that regard both the i3 and the fx-8 are both somewhere in the middle,mid range.
 
Last edited:

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
I'm comparing them because that's what sout there in the market. Not because I "want to" but because that's waht's being sold. You don't want to compare them because the results aren't what you'd like them to be.

You said Intel claimed i3 was low end, That was false. Then you said it was implied, that theory got shot down by the existence of Pentiums and Celerons. Now it's about how you "feel" which isn't relevant.

Also, plenty of "low end" laptops running Pentiums, so that further waters down your argument.

Even if we adopt your own naming conventions that's neither explicitly stated nor implied by Intel, all it boils down to is: Intel's low end > AMD's high end.

If you need to be right you can be, I'm ok with it. You can even have the last word.
But you'll have to post again.
I stand by everything I posted, and will probably
post as much again if the occasion comes up.

I'll even throw you a good natured bone, Intels low end today is
often better than AMD's high end from a few years ago.

I'd still buy another FX if my i7 dies tomorrow. Crazy world ain't it?
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
What else you got?

I just posted the latest AAA title (Witcher III) to showcase that new game engines are becoming more and more multithreaded.

Have a look at the latest AAA titles, even without Mantle, 8-core FX is faster than Haswell Core i3.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Grand_Theft_Auto_V_-test-2-GTA5_proz.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-game_2014-video-CPU-wd_proz.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-game_2014-video-CPU-da_proz.jpg




And one more think, there are millions of people playing games supporting Mantle currently, like BF4, BF Hardline, Civilization Beyond Earth etc etc .
Also DX-12 and Vulcan are made from Mantle and they both will exhibit the same characteristics as Mantle.

Make no mistake, when Win 10 and DX12 games will arrive, the Core i3 will be in a disadvantage against any Quad core and above.
 

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
ramses you are confusing "top-of-the-line" with "high-end" FX-8 is top-of-the-line, of the FX line, where the i3 is bottom-of-the-line,so to speak,of the core i line,but mid-of-the-line of the whole haswell lineup.

But if computer people talk about high end they don't talk about each company separately,everything (every- and any- thing any company makes) gets dumped into the big wide pot and you choose what performs the best.

And in that regard both the i3 and the fx-8 are both somewhere in the middle,mid range.

That's a good way to define the two things.
I don't completely disagree, and if the FX line was still fresh, I'd not disagree at all. It's age is very important imo, and it adds a huge "but.." to any comparison with anything current form Intel. I can see why someone would say tuff nookie to AMD and compare what they are still selling to what Intel has currently, but it's basically a waste of time. The FX was fighting an uphill battle a couple years ago, it's not going to get any better when Intel is still upgrading. To me, for a comparison across brands to be worth making, they both need to start out from a somewhat equal footing both in price and technology, and this equal footing does not exist now and has not for a long time. That's why I think the comparison is not productive at this point. I was having a perfectly good experience with a 9590 six or eight months ago not because it was super awesome, but because we are largely beyond needing more CPU. This whole discussion wouldn't have happened five or ten years ago when two years was an eternity of progress. If anything the fact that the FX8 is still in the same ballpark as anything new is mighty telling, and not in a good way.

I'd be ok with amending myself to say the FX8 was the Top Of The Line AMD chip, and the i3 is a lower-mid-range Intel chip(the i5 being mid-range, the i7 being high-range, in consumer circles). Does that come across better?
 

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
And one more think, there are millions of people playing games supporting Mantle currently, like BF4, BF Hardline, Civilization Beyond Earth etc etc .
Also DX-12 and Vulcan are made from Mantle and they both will exhibit the same characteristics as Mantle.

Make no mistake, when Win 10 and DX12 games will arrive, the Core i3 will be in a disadvantage against any Quad core and above.

Interesting, and further reinforcing my "fast enough" theory, per those benches at 1080P I'd be perfectly happy playing any of those games on my old 8350. Groovy.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
The issue here is twofold: lack of consistent frame of reference, plus ambiguous/arbitrary labeling.

Ignoring what AMD and Intel think about their own products, the FX-8xxx and i3 both occupy approximately the same price bracket in the market (FX is usually more when not on sale), and are sold along side each other. AMD still produces FX chips, and has even (recently-ish) released revisions of them. The FX-8xxx's primary competitor, price-wise, is the i3. It's accurate therefore to say that AMD's FX-chip compete's with Intel's ~$100-150 CPUs, however we label them. If we call an i3 Intel's "low end", AMD is competing with Intel's low-end with their FX chip. Not everyone would agree than an i3 is low-end, but I don't think anyone is arguing that they're not competitors.

I think you'll find quite a few who disagree with you, Ramses, that the FX chips are "done, gone, and over." AtenRa, for instance still recommends them to people. There aren't too many niches that FX chips are great choice for, but people are still buying them, new.
 
Last edited:

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
I think you'll find quite a few who disagree with you, Ramses, that the FX chips are "done, gone, and over." AtenRa, for instance still recommends them to people. There aren't too many niches that FX chips are great choice for, but people are still buying them, new.

I mentioned a couple times that if my i7 died I'd replace it with another (my 3rd) FX. :) I'm a fan and have had good service from em. The one's I've sold went quickly for really good money too which is also telling. I quite literally sold my 9590 and it's accompanying bits to buy the z97/4790K setup just to see how the other half lived.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I think you'll find quite a few who disagree with you, Ramses, that the FX chips are "done, gone, and over." AtenRa, for instance still recommends them to people. There aren't too many niches that FX chips are great choice for, but people are still buying them, new.

Make no mistake, I still recommend them because of their current low pricing. Plus with a few new good motherboard releases, it made the Piledriver FX 8-core a nice cheap multipurpose CPU recommendation.

But i dont know for how log, after Skylake things may change unless Intel will continue to sell dual cores at $70 and Core i3 at more than $100.
 

crashtech

Lifer
Jan 4, 2013
10,695
2,293
146
Now that we have seen AMD's FX8 compared to an i3, all I can advocate is polite silence, lest we see a comparo to a Skylake Pentium in a few months.
 
Aug 11, 2008
10,451
642
126
I just posted the latest AAA title (Witcher III) to showcase that new game engines are becoming more and more multithreaded.

Have a look at the latest AAA titles, even without Mantle, 8-core FX is faster than Haswell Core i3.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Grand_Theft_Auto_V_-test-2-GTA5_proz.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-game_2014-video-CPU-wd_proz.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-game_2014-video-CPU-da_proz.jpg




And one more think, there are millions of people playing games supporting Mantle currently, like BF4, BF Hardline, Civilization Beyond Earth etc etc .
Also DX-12 and Vulcan are made from Mantle and they both will exhibit the same characteristics as Mantle.

Make no mistake, when Win 10 and DX12 games will arrive, the Core i3 will be in a disadvantage against any Quad core and above.

Wow, those results are with GTX980 SLI. I bet with a lot of single cards you would be, wait for it, your favorite condition for evaluating games *gpu limited*. In any case, I would agree an i3 does not do well in GTAV and Witcher 3. But come on, Watch Dogs and DA:I are within pretty much the margin of error for average frame rate, although the minimums are more in favor of FX. And there are plenty of recent games in which i3 does considerably better than 8350, including AAA ones such as Advanced Warfare and AC:Unity.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Wow, those results are with GTX980 SLI. I bet with a lot of single cards you would be, wait for it, your favorite condition for evaluating games *gpu limited*. In any case, I would agree an i3 does not do well in GTAV and Witcher 3. But come on, Watch Dogs and DA:I are within pretty much the margin of error for average frame rate, although the minimums are more in favor of FX. And there are plenty of recent games in which i3 does considerably better than 8350, including AAA ones such as Advanced Warfare and AC:Unity.

People were saying the same things about the FX for years, but "you" people didnt take it seriously until today that games are demanding more and more threads and Core i3 Haswell is starting to lag behind.

I dont see the i3 Haswell doing considerably better than a 4,4GHz 8-core FX in Advanced Warfare and AC : Unity

http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-game_2014-video-CPU-cod_proz.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-game_2014-video-CPU-acu_proz.jpg


Even in FarCry 4, the FX at 4.4GHz is very close
http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-game_2014-video-CPU-far_cry_4_proz.jpg
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
In any case, I would agree an i3 does not do well in GTAV and Witcher 3.
You are looking at the haswell part right? For the witcher that's like a 12fps diference in minimums (from 39 to 51) with 6 more cores and 500mhz higher clocks,multithreaded...my donkey,its 5 fps for a 400mhz bump(fx-9370).

In Gtav the difference is 40 to 43 fps minimums,3 whole frames. (with 6 more cores and 500mhz higher clocks)
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
I dont see the i3 Haswell doing considerably better than a 4,4GHz 8-core FX in Advanced Warfare and AC : Unity
It doesn't have to,as long as it is close it's a good deal,and not only is it close it is in the lead,despite having only two physical cores.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
People were saying the same things about the FX for years, but "you" people didnt take it seriously until today that games are demanding more and more threads and Core i3 Haswell is starting to lag behind.

I dont see the i3 Haswell doing considerably better than a 4,4GHz 8-core FX in Advanced Warfare and AC : Unity

Even in FarCry 4, the FX at 4.4GHz is very close

A 4.4ghz 210$ 220W octocore FX that can just barely beat a 3.5Ghz 140$ 53W dualcore. Its getting more and more desperate isnt it? Even AMD dont use the FX crap. The Quantum is a prime example.
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
22,932
13,015
136
The price difference of a Socket 2011-v3 + 6-Core Core i7 + DDR4 vs AM3+ + FX-8320E is more than a R9 290/X alone. That means that you can have a 4K Gaming with an FX 8-Core + CF 290 for less and you will not even feel any performance difference even in minimum fps.

If a Quad Core Kaveri at DEFAULT is enough for Dual 290X and 4K then an OverClocked 8-core FX is as well or even better. And this is in DX-11, in Mantle the difference in CPU performance between the 8-Core FX and 6-Core Core i7 will be negligibly.

74901.png


74902.png

With all due respect, those minimums are pretty bad by "high end" standards. If I'm going to drop a fat wad on an SLI/Crossfire gaming machine, I want minimum 60 fps, or at LEAST minimum 30 fps in titles that just push modern hardware to hard for me to reach 60 with all the bells and whistles.

Nope, If you had a 6-core Phenom II at 4-4.1GHz you will be fine with an FX at 4.6-4.8GHz using the same cooling setup.

You don't need custom water to hit 4.8 ghz with an FX, especially not an 8320e or 8370e. I'm talking something a "high-end" user might try to use . . . you know, 5.3-5.7 ghz. At 4.8 and lower, FX gets beaten pretty badly by an overclocked 5820k or 5930k. At 5.7 ghz, which is about the wall for the "new" FX chips, they are absolute heat monsters. They still will lose to the Haswell-E I'm thinking, but it'll be closer.

Also, not sure why you brought a 4930k into this. Nobody buying high-end today will want one of those. Regardless, the numbers for both the 4930k and 8350 looked pretty bad.

This has gone on too long.

Probably. Most threads do these days. People are getting stir-crazy from lack of compelling hardware updates.

A 4.4ghz 210$ 220W octocore FX that can just barely beat a 3.5Ghz 140$ 53W dualcore. Its getting more and more desperate isnt it? Even AMD dont use the FX crap. The Quantum is a prime example.

FX chips aren't 220W @ 4.4 ghz. Here's a 5 GHz 8320e chewing up ~190W (slightly less):

http://www.xtremesystems.org/forums...-next-months&p=5253289&viewfull=1#post5253289
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Yes, cherry picking when its best. Plus the 8320E is a complete turd if not overclocked.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Yes, cherry picking when its best. Plus the 8320E is a complete turd if not overclocked.

Cherry picking ??? i posted 4-5 games where the FX is faster and another 2-3 games where the FX is very close.
Well, its unlocked (unlike the Core i3) and ready to be overclocked and beat any Intel CPU at the same price.
 

CriticalOne

Member
Apr 17, 2015
26
0
16
Hello everyone. Its that neighborhood i3 user here ready to give my $.02.

I was in the process of planning my build when I discovered that Intel made a new CPU as part of the Haswell Refresh line. The i3 4370. It comes with a base clock of 3.8GHz and 4MB of L3$ which makes it the fastest i3 yet. I saw the benchmarks of lesser processors (i3 4330 and i3 4360) and then just went to Microcenter and picked it up for $140. No questions asked, no second thoughts, no regrets.

I haven't been disappointed at all by this processor. All of the benchmarks are accurate and there's no bait and switch going on. I almost exclusively game on it and every game I have runs smoothly on it. Everything from Skyrim to BF4 runs without huge FPS drops. I can browse with as many tabs as I want, given that Flash decides to work on that day. Given that its a dual core things like processing videos is reasonably fast.

Most people also lack any understanding of what Hyperthreading actually does and just parrots stuff like "HT doesn't work in games". Hyperthreading works by allowing each core to execute two threads, allowing the physical core to use resources that would have otherwise remained idle. Its functionality is so supported its almost essentially supported on the hardware level. Games and most programs can't identify which "CPU" is a "fake" or a "real" core either. There are rare applications in which Hyperthreading does not work, but gaming is not one of those given that it can use 4 threads to begin with.

The "quad core brigade" isn't doing anyone favors by constantly crapping on the i3 series. Its okay to dislike the i3, but at least do so by its merits. Just because something is a quad core doesn't automatically make it good or superior to my i3. I bet most of the "quad core brigade" wouldn't use an quad core Atom instead of an i3. If it performs satisfactorily to your needs in benchmarks, then that's all there is to it. I don't get how people on here can be so scientific and go strictly by the numbers in these tons of FX vs i5/i7 threads but try their hardest to tarnish the i3 because a dual core performing at that level makes them uncomfortable.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Cherry picking ??? i posted 4-5 games where the FX is faster and another 2-3 games where the FX is very close.
Well, its unlocked (unlike the Core i3) and ready to be overclocked and beat any Intel CPU at the same price.

Yes, 220W 210$ FX vs 140$ 53W i3.
 
Aug 11, 2008
10,451
642
126
People were saying the same things about the FX for years, but "you" people didnt take it seriously until today that games are demanding more and more threads and Core i3 Haswell is starting to lag behind.

I dont see the i3 Haswell doing considerably better than a 4,4GHz 8-core FX in Advanced Warfare and AC : Unity

http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-game_2014-video-CPU-cod_proz.jpg


http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-game_2014-video-CPU-acu_proz.jpg


Even in FarCry 4, the FX at 4.4GHz is very close
http--www.gamegpu.ru-images-stories-Test_GPU-Videocards-game_2014-video-CPU-far_cry_4_proz.jpg

The i3 is 18% faster in AC unity and 8% faster in Advanced Warfare, which is "not much faster", but you claim GTA at 14% faster for FX proves the FX is far superior. OK, if you say so. I guess the criteria for significant difference is different depending on which company wins. In any case, it is pretty pathetic that we are arguing about whether a 125 watt top end processor from AMD is better or not compared to a 53 watt mid range processor from intel. And to use the argument that so many AMD fans use, I bet one would be very hard pressed to see any of those differences in an actual gaming session.