Do high end user use AMD instead of Intel?

Page 20 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I think that possibly this is a case of you and I not understanding each other, because of language 'issues'. To phrase it another way, he quoted you because he was ridiculing you. He was ridiculing you, because of your cherry-picking.

Seams that people doesnt know how to make a conversation. When you quote some one you have to talk about hes/shes opinion.
If you quote him/her and you change the subject/context of his original opinion then you better stop posting.
Its like you will start a topic about Mantle and i will quote you and only talk about DX-9 because you know, i would think you are cherry picking and will like to ridiculing you and your opinion.

Way to go,
:p
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Yes but other people are, because as mentioned that's what real "heavy gamers" in the real world own - a large collection of them, and they typically want the highest and most consistent performance. How many DX12 games do you own? "You don't want to talk about DX9-11 games that form 99% of the PC gaming market because that's just silly". You are simply talking around people with empty fluff not to them, then tying yourself up in knots when they don't share the same arbitrary benchmarking restrictions as your imaginary "Mr. 5-game 'power' gamer"... ;)

If YOU want to talk about something else other than what i was talking about you are free to do so, next time dont Quote me and change the subject. You could start your own point of view that older games are faster with the Core i3 without quoting me, i wouldnt even try to dispute it.
But when you will quote me, it means you have to say something about my opinion, in this case about Mantle/DX-12 and latest AAA games.

That is how we do a conversation, if everyone talks about different thinks we better stop right now.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
So we are back to "something future" may make the FX good as the new goalpost.

Just keep waiting and waiting. Unfortunately history have shown us that FX CPUs age terrible.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Why are we arguing about an 8320e at 4.4 GHz? Yeah, it's cheap, can be obtained even on some questionable 4+1 boards (and definitely on 6+2/8+2 boards), and doesn't suck up that much power compared to the 8350 or other, older FX chips.

Is that high-end? No. Why is that even in this discussion? No 8320e is going to push the kind of frames a true high-end user is going to want, certainly not in the latest game titles.

8320es with mild overclocks are for budget rigs. They can do "okay" in games. You can probably push one hard enough to get your minimum frames above 30 often enough to be respectable. But that ain't high-end.

Do you consider a 1080p 60hz gamer a high-end user ??? Because 1080p 60hz gaming is mainstream today. High-end gamer TODAY is at 1080p 120/144hz, 1440/1600p and 4K.
As i have said before AND provided 3 reviews (that it seams you didnt even have a look at them) for high end resolution gaming the FX8350 at DEFAULT is almost as fast as any Intel 6-core Ivy/Haswell. That means even the FX8320E at 4.4GHz will be enough for high resolution gaming today.

Also to remind everyone, i havent said that the FX8320E is a high-end product, but i have clearly said that High-End users/Gamers are using it, at least for High-End gaming.
 

TeknoBug

Platinum Member
Oct 2, 2013
2,084
31
91
There are still A LOT of games that uses DX9, and a lot of games also has an option to switch between DX9 and DX10/11, DX9 typically looks awful but runs better on low-midrange systems, I had to switch to DX9 in a couple games on my GTX750Ti (like Aion, Savage Lands and ArcheAge).
Why are we arguing about an 8320e at 4.4 GHz? Yeah, it's cheap, can be obtained even on some questionable 4+1 boards (and definitely on 6+2/8+2 boards), and doesn't suck up that much power compared to the 8350 or other, older FX chips.
I had the 8320E up to 4.2GHz on a 970A-D3 rev3 but wouldn't go past that otherwise it'd boil the VRM.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
If you quote him/her and you change the subject/context of his original opinion then you better stop posting.

If YOU want to talk about something else other than what i was talking about you are free to do so, next time dont Quote me and change the subject.
Are you serious? The thread title is "high end" desktop gaming. The OP was even asking about "triple monitor" setups on the very first page along with "maximum performance efficiency" and "I don't have a budget". He was specific about "no video render, editing, recording, etc...". You jumped in on page 3 with a post about Kaveri notebook idle times, and followed up with x264 encoding, then x265, then got moderated for insulting on page 4. On page 9 you posted about Cinebench & more video encoding, page 10 is about Quicksync & Handbrake, and on page 15 about A8-7600 low-end APU's. Since then that's morphed into $140 CPU budgets and "Mantle only" (which the OP didn't even mention) and now arguing about arguing.

And you're complaining about other people "changing the subject"? o_O A great deal of self-reflection is needed I think. I'm really not going go round in circles with this rubbish any longer. Have a good evening. ;)
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
You jumped in on page 3 with a post about Kaveri notebook idle times, and followed up with x264 encoding, then x265, then got moderated for insulting on page 4. On page 9 you posted about Cinebench & more video encoding, page 10 is about Quicksync & Handbrake, and on page 15 about A8-7600 low-end APU's. Since then that's morphed into $140 CPU budgets and "Mantle only" (which the OP didn't even mention) and now arguing about arguing.

Every one of my posts is debating the opinion of the person i have quoted, as you can see i havent changed the subject of his original post.

For example,

http://forums.anandtech.com/showpost.php?p=37461914&postcount=61

http://forums.anandtech.com/showpost.php?p=37462183&postcount=81

http://forums.anandtech.com/editpost.php?do=editpost&p=37462184

And you're complaining about other people "changing the subject"?

Did you actually read what i said ?? Im not complaining about people changing the subject, im complaining because YOU quoting on something i have said and then changing the subject to suite your agenda.

I'm really not going go round in circles with this rubbish any longer. Have a good evening. ;)

You too ;)
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
Do you consider a 1080p 60hz gamer a high-end user ??? Because 1080p 60hz gaming is mainstream today. High-end gamer TODAY is at 1080p 120/144hz, 1440/1600p and 4K.
As i have said before AND provided 3 reviews (that it seams you didnt even have a look at them) for high end resolution gaming the FX8350 at DEFAULT is almost as fast as any Intel 6-core Ivy/Haswell. That means even the FX8320E at 4.4GHz will be enough for high resolution gaming today.
Did you read the reviews you posted?
What kind of high-end gamer is going to play with 5-6FPS minimums?
And that is not the only game with ridiculous min,in fact out of the 4 games all but one,have sub 20fps minimums.
Metro only gives 20fps avg and sleeping dogs 40fps.
A high-end gamer wouldn't be caught dead with that kind of FPSs.
47_11_radeon_r9_290x_cf_amd_am3_vs_intel_lga_1150_vs_lga_2011_at_4k.png
 

DrMrLordX

Lifer
Apr 27, 2000
22,934
13,021
136
Do you consider a 1080p 60hz gamer a high-end user ???.

No, not even close. A high-end user is trying to push 60+ minimum fps on 1440p and/or moving to 4k.

I had the 8320E up to 4.2GHz on a 970A-D3 rev3 but wouldn't go past that otherwise it'd boil the VRM.

Sounds about right. Some people get luckier and can hit 4.4-4.5, though I think they may be playing with fire. That's still better than situations when 8350s, at stock, would overwhelm 4+1 boards.

A high-end gamer wouldn't be caught dead with that kind of FPSs.

Ding ding ding, we have winnar.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
So we are back to "something future" may make the FX good as the new goalpost.

Just keep waiting and waiting. Unfortunately history have shown us that FX CPUs age terrible.

The CPUs aged fine. The platform is another thing.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
IF FX is so good why is AMD going back to the drawing board for Zen? They are tacitly admitting FX is a failure and distancing themselves.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Did you read the reviews you posted?
What kind of high-end gamer is going to play with 5-6FPS minimums?
And that is not the only game with ridiculous min,in fact out of the 4 games all but one,have sub 20fps minimums.
Metro only gives 20fps avg and sleeping dogs 40fps.
A high-end gamer wouldn't be caught dead with that kind of FPSs.
47_11_radeon_r9_290x_cf_amd_am3_vs_intel_lga_1150_vs_lga_2011_at_4k.png

Why do you always have to cherry pick a single graph out of three reviews ???

Here is an FX8350 against Core i7 4930K at 4K with 780 and 980 SLI.
One is $140 with 990FX motherboard that have M2.0 slot at $120. The other one is more than $1000

http://www.tweaktown.com/tweakipedi...g-gtx-780-sli-vs-gtx-980-sli-at-4k/index.html

56_54_amd_fx_8350_powering_gtx_780_sli_vs_gtx_980_sli_at_4k.png


56_55_amd_fx_8350_powering_gtx_780_sli_vs_gtx_980_sli_at_4k.png


56_56_amd_fx_8350_powering_gtx_780_sli_vs_gtx_980_sli_at_4k.png


56_57_amd_fx_8350_powering_gtx_780_sli_vs_gtx_980_sli_at_4k.png


56_58_amd_fx_8350_powering_gtx_780_sli_vs_gtx_980_sli_at_4k.png


That answers the OP question magnificently,

Does High-End users use AMD instead of Intel ??? YES they do

/end of thread
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
Why do you always have to cherry pick a single graph out of three reviews ???

Here is an FX8350 against Core i7 4930K at 4K with 780 and 980 SLI.
One is $140 with 990FX motherboard that have M2.0 slot at $120. The other one is more than $1000
I posted one very representative pic and commented on the rest.
Even with the pics you posted now,the FPS are crap and the min FPS are absolute crap and those are all old games,you have gone from defending the position of "high-end gamer will only play latest AAA games like witcher 3" to posting graphs of 4-5 year old games,because once again you found a situation where the vga slows down any cpu,they could have put a celeron on this benchmark and it too would have the same framerates.
(and that would only be like $100 total)
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I posted one very representative pic and commented on the rest.
Even with the pics you posted now,the FPS are crap and the min FPS are absolute crap and those are all old games,you have gone from defending the position of "high-end gamer will only play latest AAA games like witcher 3" to posting graphs of 4-5 year old games,because once again you found a situation where the vga slows down any cpu,they could have put a celeron on this benchmark and it too would have the same framerates.
(and that would only be like $100 total)

I have never said High-End gamers only play latest AAA games, stop putting words in my mouth.

You know what is funny, the only one here supporting what he is saying by providing reviews and graphs is me. I havent see you or anyone else posting any review or a single graph to showcase your point of view about high-end USERs only using Intel and why.

Those are 4K gaming benchmarks and with high or even very high image quality settings, If you dont like the minimums you can always dial down some settings to increase performance. But most of those games are perfectly playable as they were tested in the slides above.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
IF FX is so good why is AMD going back to the drawing board for Zen? They are tacitly admitting FX is a failure and distancing themselves.

Not to mention the AMD Quantum...with an Intel CPU.

fAS9j2f.jpg


Its pretty amazing to see someone defend FX CPUs, when even the company making them wont use them. Thats really out of touch with reality.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
I'm sure some gamers do pair FX chips with high-end video cards, but many gamers who are looking for 1080P120/1440P120 or 144hz are upgrading from FX chips now because they're badly CPU limited when trying to hit those framerates. 4K60 is another matter - if you crank up the settings until you're getting 30fps or less, you're definitely not likely to see a big difference between an FX or Intel CPU.

I definitely wouldn't drop $$$$ on a setup like that and be content with low framerates; I'd drop graphical settings, and once you do that, you'll notice the Intel setup pull away from the AMD.
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
22,934
13,021
136
Why do you always have to cherry pick a single graph out of three reviews ???

Here is an FX8350 against Core i7 4930K at 4K with 780 and 980 SLI.
One is $140 with 990FX motherboard that have M2.0 slot at $120. The other one is more than $1000

Again, the minimum frames are too low on everything except GRID and DiRT. Some of the minimum frames are horrifyingly low, such as those in Metro: LastLight.

A high-end user, today, isn't going to use an 8350 or a 4930k. I also expect they're going to want something a little better than 980s in SLI . . .
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Again, the minimum frames are too low on everything except GRID and DiRT. Some of the minimum frames are horrifyingly low, such as those in Metro: LastLight.

As I have explained here on post 19, those minimum fps in Metro Last Light are only occurring once in the entire benchmark run and they dont pose an alarming threat to playability.

You can see below the lowest minimum fps of 13,32 was recorded on frame 8, but in the rest of the game fps are almost always above 30.

25i7kf9.jpg



A high-end user, today, isn't going to use an 8350 or a 4930k. I also expect they're going to want something a little better than 980s in SLI . . .

Well even if you use GTX980Ti in SLI you will still be GPU limited at 4K in the vast majority of games, especially latest AAA titles. The FX 8-core CPUs are more than capable for the job, you dont need to spend $400-$1000 for the CPU alone if you want to game at 4K, but you have to spend $1000-$2000 for the GPUs.
 

crashtech

Lifer
Jan 4, 2013
10,695
2,293
146
It seems like one dropped frame could safely be ignored, but it's not always easy to tell if it is an outlier or not unless there is a graph.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I'm sure some gamers do pair FX chips with high-end video cards, but many gamers who are looking for 1080P120/1440P120 or 144hz are upgrading from FX chips now because they're badly CPU limited when trying to hit those framerates.

Most games and especially latest AAA titles from 2013 onwards are GPU limited at 120-144Hz and not CPU. Especially in Mantle titles like BF4 and Hardline shooters that you want 120-144fps, i am GPU limited with an FX8350 Overclocked to 4.5GHz and HD7950 1GHz and cannot increase the image quality higher. The CPU at those settings can output more than 160fps, but once i increase the image quality the GPU becomes the bottleneck and performance drops bellow 100fps. (video bellow)

https://www.youtube.com/watch?v=ADaALZ0RWDA&index=7&list=PLPPlscE2CXdFD3sh6m6sNiPB9GdtPU0wU


I definitely wouldn't drop $$$$ on a setup like that and be content with low framerates; I'd drop graphical settings, and once you do that, you'll notice the Intel setup pull away from the AMD.

No it wouldn't, you are getting low fps at 4K because you are GPU limited. Decreasing the image quality settings in 4K and you will still be GPU limited but your GPUs will be able to perform faster. The CPU will still not sweat to follow the GPUs.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
It seems like one dropped frame could safely be ignored, but it's not always easy to tell if it is an outlier or not unless there is a graph.

Yes i agree, that is why i keep every Metro Last Light benchmark run i make.
 
Aug 11, 2008
10,451
642
126
As I have explained here on post 19, those minimum fps in Metro Last Light are only occurring once in the entire benchmark run and they dont pose an alarming threat to playability.

You can see below the lowest minimum fps of 13,32 was recorded on frame 8, but in the rest of the game fps are almost always above 30.

25i7kf9.jpg





Well even if you use GTX980Ti in SLI you will still be GPU limited at 4K in the vast majority of games, especially latest AAA titles. The FX 8-core CPUs are more than capable for the job, you dont need to spend $400-$1000 for the CPU alone if you want to game at 4K, but you have to spend $1000-$2000 for the GPUs.

Do you really realize how absurd this argument is? First of all a 4790k isnt 400 dollars, much less 1000. In any case, a 4690k is almost as fast for gaming. It will also be faster than 8350 in pretty much every game should you decide to turn down the resolution or settings to get a better framerate. The price difference is only about 100 dollars compared to the 8350. So are you seriously saying someone who spends a thousand to 1300 dollars for graphics cards, a hundred dollars or more for the PSU to drive it, probably 16 or 32 gb of ram, an expensive monitor, etc, etc, ---- 2000 or more for the system, should limit his flexibility to get the best framerate at other (non gpu limited) settings by saving 100.00 on the cpu? This argument is so illogical it makes my head want to explode. One can make a reasonable argument for using AMD cpu in a few low/mid range price classes, but in a high end 1000.00 plus system it is just absurd.
 

TeknoBug

Platinum Member
Oct 2, 2013
2,084
31
91
Actually a 4790K is $410 and 8350 is $230 here in Canada...

And I'd still buy an i7 4790K.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
So we are back to "something future" may make the FX good as the new goalpost.

Not new, just recycled. AMD advocates have been waiting for "something future" to make their processors shine since the original Phenom. It's just excuse after excuse followed by something "in the future" that will allow them to finally say "i told you so" first it was programs needing to be better optimized, then we had windows 7 scheduler couldn't handle bulldozer architecture, then they thought the consoles will save them, (that one is my personal favorite by the way) then there was mantle, then/now DX12 and now we've added 4K to the mix.

The sheer number of excuses and concessions that some people are willing to make for what can only be described as an underperforming processor is comedic on the surface, but also a bit sad if you think about it.