Do high end user use AMD instead of Intel?

Page 25 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DrMrLordX

Lifer
Apr 27, 2000
22,886
12,943
136
-Resolution Gaming[/B]. I have presented numerous slides and review links to present and backup my claims. But, nobody and I repeat nobody have posted higher than 1080p slides expect me.

The problem here is that:

a). the benches you're posting do not feature the fastest CPUs produced by AMD or Intel. There are no 9590s or 5960X/5930k benchmarks there. I see the 8350 and 4930k . . .

b). the framerates you're showing in those benchmarks are almost universally bad, certainly not within the range of what a "high-end" user would tolerate.

All you've done is made the case against 4k gaming, period, by showing it in a poor light. Do you really think that a guy with an overclocked 5930k and TitanX SLI is going to have those problems? How about quad SLI?

I have looked at some 4k TitanX SLI benches, and I see a mixed bag. First check out this one:

http://hexus.net/tech/reviews/graphics/81892-nvidia-geforce-gtx-titan-x-sli/?page=4

65 min FPS with TitanX SLI at 4k with all the bells and whistles. And that's on a 4770k. Then there's this one, admittedly with a test system with specs that are not listed (boo):

http://www.pcgamer.com/benchmarks-gtx-titan-x-in-sli/

Here we see TitanX SLI getting 16 average fps in the same game at 4k (settings unknown, though they probably couldn't have used higher settings than what hexus used in their reviews).

What can we conclude from this?

1). If you want to make 4k look bad on everything, there are sites out there that have the data you want, like PC Gamer. If you are (somehow) putting up genuinely-awful framerates such as 16 minimum/56 average with TitanX SLI in a game that has been around for a little while, chances are you can put up those less-than-stellar numbers with any number of different CPUs. That doesn't make the CPUs equal in any sense - it just means the benchmark settings are off, or there was a problem with the driver at the time, or something.

If you do whatever it was that Hexus did to get a minimum fps of 65 fps with a 4770k, then you have to open up to the possibility that changing CPU in that situation might actually affect your framerates. The GPUs were obviously able to handle the work of running Bioshock Infinite @4k for Hexus with all the bells and whistles, including AA which many gamers turn off at that res anyway (doesn't help much, or so they say). Do you really think at 4.4 GHz is gonna break 60 fps minimum in that situation, when many benchmarks show 9590s not breaking 60 fps minimum in games at 1440P? I think not.

He doesn't know what he's talking about.

This is what is commonly known as projection. See here for further info: https://en.wikipedia.org/wiki/Psychological_projection

I'd at least like to give him a chance to explain his intimations rather than just leaving them as vague comments. I can think of nothing that points to fundamental design flaws in the X99 platform that makes systems based on it inherently unstable. Most, if not all users of the rather-expensive tech have reported that it's the bee's knees.

One of the SKU has low voltage margin, 9.8% actualy, wich is at the limit of the 10% minimal requirement.


That s not enough to guarantee stability at high loading, this was "solved" by increasing the MBs voltages, wich has been pointed by many users as an error from manufacturers, while it is not...


IMG0045399.png



http://www.hardware.fr/articles/924-6/overclocking.html

So I read the article via Google translate, and found nothing in there where the article's author claims:

a). that the platform is inherently unstable
b). that "wrong" voltages are being set by the UEFI or in CPU microcode

All he did was overclock some Haswell-Es and hit clockspeed walls at 1.3v that weren't all that great. In other news, people on this forum have gotten better results out of their Haswell-Es. The author of the article DID have this to say:

Although it is always difficult to draw conclusions on the basis of a limited number of processor, we still found low limits on our 3 processors with 4.1 GHz to 4.2. If the 4.5 GHz were available with Sandy Bridge-E with reasonable power, he had to settle for 200 to 300 MHz less with Ivy Bridge and Haswell-E-E does not seem to do better or a little worse. For consolation we can say that for the relatively small base frequency of i7-5960X, the gain is still significant

In other words, the benchmark author did not win the silicon lottery. Oh well.
 
Last edited:

Abwx

Lifer
Apr 2, 2011
11,883
4,870
136
Old and tired Vishera with crappy IPC and high power consumption doesn't belong in a high-end system, so no, real high-end users own a LGA2011 system and have been using Intel for a few years.

Lol, because LGA2011 doesnt consume a lot of power, it didnt take you long to slide on a double standard..

That said i will concede that it s high end given that it s a proven plateform, generaly Intel throw the experiments to the enthusiasts and then release server variants once the whole thing is debbugged enough to fit the professionals...
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Undervolting is not to chek how much power you could save but to check the voltage margin implemented by the manufacturers..

For the record the 4790K is also quite mediocre in this respect, good configurations are up to the 8370 for AMD and the 4770 for Intel, all the rest are experimental builds that certainly suit the enthusiast but not the professional.

It's not unstable. Period. You don't have a clue what you're talking about just as I said. Prove me wrong instead of right next time.
 

crashtech

Lifer
Jan 4, 2013
10,695
2,293
146
Undervolting is not to chek how much power you could save but to check the voltage margin implemented by the manufacturers..

For the record the 4790K is also quite mediocre in this respect, good configurations are up to the 8370 for AMD and the 4770 for Intel, all the rest are experimental builds that certainly suit the enthusiast but not the professional.
Your assertions seem drawn from thin air. Link to a respected professional site to back up your dubious claims that thousands of retail CPUs are actually "experiments' in disguise.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
I have made a point that High-End users are using AMD FX 8-core CPUs for High-Resolution Gaming. I have presented numerous slides and review links to present and backup my claims. But, nobody and I repeat nobody have posted higher than 1080p slides expect me.
Many people have explained the issue numerous times : No-one cares about building a $2,000 high-end rig only to "enjoy" 22fps average with 5-8fps slowdowns for "high end" gaming. Just because someone buys a 4K monitor doesn't mean they throw all common sense in the trash. It's like being back in the era of old Toms Hardware Guide 640x480 resolution benchmarks (in an age of 1280x1024 monitors) to artificially skew CPU benchmarks and "prove" that one is +60% faster than another. Except now it's inverted, ie, the "objective" appears to be to artificially cripple any CPU's that's faster overall than an FX-8320 in most games at most resolutions, simply so you can call budget CPU's "high end" under fringe GPU overload testing conditions, whilst continuing with the "red herring" of comparing only Intel's most expensive chips with AMD's cheapest and ignoring everything in between. In reality, even "high end gamers with 4K screens" are going to turn settings or resolutions down if it avoids stupid single digit slowdowns that are well into the "4K playability cliff drop" where min fps falls away exponentially faster than avg fps compared to lower resolutions due to the same massive GPU bottleneck that temporarily makes weak CPU's look relatively better than they are (and only then a subset of AAA games on certain settings). If you're going to intentionally cripple frame-rates with massive GPU bottlenecks in CPU benchmarks, then the only thing those charts show is you could stick literally ANY CPU in there and call it "high end" too if you crippled the GPU's enough... Likewise, in half those benchmarks, the 4770K is faster than the 4930K, but of course your comparisons 'look better' when you compare only the FX-8320 vs 4930K prices, right? :sneaky:

The problem is, after a while when new GFX cards come out, early charts that shows less than 3% difference between FX-9590 and X4 760K, then gradually morph into this. It's a pattern that gets repeated every time there's a "resolution jump" - AMD benefits most during year 1 where it's like "27 vs 28fps", then 1-2 years later comes along a big GFX upgrade, and the result is now more like "30 vs 55fps" and the illusion of "high end equality" falls away. On top of that, the same underlying issue still stands - FX chips with weak per core performance are still far more inconsistent and "perfect thread scaling dependent" from one game to another at ANY resolution. 4K GPU overloads of cherry picked games merely try and give the illusion of "closing the gap" not by raising AMD's up, but temporarily nerfing Intel's down with glorified frame rate caps. As soon as GFX speeds pick up (or you add a third SLI, etc), "the gap" widens again...

What you're ultimately doing isn't really selling or arguing over genuinely "high end" chips, but rather promoting budget ones to take advantage of a short "time lag" latency between people buying a 4K monitor and not being GPU bottlenecked at 4K in the short term, and hope people won't notice the difference between high end and budget if they use stupid settings that gives them miserable gameplay. ;)
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
A CPU needs to be able to hit 60 fps averages without terrible minimums before it can be called sufficient. Ideally it should be able to keep 60 fps minimums. Barring that it should be at least at the top or near the top in its tier. Getting 40 fps average and 30 fps mins on any game is irrelevant because anyone who spends that much on hardware is going to want 60 fps.

The idea of a balanced build is also important. There is no sense throwing $1000 at GPUs and spending $200 on the CPU, when the CPU has the potential for the longest longevity in a modern PC.
 

YBS1

Golden Member
May 14, 2000
1,945
129
106
Lol, because LGA2011 doesnt consume a lot of power, it didnt take you long to slide on a double standard..

Lol, now you're just being intentionally obtuse. There is a massive difference in using a lot of power to provide the fastest experience you can buy versus using a lot of power to provide middle of the road at best performance. To be fair I couldn't care less how much power a CPU/videocard uses providing it performs as such. Let AMD bring a 625TDP CPU/videocard that lays waste to its peers, I'll be in line to buy it.
 

Man I Suck

Member
Apr 21, 2015
170
0
0
Does this forum also do Smackdown Of The Year threads where people submit and vote on the thread with the most epic smackdowns? Or does this kind of smackdown happen all the time here?
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
AMD uses Intel CPU's in their high end systems. Why would any consumer use AMD in their high end system, when even AMD won't?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
AMD uses Intel CPU's in their high end systems. Why would any consumer use AMD in their high end system, when even AMD won't?

Thats the question you wont get an answer for from the people telling you FX is high end.
 
Aug 11, 2008
10,451
642
126
Hmm, so you are saying a high end user uses FX, but FX is not a high end cpu? So then when he is using FX is he temporarily not a high end "user"? The quibbling over semantics and shifting of goalposts in this thread is hilarious, but if one really tries understand it logically, his head could explode.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Hmm, so you are saying a high end user uses FX, but FX is not a high end cpu? So then when he is using FX is he temporarily not a high end "user"? The quibbling over semantics and shifting of goalposts in this thread is hilarious, but if one really tries understand it logically, his head could explode.

1) Thou shall not badmouth the products that feed thy mouth.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Hmm, so you are saying a high end user uses FX, but FX is not a high end cpu? So then when he is using FX is he temporarily not a high end "user"? The quibbling over semantics and shifting of goalposts in this thread is hilarious, but if one really tries understand it logically, his head could explode.

That was my first and only opinion regarding the OP question, I havent changed it since then and i will not do in the future.

"Do high end user use AMD instead of Intel?"

Well the answer to the OP question is YES, there are many high end users that use AMD instead of Intel. There are many High-End Gamers with High-End AMD/NVIDIA graphics cards using AMD CPUs.

All those bellow are regarded as high-end gamers

High-End 4K Gamer with FX 8-core + Titan
High-End 4K Gamer with FX 8-core + 2x CrossFire R9 290/X
High-End 4K Gamer with FX 8-core + 2x SLI GTX980/70
High-End 4K Gamer with FX 8-core + Fury

High-End 3/5x Eyefinity Gamer with FX 8-core + Titan
High-End 3/5x Eyefinity Gamer with FX 8-core + 2x CrossFire R9 290/X
High-End 3/5x Eyefinity Gamer with FX 8-core + 2x SLI GTX980/70
High-End 3/5x Eyefinity Gamer with FX 8-core + Fury

High-End 1440/1600p Gamer with FX 8-core + Titan
High-End 1440/1600p Gamer with FX 8-core + 2x CrossFire R9 290/X
High-End 1440/1600p Gamer with FX 8-core + 2x SLI GTX980/70
High-End 1400/1600p Gamer with FX 8-core + Fury

High-End 1080p 120/144Hz Gamer with FX 8-core + Titan
High-End 1080p 120/144Hz Gamer with FX 8-core + 2x CrossFire R9 290/X
High-End 1080p 120/144Hz Gamer with FX 8-core + 2x SLI GTX980/70
High-End 1080p 120/144Hz Gamer with FX 8-core + Fury
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
That was my first and only opinion regarding the OP question, I havent changed it since then and i will not do in the future.

Not changing your opinion in the face of facts indicates that your beliefs about AMD are too closely tied to your sense of self. It's not your fault though, it's human nature.

Kelly Garrett and Brian Weeks did a study on misinformation;
At first, it appeared as though the correction did cause some people to change their false beliefs. But, when the researchers took a closer look, they found that the only people who had changed their views were those who were ideologically predisposed to disbelieve the fact in question. If someone held a contrary attitude, the correction not only didn’t work—it made the subject more distrustful of the source.

You are not ideologically predisposed to think of AMD in anything other than the best terms. Therefore it follows that you will not change your opinion of AMD or its products. Or Intel for that matter.

Consider this quote from John Maynard Keynes;
When events change, I change my mind. What do you do?
When the facts change, I change my mind. What do you do, sir?
When my information changes, I alter my conclusions. What do you do, sir?
When someone persuades me that I am wrong, I change my mind. What do you do?

This is the logical way to address facts.
 
Last edited:

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
I used to be a "high-end gamer" with low-end hardware. I did competitive stuff on a Geforce 4 MX440 + AMD Duron.

However, I'm pretty certain that what you're saying is not in the spirit of what OP was asking.
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
Question: "Do high end user use AMD instead of Intel?"

Short, general answer: Rarely (with emphasis on users).

Short, logical answer: No, it's Intel all the way (with emphasis on performance).

So, in both instances, Intel comes out on top.
 
Last edited: