AMD FX-8350 powering GTX 780 SLI vs GTX 980 SLI at 4K

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
Not sure why you felt the need to break down in to full-on AMD apologetics mode. AMD is "good enough" for 60Hz, that's certainly a turd you can keep on polishing with your AMD apologetics.

Well, with a quote like that -- it's clear that your just an Intel shill that can't handle the fact that AMD seems to work better at 4K resolutions for half to a third of the price of their rival's CPU.

Insults are not allowed here
Markfw900

I own both Intel and AMD -- and AMD games just as well at 120 Hz. You clearly haven't touched anything they've made in years.

It is not just one source that has been saying AMD FX cpu's are a better choice at 4K -- I've counted at least 3. Tweaktown, Forbes and Kotaku. I wish there were more benchmarks at those high resolutions, maybe Anandtech could post some to confirm the performance observed by these others.
 
Last edited by a moderator:

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
All my CPU's are intel (as they are the best at making CPU's)

I just find it funny when an AMD chip wins a benchmark something is wrong with the benchmark

Just let them have the win , they don't win very many these days


It's not a matter of letting "them" have the win (I have AMD systems myself), but 3dmark graphics suit should score about the same regardless of processor. I can find no logical reason why an FX-8 would outperform an Ivy bridge hex with hyperthreading by far more than margin of error when both should be scoring the same as a Pentium or Kaveri.

Well, with a quote like that -- it's clear that your just an Intel shill that can't handle the fact that AMD seems to work better at 4K resolutions for half to a third of the price of their rival's CPU.

I own both Intel and AMD -- and AMD games just as well at 120 Hz. You clearly haven't touched anything they've made in years.

It is not just one source that has been saying AMD FX cpu's are a better choice at 4K -- I've counted at least 3. Tweaktown, Forbes and Kotaku. I wish there were more benchmarks at those high resolutions, maybe Anandtech could post some to confirm the performance observed by these others.


I haven't done the research on 4K in specific, and I'm in agreement that we now need more 4K processor benchmarks, given this data that doesn't fit with our current knowledge. As I understood it, scaling resolution should increase GPU load and do nothing for CPU load, so a CPU lead at low resolution by Intel chips shouldn't be reversed at high resolution, unless there's something going on we don't know about.

As for 120FPS on FX chips, it depends on your game. I know few people care about Guild Wars 2 anymore, but neither AMD or Intel can provide 120FPS in it at present:

CPU-Cores.png


FX chips make a nice showing in Battlefield but ultimately you're still better off with an i7, especially if you plan to overclock:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_China_Rising_-test-bf_4_proz.jpg
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Well, with a quote like that -- it's clear that your just an Intel shill that can't handle the fact that AMD seems to work better at 4K resolutions for half to a third of the price of their rival's CPU.

I own both Intel and AMD -- and AMD games just as well at 120 Hz. You clearly haven't touched anything they've made in years.

It is not just one source that has been saying AMD FX cpu's are a better choice at 4K -- I've counted at least 3. Tweaktown, Forbes and Kotaku. I wish there were more benchmarks at those high resolutions, maybe Anandtech could post some to confirm the performance observed by these others.

SEEMS

There is something wrong with that "SEEMS", especially when you consider that resolutions should be CPU independent.

IMO there appear to be large systematic errors in that test.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Well, with a quote like that -- it's clear that your just an Intel shill that can't handle the fact that AMD seems to work better at 4K resolutions for half to a third of the price of their rival's CPU.
if you read all of my posts in the thread, you can see that

1. that specific post you targeted was an off-tangent discussion about 120Hz, not 4K

2. that I wouldn't be surprised if AMD was just as good as Intel for most situations in 4K because the resolution is very GPU limited; it just doesn't make sense that one CPU is faster than another in such a GPU limited scenario (implying something else is likely to be going on)

3. if there is anything that I "can't handle", it is the test's lack of comprehensiveness; I want to know what is going on that is producing the results - is it Intel that is faulty? or is there something to the AMD architecture/platform that gives it an advantage? or maybe there's an error in the testing? perhaps some sort of combination of the aforementioned?


I own both Intel and AMD -- and AMD games just as well at 120 Hz. You clearly haven't touched anything they've made in years.
I'll fully admit haven't owned anything AMD since the 5850 (I actually still have this card, will probably never get rid of it for sake of nostalgia), but that doesn't mean I don't have experience with their parts building / repairing / testing rigs for friends, etc. And I could just as easily flip the baseless accusations around by claiming its clear you haven't tried pushing for 120Hz and/or simply have far less demanding standards when doing so, because from my experience that certainly would appear to be the case.

But since you have both AMD and Intel ready at your fingertips, if you could specify which parts and which 120+Hz monitor and which games and the settings you find AMD just as good as Intel, maybe I've been doing it wrong the number of times I've had experience with the hardware (experience that's is only reinforced by the results and 3rd party experiences of the majority of my online encounters, instances of claims like yours and AtenRa's are the minority)

It is not just one source that has been saying AMD FX cpu's are a better choice at 4K -- I've counted at least 3. Tweaktown, Forbes and Kotaku. I wish there were more benchmarks at those high resolutions, maybe Anandtech could post some to confirm the performance observed by these others.
the number of people who claim something has no bearing on whether or not that thing is true, we need evidence to back up those claims, and thus far only the tweaktown article has any results, as you have yet to post this Kotaku source. A search for "kotaku + amd + 4k" resulted in this article, which isn't even hearsay (like the Forbes article where the author is actually called out in the comments section, and has since failed to produce anything I could find) because they don't even try and make a claim that its as-good or better or worse than intel (or nVidia in case of GPU), just that it works as the entire premise of the article is just to explore whats going on with AMD hardware and what it might be capable of. Maybe there's some other kotaku article with tests done between AMD and Intel @ 4K? I have yet to find it...

Extraordinary claims require extraordinary proof, and AMD being faster than intel's HEDT at anything is pretty extraordinary. Science relies on reproducibility, i.e. others should be able to run these tests (hopefully in a more comprehensive manner to help shed light as to what might be going on) and achieve these same (or very similar) results. And since we have only the one set of results, maybe now you understand while I'm still skeptical?

I 100% agree that I would love to see some more tests from a site as reputable and comprehensive as Anandtech so that we might have a better understanding as to whats going on in the hardware if these results are accurate.
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
But since you have both AMD and Intel ready at your fingertips, if you could specify which parts and which 120+Hz monitor and which games and the settings you find AMD just as good as Intel, maybe I've been doing it wrong the number of times I've had experience with the hardware (experience that's is only reinforced by the results and 3rd party experiences of the majority of my online encounters, instances of claims like yours and AtenRa's are the minority)

First off -- people who buy CPU's from both vendors are an extreme minority to start with. Those that actually take an unbiased view of their respective performance whittles that down to probably 3 people total on this entire forum.

In real world gaming, my i7 3770k and FX-8320 provide virtually identical performance in games -- and any difference in frames per second are generally academic at best. Once you edge up beyond a 3770k is really the only point where Intel starts to have a clear advantage in processing power. That is an advantage which entirely evaporates every time the GPU bottlenecks the system.... Which is pretty much EVERY 4K Game. This really isn't rocket science.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
First off -- people who buy CPU's from both vendors are an extreme minority to start with. Those that actually take an unbiased view of their respective performance whittles that down to probably 3 people total on this entire forum.

In real world gaming, my i7 3770k and FX-8320 provide virtually identical performance in games -- and any difference in frames per second are generally academic at best. Once you edge up beyond a 3770k is really the only point where Intel starts to have a clear advantage in processing power. That is an advantage which entirely evaporates every time the GPU bottlenecks the system.... Which is pretty much EVERY 4K Game. This really isn't rocket science.

what isn't rocket science is that this tangent you seemed to have an initial problem with was side discussion about 120Hz, not 4K, hopefully I don't have to clear that up yet again.

Also, I've always maintained that 4K should have little difference because it is so very GPU limited, which is precisely why I was suspicious of the numbers - neither CPU should be consistently faster than the other, let alone the AMD CPU which is otherwise slower in almost every scenario other than 4K gaming.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
I would like to point out a common fallacy - that a CPU bottleneck doesn't evaporate at 4K. The GPU load is simply much higher to maintain the same framerate.

For instance, let's say you have a 2500K which is capable of delivering 45fps minimums and 70fps average, which can be exceeded with an HD7970 in a given game at 1080P, making the CPU your bottleneck. Bump it up to 4K and the 7970 will become the bottleneck, but moving to quadfire 290's will still not get you past 45/70, as the CPU bottleneck remains. You won't ever get a higher framerate than your CPU can deliver, resolution independent.

EDIT: In the case of the Battlefield 4 bench I linked above, a 2500K delivered 55FPS as the minimum in their benchmark. Quadfire 290's @ 4K would likely also have 55FPS as a minimum. -> CPU bottleneck
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
All my CPU's are intel (as they are the best at making CPU's)

I just find it funny when an AMD chip wins a benchmark something is wrong with the benchmark

Just let them have the win , they don't win very many these days

You keep ignoring the elephant in the room. It is very likely the benchmark or some other part of the procedure IS ACTUALLY WRONG.

What's your theory as to how a processor that is slower in literally every other test somehow magically is faster when pairing only certain cards? Please explain why your theory is more likely than driver error or other procedural error in testing.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
You keep ignoring the elephant in the room. It is very likely the benchmark or some other part of the procedure IS ACTUALLY WRONG.

What's your theory as to how a processor that is slower in literally every other test somehow magically is faster when pairing only certain cards? Please explain why your theory is more likely than driver error or other procedural error in testing.


I tried searching for GTX780Ti 4k 3DMark Firestride Extreme benches, but can't find much. I agree that it is surprising that the FX is faster, I have to wonder if something is going on with the driver. It is an odd result, but I think it is fair to want to see more tests before outright dismissing the benchmarks.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
I tried searching for GTX780Ti 4k 3DMark Firestride Extreme benches, but can't find much. I agree that it is surprising that the FX is faster, I have to wonder if something is going on with the driver. It is an odd result, but I think it is fair to want to see more tests before outright dismissing the benchmarks.

this

its folly to flat out dismiss something even if it is likely wrong

on the one hand it could be intentionally misleading due to an agenda, and dismissing it would actually be ok in such a situation, but before we know that to be the case, it could also be likely that they made an error in some way that might be of interest to other X79 users to keep an eye out for to not make that same mistake...or maybe ASUS needs to update a BIOS (perhaps one of the cards is running with a fraction of the available PCI-e bandwidth)...or maybe there's even something flawed in the architecture.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
no, you cannot. This might be true for 60Hz, but not for 120. Again, even a heavily overclocked 6-8 core i7 is often still a bottleneck for 120+Hz, let alone an AMD chip. I think you're making the mistake of forgetting (or not realizing) that most games aren't benchmarked professionally because they're not conducive to it (e.g. multiplayer based games) and many of these games will have AMD falling far short when it comes to 120+Hz

Not sure why you felt the need to break down in to full-on AMD apologetics mode. AMD is "good enough" for 60Hz, that's certainly a turd you can keep on polishing with your AMD apologetics.

Almost 10 minutes of BF4 MP 120fps cap with FX8350 @ 4.5GHz and HD7950 @ 1GHz,

https://www.youtube.com/watch?v=ADaALZ0RWDA

just for you, njoy ;)
 

crashtech

Lifer
Jan 4, 2013
10,573
2,145
146
It would be interesting to know more about what looks like a somewhat anomalous result. What is it about this test that allows Vishera to win so convincingly, and can it be repeated by other testers? If so, the FX may have found a nice niche, but like others I have to question the results, since I can recall no other instance where Vishera was able to defeat a 3930K, let alone a 4930K. Pretty cool if true though.
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
what isn't rocket science is that this tangent you seemed to have an initial problem with was side discussion about 120Hz, not 4K, hopefully I don't have to clear that up yet again.

Personally I don't even understand why you would even waste the time debating 120Hz. I'm already gaming at 120Hz and was never particularly impressed with the improvement over 60. 120Hz monitors really aren't worth it IMO -- and they make movie playback look like complete crap. For the same money, I'll take 60Hz at 2550 x 1440 over 120Hz @ 1920 x 1080 every time.

....and I'd much rather be playing at 4K -- but don't have enough GPU to get me there yet.
 

master_shake_

Diamond Member
May 22, 2012
6,425
291
121
It's not a matter of letting "them" have the win (I have AMD systems myself), but 3dmark graphics suit should score about the same regardless of processor. I can find no logical reason why an FX-8 would outperform an Ivy bridge hex with hyperthreading by far more than margin of error when both should be scoring the same as a Pentium or Kaveri.




I haven't done the research on 4K in specific, and I'm in agreement that we now need more 4K processor benchmarks, given this data that doesn't fit with our current knowledge. As I understood it, scaling resolution should increase GPU load and do nothing for CPU load, so a CPU lead at low resolution by Intel chips shouldn't be reversed at high resolution, unless there's something going on we don't know about.

As for 120FPS on FX chips, it depends on your game. I know few people care about Guild Wars 2 anymore, but neither AMD or Intel can provide 120FPS in it at present:

CPU-Cores.png


FX chips make a nice showing in Battlefield but ultimately you're still better off with an i7, especially if you plan to overclock:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Battlefield_4_China_Rising_-test-bf_4_proz.jpg

thought the review said 4k not 1920x1080 or 1280 x 1024
 

crashtech

Lifer
Jan 4, 2013
10,573
2,145
146
Personally I don't even understand why you would even waste the time debating 120Hz. I'm already gaming at 120Hz and was never particularly impressed with the improvement over 60. 120Hz monitors really aren't worth it IMO -- and they make movie playback look like complete crap. For the same money, I'll take 60Hz at 2550 x 1440 over 120Hz @ 1920 x 1080 every time.

....and I'd much rather be playing at 4K -- but don't have enough GPU to get me there yet.
You know I am totally with you there, but... back in the days of CRTs, 60Hz flicker used to drive me insane, but I found that not everyone was bothered by it, or could even detect it at all. Based on this, I have to wonder if there are a subset of gamers that really do need those higher frame rates to avoid eye strain...? I can't detect the difference, but I can't say with my waning senses whether that means it really doesn't matter.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Personally I don't even understand why you would even waste the time debating 120Hz.
I don't see trying to educate people as a waste of time. Even if I don't get through to the person I'm directly in contact with, maybe other observers will pick up something useful.

I'm already gaming at 120Hz and was never particularly impressed with the improvement over 60.
if that's true, then I'm surprised and you would be in the minority of those I've seen have experience with both. In my experience, those who downplay 120Hz generally do so because they don't like the idea of paying so much more for a monitor they might even consider inferior to one they already own (IPS vs. TN), and generally never end up having any experience with it and thus cannot make an honest judgement.

120Hz monitors really aren't worth it IMO -- and they make movie playback look like complete crap.
now I'm starting to not believe you actually have any experience with a 120Hz monitor, and if you ahve any experience, its with 120Hz frame interpolation (likely from a TV). A 24fps movie should actually look better on a 120Hz monitor (120 is perfectly divisible by 24fps, 60 is not)

For the same money, I'll take 60Hz at 2550 x 1440 over 120Hz @ 1920 x 1080 every time.
for the money I'll take fast over slow, I'm not to the point where my own physical capabilities are that bad.

....and I'd much rather be playing at 4K -- but don't have enough GPU to get me there yet.
I would also rather be playing at 4K, but not if it comes at the cost of speed. Right now 4K is limited to 60Hz because we do not yet have the bandwidth for 4K120 over DP1.2.

although even if we do get a 4K120 monitor within the next 12 months, I'm not even sure I'd want to mess with it unless it could do a perfect downscale to 1080p, because I doubt even 2 x GM200s could push enough frames @ 4K, would likely have to settle for G-Sync instead of ULMB


You know I am totally with you there, but... back in the days of CRTs, 60Hz flicker used to drive me insane, but I found that not everyone was bothered by it, or could even detect it at all. Based on this, I have to wonder if there are a subset of gamers that really do need those higher frame rates to avoid eye strain...? I can't detect the difference, but I can't say with my waning senses whether that means it really doesn't matter.

its not for eyestrain as much as it is to increase overall motion accuracy and input response

although LMB techniques will actually reintroduce flicker to LCDs and thus a higher refresh rate is necessary to reduce eyestrain.

its nowhere near a majority, but its a large enough niche that care about it enough to produce communities for it:

http://www.blurbusters.com/
http://120hz.net/
 

crashtech

Lifer
Jan 4, 2013
10,573
2,145
146
Had a chuckle at that myself

Thread about AMD being better than intel at 4k resolutions - someone posts benchmarks for 1280x1024 :rolleyes:
It does raise the question as to what exactly is happening here. The expected result is for framerates to get crammed into a very narrow range when the test is massively GPU limited as it should be at 4K. For a CPU like the 4930K, which is fairly powerful, to be beaten decisively by the FX in this way is interesting and unusual.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
It does raise the question as to what exactly is happening here. The expected result is for framerates to get crammed into a very narrow range when the test is massively GPU limited as it should be at 4K. For a CPU like the 4930K, which is fairly powerful, to be beaten decisively by the FX in this way is interesting and unusual.


You're only going to be as fast as your slowest part. If your CPU is good for 100FPS and your GPU is good for 50FPS in a given situation, you're only going to get 50FPS. I imagine that at 4k, even with potent cards in 2x SLI like the GTX780 / GTX980, they are likely to be the limiting factor in this bench. So why does the CPU change give such a difference in score? I bet both of these CPU's are fast enough to not be the limiting factor in this bench.

The results are a bit unexpected, but it is possible that there is an organic performance benefit for some reason on the FX. Or, it could be an odd driver bug that is artificially holding back performance on the i7. It could be an error in testing, maybe throttling on the i7 rig. Who knows. But I don't think it is fair to dismiss the results outright (*edit - not that you are, just a general comment) because they aren't in line with the expected outcome. I'd like to see this looked into, see if the results are duplicated or if this is just an unexplained anomaly. Unfortunately, finding Firestrike extreme benches with SLI GTX780's / GTX980's in 4k resolution on an i7 is challenging enough. Good luck finding that set up on an FX! (though if anyone wants to borrow me their 4k monitor and two GTX780's / GTX980's, I'm ok with that. :D )
 
Last edited:

crashtech

Lifer
Jan 4, 2013
10,573
2,145
146
I don't want to come off as dismissive. I think anyone would have to admit that the result is unusual. If it bears out to be true, I will be buying an FX when I buy my 4K monitor.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
I don't want to come off as dismissive. I think anyone would have to admit that the result is unusual. If it bears out to be true, I will be buying an FX when I buy my 4K monitor.

there's also the problem that the 4930K is effectively an obsolete CPU now that X99 and Haswell-E is available (particularly with the 5820K coming in at a lower price point not too far from the 4790K)

we need more comprehensive testing as the aberration could very well be the fault of the X79 platform and not the CPU itself, and that thew newer CPU/platform fixes the problem (although I'd still be inclined to think it's user generated anomaly and/or perhaps specific to the motherboard, such as incorrectly configured PCI-e slots)
 

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
This is like the glass half empty/half full thing. The Intel fan see's hampered performance from some software problem, the AMD fan see's surprisingly good performance for such an aged chip.

edit: isn't the 4930K barely a year old?
 
Last edited:

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
This is like the glass half empty/half full thing. The Intel fan see's hampered performance from some software problem, the AMD fan see's surprisingly good performance for such an aged chip.

edit: isn't the 4930K barely a year old?

then there's the reasoned skeptic who sees this one test with results that go against everything we've seen before and thus remains skeptical, especially when this one test is far from comprehensive
 

Ramses

Platinum Member
Apr 26, 2000
2,871
4
81
My skepticism is directly related to how important a thing is. In this case, slightly to the former and not very to the latter. It's interesting casual forum reading though for sure. I'd like to see further testing as well. At this point as a realistic AMD guy I'm happy to see them just on the board, let alone doing well lol.