AMD FX-8350 powering GTX 780 SLI vs GTX 980 SLI at 4K

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
OK, I was trying not to stir the pot too much, but now I will just be blunt. Testing a generation old more expensive Intel CPU instead of a newer, cheaper, and faster one just seems like stacking the deck in favor of amd especially when price is considered, which inevitably it will be. I am not saying a 4790k would have been faster, but it would have been a more logical comparison.

IMO this doesn't matter so much.

I'm shocked that the AMD system is showing 20% better framerates in 100% GPU-bound scenarios.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
OK, I was trying not to stir the pot too much, but now I will just be blunt. Testing a generation old more expensive Intel CPU instead of a newer, cheaper, and faster one just seems like stacking the deck in favor of amd especially when price is considered, which inevitably it will be. I am not saying a 4790k would have been faster, but it would have been a more logical comparison.

? the 4930K is Ivy Bridge E, which is at worst the same generation as Piledriver in the 8350 (albeit, is a thoroughly superior CPU vs. the 8350, even the 9590 fails to close the gap enough to threaten any sort of real competition)

that being said, I'm pretty skeptical of these results:

without CPU only tests, we can't be sure that the 4930K system is running optimally

without single GPU tests, its hard to tell if there might be an SLI problem

then there's the fact they only ran these things stock (seriously, how many are going to get a 4930K or 8350 to run stock? The 4790K or the 9370/9590 I can understand leaving "stock")

I'm also disappointed they didn't run any 290Xs (its entirely possible that nVidia's drivers are working some sort of voodoo)

4K is clearly a situation where the performance bottleneck will be shifted almost entirely onto the GPU(s), so it isn't surprising to see results where the otherwise superior 4930K often if not almost always appears to offer no advantage over a weaker CPU like the 8350 (reflected quite often in the AT bench comparisons, particularly with all the single 770 results of the 9590 vs. 4930K where many of the tests end up effectively tied), but these results seem to suggest that the 8350 is actually more than margin of error faster than the 4930K, when at worst they should be tied...that is unless something is wrong (and it could very well be that there is something wrong with the s2011/X79 platform itself)
 

Shivansps

Diamond Member
Sep 11, 2013
3,875
1,530
136
There is definately something wrong, thats not even a question, how on earth a FX8350 gona give 1200 more graphics points running a firemark extreme @ 4K.
There a similar, but smaller problem with the 980s, the difference is way too much to be normal, even for a SLI, it also does not make sence to the gap to be smaller with 980 and bigger with 780s.
 
Last edited:

nenforcer

Golden Member
Aug 26, 2008
1,767
1
76
There is definately something wrong, thats not even a question, how on earth a FX8350 gona give 1200 more graphics points running a firemark extreme @ 4K.
There a similar, but smaller problem with the 980s, the difference is way too much to be normal, even for a SLI, it also does not make sence to the gap to be smaller with 980 and bigger with 780s.

The only conclusion I could draw from this is that possibly the drivers are more mature (having been developed on longer) for the GK110 780 based cards when compared to the brand new GM104 980's.

I'm guessing as future driver releases are made the gap will reverse itself and be greater with the 980 and smaller with the 780.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
OK, I was trying not to stir the pot too much, but now I will just be blunt. Testing a generation old more expensive Intel CPU instead of a newer, cheaper, and faster one just seems like stacking the deck in favor of amd especially when price is considered, which inevitably it will be. I am not saying a 4790k would have been faster, but it would have been a more logical comparison.


Sorry, I'm not as familiar with Intel stuff, I sometimes forget their use of the similar numbering on CPU's that share different generation cores. My mistake. Yea, it would have been nice to see a Haswell tested as well, fair point. But, Ivy is no slouch and AMD did well here. Even if not the fastest Intel CPU, I think the take away here is that even if it was Sandy it's kind of surprising to see AMD pull ahead even if it isn't a huge advantage.
 
Last edited:

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Sorry, I'm not as familiar with Intel stuff, I sometimes forget their use of the same numbering on CPU's that share different generation cores. My mistake. Yea, it would have been nice to see a Haswell tested as well, fair point. But, Ivy is no slouch and AMD did well here. Even if not the fastest Intel CPU, I think the take away here is that even if it was Sandy it's kind of surprising to see AMD pull ahead even if it isn't a huge advantage.

Some of the tests where it's pulling way ahead in, CPU shouldn't matter at all. The 3dmark graphics score for instance should have been the same with a Pentium, yet the FX is 20% faster.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
AMD wins benchmark test = test must be wrong :thumbsup:

Yes. Because basic, rudimentary logic dictates that when two things are compared in 100 different ways and one is superior in 99 of them, and the 100th test is highly similar to the previous 99 tests, that there is a high likelihood that the 100th test is faulty.

I don't cheer for the 3 lb weight to be heavier than the 5 lb weight the 100th time I weigh it when it was lighter 99 times before
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
AMD wins benchmark test = test must be wrong :thumbsup:

I think you are confusing and adding some sort of AMD hatred that you feel that so many have with simply the belief that the results to NOT cohere with other data. It has nothing to do with AMD or intel, its simply that there are differences where there should be none.
 

itsmydamnation

Platinum Member
Feb 6, 2011
2,907
3,521
136
if NV's drivers are really well threaded and they uses lots of integer then that would explain a lot. Bulldozer derived parts are very good at integer but they are still only two ALU's wide so you need good threading to get the throughput high.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
There should be none according to what.?.
To the same old urban legends unrelentlessly rehashed.?.

According to that 3dmark FS extreme graphics score should not be different?

Should be no to little difference; less than 2% between a pentium and a 5960X.
 

DrMrLordX

Lifer
Apr 27, 2000
21,991
11,542
136
That comparison would have been more useful if they had included some scaling tests with different card configs (single and SLI) to show how clockspeed affected performance for each CPU.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
Wow nice showing for the FX 8350. I've said countless times and i'll say it again just to reinforce it, you won't be able to tell it a gaming rig is being powered by intel or AMD FX... ok maybe if you really need to have 120 fps but again that's just on very few titles.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Wow nice showing for the FX 8350. I've said countless times and i'll say it again just to reinforce it, you won't be able to tell it a gaming rig is being powered by intel or AMD FX... ok maybe if you really need to have 120 fps
ding ding ding! AMD is not a viable option for trying to push 120+fps (a heavily overclocked i7 often leaves one wanting in some games, especially if they aren't optimized enough)

when games are GPU limited, you can typically turn down settings and/or add more GPU

when games are CPU limited, there usually isn't much you can do other than get a faster CPU, as most adjustments have little to no effect on CPU, and there are still way too many games that require high IPC to get comfortably above 60fps (even if they have a respectable amount of mult-thread optimization)

but again that's just on very few titles.
PC games that have a hard framecap @ 60fps (or less) are not the majority, unless you're placing too much focus on indie style games that don't require more than APU type power, let alone heavily overclocked i7 + 780Ti/980 SLI

maybe you mean its not a very large niche and thus not very relevant, because the majority of PC gamers still run 60Hz, even amongst PC enthusiasts (granted, the same could be said of 4K)
 

Jovec

Senior member
Feb 24, 2008
579
2
81
Instead of max frames, I find Intel to give more consistent minimums with less hitching. FX may be different since I stopped gaming on AMD after my (3.8) 1090t and moved to a (stock) 2600k, but it was noticeable back then. Almost like a SSD today where you forget it's there after a while unless switch to a computer without one.
 
Last edited:

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
Instead of max frames, I find Intel to give more consistent minimums with less hitching. FX may be different since I stopped gaming on AMD after my (3.8) 1090t and moved to a (stock) 2600k, but it was noticeable back then. Almost like a SSD today where you forget it's there after a while unless switch to a computer without one.

I do believe the FX is indeed different. My old Thuban was annoying at stutters for me -- but my FX-8320 seems a hell of a lot more fluid. I also concur with what many others are saying on this forum: An FX 6 or 8 core around 3.8 Ghz+ is nearly impossible to discern from an Intel CPU to the naked eye. Sure the benchmarks might show an extra 5-10 fps, but it really is not enough to change the overall experience.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Toms, Anandtech, Pcper have all done CPU tests in the last year and a half, and gamegpu.ru runs CPU tests on every game. None of them ever have the 8350 winning out by 10%.

Here is the 2600k and the 8350 compared over 101 games on gamegpu.ru and you can see the 8350 wins once by about 10%. All the other times its behind, by as much as less than half the speed in some cases.

2600k%20v%208350%20time.png


A more limited view to give us an idea what happens on a 4770 compared to a 2600k to give you an idea of the performance improvements it brings at realistic gaming resolutions.

2600k%20v%204770k.png


So I do believe based on what I know the data in this review is inconsistent with the majority of data out there.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
I'd like to see 4k data though. I'm not standing behind these results, but I think it is unfair to rule them out just because the outcome is unexpected. More tests like this will be done in time, we'll see what happens. Either way, I would not be afraid to use my CPU with two R9 290/290x's, two GTX780Ti's, or two GTX970/980's and 4k. :)
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
I'd like to see 4k data though. I'm not standing behind these results, but I think it is unfair to rule them out just because the outcome is unexpected. More tests like this will be done in time, we'll see what happens. Either way, I would not be afraid to use my CPU with two R9 290/290x's, two GTX780Ti's, or two GTX970/980's and 4k. :)

It is theoretically possible that games not optimized for hyperthreading potentially would be faster on an FX. My FX 8320 at stock clock performs nearly identically to my i7 3770K under Linux, although trails the i7 under Windows 7.

I've noticed that other articles have picked AMD FX processors OVER Intel i7 for 4K gaming. According to Forbes, the FX-8350 is "Where gaming is concerned, the FX-8350 edges out Intel's Core i5 and i7 processors." So there may be some merit that the FX multi-thread brute force approach may make it better suited than hyperthreading at extremely high resolutions.

http://www.forbes.com/sites/jasonev...00-ultra-hd-monitor-ssd-and-windows-included/
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
ding ding ding! AMD is not a viable option for trying to push 120+fps (a heavily overclocked i7 often leaves one wanting in some games, especially if they aren't optimized enough)

I can have 110+ fps in BF4 MP at 1080p Low Settings with a FX8150 @ 4.4GHz and HD6950 2GB and still the GPU is the limiting factor.

You can have 120+ fps in the majority of todays Games with an AMD CPU if your GPU have the performance to do it.

Also, there is not a single new game that is unplayable with any AMD FX CPU or Quad Core APU, so having 300+ fps at lower resolutions/quality settings doesnt mean a thing. In most systems except High-End, the GPU is the limiting factor in the majority of AAA games and not the CPU. There are a few games and MMOS that need more IPC and will definitely run way faster with Intel CPUs but those are getting fewer and fewer every time.
 
Last edited:

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
It is theoretically possible that games not optimized for hyperthreading potentially would be faster on an FX. My FX 8320 at stock clock performs nearly identically to my i7 3770K under Linux, although trails the i7 under Windows 7.

I've noticed that other articles have picked AMD FX processors OVER Intel i7 for 4K gaming. According to Forbes, the FX-8350 is "Where gaming is concerned, the FX-8350 edges out Intel's Core i5 and i7 processors." So there may be some merit that the FX multi-thread brute force approach may make it better suited than hyperthreading at extremely high resolutions.

http://www.forbes.com/sites/jasonev...00-ultra-hd-monitor-ssd-and-windows-included/

1. HT can be disabled, and from what I've seen, rarely causes a negative impact on gaming performance (on the contrary, its what keeps the i3 relevant).

2. this isn't Piledriver winning against an i5 or even a quadcore i7; the 4930K is 6 intel cores, HT should be irrelevant with Intel's IPC advantage against AMD's 8 "cores"

3. the Forbes article is hearsay, as it offers no evidence or sources for any claims about the CPU performance, and using it is an argument from authority fallacy, i.e. just because its from Forbes doesn't automatically mean its true. Then there's also the fact that the article is almost a year old and its original focus is to build a 4K rig around a set budget ($2500, and the vast majority of that budget goes towards the monitor and GPU, as it should), not "build the best 4K rig possible"


I can have 110+ fps in BF4 MP at 1080p Low Settings with a FX8150 @ 4.4GHz and HD6950 2GB and still the GPU is the limiting factor.
Let me know when you can "have" 120fps in the average MMO or RTS, let alone minimums in the 120+ range necessary for LMB. PS2 is still too much for my 4.7GHz 3930K even with all their massive efforts to get it multi thread optimized for the PS4.

You can have 120+ fps in the majority of todays Games with an AMD CPU if your GPU have the performance to do it.
no, you cannot. This might be true for 60Hz, but not for 120. Again, even a heavily overclocked 6-8 core i7 is often still a bottleneck for 120+Hz, let alone an AMD chip. I think you're making the mistake of forgetting (or not realizing) that most games aren't benchmarked professionally because they're not conducive to it (e.g. multiplayer based games) and many of these games will have AMD falling far short when it comes to 120+Hz

Also, there is not a single new game that is unplayable with any AMD FX CPU or Quad Core APU, so having 300+ fps at lower resolutions/quality settings doesnt mean a thing. In most systems except High-End, the GPU is the limiting factor in the majority of AAA games and not the CPU. There are a few games and MMOS that need more IPC and will definitely run way faster with Intel CPUs but those are getting fewer and fewer every time.

Never said anything about being unplayable (which is subjective), only said that AMD really isn't an option for trying to push 120Hz, and that intel is often not a great option...although when that's the case, its typically 33-100% better than the AMD option.

Not sure why you felt the need to break down in to full-on AMD apologetics mode. AMD is "good enough" for 60Hz, that's certainly a turd you can keep on polishing with your AMD apologetics.
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
Release of i7-4930K, 10 September 2013.
Release of FX-8350, 23 October 2012.

To the ones pointing out that its unfair because Intel's chip was older.
 

davie jambo

Senior member
Feb 13, 2014
380
1
0
I think you are confusing and adding some sort of AMD hatred that you feel that so many have with simply the belief that the results to NOT cohere with other data. It has nothing to do with AMD or intel, its simply that there are differences where there should be none.

All my CPU's are intel (as they are the best at making CPU's)

I just find it funny when an AMD chip wins a benchmark something is wrong with the benchmark

Just let them have the win , they don't win very many these days