[Techno-Kitchen] i7 4790K@4.7Ghz vs. i7 5930K@4.7ghz + GTX980Ti@1300mhz SLI in 4K!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
In any event, I said before that for someone who has access to MicroCenter, i7 5820K was a no brainer over the i7 4790K.

RS--

I recently did a build for a buddy at MicroCenter on the 5820k. Got a Fractal Define S after you had suggested it as a good watercooling case. That thing is fantastic. Easily one of the best cases I've ever worked with. We put a H110i GTX with 4 140mms in push-pull on the front of the Define S. We're still working on the OC but he's currently at 4.4 @ 1.27v and the hottest core doesn't go above 53-54 in OCCT. And it's dead silent. Thanks for that bit of advice broski

Could not agree more on the 5820k upgrade over the 4790k.
 

B-Riz

Golden Member
Feb 15, 2011
1,594
756
136
They should have tested a 5820k to see if the 28 vs 40 pcie lanes made a difference.

I am under the impression that the extra PCI-E lanes is the big plus.

My passing on X99 is due to not wanting to re-buy RAM.
 

smartypnt4

Junior Member
Mar 10, 2015
1
0
66
I'm very interested, as has been said before in this thread, in why the VRAM utilization is so much higher in Crysis 3...

Is the same amount of system RAM installed in both systems? Other than that, the differences are PCIe configuration, system RAM speed, and # of cores. Those have collectively previously been shown not to have much bearing on framerate, if any at all.

So, RussianSensation, I would agree with your assessment that this is a very surprising result. I'm wondering if this holds true in CrossFire for Fury X's as well, and what of the above is causing the discrepancy.

However, if you get a 3-5% increase from all 3 things above in a sort of perfect storm case (minus the possible increased RAM capacity), you would end up at 9.3-15.7% increase in performance, so maybe TW3's result is reasonable based on those combined factors... A 7-10% average increase from each (which is largely unheard of previously) would result in the sort of difference seen in Crysis 3.

I mention total system RAM in each because I seem to recall a Tom's Hardware article exploring CrossFire Scaling with the 5800 series and total system RAM. The conclusion I seem to recall is that 16GB fared noticeably better than 8GB (or 6GB vs 12GB or something), but I can't for the life of me find the article...



Sidenote: can we get average/min/max framerate info for each of the 3 games tested? It'd be easier to see than a video.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
What is wrong with just posting a chart? Is it really necessary to consume 200MB of bandwidth and force people to fish around a stupid video to get data?
 

tential

Diamond Member
May 13, 2008
7,348
642
121
What is wrong with just posting a chart? Is it really necessary to consume 200MB of bandwidth and force people to fish around a stupid video to get data?
YouTube generation won't look at charts... Dead serious.
I can post tons of charts proving a point and a noob will post a YouTube video of a side by side comparison and guesstimate the fps... No joke.

Lots of gamers who aren't into gpus like we are prefer a YouTube video even if it's longer and less informative.
 

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
Very interesting video. However, I do doubt the difference lies with the extra cores, like a few others have mentioned. The additional PCIe lanes to even the playing field between the GPUs looks to be the more likely contributing factor.

I do wish they published the CPU utilization figures by core. If the 4790k isn't maxed, the extra cores aren't doing anything. The additional bandwidth, however, with that much textures (5.5GB VRAM in GTAV, really Rockstar?) flying around, will make quite a bit of difference I'm sure.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Very interesting video. However, I do doubt the difference lies with the extra cores, like a few others have mentioned. The additional PCIe lanes to even the playing field between the GPUs looks to be the more likely contributing factor.

I do wish they published the CPU utilization figures by core. If the 4790k isn't maxed, the extra cores aren't doing anything. The additional bandwidth, however, with that much textures (5.5GB VRAM in GTAV, really Rockstar?) flying around, will make quite a bit of difference I'm sure.
So you want a real analysis and not just a quick youtube..... Which is why I hate youtube it just dumbs down information for things like this.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
YouTube generation won't look at charts... Dead serious.
I can post tons of charts proving a point and a noob will post a YouTube video of a side by side comparison and guesstimate the fps... No joke.

Lots of gamers who aren't into gpus like we are prefer a YouTube video even if it's longer and less informative.

And that's Eurogamer's Face-Off claim to success. Barely anyone cares about the logistics, they just want to see pretty pictures in motion.

I personally can't stand these new Youtubers, more so because I have to listen to them talk. (Haha). I read faster than these guys yammer on about their opinions and for heaven sakes the ones that do the "ummm" "umm..." "ummm" just shoot me.

You turned something I can read and absorb the data in under a few minutes to a 5+min video. EDIT: And worse, I have to stare at the goddamn video in fear when you do flash any sense of logistical data, it's merely for a fraction of a second. Then I find myself pausing, rewinding, trying to find that exact moment only to get exasperated and just close the video.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,348
642
121
And that's Eurogamer's Face-Off claim to success. Barely anyone cares about the logistics, they just want to see pretty pictures in motion.

I personally can't stand these new Youtubers, more so because I have to listen to them talk. (Haha). I read faster than these guys yammer on about their opinions and for heaven sakes the ones that do the "ummm" "umm..." "ummm" just shoot me.

You turned something I can read and absorb the data in under a few minutes to a 5+min video. EDIT: And worse, I have to stare at the goddamn video in fear when you do flash any sense of logistical data, it's merely for a fraction of a second. Then I find myself pausing, rewinding, trying to find that exact moment only to get exasperated and just close the video.

Always. I want lots of graphs, charts, and text. Old school Anand style from back when I first browsed the site in 2003.
Exactly why I hate youtube. I honestly wish it wasn't invented a lot of times.

Which is why I don't view these vids and wait for screencaptures or a summary. I'm not promoting these guys by adding to their view count. I want this medium to die not to flourish.

Sadly though I'm thinking of starting my own lol.
Might as well get paid for my opinion.
I bet you there is someone out there reposting Russian sensation posts and making money. Lol, I could write a blog on his posts alone probably and have enough content lol.
 

alcoholbob

Diamond Member
May 24, 2005
6,295
342
126
Most of the difference between pcie lanes (16 vs 8x) shows itself at lower resolutions. At 4k the difference is nonexistent. A lot of games are still CPU bound particularly open world games there's a noticeable gap in performance between quad and six core when there are multiple units the the screen.

However on the flip side a lot of dx12 benchmarks are showing no scaling past 4 cores. So it may be that dx12 has solved the draw call problem that would otherwise necessitate hex or octacores.
 

rchunter

Senior member
Feb 26, 2015
933
72
91
Looks like I'll be sticking with my hexcore x58 system for a few more years yet.

I got one too, x5660@ 4.2ghz. I love it. Rarely gets over 60C at full load.
I'm going to try to hold out for skylake-e. I'm hoping it will bring 8 and 10 core processors with it. I'm saving up for the 10 core chip. :biggrin:
I definitely see good activity on all 6 cores playing gta 5. Hopefully more games will follow suit.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
I got one too, x5660@ 4.2ghz. I love it. Rarely gets over 60C at full load.
I'm going to try to hold out for skylake-e. I'm hoping it will bring 8 and 10 core processors with it. I'm saving up for the 10 core chip. :biggrin:
I definitely see good activity on all 6 cores playing gta 5. Hopefully more games will follow suit.

yep :biggrin:
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
https://www.youtube.com/watch?v=ZABt8bHgDHo

Crysis 3
The Witcher 3
GTA V

4K All MAX

4790K@4.7
5930K@4.7
@ 1300 980Ti SLI
Windows 7
Drivers and patches relevant to the July 15, 2015

WOW, for all those people who have dismissed X99 platform as a waste of $ and labelled it purely a workstation platform......this is pure domination for the 5930K! Unless something is seriously wrong with their i7 4790K platform, 980Ti SLI is bottlenecked like crazy by the quad-core i7. I can't believe how large the differences can get!

:D

Skylake-S might actually be a downgrade vs. a 4.5Ghz+ X99 i7 platform.

Can't wait for Skylake-E. Hopefully more games start using 6 cores. :thumbsup:

At no point in that video did it dominate anything. Only the first game had 10-15fps advantage in one game out of three. lol The other two was around 5fps more... You really do boggle the mind when you post.
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
What is wrong with just posting a chart? Is it really necessary to consume 200MB of bandwidth and force people to fish around a stupid video to get data?

Since when is 200mb a lot of data? Yah youtube videos for GPU tests are terrible..but still only 200mb of data.
 

Ranulf

Platinum Member
Jul 18, 2001
2,509
1,571
136
WOW, for all those people who have dismissed X99 platform as a waste of $ and labelled it purely a workstation platform......this is pure domination for the 5930K! Unless something is seriously wrong with their i7 4790K platform, 980Ti SLI is bottlenecked like crazy by the quad-core i7. I can't believe how large the differences can get!

Meh, I'd still prefer to see the numbers from an actual 5820k at 4.7ghz over the 5930k.

One as you mentioned here and in the other cpu thread, it all depends on if you have a microcenter nearby.

Two... its SLI. RS, haven't you yourself said new AAA games are broken too often on release? Let alone SLI/X-fire not working right out the gate on new games. Might as well wait 6-12 months to buy a game and upgrade.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Meh, I'd still prefer to see the numbers from an actual 5820k at 4.7ghz over the 5930k.

One as you mentioned here and in the other cpu thread, it all depends on if you have a microcenter nearby.

Two... its SLI. RS, haven't you yourself said new AAA games are broken too often on release? Let alone SLI/X-fire not working right out the gate on new games. Might as well wait 6-12 months to buy a game and upgrade.
Ya rs isn't a fan of multi gpu. I love multigpu because I play years after release date. I wish I had 2 gpus for the Witcher 2 right now but the noise would be annoying at this point.

I haven't felt like I've missed something playing games years after release. I think I enjoy the in their full glory imo since I crank every setting play with full dlcs and mods I get a full experience of the game rather than simply getting to play it first.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
At no point in that video did it dominate anything. Only the first game had 10-15fps advantage in one game out of three. lol The other two was around 5fps more... You really do boggle the mind when you post.
You didn't watch the same video as the rest of us then. Also that 15FPS was 33%, which is insane considering they were at the same clock speed.

Anywho, I've been saying this for a year - it's short-sighted to go with a quad core for a gaming rig. If budget is not an issue you should push for the hex core.

Great post RS :thumbsup:
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
Hex core goodness is the way to go.

I'm DooKey and I endorse this post.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Lol @ all the hexcore buyers finally getting some justification for spending stupid amounts of money for 10-15% more performance. Oh wait how many of you have dual 980 Ti's and 40 lane hex core's to make this relevant?
 

rchunter

Senior member
Feb 26, 2015
933
72
91
Lol @ all the hexcore buyers finally getting some justification for spending stupid amounts of money for 10-15% more performance. Oh wait how many of you have dual 980 Ti's and 40 lane hex core's to make this relevant?

I paid $90 for mine. And by the time I do upgrade more and more games should be supporting 6 or more cores. I use mine for way more than just silly games anyway. Games are just a bonus.
 

alcoholbob

Diamond Member
May 24, 2005
6,295
342
126
I paid $90 for mine. And by the time I do upgrade more and more games should be supporting 6 or more cores. I use mine for way more than just silly games anyway. Games are just a bonus.

$90?!? Are you on the Intel Retail Edge program or something?
 

alcoholbob

Diamond Member
May 24, 2005
6,295
342
126
Lol @ all the hexcore buyers finally getting some justification for spending stupid amounts of money for 10-15% more performance. Oh wait how many of you have dual 980 Ti's and 40 lane hex core's to make this relevant?

You do realize regardless if you have a 5820K, 5930K, or 5960X, the third X99 PCIE slot (which is the same distance from the first X99 slot as the first and second Z97 slots are apart) is always 8X right?

Most flagship cards will not even fit into the 2nd X99 slot in 2-way SLI setups so 16X is moot point.

The difference between quad and hex is entirely due to the number of cores. PCIE and DDR4 doesn't even have anything to do with it (in fact 4960X outbenches the 5820K/5930K in several memory tests).
 
Last edited:

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
You didn't watch the same video as the rest of us then. Also that 15FPS was 33%, which is insane considering they were at the same clock speed.

Anywho, I've been saying this for a year - it's short-sighted to go with a quad core for a gaming rig. If budget is not an issue you should push for the hex core.

Great post RS :thumbsup:

Again, it was 3 games, and the one in "dominated" was older game. As for the "rest of us" does not make the majority.