Thinking about Crossfire - should I go for i7?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I contacted Ryan Smith and asked specifically about the 290s and here's what he had to say:

The dual-GTX 770 LE(or 7970s for AMD) used in their Haswell Refresh review are not quite high-end cards but they are close.

And it seems like I finally got a number to expect for high-end GPUs, 10-15%. So not that hugely significant but certainly notable in an era with small CPU incremental IPC increases. I got a 144 Hz monitor right now for 1080p but mentioned that I'm thinking about either going 4K or 120 Hz 1440p when that comes out in greater numbers.

As expected by a senior GPU editor, Mr Smith gave a very informed answer. I'm still not totally sold if it's worth it, but considering that I prize my high frame rates a lot, meaning that I'd be more likely to jump to 1440p at 120 Hz than going to 4K in the immediate future, the i7 just got a boost in my eyes. At the same time, low IPC increases notwithstanding, 10-15% is not huge. I thought I'd be bottlenecked by 20-30% or so.

Well, the thread is more or less over as I did get the answer, only via email :D

It could serve as a reference for people in the future with similar concerns, since it seemed to be pretty hard to get a straight, data-driven answer from people. (I.e. nobody really seemed to know for sure what the performance delta was).

Not really saying much above and beyond what's being said here. IF your GPU's are being bottlenecked AND the game can use HT, you'll see 10-15%.

A low number that comes with stipulations most games will not meet. Which is what was said here. You made it sound like you were looking for charts that plotted the actual performance, which you didn't get from the thread or the email.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
We all know the rough benefits of hyperthreading when it works well. But how many games actually benefit?

I went through all of gamegpu.ru's games from Guild wars 2 up to Company of Heroes 2 a while ago in an attempt to answer some of these sorts of questions. 52 games in total are in my data set. Most of them were run on a GeForce 690 but some of the more recent ones were on a Titan. Realistic high/ultra settings are what they use so its fairly representative of what someone with that hardware would be using. They don't/didn't have Haswell hardware but one of the comparisons I did was a 2600k v a 2500k and I also captured the 3930k verses 2600k. So I have several useful comparisons that relate to CPU choice, how much and how often does hyperthreading help and how often do the additional cores help (I also have some 8350 data). The basic summaries are as follows:

3930k v 2600k
Average improvement 1%
Best improvement 67%
25% quartile -1%
75% quartile 2%
Minimum -12%

So overall the 3930k is on average the same speed as a 2600k, but in some games it has a benefit, but 75% of the games it shows none. The games it shows benefits beyond 10% are the follow: Project cars DX11, Medal of Honour warfighter, Hitman Absolution, Crysis 3, Metro last light (67%). Games that lost out by more than 10% were Mechwarrior online and Carrier command Gaea.

2600k v 2500k
Average 5%
Maximum 22%
25% quartile 4%
75% quartile 7%
Minimum -5%

On average not much gain, certainly below 10% for the 75% of games, but also the upside is lower compared to having real cores. The games that gained by more than 10% were: Medal of honour warfighter(22%), Battlefield 3 end game, Far cry 3 blood dragon. Just bare in mind that there is a clock speed difference between these two CPUs of 100Mhz, as well as some cache. So its not a pure hyperthreading comparison as it can only show the real CPUs against each other. I did try to correct for the HT in my spreadsheet but it makes the equally bad assumption that the benefit from the extra clock speed is linear. Regardless these are the real benefits going from a real i5 to a real i7 with all the warts of that not really comparing HT in isolation included.

What I think is most interesting about this is that its a different list of games for HT compared to more cores on top of HT. The 3930k v 2500k shows a bit more improvement from 6 cores than compared to the 2600k but it isn't really interesting.

By and large what I determined from all that data is the bigger games tend to show more gains from more hyperthreads and more CPUs than the smaller ones, but its far from universally true. For hyperthreading compared to an i5 just 6% of games showed more than 10% and just one showed 22%. For 6 cores we saw 9% of games showing improvement and the best improvement was 67%. There isn't even an obvious trend towards the more recent games but I haven't updated it in months.

I hope that answers the question as best as it can be answered based on the data we have, which admittedly isn't crossfire since its SLI but its a similar performance level and should have indicative results. The link to the spreadsheet if you want to go through the data yourself - https://dl.dropboxusercontent.com/u/3638175/GameGPU CPU performance.ods
 

Elfear

Diamond Member
May 30, 2004
7,167
824
126
2. I'd just remind you that I'd be using Crossfire, which is why I was a bit surprised why you left out your GPU setup in your comment. Did you use a single GPU or several?

Sorry, I should have been more explicit in my reply. I was using the pair of overclocked 290s in my sig.

I went through all of gamegpu.ru's games from Guild wars 2 up to Company of Heroes 2 a while ago in an attempt to answer some of these sorts of questions.

Snip...

Interesting results. Thanks for taking the time to run the numbers. :thumbsup: