Skylake VS Sandy-E 980TI SLI Bench Battle

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Deders

Platinum Member
Oct 14, 2012
2,402
0
91
#26
Tomb raider is very light on the CPU.
 

Deders

Platinum Member
Oct 14, 2012
2,402
0
91
#28
At least with their benchmark. 400% or more difference between in game and benchmark.
Even during the game it barely uses a core's worth, at least when I was playing it with SSAA and TressFX.
 

moonbogg

Diamond Member
Jan 8, 2011
9,744
22
126
#29
I would think that any included benchmark would not use the CPU nearly as much as actually playing the game, right? If its a scripted thing I would think it would be easier on the CPU. Also, that's a lot of games for me to buy, lol. I'm thinking probably not. I'm also waiting for him to complete the first round of benchmarks for our comparison. Its important to have relative data from reliable benchmarks before going into games IMO. The benchmarks I completed show multi core and single core performance, GPU performance etc etc.
 
Last edited:
Apr 22, 2012
20,395
0
106
#30
Yep, prescripted benchmarks is only good for GPU testing.
 
Mar 10, 2006
11,709
102
126
#31
You should also add some of these benchmarks:

Metro Last Light
Hitman Absolution (Very CPU limited)
Bioshock Infinite
Tomb Raider
Shadow of Mordor


All of the these games have built-in benchmarks, so it will make testing much easier.
I have Metro Last Light and Bioshock Infinite. I would be willing to purchase more fun games in the name of "research", though...

Anyway, I'll post up a bunch of results after I'm done with work today.
 

bystander36

Diamond Member
Apr 1, 2013
5,139
10
106
#32
Tomb raider is very light on the CPU.
I play Tomb Raider with 3D Vision, and unlike many games in 3D Vision, this one renders all the physics for both eyes the best I can tell. With 3D Vision, my FPS can drop to the 40's in a few spots, and it's CPU bottlenecked, but it never drops below 80 without it.

I have a very different view of Tomb Raider than most people. I view it as one of my most CPU intensive games. Of course it's also one of the best games to play in 3D Vision, so turning it off isn't really an option. I do turn down "level of detail" to high, and it fixes the CPU intense areas, and I can't tell a difference unless looking at them side by side, and still it's tough.
 

Deders

Platinum Member
Oct 14, 2012
2,402
0
91
#33
I play Tomb Raider with 3D Vision, and unlike many games in 3D Vision, this one renders all the physics for both eyes the best I can tell. With 3D Vision, my FPS can drop to the 40's in a few spots, and it's CPU bottlenecked, but it never drops below 80 without it.

I have a very different view of Tomb Raider than most people. I view it as one of my most CPU intensive games. Of course it's also one of the best games to play in 3D Vision, so turning it off isn't really an option. I do turn down "level of detail" to high, and it fixes the CPU intense areas, and I can't tell a difference unless looking at them side by side, and still it's tough.
Would have thought the level of detail would have affected the GPU more.

Plus I don't recall that much physics. Nothing really intense at least. Most of it was scripted.

My readings were from a Rivatuner statistics server graph on my keyboards display. This was on my old i5-750. It rarely went over a single core's worth. Granted I was always GPU limited with SSAA and TressFX but the game was designed with the PS3 and Xbox 360 in mind.

Still might be worth a try with less intense settings.

I guess SSAA should be disabled for Metro too.
 

bystander36

Diamond Member
Apr 1, 2013
5,139
10
106
#34
Would have thought the level of detail would have affected the GPU more.

Plus I don't recall that much physics. Nothing really intense at least. Most of it was scripted.

My readings were from a Rivatuner statistics server graph on my keyboards display. This was on my old i5-750. It rarely went over a single core's worth. Granted I was always GPU limited with SSAA and TressFX but the game was designed with the PS3 and Xbox 360 in mind.

Still might be worth a try with less intense settings.

I guess SSAA should be disabled for Metro too.
Like I said, this was more of a 3D Vision issue. I just found it funny where others view the game as CPU light, with 3D Vision, it was CPU heavy. And it's not a GPU issue, unless I leave TressFX on. I use TressFX if I turn 3D Vision off, but that game looks really good in 3D Vision.

Level of Detail isn't what you would think it is. It only effects the detail of things at range. Less stuff is tracked at long range when set to high compared to Ultra.

I can tell when it is GPU or CPU bound by MSI afterburner. I have it shown on my Logitech G13 LCD at all times.
 
Last edited:
Jun 15, 2015
140
0
51
#36
Looking forward to the results, thanks for doing this the both of you!
 

moonbogg

Diamond Member
Jan 8, 2011
9,744
22
126
#38
Agreed, get in there Gus!
No doubt about that. No excuses, Guskline. Well, except for the whole ridiculous cost of the card and imminent obsolescence by future hardware just a few months from now. If he was hardcore, he'd do it anyway. :D HARDCORE GUS!
 

el etro

Golden Member
Jul 21, 2013
1,581
0
81
#39
If i could chose, i would get no more than a 6700K OC.
 

MrTeal

Platinum Member
Dec 7, 2003
2,612
6
106
#40
If you guys run the FO4 benchmark, I'd be really interested in a few runs at different cache frequencies. The cache on the Haswell-E chips runs much slower than the cache on a 6700k. Memory scaling has a big effect on FO4 benchmarks, it'd be interesting to see cache scaling as well.
 

moonbogg

Diamond Member
Jan 8, 2011
9,744
22
126
#42
If you guys run the FO4 benchmark, I'd be really interested in a few runs at different cache frequencies. The cache on the Haswell-E chips runs much slower than the cache on a 6700k. Memory scaling has a big effect on FO4 benchmarks, it'd be interesting to see cache scaling as well.
Sorry, I don't have FO4.
 

bystander36

Diamond Member
Apr 1, 2013
5,139
10
106
#44
Either there is something wrong with your 980ti's, or you two are not at the same clocks. His Graphics score is much higher.
 
Mar 10, 2006
11,709
102
126
#45
Either there is something wrong with your 980ti's, or you two are not at the same clocks. His Graphics score is much higher.
Agreed. I'm going to run the benchmarks and have GPU-Z output to log file the boost clocks.

I'm also getting a much lower Unigine Heaven score than his, as you can see here:

 
Mar 10, 2006
11,709
102
126
#46
OK, so I cranked up the fan speed on the 980 TIs to 100% and this led to a dramatic improvement in my score in Unigine Heaven:



I will now re-run 3DMark.
 

Majcric

Golden Member
May 3, 2011
1,318
0
81
#47
Nice thread guys. I enjoy looking at the results. Any chance you have Watch Dogs?
 
Mar 10, 2006
11,709
102
126
#48
3DMark Fire Strike Extreme score went up a bunch after turning up the fan speeds:

 

moonbogg

Diamond Member
Jan 8, 2011
9,744
22
126
#49
I failed to disclose my GPU clocks. We didn't match them. Sigh...Its not a big deal. If you tell me what you are turboing to I will match it exactly on my cards. My tests were done at 1450 core 7900 ram. I will match yours and re run the tests. The cards have to be at the same clocks or this CPU test doesn't mean crap.
 

moonbogg

Diamond Member
Jan 8, 2011
9,744
22
126
#50
Are we doing this? Lets move on to games. If there is any CPU bottlenecking we will know when we test games at 1080p and compare to 1440p. I'm ready.
 


ASK THE COMMUNITY