GTX 680 SLI vs Radeon HD 7970 CrossfireX

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Oh my god. IT IS GAMEPLAY.

Everything is rendered with the game engine. Further, the entire level is 10-15 fps slower,


Thats bold. Its not a cut scene. Do the test like I did. Its 100% rendered with the game engine, and the entire level is consistently 10-15 fps slower.

Lastly, i'm not sure why apoppin would be so butthurt. I've stated that the 680 wins many games (batman, bf3 very slightly, dirt 3) with the 7970 winning all crysis games, metro 2033, and DEHR.

Please explain how this is an unreasonable statement. Multiple websites have shown benches of these games (crysis 2 @ 2560, metro 2033) being slightly sloower on the 680 at stock speeds in single card config. Please explain how some games being slower on 680, with others being faster is an unreasonable statement. Lastly, PLEASE DO TEST it. Replicate my testing methods and try it. Please don't bother me with the 1 benchmark out of 20 which somehow shows a single card 680 winning. Replicate the 7970 xfire and 680 sli test with the latest drivers/caps with both cards overclocked.

By all means test it yourself. IMO apoppin is doing damage control for nvidia viral I guess? :p Latest driver for both cards, using afterburner to overclock, 7970 @ 1125/1700 and 680 @ 135 offset. 2560 resolution. DX11 mode with tess, high textures. 12.2 WHQL with latest CAP, 301.10 driver for NV. both cards oc'ed, same with 680 @ 135 offset. Full eye candy, everything enabled except motion blur. There's the testing method There are no pre rendered scenes in the game, its all done with the game engine.

So you're saying flat out that a cutscene is gamplay. And that is something that all the review sites included in their benches? Or just you? I've never heard of anyone benchmarking cutscenes anyway. You should format anyway, get a nice clean install of Windows 7 and that no GPU drivers are ever present in the system before hand.

I always make it a point, when I benmark, that I have two hard drives. I Install a fresh copy of Windows 7 (they give you 30 days to register but for just benching purposes that's fine). Install every driver except the GPU's driver. Then I clone the drive. The machine never gets fired up if the drive doesn't match the cards I'm using.

It's the best purest way to bench. IMHO. And not cut scenes. :D

I'd also invite you to freely browse Crysis SLI results for GTX680's and Crossfired 7970's on a veritable plethora (3 amigos xD ) of review sites.

Here is one to start you off. http://www.tomshardware.com/reviews/geforce-gtx-680-sli-overclock-surround,3162-6.html
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I'm done discussing this. Do the test, or don't do it. If you have a hard time believing it, I welcome you to test it yourself.

The passive aggressive jabs are cute but if someone is calling me a liar, please by all means replicate my testing methods and test it.

Again it is not a cutscene. Have you played the game? It is the entire level, consistently 10-15 fps lower.

Here's another to start you off:

imageview.php


Crysis_wh.png
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Yeah I edited because having played it I know the benchmark they used is limited to around 99 fps on the top end for most people.

b8151c2b.png


So it's probably not exactly a scaling issue as they have $1000-$1200 running 1200p, and Crysis 2 has no AA it's only FXAA adding 4xAA does nothing.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I just wanted to chime in and point that a canned bench is the equivalent of an in-game rendered cutscene. There is no input from the player, and the scene is entirely rendered by the engine in real time.

So I find it ironic that someone claims a "cutscene" isn't proper yet a canned bench is. You can play a "cutscene" rendered in-game over and over and over and you'll notice, it is the exact same thing. EDIT: Unless it is an intro scene over a MP scene, and you can actually have players in the background monkeying around as your "cutscene" is playing out. Think of WoW when you create a new toon and there are players in the starting area moving about.

Kind of like a canned benched mark ;)
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I just wanted to chime in and point that a canned bench is the equivalent of an in-game rendered cutscene. There is no input from the player, and the scene is entirely rendered by the engine in real time.

So I find it ironic that someone claims a "cutscene" isn't proper yet a canned bench is. You can play a "cutscene" rendered in-game over and over and over and you'll notice, it is the exact same thing. EDIT: Unless it is an intro scene over a MP scene, and you can actually have players in the background monkeying around as your "cutscene" is playing out. Think of WoW when you create a new toon and there are players in the starting area moving about.

Kind of like a canned benched mark ;)

I test the level in its entirety. So this argument about cutscenes is meaningless because I tested the entire level, and the intro is rendered with the game engine anyway. The intro is rendered by the game engine and is not a pre-rendered cut scene. But, again -- I tested the entire level on both setups. Balla, is the entire level from the ashes a cutscene? ROFL, i'm asking because you've actually played it unlike some others.


I still cannot understand how some are so butthurt by this, i've acknowledged that the 680 wins other games such as skyrim, dirt 3, and a couple others. Unfortunately it does not win any crysis game at the resolution and settings I play at. Which is fine because I play skyrim more than crysis 2 (which i've played through several times) and skyrim is faster on the 680s.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I test the level in its entirety. So this argument about cutscenes is meaningless because I tested the entire level, and the intro is rendered with the game engine anyway. The intro is rendered by the game engine and is not a pre-rendered cut scene. But, again -- I tested the entire level on both setups. Balla, is the entire level from the ashes a cutscene? ROFL, i'm asking because you've actually played it unlike some others.


I still cannot understand how some are so butthurt by this, i've acknowledged that the 680 wins other games such as skyrim, dirt 3, and a couple others. Unfortunately it does not win any crysis game at the resolution and settings I play at. Which is fine because I play skyrim more than crysis 2 (which i've played through several times) and skyrim is faster on the 680s.

The crusade must be fought. When making bold claims, any straw that can be grasped will be pulled.

After reading over there (and then finding this thread was the origin) I asked myself - how does one card have an advantage at rendering a cutscene over another when they are essentially doing the same work? In-game rendered cutscenes are still done on the fly, they are akin to canned benchmarks.

I'm not focusing on anything else but that simple contention - in-game cutscenes favor AMD according to him. Guess AMD optimized for cutscenes haha.
 

AdamK47

Lifer
Oct 9, 1999
15,842
3,628
136
Get a 3rd 7970 and then wait to upgrade to the GK110 at the end of the year.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I guess AMD optimized their drivers for the first 10 second intro of the from the ashes level, and since no website tests that level thats pretty interesting. I wonder how AMD is consistently faster throughout the entire level though? Hmm, hmm, AMD must be cheating , clearly.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Nope, out of the ashes is what I used to test it when it first came out in DX9.

e51e87d1.png


You still control the gun, and it's run time is always the same. I think I used 180 second timer on fraps for me each run.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Nope, out of the ashes is what I used to test it when it first came out in DX9.

e51e87d1.png


You still control the gun, and it's run time is always the same. I think I used 180 seconds timer of fraps for me each time.

Thank you :cool: Its a great level for testing IMO because its a pretty detailed / demanding level. And its not a cutscene like others suggest, the intro only lasts like 10 seconds (and is rendered with the game engine)
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
The intro is basically the one guy talking with the OSD showing and the earth moving causing some debris to fall from the top level of what appears to be a road or parkway.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Apples and oranges.
Saying cut scenes possibly playing from a compressed proprietary format, essentially playing a divx movie are in no way the same as in game benchmarks that run without user input , that are rendered through the gpu hardware.
Those canned benchmarks are also what many intelligent reviewers call scientific testing.
I used the word intelligent, because I doubt there is a college degree for benchmarking. I'm just saying that the Journalists/reviewers are not all cut from the same cloth, but none of them are dummies.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Apples and oranges.
Saying cut scenes possibly playing from a compressed proprietary format, essentially playing a divx movie are in no way the same as in game benchmarks that run without user input , that are rendered through the gpu hardware.
Those canned benchmarks are also what many intelligent reviewers call scientific testing.
I used the word intelligent, because I doubt there is a college degree for benchmarking. I'm just saying that the Journalists/reviewers are not all cut from the same cloth, but none of them are dummies.

An in-game rendered cutscene isn't a pre-recorded file that is played back. You can tell when your AF/AA settings are applied to in-game rendered cut scenes. Lots of games are now using engine rendered cutscenes versus pre-rendered due to cost and time.

Pick up Call of Duty, most of the "cinematics" are done via in-game engine rendered on the fly.

THere is a difference and it is being clearly noted.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Every cut scene in Starcraft 2 is rendered real time through the engine, they also happen to be some of the most intensive on your system.

Not a fan of canned bench runs in reviews for the most part. AMD and nvidia specifically optimize for the more well known canned benches. Due to that it may not represent what your game play will turn out to be. Sure a couple are alright, but a review full of canned benches is not much of a review, and a lazy one at that if all it consists of are results from canned benching.
 

themodernlife

Member
Mar 24, 2010
80
0
0
And, in order for your personal benchmarks to even begin to be taken seriously, you need to run those each on it's own fresh windows installation.
I kind of figured your results wouldn't reflect what the review sites came up with. You may need to improve your benchmark methods.

LoL. he's just sharing his genuine real life real world experience that reflects what we ourselves would be experiencing. Kudos to him for providing numbers that reflect an honest attempt to figure out what is the best product for his specific needs. He may not have companies sending him free gear to use, but he is doing us all a favor.
 

Elfear

Diamond Member
May 30, 2004
7,168
826
126
Every cut scene in Starcraft 2 is rendered real time through the engine, they also happen to be some of the most intensive on your system.

+1

When testing for my overclocking limit the cut scenes would crash faster than actual gameplay.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
This is actually ridiculous, nvidia fanboys refusing to believe that 680GTX CF is slower in some games than 7970CF including crysis 2. Saying that cut-scenes rendered in the game engine are actually pre-recorded takes the cake. If they were pre-recorded there wouldn't be any difference in fps, would there?
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
This is actually ridiculous, nvidia fanboys refusing to believe that 680GTX CF is slower in some games than 7970CF including crysis 2. Saying that cut-scenes rendered in the game engine are actually pre-recorded takes the cake. If they were pre-recorded there wouldn't be any difference in fps, would there?
This preaching takes 2 pictures :)
exposestraw.jpg

Strawman+%28light%29.jpg
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Silly wabbits, don't you know its pointless to use logic, reason, or objectivity in discussions with those wearing green blinders?
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I would think he would need, at the bare minimum, six 7970s in crossfire to match 680 sli.

More to the point, we should probably give nvidia a chance to actually address SLI scaling within the drivers since the only drivers released to the public is a 301.1 WHQL.

AMD has had their cards paper launched since 2011, so they've got a bit of a head start on driver support.