My GTX680 Review (48 games tested!)

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Hugh H

Senior member
Jul 11, 2008
315
0
0
I ordered an Asus 680 today to replace my unlocked 6950. But 10 mins later I had a change of mind and cancelled the order. The desire to own new technology is tempting but the fact that I only play 1 pc game atm, Tropico 4 and the highest resolution on my TV is 1080P is counteracting this. I am sure many of you had gone through this. How do you finally justify your purchase and feel good about it?

You don't. Just gotta do it man.
 

AdamK47

Lifer
Oct 9, 1999
15,844
3,632
136
The GTX 680 is to Sandy Bridge as the GK110 is to Sandy Bridge E.

I'm waiting this one out.
 
Mar 11, 2004
23,444
5,852
146
Yep, that’s OGSS. The xS and X modes do that. 32xS in this instance internally renders at 5120x3200 with 8xMSAA, then down-samples. I’ve been using those modes since 2004 on the 6800 Ultra.

nVidia’s new version just adds support for DX10/DX11 and likely resolves later in the pipeline to account for deferred rendering, but it’s probably too slow for modern games.

As an aside, I’m finding FXAA to be far superior to MLAA and very usable in games where no other AA works or 2xMSAA is too slow.

Ah, ok, thanks for the clarification, I really don't know a lot of this stuff and haven't been keeping up on it too much.

Yeah, FXAA seems to be quite good, and I'm happy to see developments in AA that aren't just about upping the ones that are nigh-unusable in too many games.

You might find this thread interesting: http://forums.anandtech.com/showthread.php?t=2209459
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
For Quake 3, you could probably try 32x CSAA with 8x SGSSAA.. to improve things even further?!?
True, but such an improvement would probably be academic at best. When you reach a certain level of IQ, maintaining a high (and constant) framerate becomes much more noticeable than theoretical IQ increases.

This is especially true in those old twitchy games where it’s very easy to detect when you drop from a constant 120 FPS to something like 60 FPS.

As to the 32xS modes for UT99, etc., why not just use the SGSSAA modes like you did for Quake 3? It seems that 8x SGSSAA is better than 32xS, since 32xS only does 2x2 OGSS, with 8x RGMSAA on top of that. That means you're only getting 32x effective AF (2x 16x AF) but with 8x sparse-grid SSAA, you're getting more than that, right? Plus at 2560x1600, the edges are already pretty much sharp enough with 8x (you could do 32x CSAA plus 8x SGSSAA, right?)..
Old games usually have no shader aliasing and only slight texture texture aliasing at worst. You don’t need the strongest SSAA available to clean that up.

Also SGSS can cause slight image blurring unless you adjust the LOD. Not only would this complicate my settings, but the LOD adjustment doesn’t work for OpenGL, which accounts for a massive portion of my older games.

I don’t want to blur opaque textures in old games without a good reason, so I stick to xS. The only reason I used 8xSGSAA in Quake 3 was because there’s a current driver bug that prevents 32xS from working in it, and it also has pretty similar performance to 32xS .

This reminds me to remind you.. you gotta try out Stereo3D, my friend!!! I'd guess that 90% of those who enjoy playing older games would hugely appreciate playing them again in 3D, the biggest and most refreshing change since color vs black-n-white. About 60% of the games can be played without any serious issues.
60% is not even remotely close to the level of robustness I expect, not to mention that I’m not willing to downgrade to a 1080p TN panel from my current 1600p S-IPS.

What happened to doing reviews for ABT? Your unique review is definitely worthy of an "official" review article!
Kind words, thanks. :)

I don’t write for ABT anymore. Things didn’t work out for me over there, but I wish them the best for the future.
 

BoFox

Senior member
May 10, 2008
689
0
0
True, but such an improvement would probably be academic at best. When you reach a certain level of IQ, maintaining a high (and constant) framerate becomes much more noticeable than theoretical IQ increases.

This is especially true in those old twitchy games where it’s very easy to detect when you drop from a constant 120 FPS to something like 60 FPS.


Old games usually have no shader aliasing and only slight texture texture aliasing at worst. You don’t need the strongest SSAA available to clean that up.

Also SGSS can cause slight image blurring unless you adjust the LOD. Not only would this complicate my settings, but the LOD adjustment doesn’t work for OpenGL, which accounts for a massive portion of my older games.

I don’t want to blur opaque textures in old games without a good reason, so I stick to xS. The only reason I used 8xSGSAA in Quake 3 was because there’s a current driver bug that prevents 32xS from working in it, and it also has pretty similar performance to 32xS .


60% is not even remotely close to the level of robustness I expect, not to mention that I’m not willing to downgrade to a 1080p TN panel from my current 1600p S-IPS.


Kind words, thanks. :)

I don’t write for ABT anymore. Things didn’t work out for me over there, but I wish them the best for the future.
Cool, thanks for the reply! Same here - I can notice a difference between 85fps and 60fps (with the refresh rate at least 85Hz, of course) but about 90fps/Hz is about as fluid as it can get for my eyes to discern any further benefits.

Yeah, I'd agree that 32xS is like pretty much "good enough" for most older games - unless you can enable 4x4 SSAA at playable frame rates!!
I found a nifty chart explaining the different modes (but the CSAA modes are the same as the alternate names with identical numbers) Also, I'd say that the color coverage samples need some more explaining):
Antialiasing.jpg

http://www.techspot.com/community/topics/question-about-anti-aliasing.166645/
It seems that 8x SGSSAA multiplies 16x AF by 3-4 with a negative LOD bias of 1.5 to 2.0, giving up to 64x effective AF (while the hybrid OGSSAA modes using only 2x2 AA gives 32x AF).

As to the LOD adjustments, you could probably adjust them in the .ini files of most OpenGL games (and most of the Unreal-engine games too), and then set the Negative LOD Bias option in the CP to "Allow" rather than "Clamp".

Hey, who said that you have to "downgrade" your monitor? You could just get a DLP HDTV (especially the Mitsubishi one) that's whopping 65 inches for only $700 or so if you find a good deal on the "last-year's" model on sale. It's perfectly compatible with Nvidia 3DVision, and also 3DTV-ready (with the "killer" benefit being that it has 0.05ms of response time, meaning zero ghosting issues). There, you have an awesome 3DTV that you could enjoy with DLP Link 3D glasses, or with NV's 3DVision kit when you hook it up to your rig using a long HDMI cable. The "level of robustness" for S3D compatibility is magnitudes of orders better than AMD's HD3D support with 3rd-party software, for a vast library of games (I'd say hundreds of games, if not 500 as a guess).

Well, the future is coming anyways. :cool:

EDIT- Ever tried out the MIXEDSAMPLE_64X OR 128X modes (at the bottom of the menu in NVIDIA Inspector)??? I haven't bothered yet, but one Russian guy said (translated):
MIXEDSAMPLE_64X sometimes worked, but gave 2.4 (2 whole and 4 tenths) in the FBS loginskrine, sometimes just a black space as a method of stock.
MIXEDSAMPLE_128X vidyuhi hung so that Monique started to give a sign (signal not etected). After ktrl + alt + right cases were cured, but the point I think is clear, there's something uber-hellish, raschitanoe current on what types of cards quad string))
http://forum.eve-ru.com/index.php?showtopic=46579

It looks like AA_MODE_METHOD_SUPERVCAA_64X_8v8 could be better than 32xS! It's 2x2 OGSS + 16xQ CSAA (8x MS + 8x CS) :)
 
Last edited:

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126

DeeJayeS

Member
Dec 28, 2011
111
0
0
Only 25% increase from flagship to flagship?

Seems like only yesterday there were multiple threads being created about how poor the performance increase was for AMD's 7970 vs. the 580/6970.

Why aren't those people now screaming about NV's "failure"?

/roll eyes...
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Only 25% increase from flagship to flagship?

Seems like only yesterday there were multiple threads being created about how poor the performance increase was for AMD's 7970 vs. the 580/6970.

Why aren't those people now screaming about NV's "failure"?

/roll eyes...
so does this really have to be explained to you? the 7970 gave you 40% increase over the 6970 BUT cost you 50% MORE. sure the increase the gtx680 gave over the gtx580 is smaller but they are at least giving it to you fro the SAME price. bottom line is that Nvidia gives you 25-30% more performance per dollar over the previous gen where as AMD gave you 5-10% LESS. :whiste:
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
so does this really have to be explained to you? the 7970 gave you 40% increase over the 6970 BUT cost you 50% MORE. sure the increase the gtx680 gave over the gtx580 is smaller but they are at least giving it to you fro the SAME price. bottom line is that Nvidia gives you 25-30% more performance per dollar over the previous gen where as AMD gave you 5-10% LESS. :whiste:

BUT, the GTX 680 is really the GTX 460 replacement :p

Have to throw that in there - the scaling is the same :colbert: