[H]ardocp - gtx 680 reviewer's press guide posted

Grooveriding

Diamond Member
Dec 25, 2008
9,144
1,322
126
http://www.hardocp.com/article/2012/03/30/nvidia_geforce_gtx_680_reviewers_guide

[H]ardocp has done something cool and put up the reviewer's guide that nvidia distributed with the 680. Interesting to see what nvidia says and offers to reviewers with their cards they send out for review.

Also as the [h] article says

If nothing else, you will get to see the overall message that NVIDIA is evangelizing to the folks that write the reviews you are reading across the Web. It should be interesting to see how the review messages you have consumed differ or agree with what exactly NVIDIA’s messages were on the new GPU. Enjoy.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
There is not much technical info there unfortunately.I believe AT covered most of the points in their review.A good read nonetheless.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
I read through it yesterday... More general than such things have been rumored to be in the past. This is a good thing.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I'm surprised that they would post this. If I was NV I'd be pissed at somebody for doing that. Maybe Kyle will stop getting cards from NV going forward...he and Apoppin could compare notes I guess...
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I agree with testing video cards the way drivers are installed. This is from discussion in the H forums.
http://hardforum.com/showpost.php?p=1038556058&postcount=15
Brent_Justice [H] Video Card Managing Editor, 12.0 Years
Status:
user_offline.gif



Quote:
Originally Posted by demingo
So when you guys test, do you follow any of the information relating to the AMD driver settings? Or do you just leave both at default? Is there a noticible image quality difference with the AMD driver enhancements on? I am wondering how much of this is true and how much is Nvidia PR spin.

Default driver configs on both products, out-of-box experience as AMD and NV intends. The only fair way to test is to do nothing and leave the settings at the default settings that AMD and NV ships with and then look at how performance and IQ compares.

Optimization does not equal a bad thing, it isn't a bad word, any optimization that can improve the gameplay experience, and performance, without a negative change in visual quality, is a positive thing.

But, I'm conflicted about that statement when I know AMD CCC by default ,checks/enables tessellation optimization.

1332632141aKw0DB88Oa_1_26_l.jpg
 

djsb

Member
Jun 14, 2011
81
0
61
But, I'm conflicted about that statement when I know AMD CCC by default ,checks/enables tessellation optimization.
But if it made any negative visual difference they would note it in the review that Nvidia has better image quality. They have done this in the past where AA and AF modes are concerned.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
For the tessellation optimization to have any effect, a profile for each application where this would be used, is required. As of now, no such profiles exist, thus the optimization is without effect.

However, it is important to sensitize reviewers to check once in a while, because AMD mostly implemented this slider to get away (cheat) with their lower tess performance in the 5000 and 6000 series.

It is useful for people with less powerful cards, but then the "AMD optimized" setting should be optional and "application controlled" should be the default setting just like it is with AA and AF.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
But if it made any negative visual difference they would note it in the review that Nvidia has better image quality. They have done this in the past where AA and AF modes are concerned.

Yes, that can be subjective, and we have seen them voice , and stand by certain observations that are subjective. So that's consistent.
But the team over there also beats in to everyone that in game benchmarks are just plain evil. They gave them a nickname, canned and beat in to the readers head that these canned benchmarks can be optimized. Hey, theres that word again, optimization !
What I trust from reviewers that use both methods of benchmarking, game play and in game benchmarks is to inform me, if they have a subjective opinion that the canned benchmark is not realistic.
Thats something H won't give other reviewers credit for.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Curbing excessive tessellation if it doesn't impact IQ is a good thing, I think it should be left enabled in benchmarks provided there's no discernible difference while gaming.

because AMD mostly implemented this slider to get away (cheat) with their lower tess performance in the 5000 and 6000 series.

How is providing higher performance with no discernible difference in IQ cheating? If developers had any sense in implementing tessellation there would be no need for such a feature. Tessellation was supposed to improve IQ with little performance cost and not cripple performance while not providing any appreciable difference in IQ. It's just tessellation done wrong.
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Curbing excessive tessellation if it doesn't impact IQ is a good thing, I think it should be left enabled in benchmarks provided there's no discernible difference while gaming.

How is providing higher performance with no discernible difference in IQ cheating? If developers had any sense in implementing tessellation there would be no need for such a feature. Tessellation was supposed to improve IQ with little performance cost and not cripple performance while not providing any appreciable difference in IQ. It's just tessellation done wrong.

Benchmarking is having a specific workload that all cards have to do. To alter this workload by using fp16 instead of fp32 render targets, 16bit instead of 32bit, 4xAA instead of 8xAA is cheating. It is not for AMD/Nvidia to decide what "no discernible difference in IQ" is. If one thinks a certain benchmark is not suitable, then one should just not bench it, simple as that.

Look at the time when 32bit was new. Turning it on gave a huge performance hit, today it is standard. The slider is okay for gaming, not for benchmarking.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Benchmarking is having a specific workload that all cards have to do. To alter this workload by using fp16 instead of fp32 render targets, 16bit instead of 32bit, 4xAA instead of 8xAA is cheating. It is not for AMD/Nvidia to decide what "no discernible difference in IQ" is. If one thinks a certain benchmark is not suitable, then one should just not bench it, simple as that.

Look at the time when 32bit was new. Turning it on gave a huge performance hit, today it is standard. The slider is okay for gaming, not for benchmarking.

Agree. If you're going to bench something for comparative at 1:1, you make sure both are executing the same workload. Otherwise, disclose all difference during benching.

It isn't the reviewer's job to determine how the end user will configure their setup, just to test the products as equally as possible.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Benchmarking is having a specific workload that all cards have to do. To alter this workload by using fp16 instead of fp32 render targets, 16bit instead of 32bit, 4xAA instead of 8xAA is cheating. It is not for AMD/Nvidia to decide what "no discernible difference in IQ" is. If one thinks a certain benchmark is not suitable, then one should just not bench it, simple as that.

Look at the time when 32bit was new. Turning it on gave a huge performance hit, today it is standard. The slider is okay for gaming, not for benchmarking.

I didn't mean benchmarks only games. Doing it in 3Dmark or uniengine is far different from doing it in an actual game. I don't condone such practices in benchmarks, far from it. It is blatant cheating. nVidia is no stranger for such things, everyone remember their blatant cheating in 3dmark 03 where their flagship card was getting a whipping from a competitor's midrange card so they decided to do something about it. However when it comes to games things are far different. Benchmarking games is not about fairness, it is about actual experience that a given card provides, so not impacting IQ while increasing performance is a very desirable thing. For example if you can increase performance by 25% while impacting IQ only to the degree that you need two static screenshots with a magnifier to see a difference, do you think it should not be done?
 
Last edited:

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
I call shens on the Crysis 2 Nvidia results. 7970 should have outperformed.
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I didn't mean benchmarks only games. Doing it in 3Dmark or uniengine is far different from doing it in an actual game. I don't condone such practices in benchmarks, far from it. It is blatant cheating. nVidia is no stranger for such things, everyone remember their blatant cheating in 3dmark 03 where their flagship card was getting a whipping from a competitor's midrange card so they decided to do something about it. However when it comes to games things are far different. Benchmarking games is not about fairness, it is about actual experience that a given card provides, so not impacting IQ while increasing performance is a very desirable thing. For example if you can increase performance by 25% while impacting IQ only to the degree that you need two static screenshots with a magnifier to see a difference, do you think it should not be done?

So now the reviewer is dictating what we, the consumers/audience, find as deemable IQ?

IQ is subjective from person to person. I don't read reviews to see what some guy things about the IQ, in the end I may totally disagree use something more lax or more aggressive. I read reviews to get an understanding of the product and to see how it fairs against its competitors.

Reducing IQ to gain performance because the reviewer deems it a wash doesn't tell me anything about the product and more so what the reviewer prefers. Now, if they wanted to do a separate IQ comparative, go for it, but don't mix the two and tell me where I should be setting my options for MAXIMUM PERFORMANCE. (Stupid Crysis, always have to it that way.)
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
^ this.
If one wants to change the setting, do it yourself. A separate part of the review where they what the slider does to one or two games is okay if there is time for it. But it has no place in general benchmarks, synthetic or "real".
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
So now the reviewer is dictating what we, the consumers/audience, find as deemable IQ?

IQ is subjective from person to person. I don't read reviews to see what some guy things about the IQ, in the end I may totally disagree use something more lax or more aggressive. I read reviews to get an understanding of the product and to see how it fairs against its competitors.

Reducing IQ to gain performance because the reviewer deems it a wash doesn't tell me anything about the product and more so what the reviewer prefers. Now, if they wanted to do a separate IQ comparative, go for it, but don't mix the two and tell me where I should be setting my options for MAXIMUM PERFORMANCE. (Stupid Crysis, always have to it that way.)
There are always differences in how the games look on NV and AMD. If you want the game to look as it should your only choice is to use MS software renderer.

If one wants to change the setting, do it yourself. A separate part of the review where they what the slider does to one or two games is okay if there is time for it. But it has no place in general benchmarks, synthetic or "real".
That way you aren't really doing any real world benchmarks. All of your benchmarks become synthetic that way. You can compare ASICs that way but not the final products.
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
That way you aren't really doing any real world benchmarks. All of your benchmarks become synthetic that way. You can compare ASICs that way but not the final products.

No, there are enough settings to cater to the majority of gamers out there, namely different resolutions with 4x or 8xMSAA and 16xAF. If you want "real" benchmarks, why not change ingame settings as well? And use slower CPUs - not everyone has an overclocked Core i7. Do you know how much time that would cost and how bloated the reviews would become?
Also, you bench what is comparable. As you cannot influence the tessellation level on Nvidia hardware, you cannot compare it to AMD.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
LOL, I get better performance with the "use application setting" and the slider maxed than with "AMD optimized" tessellation.

Maybe this setting is a throwback from the cayman days, since cayman was bad with tessellation. My Batman, Crysis 2 numbers went up when I enabled "use application setting" for tessellation.
I don't see a difference in image quality, is there a way to enable wire frame mode with tessellation? Any other 7970 users, test it?
 
Last edited:

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Also, you bench what is comparable. As you cannot influence the tessellation level on Nvidia hardware, you cannot compare it to AMD.

All the more reason for them to add that, more options are always better. We should use that slider, provided there's no impact in IQ. nvidia convinced some developers to cripple performance of their cards just to cripple performance of the competitor's card more. That's really condemnable action and that's the reason for the existence of that slider if developers implemented tessellation for the IQ benefit there would be no reason for such driver intervention. It is very fitting to use that.
 

(sic)Klown12

Senior member
Nov 27, 2010
572
0
76
But, I'm conflicted about that statement when I know AMD CCC by default ,checks/enables tessellation optimization.

While AMD has the ability to override a game's tessellation level, they have yet to implement game specific tessellation profiles to take advantage. The drivers say "optimized" but without a profile to tell it what to do it renders the application's default level. How long it stays that way is anyone's guess, but as of now it's not giving AMD an unfair advantage.

I've tested Crysis 2, Batman AC, Heaven, and Metro 2033 and the image quality and performance is the same with the "optimized" setting and selecting the option to let the game decide.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
There are always differences in how the games look on NV and AMD. If you want the game to look as it should your only choice is to use MS software renderer.

I never argued what the game looks in the end, that is more your arguement. Mine was making sure that both products are doing the same amount of work as best as possible.

That way you aren't really doing any real world benchmarks. All of your benchmarks become synthetic that way. You can compare ASICs that way but not the final products.

Real world benching? Because everyone has an i7 processor clocked high and a cushy set of RAM? I think you are now trying to reach for something that completely turns your original position upside down.

The final product probably won't ever be used as it is reviewed. Everyone will tweak their end product to accomodate bottlenecks in their setup or to satisfy some requirements.

In the end, what you are asking for has more variables that will make the end result even more skewed then using optimization. The reviewers have to run the products as best 1:1 they can, and if they wish to run a second article about optimization - power to them. The user will decide which optimizations they want to use.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
LOL, I get better performance with the "use application setting" and the slider maxed than with "AMD optimized" tessellation.

Maybe this setting is a throwback from the cayman days, since cayman was bad with tessellation. My Batman, Crysis 2 numbers went up when I enabled "use application setting" for tessellation.
I don't see a difference in image quality, is there a way to enable wire frame mode with tessellation? Any other 7970 users, test it?

I monkey'd around with it in Batman:AC. When I owned my HD 5870, yes the slider was useful. Right now in my last just monkeying with Batman: AC the hardware is just so much better the slider does nothing EXCEPT affect IQ (negatively as you scale down) for FPS that won't even matter.

I can tell you in Batman: AC during my HD 5870 days it was bad to the point where tessellation acted like pop-up, flat objects would just grow in volume as you got closer which made me laugh.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I monkey'd around with it in Batman:AC. When I owned my HD 5870, yes the slider was useful. Right now in my last just monkeying with Batman: AC the hardware is just so much better the slider does nothing EXCEPT affect IQ (negatively as you scale down) for FPS that won't even matter.

I can tell you in Batman: AC during my HD 5870 days it was bad to the point where tessellation acted like pop-up, flat objects would just grow in volume as you got closer which made me laugh.

This doesn't even make a bit of sense though. AMD optimized is slower than application preference, but i've only tested a couple of games. 3dmark 11 score is the same with both settings.

When I select application preference, it automatically maxes the slider out to 64x. Anyway, I really think this setting was geared toward the 5xxx and 6xxx cards which had relatively poor tessellation performance. That isn't really the case with the 7xxx.
 
Last edited:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I'm in favor of the slider and effort to have more options to adjust your experience. I just don't think , it should be a default implementation.
By doing that, Reviewers like Brett, have no problem justifying leaving it that way.
I also think the whole Nvidia has developers sabotage their own games, is nonsense.

I can site Dirt 2. DX11 vs DX9 on the 5870 was a incredible performance hit. Yet , AMD needed Dirt 2 , to have a dx11 game. In that game, tessellation was used for waving flags in the grandstands, and a puddle water splash.
So even now, a lower tier card, can still run dx11 with a AMD card and use that slider, to gain some fps. Biut still benefit from other coding in dx11 executable that does exist.
The above real scenario had nothing to do with Nvidia.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
This doesn't even make a bit of sense though. AMD optimized is slower than application preference, but i've only tested a couple of games. 3dmark 11 score is the same with both settings.

When I select application preference, it automatically maxes the slider out to 64x.

You're confusing "grey-out" with "maxed-out."

You can freely bench it yourself, messing with the slider affects IQ once you get lower (I believe it was around 16x, below that you can see the changes in the vines, trees, and cobblestone - you start getting that pop-up thing I mentioned.)

If the game is set to use 16pixels, setting it 64x won't do anything. If the game is set to use 1pixels setting it to 64x versus 32x will show you a difference (this is evident in Crysis 2.)