[VR] NVIDIA GeForce GTX 680 Specifications Revealed

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
No way GK104 which is rumored at 20% faster than a single GTX 580 would be able to do that.

GK100?

Also, wow.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Still, the overhead from 4xAA doesn't require two additional GTX 580s.

Any other setting reductions?

FXAA also looks considerably better than 4xAA.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
You know what

With well executed UE3, particularly on something dreamy with lots of DoF/Bokah like Samaritan - IT DOES.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
But it's also a single card which has it's own benefits.


OT - I really hope GK104 ends up being somewhere between 7950 and 7970. Not just because I already invested in Tahiti but because I like it when Nvidia and AMD are neck and neck. If Nvidia's midrange really beats AMD's best, it could be bad for future competition. I don't want the GPU world to start looking like the CPU world.

If NV mide range kicks AMD high range that means the NV high range would be such a monster, we wouldnt need to upgrade so often, that would have to be a savings for many...as it is, 1 7970 is plenty and more for most games...
Not that I do...lol
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Jaw was hitting the floor at 3 GTX 580s to 1 Kepler, then I read the linked NVIDIA PR and about the switch from MSAA to FXAA. Back to waiting on actual game results.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
MSAA on Deferred engine is particularly heavy, and not easy to implement.

BF3 uses some kind of supersampling, Witcher 2 MSAA is lacking,
UE3 implementations often have white outlining - so does Skyrim and so on.

We need FXAA 4.0 and new SMAA!
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
well we all know FXAA does tend to blur, but it depends really.
SMAA is much better, and MLAA was worst blur offender and slowest.

Dunno how's new MLAA 2.0
I overheard - faster, but worse looking.

New SMAA and FXAA 4.0 can't come fast enough
 

thilanliyan

Lifer
Jun 21, 2005
12,066
2,279
126
I say unfortunately because AMD didn't set the bar high enough so now Nvidia is able to label what was intended as a midrange card as a high end card and sell it as such.

Oh dear...so when nV releases their cards and if they price it relative to current 7 series prices, you're gonna blame AMD for that? Wow, just wow, I guess nVidia can do no wrong in some people's eyes.

Why do you blame AMD now for pricing their cards relative to the GTX580, if you're going to excuse nVidia for doing the same thing with Kepler?

Should AMD have priced the 4870 just under the GTX280 which was $650 at the time? I mean nV set the price bar pretty high, AMD should have followed suit right?

And after admitting you don't usually buy AMD, now it makes sense why you came into every 7 series thread and complained about the price.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I'm just guessing, so don't come at me bro... But I would assume the logic behind their thought process is that AMD released a next gen product that took the price point of last gens high end cards, but only provided next gen mid ranged performance.
 
Feb 19, 2009
10,457
10
76
MSAA on Deferred engine is particularly heavy, and not easy to implement.

BF3 uses some kind of supersampling, Witcher 2 MSAA is lacking,
UE3 implementations often have white outlining - so does Skyrim and so on.

We need FXAA 4.0 and new SMAA!

BF3 uses a heavily tweak and optimized MSAA that acts only on a small portion of objects on a scene, which is why it only incurs a 15-20% performance hit in a deferred rendering engine.

Without those optimization, it would be much much slower as most of the AA is wasted on things you don't see.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
I'm just guessing, so don't come at me bro... But I would assume the logic behind their thought process is that AMD released a next gen product that took the price point of last gens high end cards, but only provided next gen mid ranged performance.

This gets thrown around and alot and its just not true... Since when are mid range cards supposed to beat previous high end ones? Sure, theres the occasional "amazing" generation where that happened, aka 9700PRO/8800GTX, but most of the time its not true

I still dont get why Nvidia would label their midrange 680, with the highend nowhere in sight... It reminds me of how they skipped the 380 series because Fermi was "so good it warranted 2 generations of upgrades" :D
 

thilanliyan

Lifer
Jun 21, 2005
12,066
2,279
126
I'm just guessing, so don't come at me bro... But I would assume the logic behind their thought process is that AMD released a next gen product that took the price point of last gens high end cards

This doesn't make sense to me...was AMD supposed to GUESS what kind of performance nV would bring with their mid- or high-end and then price it according to their GUESS? You price according to what's available in the market. Trying to argue that AMD should price according to what they THINK will COME reeks of ignorance and other unpleasant words :). AMD did bargain hunters a favour by starting price wars during the 4870/5870/6970 days. They don't owe us that continuously. Why aren't these same people clamoring for nV to start a price-war? No, now we have people that will blame AMD for nV's pricing. :confused:
I want to see both AMD AND NVIDIA doing well. They both need to make money. I see no problem with both companies pricing their cards relative to what the market will bear and what the competition brings. If we get price wars, all the better.

but only provided next gen mid ranged performance.
This is an unknown right now. How can you say with certainty that the 7970 only brought "mid-range" performance? We don't even know yet what nV is bringing.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Public Service Announcement:

If you don't like the product at the price that it sells for, for whatever reason, don't buy it.

If you like the product at the price that it sells for, and you have a need for it, buy it.

It serves no purpose to go on and on about how it compares to last gen price/perf wise or whatever (because else you'd just buy last-gen stuff if you preferred it, probably at a nice discount, and be done with it). Or to compare it with other products that won't release anytime soon in an attempt to characterize something as midrange or high end or whatever, when those categories are somewhat arbitrary. You can kvetch all you want, but the companies do not listen. They only care about how you vote with your wallet.

So vote with your wallet. Case closed.

(mutters something about First World Problems...)
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
http://international.download.nvidi.../GDC2012/Samaritan-MSAAFXAA-Comparison-1.html
http://international.download.nvidi.../GDC2012/Samaritan-NoAAFXAA-Comparison-1.html

I can't tell a difference in the textures with either full screen shot comparing msaa and no AA to FXAA. You must have the most perfecterist vision ever!
The entire FXAA side is blurred by depth of field to hide its poorer image quality. This is typical marketing deceit that most discerning customers would catch. Fanboys, however, do not.
 

thilanliyan

Lifer
Jun 21, 2005
12,066
2,279
126
The entire FXAA side is blurred by depth of field to hide its poorer image quality. This is typical marketing deceit that most discerning customers would catch. Fanboys, however, do not.

Move the slider left or right and you can see the entire scene with FXAA or MSAA. FXAA does blur noticeably though.