• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

GTX680, images from THG review leaked

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
skyrim with mods does not count as a retail game, which was my stipulation

with enough modding you can get any game to break any configuration of hardware, including your 3GB 7970

This is kind of round about thinking, they don't code a game so that it won't run on gaming machines. Developers and modders will however provide ways to leverage the bleeding edge.
 
GTX 680 is mid range?

Absolutely. Regardless of what NVIDIA calls it or charges for it, this is a replacement for the GTX 570. The high end cards need to deliver 2560x1600, 3D Vision (basically 2 x Vertical resolution), and 2D/3D Vision Surround (5760 x 1080 and up). Again, I'm very impressed by the 680. It simply doesn't deliver 1600p performance superior to the GTX 590, (which is roughly the same as my SLI 570's.)
 
Last edited:
It was good enough for 6950 and 6970 2GB models. In Crossfire that is.

It was the best you could get then, but it wasn't enough. Hardcore eyefinity/surround enthusiasts moved to 3GB 580s when they became available because they were hitting VRAM walls. They'll stick to 7970s until there is a 4GB 680 around, if, it is even faster at those resolutions (doesn't look like it will be)
 
Absolutely. Regardless of what NVIDIA calls it or charges for it, this is a replacement for the GTX 570. The high end cards need to deliver 2560x1600, 3D Vision (basically 2 x Vertical resolution), and 2D/3D Vision Surround (5760 x 1080 and up)
So the top end card will be called GTX 685? GTX 690? If it's called GTX 780 does it make the GTX 680 high end for this generation?
 
all in one fell swoop you're completely missing the point I was trying to make as well as reinforcing my argument: without mods the VRAM advantage of a 7900 is moot when playing those games on a single monitor (also these games will be faster on a 680 unless you're running multiple monitors, again, because there are no mods the extra VRAM will not help)

Nah, your statement was


skyrim with mods does not count as a retail game, which was my stipulation

with enough modding you can get any game to break any configuration of hardware, including your 3GB 7970

You said any game would break any configuration of hardware with mods. Which is 100% incorrect.

You are correct, the VRAM advantage is moot. I had a 2GB 6950 which was more than enough, too. You just made an incorrect blanket statement, that's all 🙂
 
Absolutely. Regardless of what NVIDIA calls it or charges for it, this is a replacement for the GTX 570. The high end cards need to deliver 2560x1600, 3D Vision (basically 2 x Vertical resolution), and 2D/3D Vision Surround (5760 x 1080 and up)

Because random internet man says so!
 
Because random internet man says so!

Did you see the benchmarks? 1600p performance is clearly inferior to the GTX 590. You don't need to take my word for it. Look at the data:
680at1600p.png

Source: http://videocardz.com/31116/nvidia-geforce-gtx-680-performance-comparison

And there is little doubt the GK110 is on its way.
 
Last edited:
That's being thrown around here as fact, left and right. And left again.

On one hand this chip is much smaller than Nvidia's recent flagships, has a 256bit memory bus, and loses to the HD7970 at very high resolutions and in GPU compute. On the other hand, there's no proof that Nvidia is actually planning to release a bigger flagship, however given that GPU compute is so poor with the GTX 680, I think a bigger Kepler chip is likely coming out eventually.
 
I've heard that games currently can't use more than 4GB total RAM including VRAM. Can anyone clarify if this is true? If it's true, I can't see 3GB cards being all that useful.
 
People saying 3GB doesn't matter aren't the kinds of people who buy once and hang onto the card for 3+ years before upgrading again. 3GB VRAM and better high-res performance is more futureproof than 2GB with worse high-res performance. Esp. with 3D thrown in for good measure.

On the other hand: People who upgrade frequently don't really need that 3GB unless they are at very high res or 3D. Also, adaptive VSync, GPU Boost, PhysX, and CUDA may be attractive to some people.
 
On one hand this chip is much smaller than Nvidia's recent flagships, has a 256bit memory bus, and loses to the HD7970 at very high resolutions and in GPU compute. On the other hand, there's no proof that Nvidia is actually planning to release a bigger flagship, however given that GPU compute is so poor with the GTX 680, I think a bigger Kepler chip is likely coming out eventually.

They're has got to be one coming. They need that chip they can slap on a card and sell for $5000. GK104 is not that chip. I sure hope it is coming. 8800GTX to GTX 280 = massive increase, GTX 285 to 480 = massive increase, 580 to 680 = increase :sneaky:
 
So the top end card will be called GTX 685? GTX 690? If it's called GTX 780 does it make the GTX 680 high end for this generation?

I tend to think of generations as being either new nodes or entirely new architectures, depending on what the situation is. GF110 and Cayman were not, to me, new generations. But I considered the gtx280, despite being on the same node process as G92, a new generation.
 
I tend to think of generations as being either new nodes or entirely new architectures, depending on what the situation is. GF110 and Cayman were not, to me, new generations. But I considered the gtx280, despite being on the same node process as G92, a new generation.
Maybe they will release BigKepler in December to compete with what ever AMD brings out around that time.
 
It's a subpar nvidia flagship though imo, and using Tom's results, is less of an improvement over the 580 than the 7970 was over the 6970.

So when this card is rebadged as the gtx760ti, and has a reference core speed of 1150mhz, will it be above par then? 😀 😀 😀 😀 😀 😀
 
So when this card is rebadged as the gtx760ti, and has a reference core speed of 1150mhz, will it be above par then? 😀 😀 😀 😀 😀 😀

Like I said about the HD 7980 thing - just give me the bios file and a sharpie, watch me flash my GTX 680 over!

Who didn't do that with their Radeon 9500 or GTX 8800? You know you did it! Haha.
 
Maybe they will release BigKepler in December to compete with what ever AMD brings out around that time.

I think they will release big kepler when they consider it to be ready. I still expect it to come out sometime this fall. Or best bud Charlie had an article in February saying it taped out, so it probably taped out at the beginning of February or end of January. Tack on 6-8 months for testing and a few respins, and that puts it in August-October time frame. I fully expect that beast to top GK104 by 40%.
 
You tell me, is that what it would take to get you to buy it ?

I think these new high end cards, despite being "high end" for what is available right now, need to be around $400 for me to consider purchasing. But I feel like by the time hd7970 and gtx680 get down to $400, refreshes will literally be 1-2 months away. So then, to answer your question, yes I think that is what it will take to get me to buy one of these cards. Either that or lots of overtime at my work.
 
Hopefully this clears up some confusion about Nvidia releasing a high end part later this year. Apologies if already posted.
Gibbo said:
NVIDIA won't release a faster single GPU this year, the next faster product from both AMD and NVIDIA shall be dual GPU based.

source
 
So did the official MSRP of $499 get laid out somewhere in this thread yet? $499 HERE That's a nice little surprise. Nvidia launching a GPU that is about 10-15% faster and $50 cheaper to boot.
 
The point is simple for those who don't think this is a mid-range GPU being sold at enthusiast prices: Enthusiast GPUs have always focused on giving the highest performance at the highest resolution and AA. 1080p is not enthusiast resolution, sorry for those who think it is.

The fact that the performance delta drop significantly at 1600p (and by extension, worse at 1080p x 3) should really tell you its a mid-range GPU, aimed at 1080p performance, castrated in dp and compute performance. It's traits are reminiscent of the gtx460, 560, 560 ti, we've seen it before, we can smell it. It smells like a mid-range product being sold at high prices. The same for the 7970, which is why I've always said its prices were ridiculous.

The architecture is great, amazing perf/w. But the entire line-up from both companies are ~$100 too high.

Really, even 7870 with its tiny GPU is being sold near $400, that's just silly season.

Edit: So NV can't get their gk110 out this year? Sad.
 
Last edited:
Agreed with all but the first bolded juvenile crap and the second which is bull. Nvidia delivered twice the performance increase on 60nm and 40nm that they have on 28nm.

This card is impressive because it raised the bar @ 1080p and being the fastest card is worth something. Bringing new levels of performance that were not available before is awesome. It's a subpar nvidia flagship though imo, and using Tom's results, is less of an improvement over the 580 than the 7970 was over the 6970.

Yeah I care about 1600P, so I guess I am biased. :sneaky: Also, very surprised at nvidia shifting their emphasis from delivering the most performance they can, to lower performance but more efficiency. I don't like noise/heat, but don't care about power consumption, power is cheap in Canada - we make tons of it 😀

while its subpar for an nVidia flagship its also very much well above par where their midrange performance has historically settled and thus bodes very well for any prospective "true" monster flagship part if we can correlate GF104/114 performance to GF100/110 performance and thus extrapolate what we might get from GK100/110 from GK104.


You said any game would break any configuration of hardware with mods. Which is 100% incorrect.

You are correct, the VRAM advantage is moot. I had a 2GB 6950 which was more than enough, too. You just made an incorrect blanket statement, that's all 🙂

wow, you are taking things way too literally, I had inferred anyone here would know that I meant any modifiable game could be made to break any hardware, which was what we were talking about

And really just about any PC game can be modded. The games you list simply aren't common for mods. Heck, I've already found mods for Witcher 2 and Mass Effect with some quick Google-Fu

http://kotaku.com/5880354/mod-makes-mass-effect-on-pc-look-absolutely-incredible/gallery/1

http://www.gamingreality.com/2011/06/witcher-2-mods-list.html

my point was that if someone with the know-how was determined enough they really could create a mod for any game that could make even the most powerful rigs come to a crawl

It doesn't take a Rocket Surgeon to figure out why we don't see 3rd party image quality mods benchmarked (at least not on a regular basis as I'm sure you'd be quick to point out that someone, somewhere, at some point in time has done such tests, and I am sure it has happened)


People saying 3GB doesn't matter aren't the kinds of people who buy once and hang onto the card for 3+ years before upgrading again. 3GB VRAM and better high-res performance is more futureproof than 2GB with worse high-res performance. Esp. with 3D thrown in for good measure.

On the other hand: People who upgrade frequently don't really need that 3GB unless they are at very high res or 3D. Also, adaptive VSync, GPU Boost, PhysX, and CUDA may be attractive to some people.
its generally not very economical to blow your wad all at once and expect it to last: by the time the extra VRAM is truly a difference maker, chances are you're going to be turning settings down regardless because the GPU will have lost its edge and becomes a bottleneck on its own accord. It really would be better to spend half the money twice as frequently to keep up to date with midrange parts than to go all in with high end parts and cling to them for years and years.

Of course this is where AMD still has the advantage (and really has for some time) as they actually have "midrange" parts available for this generation and have typically had very strong offerings in that niche. Even though we could argue they are currently overpriced, they currently lack competition.


This is kind of round about thinking, they don't code a game so that it won't run on gaming machines. Developers and modders will however provide ways to leverage the bleeding edge.
That was my point, and it was also my point that the line has to be drawn somewhere, otherwise we can always cook up some scenario where one option is better than another when those extreme scenarios really don't matter for the vast majority. If we throw enough 3rd party textures at a game, of course the GPU with more memory is going to favor from it. However most people simply won't care because most people aren't going to be running those mods.
 
Back
Top