OFFICIAL KEPLER "GTX680" Reviews

Page 23 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Anandtech. Well that settles it.

Anyway, sorry about the misunderstanding on the clock scaling thing. I thought you were talking about scaling on the new GPUs.

Or, you could click a few buttons and check out every other review on the web. His statement was pretty tame, really, he just said that gtx 680 wins by a little bit.
 

Crap Daddy

Senior member
May 6, 2011
610
0
0
Both cards can run with the same clocks. It seems 7970 is faster at the same clocks.

They can't. Or it's very damn hard to check this. You see, the GTX680 is not running at a fixed clock, it is constantly adapting to load. You cannot fix the clock speed to a certain value.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Congrats! .

Now I have to decide if I want to do some benches on my current setup to compare to the new one. On and off the fence, would rather not find any situations where what I have now is faster...

I'll probably do a few easy to run canned ones. I have to yank the cards out to sell them tomorrow, so I'll get a few in for comparison's sake.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Both cards can run with the same clocks. It seems 7970 is faster at the same clocks.

Who knows with DOC? Who cares? All that matters is card that's faster, win price/performance, cooler, less power - GTX 680 wins them all.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
At EVGA's site they do plan on releasing models with the names Classified / FTW

When you click on the gtx 680 secrets (lol) you see more information.
http://www.evga.com/articles/00669/#GTX680

  • 8+6 Pin/8+8 Pin Power Connectors
  • Available on EVGA GeForce GTX 680 Superclocked Signature, FTW, Classified and Hydro Copper models, these connectors can provide up to 225/375 watts, delivering better overclock stability.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Well the overclocking options are somewhat disappointing. I'd be interested to see what Afterburner the like offer over the next couple of weeks. That and checking out some overclocking-focused reviews (hopefully AT has one coming, I know [H] does). I'm not sure how I'd feel buying a card that's closer to its tweaking limit.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Now I have to decide if I want to do some benches on my current setup to compare to the new one. On and off the fence, would rather not find any situations where what I have now is faster...

I'll probably do a few easy to run canned ones. I have to yank the cards out to sell them tomorrow, so I'll get a few in for comparison's sake.

I think you did a bunch of benches when you replaced your 1366 though right? I remember you had Nehalem bottlenecking in BF3 in Multiplayer before you jumped to SB-E.

Some interesting notes from Xbitlabs review:

Notes:

"1) Nvidia’s GeForce 300.99 driver doesn’t look stable as yet. We could observe image tearing in Metro 2033, occasional image defects in Crysis 2 and Battlefield 3, and long test loading times in 3DMark Vantage. Hopefully, driver updates will solve these and other bugs and will also push the graphics card's performance even higher up.

2) When we replaced the default GPU thermal interface with Arctic MX-4, we saw the GPU temperature drop by 4°C at peak load. The top speed of the fan decreased by 120 RPM in the automatic regulation mode at that. (Looks like NVidia's poor TIM quality/application is carried over from Fermi days. :D)

3) Our testbed with GeForce GTX 680 would occasionally not start up at all, emitting two long beeps. This must be some partial incompatibility with the Intel Siler DX79SI mainboard or a defect of our sample of the GeForce GTX 680 card.

4) When checking out the overclocking potential of the card, we found a strong correlation between its stability and cooling. For example, if the GPU temperature was not higher than 70°C (at 85% fan speed), the card's GPU could be overclocked by 15-20 MHz more."

I'm not sure how I'd feel buying a card that's closer to its tweaking limit.

It would feel like:

- A 30% faster HD7950 for $50 more
- A factory pre-overclocked 1050mhz HD7970 for $50 less out of the box
- An 1186mhz air cooled GTX680 that's 5-10% faster than an 1150mhz HD7970 for $50 less
- A $500 overclocked on air GPU that delivers 90% of the performance of a water-cooled 1350mhz HD7970 for $200 less
 
Last edited:

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Who knows with DOC? Who cares? All that matters is card that's faster, win price/performance, cooler, less power - GTX 680 wins them all.

For enthusiasts the most important metrics are OC performance and OC multi-gpu performance.

OC performance is very close, if you get a good sample 7970 is actually faster. I haven't seen any multi-gpu tests with OC. The only SLI test I have seen actually puts 7970CF ahead of GTX680SLI.

Anyway are there any 4-way benches against 7970 4-way? I'm getting an upgrade bug...
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
4) When checking out the overclocking potential of the card, we found a strong correlation between its stability and cooling. For example, if the GPU temperature was not higher than 70°C (at 85% fan speed), the card's GPU could be overclocked by 15-20 MHz more."
Overclocking has always worked like that. That's why people turn up their fans to go for maximum clocks. This is nothing new.
 

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
Actually the voltages I listed are very realistic for HD7970 cards. Did you take a look at average overclocks on air for cream de la crop HD7970 range?

Here is a Asus Direct CU II TOP - One of the most stringent bins of HD7970's in the industry:

The Makings Of ASUS TOP Graphics Cards.

"The labs then use the selected GPUs to assemble complete graphics cards, with PCBs and memory. The products are put under high resolution automated optical inspection (AOI) hardware for an extended certification stage that ferrets out any defects or flaws in manufacturing (special attention given to soldering and contact points, such as data interfaces)."

That highly-binned 7970 chip needed 1.3V to reach 1250mhz on air. We are talking cream of the crop 7970 chips. At 1150mhz, HD7970 can't beat GTX680 and costs more. That means you are really getting into insane overclocking gambling trying to find a chip that'll do 1250mhz easily or have to pay $580-600 for non-reference 7970 that have been specifically binned (XFX Double D, Asus Direct CU, MSI Lightning).

vs. Reference GTX680 with no chip binning, no after market cooling, no beefed up components, for less $?

It's an assumption on your part about binning. High overclocks need higher voltages, which means you don't do it with high quality ASIC. You actually want something middle of the road to high average quality so you can feed it voltage.

From everything I have seen Nvidia and AMD are back in the same ball park and it's a small park. Normally Nvidia leaves zero question as to who is dominates because of their big die strat. I think it's funny that Nvidia took 3 generations of cards to figure out that AMD's strategy of smaller, more efficient die size is the way to go. The added bonus is the profit margins are way better.

One thing that might not be relevant to gamers, but is relevant to card owners is Bitcoin. My AMD cards make me money mining Bitcoin. Every argument that is pro Nvidia is laughable in that context because Nvidia is so bad at hashing. I've already made back 25% of my card cost in Bitcoin and that is less power. So my 7970 has cost me $412 and continues to drop in cost every day. All arguments are already out the window at that price point. Obviously work on 680 CUDA miners is about to start as soon as the coders get their cards. HardOCP didn't run their bitcoin analysis like they did with the 7970. I don't know if that is bad or good.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Wow.

gg, AMD. Your only weapon now is price cuts. I guess you should be used to being in this situation, what with Intel and all.

EDIT: Wait, that can't be right, RS. I think those numbers are wrong because of AT's table showing how they are already beyond that voltage well before 1.2GHz: http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/4

AMD had to have seen this possibility coming from Nvidia. I think they priced the 7970 as high as they possibly could to take advantage of coming to the market first, being new, and being the fastest, even if it was for only a short time.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
One thing that might not be relevant to gamers, but is relevant to card owners is Bitcoin. My AMD cards make me money mining Bitcoin. Every argument that is pro Nvidia is laughable in that context because Nvidia is so bad at hashing. I've already made back 25% of my card cost in Bitcoin and that is less power. So my 7970 has cost me $412 and continues to drop in cost every day. All arguments are already out the window at that price point. Obviously work on 680 CUDA miners is about to start as soon as the coders get their cards. HardOCP didn't run their bitcoin analysis like they did with the 7970. I don't know if that is bad or good.

That's because you have very cheap electricity in the US. In my country bitcoin mining became unprofitable a long time ago. I don't know about now but I doubt that something changed for the better.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
I think the auto OC features are good for non enthusiast Overclockers. Just like SB turbo boost.

A bit wonky that AMD is now the forerunner in GPU compute and apparently now more amenable to clock/voltage tweaking. Having no x86 business to keep fed, NVIDIA has taken the opportunity to focus this 680 chip 100% on video game performance.

Soon we will see if AMD was milking with their default 7970 clocks or whether it's just a consequence of the current state of 28nm. Bare minimum expectations would be for 1GHz retail box 7970s at 680 prices.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
lol, seriously you just went there? Please, stop acting like a fanboy. Fermi was way more power hungry then Cypress/Evergreen but then it was fine, when the tables are turned so are your priorities. Who cares about power consumption with high-end cards?

Anyway 7970 power consumption isn't bad it's that GF680 power consumption is just great.


bwahaha we have to listen to it for 1 1/2 years because the slower card used 50 less watts, but now that AMD has the slower card that uses more power we can't talk about it because we'd be fanboys if we did? :thumbsup:
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Even 1.3GHz with boost will be very hard to find.
It seems 680 and 7970 can clock head to head. There is not much difference there.
What's the performance clock for clock?

From what I've seen, they're very close to the same clock/clock, though with the extra memory headroom I'd expect 7970 to pull away a bit at higher clocks. Doesn't matter, however, because even if they're identical, gtx 680 still rules in power/heat/noise.
 

mak360

Member
Jan 23, 2012
130
0
0
bwahaha we have to listen to it for 1 1/2 years because the slower card used 50 less watts, but now that AMD has the slower card that uses more power we can't talk about it because we'd be fanboys if we did? :thumbsup:

lol, just couple months ago, am sure i can remember you saying the exact same thing, that people don`t buy high end for power and that they should buy lower/mid end for that.

ha-ha good one, long time laugh - kthxbai

Edit: Bring on the overclocking since nvidia is going out of its way to stop/pull any site going over the top
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
WOW, this just got pulled http://www.overclock3d.net/reviews/gpu_displays/nvidia_gtx680_review it was showing overclocks, so got pulled

maybe nda or something (which explains why theres no overclocks)


This is ridiculous man, is Nvidia is pulling web reviews that have any mention of overclocking on the 680? What gives?

Kyle at HardOCP specifically mentioned that Overclocking wouldn't be covered in their initial review. They've never not covered OC's in a GPU review to my knowledge.
 
Last edited:

aceshigh23

Member
Oct 20, 2008
30
0
0
took the liberty to arrange some review links that are not in the OP


http://techreport.com/articles.x/22653/1

Thanks for this review. I loved the methodology that techreport is using. I'm mostly just looking for the best card for Skyrim (still haven't purchased it yet because I've been waiting to refresh my 2 year old system), and their review makes me much more comfortable with pulling the trigger on a 680 with my 2560x1600 monitor.
 

mak360

Member
Jan 23, 2012
130
0
0
Am not sure yet but we will know whats going on by tomorrow am sure (am guessing most are still under nda)

Edit, that cant be right, i though nda expired today...hhmm
 

thilanliyan

Lifer
Jun 21, 2005
12,060
2,273
126
One thing that might not be relevant to gamers, but is relevant to card owners is Bitcoin. My AMD cards make me money mining Bitcoin. Every argument that is pro Nvidia is laughable in that context because Nvidia is so bad at hashing. I've already made back 25% of my card cost in Bitcoin and that is less power.

Yep I have made back the full cost of one of my 6950s in the equivalent of about 3.5 months of mining at ~$5/bitcoin. The mining actually makes it hard for me to switch to nV now. Hopefully people have better results with the GTX680.
 
Last edited:

-Slacker-

Golden Member
Feb 24, 2010
1,563
0
76
Or, you could click a few buttons and check out every other review on the web. His statement was pretty tame, really, he just said that gtx 680 wins by a little bit.

He said that they win most of the time (first 95% of the time, then 90%). He wasn't referring to the performance difference, but to the frequency of their wins.

acehigh23 said:
Thanks for this review. I loved the methodology that techreport is using. I'm mostly just looking for the best card for Skyrim (still haven't purchased it yet because I've been waiting to refresh my 2 year old system), and their review makes me much more comfortable with pulling the trigger on a 680 with my 2560x1600 monitor.

Definitely. I don't think I've seen Anyone else trying to address how often do the games experience very low frame rates or frame latencies.



mak360 said:
WOW, this just got pulled http://www.overclock3d.net/reviews/g..._gtx680_review it was showing overclocks, so got pulled

maybe nda or something (which explains why theres no overclocks)

Damn ... that was the most balanced review, what with all those overclocked cards in every game. It also happened to be the poorest showing on overclocking compared to tahiti out of all the reviews I've seen.
 
Last edited: