There's already plenty of links to show OC vs OC reviews and benches (max OC 7970 beat a +135% gtx680), you can dig them out of the OLD review thread yourself. This is beating a dead horse.
"The implementation of hardware monitoring and power/performance balancing doesn't negatively impact framerates though and at 1920x1080 the GTX 680 almost always exceeds the performance of a reference clocked 7970, in fact it tends to match or exceed the 7970 running at its maximum rate for air cooling." ~ Hardware heaven
On average GTX680 OC and HD7970 OCed trade blows. However, if it starts coming down to modern vs. older games, GTX680 wins in modern games while HD7970 wins in older games (Metro 2033, Crysis 1/Warhead).
In BF3, Witcher 2, Dragon Age 2, Batman AC, Old Republic, SKYRIM, Dirt 3, GTX680 sweeps the 7970 without any problems up to 1920x1200. At 2560x1600 both cards are too slow to provide playability in the most demanding games without a 2nd card. So it's a moot point. Considering HD7970 costs more in US, it's a non-starter.
Once aftermarket 680s launch, any overclocking advantage 7970 has right now will completely disappear. Here is a
Zotac that hit 1400mhz on Air.
We already know it takes at least an 10500-1070mhz HD7970 just to match a stock 680 from MSI Lightning Reviews.
Also, AMD continues to focus heavily on texture optimization (which is an acceptable practice) but not when it reduces image quality and unfairly increases performance on HD7000 series of cards. It was criticized immensely for this when HD6870 was launched with cheating drivers and produced inferior texture image quality to HD5850/5870 with a 5-6% increase in performance as consequence and now this is happening again with HD7970:
"Considering the data we’ve seen up until this point, we have to come to the disturbing conclusion that AMD's Radeon HD 7000-series cards currently enjoy more aggressive benchmark results at their default driver settings, resulting in reduced texture quality compared to the Radeon HD 6000s and GeForce GTX 500s. Using the highest Catalyst A.I. setting appears to be the remedy, though it costs additional speed." - Toms Hardware
It looks like
AMD released a driver that fixes this issue. However, shouldn't the image quality be almost a non-factor in 2012? Why does a professional review website have to go out of their way and investigate such a thing? Similar image quality should be almost a standard.
It's not whether the issue was fixed or not but because it has now happened with HD6870 and HD7870, it means we can't automatically assume that AMD won't do this again. That's disappointing.
you then do not understand what a baseline is
for it to be a baseline it needs to be utterly repeatable. on different cards, in the same game with the same settings. this prevents that. teh boost #'s in the [H] review were all over the place
All benchmarks have a variation of 1-3% as a result of margin of error. The fact that NV's GPUs Boost out of the box is how the consumer will get those cards. Therefore disable GPU Boost is counter to how 100% of GTX680 owners will use their cards.
On the other hand manual overclocking on an HD7970 or GTX680 is not something each user will perform. The fair comparison would entail:
1) Out of the box GTX680 vs. out of the box HD7970 with no manual overclocking at all
OR
2) Manual overclocking on GTX680 vs. manual overclocking on HD7970.
The baseline performance for any product is what performance the user will get after he takes that component out of the box and puts it in his/her system. Any speed above that speed which results from manual adjustments is overclocking.
Just wait until AMD implements the same feature and then all these arguments will be put to the backside. Dynamic overclocking is the future because it allows for a better balance between performance and power consumption.
There have been two or three cards compared in this thread. That is not an adequate sample size. As has already been discussed, [H]'s card hit 1200Mhz. Either they were sent a ringer by Nividia for their review or there are cards out there that boost a significant amount. With such a variable range of stock performance, it makes benchmarking very difficult and gives no true baseline of what the end-user can expect from the stock card. 1097Mhz to 1200Mhz is a 9% difference. That's more than a little variance.
If HD7970 with aftermarket coolers was $499, then we can start getting into these types of discussions. As it stands, it takes cream of the crop HD7970 that can reach 1250-1280mhz on air to beat a factory default 680. Since pretty much most 680s can hit 1200mhz+ with a GPU offset, that would bring 680 to parity with an OCed 7970, while retaining a cheaper price and lower power consumption and more features.
And then there is the case of after market non-reference 680s that will hit 1300mhz+. HD7970 is still a good card but until there is a price cut, it comes down to what games/programs you run specifically since an overclocked-on-air 7970 still cannot convincingly beat a manually overclocked 680.
Also, I think the debate is more for people running 2560x1600 screens. 1920x1080 or 1920x1200, it's not that close. GTX680 has a healthy 15%+ lead in more recent games at those resolutions.
I think it's best to buy based on the games/programs you run. Like if you play
Anno 2070, Bulletstorm, Metro 2033, Crysis 1/Warhead, get the AMD card. If you play BF3, Crysis 2, Batman AC, Dirt 3, SKYRIM, get the 680. For current HD7970 users, there is no point in wasting $ to side-grade.
The choice is rather very simple. When AMD and NV had cards which were similarly priced (i.e., X850XT PE vs. 6800U), this is how we used to recommend them.
