
Introduction - 5/14/12
I will be running a few games and benchmarking applications to see what type of increase I get. I'm guessing anywhere from 0% to 20%. Will also look for any improvements in minimums or microstutter (honestly not sure if I'm experiencing it, but this should help determine that). Also looking to drop idle power draw by 25w and load power draw by 125w. And yes, I'm ready for the "sidegrade" comments - I'm doing this mostly for the power use reduction, and just for the fun of it. I don't actually have any issues with my 5850s.
You can see my rig specs in my sig - will be running this on my i7-860 at 3.5GHz, and unless I start to see serious bottlenecking, won't be increasing it above that. The HD5850s were clocked at 850/1200, the GTX670 I'll keep at stock to start, and then I'll see what I can do with the limited options allowed by nVidia.
Here are the applications I'll be benching, all at 1920x1200 (except for 3dMark11):
Battlefield 3 (single-player and multiplayer)
Metro 2033
Dragon Age II
Borderlands
3dMark11
Heaven 3.0
Conclusions - 5/15/12
Overall, the 670 just about met my expecations. Surprisingly, both stock and overclocked, it was beat by the Crossfired 5850s in several games, but handily beat the 5850s by 12-17% in benchmarks, while also notching better minimums in practically all tests. Idle power dropped 23w, load power dropped about 125w depending on the game.
In my opinion, many reviewers have praised the 670 for its overclocking and efficiency, but they are in part making the same mistake we might all make in our frustration at the initial overpricing of the HD7000 series - they have overcompensated.
The reference GTX670 is not a great overclocker, relative to modern cards. While some reviewers did better than my 13% offset, I haven't seen many reviews where the boost of 1058 was raised to more than 1234, which is a 16% overclock (calling this a 35% overclock over the 915 stock clock is just disingenous, BTW, and reviewers should know that). Same for the memory - absolute highest I've seen is 7GHz, or 17%, and I bet that wasn't stable. I'm settling at 10%, with a bit of a safety margin built in. At one time ~15% was considered pretty good (that's about what I got on my 5850 cores), but compared to the better 7000-series cards (7970, 7850), it isn't even close.
On the other hand, the GTX670 is massively efficient. The highest power draw I've seen is 280w, compared to 410w for my crossfire pair (and about 265w for a single OC'd 5850), with at least equivalent performance. That's an amazing improvement over generations past in terms of performance per watt. When I eventually SLI this card, sometime next year, I will again have performance to spare over any existing single card, and will still be well under 450w at load. I am now more convinced than ever that the slowing pace of GPU improvement makes Crossfire/SLI absolutely the best way to experience next-gen performance today. I truly enjoyed my 5850 Crossfire setup - it dramatically improved my BF3 gaming experience. I now have about the same performance in a single card, and when the time comes, I'll be ready to SLI for the next generation of games.
Update - 5/16/12
With the GTX600 series more than ever, enabling vsync (or Adaptive Vsync in this case) has very tangible performance benefits. In addition to reducing tearing, it keeps temps down. While this is nice on any card, with the nature of 600-series boost, it's a real feature. The GTX670 automatically reduces voltage from 1.175 to 1.162 as soon as it hits 70C, making a boost above 1200 impossible, from my testing. This makes overclocking a whole new ballgame. You absolutely have to run a high fan speed or buy a card with a non-reference open air cooler.
When the card is under 70C with Adaptive Vsync on, and a difficult to render scene appears, boost can shoot up dramatically to handle the rendering chores until that scene passes, or until "boost runs out" (i.e., temps get too high). So to all those 670/680 owners out there, seriously consider enabling Adaptive Vsync - it really was made to work hand-in-hand with GPU Boost. You weren't going to see those extra frames (>60fps) anyway, except in your benchmarks.
Happy gaming, folks!
Update - 5/17/12
Well, nothing is simple in hardware upgrades, it seems...I realized yesterday that enabling Adaptive Vsync introduces terrible input lag, at least in BF3. This is not visible in benchmarks, where there is no input involved. For now I've switched back to using the Frame Rate Target in PrecisionX, but this doesn't have the same level of interaction with the 670 drivers, meaning that it doesn't reduce clocks to compensate for the lower load, and therefore isn't quite as efficient. Edit: I think the problem was enabling Vsync after a game had already been loaded. Just played a few rounds and it worked fine without stuttering, but I enabled Adaptive Vsync before loading BF3. This is unlike regular vsync, which you can turn on and off at any time within the game menu.
Last edited: