GTX670 Upgrade and Overclocking Review (vs. 5850 crossfire)

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ieat

Senior member
Jan 18, 2012
260
0
76
That's a nice looking card.



That's an amazing out of box overclock. Honestly, manufacturers should just stop advertising core clock. It's totally meaningless. If I had known the windforce had a default boost of 1176 versus 1058 for reference, hell yeah I would have bought it. But instead we see 980 versus 915, which didn't sound like too big a hurdle to overcome.

Yep my Gigabyte boosts to 1189 out of the box. I've heard of some boosting to 1202. It seems all or most of the gigabyte cards have a crazy out of the box stock boost.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
So I managed to get the card up to 1190 Boost and 6468 memory. +105 and +130 respectively. Gonna call it good and leave it. It's fast enough.

That FTW edition has a different core/boost ratio than most, maybe because it's based on the 680. The default clock is nearly 100MHz higher, but the default boost is barely any higher than stock (~30MHz). In my opinion, that makes it more like stock card in performance...and not as much like the Gigabyte card, for which the default boost is out of this world. Still a nice card, no doubt, and you get the upgraded quieter cooler. And the clocks you got, which are basically what I'm running at, are plenty fast indeed - just slightly faster than a 680.

awesome review Termie!

Just one question, the microstuttering that we hear for dual cards, is it true? did u notice a diff between the 670 & the 5850s? i totally agree that SLI seems the way to go for the future due to slow video card development, but microstuttering and driver support issues always turned me off.

another thing about SLIing in the future, won't 2gb be an issue in 2 years like 1gb is now? So ideally u'd get a 3/4gb GTX 670 and sli that in the future, yea?

You're welcome! I'll be honest...I'm still not sure I had any microstutter. The OC'd 5850 crossfire versus 670 comparison is a good one because the performance on paper is nearly identical. Overall the 670 is definitely smoother, particularly in rendering explosions. Lower minimums might have been the culprit more than microstutter, but there's no way for me to say for sure. In either case, the difference is not significant enough for me to say people looking for the ultimate value should not go crossfire/SLI. There's just no comparison in terms of price, but I really say this for people who already have one card - it's not as clear a decision for people who are building from scratch. But if you have a single 6870 or 560, for instance, my opinion is that buying another 6870 or 560 is absolutely a better deal than upgrading to a 7850, 7950, or even a 670.

And that brings me to your last question - 2GB versus 4GB. Yes, 1GB is already showing significant limitations, but that has been a standard since 2008 at least. 2GB just came about in late 2010, and only showed benefits in 2011. Maybe by 2015 2GB will be a serious limitation. Given there are no 670 4GB cards yet, that isn't an option at this point, but 2GB will be good enough for a while, except for ultra resolutions. In that case, I think I'd get a 7970 or 680 4GB.

Which one of the gtx 670 cares have voltage control?

None, and in my opinion, there never will be one. Total control of voltage, including the ability to automatically undervolt when a certain temperature is reached (70C), is central to nVidia's Boost design. That is how nVidia was able to get such amazing clocks out of a chip that likely was never meant for such high frequencies. My hunch is that boost may have been developed after the initial design phase once the performance of the 7000 series was revealed. A 670 at a clock of 915 wouldn't be nearly as impressive against the 7950, for instance.
 
Last edited:

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
Termie: I have especially enjoyed this thread due to my rigs below ( 2 5850s CF vs GTX 680). No doubt the GTX 680 is a fast powerful card but the 5850s in CF are remarkable considering their age vs the 680. However, the GTX 680 with 2 G Vram really shows its muscle in a multi-monitor environment with 5760x1080 overall resolution.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Termie: I have especially enjoyed this thread due to my rigs below ( 2 5850s CF vs GTX 680). No doubt the GTX 680 is a fast powerful card but the 5850s in CF are remarkable considering their age vs the 680. However, the GTX 680 with 2 G Vram really shows its muscle in a multi-monitor environment with 5760x1080 overall resolution.

Thanks guskline. Before I bought my 670, I saw your comment in another thread stating that 5850CF was very similar to a 680. I didn't quite believe you so I had to try it myself. You we're right. ;)

What is your opinion of the two setups at the same resolution (if you've been able to test that)? In some games, crossfire doesn't scale well or at all, so that's an obvious win for the 670/680. But for all the rest, there's generally just a slight advantage in playability in favor of the 670 (where 5850CF wins in FPS, it usually still doesn't have a minimums advantage). Did you notice this? If so, could it be microstutter? I think there may have been times when I sensed this with my 5850s, as if the screen would get stuck ever so briefly. Again, that could just be minimums, and either way, it was pretty subtle.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
Termie, the GTX 680 is faster and has an edge in Vram but it isn't as obvious with just one monitor at 1920x1080 or below. I don't own a higher resolution monitor. With 3 1920 x 1080 monitors the 680 wins EASY! That being said, if you have a CF 5850 setup with one monitor whose resolution is 1920 x 1080 or below, the performance jump is not as clear with a GTX 680 as it was with a multi-monitor setup.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
That FTW edition has a different core/boost ratio than most, maybe because it's based on the 680. The default clock is nearly 100MHz higher, but the default boost is barely any higher than stock (~30MHz). In my opinion, that makes it more like stock card in performance...and not as much like the Gigabyte card, for which the default boost is out of this world. Still a nice card, no doubt, and you get the upgraded quieter cooler.

I have found boost to act a little weird. When it says 1190 it will sometimes run at 1230 or so and it may have a few problems with some testing I did. Had to back it down a bit because the way it seems to work is the boost clock is not really the Max clock the card will run at. Doesn't matter much in terms of performance to drop a few MHz off the clocks for total stability.

Also default stock reference boost is 980 the gigabyte is 1058 and the evga ftwv is 1084. Stock ftw has faster memory clock as well (6208 vs 6008) according to specs.
 
Last edited:

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
I have found boost to act a little weird. When it says 1190 it will sometimes run at 1230 or so and it may have a few problems with some testing I did. Had to back it down a bit because the way it seems to work is the boost clock is not really the Max clock the card will run at. Doesn't matter much in terms of performance to drop a few MHz off the clocks for total stability.

I agree - it's very hard to actually say what the card's clocks are for comparing to other cards (and therefore figuring out your own overclock). If you use GPU-z, it reports boost clocks that are totally different from what the card actually boosts to. Similarly, I've seen as high as a 1084 boost at stock, even though both 980 and 1058 have been published as the actual boost for a stock card. The Gigabyte's boost is published as 1058, but actually appears to be something like 1189 from user reports.

Anyway, you have to take the good with the bad - the 600-series has some trickery up its sleeve to give you good performance, but you can't always be in control of it. In that sense it's not a card for someone who really wants to get into overclocking.
 
Last edited:
May 13, 2009
12,333
612
126
Anyone try the voltage unlocking with afterburner? My voltage unlocked but didn't seem to do any good. Is it just a dummy switch or what?
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Anyone try the voltage unlocking with afterburner? My voltage unlocked but didn't seem to do any good. Is it just a dummy switch or what?

I've read that you shouldn't use that on a 670/680, as it just increases the heat, which will lead to an automatic undervolt and thus lower clocks. At least one forum user found using the voltage feature did indeed decrease performance. I'm not going to play with it, because I think that's just an option that appears for the benefit of older generation cards.

If you decide to test it, let us know what the results are - (a) if it actually increases reported voltage, and (b) if you can keep your temps in check long enough to benefit from it before the card's undervolting takes over. With a windforce you might just be able to.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I agree - it's very hard to actually say what the card's clocks are for comparing to other cards (and therefore figuring out your own overclock). If you use GPU-z, it reports boost clocks that are totally different from what the card actually boosts to. Similarly, I've seen as high as a 1084 boost at stock, even though both 980 and 1058 have been published as the actual boost for a stock card. The Gigabyte's boost is published as 1058, but actually appears to be something like 1189 from user reports.

Anyway, you have to take the good with the bad - the 600-series has some trickery up its sleeve to give you good performance, but you can't always be in control of it. In that sense it's not a card for someone who really wants to get into overclocking.

This is why I actually settled for much lower boost than some sites report in reviews. The boost clocks I see them up yo actually end up close to 1300 on my card and totally crash. So what I did last night is settle on just a bit more than the stock overclock and called it good enough. I am now leaving it 24/7 at +30 boost for 1115mhz (actual clock is a lot higher when playing a game), and memory +50 for 6258. No a huge memory over clock but I didn't really get anywhere when I went up too much. Very minor difference at 1920x1200
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
[snip]

Here is Unigine with the same overclock of +135/6600. It boosted to 1202 during the entire benchmark except for a few seconds where it hit 70C and dropped voltage - then boost was at 1194, but that shouldn't make a huge difference. If I do it again, I'll just max fan for the test.

1356600.jpg


[snip]

Here are some sample boost levels I've seen:

At stock: 1058, 1071, 1084
At +125: 1184
At +135: 1194
At +150: 1202
At +151: 1215

So I have a big update for you all. I ended up RMA'ing my EVGA GTX670 for three reasons: (1) annoying fan buzzing, (2) graphical corruption on cold boots, and (3) regular graphics card shut downs (with sound still running) in BF3 with a mild overclock (10%). Of these, it was the second that suggested there was really something wrong, but anyway, I found out a lot more than I expected when I got my new card.

(1) The fan on the new card sounds exactly the same as the old one. Seems like this is just how these are built, and if you want more proof, read the Anandtech review here: http://www.anandtech.com/show/5818/nvidia-geforce-gtx-670-review-feat-evga/17. Scroll down to read all about the problems EVGA's card has in regard to sound. I read the review before buying and thought it was a fluke in their test card. It wasn't.

(2) All stock 670s are not the same! My first card would default to a boost of 1058, and very rarely (as seen above) would hit 1071 or 1084. Well, the first benchmark I ran on the new card showed a boost of 1110. I was so confused - I thought somehow my OC profiles were kicking in, and yet I'd erased them all to start fresh. What was going on? This card is simply faster at stock speeds that my previous GTX670.

(3) And the kicker - not only does it start faster, it could also take more boost. My old card crapped out after a 135 boost (and frankly might not have been entirely stable) - that was 1058 + 135 = 1202 (I know - it didn't add up!). My new card easily boosts to 1110 + 150 = 1260. Here's a Unigine benchmark run at 1260/6700:

capturezay.jpg


Compared to the score above at 1202, it's only about 2% faster, but that could be due to the memory being just a bit faster. Like the run with the old card, the boost would drop a bin or two (to 1254 or 1247) upon hitting 70C. The amazing thing is that it runs at all. My old card couldn't come close.

Another very interesting finding I made: this card boosts so high at stock speeds that it blows past its power limiter. At 1110, it is above TDP, and downclocks itself to keep within the power limiter. This is just another reminder that any factory-overclocked card (as this one strangely acts like) really isn't OC'd until you manually up the power limiter. At about 106%, it will never downclock for power reasons (but will for temps above 70C).

These Kepler cards are strange beasts. I'm very pleased with the performance (with this new card and its healthy OC, I'm now comfortably above my 5850 duo in FPS and more importantly in smoothness). But the idle fan noise on such a high-end card is very disappointing, and the fact that you don't necessarily get the same performance out of the same model of card is perplexing at best. OC'ing was never guaranteed, but now neither is out-of-box performance. And you absolutely have to keep the card below 70C to hit your max boost.

One more thing - I'd like to make a plug here for EVGA. They just have first-rate service. While their card designs aren't very innovative (compared to Gigabyte/Asus, for example), and the fan on this model isn't top notch, I couldn't have had a better experience with this RMA. They are easy to reach, they cross-shipped the replacement, and they made me feel like a valued customer the whole way through. I'll buy from them again (but take Anandtech's noise testing more seriously next time!).
 
Last edited:

chimaxi83

Diamond Member
May 18, 2003
5,649
61
101
Yep, Boost has effectively removed consistent baselines. Boost clocks are everywhere, even within the same model/manufacturer.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
so the Asus 670 TOP would be the way to go, to avoid the fan noise??

Probably anything other than a reference model. The 680 cooler used on the 670FTW is supposedly good, as is the Gigabyte model. There's been a lot of talk about the Asus model, but not a lot of user reviews here or on Newegg, so I'd probably wait to see what people say before jumping on that one. But there are actually a surprising number of user reviews stating that cards from each manufacturer have gone bad, so I really don't know what I'd do if I were buying now.
 

realjetavenger

Senior member
Dec 8, 2008
244
0
76
Termie [I said:
Update[/I] - 5/17/12

Well, nothing is simple in hardware upgrades, it seems...I realized yesterday that enabling Adaptive Vsync introduces terrible input lag, at least in BF3. This is not visible in benchmarks, where there is no input involved. For now I've switched back to using the Frame Rate Target in PrecisionX, but this doesn't have the same level of interaction with the 670 drivers, meaning that it doesn't reduce clocks to compensate for the lower load, and therefore isn't quite as efficient. Edit: I think the problem was enabling Vsync after a game had already been loaded. Just played a few rounds and it worked fine without stuttering, but I enabled Adaptive Vsync before loading BF3. This is unlike regular vsync, which you can turn on and off at any time within the game menu.

Is the edit from your 5/17 update still holding true? Any more evidence of input lag or stuttering while using adaptive Vsync in BF3 (or with the replacement card)?

Another very interesting finding I made: this card boosts so high at stock speeds that it blows past its power limiter. At 1110, it is above TDP, and downclocks itself to keep within the power limiter. This is just another reminder that any factory-overclocked card (as this one strangely acts like) really isn't OC'd until you manually up the power limiter. At about 106%, it will never downclock for power reasons (but will for temps above 70C).

These Kepler cards are strange beasts. I'm very pleased with the performance (with this new card and its healthy OC, I'm now comfortably above my 5850 duo in FPS and more importantly in smoothness). But the idle fan noise on such a high-end card is very disappointing, and the fact that you don't necessarily get the same performance out of the same model of card is perplexing at best. OC'ing was never guaranteed, but now neither is out-of-box performance. And you absolutely have to keep the card below 70C to hit your max boost.

This is what concerned me about getting a kepler. Sure is odd trying to figure out what the card is doing out of the box. Then add another layer of confusion trying to figure out where your max stable oc is. Not that is stopped me from getting one. ^_^ (Tracking is telling me my Asus TOP was delivered and is waiting for me to get home from work.)
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Is the edit from your 5/17 update still holding true? Any more evidence of input lag or stuttering while using adaptive Vsync in BF3 (or with the replacement card)?

I honestly haven't had the vsync problem since I installed 301.42, but nVidia made no mention of fixes to vsync in that driver release, so I have no idea why the problem disappeared. It's been working very well for me in BF3 for the past week - ultra smooth, and it keeps temps down (which in turn allows max boost all the time).

This is what concerned me about getting a kepler. Sure is odd trying to figure out what the card is doing out of the box. Then add another layer of confusion trying to figure out where your max stable oc is. Not that is stopped me from getting one. ^_^ (Tracking is telling me my Asus TOP was delivered and is waiting for me to get home from work.)

Congrats - let us know how it goes. Lots of interest here in the Top! Hopefully you've found this thread helpful - it should give you a leg up in understanding how to approach OC'ing. Power limits, temps, variable boost - it all matters. Frankly, the professional reviewers simply had no idea what they were doing when they tested this card. Anandtech reported a max OC boost of 1263 on their GTX670 SC, but also said they were running it at over 80C. Not gonna happen. My guess is that their cards were running at about 30MHz lower than their max boost.
 
Last edited:

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
To all the loyal followers of this thread, I have something extra special for you tonight. Because I happen to have two GTX670s sitting here (my original and my RMA), and because my original 670 works well enough as a slave card (other than a nasty BSOD sitting on the desktop), and because I've got the MB and PSU for SLI, well, you know what's coming....

I present to you GTX670 SLI Benchmarks. Rig in sig, both cards running at 1110/6000 (the default boost of the new card - I had to manually set the first card to run that high). I'll be comparing the results to the stock original card, meaning the SLI'd cards are running about 4% faster than the single card - closest I could get it given that nVidia now sees fit to ship stock cards that run at different clocks.

3dMark11

3dmark11comparecapture.jpg


Note that my physics and combined scores have dropped significantly - perhaps SLI is presenting a CPU load, or maybe something else isn't optimized, but I'm not going to worry about it. The graphics tests speak for themselves. When it comes to theoretical tests, scaling is near 100% (accounting for slightly different clocks).

Unigine

uniginecomparecapture.jpg


64% scaling in Heaven, and minimums nearly identical. Eh, not too impressive.

BF3

bf3benches.jpg


Less than 40% scaling once you account for the different clocks. Because this result is so strange, I'm including a screenshot of all the performance parameters during this benchmark:

bf3slicapture.jpg


The cards are running at about 70% load. CPU bottleneck? I don't think so, not in single-player. PCIe lane bottleneck? Not likely, since the cards weren't that close to full load. Whatever the case may be, based on reviews of 680SLI and the 690, and extrapolating down, I should have hit about 145. Oh well...again, I was never intending to run 670SLI in this rig.

A few other observations:

(1) SLI was incredibly easy to setup, even easier than my old 5850 crossfire. Plugged in new card, it was detected by the drivers, I restarted, and a popup said SLI was ready to enable. Cool.

(2) Power use of the cards is very reasonable. Peak power I saw was 430 in Heaven and 438 in 3dMark11. Definitely do-able on a 650w power supply, even with hefty overclocks (although I didn't try). Idle power was 79w (vs. 62w with one card).

(3) The cards ran cooler together than one alone. This is probably because they never really hit 100%. Probably also helped by the blower-style fans and the extra gap between them on my motherboard. I don't know how anyone runs cards stacked together, though...

Hope you enjoyed this little update! My original card will be on its way back to EVGA tomorrow (thank you EVGA!).
 
Last edited:

realjetavenger

Senior member
Dec 8, 2008
244
0
76
Those sli results you state sure are odd. (I can't see the images - they are blocked by firewalls at work.) I wonder if it has something to do with the bad card you are using. Even though 3dmark is scaling at 100%, heaven at only 64% doesn't sound right. The few user benchmarks out there on the interwebz (that I've seen) with either a 670 or 680 sli setup are getting better scaling than that in heaven. But it may not be all that relevant as the sample size isn't all that large. And bf3 at 40%? That's just weird.

btw, I did get my top yesterday and fully intended on installing and running a couple of benches. But life got in the way and the only thing I was able to do was take it out of the box and oohhh and ahhh at it. It is a good looking card with that cooler and the racing stripes. The red stripes alone have to be worth at least 10 mhz on the boost clock.
There is an interesting user thread over at overclock.net. Most guys are getting impressive clocks (couple over 1300 boost with most at or over 1250 boost). However there are two guys who said their cards out of the box are crashing while running heaven. It's too early to tell why they are having this issue. Plus it is not clear if they have the top or non-top version.

p.s. nice job Termie on this thread. It has been a very informative and interesting read. :thumbsup:
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Those sli results you state sure are odd. (I can't see the images - they are blocked by firewalls at work.) I wonder if it has something to do with the bad card you are using. Even though 3dmark is scaling at 100%, heaven at only 64% doesn't sound right. The few user benchmarks out there on the interwebz (that I've seen) with either a 670 or 680 sli setup are getting better scaling than that in heaven. But it may not be all that relevant as the sample size isn't all that large. And bf3 at 40%? That's just weird.

btw, I did get my top yesterday and fully intended on installing and running a couple of benches. But life got in the way and the only thing I was able to do was take it out of the box and oohhh and ahhh at it. It is a good looking card with that cooler and the racing stripes. The red stripes alone have to be worth at least 10 mhz on the boost clock.
There is an interesting user thread over at overclock.net. Most guys are getting impressive clocks (couple over 1300 boost with most at or over 1250 boost). However there are two guys who said their cards out of the box are crashing while running heaven. It's too early to tell why they are having this issue. Plus it is not clear if they have the top or non-top version.

p.s. nice job Termie on this thread. It has been a very informative and interesting read. :thumbsup:


Thanks for the feedback. You know, I think it's pretty early yet for 600-series SLI. Even Anandtech only got 64% scaling in BF3 from their 680 pair at the same settings (160 vs. 97): http://www.anandtech.com/bench/Product/585?vs=555, so my 40% scaling isn't that much worse.

I honestly don't think SLI is the way to go right off the bat with a new series. I see it as an upgrade option down the line, and my brief test above shows that there's potential, but either the drivers or my old platform are holding it back. I will consider 670SLI right around Haswell time, if you know what I mean. ;)

By the way, I was having major crashing issues in Heaven with my original 670 with pretty minor overclocks. I wonder if Asus is really binning the TOP, because if it's not, plenty of users out there (maybe even you) will get stuck with one of the 670 GPUs that can't break 1200 boost (and keep in mind that a cooler alone won't do the trick, as mine would crash at around 70C). My new one sailed right through Heaven at 1260 boost.
 
Last edited:

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
Nice job Termie. Since the GTX670s are so new, I'm sure we will see what gives with SLI results.
 

redrider4life4

Senior member
Jan 23, 2009
246
0
0
Glad to know a fellow i7 860 had success with this, I am still running a 5850 solo and am going to upgrade to the 670 by the end of the summer.
 

.Root

I am the banned bad trader, m3t4lh34d
Jun 3, 2012
9
0
0
Thanks for the feedback. You know, I think it's pretty early yet for 600-series SLI. Even Anandtech only got 64% scaling in BF3 from their 680 pair at the same settings (160 vs. 97): http://www.anandtech.com/bench/Product/585?vs=555, so my 40% scaling isn't that much worse.

I honestly don't think SLI is the way to go right off the bat with a new series. I see it as an upgrade option down the line, and my brief test above shows that there's potential, but either the drivers or my old platform are holding it back. I will consider 670SLI right around Haswell time, if you know what I mean. ;)

By the way, I was having major crashing issues in Heaven with my original 670 with pretty minor overclocks. I wonder if Asus is really binning the TOP, because if it's not, plenty of users out there (maybe even you) will get stuck with one of the 670 GPUs that can't break 1200 boost (and keep in mind that a cooler alone won't do the trick, as mine would crash at around 70C). My new one sailed right through Heaven at 1260 boost.


Considering that I'm running FOUR 680s at the moment, I agree with your statement. I wouldn't recommend anything more than 2 670/680s in SLI until drivers are refined more. The only game that makes use of all of my cards is BF3 @ 2560x1440 maxed. Even then I only see usage in the 80% range. It seems my 7970s scale much better than the 680s, but most likely it's because of the driver lead that the 7970s have on the 680s at this time.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Hey all - I have another update for you. This one is a closeup view of how 670 overclocking works, along with the highest 3dMark11 score I've achieved so far, running at a theoretical boost of 1110+150 = 1260, and 6600 on the memory:

capturextv.jpg


There are a number of things to look at in this graph. First, note that 3dMark11 has 6 main tests - four graphics test, a physics test (where GPU load is very low), and a combined test. You can see the 4 high peaks, the one small peak, and then the final somewhat high peak on the graph. Here are the other things to look out for:

(1) I had the fan at maximum (80%), to keep the temperature under 70C, which is one key to full boost. The card averaged around 67-68C in this benchmark, and I don't believe it actually went above 69C. This should mean that voltage is maintained at max (1.175v).

(2) The other major limiter on boost is the power limiter. What you can see in the first two tests is that the power limiter is actually exceeded - it actually goes beyond 122% (the maximum), and then voltage is cut. What results is a decrease in core boost by up to 19MHz (I believe). There is nothing I (or anyone) can do to increase the power limiter beyond 122%, as far as I know, but maybe tweaks will come out later.

(3) Given that temps weren't a limiter, and for some reason power didn't exceed 122% in the second two graphics tests or the combined test, the core is at 1260 the entire time.

(4) Also, I've done some more testing on memory overclocks, and as far as I can tell, I get no additional benefits from exceeding 6600. I did this testing in BF3, which is the easiest way I've found to do it (using a singleplayer campaign). The memory keeps going up, but FPS does not (but I haven't seen it go down either), so rather than stress the memory unnecessarily, I'm leaving it at 6600MHz.

Hope this was helpful. Any questions?
 
Last edited: