GTX670 Upgrade and Overclocking Review (vs. 5850 crossfire)

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Alaa

Senior member
Apr 26, 2005
839
8
81
Would you please post unigine heaven with 4xAA for me to compare it to my 7950?
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Thanks for the tip. Sounds like you are very familiar with the issue then. Next build I'll go for a more modern case - layouts have really changed since the Antec 900 came out. I got mine in Nov. 2007, so it's been through a few builds....

I am rocking with the Temjin TJ-08e now. Micro-atx motherboards only. I had the Silverstone FT-03B for awhile, and that was probably the easiest, user friendly, advanced case I have ever (and will ever) own, but man it was just too big for me. So I sold down to get the Temjin. It's a fairly small case (noticeably smaller than the Antec 900) but still has pretty good cooling. I personally like the smaller footprint it leaves and doesn't have any of the annoying LED lights I thought were cool 10 years ago.
 
Last edited:

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Would you please post unigine heaven with 4xAA for me to compare it to my 7950?

Please list all heaven parameters and I'll do that for you.

I am rocking with the Temjin TJ-08e. Micro-atx motherboards only. I had the Silverstone FT-03B for awhile, and that was probably the easiest, user friendly, advanced case I have ever (and will ever) own, but man it was just too big for me. So I sold down to get the Temjin. It's a fairly small case (noticeably smaller than the Antec 900) but still has pretty good cooling. I personally like the smaller footprint it leaves and doesn't have any of the annoying LED lights I thought were cool 10 years ago.

Believe it or not, that is exactly the case I intend to use for my next build. Was spec'ing out parts for an Ivy build and then decided to wait for Haswell. I'll still be buying the Temjin or whatever replaces it.
 

Alaa

Senior member
Apr 26, 2005
839
8
81
Please list all heaven parameters and I'll do that for you.
Thanks, here is mine:
7950.jpg
 

Forgets

Junior Member
Apr 24, 2012
13
0
0
Figured I'd throw mine in as well. Seems like most of my advantage is in minimums.
7950 @ 1200/6000
Thuban 1055t@ 3.9

7hIPC.jpg
 

Alaa

Senior member
Apr 26, 2005
839
8
81
Here you have it, at +135 offset, 6600 memory. What type of overclock do you have on that 7950? 1150/5600? Amazing that it can beat the 670. Then again that's a 44% overclock, so it looks like it's paid off.
Actually it was 1200/5600. I just lowered it a little to accommodate the hot weather here.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Well, I'm pretty impressed there are multiple examples of the 7950 doing a 50% overclock. Bet you're using more power, though. ;)

Are these completely game stable? That would surprise me, given that HardOCP could only get up to 1050: http://www.hardocp.com/article/2012/05/14/geforce_680_670_vs_radeon_7970_7950_gaming_perf/3

EDIT: After doing some reading, looks like those clocks, or something close to them, are in fact possible: http://www.guru3d.com/article/radeon-hd-7950-overclock-guide/1

Pretty amazing performance, no doubt. Perhaps the 7950's OC potential is being undersold...because this would make it better than the 7970's on air.

Comparing a few benchmarks, this OC would make it:

(1) Faster than a stock 670 in 3dMark11 but slower than an OC'd 670:
7950oc: http://www.guru3d.com/article/radeon-hd-7950-overclock-guide/11
670stock: http://www.guru3d.com/article/evga-geforce-gtx-670-sc-review/20
670oc: http://www.guru3d.com/article/evga-geforce-gtx-670-sc-review/23

(2) Much faster than a stock 670 in Metro2033:
7950oc: http://www.guru3d.com/article/radeon-hd-7950-overclock-guide/10
670stock: http://www.guru3d.com/article/evga-geforce-gtx-670-sc-review/16

(3) Equal to a stock 670 in BF3:
http://www.guru3d.com/article/radeon-hd-7950-overclock-guide/12
http://www.guru3d.com/article/evga-geforce-gtx-670-sc-review/19

Eh, I think I'd still take the 670 for its lower power use. Even when OC'd 15%, it only uses about 20-30 watts more, instead of 80w more like the 7950: http://www.guru3d.com/article/radeon-hd-7950-overclock-guide/3
 
Last edited:

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Another update for people...looks like nVidia still has its work cut out on the driver side. I'm getting stuttering in BF3 when turning on Adapative Vsync, which is a shame given the potential efficiency and boost advantages. Runs fine without it. For now I'm using the frame rate target in Precision, but not sure if it's much better...still testing...

Apparently, this affects both the 670 and 680: http://hardforum.com/showthread.php?p=1038716805#post1038716805

http://forums.nvidia.com/index.php?showtopic=226227&st=0
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
I am not saying it's the user error on your part,
but you must be the single person in the world that gets stuttering when turning Adaptive ON.

Nvidia has acknowledged issue with Kepler driver polling that can be seen on some systems with VSYNC ON. They have announced fix in next driver, but in the meantime Adaptive fixes this for most ppl.

Of course stuttering will never be fixed 100% because

(allow me the quote)

PC really is a minefield for these, worst issue is even developers close to the root of the various issues, eg. nic developers make a poor or non effort at all figuring these out.

Same with engine developers,... look at epic's Unreal Engine, up till today there are games out there with suboptimal ini/cfg settings, it would be easy for Epic to recommend or update certain defaults but apparently they don't care to explain, even less fixing things,...

eg. bInitializeShadersOnDemand is still set to "true", once became default because apparently certain Nvidia cards/ drivers could crash, hasn't been an issue for a while but the workaround stayed, even tho if you set it to "false" it often helped with some lag/ stutter one could encouter,... even worse recent unreal engine games revert manual changes if you don't write protect the files (which can cause other issues).

sound cards in "exclusive mode", can cause audio stutter which also can lead to visual stutter, best is to disable any "exlusive mode" settings.

network cards with flow control and interrupt-moderation enabled, many nic's default this to enabled (their devs even recommend it), often causes frequent lags and stutter of streaming or simply any kind of I/O activity if int sharing (which on a lot of hardware is the typical scenario)

Harddisks/IDE/SATA/PATA/AHCI controllers can easily affect lag/ stutter, what a difference a SSD makes is unbelievable (of course that's only another workaround and doesn't solve the problem) especially in Unreal Engine games where apperently the streaming/ caching/ loading of textures/ all kinds of data is not yet anywhere near perfect and often I think it's similar to the lazy implementation of available tools/ tech,- like with ur FXAA, some ppl (cough* Bethesda) didn't bother to tweak the settings they shipped with, it works but might not be working very well.

Recently played Batman Arkham City, great game but if you can't run this on a top notch system you're likely going to be dissappointed and annoyed by it's performance,... seriously it lags all over the place, even in it's benchmark run it has areas where texture streaming/ lod pop ins are terribly bad, makes you wonder if these things only get tested on ultra fast systems where ur not likely to encouter these,...

Instead of optimizing what they have they introduce "one thread frame lag", "frame smoothing" or keep old workarounds like "UseMinimalNVIDIADriverShaderOptimization" around,... together with driver profiles or drivers assuming certain things on top of all these workarounds who knows what's happening there,... it really seems to be a bloody mess, and as much as Nvidia and ATI usually get blamed for stutter I'm afraid it's not even their fault most of the time.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
I am not saying it's the user error on your part,
but you must be the single person in the world that gets stuttering when turning Adaptive ON.

Nvidia has acknowledged issue with Kepler driver polling that can be seen on some systems with VSYNC ON. They have announced fix in next driver, but in the meantime Adaptive fixes this for most ppl.

Of course stuttering will never be fixed 100% because

(allow me the quote)

PC really is a minefield for these, worst issue is even developers close to the root of the various issues, eg. nic developers make a poor or non effort at all figuring these out.

Same with engine developers,... look at epic's Unreal Engine, up till today there are games out there with suboptimal ini/cfg settings, it would be easy for Epic to recommend or update certain defaults but apparently they don't care to explain, even less fixing things,...

eg. bInitializeShadersOnDemand is still set to "true", once became default because apparently certain Nvidia cards/ drivers could crash, hasn't been an issue for a while but the workaround stayed, even tho if you set it to "false" it often helped with some lag/ stutter one could encouter,... even worse recent unreal engine games revert manual changes if you don't write protect the files (which can cause other issues).

sound cards in "exclusive mode", can cause audio stutter which also can lead to visual stutter, best is to disable any "exlusive mode" settings.

network cards with flow control and interrupt-moderation enabled, many nic's default this to enabled (their devs even recommend it), often causes frequent lags and stutter of streaming or simply any kind of I/O activity if int sharing (which on a lot of hardware is the typical scenario)

Harddisks/IDE/SATA/PATA/AHCI controllers can easily affect lag/ stutter, what a difference a SSD makes is unbelievable (of course that's only another workaround and doesn't solve the problem) especially in Unreal Engine games where apperently the streaming/ caching/ loading of textures/ all kinds of data is not yet anywhere near perfect and often I think it's similar to the lazy implementation of available tools/ tech,- like with ur FXAA, some ppl (cough* Bethesda) didn't bother to tweak the settings they shipped with, it works but might not be working very well.

Recently played Batman Arkham City, great game but if you can't run this on a top notch system you're likely going to be dissappointed and annoyed by it's performance,... seriously it lags all over the place, even in it's benchmark run it has areas where texture streaming/ lod pop ins are terribly bad, makes you wonder if these things only get tested on ultra fast systems where ur not likely to encouter these,...

Instead of optimizing what they have they introduce "one thread frame lag", "frame smoothing" or keep old workarounds like "UseMinimalNVIDIADriverShaderOptimization" around,... together with driver profiles or drivers assuming certain things on top of all these workarounds who knows what's happening there,... it really seems to be a bloody mess, and as much as Nvidia and ATI usually get blamed for stutter I'm afraid it's not even their fault most of the time.

Thanks for that, but did you read the threads I linked? nVidia has replicated the issue on the 600 series.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
I'm now pretty sure that the "stutter" that was evident in BF3 under Adapative Vsync is actually input lag gone terribly wrong.

First hint that this was the issue was that there was absolutely no stuttering in benchmarks. Second was that the "stutter" only occurred when I moved my character in game. Standing still there was no stutter, and in fact the frame rate was locked at 60fps the entire time, so it wasn't an actual frame rate slowdown.

Another interesting fact: some of the threads on this issue have asked users for their mouse polling rate. I think this is a key to the problem. The "stutter" was much worse on my high polling rate g9x than on my low polling rate MX1100. Seems that nVidia just needs to sort out the issue in the drivers.

For now, I've solved the stutter/lag problem by disabling Adaptive Vsync and using the Frame Rate Target in PrecisionX.

Edit: I think this was caused by enabling Adaptive Vsync after loading the game. I just played a few rounds without problem, but had enabled it before loading. So you can't switch it on and off during a game, it seems. I've read the same about the Frame Rate limiter.
 
Last edited:

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Here you have it, at +135 offset, 6600 memory. What type of overclock do you have on that 7950? 1150/5600? Amazing that it can beat the 670. Then again that's a 44% overclock, so it looks like it's paid off.

captureqdz.jpg

Well I figured I'd try it with my 670 which oc's a little better than Termie's. I can run these settings 24/7 w temps that top out mid 60's. Windforce 670 btw. 1270 core and 6840 memory
heaven.jpg

Very nice bench. Exactly in line with the ~7% overclock advantage you have. Again, Heaven is pretty consistent across platforms.

By the way, what's your offset to get to 1270? I've found that my max offset is +151, which gets me to 1215, so you have a good 5% on me due to the windforce cooler and larger PCB. If I were to do it all over again, I'd certainly consider the Gigabyte. In my Antec 900, though, it would force me to move an internal 120mm fan and would likely lead to higher case temps...

Don't forget to update your sig!
 
Last edited:

Alaa

Senior member
Apr 26, 2005
839
8
81
I think this is still the same performance as mine because min fps in this bench for me is probably in the first second and it never dips that low at any point after that. Maybe this is due to my dual core CPU which I bought while waiting for ivy bridge.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Nvidia Geforce driver v302.59 is available http://forums.guru3d.com/showthread.php?t=363158

Supports 6800-GTX690. Can use 2nd card for Physx now with GTX600 series.



Just got done installing and testing my new GTX 670 FTW card. Here's some comparison shots as well as a screenshot of what clocks I have it up to currently. I'm not trying to max this out. I did try going over +100 on the core but that took me above 1200Mhz Boost Clock and crashed Crysis 2. +75 keeps it around ~1160. The card with the white stripe on it is the GTX 670 SC version. You can see the FTW card has a full size PCB and the cooler off the GTX 680.

img20120517180752.jpg


img20120517180825.jpg


img20120517180852.jpg


img20120517180918.jpg


44191882.png
 
Last edited:
May 13, 2009
12,333
612
126
Very nice bench. Exactly in line with the ~7% overclock advantage you have. Again, Heaven is pretty consistent across platforms.

By the way, what's your offset to get to 1270? I've found that my max offset is +151, which gets me to 1215, so you have a good 5% on me due to the windforce cooler and larger PCB. If I were to do it all over again, I'd certainly consider the Gigabyte. In my Antec 900, though, it would force me to move an internal 120mm fan and would likely lead to higher case temps...

Don't forget to update your sig!

I have a +94 offset to get to 1270. Mine comes oc out of the box. I had to get the memory to 6800 for it to catch up to the 1270 core. After 6800 the memory wasn't getting me any more gains.
It really seems as if I finally got one of the better overclockering cards on the forum. :)
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I have a +94 offset to get to 1270. Mine comes oc out of the box. I had to get the memory to 6800 for it to catch up to the 1270 core. After 6800 the memory wasn't getting me any more gains.
It really seems as if I finally got one of the better overclockering cards on the forum. :)

I've pretty much given up trying to overclock my GPU like I do my CPU. I'll do a baby step and get it above factory but trying to figure out the optimal core and memory clock for every game I play isn't fun lol. I did manage to get boost to around 1300, but it crashed crysis 2 lol.

Grats on your fine card though.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Nvidia Geforce driver v302.59 is available http://forums.guru3d.com/showthread.php?t=363158

Supports 6800-GTX690. Can use 2nd card for Physx now with GTX600 series.



Just got done installing and testing my new GTX 670 FTW card. Here's some comparison shots as well as a screenshot of what clocks I have it up to currently. I'm not trying to max this out. I did try going over +100 on the core but that took me above 1200Mhz Boost Clock and crashed Crysis 2. +75 keeps it around ~1160. The card with the white stripe on it is the GTX 670 SC version. You can see the FTW card has a full size PCB and the cooler off the GTX 680.

img20120517180752.jpg


img20120517180825.jpg


img20120517180852.jpg


img20120517180918.jpg


44191882.png

That's a nice looking card.

I have a +94 offset to get to 1270. Mine comes oc out of the box. I had to get the memory to 6800 for it to catch up to the 1270 core. After 6800 the memory wasn't getting me any more gains.
It really seems as if I finally got one of the better overclockering cards on the forum. :)

That's an amazing out of box overclock. Honestly, manufacturers should just stop advertising core clock. It's totally meaningless. If I had known the windforce had a default boost of 1176 versus 1058 for reference, hell yeah I would have bought it. But instead we see 980 versus 915, which didn't sound like too big a hurdle to overcome.

Termie, which drivers are you using?

301.34. I read on another forum that there are already some newer betas, like 301.40 and 302 something, can't quite remember. Edit: here's a thread on the new 302.59 beta: http://forums.anandtech.com/showthread.php?t=2246723

Anyway, I've definitely had some glitching, like vsync input lag problems if enabled during the game without a restart, menus becoming unclickable, and screen blackouts, all in BF3. But then I played some rounds tonight and had no problems at all. Go figure.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
So I managed to get the card up to 1190 Boost and 6468 memory. +105 and +130 respectively. Gonna call it good and leave it. It's fast enough.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
awesome review Termie!

Just one question, the microstuttering that we hear for dual cards, is it true? did u notice a diff between the 670 & the 5850s? i totally agree that SLI seems the way to go for the future due to slow video card development, but microstuttering and driver support issues always turned me off.

another thing about SLIing in the future, won't 2gb be an issue in 2 years like 1gb is now? So ideally u'd get a 3/4gb GTX 670 and sli that in the future, yea?