Gigabyte Windforce Nvidia GTX 670 OC Version (User Review) & Owners' Thread

Destiny

Platinum Member
Jul 6, 2010
2,270
1
0
Gigabyte Windforce Nvidia GTX 670 OC Version (User Review)


windforcegtx670specs.jpg




The Gigabyte Windforce GTX 670 OC is about an inch longer than my EVGA GTX 570!
570x670.jpg


It is longer due to the custom cooling features and the use of the exact same PCB as the Gigabyte Windforce GTX 680! Some people may be turned off by the BLUE PCB, but how many of us are going to be staring at the GPU in our PC cases?
bluepcb.jpg



I was a bit worried because my EVGA GTX 570 was already a tight fit, but luckily the Gigabyte GTX 670 fits in my Rosewill Challenger v2 Mid Tower Case, too! Of course, the Blue PCB does not match the black color scheme of my rig...(See, the BLUE PCB doesn't really matter when its in the case!)
installed.jpg



It barely fits!
cuttingitclose.jpg






The next two benchmark tests are performed with "Out of the Box" Shipping specifications with no tweaks to any of my PC hardware (I will save that for an update to the review). Please note I am using the "Free Benchmark Software versions" that may not allow me to adjust additional settings. Based off these two benchmarks, it will let current GTX 570 and GTX 560 Ti owners decide if it is worth the upgrade. My PC hardware specifications are in my signature.



Heaven Benchmark v3.0 Basic Benchmark:
Upgrading from an EVGA GTX 570 to a Gigabyte Windforce GTX 670 OC showed over 50% improvement in frames per second and 65% improvement on the Unique Engine Score at 1920x1200p Resolution!
gtx570vsgtx670ue.jpg



I will run the Heaven Benchmark again when I purchase a 2560x1440 monitor.




3DMARK 11 Basic Edition (Version 1.0.3):
The 3DMARK 11 provides a series of test to benchmark your PC Hardware's gaming performance. I ran both my EVGA GTX 570 and Gigabyte Windforce OC GTX 670 through all performance tests that the 3DMark 11 Basic Edition has to offer.
With the EVGA GTX 570 my rig received a 3DMARK11 score of P5846 and with the Gigabyte Windforce OC GTX 670 my rig received a 3DMARK11 score of P8832. Again, the 3DMARK11 scores indicates more than 50% improvement with my GTX 670 upgrade.


gtx570vsgtx6703dmark11b.jpg


Graphics Test #1 through Graphics Test #4 showed the GTX 670 contributed to more than double digit frames per second improvements. .


The Physics Score and Physics Test FPS show little or no improvement because I have the i5-3570k for both tests.


The final Combined Test score, which includes DirectCompute tests, show a 17% (5 FPS) improvement. It is common knowledge that the K104 Kepler of the GTX 670 had sacrificed compute performance.


Overall, the Gigabyte Windforce GTX 670 OC overall was a 50% performance increase over my previous EVGA GTX 570. Wither this is enough to sway current GTX 570 and GTX 560 Ti owners, it is of course objective and user preference.



MSI Kombuster Card Burn-in & Benchmark Utility:
To test the Gigabyte Windforce GTX 670 OC Version's advertised features of factory over clocks, higher Core/Boost Clocks, Ultra Durable VGA, and Windforce "Triangle" Cooling Technology I ran three Kombustor stress tests for at least 5 minutes. The reason why only 5 minutes is because I don't see fan speeds and temperatures increasing (sometimes they decrease) in longer test durations. After 5 minutes I took screen shots of the results:


Tessy Spheres on Plane v2 (GL4) at 2560x1600 with 8X MSAA
tessyspheresonplayv2gl4.jpg



Wavy Plane (GL2) at 2560x1600 with 8X MSAA
kbwaveyplanegl22560x160.jpg



KMark (PhysX) at 2560x1600 with 8X MSAA
gtx670kmarkphysx2560x16.jpg



For reference, I will be referring and comparing my Stress Test Results to the results from this section of AnandTech's GTX 670 Review: http://www.anandtech.com/show/5818/nvidia-geforce-gtx-670-review-feat-evga/17

Please Note: These are "Out of the Box" Shipping performance and I have not tweaked or attempted to OC anything yet.


In all three stress test results the Fan speeds never reached passed 45% speeds. Temperatures hovered at or between 60 and 67 Degrees Celsius which are 10 degrees lower than the reference GTX 670 and EVGA GTX SC 670 load temperatures. At idle the Gigabyte Windforce GTX 670 OC sits at 35 degrees Celsius with fan speeds at 23%. During the stress test the fan noise was noticeably quieter than my EVGA GTX 570.


During the entire duration of all three stress tests the Gigabyte card's clock speeds were at 1162MHz with a boost load of 1.175v. The Gigabyte's constant boost clock speed of 1162MHz also exceeds its own advertised boost clock of 1058MHz. Also the TDP never surpassed 75%...


In my opinion, the Gigabyte Windforce Nvidia GTX 670 OC is definitely worth the 50% performance increase of the Nvidia GTX 570. Gigabyte's choice of using a non-reference PCB and Windforce technology makes it a Nvidia GTX 670 card that cannot be ignored when you are shopping for your next Graphics Card upgrade.

I created this thread for all Gigabyte Windforce GTX 670 owners and other GTX 670 owners to post their results all in one thread and for discussion... I'm a noob so I need help with this stuff too...
 
Last edited:

Destiny

Platinum Member
Jul 6, 2010
2,270
1
0
Games I own: Crysis, Crysis Warhead, Mafia II, Saints Row the Third, Total War: Shogun 2, LA Noire, Star Craft 2, Battlefield 3, Diablo 3... and others


1920x1200gaming5212012.jpg

Note: For Total War: Shogun II, I used the maximum army size and maximum army units on both sides... then I ran FRAPS during the battle to record FPS.

Mafia Wars II
mafia22012052004224757.jpg

mafia22012052004205453.jpg


Running the Mafia II benchmark with a Nvidia GTX 570 average frame rates were only 2-3FPS less at 45 fps... so it seems that the Nvidia GTX 670 cannot handle PhysX better than the Nvidia GTX 570...o_O
With PhysX turned off the Gigabye GTX 670 OC's average frame rates were at 90 fps.

Gaming Performance for other games coming soon!


2560 x 1440 Gaming Performance Coming Soon!
 
Last edited:

Destiny

Platinum Member
Jul 6, 2010
2,270
1
0
Reserved for Overclocking Peformance Results.
 
Last edited:

Destiny

Platinum Member
Jul 6, 2010
2,270
1
0
Nvidia GTX 670 Owners' Silicon Lottery Results - There seems to be two lotteries, out of the box "actual boost clocks and user overclocking clocks.

Shipping (Out of the Box) Core Clocks/Boost Clocks have been all over the place for the Nvidia GTX 670. It would be intersting to see what everyone gets! Post your (Out of the Box) Shipping Boost Clock Results and your stable OC results! I will try to keep this updated - please provide proof (screen shots). Thanks!

Gigabyte Windforce GTX 670 OC - Actual Out of the Box (Shipping) Boost Clock Results:

Destiny = 1163MHz
sontin = 1186MHz
Emo = 1175MHz
Hauk = 1176MHz
Ieat = 1189MHz
Anonymouscrayon = 1202MHz
Caboob = Tri-SLI: Card#1 = 1189MHz, Card#2 = 1189MHz, Card#3 = 1150MHz
Zanover = 1176MHz

Other Nvidia GTX 670 - Out of the Box (Shipping) Boost Clock Results:



Gigabyte Windforce GTX 670 OC - User Overclocking Results (Stable):

OILFIELDTRASH = 1270MHz
Ieat = 1309MHz
Anonymouscrayon = 1342MHz
Caboob = Tri-SLI: Card#1 = 1295MHz, Card#2 = 1290MHz, Card#3 = 1298MHz
Zanover = 1241MHz

Other Nvidia GTX 670 - User Overclocking Results (Stable):
 
Last edited:

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
Cheers for that:thumbsup:,im still waiting for my windforce 670 to come and looking forward to it,although the thread on nvidia forums on stuttering has me concerned.
 

Destiny

Platinum Member
Jul 6, 2010
2,270
1
0
Cheers for that:thumbsup:,im still waiting for my windforce 670 to come and looking forward to it,although the thread on nvidia forums on stuttering has me concerned.

Thanks!

Stuttering in games? I haven't experienced that yet...

when you get it let me know what your boost clock is during gaming... From what I'm reading every GTX 670 is different

My Boost Clock is sitting at 1162MHz during gaming at stress loads...:D
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Congrats on getting the card OP,its really really hot :thumbsup:
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Nice work and nice card, OP!

One note - the physics test in 3dMark11 is a CPU test, so you may want to note that where you say it hasn't changed with your new card.

Overall, these gigabyte cards are just in a different class all together. The size, as you note, could be a problem, but the stock boost and fan noise make them superior to the reference design.

And in response to the poster above about stuttering, Nvidia has determined there is a problem with the drivers relating to vsync and will issue a fix in the next major driver release in June. This is according to Tom's Hardware.
 
May 13, 2009
12,333
612
126
I'm getting 1270 core 6800 memory for overclocks.

Not sure I'd upgraded from a 570 though. I went from a gtx 480.
 

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
Nice work and nice card, OP!

One note - the physics test in 3dMark11 is a CPU test, so you may want to note that where you say it hasn't changed with your new card.

Overall, these gigabyte cards are just in a different class all together. The size, as you note, could be a problem, but the stock boost and fan noise make them superior to the reference design.

And in response to the poster above about stuttering, Nvidia has determined there is a problem with the drivers relating to vsync and will issue a fix in the next major driver release in June. This is according to Tom's Hardware.

Are you having any probs with stuttering in games?also how is gpu boost functioning with older games or games that need less grunt, any framerate drops in those games?
 
Last edited:

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
I'm getting 1270 core 6800 memory for overclocks.

Not sure I'd upgraded from a 570 though. I went from a gtx 480.

Given that a 570 and 480 are about the same thing, are you saying you wouldn't have upgraded either?

By the way, what's the default power target on the Windforce? If it's 100%, I wonder if it's actually getting more power than a reference card at 100%, because running high 1100 boost at reference power is impossible. I assume at 1270 your have a 122% power target set.

Are you having any probs with stuttering in games?also how is gpu boost functioning with older games or games that need less grunt, any framerate drops in those games?

I got stuttering in BF3 - more like input lag (character wouldn't move when I moved the mouse). When I ran Borderlands (which has a default vsync enabled), the card never boosted. That could be a result of vsync, though. I also noticed that it will drop in menus when FPS gets really high, so I do think it responds to load to some degree. Just for the heck of it, I loaded up Age of Empires III, and at 150fps it (CPU-limited), it ran at 980 boost at stock.

That actually reveals something I never understood - the meaning of the default boost. It's the minimum boost you'll get, and you'll get more if you need it. 980 is the "default boost" for a stock 670, but I've actually never seen it until just now. Edit: now after playing for a few minutes it's boosted and I'm at over 200fps. Guess it actually isn't really limiting itself.
 
Last edited:
May 13, 2009
12,333
612
126
Given that a 570 and 480 are about the same thing, are you saying you wouldn't have upgraded either?

By the way, what's the default power target on the Windforce? If it's 100%, I wonder if it's actually getting more power than a reference card at 100%, because running high 1100 boost at reference power is impossible. I assume at 1270 your have a 122% power target set.



I got stuttering in BF3 - more like input lag (character wouldn't move when I moved the mouse). When I ran Borderlands (which has a default vsync enabled), the card never boosted. That could be a result of vsync, though. I also noticed that it will drop in menus when FPS gets really high, so I do think it responds to load to some degree.

122%? Afterburner only let's me turn it to 111%. Where can I get the 122?

As far as the 570 v 480. I got tired of my 480 sucking down the wattage and also for some reason the 570 is a bf3 beast compared to the 480. I mainly play bf3.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
122%? Afterburner only let's me turn it to 111%. Where can I get the 122?

As far as the 570 v 480. I got tired of my 480 sucking down the wattage and also for some reason the 570 is a bf3 beast compared to the 480. I mainly play bf3.

Ah, that explains some things. Gigabyte has hard-wired an 11% boost in power offset at default. That explains how without touching the offset, you're at a higher boost than would be otherwise possible (1160-1180).

If you look at the Anandtech review, you'll see they did something really interesting in their overclocking tests - they ran default, default with power target maxed, and max overclock. You'll see that for the EVGA superclock, the default increase in clocks does almost nothing at all. It's basically a wash unless you manually increase the power target, and then it starts to act like an overclocked card. Gigabyte got around that by upping the power target for you.

46559.png


Source: http://www.anandtech.com/show/5818/nvidia-geforce-gtx-670-review-feat-evga/19

And I didn't realize the 570 beats the 480 in BF3, but I definitely hear you on power use. I was a bit tired of gaming at GTX480 power levels with my 5850 crossfire.
 
Last edited:

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
Termie,sorry for all the questions dude:p,are you able to hold a constant 60 fps with,adaptive or normal vsync with age of empires 3? just interested to see if this gpu boost is behaving:p
 
Last edited:

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
I ordered two from Amazon when it showed "sold by Amazon" for $399 each. They shipped one and backordered one. Shows shipping this week. System is ready for the other, blue PCB not so bad. So far an awesome experience. Fast and quiet:
IMG_1268.jpg


I just ran Skyrim, latest GPU-Z reporting only 44c load temp:
GPUZ.jpg

I guess that could be accurate; it's not an over demanding title and I have decent airflow. Those are quiet 900 RPM intake fans in the door, open mesh above, quiet 1,200 RPM fans exausting on my H80, and and two quiet 200mm fans (one intake one exaust). I find what GPU-Z reports on the 3D clock interesting. Could that be right? I took the screenshot right after exiting the game, GPU-Z was running in the background.

Base clock reports correctly:
GPUZ1.jpg
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Termie,sorry for all the questions dude:p,are you able to hold a constant 60 fps with,adaptive or normal vsync with age of empires 3? just interested to see if this gpu boost is behaving:p

No, problem, man!

Here's a graph of temps and clocks while playing a round of Age of Empires with Adaptive Vsync enabled.

90742589.jpg



What you see there is pretty amazing. With framerate hovering right at 60fps and GPU usage around 50%, I'm gaming at 46C. The real kicker is clock speed - I was at 589MHz until I jumped out of the menu to take this screen shot. It even radically downclocks the memory from 3200 (6400) to 810 (1600). Clearly, the 670 intelligently manages its clocks to save power and reduce heat. And this will give you a laugh: power use while gaming...a cool 98w. :) Basically, it shut half of itself down to play this game (with adaptive vsync).

I may not have gotten much of a performance increase from my upgrade (5850CF -> 670), but I just love the technology packed into this thing. The energy efficiency, not just at max but perhaps more importantly when not at max, is astounding.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
By the way, what's the default power target on the Windforce? If it's 100%, I wonder if it's actually getting more power than a reference card at 100%, because running high 1100 boost at reference power is impossible. I assume at 1270 your have a 122% power target set.

Default TDP is higher than the reference design. My cards running with 1186MHz with less than 100%. So i think the "average power consumption" is 170 watt instead of ~150 watt of the reference GTX670. And that's the reason why you only can increase the power slide up to 11%.
 

Emo

Senior member
Oct 9, 1999
349
0
76
I had stuttering (even when running Unigine) when I tried 301.34 and 301.40, had to go back to 301.24. My card default boosts to 1175 and that is enough for me, for now. :)
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I've got newegg set to auto-notify me when this card is back in stock. :(

Destiny! When you run some Mafia II benchmarks, do it with physx enabled on the highest setting. If you still have your gtx570 to compare it to, I would be interested in seeing that comparison for the games you have.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Thanks!

Stuttering in games? I haven't experienced that yet...

when you get it let me know what your boost clock is during gaming... From what I'm reading every GTX 670 is different

My Boost Clock is sitting at 1162MHz during gaming at stress loads...:D

That's pretty much what Boost clock does. I have a hard time understanding how to figure out what it will go to. So when you overclock, the jump is sometimes too much.
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
I installed Precision X to get a more detailed account of what's happening during gaming. This is 30 or so minutes playing Skyrim. It shows GPU clock all over the place, boosting to
1176 max with max GPU temp of 59c. I had Precision running for monitoring only, no adjustments:
PRECISION.jpg


So this is how Kepler works? I never really took the time to read up on it. Interesting..
 

Destiny

Platinum Member
Jul 6, 2010
2,270
1
0
Nice work and nice card, OP!

One note - the physics test in 3dMark11 is a CPU test, so you may want to note that where you say it hasn't changed with your new card.

Overall, these gigabyte cards are just in a different class all together. The size, as you note, could be a problem, but the stock boost and fan noise make them superior to the reference design.

And in response to the poster above about stuttering, Nvidia has determined there is a problem with the drivers relating to vsync and will issue a fix in the next major driver release in June. This is according to Tom's Hardware.

Thanks for the explaination... I was hoping someone could help me explain some of the test results... I will update.

Also I don't play with v-sync, maybe thats why I don't notice the stuttering.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Vsync is the cause of stuttering. The way they designed it into these drivers for the gtx 600 series causes a problem where your fps drops and you get a really bad stutter. The fps drop doesnt even have to be that bad. I have experienced it with both adaptive vsync and real vsync enabled and my fps drop was only like 2. It happens at random times and not always in the same spot or when you di the same thing over to try repeating the effect. Vsync off is the only way around it but i just dont like screen tearing.