if its stable in a few loops of 3dmark11, AvP and Heaven then its stable in my expereince.
LOL another thread going to disaster-zone. This is just describing any overclocking. Stable in some cases, but not in others ? Your overclock is not stable, dial it down until it's stable in all cases.
What kind of stress testing do you guys do? I wasn't able to hit 1300 core (I crashed on my 5-6th hour of Heaven) but when I overclock I run Heaven for 8 hours or so and that hasn't failed me yet. I used to use Furmark but it seems like the card throttles itself, I am also on water so maybe that has something to do with it. MrK6's 7970 is at 1350 core and I remember another member of our forums had his at 1375 core. All under water though I wouldn't attempt these types of overclocks with stock coolers.
At $500-$550 this really throws a wrench on my plans for waiting for 20nm.
:hmm:
At $500-$550 this really throws a wrench on my plans for waiting for 20nm.
:hmm:
Why in the world is it an even remotely close to fair comparison to compare one massively overclocked card to one at stock speeds? And the 7970HD was absolutely not a "huge increase" over an overclocked gtx580.
What in the world are you smoking??
its cheating just like Intel does with the i7 and i5. <sarcasm>Because that is the default behaviour of GTX680 out of the box.. it would be worse to artificially cap the clocks, don't you think?..
its cheating just like Intel does with the i7 and i5. <sarcasm>
funny now that AMD cpus and gpus do that too...
yes its not exactly the same but the main point was that when Intel came out with the i7 so many morons called it cheating too. as for Kepler, anybody capable of reading and that keeps up with hardware knows max boost is going be higher than the advertised typical boost clock. and most cards with the same advertised clocks will not vary all that wildly for max boost. take the stock gtx680 which lists 1058 for typical boost. most cards have max boost of 1100-1150. big dealWhat? Why are you posting stuff like this? You know that nVidia GPU boost isn't at all the same as what AMD does with their CPU/GPU boost. Nor Intel with theirs.
yes its not exactly the same but the main point was that when Intel came out with the i7 so many morons called it cheating too. as for Kepler, anybody capable of reading and that keeps up with hardware knows max boost is going be higher than the advertised typical boost clock. and most cards with the same advertised clocks will not vary all that wildly for max boost. take the stock gtx680 which lists 1058 for typical boost. most cards have max boost of 1100-1150. big deal
"Kyle saw a GTX 680 sample card reach over 1300MHz running live demos but it could not sustain this clock"True enough about the complaining about Intel's turbo feature. Just like people complained about comparing 8 core FX CPU's against 4 core Intel CPU's.
Some review cards boosted much higher though, and that's the big difference. How many 680's do people own that will boost to 1300MHz? Intel and AMD's boost features boost to a predetermined clock which is the same across all chips.
"Kyle saw a GTX 680 sample card reach over 1300MHz running live demos but it could not sustain this clock"
I agree 1300 does sound high but briefly hitting that in one game does not mean much though. we all know boost can vary a bit from game to game and max boost may not last or even be needed long. it is more confusing but you still get a minuscule overall average difference from cards based on the same clocks.
5% lower clock speed is probably a 2% difference in fps.
Dynamic boost is the future.
It's a better solution than Intel's, it needs work, but overall it is genius. Dynamically adjusting clock speeds based on a set power envelope... What's not to like? Hopefully they can work in vram speed as well!
If your i7 could self clock to 4.5GHz instead of 3.6 while using 77w of power instead of 56 in a dual threaded game wouldn't you be impressed? Any self respecting enthusiast who appreciates the technology aspect of this hobby would.
The problem with boost is it came packaged with strict voltage and TDP caps, what says it has to?
Do you think we could lobby MSI to use larger fans with their twin frozr heatsink? 90mm fans make so much noise when spinning up. I swear on my life that the twin frozr series back in 2010 (with the gtx400 series cards) had larger fans, because I had a gtx465 that was dead quite up to 50% fan speed. My PE gtx670 is audible at 45% and above.
I think there is room for an easy redesign of their shroud to accommodate 100mm fans. Lets do it.
I wonder what ever happened to the MSI titan? There had been some hints of something and then ... nothing.
