[Semiaccurate] GK104/Kepler/GTX680 Next Week?

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
If true, I hope the "Turbo Boost" can be disabled for regular overclockers.


I bet it can. In fact I will guess how it works. It will be bios implemented, but changeable only through the Nvidia control panel and will require a re-boot to enable and disable it.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Wish I owned a popcorn company.

*Grabs some popcorn, awaits epic GPU media battle*

Wonder if clock auto-adjusting Kepler SKUs will carry a premium.
 
Last edited:

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
I don't understand that "Speed Boost" thingie. Any modern card has several frecuencies in the core and memory for each type of work and load. What's the difference here?
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
This Speed Boost" thingie is a function of TDP and the required GPU load

so pretty much like Intel Turbo
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
It's not a problem with Fermi, my 470's will switch between 405Mhz and whatever overclock speed I have in some games depending on the load.

Sometimes one will be running at 60% load and 405Mhz while the other is at like 30% and 900MHz.

At least when using a fps limiter/vsync.
 

thilanliyan

Lifer
Jun 21, 2005
11,864
2,066
126
It's not a problem with Fermi, my 470's will switch between 405Mhz and whatever overclock speed I have in some games depending on the load.

I think the bigger issue is memory clocks. IIRC there have been issues with memory clocks when switching between power states. Can't remember if it was AMD, nV, or both that had that problem.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
If this is a turbo core feature that probably wouldn't be an issue though right, as only the core/shader frequency would be in flux?
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
Oh boy, it's not like Nvidia's TDP ratings aren't too misleading already. "Hey guys, it's a 250W part, we promise!*"

*Unless you're thinking of actually, you know, using it.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Well then, maybe instead of putting a 600W in her rig, I'll go ahead and up that to a 750W. Just in case.

Turbo Boost in our videocards? That is actually a nifty idea. I wonder if it would work like Intel's technology were it idles some of the chip and ramps up the working end, or if it will just do like my old 2004 MSI motherboard with DOT (Dynamic Overclocking Technology) and overvolt the processor and raise the multiplier a few ticks when some arbitrary limit was reached - which I never figured out when.

Bring the new tech on!
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Oh boy, it's not like Nvidia's TDP ratings aren't too misleading already. "Hey guys, it's a 250W part, we promise!*"

*Unless you're thinking of actually, you know, using it.
*Desktop and decoding usage only.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
That ship sailed long ago - TDP is under control.
Chip size alone should tell you that.
Then remember Apple is back, and all those laptop wins...

On power envelope alone, AMD will have trouble selling it to PETA Beowulf cluster built by Al Gore
 

SolMiester

Diamond Member
Dec 19, 2004
5,331
17
76
I can understand multi core turbo stepping when only 1 core used, allowing clocking to TDP limit, however if this is just 1 core, why not just clock it as high as possible when in 3D?...
Will there be a threshold of FPS before kick in?. It sounds cool and good on NV for leading GPU hardware again, but it sounds a bit of a gimmick?..
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Oh boy, it's not like Nvidia's TDP ratings aren't too misleading already. "Hey guys, it's a 250W part, we promise!*"

Oh boi...talking about barking at the wrong tree

:whiste:

Speaking of the "Samaritan" demo, Rein noted that when they showed it off last year, it took three Nvidia cards and a massive power supply to run. However, they showed it off the demo again that ran using a new, not yet released Nvidia card and one 200 watt power supply.


http://www.gamesindustry.biz/articles/2012-03-08-gdc-epic-aiming-to-get-samaritan-into-flash
 

PhoenixEnigma

Senior member
Aug 6, 2011
229
0
0
I can understand multi core turbo stepping when only 1 core used, allowing clocking to TDP limit, however if this is just 1 core, why not just clock it as high as possible when in 3D?...
Will there be a threshold of FPS before kick in?. It sounds cool and good on NV for leading GPU hardware again, but it sounds a bit of a gimmick?..
Probably similar to the all-cores turbo on CPUs. Not all instructions use the same amount of power, there's usually a gap between maximum specified ambient temperature and actual ambient, probably a couple other factors. If there's a safe way to eat into some of that headroom, why not grab that extra couple FPS?
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Does that mean that one 200W PSU alone is powering the card or the whole system? If the whole system, this thing must run at negative power!

WTB clarification.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Must be off on the wattage or referring to one of those secondary GPU only power supplies.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Well the CPU would probably be using a third of that or more. That demo doesn't look Celeron friendly.
 
Status
Not open for further replies.