GTX 680 launch video - Narrated by Ujesh Desai, VP Corporate Marketing - Nvidia

Smoblikat

Diamond Member
Nov 19, 2011
5,184
107
106
Finally. The only thing i had against nvidia was its lack of multi monitor support. You were limited too either buying a second card, or sticking with dual monitors with nvidia. Unlike ATi who has 6 monitors per card for some cards. If what this guy said is true nvidia finally realized their error and is now letting you do triple monitor off of one card.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Finally. The only thing i had against nvidia was its lack of multi monitor support. You were limited too either buying a second card, or sticking with dual monitors with nvidia. Unlike ATi who has 6 monitors per card for some cards. If what this guy said is true nvidia finally realized their error and is now letting you do triple monitor off of one card.

Only thing missing in the video was a green colored progress bar :biggrin:

Most likely you'd still need a second card to get decent framerates anyways. Same hold true for AMD's cards. The exception is for those who run multiple monitors but game on a single screen it's a plus.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
I swear, I have no idea where they are going with this GPU Boost thing. Why would I want to waste power when rendering gets simpler and performance is already sufficient.

It's counter productive.
 

96Firebird

Diamond Member
Nov 8, 2010
5,740
337
126
The exception is for those who run multiple monitors but game on a single screen it's a plus.

This is me. Finally I can get a second monitor for home and work on one screen, browse on another, and keep my TV hooked up for movies with one Nvidia card. :thumbsup:
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
I swear, I have no idea where they are going with this GPU Boost thing. Why would I want to waste power when rendering gets simpler and performance is already sufficient.

It's counter productive.

I think you got it arse backwards as far as I can tell.

Looks more like it gives you a little more umph when it's actually needed. Kinda similar to turbo boost I guess.
 

superjim

Senior member
Jan 3, 2012
293
3
81
I think you got it arse backwards as far as I can tell.

Looks more like it gives you a little more umph when it's actually needed. Kinda similar to turbo boost I guess.

How do you define "when it's needed"? In my book it's always needed if the fps isn't greater than 59.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I swear, I have no idea where they are going with this GPU Boost thing. Why would I want to waste power when rendering gets simpler and performance is already sufficient.

It's counter productive.

I think we'll have to see how it ends up working out in the real world before passing judgement, but I see where you are coming from with this.

I imagine in an overclocking scenario one would just set the card to run at the max clocks all the time (in 3D) and set the boost gap to zero if that is possible. Granted, it may be the case that the card can't run those clocks all the time but is able to do so only in boost mode when the gpu is not being fully utilized.
 

omeds

Senior member
Dec 14, 2011
646
13
81
How do you define "when it's needed"? In my book it's always needed if the fps isn't greater than 59.

I think you just answered your own question. Pretty sure I saw a slider for desired frame rate. Will be handy with the new dynamic vsync feature too.
 

kami

Lifer
Oct 9, 1999
17,627
5
81
Different games stress the card in different ways. Just how like furmark is way more demanding than any game.

Just like intel turbo boost. The full capability of the chip isn't being utilized, so it boosts what IS being utilized since power wise and temperature wise it is able to.

At least thats how I understand it...I could be wrong. I think it works with overclocking too, so it will boost from your overclock base if its able to.

Interested in seeing real world results.
 

Granseth

Senior member
May 6, 2009
258
0
71
I think this boost is a complex thing.

It's very nice in games that have trouble with framerate, especially with 3d vision even though it probably won't do anything when it's alot happening at the same time on the screen.

But it seems like it will boost fps unnecessary when it's already high, and it will give a strange result in average framerates in the reviews. What I mean is it will probably not increase minimum framerate, so sometimes the extra frames will do little to enhance the game experience but will bump the 680 in the charts.

But then again, who can complain about getting some extra performance.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
I think you just answered your own question. Pretty sure I saw a slider for desired frame rate. Will be handy with the new dynamic vsync feature too.

lol dude that wasn't a slider, that was a frame rate counter :confused:
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Very cool I liked everthing he talked about . Can't wait to sli 2 of them. I want the more memory model tho
 

Nintendesert

Diamond Member
Mar 28, 2010
7,761
5
0
I'm hoping what that GPU boost is is what Kami stated, this would I suppose help improve the minimum framerate which to me is much more important than the max. Those dips are what's noticeable, but going from 80 to 110 FPS might not be.

I can't wait for some of the reviews to hit!
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
Biggest thing for me is the multiscreen support. That was the only thing edging me toward AMD this round because I know I will be going 3 screens for productivity, not necessarily gaming. I hope they fixed the P12 multi-screen issue in though. I don't want the card in 3D mode all the time.

NDA up tonight with this press release?
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
I'm hoping what that GPU boost is is what Kami stated, this would I suppose help improve the minimum framerate which to me is much more important than the max. Those dips are what's noticeable, but going from 80 to 110 FPS might not be.

I can't wait for some of the reviews to hit!

If this is how their GPU boost tech works, then it's lame.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I thought he explained it well and the video did a good job of conveying the concept.

When you're inside the carrier in BF3 there isn't a need for high clocks as it's already rendering well above 60 fps, however when it starts to render the waves, other ships, rain, and everything else it quickly boosts itself up to prevent a sudden fps loss. When he turns the corner and lessens the rendering load it starts to drop boost levels.

This coupled with their new vysnc could be the solution people who are sensitive to MS are looking for, or it might not be. Hopefully you can actually control the dynamic boost levels, allowing you to run your card much more efficiently instead of going full throttle with massive overclocks all that time that do nothing but waste power and create excessive heat.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
That is actually a really cool video! And honestly, no PR BS o_O

- fastest single GPU
- focus on low noise and great perf/watt
- GPU boost - looks nice! boost clocks from stock when power usage permits
- multi-monitor support off a single card ("me too", but still nice)

I wonder if they will listen to the gamers regarding its price though ;)