• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Gigabyte GTX680 retail pictures

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Since it looks like hotclocks are still there I don't think that shaders can keep up with core speed.

I'd really like to see an OC scaling review the first day.
 
I play at 1680x1050 and I am planning on buying a 680 once it's released. Why? Because my GTX 560s in SLI are not adequate enough to run Battlefield 3 at my preferred settings at a minimum of 60 fps.

The 680 looks like the perfect solution for me as I will get a great performing card and will be able to get rid of my SLI set up.

I just noticed this now. That is a sidegrade, I believe the benches showed the 680 a little slower than a 590 ?

You might want to buy 2x 680 if you want to actually see the difference, beyond lack of microstutter if you notice that.
 
Since it looks like hotclocks are still there I don't think that shaders can keep up with core speed.

I'd really like to see an OC scaling review the first day.

Where have you seen the hotclocks? The card is clocked at 1006MHz, there's no shader clock.
 
The thing people are forgetting about is how great the 7870 is for gaming at 1080p. If 20% more performance is worth the money go for the 680.

I made that 20% up by the way because I'm just guessing right now like everyone else.
 
holy crap uber 1337!!!

GK104 hit a homerun and nV Engineers bring home the gold!

nV Engineers are credit to team!
 
There is now absolutely zero reason to buy a 7970, unless it gets at least a $100 price cut.
So you were able to definitively come to that conclusion based on six benchmarks from one leaked review of an unreleased card to be sold at estimated pricing?

Wow... You're easy to convince. Personally, I plan to wait until all the review are in before I form my conclusion.
 
Where have you seen the hotclocks? The card is clocked at 1006MHz, there's no shader clock.

Well or wrong GPU-Z 0.5.9 is reading the shaders frequency.

33f7y29.png


GPU-Z 0.6.0 fix core frequency readings and throw away shader readings but most of leaks said that the GTX 680 will have hotclocks.

Just a guess.
 
The way I understand it is, it will basically add what little % boost you see, to the min fps. Which is awesome if it works. You don't need it when you look at the sky for example in BF3, but you would want it when a RPG comes flying in to the scene /explosion .
 
The GPU boost feature may prove useful when using the dynamic vsync option, clocking up to help achieve 60fps, then clocking down when at 60+.
 
I can't believe overclock.net changed that OP link so you have to login! I read the thread a couple days ago before they removed the pictures, and they were impressed by being linked by basically every tech forum and had 100,000 plus views in like 24 hrs, but still, trying to use the minute of fame to get people to register, wow.
 
I can't believe overclock.net changed that OP link so you have to login! I read the thread a couple days ago before they removed the pictures, and they were impressed by being linked by basically every tech forum and had 100,000 plus views in like 24 hrs, but still, trying to use the minute of fame to get people to register, wow.

office_space_kit_mat.jpg


Uh, no registration requirement found bud, the thread is gone. Error message:

Insufficient Permissions
Your account does not have the required permissions to access this page.
Logout and try again with an administrative account, or contact a site administrator for support.
 
Uh, no registration requirement found bud, the thread is gone. Error message:

Insufficient Permissions
Your account does not have the required permissions to access this page.
Logout and try again with an administrative account, or contact a site administrator for support.

Suitable picture anyway 🙂
Point taken although it's not entirely clear, it appears that you need to "login" to see the thread.
 
The way I understand it is, it will basically add what little % boost you see, to the min fps. Which is awesome if it works. You don't need it when you look at the sky for example in BF3, but you would want it when a RPG comes flying in to the scene /explosion .

The GPU boost feature may prove useful when using the dynamic vsync option, clocking up to help achieve 60fps, then clocking down when at 60+.
If the stated effect, by Nvidia, of this boost is to add clockspeed when you're NOT using max resources of your Gpu, how will it help in low speed sections of a game? AFAIK, this is when you're stressing the Gpu to the maximum and pulling most power.

Isn't that the time when you frame-rate drops?
 
Back
Top