Gigabyte GTX680 retail pictures

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
225w limit might actually be a bad thing.
It means the TDP should be low, but with the 7970 the "TDP" is 250w, but the typical power use is something like 190w or whatever.

By limiting the card to 225w power within spec, it might potentially limit overclocking. It would be nicer to see 300w potential power with below 225w TDP.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Curious the box reads DirectX 11 -- not DirectX 11.1.

I guess after the fad with DX10.1...NVIDIA thinks it's a moot point....I know I do.
How many DX11.1 titles do you think we will see...before DX12? ;)
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Curious the box reads DirectX 11 -- not DirectX 11.1.

Regardless of what instruction set Kepler will support, I anticipate DX11.1 becoming about 1/3 as popular as GPU-accelerated physx was last year. In other words, there might be two games that will utilize DX11.1 features, which will most likely be entirely transparent from an end user's point of view (other than seeing an option check marked in a game's visual options menu).

I'm not making excuses for Kepler if it doesn't support dx11.1, as it probably should given that AMD's hardware supports it and was released 2 1/2 months before Nvidia's, I'm just stating what I think the scope of importance dx11.1 will have.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
225w limit might actually be a bad thing.
It means the TDP should be low, but with the 7970 the "TDP" is 250w, but the typical power use is something like 190w or whatever.

By limiting the card to 225w power within spec, it might potentially limit overclocking. It would be nicer to see 300w potential power with below 225w TDP.

I wondered this myself when the 2x6-pin information was all but confirmed with the leaked stacked-PCIe power plugs, but I don't think it will be a huge issue. Not with reference-based cooling, anyways. But, at the same time, I too would like to see 1x6+1x8 plug ins.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
225w limit might actually be a bad thing.
It means the TDP should be low, but with the 7970 the "TDP" is 250w, but the typical power use is something like 190w or whatever.

By limiting the card to 225w power within spec, it might potentially limit overclocking. It would be nicer to see 300w potential power with below 225w TDP.

You still need to put GK104 in context with it's" Kepler" brothers and sisters...and the GF104...*hint-hint*

Unless you of course think that this GPU is the top GPU from NVIDIA this round...
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
It probably is the flag ship GPU for the 6XX series, one may imagine. I doubt it is going to be the flag ship GPU for the Kepler architecture.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Regardless of what instruction set Kepler will support, I anticipate DX11.1 becoming about 1/3 as popular as GPU-accelerated physx was last year. In other words, there might be two games that will utilize DX11.1 features, which will most likely be entirely transparent from an end user's point of view (other than seeing an option check marked in a game's visual options menu).

I'm not making excuses for Kepler if it doesn't support dx11.1, as it probably should given that AMD's hardware supports it and was released 2 1/2 months before Nvidia's, I'm just stating what I think the scope of importance dx11.1 will have.

11.1 has some improvements with Stereo3d -- this is why it is surprising. nVidia may still offer 11.1 abilities.
 

MrTeal

Diamond Member
Dec 7, 2003
3,898
2,621
136
Why would 6pin vs 8pin make a big difference in overclocking? It's not like you're hard limited to 225W with 2x6pin. All adding another two ground pins will do is slightly lower voltage drop and keep your ground reference from creeping up, but the extra couple amps going from 225W to 250W shouldn't be a big deal. I would say that the power supply design on the card is an order of magnitude more important than the extra two grounds on the connectors.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
So it looks like power consumption at least will be good. Happy to see this won't be a room furnace like GF100.

But then again, this is to be expected. Remember that GK104 replaces GF114, and the GTX 560 Ti had 2x 6-pin PCIe connectors. TDP at full load should be roughly 200W; it should consume a similar amount of power as the GTX 560 Ti, which wasn't half bad when it came to efficiency.

He ran Crysis 1, but at 1600x900, which is a pretty useless resolution for this type of card. Maybe when we see some benches at the target 1920x1200...
 

Mopetar

Diamond Member
Jan 31, 2011
8,345
7,416
136
You still need to put GK104 in context with it's" Kepler" brothers and sisters...and the GF104...*hint-hint*

Unless you of course think that this GPU is the top GPU from NVIDIA this round...

What do you mean by 'this round'? If they call it the 680, it's going to be their biggest GPU until the 700 series launches, whenever that is.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
What do you mean by 'this round'? If they call it the 680, it's going to be their biggest GPU until the 700 series launches, whenever that is.

I don't care for "monikers" on cards.
I care for the GPU on the board.

GK104 has the same performance no matter if they call it 680GTX, 780GTX or 880GTX...the latter being a useless marketing name.

You are eating up all the PR-FUD...but neglecting the devil in the details.
GF104 -> GK104.

Now try to add this:
GF110 -> GK110

You might think that 680GTX is highend.
I think it's NVIDIA's planned midrange (GK104 A2) GPU rebranded from a 660Ti to 680GTX...due to the (lack of) performance range of the 7970.

Hence why NVIDIA has been so quiet...it's another G80 round.
 

Nintendesert

Diamond Member
Mar 28, 2010
7,761
5
0
Regardless of what instruction set Kepler will support, I anticipate DX11.1 becoming about 1/3 as popular as GPU-accelerated physx was last year. In other words, there might be two games that will utilize DX11.1 features, which will most likely be entirely transparent from an end user's point of view (other than seeing an option check marked in a game's visual options menu).

I'm not making excuses for Kepler if it doesn't support dx11.1, as it probably should given that AMD's hardware supports it and was released 2 1/2 months before Nvidia's, I'm just stating what I think the scope of importance dx11.1 will have.




I would first guess it is just marketing and stating DX11 is enough vs. writing out DX11.1 which nobody really knows what it does and the people that do will read reviews and know whether the chips actually do DX11.1.
 

moriz

Member
Mar 11, 2009
196
0
0
I don't care for "monikers" on cards.
I care for the GPU on the board.

GK104 has the same performance no matter if they call it 680GTX, 780GTX or 880GTX...the latter being a useless marketing name.

You are eating up all the PR-FUD...but neglecting the devil in the details.
GF104 -> GK104.

Now try to add this:
GF110 -> GK110

You might think that 680GTX is highend.
I think it's NVIDIA's planned midrange (GK104 A2) GPU rebranded from a 660Ti to 680GTX...due to the (lack of) performance range of the 7970.

Hence why NVIDIA has been so quiet...it's another G80 round.

i fail to see how any of this is relevant. if nvidia decides to release it as a high end product and price it as such, then it is a high end product. the codename means nothing.

in the end, the GPU codename and the product name are just names. it's the price and performance that really matters.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
I don't care for "monikers" on cards.
I care for the GPU on the board.

GK104 has the same performance no matter if they call it 680GTX, 780GTX or 880GTX...the latter being a useless marketing name.

You are eating up all the PR-FUD...but neglecting the devil in the details.
GF104 -> GK104.

Now try to add this:
GF110 -> GK110

You might think that 680GTX is highend.
I think it's NVIDIA's planned midrange (GK104 A2) GPU rebranded from a 660Ti to 680GTX...due to the (lack of) performance range of the 7970.

Hence why NVIDIA has been so quiet...it's another G80 round.

Which Nvidia authority was it that said this is a mid range part, despite its flagship x80 designation?

Just wondering.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
i fail to see how any of this is relevant. if nvidia decides to release it as a high end product and price it as such, then it is a high end product. the codename means nothing.

in the end, the GPU codename and the product name are just names. it's the price and performance that really matters.

Then wait for later this year...it will be very clear to you then.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Which Nvidia authority was it that said this is a mid range part, despite its flagship x80 designation?

Just wondering.

Part A:
GTX680

Part B:
GK104 A2

Which one matters?
Which one isn't just a empty name, put on by marketing?
Which one carries a mid range GPU moniker?

I though people were more technical here...and didn't just gobble the PR coolaid without tany thought...was I wrong?



That last inflammatory flamebait quip just bought you a vacation, you will be released from vacation 2-days after Kepler's release, however far off that may be.

Public Service Announcement for the rest of the readership - do not post in kind as Lonbjerg has done unless you wish to join him in his vacation.

The baiting and flaming in the run-up to Kepler release has been unacceptable, it ends now.

Administrator Idontcare
 
Last edited by a moderator:

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Then wait for later this year...it will be very clear to you then.

I agree with you Lonbjerg that there is of course a 110 Kepler in development, however as was stated Nvidia calls this a 680 and prices it high end. So that's the high end.

Theres a chance 110 will never come out, or have an extreme delay to its release. Kinda makes me sad. I wanted the second coming of 8800GTX / 9700Pro
 

MrTeal

Diamond Member
Dec 7, 2003
3,898
2,621
136
That last inflammatory flamebait quip just bought you a vacation, you will be released from vacation 2-days after Kepler's release, however far off that may be.

Public Service Announcement for the rest of the readership - do not post in kind as Lonbjerg has done unless you wish to join him in his vacation.

The baiting and flaming in the run-up to Kepler release has been unacceptable, it ends now.

Administrator Idontcare

Not so much against Lonbjerg specifically since his post was not nearly as bad as some on here have been, but +1. I come here to see if any new information has leaked on Kepler and to hopefully discuss what the leaks might mean. The atmosphere around here lately has been terrible for a place that has specific rules about maintaining civility.
 

moriz

Member
Mar 11, 2009
196
0
0
Then wait for later this year...it will be very clear to you then.

so in translation: eventually nvidia will release a faster card that will replace the GTX680, that may or may not be at the same price point?

really? you don't say.

it's all just names. if it performs like a high end product, if it is priced like a high end product, then it IS a high end product. i couldn't care less what they call it, both its product name and its codename.
 

Quantos

Senior member
Dec 23, 2011
386
0
76
The possibilities really aren't endless.

It's one of two things:

1. The GTX680 is actually mid range and the GK104 codename is the valid piece of information. The 680 name then loses its high-end meaning, though only in the long run, whenever GK110 is release.

2. The GTX680 is actually high end and the GTX680 name is the valid piece of information. GK104 then loses its mid range meaning, again until GK110 is released.

All this is pending the release of GK110, but until then, no one can argue that the GTX680 is mid range. It just isn't. It might become mid range at some point, but only if GK110 is actually the same generation. If GK110 simply supersedes GK104 with the name of GTX780, then it just becomes the new generation high end, and GTX680 becomes the old generation high end.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
That same poster posted results of crysis benchmarks in that same thread on OCN and got pretty terrible results to be honest, but he's using an i3....

I'm not sure if CPU is a limiting factor for a game that old, its mostly gpu limited IIRC....especially at very high quality settings. Hopefully we'll get data on a proper test bed soon. I sure hope his crysis results are a fluke or not true.

Edit: His heaven DX11 results are low too. I'm pretty sure that is GPU limited for the most part. Not sure if he has the most up to date driver, but he is using the driver on the install CD, forceware 300.65.
 
Last edited: