GTX 680 Longevity question

Fuelrod

Senior member
Jul 12, 2000
369
0
76
As I don't keep up with what is in the pipeline as much as I use to; can someone who does help me out? I remember purchasing the 480 GTX only to be miffed when 6 months later they came out with the 580 GTX back in 2010. I want to buy the GTX 680 but I would wait if there is something better (significantly better like the 480 to 580 was) coming by the end of the year. When is the next major bump coming, 6 months, 12 months, 18 months? What are the thoughts here on how long the legs are on the GTX 680.
 
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Shorter than the 480. BigK will likely come this year - when exactly remains to be seen. I'm pretty sure it will bring a larger performance increase over the 680 than the 580 brought over the 480.

Why is my post in front of yours??? :)
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
I think it will be anywhere from the 6 months we saw with 480>580 to maybe early 2013 at the latest. They won't drop GK110 in 3 months because that will be too much of a finger to the people who bought the 680 (also everyone who went EVGA would step up :D )

I don't think it will be all that long though and I expect where the 680 is only about 35% faster than a GTX 580, the GK110 will be more in the area of 50- 80% faster.

I couldn't wait myself, had enough of my current setup and I can get about 20% more performance out of two 680s without the headaches of the 480s. If you're worried about feeling the way you did about the 480 to 580, I would wait, it won't be that long in the scheme of things.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
I think it will be anywhere from the 6 months we saw with 480>580 to maybe early 2013 at the latest. They won't drop GK110 in 3 months because that will be too much of a finger to the people who bought the 680 (also everyone who went EVGA would step up :D )

I don't think it will be all that long though and I expect where the 680 is only about 35% faster than a GTX 580, the GK110 will be more in the area of 50- 80% faster.

I couldn't wait myself, had enough of my current setup and I can get about 20% more performance out of two 680s without the headaches of the 480s. If you're worried about feeling the way you did about the 480 to 580, I would wait, it won't be that long in the scheme of things.

They won't drop GK110 not because of the "finger" but because there is no need to. The 480 had some pretty significant problems with power, heat, and noise. The 580 was needed. The 680GTX is a fantastic card all around. If AMD can't compete for a while, you can be sure the GK110 will be a long ways off.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
They won't drop GK110 not because of the "finger" but because there is no need to. The 480 had some pretty significant problems with power, heat, and noise. The 580 was needed. The 680GTX is a fantastic card all around. If AMD can't compete for a while, you can be sure the GK110 will be a long ways off.

I am assuming they need the GK110. They're not going to use the GK104 for their professional cards. So they'll need something to put on a ridiculously overpriced $5000 card.
 
Feb 19, 2009
10,457
10
76
Think about this:

TSMC has all production capacity filled. NV has limited production capability on 28nm wafers. Which will they focus on, a "good" yielding small die they can sell in cards at $500 or an abysmal and probably un-manufacturable (if i can borrow the term) 550mm die that would still be selling around ~$600?

Big K will come when the dust settles and TSMC can allocate to NV much more production. Soon? Unlikely.

The original Kepler leak months ago is spot on so far. GK100 gone, TSMC can't make it this early in 28nm. GK110 ETA Q1 2013.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
I am assuming they need the GK110. They're not going to use the GK104 for their professional cards. So they'll need something to put on a ridiculously overpriced $5000 card.

I agree with this theory. Put out a compute heavy GPU for the scientific computing crowd. Research doesn't flinch at thousand dollar cards. I know my professors constantly talk about how incredibly cheap the GPUs are. Nvidia could easily tap that crowd.
 

blckgrffn

Diamond Member
May 1, 2003
9,676
4,308
136
www.teamjuchems.com
I am assuming they need the GK110. They're not going to use the GK104 for their professional cards. So they'll need something to put on a ridiculously overpriced $5000 card.

I agree. Those supercomputers need upgrades.

I would think GK110, and it focus on compute, might be disappointing compared to GK104 in its power usage and computer graphics performance.

GK104 is so slanted towards graphics performance that to bring back compute it is going to seriously compromise the design "ethos" (for lack of a better term?) that makes the GK104 such a sweet "little" chip. My $.02. It is possible that Big-K still has the hardware scheduler, etc.

And if there are production volume issues (with a big chip on the latest and greatest process), you can bet every big-K is going to be headed for a server somewhere rather than your PC. The margins are just so pale in comparison...

ie, why wait? This is going to be a pretty awesome chip for quite a while.
 
Last edited:

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
There is no guarantee GK110 can even work all rumor at this point. As usual buy what you need NOW, you will always be disappointed, just a fact because hardware gets cheaper and faster all the time and your products are always undercut and worth - less than when you bought. If you wait you'll be disappointed too because you can't play. sux huh?
 

IlllI

Diamond Member
Feb 12, 2002
4,927
11
81
after reading the anandtech review, i am convinced that it was as some people said all along. 680 was suppose to be their mainstream part, only they upped the product number and price since it was just as good as amd's 7970.
 

Mars999

Senior member
Jan 12, 2007
304
0
0
What about 4GB in a 680GTX? I want the high res texture packs for Skyrim, Rage, ect... and I heard they can suck up to 2GB of data alone.... Anyone seen any benchmarks with the 680 vs. 7970 3GB to see if it matters?
Thanks
 

Smoblikat

Diamond Member
Nov 19, 2011
5,184
107
106
They won't drop GK110 not because of the "finger" but because there is no need to. The 480 had some pretty significant problems with power, heat, and noise. The 580 was needed. The 680GTX is a fantastic card all around. If AMD can't compete for a while, you can be sure the GK110 will be a long ways off.

The GTX480 had no problems with power. Or the GTX 580 had as many problems, either way the 480 had the exact same TDP as the 580.
 

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
? Really?

So what was with the low clock speeds if not power?

I think he meant that they consume similar amounts of power which is true. However, you are right in that Nvidia managed to raise the clocks and increase the core count with the 580 while keeping a similar power envelope.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
What about 4GB in a 680GTX? I want the high res texture packs for Skyrim, Rage, ect... and I heard they can suck up to 2GB of data alone.... Anyone seen any benchmarks with the 680 vs. 7970 3GB to see if it matters?
Thanks

Without any mods 3gb doesn't confer any advantage in current games. With mods you probably could overfill 2gb of memory. For now I wouldn't worry about 2GB of RAM. If you want to keep your card for 3 years then it might matter.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Big K would be first and foremost going out the door as $1000+ cards to replace the Fermi Tesla line. They might game card flagship the BigK a month or two after they have those in the channel.

Think about this:

TSMC has all production capacity filled. NV has limited production capability on 28nm wafers. Which will they focus on, a "good" yielding small die they can sell in cards at $500 or an abysmal and probably un-manufacturable (if i can borrow the term) 550mm die that would still be selling around ~$600?

Big K will come when the dust settles and TSMC can allocate to NV much more production. Soon? Unlikely.

The original Kepler leak months ago is spot on so far. GK100 gone, TSMC can't make it this early in 28nm. GK110 ETA Q1 2013.
 

Doougin

Member
Jul 4, 2011
80
0
66
2gb cards are only reference right now. when the non-reference cards start to come out there will most likely be a 4gb version.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
such an unusual release for NV, no heat spreader on the core, no hot clock. what is this world coming to!
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I am assuming they need the GK110. They're not going to use the GK104 for their professional cards. So they'll need something to put on a ridiculously overpriced $5000 card.

I think you fail to understand the purpose of those 'overpriced cards'. NV's professional line is pretty well-respected for offering exceptional driver support and performance. You pay a lot (for sure) but it's for a specific reason. The price is compelling because the bang-for-the-buck is definitely there for many applications that heavily use parallel computation. With the Quadro purchase, you have a dedicated team of driver developer's focusing on professional applications.

When you look at the software development or licensing costs associated, the hardware costs of the servers + GPUs is likely a minority.

Sure, $5000 is a complete waste for Joe-schmoe who uses it for gaming and CS5, but that's not the market. :)