NVIDIA GeForce GTX 780 To Be Based on GK114 GPU

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Designing new chips is very very expensive and time consuming. You can't run them for only 6 or 9 months and make any money. Even the all mighty Intel would lose money if they replaced their lineup that often. Considering how many fewer GPU's are sold than CPU's I can understand the difficulty in making money.

Expecting a 30% improvement on refreshes and a 100% improvement on shrinks, continuously, isn't realistic.
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
GTX 480 was a king ... when relesed ;/ i dont have same feeling about 680 ... and won't have about 780 to if it will be 15% more then 680 :/ hope thee will be jump like from 285 to 480 while maxwell ... 780 to 880 ....

Fermi destroyed across the board while the 680 chokes just as bad as fermi on max shadows/4x msaa in BF3 which happens to be the only game I play that could actually use the extra power. Better thermals would be nice but not four-five hundred dollars nice.

At that price I could just buy another GTX 480 and a water cooling setup for even better performance.
 

brandon888

Senior member
Jun 28, 2012
537
0
0
Fermi destroyed across the board while the 680 chokes just as bad as fermi on max shadows/4x msaa in BF3 which happens to be the only game I play that could actually use the extra power. Better thermals would be nice but not four-five hundred dollars nice.

At that price I could just buy another GTX 480 and a water cooling setup for even better performance.


yep .... and waiting for 600 series on steroids is useless ;/ so there won't be anything good till maxwell ... but even maxwell can fail ... who knows ... sad .... very sad ....
 
Feb 19, 2009
10,457
10
76
Designing new chips is very very expensive and time consuming. You can't run them for only 6 or 9 months and make any money. Even the all mighty Intel would lose money if they replaced their lineup that often. Considering how many fewer GPU's are sold than CPU's I can understand the difficulty in making money.

Expecting a 30% improvement on refreshes and a 100% improvement on shrinks, continuously, isn't realistic.

Because some of us were expecting at least a few GK110 to be made into a consumer geforce.. sad panda that they aren't likely to even try. Suggests TSMC just pure failed making it, since the few that are made needs to go to fill pre-orders for tesla buyers who pay big $$.

But a pure refresh, 15-20% is good. Yes, i feel the need to troll both NV and AMD, where's the huge die massive single GPU performance winner? There's games that even a single 680/7970 struggle at 1080p, especially if you want something more than blur-fest FXAA.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I won't rule out GK110 until Maxwell drops, seems unlikely they won't use it.

I can't imagine every GK110 spun is going to met every specification for a workstation card, likewise I can't see them getting tossing into a trash bin after they fail to met that standard.
 

brandon888

Senior member
Jun 28, 2012
537
0
0
wait so if 256 bit interface remains so no 3 GB cards ;/ ? only 2 and 4 GB ???? im about 780 :/ ....

i hoped for 3 GB high-end card with more Memory bandwidth :/ i think all 680 or 670 4Gb versions are too unbalanced cause of small Memory bandwidth :/
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
I won't rule out GK110 until Maxwell drops, seems unlikely they won't use it.

I can't imagine every GK110 spun is going to met every specification for a workstation card, likewise I can't see them getting tossing into a trash bin after they fail to met that standard.

This sounds pretty logical. I wouldn't be surprised if AMD pulled something out of their hat (with the HD 8970) that the GK110 would suddenly be the GTX 780, however I don't expect this scenario.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
UGH!!

Super frustrating...the 680 was quite frankly a mild improvement over the 580, not the sort of generational leap I've come to expect from Nvidia, and now we're expecting another mild jump from the 680 to 780?

I stuck with my 580 because if you factor in the lack of requirement for horsepower with the market flooded with console ports, with the pretty mild increases from the new hardware it's about the least enticing time to upgrade ever.

I've had a top end, flagship card from every generation pretty much since the Ti4600, every 18 months dropping the cash to get the latest and greatest, but the 580-680 jump was the first time in my gaming-life I've skipped a generation, it's also the point in my life where I have the most disposable income to throw at hardware...

This is really pretty lame if true.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
The source seemed to be an updated road map, according to Charlie -- interesting, thanks for the link!
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
What happened to what the faboys made out to be the fabled beast GK110, shaped like a unicorn and [generating] rainbows? You mean to tell me that all Nvidia could produce for Geforce was a "mid range" GK104 that they "decided to name" GTX680 only AFTER "they saw how easily it beat 7970"? Lulz. Sorry for the quotes, its just random lines from Nvidia zealots I've read here.

No profanity in the tech forums
-ViRGE
 
Last edited by a moderator:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
What happened to what the faboys made out to be the fabled beast GK110, shaped like a unicorn and pissing rainbows? You mean to tell me that all Nvidia could produce for Geforce was a "mid range" GK104 that they "decided to name" GTX680 only AFTER "they saw how easily it beat 7970"? Lulz. Sorry for the quotes, its just random lines from Nvidia zealots I've read here.

Yes, GK100 was scrapped. All Nvidia could produce in Q2 2012 was GK104. It's also all they needed to produce to retain the $500 price point.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
What happened to what the faboys made out to be the fabled beast GK110, shaped like a unicorn and pissing rainbows? You mean to tell me that all Nvidia could produce for Geforce was a "mid range" GK104 that they "decided to name" GTX680 only AFTER "they saw how easily it beat 7970"? Lulz. Sorry for the quotes, its just random lines from Nvidia zealots I've read here.

NV knows we're in a worldwide economic depression/recession, PC demand is weak, consoles are around the corner which may further depress video card demand and current console demand... plus so many games are console ports anyway, and 3D hasn't really taken off yet, nor has triple monitor (unfortunately). So they are relentlessly cost shaving (smaller PCBs, 2GB VRAM instead of 3GB, 256 bit memory bus, pathetic stock heatsinks on the GTX 6xx series, etc.).

They can get away with this because AMD's best Radeon is not decisively faster than the GK104, and by decisively I mean >20% faster. They are using GK104 as a stopgap so they can make decent money off GeForce today. Not tomorrow, today. Meanwhile they are making buckets of money in Quadro/Tesla. NV could have fielded a faster GPU in all likelihood--but at what cost? Apparently too high of a cost. So they probably decided it was better to hold the fort with GK104 and accrue cash to ride out this rough period.

http://vr-zone.com/articles/how-the...-prime-example-of-nvidia-reshaped-/15786.html

http://techreport.com/review/22989/a-brief-look-at-nvidia-gk110-graphics-chip/3

http://vr-zone.com/articles/nvidia-...-k20-2013-geforce-and-quadro-cards/15884.html

Like it or not, PC gamers are the tail, not the dog, thanks to our increasing marginalization and the dominance of console games and console ports that do not require much more GPU horsepower unless you do funky stuff like play 3D/120Hz/multi-monitor or use fancy DoF/Tess/etc. settings.

In other words, PC consumer graphics (and in a way, console GPUs during this lull before the next-gen console storm) are a stagnant market compared to the fast-growing mobile and HPC markets. Pro graphics is also stagnant thanks to longstanding industry ties to NV, so Quadro is also milked like GeForce, but at least they are high margin cards and thus more deserving of getting GK110 first. If NV has any leftover GK110s, they may find themselves in GeForce cards, but don't bank on it (see links above).

Some of you sound like if you were in charge of NV you'd issue GK110 as GeForce for meager profits just to appease the dog's tail, instead of doing enough to get by in GeForce while pursuing far more lucrative markets and selling GK110 for far, far more money as Quadro/Tesla. Sorry but NV is a corporation. Corporations exist to make money. Obviously as a PC gamer I'd love things to be different, but let's face economic reality here.

Edit to add: Okay so now we're hearing a story about cut-down GK110s being sold as GeForces; that is much mroe likely. Tesla/Quadro gets GK110, GeForce gets imperfect GK110s that would otherwise be thrown away. But non-cut down GK110s being sold as GeForce when they could have been sold as Tesla/Quadro? Maybe, but I'm not betting on it.
 
Last edited:

brandon888

Senior member
Jun 28, 2012
537
0
0
What happened to what the faboys made out to be the fabled beast GK110, shaped like a unicorn and pissing rainbows? You mean to tell me that all Nvidia could produce for Geforce was a "mid range" GK104 that they "decided to name" GTX680 only AFTER "they saw how easily it beat 7970"? Lulz. Sorry for the quotes, its just random lines from Nvidia zealots I've read here.

huh im nvidia fanboy but never expected 60% + performance of 780 over 680 :D my maximum expectations was 25-30% .... only thing i hope was 3GB version of 780 and more Memory bandwidth for better AA performance ....

im nvidia fanboy ! yes and same Memory bandwidth on 680 as on 580 is disappointment !
 
Feb 19, 2009
10,457
10
76
its BS to blame consolitis for the lack of gpu grunt. theres current games that bring gk104 to its knees at 1080p.

its not here for consumers *gk110, because tsmc failed to make a ~600mm2 chip on their immature node, thats it.
 

brandon888

Senior member
Jun 28, 2012
537
0
0
its BS to blame consolitis for the lack of gpu grunt. theres current games that bring gk104 to its knees at 1080p.

its not here for consumers *gk110, because tsmc failed to make a ~600mm2 chip on their immature node, thats it.

True ... for example sleeping dogs .... max payne 3 at 8X AA ... on BF3 670/680 are good but still can't maintain 60 fps as minimum .... +15% performance won't change that situation :/ so no decent Cards till maxwell :/


only good thing is that no one of 600 buyers will regret now ... 15% is not a big deal .... and tahy can hold till maxwell easily...
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
its BS to blame consolitis for the lack of gpu grunt. theres current games that bring gk104 to its knees at 1080p.

its not here for consumers *gk110, because tsmc failed to make a ~600mm2 chip on their immature node, thats it.

This forum is not representative of the larger market which is apparently happy at lower settings and on consoles no matter how much you may hug your desktop PC at night; and furthermore you took one thing out of my post and one thing only. Nice try.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Yeah sure right okay :rolleyes:
50% more bandwidth, at least 50% more texturing power and compute power but not much faster than GK104. Charlie is spreading bullshit again.

Die harvesting is a longstanding tradition in GPU manufacturing so I can see a cut-down GK110 sold as GeForce happening. Basically the Tesla/Quadro rejects get a faulty SMX turned off or whatever and turn into GeForce. However, I have read elsewhere that TSMC 28nm yields are actually quite good, though admittedly those sources didn't talk about NV specifically. And a GK110 would have to be seriously clocked down or cut down or both, if what Charlie wrote is accurate. Either that or there is some other bottleneck we don't know about.
 
Feb 19, 2009
10,457
10
76
This forum is not representative of the larger market which is apparently happy at lower settings and on consoles no matter how much you may hug your desktop PC at night; and furthermore you took one thing out of my post and one thing only. Nice try.

>$500 GPUs is not representative of the larger market. Nor are $300 ones. What are you getting at? NV doesn't want to make money eventhough they have GK110 in abundant and ready for consumers but feel like earning less??