GTX780 will not be based on GK110? (OBR Rumor)

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
http://www.obr-hardware.com/2012/09/exclusive-some-geforce-gtx-780-details.html

"Successor GeForce GTX 670/680 will NOT be based on core GK110. These cards will be based on a completely different, new core ... more detail later."

newgpu.jpg


In the top left corner the memory chip has "SA" on it --> I am guessing it uses Samsung memory chips.
 

Smoblikat

Diamond Member
Nov 19, 2011
5,184
107
106
If thats true I guess Ill never buy Nvidia again.

Im a cynical A-hole about my computer parts, and the fact that they sold us a mid range card (gtx 680) for 500$ really ground my gears. Plus the 660Ti was based on a low end core and that sells for 300$. I was waiting for Big K to buy a real high end card for a real high end price but I guess I wont now.

If you have no idea what im talking about its this:
GTX 460/560/560Ti = Gf104/GF114
GTX 680 = GK104, the mid range core
GTS 450/550Ti = GF106/GF116
GTX 660Ti = GK106, a low end core
GTX470/480/570/580 = GF100/GF110
Big K = GK110, if this never comes out I wont be buying Nvidia ever again.
 
Last edited:

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Actually if people bothered to read the GK110 whitepaper it becomes very believable indeed.But I don't have much regards for these kind of sites so I will take it with a large grain of salt.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Smoblikat, so if GTX780 has 2304 or 2536 SPs and beats HD8970, you still won't buy it out of principal that they didn't deliver you a 275-300W GK110 1Ghz 2880 SP part?

A custom GK11x part could make sense since it would allow NV to maintain good performance/watt and still have 30%+ more performance over 680 I bet.

Perhaps because the 28nm node is tricky, AMD and NV didn't want another GTX480 repeat. Maybe they both split the generational increase we normally see into 2 releases. Instead of delivering 75% at once and then having a mild refresh in between, they are just going Step 1: 30-45%, Step 2: 30-45%. In the end, we still end up with 70-100% increase in about 2 years from a previous gen flagship.

Again, this is just a rumor and could be false.
 

Smoblikat

Diamond Member
Nov 19, 2011
5,184
107
106
Smoblikat, so if GTX780 has 2304 or 2536 SPs and beats HD8970, you still won't buy it out of principal that they didn't deliver you a 275-300W GK110 1Ghz 2880 SP part?

A custom GK11x part could make sense since it would allow NV to maintain good performance/watt and still have 30%+ more performance over 680 I bet.

Perhaps because the 28nm node is tricky, AMD and NV didn't want another GTX480 repeat. Maybe they both split the generational increase we normally see into 2 releases. Instead of delivering 75% at once and then having a mild refresh in between, they are just going Step 1: 30-45%, Step 2: 30-45%. In the end, we still end up with 70-100% increase in about 2 years from a previous gen flagship.

Again, this is just a rumor and could be false.

Precisley. If I wanted a low power machine why would I get a high end card? Im all for 130w CPU's and 250w+ GPU's.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
The key is not how will the GK110 compare with the GK-104 but how will the GK-110 compare to the GK-114, from a gaming point-of-view; and with this context does it makes sense to bring the GK-110 to GeForce?
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
1x0 part has never been a 1x4 part replacement
1(x+1)4 is what replaces 1x4

If GK114 is unobtainable we might see GK110 in GTX 780.
Other than that why would NV want to jeopardize perf/W and mindshare by using computing-workloads-optimized GK110 for regular GTX 780?

IMHO there's very little chance that Nvidia would want to screw 99% of regular users, so that 1% can boast with all those fresh new GK110 features... that they will never actually use.

Above is pretty clear, AMD is on the move next - wheter they will continue an uphill fight or concede and go to what they do best.
Making small, energy efficient gaming/media chips, instead of all-purpose ones like 7970.
 
Last edited:

dangerman1337

Senior member
Sep 16, 2010
333
5
81
Can someone calculate the size of that GPU in the photo? I think it'll give us an idea of what it could be (i've got a feeling they're going to do a 'GK112' which will be between somewhere GK110 and GK104).
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
GTX480 used about 250W of power in gaming (which I am cool with), but the reference card initially ran pretty hot, although latter samples were much better. I don't think NV wants to go back to those days. All they have to do is beat HD8970 by 10% or more and they have a winner for the intended high-end GPU gaming target market. Why go for a 275W+ monster and unleash a 75-80% faster card and make Maxwell look bad? You want to leave some reserve for the next generation, or end up with Maxwell as some mid-range performance/watt part that's 20% faster than a 550-600mm2 GK110 and everyone makes fun of it. There are plenty of reasons why 2880 SP 1Ghz GK110 is very unlikely:

1) Power consumption, noise levels & heat for reference design blower card;

2) Marketing wise it could then make a full blown GK110 look a lot better vs. GK104 than Maxwell will be against GK110.

3) Strategy: Why would NV release a 75-80% faster GPU in early 2013 when most games are console ports? The biggest increase sounds a lot more marketable in 2014 years when next generation consoles launch, DX11 next generation games are in full swing and Maxwell looks extremely powerful against the outdated Kepler tech.

Can someone calculate the size of that GPU in the photo? I think it'll give us an idea of what it could be (i've got a feeling they're going to do a 'GK112' which will be between somewhere GK110 and GK104).

That's almost impossible to do since the die under heatspreader could be anything.

GT200 - GTX280
attachment.php

dsc0008smallbp5.jpg


GF100 - GTX480
,H-8-240380-3.jpg

gf100-angle.jpg

chip-cap.jpg

attachment.php

Source: http://www.gpu-tech.org/content.php/136-Hardware-porn-The-naked-Die-of-GF100-exposed

GK104 - GTX680 - very small die under the huge heatspreader.
chip-quarter.jpg


If you just look at the heatspreader, they aren't much different but the die under differs in size dramatically.
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
@RussianSensation

Dropping in Maxwell vs GK110 argument is pretty far fetched.
Perhaps you don't squeeze max perf. because of the refresh, but there's little evidence that anyone is holding back on current arch.

Not having to use 500mm2 dies in ever increasing next-gen-silicon cost, for $500 GPU is more like it.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
f1sherman, I agree that 500mm2+ die size cost and power consumption (and associate heat and noise) are the more plausible factors that would entice NV to move to a custom GK11x chip. It also would work nicely with their strategy of segregating HPC chips and gaming chips. They paid the price by not doing so with GTX480/580 in terms of power consumption and heat. If they have the $ to design a pure gaming chip but larger than GK104, I think most gamers would prefer it. DirectCompute in games may take off but NV can incorporate more substantial compute in Maxwell on a lower node without suffering a huge penalty on 28nm node now that AMD has had to face.
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
If they have the $ to design a pure gaming chip but larger than GK104, I think most gamers would prefer it. DirectCompute in games may take off but NV can incorporate more substantial compute in Maxwell on a lower node without suffering a huge penalty on 28nm node now that AMD has had to face.


256 bit MC is the only perf. hurting part of GK104.
Im pretty sure that lack of GPGPU resources compared to Tahiti,
is not the reason for 680 loosing in a single gaming benchmark.

If GPGPU gaming workloads would substantially increase than YES, but that remains to be seen
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I believe the Tesla K20 has 300W TDP and will get a 1.5Tflops DP theoretical peak at 1/3 rate (or maybe it was 1.2 Tflops). Working backwards at 2880 SPs, that gets us 780 mhz GPU clock speed. K10 is already confirmed to work at just 745mhz. A 2304 SP Kepler at 1000mhz would still be a beast. We can't just look at shaders/cuda cores and ignore clocks. You can have a very fast Kepler chip with even 2304 SPs.
 
Last edited:

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Has it ever been confirmed that the GK110 was ever meant to be a GeForce chip? Do we really know that the GK104 was supposed to be midrange? I wasn't sure if that was ever confirmed or is still more or less a rumor.
 

OCGuy

Lifer
Jul 12, 2000
27,227
36
91
If thats true I guess Ill never buy Nvidia again.

Im a cynical A-hole about my computer parts, and the fact that they sold us a mid range card (gtx 680) for 500$ really ground my gears. Plus the 660Ti was based on a low end core and that sells for 300$. I was waiting for Big K to buy a real high end card for a real high end price but I guess I wont now.

If you have no idea what im talking about its this:
GTX 460/560/560Ti = Gf104/GF114
GTX 680 = GK104, the mid range core
GTS 450/550Ti = GF106/GF116
GTX 660Ti = GK106, a low end core
GTX470/480/570/580 = GF100/GF110
Big K = GK110, if this never comes out I wont be buying Nvidia ever again.



Oh god please stop this garabage. They sold the top single GPU. You don't know how the new chip would preform, so you don't know if you would buy it.


Just because it is repeated on the internet 1,000 times by the same 10 people doesnt make it true.
 
Feb 19, 2009
10,457
10
76
If thats the case its not the potential high TDP, because if its high TDP but high performance, enthusiasts will buy it no hesitations. Ppl who buy $500+ gpus don't give a damn about TDP as long as it performs.

This suggests TSMC cannot make enough GK110 for reasons which are obvious: they are fail and combined with gk110 die near 600mm2 as rumored, not a good combination. What they can make wouldnt even be enough to supply their HPC market (with huge profits), when we're talking orders of 15k units just from ONE lab.. the perspective needs to be re-stated.

A revamped gk104 => gk114 would be great for gaming efficiency but I was hoping for an uber beast single card.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Has it ever been confirmed that the GK110 was ever meant to be a GeForce chip? Do we really know that the GK104 was supposed to be mid-range? I wasn't sure if that was ever confirmed or is still more or less a rumor.

No to both questions. A lot of us thought back in February-March 2012 that NV will do a refresh with "7900GTX style" upgrade from 7800GTX 256mb (simply because 30-35% faster from 580 seemed underwhelming at the time) with the GK110 but it never happened. It sounded nice in theory that NV held back GK110 on purpose but given that they only sent out 32 and then 1000 GK110 samples for 'development testing' to Oak Ridge National laboratory just recently, GK110 consumer card was simply unmanufacturable at profitable levels for most of 2012 in the consumer space. I used to think NV held back GK110 on purpose but now it's more clear than ever that NV went with GK104 not because they wanted to but because they had to.

If NV had GK110 ready to go for mass consumer launch in 2012, they would have needed to have volume production a long time ago. Since it looks like Oak Ridge is only getting the remaining 13000 GK110 chips by March 2013, GK110 consumer part for 2012 was basically a pipe dream. These rumors are supporting the notion that GK110 could forever be an HPC/Scientific Compute Tesla part only. However, it doesn't mean NV can't produce some new chip that would be impressive. I guess if people don't get 75-100% faster GPU next generation they will quit PC gaming? It's a matter of perspective. 30-40% faster every 12 months or 75-100% faster every 24 months. If GTX780 is 30-40% faster by Spring 2013, that looks great to me since it would be only 1 year since 680 launched and 30-40% faster in 1 year is very good.
 
Last edited:

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
It seems that with nVidia they need custom cores for either compute or gaming. For whatever reason AMD seems able to do both no problem. I wouldn't be surprised at all if Big Kepler is not a good gaming GPU whereas a different variation is. The GTX 680 sucks at compute so this could give credence to the rumor.

That said, this rumor could be AMD viral designed to stop people from getting excited about Big Kepler. *shrug*
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Unless my memory is faulty, I recall references to a "GK112" back when Kepler leaks first started coming out. Interesting turn about, if true. Although I have been adamantly saying GK110 will be a Geforce product in 2013, I think it's plausible they are beefing up GK104 for 2013. *IF* GK104 had been outfitted with a 320-bit bus in the first place, with the 25% more memory bandwidth, TMU's, and extra ROP's over it's current configuration it would have probably slightly edged the hd7970GE.

Therefore, moving forward, they'll need a 320-bit bus with 7ghz vram and 2 more SMX cores, or a 384-but bus with 6ghz vram and 2 more SMX cores if they want to keep pace in the high end graphics performance without having to resort to a 550mm^2 chip. A 384-bit bus, with the same ROP and TMU ratios would give it a 50% memory, texture, and ROP improvement while 2 more SMX cores would give it a 25% computational improvement. I think the next result would be ~35% faster than gtx680 at 1440p. And it could all be done with a die size probably =< 375mm^2.