NVIDIA GeForce GTX 780 To Be Based on GK114 GPU

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
>$500 GPUs is not representative of the larger market. Nor are $300 ones. What are you getting at? NV doesn't want to make money eventhough they have GK110 in abundant and ready for consumers but feel like earning less??

If you want to completely misread my post, sure.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Die harvesting is a longstanding tradition in GPU manufacturing so I can see a cut-down GK110 sold as GeForce happening. Basically the Tesla/Quadro rejects get a faulty SMX turned off or whatever and turn into GeForce. However, I have read elsewhere that TSMC 28nm yields are actually quite good, though admittedly those sources didn't talk about NV specifically. And a GK110 would have to be seriously clocked down or cut down or both, if what Charlie wrote is accurate. Either that or there is some other bottleneck we don't know about.

The thing to take from this article is - yes, there's going to be GK110 based GeForce.

Because we already knew that GK110 wont be nowhere close to GTX 690, and ofc GK104 will have superior metrics vs heavy HTPC GPU.
But that's why they're making two different chips :rolleyes:

And then there's the part about terrible yields, which are so serious, and this is the best part, that NVIDIA will willingly be selling lower parts to consumers for paltry few hundreds, instead to pro's for couple $ K.

But he addresses that. By stating that Intel's KC of which performance no one knows anything of,
other that it won't make existing (x86) code magically parallel and fast, while NV seems to be doing just that (1) just fine (2),
anyway, Knight's Corner is going to be fast, poor-yielding but so fiercely fast, that NV needs to push everything they got. It's Intel's first MIC, but it so happens to be they're neck to neck with GK110.

He forgets that it's ALWAYS good to have high volume by utilizing utilize Non Recurring Engineering/Cost. That NV has been doing just that since forever. That Intel is a newcomer in projecting parallel monsters, and that everyone who needs absolutely fastest parallel code has either already learned CUDA or something else because x86 just doesn't cut it there. Nvidia is not afraid of losing performance to Intel, but convenience and middle-ground customers.

But it's nice to see Charlie back to his old routine, Bobcat and Hondo alive and mutating, while "at $1 per year overpaid" Jen-Hsun is once again about to run NVIDIA aground :ninja:


TL;DR Charlie confirms GK110 based GeForce.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Wait - Intel's KC is already out? I thought only people with a NDA have pre-samples...

BTW: A german workstation seller has listed specs of K20 which could or could not be true:
13 SMX | 700MHz | 320bit.

/edit: Oh, i guess Charlie never looked at the specs of the Tesla cards. They always lower than the Geforce products...
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Wait - Intel's KC is already out? I thought only people with a NDA have pre-samples...

BTW: A german workstation seller has listed specs of K20 which could or could not be true:
13 SMX | 700MHz | 320bit.

/edit: Oh, i guess Charlie never looked at the specs of the Tesla cards. They always lower than the Geforce products...

that would put it at 2496 shaders. 192x13.
and 320 bit would give us 2.5 gigabytes of RAM.
Think GTX 570 with 320 bit bus with 1.25 gigabytes times 2.
Would be pretty kewl.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
What happened to what the faboys made out to be the fabled beast GK110, shaped like a unicorn and pissing rainbows? You mean to tell me that all Nvidia could produce for Geforce was a "mid range" GK104 that they "decided to name" GTX680 only AFTER "they saw how easily it beat 7970"? Lulz. Sorry for the quotes, its just random lines from Nvidia zealots I've read here.

Do you ever, EVER say anything useful and not inflammatory? When was the last time you actually contributed a noninflammatory, self generated worthwhile post on here?
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Do you ever, EVER say anything useful and not inflammatory? When was the last time you actually contributed a noninflammatory, self generated worthwhile post on here?

You sound mad. Are you mad? :colbert:

I was just wondering what happened to the Geforce iteration of GK110, whose mere RUMOR caused green nerdgasms forum-wide. It was very recent. Don't be mad if I question one rumor with another, just because it isn't necessarily positive for Nvidia.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
What happened to what the faboys made out to be the fabled beast GK110, shaped like a unicorn and pissing rainbows? You mean to tell me that all Nvidia could produce for Geforce was a "mid range" GK104 that they "decided to name" GTX680 only AFTER "they saw how easily it beat 7970"? Lulz. Sorry for the quotes, its just random lines from Nvidia zealots I've read here.

1) Only AMD fanboys and haters called the GTX 680 mid range
2) I don't know anyone who thought the GTX 680 was going to be named any differently
3) The GK110 does exist but you have to buy Quadro cards to have it. Makes sense due to the limited supply of GPUs and the money they make on the professional lines.
4) Why are you changing the subject randomly?
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
1) Only AMD fanboys and haters called the GTX 680 mid range
2) I don't know anyone who thought the GTX 680 was going to be named any differently

Stop it with the name-calling, especially when you are wrong. It wasn't supposed to be their top GPU, it just worked out that way because 7970 wasn't as fast as NV had feared. No need to squander GK110 GPUs on GeForce then. GK104 is indeed midrange but it outperformed expectations and got shipped out with their high-end moniker "GTX x80."

http://www.techpowerup.com/162901/Did-NVIDIA-Originally-Intend-to-Call-GTX-680-as-GTX-670-Ti-.html

http://vr-zone.com/articles/how-the...-prime-example-of-nvidia-reshaped-/15786.html
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I don't disagree with you but it still is a theory, speculation and conjecture though --- even in the Techpowerup link they offer theory and Vrzone, list sources -- nothing official or concrete.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
You sound mad. Are you mad? :colbert:

I was just wondering what happened to the Geforce iteration of GK110, whose mere RUMOR caused green nerdgasms forum-wide. It was very recent. Don't be mad if I question one rumor with another, just because it isn't necessarily positive for Nvidia.

TBH it's you who sounds mad.

Mostly at "faboys made out to be the fabled beast GK110, shaped like a unicorn and pissing rainbows"?
Considering this is Anandtech forum where nerds wallowing at arrival of shiny new tech is a pretty normal thing,
the really interesting question is - why are you bothered by something that's normal and expected?

Occasional irony and stab here and there is perfectly fine, particularly if it's humorous.
But understand that you can not even stab these fanboys, not with everyday irony, in every single post.
Because that's just bitterness. And they just laugh at your thinking "damn I am a zealot fanboi, but this one, he's gone"
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I don't disagree with you but it still is a theory, speculation and conjecture though --- even in the Techpowerup link they offer theory and Vrzone, list sources -- nothing official or concrete.

Do you seriously think NV is stupid enough to officially announce to the world that they took their better-than-expected midrange part and turned it into what is now know as GTX 680? You guys in the forums would rip them apart for saying so, and people would look at the shrunken PCB, memory bandwidth, heatsinks, etc. and nod to themselves, "yeah it looks midrange." NV is not that stupid.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
You sound mad. Are you mad? :colbert:

I was just wondering what happened to the Geforce iteration of GK110, whose mere RUMOR caused green nerdgasms forum-wide. It was very recent. Don't be mad if I question one rumor with another, just because it isn't necessarily positive for Nvidia.

Calling what I see is not being mad. It's just pointing out that people like you who offer inflammatory comments way more often than general, intelligent discussion ruin threads and forums like this. It has nothing to do with positive / negative opinions of Nvidia. I've posted plenty of negative things regarding Nvidia in the past two days without mocking anyone or trolling threads. You are specifically trolling on this forum. And even after I call you out, you're still doing it. You can't post something worthwhile and not inflammatory. You contribute nothing but troll posts and trash that derail worthwhile discussion.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Do you seriously think NV is stupid enough to officially announce to the world that they took their better-than-expected midrange part and turned it into what is now know as GTX 680? You guys in the forums would rip them apart for saying so, and people would look at the shrunken PCB, memory bandwidth, heatsinks, etc. and nod to themselves, "yeah it looks midrange." NV is not that stupid.

I don't disagree with your theory!
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
huh im nvidia fanboy but never expected 60% + performance of 780 over 680 :D my maximum expectations was 25-30% .... only thing i hope was 3GB version of 780 and more Memory bandwidth for better AA performance ....

im nvidia fanboy ! yes and same Memory bandwidth on 680 as on 580 is disappointment !

No. A fanboi defends them to the end (like the "It's AMD's fault" crowd). You simply prefer nVidia, and are disappointed.

To nVidia's credit they are attacking every reason that people have had, for some time now, to buy AMD and improving those aspects on their own products. They just seem to shoot too high (too big) on these recent node changes and run into yield/manufacturing and/or power usage issues that hold them back. They seem to have real issues working within the manufacturing limits. AMD, on the other hand, doesn't promise quite so compelling products, but more times than not they deliver, functional and on time.

What most nVidia fans are used to is that at some point nVidia is going to release a superior performing product. This round, they haven't been able to do that. If rumors are true they might not be able to do it next round either.
 

Siberian

Senior member
Jul 10, 2012
258
0
0
Stop it with the name-calling, especially when you are wrong. It wasn't supposed to be their top GPU, it just worked out that way because 7970 wasn't as fast as NV had feared. No need to squander GK110 GPUs on GeForce then. GK104 is indeed midrange but it outperformed expectations and got shipped out with their high-end moniker "GTX x80."

http://www.techpowerup.com/162901/Did-NVIDIA-Originally-Intend-to-Call-GTX-680-as-GTX-670-Ti-.html

http://vr-zone.com/articles/how-the...-prime-example-of-nvidia-reshaped-/15786.html

If AMD had not fallen so far behind this generation, NVIDIA never would have been able to pull that off.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Calling what I see is not being mad. It's just pointing out that people like you who offer inflammatory comments way more often than general, intelligent discussion ruin threads and forums like this. It has nothing to do with positive / negative opinions of Nvidia. I've posted plenty of negative things regarding Nvidia in the past two days without mocking anyone or trolling threads. You are specifically trolling on this forum. And even after I call you out, you're still doing it. You can't post something worthwhile and not inflammatory. You contribute nothing but troll posts and trash that derail worthwhile discussion.

Meh :rolleyes: Take it to pm bro, and/or report your trolling theory. Thanks :thumbsup:

Also, your hilarious hypocrisy shows in not calling out something like the following:

If AMD had not fallen so far behind this generation, NVIDIA never would have been able to pull that off.

Which he does day in, day out. But at least its negativity about AMD, all good in your eyes :rolleyes:

Please take the next 3 days off to refresh yourself on why you aren't supposed to use the technical forums to discuss your issues with other posters
-ViRGE
 
Last edited by a moderator:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Stop it with the name-calling, especially when you are wrong. It wasn't supposed to be their top GPU, it just worked out that way because 7970 wasn't as fast as NV had feared. No need to squander GK110 GPUs on GeForce then. GK104 is indeed midrange but it outperformed expectations and got shipped out with their high-end moniker "GTX x80."

http://www.techpowerup.com/162901/Did-NVIDIA-Originally-Intend-to-Call-GTX-680-as-GTX-670-Ti-.html

http://vr-zone.com/articles/how-the...-prime-example-of-nvidia-reshaped-/15786.html

So, if 7970 had been faster what would they have done? They had no GK110 to counter with. The GK104 would have still been their top of the line this round.

They had designed their 2nd tier chip larger this round. It was big enough that they could wring it out and get gaming performance competitive with Tahiti. It was originally slotted to be a lower clocked chip, but they adjusted that and put out a winner. Make no mistake though, GK104 isn't truly competitive with GCN. There are performance parameters that Tahiti is more than 100% faster than GK104 (OpenCL rendering, for example).

Whether they will or not (I tend to doubt it) AMD could bump the size of Pitcairn a bit and catch GK104 (the current version not the refresh). It will still outperform it convincingly in compute. nVidia has done a wonderful job of leveraging the strengths of GK104 and marketing them. It doesn't truly compete with GCN overall though. GK110 should. I'll be curious to see how it affects the gaming performance.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
If AMD had not fallen so far behind this generation, NVIDIA never would have been able to pull that off.

True, but if Nvidia wasn't such a cooperation they wouldn't have done us so dirty.


Every time I see the reference 670 I both laugh on the inside and die on the inside, things like 4 inches long and cost $400.


So, if 7970 had been faster what would they have done? They had no GK110 to counter with. The GK104 would have still been their top of the line this round.

I don't know, what kind of question is that?

You can't make any logical assessments after you changed history. Maybe they would have pushed the 680 out to compete for sweat spot volume sale. Maybe they would have delayed longer than 2 months and pushed GK110 out the door GF100 style, who knows?
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
So, if 7970 had been faster what would they have done? They had no GK110 to counter with. The GK104 would have still been their top of the line this round.

They had designed their 2nd tier chip larger this round. It was big enough that they could wring it out and get gaming performance competitive with Tahiti. It was originally slotted to be a lower clocked chip, but they adjusted that and put out a winner. Make no mistake though, GK104 isn't truly competitive with GCN. There are performance parameters that Tahiti is more than 100% faster than GK104 (OpenCL rendering, for example).

Whether they will or not (I tend to doubt it) AMD could bump the size of Pitcairn a bit and catch GK104 (the current version not the refresh). It will still outperform it convincingly in compute. nVidia has done a wonderful job of leveraging the strengths of GK104 and marketing them. It doesn't truly compete with GCN overall though. GK110 should. I'll be curious to see how it affects the gaming performance.

According to the VR Zone article, which SirPauly rightfully said could be wrong (but could be right and sure looks right based on anecdotal evidence such as that unearthed by TPU), the original HPC chip was not as powerful and they further strengthened it into what is now known as the GK110. The original, weaker HPC chip would presumably have gotten tagged as the GTX 680 if Tahiti were considerably stronger. Please read the VR Zone article for more details:

"The decision to rebrand GK107 into GTX 680/690 was made. Furthermore, the company decided to further increase the performance of the high-end die, giving birth to GK110"

(I think they made a typo, they meant GK104 instead of GK107)



True, but if Nvidia wasn't such a cooperation they wouldn't have done us so dirty.

Every time I see the reference 670 I both laugh on the inside and die on the inside, things like 4 inches long and cost $400.

If you're looking for inches, AMD has eleven-inchers and even footlongs in its stable. Some of them are 50% thicker than normal, too.
 
Last edited:
Dec 30, 2004
12,553
2
76
Or you can have cards like the 7970 which have way more memory bandwidth. Hopefully the 8970 isn't as much of a let down as a GK114 chip would be.

I believe this may be a "testing the waters" for Nvidia, telegraphing to AMD that they don't really want to compete this round. They would do this because they think AMD is going to be play ball, and because it is mutually beneficial to not try to compete since the market for high end graphics cards is pretty stagnant.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I believe this may be a "testing the waters" for Nvidia, telegraphing to AMD that they don't really want to compete this round. They would do this because they think AMD is going to be play ball, and because it is mutually beneficial to not try to compete since the market for high end graphics cards is pretty stagnant.

"I really think we should work harder together on the marketing front. As you and I have talked about, even though we are competitors, we have the common goal of making our category a well positioned, respected playing field. $5 and $8 stocks are the result of no respect."

"We launch the GPU initiative at some industry show together. Perhaps something like Meltdown or IDF. We could even share a GPU initiative booth together to get tons of PR from the press."

"Both of us have spent the last three years trying to bring the perceived value of our products up to the level of Intel. The "GPU" category is clean and has served us well that way. We both have increased the price of our high end product several fold over the last 4 years while Intel’s high end prices have more than halved. Creating another category serves to work contradictory to that. How does one cleanly position it versus a GPU and a CPU?? It will tear down what we have both built."

http://www.tomshardware.com/news/Nvidia-ATI-lawsuit-antitrust,6421.html

http://www.tomshardware.com/news/nvidia-amd-ati-graphics,6311.html
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
"I really think we should work harder together on the marketing front. As you and I have talked about, even though we are competitors, we have the common goal of making our category a well positioned, respected playing field. $5 and $8 stocks are the result of no respect."

http://www.tomshardware.com/news/Nvidia-ATI-lawsuit-antitrust,6421.html


stella21.gif
 

Siberian

Senior member
Jul 10, 2012
258
0
0
I think NVIDIA put a longer cooler on the 670 to make it look more like a high end card. I don't think it needs it when you look at the PCB. Looking at the launch reviews the 680 was the clear winner over the 7970, so you can't blame NVIDIA for taking advantage of the situation.