Lenzfire.com: Entire Nvidia Kepler Series Specifications, Price & Release Date

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MrTeal

Diamond Member
Dec 7, 2003
3,916
2,700
136
Wow, if the specs are true, this is not a victory. A 550mm2 GPU, 2/3rds bigger then Tahiti, will need much power? Offers how much more performance? Those types of clock speeds? And how big is the cooler?

Rethink this --- the 660ti, a supposed GK110 GPU, at 550mm2, is only 10% faster then Tahiti at 365mm2. If true, AMD is still superior in GPU design.

Well, that's a pretty staggering if. If they were true parts like the 660 would be great, but the numbers just don't make sense. Look at the 680 vs the 670; performance relative to the 7970 is claimed to be 21% higher (1.45/1.2), while the 680 is clocked the same and only has 14% more SPs, ROPs and bus width than the 670. Sure memory is clocked a little higher, but in effect this data claims that Kepler gets greater than unity increases in performance from added stream processors. Now that's amazing.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
The real question is how does nvidia plan to fit a 550mm2 die inside an ultrabook? I know NV has stated their intention to make kepler work in ultrabooks, yet another sign of the true nature of this "leak". In fact, isn't Apple using Kepler for the Macbook Pro this summer? Heck, minuswell put a GTX 480 in it.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
lolwut?

laptop sku's aren't desktop sku's... Nvidia without question is going to get better performance per watt than last gen, that shouldn't even be questioned.

The question is how much better, and how much power can they pack into ultra low power sku's.

You don't believe a 580m is the same as a 580, do you?
 

MrTeal

Diamond Member
Dec 7, 2003
3,916
2,700
136
The real question is how does nvidia plan to fit a 550mm2 die inside an ultrabook? I know NV has stated their intention to make kepler work in ultrabooks, yet another sign of the true nature of this "leak". In fact, isn't Apple using Kepler for the Macbook Pro this summer? Heck, minuswell put a GTX 480 in it.

Kepler is more than just the GK110. If Kepler was going into an Ultrabook it would probably be GK106, or a GK108 class chip and be under 100mm^2.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
The real question is how does nvidia plan to fit a 550mm2 die inside an ultrabook? I know NV has stated their intention to make kepler work in ultrabooks, yet another sign of the true nature of this "leak". In fact, isn't Apple using Kepler for the Macbook Pro this summer? Heck, minuswell put a GTX 480 in it.

Apple has been using AMD GPUs for discrete cards for around two years now, IIRC. They don't have options with NVIDIA GPUs for the vast majority of their lines because AMD has superior performance/watt.

And even if I don't like Apple, that's the best combo right now: Intel CPU, AMD GPU.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
More likely a rebrand. Half of AMD and NV's mobile lineup seems to be rebrands these days.

http://www.anandtech.com/show/5200/...-7000m-and-nvidias-geforce-600m-mobile-gpus/2

Getting sick of these complete BS rumours. Nothing but junk just like in the months leading up to Fermi's launch. Get your act together nvidia, catch up on 28nm, and put some cards out or do a paper launch at the least.

It's taking forever for the MSI Lightning 7970 to come out, which is the card I want, with a full cover waterblock. So I figure I have a few months until that is available. I was hoping nvidia would release by then so I could compare their offerings, but it's starting to look really bleak with nothing but these obvious fakes are obvious fantasy leaks.
 

trek554

Banned
Feb 3, 2012
17
0
0
224bit memory bus? LOL, that is a new one. I thought that memory controllers were 64 bit so how would that even be possible?
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
lolwut?

laptop sku's aren't desktop sku's... Nvidia without question is going to get better performance per watt than last gen, that shouldn't even be questioned.

The question is how much better, and how much power can they pack into ultra low power sku's.

You don't believe a 580m is the same as a 580, do you?

Understand the thermal requirements for ultrabooks is even more strict than for notebooks. While I'm aware that GK110 will not be in an ultrabook, the underlying architecture _must_ be super efficient for it to work in an ultrabook setting. AFAIK fermi was never put in an ultra book, even the mobile part is not efficient enough....the architecture is not efficient enough.

Charlie has hinted that kepler is super small and has great thermals, and i'm not sure everything he says is credible but i'm inclined to believe that Kepler will not be a large die solution. This has been stated numerous times on many websites, Kepler is supposed to be NV's "efficient" chip - their first one ever.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Understand the thermal requirements for ultrabooks is even more strict than for notebooks. While I'm aware that GK110 will not be in an ultrabook, the underlying architecture _must_ be super efficient for it to work in an ultrabook setting. AFAIK fermi was never put in an ultra book, even the mobile part is not efficient enough....the architecture is not efficient enough.

Charlie has hinted that kepler is super small and has great thermals, and i'm not sure everything he says is credible but i'm inclined to believe that Kepler will not be a large die solution. This has been stated numerous times on many websites, Kepler is supposed to be NV's "efficient" chip - their first one ever.


This is probably why Kepler is going into ultra books:

http://www.techpowerup.com/159917/AMD-Slips-Out-Trinity-ULV-3DMark-Performance.html

In a footnote of a slide detailing AMD's Trinity A6 APU for Ultrathin notebooks at the company's Financial Analyst Day event, the new chip's 3DMark performance was revealed. The company was talking about the 17W ULV (ultra-low voltage) variant of the "Trinity" APU in the slide, that's designed for compact notebooks. The 3DMark Vantage performance of the APU was measured to be 2,355 points, in the same test, an Intel Core i5-2537M ULV 17W "Sandy Bridge" processor scored 1,158 points. The AMD chip, hence, emerged with a 103% graphics performance lead.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
It's taking forever for the MSI Lightning 7970 to come out, which is the card I want, with a full cover waterblock. So I figure I have a few months until that is available. I was hoping nvidia would release by then so I could compare their offerings, but it's starting to look really bleak with nothing but these obvious fakes are obvious fantasy leaks.

A little off topic, but full cover blocks for non-reference cards are generally pretty hard to come by.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
That was a confirmed rebrand like a month ago.


Yes , there was talk there are model rebrands, but the article linked was written on Jan 27, and speaks of 2 separate GPU's.
We're still not entirely sure whether or not N13P-GT is a Kepler part or not considering its relative similarity to the GT 630 which has been known to be a Fermi rebranded mobile GPU.

We are hopeful that we've stumbled upon the first Kepler part in a mobile devices, but we remain somewhat cautious about it being one considering all of the similarities.
You are aware that Nvidia gpu's are going back in to Apple laptops ? You seem to write other-wise.

http://blogs.barrons.com/techtrader...o-use-gpus-in-place-of-amd-says-semiaccurate/
 

Ajay

Lifer
Jan 8, 2001
16,094
8,112
136
Parts with hotclocks require insanely huge die, are less efficient and most of the time are power hungry. Nvidia wants the kepler to be super efficient. They want it to be viable for ultrabooks.

What?? Hotclocks reduce die area at the cost of power consumption. That's the trade off. If you drop hotclocks, you need more CCs, increasing die area, but improving efficiency. However, if NV drops hotclocks, the new architecture will deviate further from Fermi and possibly create new problems in production.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136

Er, don't you mean why Kepler would NOT be going into Ultrabooks? Because they won't be able to due to the much higher heat/power cost of having a CPU and separate GPU?

Even if Intel offers chips that are much lower power, having a separate GPU add not only more power and heat, but there is physically not much room on ultrabooks for a standalone GPU.
 

MrTeal

Diamond Member
Dec 7, 2003
3,916
2,700
136
Er, don't you mean why Kepler would NOT be going into Ultrabooks? Because they won't be able to due to the much higher heat/power cost of having a CPU and separate GPU?

Even if Intel offers chips that are much lower power, having a separate GPU add not only more power and heat, but there is physically not much room on ultrabooks for a standalone GPU.

I think he was implying that because Intel performs so poorly in gaming relative to Trinity, they would need a discrete card to compete.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Er, don't you mean why Kepler would NOT be going into Ultrabooks? Because they won't be able to due to the much higher heat/power cost of having a CPU and separate GPU?

Even if Intel offers chips that are much lower power, having a separate GPU add not only more power and heat, but there is physically not much room on ultrabooks for a standalone GPU.


As with Apple laptops, there are Ultrabooks that are going to have Ivy Bridge and Kepler.

Kepler to end up in Ultrabooks

However, we stand tall by the news that Kepler gets to Ultrabooks this year. This was confirmed by several industry sources
 

WMD

Senior member
Apr 13, 2011
476
0
0
I love how this thread now turns into Nvidia mobile GPU speculation thread.
 

WMD

Senior member
Apr 13, 2011
476
0
0
Intel forced you buy a low end GPU anyway. :sneaky:

Sneaky and not very nice of them but least it bundles with a cpu beating previous high end models at a much lower price. AMD can learn a thing or two from that.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Oh snap, I guess something did come from AMD bankrupting themselves in the acquisition of ATI thus stagnating their cpu performance for the last six years. Because of ATI we got Bulldozer, heck yeah!

OT: Even if I do go triple screen I may forgo an upgrade to even Kepler this gen, games are just so far behind and the difference between "Ultra" and "High" is unnoticeable in many games despite the large performance impact (cough BF3/cough shadows).
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
What?? Hotclocks reduce die area at the cost of power consumption. That's the trade off. If you drop hotclocks, you need more CCs, increasing die area, but improving efficiency. However, if NV drops hotclocks, the new architecture will deviate further from Fermi and possibly create new problems in production.

wtf, i don't know you are being sarcastic or not but if the hotclock really reduce the die are then why nvdia GPU is always bigger and more power hungry than amd in the same performance category ?