Lenzfire.com: Entire Nvidia Kepler Series Specifications, Price & Release Date

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ajay

Lifer
Jan 8, 2001
16,094
8,112
136
Ah, OK. I was wondering if I missed some announcement or something because such a change would be rather drastic and I'd like to read more about it.

Personally I don't see Kepler being a "threw the baby out with the bathwater" microarchitecture redesign over Fermi.

But if they are really going after power usage then it makes sense that they would eliminate hot-clocks since it takes a disproportionate amount of voltage to get those extra MHz.

But NV could just lower the shader clock to core clock ratio from 2:1 down to 3:2, or whatever works for them. That would allow Kepler to more like Fermi v2. If they've dropped the hot clocks, they would need some more significant architectural changes to be competitive (which could make sense, if indeed they are late - since they would be facing a bigger arch change + a new node).
 
Feb 19, 2009
10,457
10
76
Why even drop the hotclocks, i don't think its the cause for their lower perf/w, considering their GPUs tend to be much larger and focused on HPC output.

If it aint broke why fix it.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Why even drop the hotclocks, i don't think its the cause for their lower perf/w, considering their GPUs tend to be much larger and focused on HPC output.

If it aint broke why fix it.

Parts with hotclocks require insanely huge die, are less efficient and most of the time are power hungry. Nvidia wants the kepler to be super efficient. They want it to be viable for ultrabooks.

This makes complete sense too because, as we all know discrete sales are down because of intel IGPs and the HDD shortage. Consider that intel has 60% of the graphics market, and discrete sales are way down due to the HDD shortage -- so what PC makers do? They remove discrete graphics to offset the costs of HDD prices. Dell in particular is doing this and they are one of AMD's biggest customers. Anyway, it was the smart decision to drop the hotclocks, designing large die gpu's is an engineering nightmare. (in addition to the above reasons)
 
Last edited:

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
The Pentium 4 had hot clocked (double pumped) integer units I believe. Seems to add an extra layer of difficulty to controlling power and heat characteristics. A process might be fine at 1GHz but have issues (leakage?) at the 2GHz your pumped unit is set to run at.
 

Crap Daddy

Senior member
May 6, 2011
610
0
0
Those prices are insane if remotely accurate.

Insane, you mean too high?

Anyway the hotclock rumor was launched by only one source 3dcenter and everybody took it for granted. We don't know nothing for real so the lenzfire stuff might not be too off.
 

thilanliyan

Lifer
Jun 21, 2005
12,037
2,249
126
Trying to wade through your sarcasm here, but am I to understand that you believe that 7970 performance over GTX 580 is, or isnt (or spot on) underwhelming for its price. Sorry but the screen was dripping with sarcasm. :)

I wish 7970 was priced lower but it is what it is...I don't think I'm the one to judge whether it is or isn't worth it. Until nV brings out some competition AMD has no reason to bring the price down, just like nV charges a premium for the GTX580.

I'm more mocking the people who were disappointed/bitter with the 7970 launch performance and/or price. They were comparing the 7970 launch to the Bulldozer launch of all things lol. Go figure...
 

MrTeal

Diamond Member
Dec 7, 2003
3,916
2,700
136
Insane as in good deals, IMO. GTX580 performance in a better power envelope and for only $320 would be an awesome deal.

There's a few interesting things in that data that stand out
1) The memory speed is great for a 512-bit bus. Most GTX580 OCs I see have trouble cracking 4400 on a 384-bit bus. Reaching 5.5GHz as a stock speed on a 512-bit bus would be incredibly impressive for nVIDIA, since Fermi's had issues maximizing the potential of GDDR5.
2)The 690 seems like a dangerous card. With that much power, I can't see how 1.75GB effective would be enough. For $1000, they could really make that 3.5GBx2.
3)The 690 is strange. Most nVIDIA parts since Fermi has the bus size tied to the shader count. The 580->570 harvest drops stream processors from 512 to 480 and bus width from 384 to 320. The 670 drops from 1024/512 to 896/448. The 690 though has the 1024 SPs of the 680 with the 56 ROPs and 448bit bus of the 670. Odd.
4) Stock memory speed on the GTX660 is insane.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Insane as in good deals, IMO. GTX580 performance in a better power envelope and for only $320 would be an awesome deal.

There's a few interesting things in that data that stand out
1) The memory speed is great for a 512-bit bus. Most GTX580 OCs I see have trouble cracking 4400 on a 384-bit bus. Reaching 5.5GHz as a stock speed on a 512-bit bus would be incredibly impressive for nVIDIA, since Fermi's had issues maximizing the potential of GDDR5.
2)The 690 seems like a dangerous card. With that much power, I can't see how 1.75GB effective would be enough. For $1000, they could really make that 3.5GBx2.
3)The 690 is strange. Most nVIDIA parts since Fermi has the bus size tied to the shader count. The 580->570 harvest drops stream processors from 512 to 480 and bus width from 384 to 320. The 670 drops from 1024/512 to 896/448. The 690 though has the 1024 SPs of the 680 with the 56 ROPs and 448bit bus of the 670. Odd.
4) Stock memory speed on the GTX660 is insane.

Its unfortunate that the specs are a complete fabrication.
 

Crap Daddy

Senior member
May 6, 2011
610
0
0
Insanely high, yes. How many here think they are insanely low? Anyone?

If the prices will be true then yes, we can thank AMD for their very brave venture into high-end territory pricing reserved until now to NV.

The fact that AMD think they can charge whatever they like for the "fastest GPU on Earth" from unsuspecting tech enthusiasts brings the logical consequence of Nvidia launching a faster - I do believe it will be much faster - GPU at a much higher price.

But let's leave the high-end to the very small community who want and can afford it and see what's happening just one level behind.

The second in line for both companies was the 570 and the 6950 (350 and 300 dollars at launch) Now, the new generation will bring the 7950 at 450 plus $ and the 670 at 500$!

And, to make things worse, there's not a game out there that really needs the power of these cards and I don't see one being launched this year.
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
And, to make things worse, there's not a game out there that really needs the power of these cards and I don't see one being launched this year.

At 1680x1050 sure, but at 1600p and above that is not true. Metro 2033 averages in the 30's with Tahiti XT without maxing out the settings. There are other games that bring the high-end GPU's to their knees as well.

1080p for the most part is fine on 6950/GTX 570 cards but even then there are games that will give them a digital spanking. We need moar powuh!
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
If the prices will be true then yes, we can thank AMD for their very brave venture into high-end territory pricing reserved until now to NV.

The fact that AMD think they can charge whatever they like for the "fastest GPU on Earth" from unsuspecting tech enthusiasts brings the logical consequence of Nvidia launching a faster - I do believe it will be much faster - GPU at a much higher price.

Hahahaha, "unsuspecting tech enthusiasts"? "AMD's very brave venture into high-end territory pricing reserved until now to NV"? This is an amazing post for all the wrong reasons. I mean, I'd agree that the 7950 is priced far too high but seriously?

Enthusiasts actually check the products performance and are less likely to make blind buys. What you're saying is that mean old AMD is robing the "unsuspecting tech enthusiasts" and now poor Nvidia has to charge more for Kepler, because otherwise they wouldn't be shafting their loyal clients! So much cognitive dissonance in a single post.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Hahahaha, "unsuspecting tech enthusiasts"? "AMD's very brave venture into high-end territory pricing reserved until now to NV"? This is an amazing post for all the wrong reasons. I mean, I'd agree that the 7950 is priced far too high but seriously?

Enthusiasts actually check the products performance and are less likely to make blind buys. What you're saying is that mean old AMD is robing the "unsuspecting tech enthusiasts" and now poor Nvidia has to charge more for Kepler, because otherwise they wouldn't be shafting their loyal clients! So much cognitive dissonance in a single post.

Didn't you get the memo? Nvidia is the poor victim. I weep for them.

Thank GOD they have the ability to rip us off, I'd feel horrible if they didn't. I bought an 850$ 8800ultra because I felt so bad for them.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Its unfortunate that the specs are a complete fabrication.
Yes, you have said this again and again. Do you have some kind of alternate facts to present ?


Didn't you get the memo? Nvidia is the poor victim. I weep for them.

Thank GOD they have the ability to rip us off, I'd feel horrible if they didn't. I bought an 850$ 8800ultra because I felt so bad for them.

Really just 1 ? Not 2 or 3 ? Was it a good card for you ?
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Performance seems to be the only threshold Nvidia cares about.

That said unless that changes, Kepler will be a disappointment if they don't at the very least get the same gains over the 580 as AMD got over the 6970.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
And you know that how? Every new generation since 2006 has almost doubled the SIMD count. 128->240->480
It's mostly other things that make the chip big and hungry like all the HPC stuff they've been concentrating on since GT200.

Kepler is just a shrink/optimization of fermi. Maxwell will be the true "next gen" for nvidia.
 

kevinsbane

Senior member
Jun 16, 2010
694
0
71
Yes, you have said this again and again. Do you have some kind of alternate facts to present ?

Would it be fair to ask the question, "How is this not a fabrication?"

I think it is disingenious to ask for "alternate facts" when, in fact, there aren't any to begin with.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Performance seems to be the only threshold Nvidia cares about.

That said unless that changes, Kepler will be a disappointment if they don't at the very least get the same gains over the 580 as AMD got over the 6970.
I agree those are the minimum goals. I will be dissapointed if we don't get a card, for under 200 dollars that can easily deliver 6870 / gtx 560TI + 20% .
I don't know if a 300-320 dollar card , even counts as mid-range. IMO, it doesn't.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Would it be fair to ask the question, "How is this not a fabrication?"
Tech sites , even blogs have to be held to a higher standard than your average forum poster. Yes , they could be wrong, or it could also be a scoop. He said /she said is pretty useless without proof ?
 

ultimatebob

Lifer
Jul 1, 2001
25,134
2,450
126
Wow... and I thought that Apple fanboy websites were the kings of creating bullshit technical specs. They got NOTHING on these guys.
 

superjim

Senior member
Jan 3, 2012
293
3
81
I don't know if a 300-320 dollar card , even counts as mid-range. IMO, it doesn't.

It depends on what metric you use to determine "mid-range". If NV has 30 SKUs then a mid-range card could be $600 for all we know, with the high-end being a $2000 card.

If you use sales to determine mid-range, what is selling better than the cheap cards but not as good as the expensive cards, then you could have a $100 mid-range card.

If we use performance, then the mid-range is going to be whatever cards fall into the middle of the pack. Those could be $100 or $600.

I think most would agree that "mid-range" is normally in the $150-250 range. I consider anything higher than $300 to be firmly in the high-end regardless of performance or number of SKUs.
 

kevinsbane

Senior member
Jun 16, 2010
694
0
71
Tech sites , even blogs have to be held to a higher standard than your average forum poster. Yes , they could be wrong, or it could also be a scoop. He said /she said is pretty useless without proof ?

But isn't this just that? Useless? Fun, to be sure, to speculate on, but since the source is of questionable authenticity, it does boil down to a he said/she said situation.


I'm not sure I get what you're saying. Are you saying that we should hold tech sites and blogs to a higher standard of trustworthiness (we require their information be reliably based on solid proof) or that because they are tech sites and blogs, the information they provide is inherently of greater trustworthiness due to their higher standard?
 

Jionix

Senior member
Jan 12, 2011
238
0
0
Wow, if the specs are true, this is not a victory. A 550mm2 GPU, 2/3rds bigger then Tahiti, will need much power? Offers how much more performance? Those types of clock speeds? And how big is the cooler?

Rethink this --- the 660ti, a supposed GK110 GPU, at 550mm2, is only 10% faster then Tahiti at 365mm2. If true, AMD is still superior in GPU design.