Kepler Before Xmas

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

formulav8

Diamond Member
Sep 18, 2000
7,004
522
126
Definitely don't get your hopes up. The chance might be there, but I really think the odds are way below expecting a higher desktop part in 2011 that is. But thats only through my eyes and not from nvidia or proof. I am surprised nvidia hasn't showed off any working silicon unless I overlooked it??
 
Last edited:

Ajay

Lifer
Jan 8, 2001
16,094
8,112
136
Most of the rumors I've read put GK100 part as coming out sometime this spring. Other rumors are that nVidia will be releasing lower end parts first (around the beginning of the year). Unless something magical has happened with TSMC's 28nm node, it doesn't seem likely that NV's large high performance graphics parts will be available in acceptable quantities.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
bunnyfubbles said:
Unless they want to make a drastic change and suddenly really try to challenge nVidia in the GPGPU arena I seriously doubt that would be the case. They've been golden ever since learning from their HD2900 debacle which they rectified with the 3000 series and continued on with ever since. They've never held the single GPU title (except for the period where the 5800s had no competition at all due to Fermi's delay) but they've since been besting nVidia in price/performance, performance/watt, performance/die-size...and I see no reason for them to change. In fact their strategy is slowly working in their favor with multi GPU solutions starting to tilt in favor of AMD when it was nVidia that dominated that arena for so long.

9700 pro
x1950xtx
5970
6990

The reason to change is that they were much more profitable when they had the single core lead instead of resorting to dual core cherry-picked ultra limited availability cards for the top end.

They haven't made more than a couple hundred million dollars TOTAL since AMD bought them. I was a big fan of the small ball strategy for many years, but they simply aren't making enough money to justify their continued existence right now. As intelligent/cool/interesting as small ball is, it just doesn't generate profits the way "big-ball" does. They've done a good job of rehabilitating the company since the 2900 fiasco, and they made a bit of money a couple of years ago when they beat nvidia to the punch by 6 months, but it's time for them to start making real money. As in 9 figures per quarter instead of 7.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
nope, ATI/AMD has almost always beaten nVidia to a new fabrication process. Yes, nVidia has beaten ATI/AMD to the punch when it comes to releasing a new generation of GPU and a higher end part, but I can't remember when they've been first to a new fabrication process.

the GT200s were 65nm when ATI had long been on 55nm with the 3000 series, and was why the 4000 series was such a huge blow to nVidia despite ATI being unable to claim the performance crown, they had such a huge thermal and price/performance advantage that it didn't matter.

Same went for the 8800 GTX (90nm) vs. the 2900 (80nm). And again for the 6800Ultra only 130nm, while ATI was first to 110nm.


So unless nVidia releases a 40nm version of Kepler (which would pretty much nullify any chance that it will be an earth shattering performance part) it would mean they'd be first to the 28nm party and first with a new fabrication process in forever.

As I understand it . This release is 40nm . It has 4 cpu cores . So I guess cpu and Gpu are now the same might as well merge them now . No need for a gpu cpu section merge those., OK I went to look I see were talking 2 differant products . OOPS
 
Last edited:

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
9700 pro
x1950xtx
5970
6990

The reason to change is that they were much more profitable when they had the single core lead instead of resorting to dual core cherry-picked ultra limited availability cards for the top end.

They haven't made more than a couple hundred million dollars TOTAL since AMD bought them. I was a big fan of the small ball strategy for many years, but they simply aren't making enough money to justify their continued existence right now. As intelligent/cool/interesting as small ball is, it just doesn't generate profits the way "big-ball" does. They've done a good job of rehabilitating the company since the 2900 fiasco, and they made a bit of money a couple of years ago when they beat nvidia to the punch by 6 months, but it's time for them to start making real money. As in 9 figures per quarter instead of 7.

well big ball doesn't work either, 2900 proved that, and the GF200 series and following Fermi fiasco ended nVidia undisputed dominance, so the only way to go from here is to bow out of discrete GPU entirely
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
well big ball doesn't work either, 2900 proved that, and the GF200 series and following Fermi fiasco ended nVidia undisputed dominance, so the only way to go from here is to bow out of discrete GPU entirely

How so? On the high end, NV dominated with 8800GTX over 2900XT/3870, GTX280 was faster than HD4870, GTX285 was faster than HD4890, GTX470 beat HD5850 and GTX480 beat HD5870, while GTX580 beat a 6970. I think AMD provides excellent bang-for-the-buck options (although NV also did well with 8800GT, 9800GT, and GTX460/560/560Ti series). However, in terms of single-GPU performance, NV has now won 3 consecutive generations.

Also, HD5870 and esp. the 6950/70 series have a much larger die than ATI's previous high-end cards. I have no idea why AMD even called it a "small die" strategy unless this was a comparison to NV's die. But in comparison to ATI, it was very misleading to call it that since the die sizes continue to grow on the AMD side, while at the same time AMD has completely abandoned the $400-550 single-GPU market bracket.

To make matters worse, their HD6950 can often be unlocked into their flagship chip. Now look what happens then:

AMD sells a 389mm^2 HD6950 for $240-270. But AMD gives you 2GB of faster VRAM than what's found in the 570/580. This most likely makes it more expensive for AMD to sell 2GB of fast GDDR5 than it does for NV to sell 1.28-1.5GB of slower VRAM. And yet, NV is able to sell a 520mm^2 GTX580 for a whopping $430-500!

I have a feeling the cost difference between a GTX580/GTX570 GPU and a 6950 GPU is not enough to offset the huge price premium that NV commands. That means JHH is laughing all the way to the bank every time a single GTX570/580 is sold!

Since AMD's die size on the HD6970 is almost 400mm^2, I have my doubts that a Cayman GPU costs 2x more to manufacture than a GF110 GPU....
 
Last edited:
Feb 19, 2009
10,457
10
76
You Americans place no emphasis on perf/watt at all.

Come back and consider when your electricity is charged at 30-40 US cents per kilowatt hour, instead of the current 8 cents.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
You Americans place no emphasis on perf/watt at all.

Come back and consider when your electricity is charged at 30-40 US cents per kilowatt hour, instead of the current 8 cents.

No, some people do, but not because of cost, primarily because of the extra heat and noise required to dissipate that heat.

But I'll tell you why I personally don't care about the cost using mathematics.

Ok let's consider a 120W HD7870 with performance about 15% faster than an HD6970 vs. a 250W HD7970 with performance 60-70% faster than an HD6970. I'll use your 'expensive' rate of 40 cents per KWh.

130W power differential @ 20 hours of gaming a week x $0.40 = $1.04 a week or $54.08 per year

Is an extra $54 per year for gaming 20 hours a week x 52 weeks a lot of $ to pay for our PC gaming hobby? That works out to 1040 hours of gaming for the year. For some people, that's just what their internet costs them a month.

Therefore, we arrive at the incremental electricity cost to game on a 250W card vs. a 120W card at a rate of $0.052 per hour! ($54.08 / 1040 gaming hours a year). I am pretty sure most people on our forum make more than $4 an hour....I'll even say that most of us buying $200-300+ GPUs are making at least $20-25, if not higher.

Ok champ, now is 5 cents an hour savings worth it to get a mid-range HD7870 over a high-end HD7970/GTX680 just to save on electricity? :hmm:

Even at your rates, the cost of electricity is such a tiny expense for this hobby, that it's practically irrelevant. And if you game less than 20 hours a week, then the cost is even less relevant! :biggrin:

Just think about this statistic for a second: Someone who games 1040 hours a year x 50 years will have spent 52,000 hours of his life gaming (or almost 6 years of that person's life on non-stop gaming).

The total cost associated with the extra 130 Watts of electricity at $0.40 cents x 1040 hours of gaming / year x 50 years = $2,704.

Now you tell me, is $2,704 over your entire life a lot to spend on something you enjoy?

Please name a cheaper hobby and I'll gladly start caring about electricity costs. I can understand your argument if a person were to make $400-600 a month in income. But last time I checked, the average person in America doesn't make 3rd world country wages. And on top of that, the overall purchasing power of an individual in America is often a lot higher than in other parts of the world (cars and food and real estate are actually pretty cheap vs. the rest of the world when you consider per capita income in US/Canada).

If you just purchased BF3 at launch at $59 USD, you just negated your estimated annual electricity cost savings -- with 1 game purchase.

Just take a look at the cost of expensive hobbies:

http://www.mainstreet.com/slideshow/smart-spending/most-expensive-hobbies

OR

http://www.catalogs.com/info/bestof/top-10-most-expensive-hobbies

Ya, it's pretty hard to consider a $55 electricity cost "a lot of $$" in light of all the other things you can do in life!
 
Last edited:

Ajay

Lifer
Jan 8, 2001
16,094
8,112
136
You Americans place no emphasis on perf/watt at all.

Come back and consider when your electricity is charged at 30-40 US cents per kilowatt hour, instead of the current 8 cents.

I think the current average is ~11 cents/kwh. @ 30-40 cents/kwh, I'd probably stop running F@H :eek:
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Wait, high end GPUs are going to use 250W 50 years from now...?

Obviously we don't know that. But for simplicity's sake, I just assumed constant rate of electricity and 130W power consumption differential between a high-end and a lower end card. Unless electricity costs skyrocket in North America, or wages plummet, people will be buying more power efficient devices to reduce heat/noise and feel like they are doing something good by helping the environment. Look at how many people buy a Toyota Prius, but the cost savings will take 10+ years to break even over a budget $15k-17k compact car that gets 40mpg. The cost savings alone doesn't justify buying a Prius for most Prius buyers. It's more to it than that - like those people feel strongly about saving the environment, or having a certain image, etc. The whole movement towards greener devices has as much to do with it being "trendy" and "saving the world" as it does with costs.

I was reading that the average person in North America uses 100 litres of water a day (shower, brushing teeth, tea/coffee, flushing the toilet, washing clothes, etc.). Do you care about wasting 100 litres of water a day? The media just isn't *Hot* about preserving water right now, but in 20-30 years it'll also be "trendy" like preserving electricity is today. The new generation is starting to care much much more about having a sustainable planet for future generation than cost savings.

The entire movements behind recycling, solar panels, wind power, less reliance on dirty coal and fossil fuels and a shift towards nuclear energy worldwide - almost none of these things were initially driven by cost savings. In fact, a lot of the alternative energy sources cost more than traditional sources of energy such as natural gas, coal or fossil fuels. Instead, the reason these things are becoming more popular is because many more people on our planet strive for a cleaner/safer/more sustainable over the long-term planet, with less pollution. Obviously, if a cost savings can be realized, that's just a bonus, but not the primary motivator at the moment.

My main point was that gaming is a super cheap hobby in the grand scheme of things and an extra $55 in electricity costs is practically irrelevant on an annual basis considering that's what a single game costs.

If you just look at it strictly from a financial perspective, you'd save a lot more $ if you air dryed your household's clothes vs. using the clothes dryer. Do people do something that drastic? If not, they don't care enough about electricity costs.

I think the current average is ~11 cents/kwh. @ 30-40 cents/kwh, I'd probably stop running F@H :eek:

When you are folding for 24 hours a day, you aren't actually "experiencing" the event for 24 hours. You are just arriving at some score. So obviously the marginal utility of folding at home likely provides a lower emotional benefit than gaming. How high would the cost of electricity need to rise for you to stop buying a 250W GPU and only consider a 120W GPU? It would likely need to rise far beyond 40 cents/kWh for you to stop considering a 250W GPU.
 
Last edited:
Feb 19, 2009
10,457
10
76
$55 doesn't sound much, but ppl argue over which card to buy when its within $20 of each other etc. Again, $20 aint much.

My point is if people care about perf/$$ they should also care about perf/watt because over the long term, it does make a difference if you live outside the USA. If $20-50 doesn't make a different to you, then obviously you don't care about perf/$$ to start with and you buy cards based on other factors.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
$55 doesn't sound much, but ppl argue over which card to buy when its within $20 of each other etc. Again, $20 aint much.

Ya, you are right. Often, even I am not consistent in that regard. :oops:

But I want to point out that of course we haven't had a situation yet where a 120W card could provide us with a performance of a 250W card of the same generation.

I mean, if you look at the current generation (GTX470/480/560Ti/570/580/5870/6950/6970), the difference between the most power efficient of these and the least is only 85W (322W vs. 407W) in gaming.

Power.png


So unless you start comparing something like an HD6870 to an overclocked GTX480/580/6970, then the $55 estimate is way too high since instead of having a 130W power consumption difference, you'll most likely be looking at a 60-70W difference.

Keep in mind that someone who buys a $150 HD6870 probably isn't cross-shopping a $430 GTX580 though. So again, my view is that people who are buying $250-500 GPUs are less concerned with electricity costs at load. Instead, they care about higher performance/watt for other reasons:

1) So that their room doesn't turn into a sauna in the summer time;
2) So that their GPU doesn't sound like a jet engine trying to get rid of all that heat.
3) So that all this heat isn't dumped inside their case, consequently raising the temperatures of other components
4) PSU limitations.
 
Last edited:

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
How so? On the high end, NV dominated with 8800GTX over 2900XT/3870, GTX280 was faster than HD4870, GTX285 was faster than HD4890, GTX470 beat HD5850 and GTX480 beat HD5870, while GTX580 beat a 6970. I think AMD provides excellent bang-for-the-buck options (although NV also did well with 8800GT, 9800GT, and GTX460/560/560Ti series). However, in terms of single-GPU performance, NV has now won 3 consecutive generations.

I guess you forgot the part where nVidia had to slash prices on the GTX 280 and 260 by an embarrassingly large amount that was certainly a major blow to the profit margins simply because of how efficient the 4800 series was. Or that nVidia had no competition for half a generation against the 5800s because they didn't even have a product out to compete with, and even when they finally did have something, they still can't distance themselves by a proportional margin despite the brute force tactics.

Unfortunately it appears G92 was just a happy aberration, but its a good thing because while nVidia can continue to claim the performance crowns, AMD continues to thrive because of their practicality.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
DX-11 Tessellation and GPGPU, will make AMD's High End Graphics chips to have lower Performance(Gaming) per Die size every time .

Perhaps they will try and not pass the 400mm2 mark but the road to GPGPU comes at a die size to performance(Gaming) penalty.

GCN paves the road for GPGPU that leads to Fusion and with it, Gaming Performance Efficiency will take the downhill.

The only feature i can see that will scale higher in the future is DX-11 and Tessellation performance due to higher parallel Tessellator Units implemented in the new GPU designs from both AMD and NVIDIA.

That will translate to higher implementation inside the games and more games using it in the near future.
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
There is no law that states that only Nvidia can make a >400mm^2 GPU, a 520mm^2 Radeon would beat the GTX580. That such GPU doesn't exist tells us that AMD thinks the halo effect and the price premium will not make up for the R&D and manufacturing costs. There just aren't that many people in the market for the fastest single-GPU card.
 

Don Karnage

Platinum Member
Oct 11, 2011
2,865
0
0
You Americans place no emphasis on perf/watt at all.

Come back and consider when your electricity is charged at 30-40 US cents per kilowatt hour, instead of the current 8 cents.

We'd still game at a high level. Seem to forget we pay 4+ Dollars a day for Starbucks
 

Ajay

Lifer
Jan 8, 2001
16,094
8,112
136
When you are folding for 24 hours a day, you aren't actually "experiencing" the event for 24 hours. You are just arriving at some score. So obviously the marginal utility of folding at home likely provides a lower emotional benefit than gaming. How high would the cost of electricity need to rise for you to stop buying a 250W GPU and only consider a 120W GPU? It would likely need to rise far beyond 40 cents/kWh for you to stop considering a 250W GPU.

Well, I spend around $55/mo folding right now ($0.12/kwh). If I had to pay 32 cents/kwh, then I be up to ~$140/mo. My wife would notice the difference, so I better be earning so good money by then or I'll be cutting back :\
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Ya, you are right. Often, even I am not consistent in that regard. :oops:

But I want to point out that of course we haven't had a situation yet where a 120W card could provide us with a performance of a 250W card of the same generation.

I mean, if you look at the current generation (GTX470/480/560Ti/570/580/5870/6950/6970), the difference between the most power efficient of these and the least is only 85W (322W vs. 407W) in gaming.

Power.png


So unless you start comparing something like an HD6870 to an overclocked GTX480/580/6970, then the $55 estimate is way too high since instead of having a 130W power consumption difference, you'll most likely be looking at a 60-70W difference.

Keep in mind that someone who buys a $150 HD6870 probably isn't cross-shopping a $430 GTX580 though. So again, my view is that people who are buying $250-500 GPUs are less concerned with electricity costs at load. Instead, they care about higher performance/watt for other reasons:

1) So that their room doesn't turn into a sauna in the summer time;
2) So that their GPU doesn't sound like a jet engine trying to get rid of all that heat.
3) So that all this heat isn't dumped inside their case, consequently raising the temperatures of other components
4) PSU limitations.


I am often extremely impressed with your insightful post. They are from very well thought, complete understandings which manifest to extremely clear points of view.

Your post on this subject are a great example. There is a lot if one could gather from your post. I am glad you still take the time. Its sad that most ppl wont pay much mind, or grasp the real importance and ramifications intertwined within. It is a deterring factor on my standpoint but i do still sometimes find myself rambling. So I commend you for taking your time, as you do. Hopefully there are enough ppl that will value it as such.
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I guess you forgot the part where nVidia had to slash prices on the GTX 280 and 260 by an embarrassingly large amount that was certainly a major blow to the profit margins simply because of how efficient the 4800 series was. Or that nVidia had no competition for half a generation against the 5800s because they didn't even have a product out to compete with, and even when they finally did have something, they still can't distance themselves by a proportional margin despite the brute force tactics.

Unfortunately it appears G92 was just a happy aberration, but its a good thing because while nVidia can continue to claim the performance crowns, AMD continues to thrive because of their practicality.

How is AMD's gpu division "thriving" when they're not making any money? And how much does it really hurt nvidia today that they had to drop prices on high end/low volume parts 2 1/2 years ago? Nvidia's bottom line even last year with the fermi I launch fiasco was a LOT better than AMD's gpu division. And last year was nearly a perfect storm scenario for AMD in that they had a huge advantage for an extended period of time. In fact, NV didn't really even come to play in the dx11 market until summer 2010 with gtx 460.

What will it look like in 2011? Very good for nvidia and very weak (again) for AMD gpu.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
There is no law that states that only Nvidia can make a >400mm^2 GPU, a 520mm^2 Radeon would beat the GTX580. That such GPU doesn't exist tells us that AMD thinks the halo effect and the price premium will not make up for the R&D and manufacturing costs. There just aren't that many people in the market for the fastest single-GPU card.

I have have read this probably about ten thousand times. The R&D is about the architecture more-so than just the high-end sku. The higher end sku, if flexible, can create multiple sku's and for different sectors.

One still has to spend resources on an architecture; why not be strong top-to-bottom in different sectors?
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
How is AMD's gpu division "thriving" when they're not making any money? And how much does it really hurt nvidia today that they had to drop prices on high end/low volume parts 2 1/2 years ago? Nvidia's bottom line even last year with the fermi I launch fiasco was a LOT better than AMD's gpu division. And last year was nearly a perfect storm scenario for AMD in that they had a huge advantage for an extended period of time. In fact, NV didn't really even come to play in the dx11 market until summer 2010 with gtx 460.

What will it look like in 2011? Very good for nvidia and very weak (again) for AMD gpu.

you reek if bias

and if you can't understand the meaning of relative then I can't help you

if it had not been for nVidia's relative blunder and AMD's relative success, I wouldn't have been surprised if we only had nVidia as a discrete GPU option today, although I'm sure you'd be thrilled if that were the case