Why a generation of cards obsessed with power efficiency at the cost of raw power?

lauren24

Junior Member
Mar 25, 2013
3
0
0
I can only speak of Nvidia here because those have been the only type of cards I have used. I used the to have a 260 then I went to 2 580s. I have been with 2 580s since because Kepler has been kind of a disappointment. I thought Titan would be my next upgrade but it also seems to have it's issues.

I get that there is a market for high power efficiency/low heat - being the server, mobile market. However, I feel like Nvidia has catered to those people almost exclusively with Kepler. The Fermi generation of cards were all about raw power. If the cards ran too hot you would watercool them - that is what enthusiast did. Nowadays it seems like the 600 line is all about having a low wattage usage. I understand some people are fine with sidegrading to a card with similar performance just because it uses less watts. But it seems like there is a hole in the section of the market for enthusiasts who are like "I don't care for low watts, give me all the power you can". Intel found a way to give CPUs to both sides.

For example, Intel releases high efficiency/low heat CPUs. Mainly the i5 and below line. When overclocked the i5 3570k only draws under load around 267w. That is great for people who want that. However, Intel also released a raw power enthusiast CPU for people who want the most power. They released the i7 3930k. This i7 when overclocked draws under load about 525w. I really wish Nvidia did this with their 600 series of cards. The 670 and below could have been the efficiency cards and the 680, Titan and rumored Titan Lite could have been the raw power enthusiast cards.
 

Eureka

Diamond Member
Sep 6, 2005
3,822
1
81
Because good engineering is not the pursuit of performance at the cost of all else. Once you reach an acceptable level of performance, the next step is to make it more manageable. It's also in the nature of the die shrink... lower power draw.

The 670 and 680 are the same chip... why would the 680 be any different? They do have a high power card and it's the Titan.

Not to mention, the enthusiast market is very, very small and it's not profitable to just release for that small market. Consumers would like to save on power and buy cheaper power supplies/not replace their current ones. And the majority of manufacturers would probably like to deliver the same performance while cheaping out on power supplies, too.
 
Last edited:

lauren24

Junior Member
Mar 25, 2013
3
0
0
Because good engineering is not the pursuit of performance at the cost of all else. Once you reach an acceptable level of performance, the next step is to make it more manageable. It's also in the nature of the die shrink... lower power draw.

The 670 and 680 are the same chip... why would the 680 be any different? They do have a high power card and it's the Titan.

Not to mention, the enthusiast market is very, very small and it's not profitable to just release for that small market. And the majority of manufacturers would probably like to deliver the same performance while cheaping out on power supplies, too.

Titan is crippled with throttling problems. The power limit on the Titan is ridiculously low. People are flashing the BIOS to fix the throttling problems but this voids the warranty. Titan is the most powerful single GPU card but it seems like it could have been more.

Because some of us have to pay electric bills.

I am not saying it should ONLY be one way or the other. That's why I gave the Intel example. They can cater to both like Intel tried to do.
 

Eureka

Diamond Member
Sep 6, 2005
3,822
1
81
Titan is crippled with throttling problems. The power limit on the Titan is ridiculously low. People are flashing the BIOS to fix the throttling problems but this voids the warranty. Titan is the most powerful single GPU card but it seems like it could have been more.

Probably trying to cut down on the number of warranty returns, and probably due to the limited number they can manufacture. If it's as simple as a BIOS limitation, it means that they're letting you do it, it's just on your responsibility. They could have done a hardware limitation and completely screwed the consumer over.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
GF100 - Hot - Power Hungry - Loud, at release, later becomes one of the best clocking generations for many - Failure in sales.

GK104 - Warm -Efficient - Under 200w - Quiet, voltage locked, power locked, limited overclocking - Major success in sales.
 

lauren24

Junior Member
Mar 25, 2013
3
0
0
GF100 - Hot - Power Hungry - Loud, at release, later becomes one of the best clocking generations for many - Failure in sales.

GK104 - Warm -Efficient - Under 200w - Quiet, voltage locked, power locked, limited overclocking - Major success in sales.

Totally agree with this. Shame that Fermi did not do better. This is what I mean about Nvidia putting efficiency over power. Voltage lock and power lock is not good for enthusiasts. I own 2 580s and I see no reason to upgrade (and I am the type to upgrade frequently as I am sure lots of people are :biggrin:)
 

Greenlepricon

Senior member
Aug 1, 2012
468
0
0
I like my $60 600w psu, my temps around 45C under full-load, the relative efficiency of my tower, and the quietness of it all. I also like my 7950 that pumps out insane graphics. Sure some people would want to fit the biggest gpu possible in their case (you seem to be one of them), but there are also engineering problems with large gpu's, higher costs associated with them, etc. Electronics work better if they're smaller, more efficient, and cool. You can get more power out of them by ignoring all of the aforementioned, but in my opinion it's definitely not worth the problems.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The more power a device uses the more expensive it becomes. You need more expensive PCB, VRM, cooling solution, power supply, etc... Making something more efficient is just a smarter way to go.

If your competition can get similar performance and use less power they are going to win.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
It's also possible that the premise is mistaken.

Perhaps the Nvidia cards have *BOTH* extreme performance and also power efficiency.

I don't think that there is some rule to the universe that prevents a video card from demonstrating both characteristics simultaneously. Or maybe I'm looking at it the wrong way - triple SLI would appear to have high power consumption?
 

Centauri

Golden Member
Dec 10, 2002
1,631
56
91
...because TCO is one of the most important equations in the purchase of anything - always has been and always will be?
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
For example, Intel releases high efficiency/low heat CPUs. Mainly the i5 and below line. When overclocked the i5 3570k only draws under load around 267w. That is great for people who want that. However, Intel also released a raw power enthusiast CPU for people who want the most power. They released the i7 3930k. This i7 when overclocked draws under load about 525w. I really wish Nvidia did this with their 600 series of cards. The 670 and below could have been the efficiency cards and the 680, Titan and rumored Titan Lite could have been the raw power enthusiast cards.

Your numbers are way, way off. Those look like "total system power consumption" numbers, and probably include a fully-loaded and overclocked GPU too.

The 3570K is a 77W TDP CPU, and overclocked to the max, worst-case it probably only takes double that.
 

kurosenpai

Junior Member
Mar 25, 2013
18
0
0
Smaller gpu chip architecture, less cost, cheaper to buy, less watt,,less heat, longer life span, less electric bills, and lots of people like this kinda things. the graphic card manufacturers are just going with the majority.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Actually this generation just had a weak showing from AMD, and a meh showing from Nvidia. The only real difference is Nvidia's mid-range competed directly with AMD's top end for awhile, still does really at lower TDP vs GHz, better perf/w though and it was a sub 200w chip. Later we see GK110 with a 250w TDP using about what the GHz edition uses, only a noticeable amount faster.

TDP/Power really hasn't changed that much since Fermi, a 7970 GHz uses about as much power as the 580, as does teh GTX Titan.

Below that we see the same cut versions offering less performance with less power consumption.

The only big differences this generation has been AMD's slip in performance advancement vs Nvidia, Nvidia's stupid boost which AMD has copied, Nvidia's stupid power and overclocking lockout, and of course insane price gouging from both sides with Nvidia taking the cake and eating it and baking another with Titan.
 
Last edited:

Obie327

Junior Member
Mar 25, 2013
20
0
0
I think performance is key here. Meaning efficient pixal pushing performance. I've had a lot of nvidia cards over the years and really like the gtx 680/660 performance. Not very strong compute cards but great for gaming. Some of my favorites I had or still have were geforce 6800 gt, 7800 gt, 7900, 8800 gts and ultra, gts 250 1 gig, evga 260gtx 216 super overclocked, Evga gtx 460 1 gig overclocked, msi 560 ti twin frozr overclocked, Zotac 550 ti, evga classified 560 ti 448 overclocked(twin fan), msi gtx 660 twin frozr 3 oc, Zotac gtx 680. some were power hogs and others like the 660 (non ti) is just amazing. I like to get the best bang for the buck these days..Not too power hungry but a good balance of price, power, And performance. I just play at 1080p though. I got the 680 for extreme eye candy for my 2600k :)
 

Obie327

Junior Member
Mar 25, 2013
20
0
0
I also have to say that single card performance is most important for me. I really don't like dabbling in sli'ing two cards. prefer getting a good gaming from one single good card than dealing with stuttering, latency, and driver issues. With electricity, heat, watts getting out of control these days I'm very satisfied with my purchase decisions And of Nvidia of late making lower wattage efficient performance gaming cards.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
GF100 - Hot - Power Hungry - Loud, at release, later becomes one of the best clocking generations for many - Failure in sales.

GK104 - Warm -Efficient - Under 200w - Quiet, voltage locked, power locked, limited overclocking - Major success in sales.

We have a winner.

High wattage besides the continual cost, also puts more pressure on the product in terms of lifespan.

But again, 99% dont want high wattage products if they can avoid it.
 

Maximilian

Lifer
Feb 8, 2004
12,604
15
81
GF100 - Hot - Power Hungry - Loud, at release, later becomes one of the best clocking generations for many - Failure in sales.

GK104 - Warm -Efficient - Under 200w - Quiet, voltage locked, power locked, limited overclocking - Major success in sales.

Didn't realize fermi failed in sales :confused:
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
Didn't realize fermi failed in sales :confused:

It didn't. It was a major success. GTX 460, GTX 560Ti and GTX 580 were way better sellers than any card in current gen.

And the only AMD card with worse perf/w than its Nvidia counterpart is the 7970.

But that's Balla for you.
 

Haserath

Senior member
Sep 12, 2010
793
1
81
Because power efficiency is also performance.

And the fact that raw GPU performance costs money to make.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The power consumption hit a maximum which they could not exceed without exotic cooling, and that is very expensive. The new process was also designed to help reduce power consumption which when you combine these two elements means that the GPU uses less power and has some headroom compared to the previous generation. It does mean the fans are not that loud.
 
Feb 19, 2009
10,457
10
76
Actually this generation just had a weak showing from AMD, and a meh showing from Nvidia.

Before Titan's release for consumers, AMD was not behind in single GPU. Also, inferring reference 1.25 vcore power use vs custom cards with much less also made efficiency look a lot worse than it really is for AMD.