Is this the best generation of GPUs ever?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
8800GTX was by far the biggest leap in both performance and technology. I bought one right after they was released. It was just awesome compared to earlier Nvidia cards and ATI's top of the line cards.

Another big leap for me, was when Voodoo 1 cards was released. I remember patching Quake 1 with the 3DGlide patch. The move was jawdropping, to say the least.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
The biggest leap I ever witnessed was with the ATI 9700 Pro with AA/AF gaming. Not only that but the product offered wonderful quality on polygon edges to finally rid the gaming world of ordered grid. High performance, High quality, efficient and balanced -- maybe the most impressive architecture for a window of time.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
How many times do people have to tell you peak power consumption is not indicative of power draw in games?

Peak gives an idea of what PSU we need to handle the card. TPU has average measurements as well.

If you want me to link average:
7970 = 163W
680 = 166W
7970 SOC = 203W
7970 GE reference (a card that doesn't exist in resale) = 209W
GTX580 = 214W
GTX480 = 257W

The 480 is 250 watt card and so is the 7970.

Not, not even close.

TDP =! power consumption.
http://www.techspot.com/review/555-gigabyte-radeon-hd-7970-soc/page6.html

and
http://www.guru3d.com/article/msi-geforce-gtx-670-power-edition-oc-review/8

and
http://www.behardware.com/articles/...asus-his-msi-powercolor-sapphire-and-xfx.html

There isn't a single review in the world where HD7970 draws 250W of power at the PSU level. It is nowhere near as power hungry as the 480. In fact an 1100mhz HD7970 uses less power than a stock 480.
http://www.behardware.com/articles/...asus-his-msi-powercolor-sapphire-and-xfx.html
1070 / 1400 MHz @ 1.150V: 202W
1070 / 1400 MHz @ 1.174V: 211W (by default)
1100 / 1775 MHz @ 1.174V: 225W
1125 / 1775 MHz @ 1.200V
1150 / 1775 MHz @ 1.225V: 256W (Only after you raise voltages past 1.2V, and clocks to 1.15-1.2, does the card start drawing more than 250W). A lot of 7970 cards can hit 1150mhz on 1.174/1.175V (stock), so this test isn't exactly the best case scenario either....


7970's pull the same amount of power as a 580, quit lying.

That's very mature.

7970 drew less in your link than the 580 and 7970 GE reference used up 1W more. Of course you can't buy reference 7970 GE since they don't exist. How about that Gigabyte SuperClock GE card that drew 203W at TPU?

I said HD7970 draws less than the 580, not HD7970 GE. The power consumption of 7970 has been discussed in many threads already. It draws about 210-225W at 1150mhz @ 1.175V depending on the model and 250-260W or so at 1200mhz @ 1.225V. The Lightning consumes the most power out of any 7970, so I used the worst case scenario. Believe what you want.
 
Last edited:

mikeymikec

Lifer
May 19, 2011
20,378
15,068
136
000049-00.jpg
:D

I'll raise you a Creative Labs Graphics Blaster Exxtreme :)
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,386
32
91
5850 came out in September of 2009 for $259. Nearly 3 years later the $250 7850 only beats it by ~25%.
 

lakedude

Platinum Member
Mar 14, 2009
2,778
528
126
5850 came out in September of 2009 for $259. Nearly 3 years later the $250 7850 only beats it by ~25%.
Beats it at what by only 25%? How about compute? How about power consumption? How about Crossfire scaling?

My 7850 runs 24/7 doing compute work when it is not gaming. That 40nm 5850 is going to do less work and take more power to do it.

If you don't care about power consumption or compute work then the 7850 might seem a bit underwhelming...certainly not worth upgrading from a 5850 in any case.

I find the 7850 amazing. It is fast, cool, quiet, and it does not run up my electric so much.
 

Gordon Freemen

Golden Member
May 24, 2012
1,068
0
0
Beats it at what by only 25%? How about compute? How about power consumption? How about Crossfire scaling?

My 7850 runs 24/7 doing compute work when it is not gaming. That 40nm 5850 is going to do less work and take more power to do it.

If you don't care about power consumption or compute work then the 7850 might seem a bit underwhelming...certainly not worth upgrading from a 5850 in any case.

I find the 7850 amazing. It is fast, cool, quiet, and it does not run up my electric so much.
7850 is for gaming not compute LOL 25% is not impressive and forget OCing.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Beats it at what by only 25%? How about compute? How about power consumption? How about Crossfire scaling?

My 7850 runs 24/7 doing compute work when it is not gaming. That 40nm 5850 is going to do less work and take more power to do it.

If you don't care about power consumption or compute work then the 7850 might seem a bit underwhelming...certainly not worth upgrading from a 5850 in any case.

I find the 7850 amazing. It is fast, cool, quiet, and it does not run up my electric so much.
thats beyond sad that after nearly 3 years, you find a 25% increase in performance amazing at the same price point.
 

CraigRT

Lifer
Jun 16, 2000
31,440
5
0
000049-00.jpg
:D


I think the Radeon 5xxx, GTX 4xx was a good gen.

Amazing. I was going to post this exact thing. I loved my V770 the thing was so awesome in it's day. Was definitely my most memorable video card ever. I really liked my GF2 GTS too.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
This last generation has been pretty shit, the percentage jump in power from the 580 to the 680 is fairly poor for a generational leap.

The most memorable leaps in recent history are 8800GTX which was something like a 2x the speed of previous generation, and the 9700 pro which brought down performance cost of MSAA significantly and blew both Nvidia and the previous generation out of the water.

I've bought a top end video card from every generation since Geforce 4, this current generation is the first time I've skipped getting a new card, and it's the time in my life I've also had the most disposable income.

Not that it matters much, we don't really have any new games to make use of all this extra power anyway we're stuck with nothing but console ports designed to run on 6 year old hardware.