Is this the best generation of GPUs ever?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I wish people would stop relating console graphics with PC graphics.

There are companies who build engines catered to high end PCs. You can always ask for more, 1080p is a pretty low resolution and it won't be long before it's commonplace to see 1080p benchmarks replaced by 1440p. Heck almost every review today is using 1440p or 1600p to show performance at higher resolution. Today's single cards cannot handle that in every game with every option set to max quality. SLI or XFire sure, but these high resolution screens are about as demanding 1080p was 4 years ago.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
All it's going to take is 1 game like Crysis 1 that's actually good to play, that hammers GPUs, and everyone will be scrambling to upgrade.

There are a couple games on the horizon that look pretty GPU demanding with everything maxed out even at 1080P:

- Medal of Honor Warfighter
- Metro Last Light
- Crysis 3 (maybe?)

I think it'll take another 2 years before we see a true leap in graphics. But I wouldn't rule out something surprising us soon.

"2K Games boss Christoph Hartmann--who heads up development on franchises like Duke Nukem, BioShock, and Borderlands--believes photorealistic visuals are needed to help propel the industry into new genres. To dramatically change the industry to where we can insert a whole range of emotions, I feel it will only happen when we reach the point that games are photorealistic," he said. ~ Gamespot

With PS4 and Xbox720 launching end of 2013/early 2014, we should get a nice boost in gaming graphics. The current consoles are holding back the adoption of next generation game engines since it's too costly to design a large number of games based on CryEngine 3 or Unreal Engine 4.0 when consoles can't even run them. Once next gen consoles are DX11 with at least 1GB of GPU VRAM, future games should use next generation DX11 features (Bokeh Depth of Field, Tessellation, dynamic lighting model, parallax occlusion mapping) on a regular basis. That's going to elevate the graphics for the average game.
 
Last edited:

Spjut

Senior member
Apr 9, 2011
931
160
106
The funny thing about finally getting at least DX11 class hardware into the consoles is that soon thereafter DX12 will probably be released(perhaps with maxwell?), and this discussion will start all over again:D

If the PS4 and Xbox 3 get six core CPUs or higher, it's also possible that we finally will have a reason for upgrading above quad cores


I'd say the HD4000 vs GT200 series was great, it brought AMD back for real and was as "cheap" high-end PC gaming can be
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It's been almost 5 years since DX9 and only recently games started to use DX11 in decent volumes. DX11 is here to stay for a long time, especially since it's the foundation for Windows 7 and 8, and new consoles. By the time DX12 games are out and use DX12 specific features, I bet it'll be past 2014. HD5870 had DX11 in September of 2009 and look how long it took before we started to see DX11 games. So if DX12 launches in 2014, it'll probably be 2016 before we see them at this pace.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
A good engine scales down. So it's built for high end PC hardware but you can scale it down for a console without a ton of reworking. I think that's the difference between the average game and a great engine used in a good game.

I do agree though that the first tastes of 3D accelerated gaming were the best. We didn't care about max fps so much as enjoying the new visuals. Voodoo 1 cards etc
 
Last edited:

lakedude

Platinum Member
Mar 14, 2009
2,778
528
126
Part 1, reminiscing:

3dfx Voodoo GLQuake, amazing improvement in visual quality and smoothness. Makes it clear exactly what a video card does in case you didn't already know. Very memorable.

Other cards were memorable for bad reasons like All In Wonder cards that were too slow to record video properly. Many latest and greatest card had lots of issues (drivers) when they were brand new. Many high end cards sucked power and ran very hot as well. Some were super loud (1800 I think).

Wasn't there a 4200Ti or something that was the Cel 300a of its day?

Part 2, today's cards:

My HD-7850 cost less, performs better, uses less power, and it rock solid (no issues at all). It is not all that much better then my 6950 but I loves the 7850 just the same. It makes tons of credit on BOINC and plays FC2 at 1900 by 1200 with everything cranked up including 8xAA smooth as silk. Total system power for my best system ever is only around 180 watts doing any normal thing like gaming or running BOINC (overclocked, more like 160watts @ stock clocks). I loves it!

Don't know what all the whining is all about. Ivy Bridge is the same kind of deal. Faster and uses less power, awesome!
 

Spjut

Senior member
Apr 9, 2011
931
160
106
It's been almost 5 years since DX9 and only recently games started to use DX11 in decent volumes. DX11 is here to stay for a long time, especially since it's the foundation for Windows 7 and 8, and new consoles. By the time DX12 games are out and use DX12 specific features, I bet it'll be past 2014. HD5870 had DX11 in September of 2009 and look how long it took before we started to see DX11 games. So if DX12 launches in 2014, it'll probably be 2016 before we see them at this pace.

Yeah I know DX11 will be used for a very long time, I'm just saying that next-gen won't be different when it comes to PC gamers complaining about console limitations holding the PC back.

The PS3/360 were DX9-ish and DX10 hardware came not too long thereafter, whereas this time the consoles will be DX11-ish and DX12 may not come too long thereafter. Perhaps DX12 won't be as big a change as DX10 was though.

I'd guess the only way to "guarantee" a quick move towards DX12 is if the API gets backported to Vista/Win7/Win8 and can target DX11 hardware as well(like how the DX11 API can target DX9 and DX10/10.1 hw (even getting some small benefits)).

This is a little old, but the dev in this thread gives some insight into the current issue with draw calls in DX11. His posts are definitely worth a read. AMD has also spoken of the PC's big disadvantage in this area.
http://www.rage3d.com/board/showpost.php?p=1336825638&postcount=80
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Poor price/performance at 28nm launch MSRP's but with 28nm maturity and competition heating up --- improving. Over-all though, AMD and nVidia are offering compelling choices though to consider.

Did go from a MSI Twin Frozr 2 GTX 470 to a MSI GTX 670 Power Edition -- nice upgrade but felt I had to pay a 28nm premium.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Yeah I know DX11 will be used for a very long time, I'm just saying that next-gen won't be different when it comes to PC gamers complaining about console limitations holding the PC back.

I agree. Usually in the first 2-3 years the consoles holding up alight but then become a bottleneck for game development again. This round all the rumours point to the next generation of consoles not being that powerful though. The rumoured GPU specs are atrocious (HD6670/HD7670), in PS4's case paired with a Trinity / Kaveri style APU. If true, that would be a bottleneck almost from day 1.
 
May 13, 2009
12,333
612
126
Not even close. Now is the worst time I can ever remember to upgrade your video card. Last gen cards like the 580/6970 or even the previous gen 480/5870 are more than capable than playing any game out there at halfway decent settings. Smart money kept their current cards and will wait for dramatic price drops or the next generation of cards. There are not many demanding games out that require buying the latest and greatest. There might be 2 or 3 games out right now that require high end cards. To me it's just not worth it to buy a $400-$500 card to play 3 games at the highest settings. Now if every new game that came out was a gpu killer I could see buying a high end card. For now I can live with turning down the AA or whatever other eye candy for a couple games and save the money for a real upgrade.
 

aaksheytalwar

Diamond Member
Feb 17, 2012
3,389
0
76
The current situation has got nothing to do with consoles per say. It is a change in common Rez that ia taking place.

Back in the day 12x10 was the norm and gpus could max that out but cried with 1080p. Same is the case with 1440p. In a few years time a new rez will be the norm and gpus will cry again.
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
A good engine scales down. So it's built for high end PC hardware but you can scale it down for a console without a ton of reworking. I think that's the difference between the average game and a great engine used in a good game.

I do agree though that the first tastes of 3D accelerated gaming were the best. We didn't care about max fps so much as enjoying the new visuals. Voodoo 1 cards etc

The thing is, that is totally not the approach currently taken. Many of the game engines today are being built for the console and then scaled up for the PC. Sometimes this is ok, sometimes this is terrabad.

The thing is, targeting multiple platforms is expensive and hard. There are some pretty huge differences between the xbox, playstation, and your PC.
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
Many of the game engines today are being built for the console and then scaled up for the PC. Sometimes this is ok, sometimes this is terrabad.

Sometimes? I submit it's always bad. (How else will we get the developers to stop doing it? >.<)
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
Sometimes? I submit it's always bad. (How else will we get the developers to stop doing it? >.<)

It really isn't the developers fault. The publisher wants to publish for the console. The console is their primary, #1, absolute goal (It is where they make the most sells and the most money). If something doesn't work well on a console, it is show stopping "We can't ship this" sort of deal. It only makes sense that they target the console first and grab whatever low hanging fruit they can get later.

If PC gaming was more lucrative, we would see more games targeting it. Unfortunately, PC games and gaming is more complex than consoles. That extra bit of knowledge about the speed of a computer and the installation process is just too much for most of the population. It is much easier for them to have a machine that just always works.
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
Us PC gamers absolutely understand that. And we absolutely hate it.

:), understood. Things would be better if publishers were willing to loose a little bit of money to polish and perfect things. As it stands, they just want to push crap out and move onto the next title.

IMO, one of the reasons blizzard and valve have done as well as they have is because it has been willing to polish things up before (and a little after) releasing.
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
Honestly, with how large the enthusiast community is (just take a quick look at the active members on this board alone), they'd more than make up the money 'lost' in developing initially for a more capable system. Granted, they need to do it correctly. Indeed Blizzard and Valve are good examples of correctly polishing a game, though their steps are more mediocre than I'd like to see.

I'm demanding because my PC is rawr! Hardware and software will never be on par, but it just sucks being held down by the masses. :/
 

RaistlinZ

Diamond Member
Oct 15, 2001
7,470
9
91
000049-00.jpg
:D


I think the Radeon 5xxx, GTX 4xx was a good gen.
 

Gordon Freemen

Golden Member
May 24, 2012
1,068
0
0
The one thing that has become substantially better is efficiency as in general the current crop of brand new cards looks like the low end matches the performance of last gens mid range and this years mid range matches last gens high end range in terms of performance and as a bonus it does so consuming less power and creating less heat/noise. The one thing in where this latest gen fails consumers is that industry has stealthily convinced the end users to pay substantially more money for the high end cards this time around to the tune of some $100 +- over last generations high end offerings SURPRISE but am I the only one that sees this or am I off point a bit let me know what you think ?
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
If HD7970 is hot and power hungry, what does that make a stock GTX480? At stock speeds the 480 uses up 80W+ more power than an HD7970 and even at 1080mhz, after-market 7970s draw 70W less than a stock 480. Only reference 7970s run hot btw. Your overclocked 480 is probably drawing 100W more power than a 1.2ghz HD7970. If you look at the difference in power consumption between the 7970 and 680, it's very small. The difference in power usage between GTX480 and 7970 is huge!

How many times do people have to tell you peak power consumption is not indicative of power draw in games? The 480 is 250 watt card and so is the 7970.

http://www.tomshardware.com/reviews/radeon-hd-7970-ghz-edition-review-benchmark,3232-18.html

7970's pull the same amount of power as a 580, quit lying.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0

BD231

Lifer
Feb 26, 2001
10,568
138
106
well his link is for the peak power consumption in Crysis 2 which last time I looked was a game as opposed to your 3DMark link which is not...

Proof? And even if it is he's still lying, the ghz addition is only drawing 40 watts less.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
@BD231

How many times do people have to tell you peak power consumption is not indicative of power draw in games? The 480 is 250 watt card and so is the 7970.

7970's pull the same amount of power as a 580, quit lying.
When playing Crysis 2:

Average: Crysis 2 at 1920x1200, Extreme profile, representing a typical gaming power draw. Average of all readings (12 per second) while the benchmark was rendering (no title/loading screen).
power_average.gif



When playing Crysis 2:

163watts for the 7970.
214watts for the 580 ( ~32% more than the 7970).
257watts for the 480 (~58% more than the 7970).

When idle at desktop: (http://tpucdn.com/reviews/AMD/HD_7970/images/power_idle.gif)
12watts for the 7970
32watts for the 580

You quit lying, cant argue with facts.

No the 7970 *isnt* a 250watt card *like* the 480. <------- (the point)
The 7970 uses ~163watts while you play Cysis2, compaired to the 257watts the 480 uses doing the same thing.


Proof? And even if it is he's still lying, the ghz addition is only drawing 40 watts less.

More like 50watts :) but still... your right, thats only like 25% more than the ghz edition, small potatoes right?
 
Last edited: