Is the GPU market doomed from a consumer point-of-view?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
I see hope on the software side now that hardware advances are slowing down.

What has sucked about PC gaming the last three years is you need like 50%+ more raw power in your PC than a console has to match that console experience because of the inefficiencies of Directx 11. And that isn't just the GPU, your CPU had to be a rockstar (compared to the crap console cores) as well.

When I see a game like Doom running in Vulkan I am given hope for the whole industry. It looks amazing, and runs great on old hardware from both vendors. That sort of game wasn't possible when Directx 11 and the drivers associated with it were lobotomizing everything, but hopefully Vulkan and Directx 12 (which we still haven't seen a great example of yet IMHO) PC gaming will enter a golden era where the increased costs of GPUs and stagnation of CPUs don't matter as much. A longer hardware refresh cycle isn't a bad solution to higher prices.

The other side is VR. If VR really takes off we could see new competition in the gaming space. Within 5 years we will see mobile SoCs that are what we currently consider to be "VR Capable" and that will change the game for the public's access to that technology.
 

philipma1957

Golden Member
Jan 8, 2012
1,714
0
76
Well not for nothing but as a heavy user of gpus to earn money and not game the rx 480 is a better card then the r9 390.

THE last 4-5 months have seen big profits mining eth coins and this effects the gpu market.

https://etherchain.org/


the network hash rate is 4th since 40 rx 480's = 1 gh

we have 4000 x 40 or 160,000 rx 480's

we know that it is not pure rx 480's

but it is a lot of gpus' at 250 usd a gpu it is 40 million dollars in gpus.

So there is an extra demand for both AMD and Nvidia cards due to this.

How long will it last it remains to be seen.
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
I'm just annoyed that I'm on a GPU that is 2 generations old now and the cards near its price are closer to a side grade than an upgrade and the card that is an upgrade is almost twice what I paid for my current GPU. Its a lot less than I expected out of a node shrink for sure.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
290s and 980s were selling for around 350 at launch. now we have 200$ cards tat matched their performance. I don't see anything wrong with it.

just don't expect rapid performance increases like we are used to before 28nm.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
290s and 980s were selling for around 350 at launch

Wat? :confused:

290's launched at $399 and were sold out everywhere then mining made their prices go over $500. 980's launched and stayed at $549 for a long time.
 
Aug 11, 2008
10,451
642
126
Pretty sure the RX460 is targeting $100-120. RX470 will be the $150 option. With proper DX12/Vulkan support, these low end GPUs should actually perform really well. Aren't they touting the RX460 as a VR card? That's a bold statement given its price point. I think they'll be lackluster against the current DX11 (950/960) cards though.

Far from it. It is being targeted for the MOBA crowd, 896 shaders or something. The 480 and perhaps 470 are targeting entry level VR.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Wat? :confused:

290's launched at $399 and were sold out everywhere then mining made their prices go over $500. 980's launched and stayed at $549 for a long time.
woah, my memory is that bad? :oops: should have googled to confirmed my prices before posting, my bad. :oops:
 

nerp

Diamond Member
Dec 31, 2005
9,866
105
106
This entire thread is in the context of supply shortages and extreme demand for all the new generation of cards. Would be smart to bookmark and revisit 9 months from now.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Moore's law is dead, and this is essentially why dramatic improvements in performance have slowed down.

The excuse the industry has used is "performance per watt" is now the focus. That focus wasn't a choice, they were forced to go this route as die sizes shrunk.

I'm not saying performance per watt is bad metric to focus on, but it should have never been prioritized over faster performance. And this is why we still don't have adequate video cards that can handle 4K gaming without getting into stupid expensive price ranges.

A GTX 1080 can drive older games at 4K, but no matter how fast a new card is, 4K will always be out of reach of single cards (and max settings), mostly due to the dev's always moving the goal posts. If they provide settings for the same card at 1080p, 4K will struggle with them. And that is what dev's always due on the high end demanding games.

It's the dev's that prevent you from maxing out 4K with a single GPU. If they want, they can simply dial back the high end settings of games, and you can do it on a single card.

OR you could dial back the settings, but that is absurd to most people, because 1080p or 1440p with higher settings looks better than 4K at lower settings (at least to those complaining). That to me means that 4K is simply not worth it, due to giving less benefit than higher IQ settings, that or it's a psychological issue with gamers that prevents them from accepting lower settings for a higher resolution.
 

guachi

Senior member
Nov 16, 2010
761
415
136
I had a 2560x1600 30" monitor when they were rare (still do) and I even used three screen Eyefinity.

Personally, I'd much rather more pixels and screen real estate than max settings. Well, to be fair, more pixels is a setting unto itself. So, to me, 4k is "max settings" and 1080 is "low settings"

Also, the extra pixels are just wonderful to have in non-gaming mode.
 

Mat3

Junior Member
Jun 3, 2016
4
0
0
If people feel progress is starting to stagnate and prices are too high, just wait until the next process node (10nm FF). It'll make the 4 years at 28nm seem like it just breezed by.
 

TidusZ

Golden Member
Nov 13, 2007
1,765
2
81
TIL: You could buy GTX 1080 performance in 2012 for $350.

There is a reason why these kind of arguments can't be held. If you want to say the cost of a specific GPU die-size went up, sure you got that right.

However, performance stack - it's some what still the same.

Halo Parts outside of "Prosumer" or whatever the hell Nvidia/supporters want to call it are still $500+

Mainstream are still $200-300
Performance/Enthusiast are still $300-500

Relatively the same dollar value still nets you the same tier within current performance brackets.

In 2012 you were paying $350 for a card slotted between the $300 and $400 card performance, not GTX 1080 performance, which by today's standards is a $100 card at best.

In 2010 I bought a radeon 5870 for $300-350 CDN which was about as close to top of the line as the 1070 is now (second best, reasonably close to the best). The 1070 cost me $610 when I bought the cheapest one on newegg one week ago. Btw I said 2010 not 2012 and this is canadian currency.

In 2009, I bought a top of the line 2xGPU sandwiched on 1 card gtx 295 for like $500. Comparing it to today's cards 1:1, it would be a 1070 sli on 1 card (2 of the 2nd best cards in 1). It was the fastest card you could buy at the time. That'd cost about $1200 today.

If you wanna go back to 2002, I recall buying a Geforce 4 ti 4200 for well under $300 CDN and it was within 20% of the fastest card b4 overclocking and probably about 5% after both are overclocked. That was a sub $200 USD card. Of course the top of the line card at that time was only around $320 USD compared to 1080 at well over $600 USD.

$300 CDN used to get you the second best card, with perhaps some change left over. When GTX 670 came out it now cost over $400, and with 1070 its $600+ CDN to get the 2nd best card or $1000-1700 CDN for the best card now (I'd say titan is comparable to the 2 gpu on 1 card that they never seem to make anymore, and which cost $500 CDN in 2009).

I've been buying videocards and reading reviews and shit on them since before geforce existed (since before AGP even) and I've seen the prices get ridiculous only fairly recently in the last 5 years or so. Of course that coincides with ATI not being able to keep up with the fastest NVIDIA cards, which they were able to do 5 years ago when they often had the fastest card.

A similar situation has existed/still exists with AMD/Intel, but luckily the price gouging by Intel has been absolutely minimal compared to what NVIDIA has done.

I bought a 1070 tho so I guess I'm part of the problem, but doesn't mean I'm gonna deny the history of it.
 
Last edited:

Rifter

Lifer
Oct 9, 1999
11,522
751
126
A GTX 1080 can drive older games at 4K, but no matter how fast a new card is, 4K will always be out of reach of single cards (and max settings), mostly due to the dev's always moving the goal posts. If they provide settings for the same card at 1080p, 4K will struggle with them. And that is what dev's always due on the high end demanding games.

It's the dev's that prevent you from maxing out 4K with a single GPU. If they want, they can simply dial back the high end settings of games, and you can do it on a single card.

OR you could dial back the settings, but that is absurd to most people, because 1080p or 1440p with higher settings looks better than 4K at lower settings (at least to those complaining). That to me means that 4K is simply not worth it, due to giving less benefit than higher IQ settings, that or it's a psychological issue with gamers that prevents them from accepting lower settings for a higher resolution.


What would help is if people realized that setting have little to no effect on actual appearance, most people would be hard pressed to tell the difference between medium and high settings from stills you can look at and inspect for a long time let alone in a fast paced game while you are playing....

crysis3minimista_702591.jpg
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
What would help is if people realized that setting have little to no effect on actual appearance, most people would be hard pressed to tell the difference between medium and high settings from stills you can look at and inspect for a long time let alone in a fast paced game while you are playing....

crysis3minimista_702591.jpg

I don't have a 4K monitor to compare, so I'm just guessing, but I'd say that 1440p may be the sweet spot atm. There is definitely diminishing returns on higher pixel counts just as there is diminishing returns on higher quality settings. It will likely vary from person to person, but 1440p may be where most people would feel you get the best bang.

That said, there is definitely a psychological issue at play. The existence of setting that you cannot use when using 4K on a single card, just overwhelms people. They'd actually be happier if the Ultra settings were removed, and slapped the label "Ultra" on what is medium...but only if they owned a 4K display.
 
Last edited:

Rifter

Lifer
Oct 9, 1999
11,522
751
126
I don't have a 4K monitor to compare, so I'm just guessing, but I'd say that 1440p may be the sweet spot atm. There is definitely diminishing returns on higher pixel counts just as there is diminishing returns on higher quality settings. It will likely vary from person to person, but 1440p may be where most people would feel you get the best bang.

That said, there is definitely a psychological issue at play. The existence of setting that you cannot use when using 4K on a single card, just overwhelms people. They'd actually be happier if the Ultra settings were removed, and slapped the label "Ultra" on what is medium...but only if they owned a 4K display.

One thing i learned from having SLI 460's for 5+ years is turning down the settings to keep up the framerate with newer games has little effect on visual quality(at 1080p anyways)

I have a RX480 on order and a 1440p display so im going to have to game at medium most likely to keep 60+ FPS so ill let you guys know if i notice difference in settings at 1440p.