PSA: GPU prices are only going up

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
I don't see this happening. What you keep overlooking is that the biggest profit margins, by far, are found on high-end professional products that aren't going to be supplanted by less powerful integrated solutions. GK110 could easily have justified its production cost based on Tesla cards alone; the consumer releases were just icing on the cake for Nvidia. This is demonstrated by the fact that they made an improved version, GK210, just to do a Tesla refresh. And even though Titan X (GM200) sacrifices Double Precision performance, Nvidia still touted its GPGPU capabilities for applications that don't need DP.

The Quadro K5000 debuted at $2,499. That's a GK104-based card. Imagine the profit margin on this. It wouldn't surprise me if a majority of GK104's profits (not number of cards sold, but actual profits measured in dollars) came from the Quadros, rather than the GTX series (680, 670, 660 Ti, 770, 760). Those GTX sales were just a nice extra bonus that required little additional R&D costs to generate.

You seem to think that losing sales on low-end trash is going to deal a death blow to the discrete GPU market. I don't see that happening. The sub-$100 cards are already pretty much dead, and no one cares. They haven't been refreshed and probably won't get refreshed. Better iGPUs may push the level of minimum discrete GPU viability up to $200, but that won't make a substantial difference in the ability to pay for R&D, because those sales were always low margin to begin with. The real money was always in the high end.

First of all, nVidia cant live by selling Telsa and Quadro cards only.

Secondly Tesla is going to die to Xeon Phi(for DP). That train already departed and even nVidia doesnt despute it. Hence their new found love for half precision and neuro networks. Intels FPGA purchase didnt exactly make it easier either.

Not low margins, but low manufactoring cost and low sales price. plus remember cashflow. You forget how the business work. The workstation GPUs you say will save the day only ships around 4 million units a year. And thats AMD+nVidia while covering top to bottom.

You may not see it happening, but its already happening. Look at AMDs GPU division today.

The GPU will die. Its fate is sealed.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Fact: AMD is releasing new products very soon, on an updated arch., with new HBM technology. Sure doesn't sound like a company that is "out of the game completely" to me...

Ironically, compared to your statement, AMD still designs and produces leading-edge IGPs, too. They aren't only a discrete company, like NVidia mostly is.

How many new GPUs are there in AMDs new products that will release soon? 1 new GPU in 1 or 2 new cards? What do you call that?

For a successful IGP you also need a successful CPU.
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
I don't see this happening. What you keep overlooking is that the biggest profit margins, by far, are found on high-end professional products that aren't going to be supplanted by less powerful integrated solutions. ...

This argument keeps showing up, it is old as dirt and was as worthwhile 20 years ago as it is now. These companies cannot survive on the paltry 2-5% of market the high end offers. The overall margin for Nvidia dGPUs is ~55%. You cannot sever 95%+ of your unit sales and survive on the high end except in niche markets, even if the high end had say a 90% margin (which itself is very doubtful). Do the math.

Nvidia has a long way to go before death in the dGPU space, but it's easy to see that train coming.

http://en.wikipedia.org/wiki/Matrox


"... despite huge claims for the Matrox Parhelia, their performance continued to be quickly outpaced by the major players.

Since then, Matrox has continued to shift the focus of its card designs towards specialized, niche markets, moving more deeply into enterprise, industrial, and government applications. "
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
iGPUs are unlikely to be enough unless we end up with some massive improvements in chip manufacturing. Even at 7nm they are unlikely to match up to mid range GPUs.

They will be enough, just look at what AMD has achieved with the PS4 and that's already 1.5 year old tech. Well the tech is actually a lot older than that, but it's been 1.5 years since AMD/Sony brought it together in a highly efficient package.

The only thing limiting them today is TDP and slow interfaces (PCI Express/Memory access) but we have the tech to overcome that as well. A unified pool of HBM2 memory and the issue with slow interface is all but gone, and TDP is a very easy thing to crank up provided you have the cooling, and we have the experience to be able to do that as well.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,600
6,084
136
Remember, the discussion is not about if IGP will catch up to GPUs. If they do or dont doesnt matter on the fate of GPUs.

GPUs will vanish due to not being economicly viable to produce anymore.

Not in a reasonably short timeframe. I anticipate the demand for better discrete GPUs will only increase with mainstream 4K+ adoption and things like VR getting close to its first mainstream product.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
No way dude, you are telling me you actually read the report and analyzed the numbers? Why are you making so much sense??

Also, OP, chill out. Bang for buck will continue to go up even if average selling prices for dGPUs goes up.

Ain't NV making record revenues and profits in the same timeframe where Intel iGPUs were improving by leaps and bounds? :rolleyes:
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Not in a reasonably short timeframe. I anticipate the demand for better discrete GPUs will only increase with mainstream 4K+ adoption and things like VR getting close to its first mainstream product.

Hi, do you have any links, or at least the name(s) of this/these first mainstream VR product(s)?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Not in a reasonably short timeframe. I anticipate the demand for better discrete GPUs will only increase with mainstream 4K+ adoption and things like VR getting close to its first mainstream product.

Unlikely. I dont see anything that can change the trend.
http://jonpeddie.com/publications/add-in-board-report/

On a year-to-year basis, we found that total AIB shipments during the quarter fell -19.41% , which is more than desktop PCs, which fell -6.52% .
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Ain't NV making record revenues and profits in the same timeframe where Intel iGPUs were improving by leaps and bounds? :rolleyes:

nVidia only does that because AMD is in free fall. Remember the marketshare changes just the last year.

Table1rev2.JPG


For total share with intel included and AMDs APUs it looks like this:
Chart%201.JPG
 
Last edited:

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
We see these threads periodically over the years but it never comes to pass.

Quite frankly GPU prices have always remained pretty good even in the high end, I've purchased a flagship GPU (or 2 medium range for SLI/Crossfire) every generation since I was 13 (18 years ago) up until the GTX580 which is the first time I've ever taken a break and didn't buy again until the GTX980.

The prices have increased somewhat but that is primarily due to inflation, if you pick a card from at least a decade ago, something like the Geforce Ti4600 which I owned, retailing at about $400 USD, calculate the inflation on that and today it's worth $530 which is pretty much what the GTX980 launched at.

The tech is increasing in speed a bit slower now than it used to, each node shrink is getting harder and takes more investment to get working which is partly why I stopped upgrading after the GTX580, the 6xx and 7xx ranges were disappointing and unnecessary for the large part due to consoles dumbing down graphics for the better part of a decade.

If you should be concerned about anything its the government robbing you with hidden taxes (AKA inflation) by printing huge sums of money and devaluing all the savings and assets you hold as well as devaluing your wage.
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
We see these threads periodically over the years but it never comes to pass.

Quite frankly GPU prices have always remained pretty good even in the high end, I've purchased a flagship GPU (or 2 medium range for SLI/Crossfire) every generation since I was 13 (18 years ago) up until the GTX580 which is the first time I've ever taken a break and didn't buy again until the GTX980.

The prices have increased somewhat but that is primarily due to inflation, if you pick a card from at least a decade ago, something like the Geforce Ti4600 which I owned, retailing at about $400 USD, calculate the inflation on that and today it's worth $530 which is pretty much what the GTX980 launched at.

The tech is increasing in speed a bit slower now than it used to, each node shrink is getting harder and takes more investment to get working which is partly why I stopped upgrading after the GTX580, the 6xx and 7xx ranges were disappointing and unnecessary for the large part due to consoles dumbing down graphics for the better part of a decade.

If you should be concerned about anything its the government robbing you with hidden taxes (AKA inflation) by printing huge sums of money and devaluing all the savings and assets you hold as well as devaluing your wage.


Your time horizon just isn't long enough. I've been working with computers since the time of the S-100 bus systems.

What you're talking about, and what most people talk about regarding GPUs today, are all just and evolution of existing tech. You note a trend (moving to iGPUs) but don't see the ultimate conclusion of that trend.

Higher resolution, faster 3D rendering, etc isn't going to stop the current trends. And the current trend is that dGPU markets are shrinking - and that pace of shrinking is accelerating. It's partly because the desktop space is stagnant, but within that space dGPUs are losing ground to iGPUs as well.

If dGPUs are to regain any kind of traction in the market it will have to
be from something revolutionary that you can't effectively do with a SoC or iGPUs, it's not going to come by doing the same thing.

Companies like Nvidia, Google, Intel, and Microsoft understand this. Technologies like Hololens, Oculus Rift, Google Glass, and Meta Lens are all efforts to move to the next generation of UI's.

If we are still on the LCD / mouse paradigm in 10 years, it will be almost entirely on iGPUs.

If there is a long term future for dGPUs and the graphics industry in general, it will be doing something like this :


maxresdefault.jpg


0feb7a56-f8c1-45e1-a412-a7edf8141f4e.jpg


hololens3b.jpg
 

artvscommerce

Golden Member
Jul 27, 2010
1,145
17
81
I'm quite curious how far off we are from games being rendered on a server somewhere to be common. I know most of us hardware enthusiasts probably hate that idea, but at the end of the day if those options give us a better gaming experience it would inevitably become a popular choice.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
I'm quite curious how far off we are from games being rendered on a server somewhere to be common. I know most of us hardware enthusiasts probably hate that idea, but at the end of the day if those options give us a better gaming experience it would inevitably become a popular choice.

Its getting closer thats for sure. nVidia is going in for an attempt. Its not an entirely far fetched idea considering almost everything moves to the cloud now.
 

artvscommerce

Golden Member
Jul 27, 2010
1,145
17
81
Its getting closer thats for sure. nVidia is going in for an attempt. Its not an entirely far fetched idea considering almost everything moves to the cloud now.

I've tried "GRID Gaming" on the Shield briefly and it worked much better than I was expecting it to. However I only spent a few minutes with it. Seems like the technology is close enough, but i think its going to be a tough sell since it's a bit of a game changer.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
I'm quite curious how far off we are from games being rendered on a server somewhere to be common. I know most of us hardware enthusiasts probably hate that idea, but at the end of the day if those options give us a better gaming experience it would inevitably become a popular choice.

Latency makes this an unviable option, especially given the dilapidated condition of America's Internet infrastructure. Even with Google Fiber to every house in the country, you'd still have a hard time making it work as well as local play for twitch games.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I'm quite curious how far off we are from games being rendered on a server somewhere to be common. I know most of us hardware enthusiasts probably hate that idea, but at the end of the day if those options give us a better gaming experience it would inevitably become a popular choice.

I think it's physically impossible for the experience to be better compared to an actual gaming computer. I mean, we are talking about how PCI Express 3.0 is "slow" when talking about major bottlencks between the CPU and GPU. I just don't see how a computer miles and miles away rendering the image and streaming it through an exponentially slower and higher latency medium, that's horribly inconsistent I might add, is going to give a better experience.

It's nothing more than a compromise brought on by the explosion of mobile computing and the push towards low power and high efficiency. For these devices, it may provide a better experience, but it isn't going to beat out having the actual hardware doing the processing right at your finger tips.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Ye, its not going to be the same thing. But it may capture some. I could imagine future console gaming being more or less cloud based. Just think of MS claims about Xbox servers.
 

artvscommerce

Golden Member
Jul 27, 2010
1,145
17
81
I think it's physically impossible for the experience to be better compared to an actual gaming computer. I mean, we are talking about how PCI Express 3.0 is "slow" when talking about major bottlencks between the CPU and GPU. I just don't see how a computer miles and miles away rendering the image and streaming it through an exponentially slower and higher latency medium, that's horribly inconsistent I might add, is going to give a better experience.

It's nothing more than a compromise brought on by the explosion of mobile computing and the push towards low power and high efficiency. For these devices, it may provide a better experience, but it isn't going to beat out having the actual hardware doing the processing right at your finger tips.


While I don't disagree with what you're saying, I think if you gave the technology a try you would be quite surprised/impressed. I played Dirt3 over a WAN connection w/ ~140ms latency and ~11mb and I couldn't tell the difference in responsiveness between remote and local. (However I don't doubt that a more competitive gamer would have noticed the difference). I also think there are other reasons that will make this technology difficult to get off the ground.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
Your time horizon just isn't long enough. I've been working with computers since the time of the S-100 bus systems.

What you're talking about, and what most people talk about regarding GPUs today, are all just and evolution of existing tech. You note a trend (moving to iGPUs) but don't see the ultimate conclusion of that trend.

Higher resolution, faster 3D rendering, etc isn't going to stop the current trends. And the current trend is that dGPU markets are shrinking - and that pace of shrinking is accelerating. It's partly because the desktop space is stagnant, but within that space dGPUs are losing ground to iGPUs as well.

If dGPUs are to regain any kind of traction in the market it will have to
be from something revolutionary that you can't effectively do with a SoC or iGPUs, it's not going to come by doing the same thing.

Companies like Nvidia, Google, Intel, and Microsoft understand this. Technologies like Hololens, Oculus Rift, Google Glass, and Meta Lens are all efforts to move to the next generation of UI's.

If we are still on the LCD / mouse paradigm in 10 years, it will be almost entirely on iGPUs.

My time frame is 100% of the 3D accelerator market, I was using PCs and getting into the hardware when graphics cards weren't accelerated for 3D and when the first 3D accelerators coming out were 2nd video cards which you'd link to your 2D video card by an external VGA cable, I think those were the Matrox millenium days, I had experience with all the early video cards through 3DFX's lifespan until their death and the raging war between Nvidia and ATI/AMD ever since.

There is no trend to move towards iGPUs as replacements for gamers there's a trend to include iGPUs as standard because there's plenty of die space on CPUs so why the hell not, it's great for businesses and mobile devices like laptops which would otherwise need a very slow dGPU.

The point is that the dGPU market is bigger than the gaming segment out there and not everyone who has a dGPU necessarily needs one, that's great that iGPUs are taking that burden, but these people aren't the people spending huge amounts in the dGPU market on gaming cards to run CoD.

It's funny you mention stuff like the Rift because with a 75-90hz expectation and with 1080p being woefully inadequate what they need is bigger and faster dGPUs not horribly slow iGPUs, there will be no gaming on the Oculus rift on dGPUs because there's going to be demand for at least 2560x1400 resolution at 90hz and even the very fastest dGPUs will struggle to adequately supply that in modern games, 4k is now here as well, the thirst for ever better graphics drives the adoption of dGPUs. Sure the sales can go down because some of the prior dGPU market was to service a market segment that's better served by iGPUs, but that's only one segment of the market. PC gaming (the kind that requires dGPUs) is alive and well and always has been.

I've seen these threads over several decades and nothing much has changed.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
While I don't disagree with what you're saying, I think if you gave the technology a try you would be quite surprised/impressed. I played Dirt3 over a WAN connection w/ ~140ms latency and ~11mb and I couldn't tell the difference in responsiveness between remote and local. (However I don't doubt that a more competitive gamer would have noticed the difference). I also think there are other reasons that will make this technology difficult to get off the ground.

I have. I Share Play with friends all the time on PS4 and I've tried PSNow which is cloud based as well. It works, not saying it doesn't. I'm just saying it's not going to provide a better experience than having your monitor connected directly to a capable gaming machine.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Ye, its not going to be the same thing. But it may capture some. I could imagine future console gaming being more or less cloud based. Just think of MS claims about Xbox servers.
Would easily capture me. I already use the in home streaming and get high latency over wireless if I use that and it's still fine..

I think ms did have xbox servers right? But the market freaked out when ms tried to introduce cloud computing to gaming
 

Shehriazad

Senior member
Nov 3, 2014
555
2
46
Not sure why people even think it would be beneficial to keep dGPUs alive for the common user...their very existence is problematic in the future.

If dGPUs sticked around indefinitely....they would for sure become a lot more expensive over time unless the market decided to stand still for all eternity...which it won't.


Not only can companies make more profit by putting it all on one chip...the user will also benefit from cheaper prices.

Let's say you buy 2 chips produced at 7nm, at a ridiculously low yield....
So you end up having to essentially shoulder the failure rate cost for 2 chips.

Now lose maybe 15% of that performance...but only be required to purchase ONE chip and have savings of up to 50% (in a perfect world scenario).

Dunno man....MOST people would prefer paying $1000 instead of $2000 if the difference you get in performance is that marginal.
iGPU WILL catch up..it already does. 4 years ago people would've laughed at you for trying to play new games on an iGPU at acceptable resolutions/settings.
Today you can actually play even new titles in 1080P on it if you are willing to turn down enough settings...but 30+ fps is not a dream.


But yes...dGPUs will rise in price...and that is EXACTLY why some companies are putting funding into iGPUs R&D...and you should be happy about it. It might take quite a few years...but eventually iGPU will catch up all the way simply because it MUST.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Not sure why people even think it would be beneficial to keep dGPUs alive for the common user...their very existence is problematic in the future.

If dGPUs sticked around indefinitely....they would for sure become a lot more expensive over time unless the market decided to stand still for all eternity...which it won't.


Not only can companies make more profit by putting it all on one chip...the user will also benefit from cheaper prices.

Let's say you buy 2 chips produced at 7nm, at a ridiculously low yield....
So you end up having to essentially shoulder the failure rate cost for 2 chips.

Now lose maybe 15% of that performance...but only be required to purchase ONE chip and have savings of up to 50% (in a perfect world scenario).

Dunno man....MOST people would prefer paying $1000 instead of $2000 if the difference you get in performance is that marginal.
iGPU WILL catch up..it already does. 4 years ago people would've laughed at you for trying to play new games on an iGPU at acceptable resolutions/settings.
Today you can actually play even new titles in 1080P on it if you are willing to turn down enough settings...but 30+ fps is not a dream.


But yes...dGPUs will rise in price...and that is EXACTLY why some companies are putting funding into iGPUs R&D...and you should be happy about it. It might take quite a few years...but eventually iGPU will catch up all the way simply because it MUST.

Agreed. It's going to take a while, but it WILL get there. I'm reminded of the Voodoo and Voodoo 2 days. When people first learned about VooDoo 3 and how it was going to be a 2D AND 3D card, everyone thought there was no way you can make a single card that does both better than having a purpose build 2d card and 3d card.

Now look where we are, can you imagine having to buy a card for desktop use and a separate card for gaming use today?