Discrete GPU is dying? NVIDIA Disagrees

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
I meant the 3770k which is used for gaming.

Nobody owns the 3970X. Also my point is valid. a 250w GPU with a 150w TDP CPU = 400w+ Good luck finding a cooler than will handle that on a warm day.

You are making some pretty bold claims there.People who can afford top of the line CPU and GPU can afford a good cooling solution as well.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
People have said discrete has been dying ever since Intel/AMD starting sticking graphics on the cpu. At the time everyone just mentioned cost, memory bandwidth and cpu TDP. A cpu has insufficent memory bandwidth for anything but low end graphics, and it's not cost efficient to add enough bandwidth (extra pins cost lots of money). Equally cpu power usage has to stay within sensible limits, sticking a huge gpu on there doesn't allow that.

Basically to keep cpu's and motherboards cheap for most people who don't care about graphics you can't stick anything too powerful on the cpu. For those that do care about graphics they'll probably still buy a stand alone card so not worth catering for them.

The one reason to add powerful gpu's is for compute but really the only ones trying to do that are AMD (Intel prefer to just add extra instructions to the x86 cpu instead as haswell is doing). AMD are too small to drive the market so that'll stay in powerpoint land for now - perhaps Intel will change that in the end, although almost certainly Intel will make those lightweight compute cores use x86, which is not compatible with anything AMD have coming out.
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
I think this console generation is going to be the first where we have APU's that are fully capable of running the games at ultra settings at 1080p. Discrete cards will be relegated to higher resolutions like 4K. In a few years 4K will become the norm, but 1080p is going to stick around for a long time, and the average consumer will be able to buy a APU to run games on with ease.

I could also see this being a very short lived console generation. I'm think 4-5 years and we will see new consoles again. Especially with Sony and Microsoft using semi custom parts. They don't need to ride the console for 10 years to try and turn a proift. Just grab the next APU in line from AMD/Intel and put it in a case.
 

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
APUs will take the low-end, but won't compete high end at least not for the next decade. It would make motherboards and RAM more expensive, cooling more difficult and more expensive, be next to impossible to upgrade properly and require new standards for a very established market (a new power supply standard for one) in a market that actually doesn't need it.

An APU makes sense for low end systems because it simplifies things and the demand for performance simply isn't there. It makes sense for a console because it's a single hardware configuration so you can afford to back its performance up with decent RAM. It doesn't make sense for the PC gaming market as it currently stands, and probably never will in its current state because there's no real benefit to be had from it. It takes away choice and would probably only increase upgrade costs because of hurdles such as new sockets and RAM requirements.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
You are making some pretty bold claims there.People who can afford top of the line CPU and GPU can afford a good cooling solution as well.

Yes and none of those would ever use an APU
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
I meant the 3770k which is used for gaming.

Nobody owns the 3970X. Also my point is valid. a 250w GPU with a 150w TDP CPU = 400w+ Good luck finding a cooler than will handle that on a warm day.

Stick to what you originally said.

"fastest desktop CPU" But not really

"Decent GPU" But not really.

So what you really mean to say is that you say nothing at all but BS.

250W GPU is 7970 or 7970GE range isn't it?
What does a decent GPU (I deem "decent" to mean middle of the road)
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
APUs will take the low-end, but won't compete high end at least not for the next decade. It would make motherboards and RAM more expensive, cooling more difficult and more expensive, be next to impossible to upgrade properly and require new standards for a very established market (a new power supply standard for one) in a market that actually doesn't need it.

An APU makes sense for low end systems because it simplifies things and the demand for performance simply isn't there. It makes sense for a console because it's a single hardware configuration so you can afford to back its performance up with decent RAM. It doesn't make sense for the PC gaming market as it currently stands, and probably never will in its current state because there's no real benefit to be had from it. It takes away choice and would probably only increase upgrade costs because of hurdles such as new sockets and RAM requirements.

Current APU's do not require expensive ram, more expensive motherboards, and are cooled just like any regular CPU. Also, why will they need a new power supply? I think you may be confused about what an APU is.

Upcoming intel CPU's may require a new PSU standard to support their new power saving features, but that does not apply to AMD's APU's at this time. I also believe that the Intel chips will work on a normal PSU but won't support the new features.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Why shouldn`t Nvidia be happy. Their recent architecture Kepler is selling like hotcakes

GeForce-700M-Series_1.jpg

Only 4 posts for something completely OT to be inserted and have the thread get derailed.
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
Stick to what you originally said.

"fastest desktop CPU" But not really

"Decent GPU" But not really.

So what you really mean to say is that you say nothing at all but BS.

250W GPU is 7970 or 7970GE range isn't it?
What does a decent GPU (I deem "decent" to mean middle of the road)

What exactly are you saying? Because to me your post looks BS. It has nothing in it of relevance apart from a load of rebuttals. Any idiot can do that.. just disagree and call everyone else's posts BS.

Please tell me how an APU will fit a 100+w CPU and a 150-250w GPU in the same chip. Also where does the needed DDR5 memory come in?

Prescott had a TDP of 103w.. that was considered a very hot CPU. Even a 7790 has a 85w TDP and isnt even as powerful as the PS4's chip.

So tell me Keysplayr how will this APU take over from the CPU/GPU setup of today
 

Unoid

Senior member
Dec 20, 2012
461
0
76
in 15 years we'll all still be on AT posting about some type of discrete graphics.
I think that's the main point.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
The key metric for me is attach rates -- when these start to suffer more and more -- discrete may slowly die -- but when?

What time-line where integration goes over the tipping point of "good enough?"

What time-line where integration makes discrete irrelevant?
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
It's inevitable that discrete GPUs for gaming go away, but not for a very, very long time.

Unlike sound cards, video cards gobble up vast amounts of electricity, so die shrinks can only do so much so soon.

But just as we reached "good enough" levels in sound cards so that we started getting integrated sound chips on mobos for "free," we already see something like that for non-gaming PCs (Ivy Bridge, Haswell). Low-end rigs can run APUs and do okay with older games. It's not the best, but it's "good enough." You get diminishing marginal returns on buying big, expensive, hot GPUs. We may already be reaching a saturation point in pixel sizes, where going any smaller doesn't bring appreciable benefits to most people. Let's take 4k as the point where it doesn't make sense anymore to increase pixels per inch (some might say 1080p is already enough). A Surface Pro already does okay at 720p, passable at 1080p. Fast forward a decade and the 2023 Surface Pro ought to be at least passable at 4k. And from there on out there isn't really anywhere to go. You run into size limitations: nobody wants to lug around 20" tablets, and huge desktop monitors don't fit many people's living situations. 4k is already overkill for many living rooms, too, when you take viewing distance into account, and how 60"+ is really big and many people will balk at buying even bigger TVs.

All of this assumes that we don't run into some physical limits though, like how it's hard to dissipate heat when you have tinier and tinier die sizes.
 
Last edited:

Bobisuruncle54

Senior member
Oct 19, 2011
333
0
0
Current APU's do not require expensive ram, more expensive motherboards, and are cooled just like any regular CPU. Also, why will they need a new power supply? I think you may be confused about what an APU is.

Upcoming intel CPU's may require a new PSU standard to support their new power saving features, but that does not apply to AMD's APU's at this time. I also believe that the Intel chips will work on a normal PSU but won't support the new features.

You clearly have no understanding on what I was talking about.

I am talking about the potential for APUs to replace high end CPUs and graphics cards. Current APUs are low end hardware which I addressed in my post. Please read next time.
 

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
in 15 years we'll all still be on AT posting about some type of discrete graphics.
I think that's the main point.

I think in 15 years we'll be posting about "hey, remember when we all had discrete GPUs?". ;)
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
8,129
3,067
146
I don't think discrete GPU's are going anywhere, not anytime soon at least. Until APUs can deliver enough processing power for modern games as the highest end discrete GPUs in the same time, there will still be a market for the high end. If someone wants the best, then the APU would have to be better than the highest end discreet card available for them to buy an APU. Thus, there would still be a market for discreet cards, even if smaller.

And I believe a decent card to be HD 7850 or better :D
 

R0H1T

Platinum Member
Jan 12, 2013
2,582
163
106
I don't think discrete GPU's are going anywhere, not anytime soon at least. Until APUs can deliver enough processing power for modern games as the highest end discrete GPUs in the same time, there will still be a market for the high end. If someone wants the best, then the APU would have to be better than the highest end discreet card available for them to buy an APU. Thus, there would still be a market for discreet cards, even if smaller.
The high end market will still be there regardless of an APU simply because you can never cram the power of 10 billion transistors on a 450mm wafer(Radeon 9970 :p) in a desktop CPU, hell not even server parts, so that'll remain for as long as we have the gaming/movie/animation/TV industries !
And I believe a decent card to be HD 7850 or better :D
With the potential of HSA we might be able to get a mid range GPU level performance in 5yrs(or less) time so I'm not sure if this space is gonna last that much longer !
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I think in 15 years we'll be posting about "hey, remember when we all had discrete GPUs?". ;)

+1

I don't think discrete GPU's are going anywhere, not anytime soon at least. Until APUs can deliver enough processing power for modern games as the highest end discrete GPUs in the same time, there will still be a market for the high end. If someone wants the best, then the APU would have to be better than the highest end discreet card available for them to buy an APU. Thus, there would still be a market for discreet cards, even if smaller.

And I believe a decent card to be HD 7850 or better :D

The PS4 APU in a PC will be enough for 9 out of 10 people. The GPU market, as we know it now, will just become uneconomical. Exactly how long before that happens? Not sure, but I'd imagine that by the time we get 14nm APU's not too many people will be interested in paying the money for a discrete card. Not enough to support the development, IMO.