Discrete GPU is dying? NVIDIA Disagrees

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
who says Intel is not speaking with Apple for powering the iphone 7 in 2014 or iphone 8 in 2015. Apple switched from powerPC to Intel x86 because Intel provided them the performance within the power envelopes which only Intel could do. No other company could do and so Apple moved to Intel. Apple would not hesitate to switch to Intel if they know that the competition will be far behind in perf and perf/watt.

Except Apple also design their own custom ARM SoCs, and there is only so much CPU to do the usual phone/table stuff which their current SoCs already provides and much more. Besides, they were smart enough to shake off Intel's x86 leash with iDevices and they are not stupid to go back to those days again, especially with two decades of Intel monopolistic antics.
 

R0H1T

Platinum Member
Jan 12, 2013
2,582
162
106
What are you trying to say here? Intel's Atom chips have a completely different roadmap than their Core counterparts (Bonnel-Saltwell-Silvermont-Airmont versus Sandy Bridge-Ivy Bridge-Haswell-Broadwell). They are planning to advance on a tick-tock yearly; next year is a die shrink to 14nm.
Yeah my bad, what I meant was that since the next node shrink is next year Intel is gonna fast track their mobile roadmap by going tick-tock starting this year or next as to how you look at it !
Really? The cost of building and upgrading 14nm fabs is upwards of a billion dollars. Qualcomm, Samsung, Apple, and other SoC designers still have to design the SoCs, and that takes R&D too. You are the one that makes no sense; how can it magically cost Samsung and co. less to design and manufacture as a whole at 14nm than Intel?
So tell me when is Samsung going 14nm, they're at 32nm last I checked ? The amount of money spent on fine tuning reference, or custom, ARM designs is much & really too much less than going for a node shrink & btw the costs borne by TSMC is eventually spread between a dozen or so chipmakers that use its fabs, so do the math yourself !
I'm not saying that Intel will magically dominate. They have a hard fight ahead of them, which they may lose, especially since, as you point out, there are many established chip designs. But in tablets and some smartphones (HTC's phones, the Nexus lines, Asus tablets) they have a chance to gain some significant marketshare.
Ah you see "but they will not dominate" & if & when they do I'll be here to eat these words :p
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
@raghu78

It's more likely Apple ditching Intel from their desktops and laptops than going Intel for everything. Mostly getting better margins on their whole lineup than throwing them away on their successful, growing mobile business.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Era of the APU has only just begun, give it a couple years and 80% of people will no longer need dedicated cards to do what they do on their computers.
It will take more than a couple of years. The external GPU will die when it is sufficiently cheap to include sufficient RAM bandwidth, while still allowing RAM expansion.

Tho I think lower end discrete GPU market will eventually die out.
It almost has. Adding displays, and/or getting driver features not offered by Intel, accounts for a lot of it, already. If the IGP in Haswell Xeons has some MCA support, you can expect the low-end Quadro and FirePro market to shrivel up, too (they only just added ECC, which Intel's will have, "for free," so if Intel can get the app certs, and add logged errors for the hardware, only the cards offering many times the IGP's performance will be worth it).

The days of MMO's have been and gone.

Next Gen is going to push games more than ever and that means more powerful GPU's

There is no need to play games on cheap PC's when consoles are going to be rocking decent graphics. Unless you want to play PC only games.
...and they have a keyboard, mouse/trackball, and can have other programs of mine running in the background? You can't play the same game on a console, no matter what the graphics (in many cases, difficulty/balance mods can come to the rescue, for multiplatform games). Some genres just don't work as well with a pad, and/or aren't as fun with an aimbot, and so on and so forth. Some games could benefit, though: imagine Dwarf Fortress with a streamlined menu system, required for gamepad use :D.

A decent GPU has a 3-5x the power requirements of the fastest desktop CPU. This will always be the case. BF3 today looks ok but in 3 years time it will look crap in comparison to whats new on the market. You will never have all that power inside an APU.
Show me the DIMMs. Cheap RAM technology is already a limiting factor to current APUs, and there's no sign of that ending on the horizon, unless AMD pulls some kind of coup along w/ the XB720, offering API-exposed embedded RAM on COTS x86 processors, or something like that. If not that, we will need the next major DRAM spec to improve bandwidth by several times (2x every ~4 years is just keeping up with increasing requirements for CPUs alone), before it becomes feasible. It will happen, but the technology to make cards obsolete is not apparent. The now is not having to buy a card due to IGP being total crap.

Solve the bandwidth/pin problem, in a way that doesn't require coding to one specific HW platform, and we'll get there quickly (and, maybe Intel has some driver magic that can do that with their DRAM). Until then, it will remain around the next corner, and the next, and the next...

we used to have a math co-processor

history teaches well
So, you think we'll have a video card that, when plugged in, replaces the CPU? :p
 
Last edited:

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
So tell me when is Samsung going 14nm, they're at 32nm last I checked ? The amount of money spent on fine tuning reference, or custom, ARM designs is much & really too much less than going for a node shrink & btw the costs borne by TSMC is eventually spread between a dozen or so chipmakers that use its fabs, so do the math yourself !
You can only refine a processor node so much before a shrink becomes inevitable. Of course, a node shrink does not promise immensely better characteristics in and of itself, but in the hands of a good team, it typically does provide significant improvements. Additionally, R&D of a micoarchitecture probably takes just as much work as a node shrink; after all, you don't see Intel R&D fluctuating wildly up and down. SOMEONE is paying for TSMC's fab upgrades, and even if it is spread across many chip developers, that cost cumulatively the same.

Think about it this way: you can buy books for roughly the same price anywhere. It doesn't matter if the contents were published and written by different firms and authors, as they all still ultimately have to pay the same printing press, which must charge equal to the costs of operation. It doesn't matter how many publishers there are: the printing press's operation costs remain the same, and that cost is ultimately passed onto the end consumer.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
@raghu78

It's more likely Apple ditching Intel from their desktops and laptops than going Intel for everything. Mostly getting better margins on their whole lineup than throwing them away on their successful, growing mobile business.

Apple is fighting the ARM/Android crowd. the bigger fight / challenge is with the Android devices. Intel will provide them with the chip to beat the ARM crowd on perf and perf/watt. importantly it will give them better performance at same power or same perf at lower power, thus allowing them to reduce size and form factor.
 

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
imagine Dwarf Fortress with a streamlined menu system, required for gamepad use :D.
I imagine it would start to crawl (seewhatididthere?). I mean, it's basically a straight-up CPU benchmark as you progress.

On the other hand, improved menus and consolidation of said menus would be really, really nice.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I imagine it would start to crawl (seewhatididthere?). I mean, it's basically a straight-up CPU benchmark as you progress.
Worse, it's single-threaded. They could add some fancy animations and a nice 3D view, practically for free, with those 7 other cores and a nice GPU :).
 

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
Poor dorfs. Their creator is so focused on adding features and staying true to his vision that the performance of the game suffers :(. Sometimes I wish Adams would swallow his pride and get a third party to check over his code.
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
Apple is fighting the ARM/Android crowd. the bigger fight / challenge is with the Android devices. Intel will provide them with the chip to beat the ARM crowd on perf and perf/watt. importantly it will give them better performance at same power or same perf at lower power, thus allowing them to reduce size and form factor.

Again, margins and independence. Apple is doing darn good by itself without Intel and won't lower its margins for something only geeks give a flying thing about.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,373
10,068
126
We still do, it's called a floating point unit.

The point being, an FPU used to be a seperate, optional, discrete chip that you would purchase and add to your motherboard. Now, it's integrated into the CPU itself. Just like APUs.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,373
10,068
126
The GPU market, as we know it now, will just become uneconomical. Exactly how long before that happens? Not sure, but I'd imagine that by the time we get 14nm APU's not too many people will be interested in paying the money for a discrete card. Not enough to support the development, IMO.

In my eyes, it seems the trend is more expensive GPUs from both camps in general. $400-500 mid-range with $700-1000 now reserved for flagship cards, or an increase from $250-300 and $500-650 we previously had.

Discrete GPUs are getting MORE expensive, as the overall market for them shrinks. Soon, they will be elite enthusiast-only, and cost $2000 a pop. Heck, they are already at $1000 ea for high-end.

All of this, because "good enough" IGPs integrated into CPUs are stealing the marketshare, and thus the R&D funding, for new discrete GPUs.
 

Unoid

Senior member
Dec 20, 2012
461
0
76
Discrete GPUs are getting MORE expensive, as the overall market for them shrinks. Soon, they will be elite enthusiast-only, and cost $2000 a pop. Heck, they are already at $1000 ea for high-end.

All of this, because "good enough" IGPs integrated into CPUs are stealing the marketshare, and thus the R&D funding, for new discrete GPUs.

I think you're exaggerating a little. There will still be a large mainstream market mainly for PC gamers. with cards of increasing power at the same $200-500 price points.

I think progress will definitely slow down as we hit process walls. But gaming will drive a good, large, and profitable market for amd, nvidia, anyone else who wants to be in the market..

Wait until we get mainly ray-tracing for 3d engines. GPU's will need to be powerful as shit, and no integrated GPU can do it well enough for 10+ years. hence discrete is here to stay.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
I don't think so either. Nvidia and AMD know that a dedicated card will always outperform a IGP (as long as there are gamers).
Remember, Intel HD series and AMD Trinity are advancing and will for sure replace the lower performing dedicated card market but they'll never outperform a full blown Geforce or Radeon
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,245
5,035
136
Show me the DIMMs. Cheap RAM technology is already a limiting factor to current APUs, and there's no sign of that ending on the horizon, unless AMD pulls some kind of coup along w/ the XB720, offering API-exposed embedded RAM on COTS x86 processors, or something like that. If not that, we will need the next major DRAM spec to improve bandwidth by several times (2x every ~4 years is just keeping up with increasing requirements for CPUs alone), before it becomes feasible. It will happen, but the technology to make cards obsolete is not apparent. The now is not having to buy a card due to IGP being total crap.

JEDEC has certified a specification for SODIMM GDDR5 memory, even if no-one is currently shipping it. And why do you think AMD has moved into selling RAM? ;)
 

NTMBK

Lifer
Nov 14, 2011
10,245
5,035
136
Wait until we get mainly ray-tracing for 3d engines. GPU's will need to be powerful as shit, and no integrated GPU can do it well enough for 10+ years. hence discrete is here to stay.

The next console generation won't be powerful enough for raytracers, so we won't be seeing them for at least another 5 years. Think the discrete GPU market will still be around in 5 years?
 

NTMBK

Lifer
Nov 14, 2011
10,245
5,035
136
I don't think so either. Nvidia and AMD know that a dedicated card will always outperform a IGP (as long as there are gamers).
Remember, Intel HD series and AMD Trinity are advancing and will for sure replace the lower performing dedicated card market but they'll never outperform a full blown Geforce or Radeon

The majority of discrete GPU sales are in the low and mid range. If those markets (continue to) implode due to iGPU, AMD and NVidia are going to lose big chunks of revenue. And how do you think they are going to fund the R&D to compete with Intel then?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
I think its pretty obvious for financial reasons why the discrete market is doomed. Its just a matter of when and its unavoidable. Just as previous seperated components in the past.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
JEDEC has certified a specification for SODIMM GDDR5 memory, even if no-one is currently shipping it. And why do you think AMD has moved into selling RAM? ;)
If they actually come out with it, they might make up to the 7750 obsolete, based on their current roadmaps (FI, Kaveri). That still leaves the whole midrange and high end, which need pins that their SoCs aren't going to have any time soon. Midrange cards have 150+GB/s, and that needs both fast chips, and a wider interface. Just fast chips will give it roughly the performance of 2 DDR3 channels per GDRR5 channel. Not bad for Jaguar, in some cases, and maybe nice for larger chips, too, but that's still half the width they need to do it well. To make it cheap, they'll need to double up again past what GDDR5 offers, then again fairly quickly after that, once more powerful GPUs are using more.

nV: 192-bit, 256-bit, 384-bit
AMD: 256-bit, 384-bit

They are going for cheap, not for midrange equivalence, much less high-end. APUs are going to be hamstrung, unless they do something like work together with Intel, or MS, on some standard for temporary storage, to offset the bandwidth costs.
 

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
games move up [ps4] should push most games at 1080-1200 past igpu's low-mid. settings on a pc. on line or sp. let alone @ 2560x1440
igpu's move up
dgpu's move up
-nothing will change imo.
 
Jan 31, 2013
108
0
0
If they actually come out with it, they might make up to the 7750 obsolete, based on their current roadmaps (FI, Kaveri). That still leaves the whole midrange and high end, which need pins that their SoCs aren't going to have any time soon. Midrange cards have 150+GB/s, and that needs both fast chips, and a wider interface. Just fast chips will give it roughly the performance of 2 DDR3 channels per GDRR5 channel. Not bad for Jaguar, in some cases, and maybe nice for larger chips, too, but that's still half the width they need to do it well. To make it cheap, they'll need to double up again past what GDDR5 offers, then again fairly quickly after that, once more powerful GPUs are using more.

nV: 192-bit, 256-bit, 384-bit
AMD: 256-bit, 384-bit

They are going for cheap, not for midrange equivalence, much less high-end. APUs are going to be hamstrung, unless they do something like work together with Intel, or MS, on some standard for temporary storage, to offset the bandwidth costs.
Of course it's going to be quite a long time before APU's replace most of the viable dedicated GPU market. I have my doubts of Kaveri even beating out the 7750, based on stream count we have seen so far. If AMD can deliver both the GDDR5 and 512 GCN streams, than AMD might as well be shooting themselves in their own foot. What we do know for a fact is Kaveri's flagship will ship with either 384, 448, or 512 SPU's. I personally see them delivering with 448 SPU's sorta like the 560Ti 448 core, which will keep even the lowest of AMD's dedicated cards above water (if they go with 384 again I think its going to fall short of peoples expectations). Meanwhile 448 GCN cores even paired with desktop DDR3 will shake the ground beneath anything we have seen in a APU (defiantly one giant step ahead of Intel). Trinity can already play games like BF3 above 30 FPS on low-med mixed settings. Toss in another compute unit, GCN core performance, and you got something to talk about. Not to mention the 800 MHz and 850 MHz QDR AMD plans on having available for replacement memory. For less than $200, you can upgrade the entire core of your grandma/childs computer to play games like MW2 on max settings. Sure them games are old, but tons of people still play them. Even MW3 is no more demanding than MW2, so there is quite a bit of room for APU's to shine (even more so seeing how all Steam games are not very demanding). Tho AMD could bring on a huge failure, with all kind of problems with the new unified architecture. We won't know until release day, or until verified people have gotten their hands on a ES unit to test.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The majority of discrete GPU sales are in the low and mid range. If those markets (continue to) implode due to iGPU, AMD and NVidia are going to lose big chunks of revenue. And how do you think they are going to fund the R&D to compete with Intel then?

AMD is going to do everything they can to transfer that loss of GPU sales to APU sales and make money, rather than lose it.

When the bulk of the sales go away, due to iGPU, is the high end market going to be enough to continue the R&D, marketing, etc. of discrete desktop graphics? I agree with you, and personally, I don't think so.

As far as competing with Intel though, Intel doesn't have discrete GPU income either. So, it should be a level playing field. Not on company size and assets, but smaller companies compete with large companies all of the time. Being small sometimes has it's benefits, if the company can leverage them.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Of course it's going to be quite a long time before APU's replace most of the viable dedicated GPU market. I have my doubts of Kaveri even beating out the 7750, based on stream count we have seen so far.
Probably not. But, with 128-bit, that would be about the best they could hope for, even with GDDR5, since the CPU will need some RAM IO time, no matter how many SPs they use, unless some generational GCN efficiency improvements are pretty major. The chip needs to be bigger and more expensive to fit more and more pins, which also increases power draw, so, without having products that can demand high prices (think Core i7), there's little incentive to even try for better.

The $100 price point will be like the $50 price point in short order, but there's still a big gulf to cross to come close to matching $200 cards of today, much less near-future ones. Most cards today have enough bandwidth, in the sense that overclocking it does not yield good returns, but they do need what they have, without some fancy caching.