I don't think GT4 exists to bow to Apple's whims, nor does the the unlocked Iris Pro Broadwell-K.
If that were the case, GT2 would be dead, and GT4 would be on every CPU from the i3 and up.
Doubt it will be that fast. Even if physics doesn't get in Intel's way, DDR4 holds up to its speed scaling promises, and on-package speedy stacked DRAM proliferates, it's going to be 3-5 years before the $100-125 cards are usurped. If any of it takes longer, or doesn't work out as well as expected, then longer for it to happen. The high-end market will basically last until only HPC cards are worth building, and software has changed enough that they aren't much better than using integrated. Not soon.To you and everyone else agreeing with Torvalds, what is your prediction on how link it'll be until the high-end gamer market becomes too small to be worth supporting? 10 years?
Putting GT4 into FC-LGA would require a new and larger socket.
I can agree with this. In 10 years it may be a niche market with only cards like Titan available, but to say that GPU advancement will end in 5 years is ridiculous. I don't understand why so many people think that, if something applies to one market, it'll be the same for every market. Have any of you considered that sound cards died so quickly because they weren't a huge market in the first place? The number of sound enthusiasts was never anything close to the number of gamers or people doing graphics work.Doubt it will be that fast. Even if physics doesn't get in Intel's way, DDR4 holds up to its speed scaling promises, and on-package speedy stacked DRAM proliferates, it's going to be 3-5 years before the $100-125 cards are usurped. If any of it takes longer, or doesn't work out as well as expected, then longer for it to happen. The high-end market will basically last until only HPC cards are worth building, and software has changed enough that they aren't much better than using integrated. Not soon.
AMD might be in yet another bind, but nVidia has their hands in plenty of pockets, so the long-predicted demise of high-volume low-end discrete GPUs won't hurt them (in the time frame it will take to creep into the midrange, they will have plenty of time to adapt). In a couple generations, they'll be using fundamentally the same designs in PC AIB GPUs as what they sell for HPC servers, cars, tablets, etc.. Kind of a bummer that they bowed out of making a server-class CPU, but that's not too big a deal (they also worded it in a way that didn't claim they were killing off their CPU development, either). As such, I also doubt the high-end GPUs will go away too soon, because they are being developed as much for gamers as for workstations and servers--the more sales, the better--and that software will take awhile to get to where the cards are slower, once they become sufficiently uncommon.
I'll believe it when I see it. Perhaps all i3s was pushing it, but the highest end one at least will need to have it before I'm convinced. (Reminder that I mean GT3 here.) That won't happen with Broadwell, Skylake, or even Cannonlake. On top of that, with more efficient architectures on the way from Nvidia and AMD, I have serious doubts that Cannonlake will have capabilities comparable to the midrange of the time. It needs to be nipping at the $200-250 cards before the market is in danger. Iris Pro can't even match the $100 cards of today.Thats utter nonsense.
AMD should sell 512SP APUs only too, right? Or drop 512SP APUs because they are not Apple supplier?
Its obvious that Iris Pro and higher GT models will be much more common as the node shrinks and the economics allows for it.
I can agree with this. In 10 years it may be a niche market with only cards like Titan available, but to say that GPU advancement will end in 5 years is ridiculous. I don't understand why so many people think that, if something applies to one market, it'll be the same for every market. Have any of you considered that sound cards died so quickly because they weren't a huge market in the first place? The number of sound enthusiasts was never anything close to the number of gamers or people doing graphics work.
I'll believe it when I see it. Perhaps all i3s was pushing it, but the highest end one at least will need to have it before I'm convinced. (Reminder that I mean GT3 here.) That won't happen with Broadwell, Skylake, or even Cannonlake. On top of that, with more efficient architectures on the way from Nvidia and AMD, I have serious doubts that Cannonlake will have capabilities comparable to the midrange of the time. It needs to be nipping at the $200-250 cards before the market is in danger. Iris Pro can't even match the $100 cards of today.
And the bulk of that volume is sold in the midrange, hence what I said in my post. Maybe I should change that range to $150-200? That doesn't really change my opinion. We'll see what happens, but I think it'll be at least 10 years for the "stage 1" you're taking about to happen. 5 years is a really short time to go for something like this to jump from being a foreseeable trend to killing off an entire market. I saw people saying that the traditional gaming market would be dead within 5 years back in 2009 with arguments similar to what you're giving, yet it took 3 years for there to even be a significant slowdown. Even then, the new consoles reduced the impact of that slowdown.Yet there is only 2 dGPU graphics companies left and on a fasttrack to become one.
I think you completely disregard the cost factor for R&D in terms of volume and so on.
The first Kepler/GCN chip that nVidia or AMD shipped essentially had a price tag of over 500mio$. For nVidia it may have been over a billion. What brings that cost down is volume. If you cant get that money back (ROI). Then you dont do it in the first place. Nomatter what gamers or whatever demands.
I'll believe it when I see it. Perhaps all i3s was pushing it, but the highest end one at least will need to have it before I'm convinced. (Reminder that I mean GT3 here.) That won't happen with Broadwell, Skylake, or even Cannonlake. On top of that, with more efficient architectures on the way from Nvidia and AMD, I have serious doubts that Cannonlake will have capabilities comparable to the midrange of the time. It needs to be nipping at the $200-250 cards before the market is in danger. Iris Pro can't even match the $100 cards of today.
...At 200-250$ cards the dGPU is already dead.
In terms of your previous faith in the enthutiast gamer. Powercolors lastest dual 290X for example is a 250 cards batch.
Sure, but nVidia folk have more than once said they make good money on the mid-range and high-end GPUs. The very same chips also go into other parts. Despite much marketing, we know the chip a Geforce has is the chip a Quadro has is the chip a Tesla has. If the gaming market went away tomorrow, nVidia would still be developing and making their GPUs (eventually all SoCs), they'd just cost more to the buyers that were left. If only say, a GTX 760 or higher were worth making anymore, they would still sell many of them, probably millions, on top of the higher-end variants and professional products. They'd need to make more to keep up the R&D, but that level of GPU is still well inside the mainstream for PC gamingEntry cost for soundcards is also pennies compared![]()
Cannonlake is 3 GPU revisions down the road: Gen8, Gen9 and Gen10. And 2 node shrinks: 14nm and 10nm with Germanium transistor innovation. It will have a 3x higher density than 16FF+ and much better power and performance characteristics. I'm not sure if it will compete against the midrange, but it should be much better than Haswell's GT2 today.
Does it count if one uses tilesets?I guess if you play ASCII games on your Linux box than yeah, dGpU is out... Linus needs to get out more than living in his sheltered life.
Your example is irrelevant because it doesn't apply to the desktop/consumer space we're talking about, which apparently Linus is too.
That's not a consumer LGA1150 CPU or socket. You can't put something like that on a Z97 motherboard (for example). In consumer space, sockets have neither the TDP or the cost budget of a Knight's Landing.
And after all that, there's absolutely no indication that it'll be anything other than a flop for graphics performance, and that's exactly where it needs to compete if it's going to replace discrete GPUs.
Yeah, sorry. I got caught up in that argument with that other person.I'm not saying that the dGPU market will die within 5 years. I don't think that's the case and I also don't really care. I just want to point out that the gap is quickly shrinking.
I'm not sure why you think this has to happen in a LGA1150 socket... were not talking this years tech.
I am talking a bout a NEW style motherboard, a complete redesign.
A motherboard with 2 sockets, much like the x86 boards which previously has the x87 drop in coprocessor.
Imagine a new Intel Motherboard with a slot for a CPU and an adjacent slot for a GPU with a data bus linking them.
the CPU and GPU could be situated close enough together that a new heat pipe cooler could cool both sockets simultaneously with a cooler not unlike those found on graphics cards today.
the bandwidth of the bus and socket set would not be the limiting factor in such a case. At that point you could even have shared memory for both CPU and GPU.
Of course it would require a complete reworking of the power distribution of such a board, and the cooling of such a board.
But the market for such a solution could be great.
Not only could you serve all single GPU consumer PC's, but such a configuration could be drop in for nearly all future generations of console applications.