• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

News Intel to develop discrete GPUs

Qwertilot

Golden Member
Nov 28, 2013
1,475
162
106
Might be quite an empty place if they take a few years releasing it :)

Like the main page article says, it'd be distinctly non trivial to just scale up their current iGPU stuff. In some ways it almost makes you wonder why they didn't buy imagination.
 

tviceman

Diamond Member
Mar 25, 2008
6,700
467
126
www.facebook.com
Intel has twice tried to enter the discrete GPU market and failed. Given Vega's obsolete perf/w vs. Nvidia under Koduri's direction, at best Intel will not be entering the consumer space anytime soon.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
106
Which means AMD gets even further left in the dust than they already are in the GPU race.
Actually it leaves NVidia without a dance partner in laptop GPUs, and the increasing trend is that people are mostly buying laptops.

Both Intel and AMD will be have better and better "Good Enough" laptop GPUs integrated in the CPU Die. NVidia is locked out at this level.

Both AMD and Intel will also have dGPUs, that will be put in special packages tightly integrated with something like EMIB. NVidia will have a hard time integrating it's dGPUs at this level so NVidia will Mostly be locked out at this level of integration.

Intel may have packaged up an EMIB commercial proof of concept, using a Radeon dGPU chip but you can bet that future integrated designs from Intel will be all Intel once they get their dGPU ready, and AMD will be packaging all AMD solutions.

Where does that leave NVidia? Mostly on the outside without a CPU partner for laptops.

NVidia dGPUs will pushed to the niche hardcore Gamer Laptops, that run GTX 1070+

So the real loser here is NVidia, not AMD.
 
  • Like
Reactions: luger

tviceman

Diamond Member
Mar 25, 2008
6,700
467
126
www.facebook.com
Actually it leaves NVidia without a dance partner in laptop GPUs, and the increasing trend is that people are mostly buying laptops.

Both Intel and AMD will be have better and better "Good Enough" laptop GPUs integrated in the CPU Die. NVidia is locked out at this level.

Both AMD and Intel will also have dGPUs, that will be put in special packages tightly integrated with something like EMIB. NVidia will have a hard time integrating it's dGPUs at this level so NVidia will Mostly be locked out at this level of integration.

Intel may have packaged up an EMIB commercial proof of concept, using a Radeon dGPU chip but you can bet that future integrated designs from Intel will be all Intel once they get their dGPU ready, and AMD will be packaging all AMD solutions.

Where does that leave NVidia? Mostly on the outside without a CPU partner for laptops.

NVidia dGPUs will pushed to the niche hardcore Gamer Laptops, that run GTX 1070+

So the real loser here is NVidia, not AMD.
If it all pans out, Intel wins and everyone loses. But what I said a few posts of above is more relevant than ever, and also people have been saying Nvidia is dead since 2009 because high end was dying and iGPU's would take over. Each year since then Nvidia has made more money than the previous. They're doing better than ever now.
 
  • Like
Reactions: Zstream and s44

Genx87

Lifer
Apr 8, 2002
41,083
489
126
NVidia dGPUs will pushed to the niche hardcore Gamer Laptops, that run GTX 1070+

So the real loser here is NVidia, not AMD.
Said every year for a decade.

Intel has failed in dGPU not due to necessarily design but due to margins. The GPU business does not present high enough margins for them to sustain development. Im curious to see if they have decided to let margins slip to gain market share in this arena. I also believe some of this is to fund R&D into their HPC stuff. Nvidia makes fat margins with their Tesla line.
 

antihelten

Golden Member
Feb 2, 2012
1,759
261
126
Intel has twice tried to enter the discrete GPU market and failed. Given Vega's obsolete perf/w vs. Nvidia under Koduri's direction, at best Intel will not be entering the consumer space anytime soon.
I doubt Intel's goals are necessarily to be targeting the consumer space anyway (at least not as their primary focus). This seems to be first and foremost a response to Nvidia's success in the deep learning and compute space, where Intel has probably finally realized that their current Xeon Phi efforts are not going to be enough to compete in the long run.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
106
If it all pans out, Intel wins and everyone loses. But what I said a few posts of above is more relevant than ever, and also people have been saying Nvidia is dead since 2009 because high end was dying and iGPU's would take over. Each year since then Nvidia has made more money than the previous. They're doing better than ever now.
Any predictions about the death of high end were clearly wrong.

But they certainly have lost much of the laptop business, such that something like 60% of the GPUs sold in 2016, are Intel IGPs.

NVidia making more money? Sure. In 2009, NVidia high end GPU (GTX 295 dual GPU) was ~$500, today it's a $1200 Titan. I wonder why they make more money every year.

So far IGP/APUs have only soaked up low end GPUs aimed at essentially non gamers.

The EMIB/Interposer + dGPU will also lock NVidia out of the mid-range as well, that is going to sting.

NVidia saw this coming which is why they pushed so hard into super computing.
 

Glo.

Diamond Member
Apr 25, 2015
4,285
2,362
136
I doubt Intel's goals are necessarily to be targeting the consumer space anyway. This seems to be a response to Nvidia's success in the deep learning and compute space, where Intel has probably finally realized that their current Xeon Phi efforts are not going to be enough to compete in the long run.
Ignoring AMD, especially when they are on the rise again, is plain stupidity.

AMD have APUs, have bigger engineering capabilities in current state of delivering hardware(SoC's) for those purposes than anyone in the industry. Nvidia does not have x86 CPUs, Intel and AMD has.

Its also funny how Raja from incompetent fool running Radeon Technologies Group suddenly morphed into star of GPU engineering by joining Intel.

AMD and Nvidia have 4-5 years of "breathe". Any work that Raja will do at Intel will come to fruitin with next generation architeture at Intel. At that time span - A LOT can change for both of those companies.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,562
96
Ignoring AMD, especially when they are on the rise again, is plain stupidity.

AMD have APUs, have bigger engineering capabilities in current state of delivering hardware(SoC's) for those purposes than anyone in the industry. Nvidia does not have x86 CPUs, Intel and AMD has.

Its also funny how Raja from incompetent fool running Radeon Technologies Group suddenly morphed into star of GPU engineering by joining Intel.

AMD and Nvidia have 4-5 years of "breathe". Any work that Raja will do at Intel will come to fruitin with next generation architeture at Intel. At that time span - A LOT can change for both of those companies.
Yeah I doubt AMD and Nvidia will be sitting still and both have way more experience then Intel does in this field.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
106
Yeah I doubt AMD and Nvidia will be sitting still and both have way more experience then Intel does in this field.
I doubt we will see Intel releasing a GPU card to compete with this:
https://www.anandtech.com/show/12009/nvidia-launches-star-wars-themed-titan-xp-collectors-edition-graphics-cards

But they will leverage packaging advantages to tie their CPUs to their dGPUs. Even if they aren't the best, the overall package will be what gets them sales.

Much like how the Intel IGP is the worse GPU on the market today, but they actually sell in the greatest numbers.
 

zlatan

Senior member
Mar 15, 2011
566
258
136
Intel has twice tried to enter the discrete GPU market and failed.
Don't think about this as a traditional way. There are some potential problem in the future. Even that PCIe 4.0 and 5.0 is coming, a GPU will need more robust connection to the CPU. AMD had a lot of ideas for this, like the Greenland project, which was basically an MCM design with a Zen CPU connected to a Vega GPU with four GMIs. Intel want to pursue this concept. They don't really care about the legacy PCIe connection, it will be too slow even with PCIe 5.0.
I can't talk about it to much, but Microsoft is working on some huge structural change for Windows. It won't come soon, but Intel probably know about it, and it will be a perfect time to attack this market, because the superfast and low latency connection between the CPU and the GPU will be necessary.
 

PingSpike

Lifer
Feb 25, 2004
21,261
196
106
This might work if Intel fires its CEO.

Nvidia certainly is and has been in a strategically weaker position. But they have been tactically masterful to make up for it. Intel is obviously after that compute money here more than anything, but that is a market Nvidia seems to have created for itself out of whole cloth. If I had to guess what will happen right now Id say based on Intel's recent efforts to move into basically any market I would expect they will fail here. I'd expect them to release an underwhelming new offering in a few years, probably poorly priced because Intel is to cheap to buy its way ahead in new markets. Then they'll lose interest and let the whole thing die on the vine like they always do.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,562
96
Don't think about this as a traditional way. There are some potential problem in the future. Even that PCIe 4.0 and 5.0 is coming, a GPU will need more robust connection to the CPU. AMD had a lot of ideas for this, like the Greenland project, which was basically an MCM design with a Zen CPU connected to a Vega GPU with four GMIs. Intel want to pursue this concept. They don't really care about the legacy PCIe connection, it will be too slow even with PCIe 5.0.
I can't talk about it to much, but Microsoft is working on some huge structural change for Windows. It won't come soon, but Intel probably know about it, and it will be a perfect time to attack this market, because the superfast and low latency connection between the CPU and the GPU will be necessary.
It has been my understanding that even high end GPUs don't take full advantage of the bandwidth of PCIe v3 x16 slot and so there still plenty of headroom left there. It is basically the data center and the like that needs PCIe version 4 and 5.
 

NTMBK

Diamond Member
Nov 14, 2011
8,809
1,855
136
It has been my understanding that even high end GPUs don't take full advantage of the bandwidth of PCIe v3 x16 slot and so there still plenty of headroom left there. It is basically the data center and the like that needs PCIe version 4 and 5.
Your understanding is wrong.
 

zlatan

Senior member
Mar 15, 2011
566
258
136
It has been my understanding that even high end GPUs don't take full advantage of the bandwidth of PCIe v3 x16 slot and so there still plenty of headroom left there. It is basically the data center and the like that needs PCIe version 4 and 5.
It is important to understand that the actual softwares are designed to these limits. But I can write stuff where the PCIe 3.0@x16 won't be enough. And I'm totally sure you would like that. :)

In the future, this will change when a GPU will connect to the CPU with a ~100-150 GB/s low-latency interconnect, and with direct access to the CPU page table. This is what AMD and Intel pursue. And you can bet they don't care about that NVIDIA only able to follow them with ARM. :)
 
  • Like
Reactions: Despoiler

tamz_msc

Platinum Member
Jan 5, 2017
2,627
2,296
106
It has been my understanding that even high end GPUs don't take full advantage of the bandwidth of PCIe v3 x16 slot and so there still plenty of headroom left there. It is basically the data center and the like that needs PCIe version 4 and 5.
If you got to take the data out of the GPU's memory(because you have to, it's obviously not going to be sitting there in compute applications), then PCI-e places severe restrictions on attainable memory bandwidth and indirectly, on actual GFLOP/s.

This is a huge problem for GPGPU, and basically why NVlink exists in the first place. There is a reason why PCI-e 5.0 has been fast-tracked after the delay with PCI-e 4.0 and people are planning to go directly to 5.0.
 

Genx87

Lifer
Apr 8, 2002
41,083
489
126
Any predictions about the death of high end were clearly wrong.

But they certainly have lost much of the laptop business, such that something like 60% of the GPUs sold in 2016, are Intel IGPs.

NVidia making more money? Sure. In 2009, NVidia high end GPU (GTX 295 dual GPU) was ~$500, today it's a $1200 Titan. I wonder why they make more money every year.

So far IGP/APUs have only soaked up low end GPUs aimed at essentially non gamers.

The EMIB/Interposer + dGPU will also lock NVidia out of the mid-range as well, that is going to sting.

NVidia saw this coming which is why they pushed so hard into super computing.
Nvidia saw the writing on the wall with regards to PC sales. Sales have been slumping for about 6 years running with no bottom in site. Right now PC sales are at 2007 levels. Instead of fighting over shrinking markets. Create a new one with Tesla. Another one with car integration. Both lucrative.
 

ASK THE COMMUNITY