News Intel GPUs - Intel launches A580

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
If Intel is successful, AMD will be squeezed out of the discrete GPU market in both the consumer and professional space. If that is the case, Nvidia will eventually be hurting too down the line and I can see them being bought out by Samsung or Apple.

But I think everyone is over-estimating Intel's ability to deliver. They have failed twice, and despite a process lead in the CPU space, they let that lead entirely nearly entirely shrink away vs. the struggling, cash strapped, and much smaller AMD.

Nvidia has a huge head start and it takes a massive amount of good engineers plus time to build something from the ground up that can compete.
 
Last edited:
  • Like
Reactions: Tlh97 and xpea

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Betting against nV already? They keep finding ways to print money.

Leadership matters and JHH has it. Imagine if a decade ago we got that proposed AMD-Nvidia under JHH management instead of AMD-ATi under clowns.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
Nvidia is not just fighting over the high end and in supercomputing/AI. They also have a robust Tegra division that integrates GPU and CPU together (though using ARM). Chromebook sales continue to rise in absolute and market share percentage each year. Nvidia has a tegra solution there already with major OEMs pushing product. They may lack a but in CPU performance at present, but this is nit a stagnant situation.

The Tegra solution Is already in the Switch and selling in volume. Nvidia already has faster products available. I think that that is where Nvidia will go for volume product, chromebook and sff desktop devices running chromeOS, using tegra based solutions. This will be their volume solution, they will have dGPU for high end as well.
 
Last edited:

Elixer

Lifer
May 7, 2002
10,376
762
126
If Intel is successful, AMD will be squeezed out of the discrete GPU market in both the consumer and professional space. If that is the case, Nvidia will eventually be hurting too down the line and I can see them being bought out by Samsung or Apple.
So, you are thinking that AMD will basically become a patent house?
I don't see that happening any time soon.
I do expect nvidia to partner up with someone, could be Apple, Qualcom, or Broadcom, or perhaps someone else.

All we know for sure is, things are going to get very interesting.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
If you got to take the data out of the GPU's memory(because you have to, it's obviously not going to be sitting there in compute applications), then PCI-e places severe restrictions on attainable memory bandwidth and indirectly, on actual GFLOP/s.

This is a huge problem for GPGPU, and basically why NVlink exists in the first place. There is a reason why PCI-e 5.0 has been fast-tracked after the delay with PCI-e 4.0 and people are planning to go directly to 5.0.
Well I was referring to consumer level hardware, and yes I'm well aware that there are "prosumers" and content creators who have run into PCIe v3 limitations.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Nvidia is not just fighting over the high end and in supercomputing/AI. They also have a robust Tegra division that integrates GPU and CPU together (though using ARM). Chromebook sales continue to rise in absolute and market share percentage each year. Nvidia has a tegra solution there already with major OEMs pushing product. They may lack a but in CPU performance at present, but this is nit a stagnant situation.

The Tegra solution Is already in the Switch and selling in volume. Nvidia already has faster products available. I think that that is where Nvidia will go for volume product, chromebook and sff desktop devices running chromeOS, using tegra based solutions. This will be their volume solution, they will have dGPU for high end as well.

I think Tegra is all but dead. The part in a the Switch is an older design. When was the last new Tegra chip design? AFAIK it was 2015.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
What would you get if you put a 72eu Gen 9.5 graphics chip on a PCI-E card and clocked it around 1.25Ghz?

Probably you'd have about a 750ti card?
 

Despoiler

Golden Member
Nov 10, 2007
1,966
770
136
Intel has a giant mountain to overcome and a past timeline of failures. They have a year old division with several market segments under it that has only made some expensive acquisitions. They add Raja the Dreamer to bring it all together. They are going after the top from what I've read. They have zero software ecosystem for DC/ML/AI and they don't even have a dGPU. It took Raja 2 years to go from Boltzman to just starting to be a viable alternative to Nvidia. AMD obviously already had compute focused GPUs. Intel wants to get all this done by 2020 from what I've read. I don't see how this happens. At least not in that timeline.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
They work on scaling existing iGPU technology to larger dGPU chips to compete with the midrange (nV Gx106 or so) while developing a new architecture that will replace it in several years and compete fully top to bottom.
 

Glo.

Diamond Member
Apr 25, 2015
5,661
4,419
136
If Intel is successful, AMD will be squeezed out of the discrete GPU market in both the consumer and professional space. If that is the case, Nvidia will eventually be hurting too down the line and I can see them being bought out by Samsung or Apple.
Yeah, like AMD will not develop their own SoC's. They have no capabilities to compete. THEY ARE BLOODY DOOMED!

Im sorry, but ITS INTEL who has to do catch up game in front of GPUs. Both to Nvidia and AMD. Its Intel who has to establish mindshare.

So suddenly, because they hired ONE GUY, Intel is the god almighty here. Let me ask you all a question, Intel DGPU supporters. If Intel would be such capable of delivering graphics, what stopped them before? Why they did not come up with tech so far, but they had to pair themselves with AMD RADEON tech, to deliver decent performance?

And lastly. Why do you believe this is not another stupid Brian Krzanich's idea, that in the long run turn into another massive failure, directly because of his idiotic management style?
 
Last edited:

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
I would guess that the Radeon deal has one source and that's Apple. While Intel may use the chips in NUCs or whatever, I think the driver of the deal was Apple.
Without Apple, we probably don't see this Intel/AMD collaboration.

I would also guess that Intel has been working on a DGPU for a while, and that the Raja hiring just broke the cover.
 

Glo.

Diamond Member
Apr 25, 2015
5,661
4,419
136
Intel has a giant mountain to overcome and a past timeline of failures. They have a year old division with several market segments under it that has only made some expensive acquisitions. They add Raja the Dreamer to bring it all together. They are going after the top from what I've read. They have zero software ecosystem for DC/ML/AI and they don't even have a dGPU. It took Raja 2 years to go from Boltzman to just starting to be a viable alternative to Nvidia. AMD obviously already had compute focused GPUs. Intel wants to get all this done by 2020 from what I've read. I don't see how this happens. At least not in that timeline.
From consumer point of view - sad but true. Intel has very capable hardware in terms of software compatibility, but the hardware itself is rubbish in current state of things, and their drivers are just ... well lets not comment on this. There was no incentive at Intel to develop the drivers to degree above "usablitity".

If they will develop both: hardware and software we can end up with great products for consumers. But it will be at last 5 years from now, that we will see this.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
From consumer point of view - sad but true. Intel has very capable hardware in terms of software compatibility, but the hardware itself is rubbish in current state of things, and their drivers are just ... well lets not comment on this. There was no incentive at Intel to develop the drivers to degree above "usablitity".

If they will develop both: hardware and software we can end up with great products for consumers. But it will be at last 5 years from now, that we will see this.
A lot can and will happen during that timeframe....
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Well, it would be good if Intel could get in the game with a competitive card.

The competition is needed, imo.

I think we will see their cards much faster than 4 or 5 years, though.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
AFAIK there's nothing besides a single SoC.

Intel still have licensed Nvidia IPs through early 2017 also last I checked. Eventually early 2017 will be old beans, so they gotta go somewhere next.
 

Despoiler

Golden Member
Nov 10, 2007
1,966
770
136
From consumer point of view - sad but true. Intel has very capable hardware in terms of software compatibility, but the hardware itself is rubbish in current state of things, and their drivers are just ... well lets not comment on this. There was no incentive at Intel to develop the drivers to degree above "usablitity".

If they will develop both: hardware and software we can end up with great products for consumers. But it will be at last 5 years from now, that we will see this.

Totally. I mean Intel finally has a whole 1 TFLOP GPU. Meanwhile AMD and Nvidia are at least 12x that. I also believe Intel GPUs are well behind AMD in perf per watt too. Although I'm having a hard time finding the wattage of their GT4E part. Simply scaling up what they have I don't think is plausible.
 
  • Like
Reactions: Tlh97

positivedoppler

Golden Member
Apr 30, 2012
1,103
171
106
Totally. I mean Intel finally has a whole 1 TFLOP GPU. Meanwhile AMD and Nvidia are at least 12x that. I also believe Intel GPUs are well behind AMD in perf per watt too. Although I'm having a hard time finding the wattage of their GT4E part. Simply scaling up what they have I don't think is plausible.

Remember when Intel underestimated AMD because Bulldozer was so far behind p/w and overall performance? That evaporated quickly.
 

Krteq

Senior member
May 22, 2015
991
671
136
Just for fun.. :D
DONsCknX4AEzbqC.png
 

tajoh111

Senior member
Mar 28, 2005
298
312
136
  • Like
Reactions: Tlh97 and xpea