Speculation: Radeon will become the "x86" of GPUs

Will Intel adopt the Radeon ISA for their future GPU?

  • Yes

    Votes: 12 30.8%
  • No

    Votes: 27 69.2%

  • Total voters
    39

Vattila

Senior member
Oct 22, 2004
799
1,351
136
Now that Kaby Lake-G has been revealed (an Intel MCM with Core CPU and Radeon GPU), and with Radeon dominating the high-end console space for the foreseeable future, and with Raja Koduri moving to Intel, and Intel announcing their intent to play in the high-end GPU space — what are the chances that Intel will adopt the Radeon Instruction Set Architecture (ISA), which is documented publically by AMD (here), rather than them creating a new GPU ISA from scratch?

Would there be a strategic benefit to AMD and Intel by cooperating on the GPU ISA, similar to their cooperation on x86 for the CPU ISA? Similarly, would it benefit AMD and Intel to cooperate on AMD's open-source initiatives GPUOpen and ROCm, to combat Nvidia's dominance with CUDA and their associated proprietary ISA?

To me it seems "yes".
 
  • Like
Reactions: jrphoenix

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Any signififcant "win" in Red vs. Green seems doubtful to me.

GPU design is not really an ISA or even an extension like the AMD x64 instructions intel adopted after Itanium flopped.

Vulkan-type low level "to the metal" coding will need to change for each major design shift in GPUs, it can't be abstracted away like higher-level Direct3D and OpenGL coding, which itself still changes with new designs.

In other words, there is (to me) no direct mapping to an ISA that largely stays the same over 5+ generations of CPUs counting extensions, or decades ignoring them.

Open-source initiatives don't really depend on shared designs, except maybe if the initiative favors a feature that other designs are weak at supporting.
 

geoxile

Senior member
Sep 23, 2014
327
25
91
Intel's is just buying actual physical GPUs. They're not licensing anything from AMD AFAIK. GCN ISA is a thing though, Intel just doesn't have the rights to use it.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Intel's is just buying actual physical GPUs. They're not licensing anything from AMD AFAIK. GCN ISA is a thing though, Intel just doesn't have the rights to use it.

Exactly. Selling GPUs is Win for AMD. Licensing their main competitor to use their one real competitive advantage doesn't make sense.

In fact, that was exactly what Lisa Su said when the Licensing rumors started flying:

"We're not looking at enabling a competitor to compete with our products," $AMD CEO Lisa Su says at JPMorgan tech conference.
 

CatMerc

Golden Member
Jul 16, 2016
1,114
1,149
136

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
NVIDIA however is who Intel is gunning for. So it would make as much sense as Intel purchasing Zen cores to add to their graphics.

Says who? This "Gunning for" mentality is nothing more than forum blather.

Most of Intels business is selling Desktop/Latop/Server x86 processors.

They really only have one competitor there: AMD, which just became a MUCH bigger threat in the last year.

No matter who Intel buys GPU chips from, it will be from a competitor.

If Intel is run with any logic, they should should just buy the GPU chip, that has the best specs for the best price.

Currently that is Vega Mobile, 6 months from now it could be GTX 2060 mobile.
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,686
1,221
136
Intel's CPU and GPU ISAs are fusible just like AMD's CPU and GPU ISA. I see it more likely a new graphics extension in x86 will pop up. In which, GPUs translate CISC ops into native RISC ops, like CPUs.

AMD and Intel both have split-core research/patents for general purpose and graphics inside the same core. AMD is a bit more fleshed out currently. With two SIMDs from GCN which reside either in the FPU or in a GPU specific execution pipeline inside the CPU processing core. Even VIA/S3 has a bit of this as well.

GCN => Wavefronts = 20, 4 SIMDs per CU, 1 RBE per CU array, 4 TMUs per CU.
X86-GCN => Wavefronts = 8, 2 SIMDs per FPU?, 1 ROP per core, 1 TMU per core.
From the patents/research for AMD. (There is a L2-uncore component for GCNx86)
 
Last edited:

Yakk

Golden Member
May 28, 2016
1,574
275
81
The more OEMs AMD can make deals with, the better for them. Intel is just one company amongst others.
 

CatMerc

Golden Member
Jul 16, 2016
1,114
1,149
136
Says who? This "Gunning for" mentality is nothing more than forum blather.

Most of Intels business is selling Desktop/Latop/Server x86 processors.

They really only have one competitor there: AMD, which just became a MUCH bigger threat in the last year.

No matter who Intel buys GPU chips from, it will be from a competitor.

If Intel is run with any logic, they should should just buy the GPU chip, that has the best specs for the best price.

Currently that is Vega Mobile, 6 months from now it could be GTX 2060 mobile.
Did you notice they hired Raja and started investing heavily into that field?

NVIDIA is becoming a large monster of a company for a reason, their GPU's are used in the fastest growing fields in the market, with the obvious end goal of turning the GPU into the central focal point of the system. Don't believe me? Check out Huang's speech during the reveal of GV100. He made it quite clear. This is directly a threat against Intel unless they start working fast to grow their own graphics and computing to stem NVIDIA's growth to both keep their CPU's the focal point for longer and to have a stake in the growing markets.

In that sense AMD and Intel have a combined interest. They want to keep x86 the focal point in the lucrative server and HPC markets since it's something only they (realistically) can do, shutting out competitors. NVIDIA's growth is a threat to x86's dominance in computing, and that's a threat to Intel and AMD.

Your view is shortsighted.

Intel will not use NVIDIA, and they certainly don't like using AMD, considering how their marketing people were trained to avoid saying the word AMD and instead say Radeon. They're developing their own dGPU, but for now the only real logical choice is AMD.
 
Last edited:

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Did you notice they hired Raja and started investing heavily into that field?

NVIDIA is becoming a large monster of a company for a reason, their GPU's are used in the fastest growing fields in the market, with the obvious end goal of turning the GPU into the central focal point of the system. Don't believe me? Check out Huang's speech during the reveal of GV100. He made it quite clear. This is directly a threat against Intel unless they start working fast to grow their own graphics and computing to stem NVIDIA's growth to both keep their CPU's the focal point for longer and to have a stake in the growing markets.

In that sense AMD and Intel have a combined interest. They want to keep x86 the focal point in the lucrative server and HPC markets since it's something only they (realistically) can do, shutting out competitors. NVIDIA's growth is a threat to x86's dominance in computing, and that's a threat to Intel and AMD.

Your view is shortsighted.

Intel will not use NVIDIA, and they certainly don't like using AMD, considering how their marketing people were trained to avoid saying the word AMD and instead say Radeon. They're developing their own dGPU, but for now the only real logical choice is AMD.

This kind of thinking is the stuff of forum rhetoric, not corporate decision making. Apple and Samsung are brutal competitors and have been involved in lawsuits, but Apple still buys Billions in dollars in parts from Samsung.

Because that is how you make the best product for the best price.

Not by acting as if somehow withholding purchase from your "enemy" will weaken them and allow you to crush them. That is pure nonsense.

If NVidia has the better part for a better deal, they would be fools not to use it.
 
  • Like
Reactions: Muhammed and DooKey

2is

Diamond Member
Apr 8, 2012
4,281
131
106
AMD can’t even manage to supply the relatively few (by comparison to nvidia) people who buy their GPUs. None of this is relevant unless they can start cranking out GPUs at a significantly faster rate.
 
  • Like
Reactions: tential and n0x1ous

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Did you notice they hired Raja and started investing heavily into that field?

Hiring away their GPU leader is not a friendly act.

Raja is at Intel to help with their own Intel dGPU. He is their to help replace Radeon, not integrate it. I said before the AMD part of Kaby-G was like a proof of concept until they get their own dGPU ready:

Intel Preps Their Own Discrete GPUs For Gen 12 and 13 Codenamed Arctic Sound and Jupiter Sound – Will Be Featured in Post-Cannonlake Chips Replacing AMD’s dGPU Solutions
 

CatMerc

Golden Member
Jul 16, 2016
1,114
1,149
136
This kind of thinking is the stuff of forum rhetoric, not corporate decision making. Apple and Samsung are brutal competitors and have been involved in lawsuits, but Apple still buys Billions in dollars in parts from Samsung.

Because that is how you make the best product for the best price.

Not by acting as if somehow withholding purchase from your "enemy" will weaken them and allow you to crush them. That is pure nonsense.

If NVidia has the better part for a better deal, they would be fools not to use it.
Suuuure. Just forum rhetoric.... Not like Intel has a history of pulling moves to weaken their competition or something. Not at all.

Hiring away their GPU leader is not a friendly act.

Raja is at Intel to help with their own Intel dGPU. He is their to help replace Radeon, not integrate it. I said before the AMD part of Kaby-G was like a proof of concept until they get their own dGPU ready:

Intel Preps Their Own Discrete GPUs For Gen 12 and 13 Codenamed Arctic Sound and Jupiter Sound – Will Be Featured in Post-Cannonlake Chips Replacing AMD’s dGPU Solutions
Who said anything about it being a friendly move? I also never said they intend to integrate Radeon. Intel is using AMD right now as a temporary means to an end, and that's to get into proper graphics and stop NVIDIA's meteoric growth. The more money NVIDIA has, the more they can pressure them on the data center front, which is Intel's bread and butter. NVIDIA makes a LOT of money off of laptops. It might shock you how much NVIDIA charges for MXM GPU's.

Intel isn't friendly with AMD and I never said that. Acting in common interests isn't a friendly move and both would cut the deal if it wasn't good for either of them.
 
  • Like
Reactions: KompuKare

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Suuuure. Just forum rhetoric.... Not like Intel has a history of pulling moves to weaken their competition or something. Not at all.

It's a mistake to think Kaby-G has any real effect on NVidia. It's one niche part, not a lock out from the market.
 

CatMerc

Golden Member
Jul 16, 2016
1,114
1,149
136
It's a mistake to think Kaby-G has any real effect on NVidia. It's one niche part, not a lock out from the market.
First I disagree about it being niche, it actually hits exactly where NVIDIA generates the most revenue. Second, I see it as a step, not the entire move.
 
  • Like
Reactions: prtskg

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,642
136
It is unclear how Intel's own attempt at the graphics market would pan out, as the first products are to be expected not before 3+ years.

However, when it comes to the software ecosystem, then AMD vs NVIDIA is like LibreOffice/OpenOffice vs Microsoft Office. Everybody likes to extol the virtues of open-source and/or software/hardware agnosticism, but at the end of the day they'll be using CUDA.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
However, when it comes to the software ecosystem, then AMD vs NVIDIA is like LibreOffice/OpenOffice vs Microsoft Office. Everybody likes to extol the virtues of open-source and/or software/hardware agnosticism, but at the end of the day they'll be using CUDA.

AMDs GPU IP is not open source.

This thread is about whether Intel will license and adopt AMDs Radeon IP, which is proprietary.

It is a virtual certainty that Intel will NOT be adopting AMDs IP, but will be pursuing their own unique designs/ISA for their in house GPUs.
 

CatMerc

Golden Member
Jul 16, 2016
1,114
1,149
136
Intel wouldn't adopt GCN's ISA, that's just pointless. What they might do is use open source API's that AMD developed and contribute to those projects, to have one strong ecosystem vs CUDA rather than multiple weak ones, but even that's not a given.