[PC Watch] Intel Announces Discrete GPU Prototype

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
This looks to be very power efficient. I hope it is!
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I have to say this because there are articles saying this is a GPU that's actually coming to the market.

It's not. It's a proof-of-concept to demonstrate advanced power management for GPUs.

-EUs are based on Gen 9, the same generation introduced with Skylake back in 2015.
-14nm process
-18EUs, same count as the one in Atom and Goldmont Pentium Celeron chips
-Article says 2 subslices out of the 3(12 out of 18EUs) incorporate new circuitry, while the leftover 1 subslice is conventional. Final products aren't going to do that. The reason they did this was to compare the EUs using new circuitry to the EUs using conventional techniques.

Typically, it takes many years before such techniques are used in a final product. FIVRs on Haswell took something like 10 years, so did Trigate/FinFET transistors, and the 80-core many core concept that laid the groundwork for Xeon Phi.

There are many more concept circuits that Intel demonstrated that hasn't been used. This may be one of them.
 
Last edited:

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
Given the current production constraints on DDR chips, why is this even something that Intel is even thinking about? Even if they could quickly take their top end iGPU package and scale it out, say, 4X and marry a bus and memory controller to it and slap it on a board, where are they going to get the RAM from for the board? They don't currently make and DDR chips of the type that they would need. It would cost them a fortune to license the needed IP and convert a foundry to the correct production type, and it would be more than a year before we'd see production.

Intel's dGPU plans are either going to incorporate AI heavily, or, they are a long, long way away from affecting the market in a meaningful way.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
No wonder we can't trust the reporters with important news. They can't even get a simple tech news right. :rolleyes: Shame on them. It's like as soon as they got the translation of the first sentence, they gathered like drones(meaning brainless autonomous creatures) and reported it as quickly as possible.

Fake news, and its a very legitimate use of the term.
 

ksec

Senior member
Mar 5, 2010
420
117
116
Given the current production constraints on DDR chips, why is this even something that Intel is even thinking about? Even if they could quickly take their top end iGPU package and scale it out, say, 4X and marry a bus and memory controller to it and slap it on a board, where are they going to get the RAM from for the board? They don't currently make and DDR chips of the type that they would need. It would cost them a fortune to license the needed IP and convert a foundry to the correct production type, and it would be more than a year before we'd see production.

Intel's dGPU plans are either going to incorporate AI heavily, or, they are a long, long way away from affecting the market in a meaningful way.

This makes zero sense.
 
  • Like
Reactions: Headfoot

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Given the current production constraints on DDR chips, why is this even something that Intel is even thinking about? Even if they could quickly take their top end iGPU package and scale it out, say, 4X and marry a bus and memory controller to it and slap it on a board, where are they going to get the RAM from for the board? They don't currently make and DDR chips of the type that they would need. It would cost them a fortune to license the needed IP and convert a foundry to the correct production type, and it would be more than a year before we'd see production.

Intel's dGPU plans are either going to incorporate AI heavily, or, they are a long, long way away from affecting the market in a meaningful way.

GPUs are planned years in advance....
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,037
431
126
just hope its not like intel last attempt,
It isn't. This chip is not even something they will ever release. It is purely a prototype to test out new design and compare against the old design, mainly looking to have more power savings while still performing the same as the old design. The core will simply be added back into the on-chip CPU+GPU designs they currently have and marketed heavily towards the laptop, and phone/mobile markets attempting to break into the phone market (as they have been trying to do over the last 8-10 years).
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
how well does it mine? will it be available for MSRP? this is literally all i care about with GPU's nowdays its sad.
 

ksec

Senior member
Mar 5, 2010
420
117
116
how well does it mine? will it be available for MSRP? this is literally all i care about with GPU's nowdays its sad.

This is actually a good questions. Assuming Intel's GPU aren't any good at mining. Intel already have the drivers done for their iGPU, they could have scaled up and make a dGPU that the market can actually buy.

Why aren't they doing it? With 128 or even 256 EU.

I mean now they have announced they are Interested in their own dGPU.