[PC Watch] Intel Announces Discrete GPU Prototype

ZGR

Golden Member
Oct 26, 2012
1,827
42
126
#2
This looks to be very power efficient. I hope it is!
 
Oct 14, 2003
6,209
324
126
#3
I have to say this because there are articles saying this is a GPU that's actually coming to the market.

It's not. It's a proof-of-concept to demonstrate advanced power management for GPUs.

-EUs are based on Gen 9, the same generation introduced with Skylake back in 2015.
-14nm process
-18EUs, same count as the one in Atom and Goldmont Pentium Celeron chips
-Article says 2 subslices out of the 3(12 out of 18EUs) incorporate new circuitry, while the leftover 1 subslice is conventional. Final products aren't going to do that. The reason they did this was to compare the EUs using new circuitry to the EUs using conventional techniques.

Typically, it takes many years before such techniques are used in a final product. FIVRs on Haswell took something like 10 years, so did Trigate/FinFET transistors, and the 80-core many core concept that laid the groundwork for Xeon Phi.

There are many more concept circuits that Intel demonstrated that hasn't been used. This may be one of them.
 
Last edited:

LightningZ71

Senior member
Mar 10, 2017
250
17
86
#4
Given the current production constraints on DDR chips, why is this even something that Intel is even thinking about? Even if they could quickly take their top end iGPU package and scale it out, say, 4X and marry a bus and memory controller to it and slap it on a board, where are they going to get the RAM from for the board? They don't currently make and DDR chips of the type that they would need. It would cost them a fortune to license the needed IP and convert a foundry to the correct production type, and it would be more than a year before we'd see production.

Intel's dGPU plans are either going to incorporate AI heavily, or, they are a long, long way away from affecting the market in a meaningful way.
 
Mar 13, 2006
10,129
130
126
#5
Keep in mind: Prototype.

Heck, the front end is a FPGA.
 
Jul 1, 2001
21,047
138
126
#6
Keep in mind: Prototype.

Heck, the front end is a FPGA.
So, what you're saying is that it's a bit too early to get in line at Best Buy to get a dozen of them for my cryptocoin miner :)
 
Oct 14, 2003
6,209
324
126
#7
No wonder we can't trust the reporters with important news. They can't even get a simple tech news right. :rolleyes: Shame on them. It's like as soon as they got the translation of the first sentence, they gathered like drones(meaning brainless autonomous creatures) and reported it as quickly as possible.

Fake news, and its a very legitimate use of the term.
 

ksec

Senior member
Mar 5, 2010
351
5
91
#8
Given the current production constraints on DDR chips, why is this even something that Intel is even thinking about? Even if they could quickly take their top end iGPU package and scale it out, say, 4X and marry a bus and memory controller to it and slap it on a board, where are they going to get the RAM from for the board? They don't currently make and DDR chips of the type that they would need. It would cost them a fortune to license the needed IP and convert a foundry to the correct production type, and it would be more than a year before we'd see production.

Intel's dGPU plans are either going to incorporate AI heavily, or, they are a long, long way away from affecting the market in a meaningful way.
This makes zero sense.
 

Headfoot

Diamond Member
Feb 28, 2008
4,408
55
126
#9
Given the current production constraints on DDR chips, why is this even something that Intel is even thinking about? Even if they could quickly take their top end iGPU package and scale it out, say, 4X and marry a bus and memory controller to it and slap it on a board, where are they going to get the RAM from for the board? They don't currently make and DDR chips of the type that they would need. It would cost them a fortune to license the needed IP and convert a foundry to the correct production type, and it would be more than a year before we'd see production.

Intel's dGPU plans are either going to incorporate AI heavily, or, they are a long, long way away from affecting the market in a meaningful way.
GPUs are planned years in advance....
 
Mar 10, 2004
28,522
238
126
#10
Intel, like most mfgs, probably has a boatload of prototypes of various types, the majority of which are going nowhere.
 

wahdangun

Senior member
Feb 3, 2011
994
7
106
#11
just hope its not like intel last attempt,
 

Fallen Kell

Diamond Member
Oct 9, 1999
5,274
16
91
#12
just hope its not like intel last attempt,
It isn't. This chip is not even something they will ever release. It is purely a prototype to test out new design and compare against the old design, mainly looking to have more power savings while still performing the same as the old design. The core will simply be added back into the on-chip CPU+GPU designs they currently have and marketed heavily towards the laptop, and phone/mobile markets attempting to break into the phone market (as they have been trying to do over the last 8-10 years).
 
Oct 9, 1999
11,426
94
126
#13
how well does it mine? will it be available for MSRP? this is literally all i care about with GPU's nowdays its sad.
 

ksec

Senior member
Mar 5, 2010
351
5
91
#14
how well does it mine? will it be available for MSRP? this is literally all i care about with GPU's nowdays its sad.
This is actually a good questions. Assuming Intel's GPU aren't any good at mining. Intel already have the drivers done for their iGPU, they could have scaled up and make a dGPU that the market can actually buy.

Why aren't they doing it? With 128 or even 256 EU.

I mean now they have announced they are Interested in their own dGPU.
 


ASK THE COMMUNITY

TRENDING THREADS