News Intel GPUs - Intel launches A580

Page 37 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,829
136
I actually hope it doesn't have much crypto support so they get the gaming side developed.

The problem is that PoW algos are really just generic GPGPU. Unless Intel goes the nVidia route and tries to disable certain algorithms in the driver, not having much "crypto support" means that the Intel consumer dGPUs could be pretty broken for all GPGPU acceleration.
 
  • Like
Reactions: KompuKare

KompuKare

Golden Member
Jul 28, 2009
1,013
924
136
The problem is that PoW algos are really just generic GPGPU. Unless Intel goes the nVidia route and tries to disable certain algorithms in the driver, not having much "crypto support" means that the Intel consumer dGPUs could be pretty broken for all GPGPU acceleration.
Yes, that was the worry about Nvidia changing mining speed. Not that Nvidia mind segmenting their line even more but there is the danger that if general GPU compute gets too slow, people will just use CPUs.
 

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,829
136
Yes, that was the worry about Nvidia changing mining speed. Not that Nvidia mind segmenting their line even more but there is the danger that if general GPU compute gets too slow, people will just use CPUs.

Well not only that, but some GPGPU applications are "legit" and don't lead to massive shortages of consumer cards. Anyone wanting to play with FP64 loved Radeon VII.
 

JTsyo

Lifer
Nov 18, 2007
11,718
877
126
You know, this would have been the ideal time for Intel to get their GPUs out. Everyone is so desperate for cards they could have gotten a decent marketshare right off the bat.
 

xpea

Senior member
Feb 14, 2014
429
135
116
You know, this would have been the ideal time for Intel to get their GPUs out. Everyone is so desperate for cards they could have gotten a decent marketshare right off the bat.
Not a single chance to happen if they use TSMC N6 as reported...
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Not a single chance to happen if they use TSMC N6 as reported...

There will definitely be a dynamic component to it, but there's a lot of static allocation, also known as minimum orders.

So if everyone turns their backs on Intel GPUs there will be plenty of them available. Makes sense?

Just because AMD/Nvidia GPUs are unavailable doesn't mean Intel will end up being the same way. Of course parts shortages extend to general semiconductors not just TSMC, but assuming all the faults on TSMC and that it'll apply to an entirely different product makes little sense.

Intel will order an X amount and depending on demand Y amount will be added and/or given by TSMC. Because if whatever they can order is entirely dependent on TSMC lacking, then they are totally screwed and it's a stupid business decision.

Maybe we should think the opposite. Perhaps Intel is ordering a lot of chips from TSMC that's why AMD/Nvidia chips are having a shortage? Goes both ways.
 
  • Haha
Reactions: scineram

etrin

Senior member
Aug 10, 2001
692
5
81
intel has been shouting this stuff for over 3 years and nothing..Their last reply was integrated graphics....UH not discrete but you are intel
 
  • Like
Reactions: dvsv

Red Squirrel

No Lifer
May 24, 2003
67,341
12,099
126
www.anyf.ca
Come to think of it what's stopping motherboard manufacturers from integrating graphics right into the motherboard? I always assumed this is what built on video actually was until I went AMD and found out the hard way. I guess is it a patent thing where they would not be allowed to because Intel, AMD and Nvidia own all the rights?
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Come to think of it what's stopping motherboard manufacturers from integrating graphics right into the motherboard?

Like put a GTX 1050 and GDDR5 memory on the motherboard?

It has been done before, but pretty rare. It would raise the prices of the board though, and you can't unplug and upgrade.
 
Feb 4, 2009
34,553
15,766
136
Like put a GTX 1050 and GDDR5 memory on the motherboard?

It has been done before, but pretty rare. It would raise the prices of the board though, and you can't unplug and upgrade.

Yeah, to my admittedly amateur understanding there simply isn’t a market for it because the basic users don’t want to pay the extra for mid range performance and the upper end doesn’t want 1050 built in performance.
I suspect cooling is also a huge challenge because isn’t that area on the board real close to the cpu?
 

jpiniero

Lifer
Oct 1, 2010
14,584
5,206
136
Come to think of it what's stopping motherboard manufacturers from integrating graphics right into the motherboard? I always assumed this is what built on video actually was until I went AMD and found out the hard way. I guess is it a patent thing where they would not be allowed to because Intel, AMD and Nvidia own all the rights?

They do, but it's mostly on server boards. They have 2D only type graphics. Very low end.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I guess is it a patent thing where they would not be allowed to because Intel, AMD and Nvidia own all the rights?

Patent/branding would only matter if they are rebranding the thing entirely. But in this case they can simply sell it mentioning the manufacturer and it's brand name.

For example, AMD X570 chipset with onboard Geforce GTX 1050 graphics.
 

Gideon

Golden Member
Nov 27, 2007
1,625
3,650
136
Come to think of it what's stopping motherboard manufacturers from integrating graphics right into the motherboard? I always assumed this is what built on video actually was until I went AMD and found out the hard way. I guess is it a patent thing where they would not be allowed to because Intel, AMD and Nvidia own all the rights?
In the olden days processors used to communicate with memory via a separate chip on the motherboard (called the north bridge). Then you could easily just add an integrated GPU to the that chip (which had the memory controller anyway) and be done with it.

Once the memory controller was integrated to the CPU (ever since Athlon 64) this became much more complex. There were still some motherboards that offered integrated GPU options, but these already had that GPU connected to system memory via PCIe, which is obviously less than ideal.

The alternative is to solder (GDDR) memory on to the motherboard directly for the GPU. On Laptops it's a common practice (and actually most mobile GPUs are built directly into the mobo) but on desktops the added cost of memory, GPU chip (and licencing) + more layers to the PCB make this a costly endeavor. These same MoBos have a PCIe slot anyway so it's much more natural to just have a separate budget GPU.

With the eve of Chiplets the CPU package is essentially becoming "the new motherboard" (when grossly simplifying things) e.g. the I/O die on AMDs chips is roughly the same as the old north bridge used to be. And according to leaks for instance AMD's Raphael (5nm Ryzen 7xxx series coming in 2022) will presumably offer integrated GPUs as an optional chiplet.

TL;DR:

With current CPU architectures integrated GPUs that are not physically on the processor can't really use system memory well. Having to solder memory for the integrate GPU on the motherboard usually isn't worth it on the Desktop vs a discreet GPU.
 
Last edited:
  • Like
Reactions: NTMBK and Tlh97

Gideon

Golden Member
Nov 27, 2007
1,625
3,650
136
They can over PCIe via CCIX or CXL.
That's decidedly next-gen not current gen :D

And besides, while possible, routing every single memory request through PCIe bus adds significant latency. While GPUs do hide it well, it's still not free. I could see this working better with solutions like Infinity Cache, but overall just having integrated GPUs on package and having separate memory for the ones that aren't on package makes a ton more sense.