• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

News Intel GPUs - we've given up on B770, where's Celestial already

Page 37 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I actually hope it doesn't have much crypto support so they get the gaming side developed.

The problem is that PoW algos are really just generic GPGPU. Unless Intel goes the nVidia route and tries to disable certain algorithms in the driver, not having much "crypto support" means that the Intel consumer dGPUs could be pretty broken for all GPGPU acceleration.
 
The problem is that PoW algos are really just generic GPGPU. Unless Intel goes the nVidia route and tries to disable certain algorithms in the driver, not having much "crypto support" means that the Intel consumer dGPUs could be pretty broken for all GPGPU acceleration.
Yes, that was the worry about Nvidia changing mining speed. Not that Nvidia mind segmenting their line even more but there is the danger that if general GPU compute gets too slow, people will just use CPUs.
 
Yes, that was the worry about Nvidia changing mining speed. Not that Nvidia mind segmenting their line even more but there is the danger that if general GPU compute gets too slow, people will just use CPUs.

Well not only that, but some GPGPU applications are "legit" and don't lead to massive shortages of consumer cards. Anyone wanting to play with FP64 loved Radeon VII.
 
You know, this would have been the ideal time for Intel to get their GPUs out. Everyone is so desperate for cards they could have gotten a decent marketshare right off the bat.
 
You know, this would have been the ideal time for Intel to get their GPUs out. Everyone is so desperate for cards they could have gotten a decent marketshare right off the bat.
Not a single chance to happen if they use TSMC N6 as reported...
 
Not a single chance to happen if they use TSMC N6 as reported...

There will definitely be a dynamic component to it, but there's a lot of static allocation, also known as minimum orders.

So if everyone turns their backs on Intel GPUs there will be plenty of them available. Makes sense?

Just because AMD/Nvidia GPUs are unavailable doesn't mean Intel will end up being the same way. Of course parts shortages extend to general semiconductors not just TSMC, but assuming all the faults on TSMC and that it'll apply to an entirely different product makes little sense.

Intel will order an X amount and depending on demand Y amount will be added and/or given by TSMC. Because if whatever they can order is entirely dependent on TSMC lacking, then they are totally screwed and it's a stupid business decision.

Maybe we should think the opposite. Perhaps Intel is ordering a lot of chips from TSMC that's why AMD/Nvidia chips are having a shortage? Goes both ways.
 
intel has been shouting this stuff for over 3 years and nothing..Their last reply was integrated graphics....UH not discrete but you are intel
 
Come to think of it what's stopping motherboard manufacturers from integrating graphics right into the motherboard? I always assumed this is what built on video actually was until I went AMD and found out the hard way. I guess is it a patent thing where they would not be allowed to because Intel, AMD and Nvidia own all the rights?
 
Come to think of it what's stopping motherboard manufacturers from integrating graphics right into the motherboard?

Like put a GTX 1050 and GDDR5 memory on the motherboard?

It has been done before, but pretty rare. It would raise the prices of the board though, and you can't unplug and upgrade.
 
Like put a GTX 1050 and GDDR5 memory on the motherboard?

It has been done before, but pretty rare. It would raise the prices of the board though, and you can't unplug and upgrade.

Yeah, to my admittedly amateur understanding there simply isn’t a market for it because the basic users don’t want to pay the extra for mid range performance and the upper end doesn’t want 1050 built in performance.
I suspect cooling is also a huge challenge because isn’t that area on the board real close to the cpu?
 
Come to think of it what's stopping motherboard manufacturers from integrating graphics right into the motherboard? I always assumed this is what built on video actually was until I went AMD and found out the hard way. I guess is it a patent thing where they would not be allowed to because Intel, AMD and Nvidia own all the rights?

They do, but it's mostly on server boards. They have 2D only type graphics. Very low end.
 
I guess is it a patent thing where they would not be allowed to because Intel, AMD and Nvidia own all the rights?

Patent/branding would only matter if they are rebranding the thing entirely. But in this case they can simply sell it mentioning the manufacturer and it's brand name.

For example, AMD X570 chipset with onboard Geforce GTX 1050 graphics.
 
Come to think of it what's stopping motherboard manufacturers from integrating graphics right into the motherboard? I always assumed this is what built on video actually was until I went AMD and found out the hard way. I guess is it a patent thing where they would not be allowed to because Intel, AMD and Nvidia own all the rights?
In the olden days processors used to communicate with memory via a separate chip on the motherboard (called the north bridge). Then you could easily just add an integrated GPU to the that chip (which had the memory controller anyway) and be done with it.

Once the memory controller was integrated to the CPU (ever since Athlon 64) this became much more complex. There were still some motherboards that offered integrated GPU options, but these already had that GPU connected to system memory via PCIe, which is obviously less than ideal.

The alternative is to solder (GDDR) memory on to the motherboard directly for the GPU. On Laptops it's a common practice (and actually most mobile GPUs are built directly into the mobo) but on desktops the added cost of memory, GPU chip (and licencing) + more layers to the PCB make this a costly endeavor. These same MoBos have a PCIe slot anyway so it's much more natural to just have a separate budget GPU.

With the eve of Chiplets the CPU package is essentially becoming "the new motherboard" (when grossly simplifying things) e.g. the I/O die on AMDs chips is roughly the same as the old north bridge used to be. And according to leaks for instance AMD's Raphael (5nm Ryzen 7xxx series coming in 2022) will presumably offer integrated GPUs as an optional chiplet.

TL;DR:

With current CPU architectures integrated GPUs that are not physically on the processor can't really use system memory well. Having to solder memory for the integrate GPU on the motherboard usually isn't worth it on the Desktop vs a discreet GPU.
 
Last edited:
They can over PCIe via CCIX or CXL.
That's decidedly next-gen not current gen 😀

And besides, while possible, routing every single memory request through PCIe bus adds significant latency. While GPUs do hide it well, it's still not free. I could see this working better with solutions like Infinity Cache, but overall just having integrated GPUs on package and having separate memory for the ones that aren't on package makes a ton more sense.
 
Back
Top