• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

why don't cpu cores integrate stream/shader units

her34

Senior member
with intel and amd both moving to merge gpu with cpu, it seems that they are simply moving the gpu from one location to another and packaging a gpu core with the cpu cores.

instead why don't they integrate a few stream/shader units with each core to truly integrate the gpu? wouldn't that have a greater benefit to software developers?

how did integration happen with math co-processors?
 
wouldn't that have a greater benefit to software developers?
Not if AMD and Intel and Via do it in incompatible ways.

how did integration happen with math co-processors?
Intel spec'd an x87 instruction set that was "standardized". Then, lots of people (Weitek, IDT, etc) built external FPUs, and eventually they were integrated. There is no agreed-upon standard for the types of operations that GPUs tend to do.
 
because thats not what CPUs are supposed to do. Keep in mind that GPUs are often larger than the CPUs, so integrating them into a CPU would double its size. In other words, most people would end up paying more for stuff they wouldn't ever use. Furthermore, GPUs need HUGE amounts of RAM bandwidth, and the CPU sockets do not provide NEARLY enough. CPUs and GPUS want completely different types of RAM, and integrating the two onto the same motherboard would be nearly impossible given the problems of routing so mancy traces. So, basically its makes absolutely 100% no sense to integrate GPU componenets onto CPUs except as a least common denominator type thing (IE: only enough to run basic programs, not to run fancy games). Also, it should be pointed out that modern CPUs do have vector processing units which are pretty much the same thing.
 
Originally posted by: CTho9305
Intel spec'd an x87 instruction set that was "standardized". Then, lots of people (Weitek, IDT, etc) built external FPUs, and eventually they were integrated. There is no agreed-upon standard for the types of operations that GPUs tend to do.

1) was the industry hurt by intel defining the standard to be used by all?

2) isn't msft defining the gpu standard, narrowing variability each generation? are shaders going to change much from this point on as they become more and more programmable?
 
Graphics are getting more and more variable with each generation, not the other way around. For awhile pretty much everything looked the same since they were using the same graphics pipeline, but now with all the programmable aspects (and many more built in options) you have much much more variability in what you can do. Not to mention the huge increase in computing power makes methods that were too comutationally intensive to be run in real time only a few years ago viable today.
 
Back
Top