• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Intel's reaction to the ATI merger?

Viditor

Diamond Member
Inq article

"Imagination chief exec Hossein Yassaie explained that the companies will work together to put graphics and video capabilities into Intel microprocessors"

I don't know much about Imagination...does anyone here know their reputation?
 
Think PowerVR. I believe they own that technology now.

They are known for making power efficent GPUs. Probably the most well known product PowerVR was involved in was the Dreamcast console. It was a good system despite it's commerical failure.
 
i think intel maybe worried about the ati amd merger, however they "amd" would be idiotic to just make the ati cards and chipsets only for amd motherboards and processors....they would lose alot more than the would gain, i dont have a problem with amd stuff but ive never really used it and dont plan on using it unless they come out with something very inticing to buy...i love my ati crossfire cards and would be dissapointed if i could no longer get ati cards....but if i was forced to get nvidia sli instead of crossfire i would...i just hope that amd would not be that stupid as to only make ati's arcitecture for amd boards and processors, i personally dont think the will but you never know
 
Intel's already had a stake in PowerVR for quite awhile. All their 9xx series GMA onboard were more or less based on PowerVR technology. Obviously they werent high performance because they weren't targetted at high performance. Their last performance chip was the PowerVR-2, which powered the Dreamcast and was found in the venerable Kyro II series from Hercules. For the seemingly low end specs, it actually performed very well due to a very efficient engine design. This was during the GeForce2-GTS era, but it managed to pull respectable numbers on par with a GeForce-DDR. However, things have changed dramatically during the last few years. I don't know if they can pull another "performance" (or even mainstream) discrete design.

And BTW, its not "buy" persay. Its only a 10% stake in it; a whole lot different than an ATI merger deal. Another interesting tidbit is that after Creative disbanded the 3dLabs team, most of them went to Intel.
 
Please note the that the key point here is integration onto the CPU...this is widely believed to be a major reason for the ATI merger.
 
Originally posted by: Viditor
Please note the that the key point here is integration onto the CPU...this is widely believed to be a major reason for the ATI merger.

The last time Intel tried that was Timna, and we all know where that ended up... Axed. The last time an integrated CPU/GPU was on die and commercially available was the Cyrix Media G3, and that was rather subpar to say the least. I'm not sure *why* anyone would want to make an integrated CPU/GPU (there was a debate on Ars forum about this). Its certainly NOT for the performance market or even mainstream. Its for the budget/value segment. Now the question is, how much can it cut costs? Low end Dell systems with OS/Monitor/Keyboard+Mouse/Printer can be had for $300.

I see this acquisition more as a functionality of CPU/GPU working closer together, maybe even sharing a common protocol (Torrenza/CSI/FSB) than anything else. Needless to say, such a thing would be much faster than the standard PCI-E bus.
 
Originally posted by: dexvx
Originally posted by: Viditor
Please note the that the key point here is integration onto the CPU...this is widely believed to be a major reason for the ATI merger.

The last time Intel tried that was Timna, and we all know where that ended up... Axed. The last time an integrated CPU/GPU was on die and commercially available was the Cyrix Media G3, and that was rather subpar to say the least. I'm not sure *why* anyone would want to make an integrated CPU/GPU (there was a debate on Ars forum about this). Its certainly NOT for the performance market or even mainstream. Its for the budget/value segment. Now the question is, how much can it cut costs? Low end Dell systems with OS/Monitor/Keyboard+Mouse/Printer can be had for $300.

I see this acquisition more as a functionality of CPU/GPU working closer together, maybe even sharing a common protocol (Torrenza/CSI/FSB) than anything else. Needless to say, such a thing would be much faster than the standard PCI-E bus.

I would say that the market to benefit the most would be the mobile market...
That said, if you've been watching the more recent AMD/ATI foils, they've been showing multiple GPUs on-die (I saw one with 3 GPUs on-die). An argument certainly could be made for the performance market there, but I suspect that it's the mobile market which is the target here...
Another target would be the industrial SOC market...
 
well those kyro cpus are very efficient memory bandwidth wise. which is perfect for intergrated chipsets. they used the tile based rendering method.

intel sure could be competitive with them. their chipset plants will be 90nm . a more advanced 90nm than most of the tsmc plants that nvidia / ati use at 90nm. so i wouldnt doubt intel one bit.
 
Originally posted by: Viditor
I would say that the market to benefit the most would be the mobile market...
That said, if you've been watching the more recent AMD/ATI foils, they've been showing multiple GPUs on-die (I saw one with 3 GPUs on-die). An argument certainly could be made for the performance market there, but I suspect that it's the mobile market which is the target here...
Another target would be the industrial SOC market...

One of the primary issues of this is die space. The bigger the die, the lower the yields. One of the arguements is that if you share a GPU/CPU die, you're going to have either a subpar GPU or a subpar CPU (compared to having 100% CPU/100% GPU die). Of course this is in relation to current/past technology trends. The mobile market is another story; where performance doesnt matter as much.
 
Originally posted by: dexvx
Originally posted by: Viditor
I would say that the market to benefit the most would be the mobile market...
That said, if you've been watching the more recent AMD/ATI foils, they've been showing multiple GPUs on-die (I saw one with 3 GPUs on-die). An argument certainly could be made for the performance market there, but I suspect that it's the mobile market which is the target here...
Another target would be the industrial SOC market...

One of the primary issues of this is die space. The bigger the die, the lower the yields. One of the arguements is that if you share a GPU/CPU die, you're going to have either a subpar GPU or a subpar CPU (compared to having 100% CPU/100% GPU die). Of course this is in relation to current/past technology trends. The mobile market is another story; where performance doesnt matter as much.

Tinma died 'cause of RDRAM, I do think that it would have been a decent performer, after all, it was designed by the same team that brought us Banias.
 
Back
Top