Originally posted by: aka1nas
Amdahl's law will quickly bite you hard if you just try to take the same algorithm or piece of work and break it up between increasing numbers of CPU cores or threads. Developers can still take great advantage of more cores by finding additional work for them to do. There's plenty of extra things that would be interesting to simulate to add more realism and depth to games and many of those tasks can be farmed out to extra cores. I for one would like to see real-time text-to-speech engines in games instead of scripted dialogue.
The real problem is that we'll "only" have 8 or 16 cores in a few years.
Which is exactly the idea behind the Sony/Toshiba Cell processor. The only problem being they did it half assed and wasted enormous amounts of money on research that ended up being fruitless.
They go through all the trouble deleoping cell, then slump on the manufacturing technology making its premire product (the PS3) late by proxy.
They cut corners in areas that wouldve made the CPU an excellent platform for things other than gaming, like making it an in-order CPU.
Early in development they abandoned x86 support.
It's an incredibly complex chip to try to design, and i dont think Sony/Toshibas team was up to the task to be frank. They got a product, but its nothing like it could have been.
It went from a multiplatform wide adoption concept to basically the PS3 CPU and some completely proprietary uses for it. Oh wait, you can use the defective ones in TVs and Blu-Ray players!!!
I kind of wandered off of the subject, but my point is... If someone were to make a "real Cell" (X86 & X64, out of order, a lot more cache, smaller process, PC based platform) it would be a very dominant CPU.
AMD is kind of flirting with a similar idea but using specialized cores, for things like physics and graphics.
Intel is developing an entire "system on a chip" architecture where you have the CPU, mobo, memory, and thats it for a fully working system.