smackababy
Lifer
- Oct 30, 2008
- 27,024
- 79
- 86
See this is where I start to think that they're insane.
Perhaps in a local 10GBe network, where bandwidth and latency are extremely good, this is feasible.
But let's take your example of puffy clouds/etc. How would this 'sync' up and be helpful in any way to the local game engine and GPU output? What content can it stream? If it's just static texture data, maaaybe, but it'll still take time to transfer depending on size. If it's something more complex, well good luck with that. They already struggle to utilize the various CORES, not just on PCs where they can argue that they won't waste time optimizing when most play on Dual-Cores, but on Consoles that have multi-core resources to work with.
This is like taking the hyperthreading section of a processor, moving it off the die, moving it down the street, moving it across the country over the internet, and then saying that it'll somehow be useful.
Don't get me wrong, a powerful server network and good infrastructure (which I firmly believe they WILL have) will make the user experience a good one for those with good connections. I just severely doubt that it will introduce anything that couldn't be done better locally with the right work, and at worst, trying to 'cloud' graphics will screw things up terribly.
I think it will more likely be used to offload some math and physics equations that can free up processor cycles. How it is implemented will be key. Unless you have some crazy rendering farm and you're just streaming the game, I don't think it will have a huge impact, but the implications scale as far as the implementation allows.
