Cloud gaming - the end of the desktop performance pc cpu market?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

sephiroth9878

Member
Sep 29, 2012
26
0
0
I think its a fairly good idea yet hard copies are always better as you have it in your hand and its not just data... i think its mostly to try and stop game-sharers maybe ?
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
Cloud computing for gaming is a clear case of ignoring the basics.

In discussions like this, I keep referring to an old and simple document.
RFC1925: http://tools.ietf.org/rfc/rfc1925.txt

The Twelve Networking Truths

(2) No matter how hard you push and no matter what the priority,
you can't increase the speed of light.
The fact that Onlive exists, is simply a case of truth number 3.
(3) With sufficient thrust, pigs fly just fine. However, this is
not necessarily a good idea. It is hard to be sure where they
are going to land, and it could be dangerous sitting under them
as they fly overhead.
RFC1925 is from 1996.
But clearly, not enough people have read it.
It might be a document that has been written as a joke. But if you don't understand the jokes in it, you should not design new networking-technology.

The older I get, the more often I think of RFC1925.
In particular, I have developed a strong respect for truth number 8.
(8) It is more complicated than you think.
 
Last edited:

kelco

Member
Aug 15, 2012
76
0
0
Well I guess Nvidia is smoking crack with their cloud dreams. Wouldn't they be shooting themselves in the foot? Or do they really beleive they wont cannabalize the market?
 

podspi

Golden Member
Jan 11, 2011
1,982
102
106
Well I guess Nvidia is smoking crack with their cloud dreams. Wouldn't they be shooting themselves in the foot? Or do they really beleive they wont cannabalize the market?

I think they would prefer selling to businesses virtualizing GPUs than consumers. Firms will be willing to pay more per card, knowing they can use the same card to service multiple customers.
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
Will never happen. Most people, including myself, have way to low of a bandwidtch cap to make this even remotly possible.

Not to mention every company that has tried this has gone down in flames.
 

NickelPlate

Senior member
Nov 9, 2006
652
13
81
I'm skeptical. There have been a number of companies that have claimed this and tried to make cloud gaming work over the past decade, and they have all failed.

With ISPs starting to put bandwidth caps it seems even less likely to me.

^This. Others have been trying to bring into the mainstream for quite some time now. I can't remember if it was onlive that is/was saying they could stream the games to the client system without the client system requiring expensive hardware.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,060
3,551
126
So after seeing Carmack's keynote at quakecon, and acknowledging that cloud computing is capable of streaming games realtime, will that kill off the performance computing parts market and just leave xeon and opterons in the future?

no its not possible... online tried it for years and failed horribly.
The stream quality is not great, there is latency lag.

onlive in trouble tells the opposite, I think there is still a long way go for cloud gaming to become really relevant.

yeah a lot of people i know who tried it said it was horrible.

its about good for early console type stream porting...
But trying to stream something like BF3, is asking for way too much.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
<--- Skeptic of Cloud gameing.

I want to see a game server, that can handle 300,000 users all "streaming" the game from servers.
I want to know how badly the latency/lag will be if everyone is "streaming" a 1920x1200 size window.

I mean with youtube you can watch 720p videos.... without haveing to wait on the video, so I guess its possible with a fast connection, however the amount of server horse power it d take to run 300,000 "game" instances would be insane.

Will never happen. Most people, including myself, have way to low of a bandwidtch cap to make this even remotly possible.
Yeah Cloud gameing would eat up bandwidth reaaaaaally quick.

Imagine spending 12hours of gameing, would be something like 12hours of watching youtube 1080p.


*edit:
I tested a youtube video of 1080p and it took about 150mb to steam 4minuets of video.
This mean if you "play" a 1080p video for 1hour it would be 60/4 = 15 times as much => 2,250 mb pr hour.

This means if you play like 8hours a day, you d use about ~18 Gigabytes of bandwidth (pr day).
 
Last edited:

kelco

Member
Aug 15, 2012
76
0
0
<--- Skeptic of Cloud gameing.

I want to see a game server, that can handle 300,000 users all "streaming" the game from servers.
I want to know how badly the latency/lag will be if everyone is "streaming" a 1920x1200 size window.

I mean with youtube you can watch 720p videos.... without haveing to wait on the video, so I guess its possible with a fast connection, however the amount of server horse power it d take to run 300,000 "game" instances would be insane.

Yeah Cloud gameing would eat up bandwidth reaaaaaally quick.

Imagine spending 12hours of gameing, would be something like 12hours of watching youtube 1080p.


*edit:
I tested a youtube video of 1080p and it took about 150mb to steam 4minuets of video.
This mean if you "play" a 1080p video for 1hour it would be 60/4 = 15 times as much => 2,250 mb pr hour.

This means if you play like 8hours a day, you d use about ~18 Gigabytes of bandwidth (pr day).

I'm sure theyll have compression, and maybe some smart stuff that only sends new bits instead of the whole screen each refresh. Reminds me of the graphics card, what was it, the S3 that only rendered what was visible to the user?
 

imagoon

Diamond Member
Feb 19, 2003
5,199
0
0
<--- Skeptic of Cloud gameing.

I want to see a game server, that can handle 300,000 users all "streaming" the game from servers.
I want to know how badly the latency/lag will be if everyone is "streaming" a 1920x1200 size window.

I mean with youtube you can watch 720p videos.... without haveing to wait on the video, so I guess its possible with a fast connection, however the amount of server horse power it d take to run 300,000 "game" instances would be insane.

Yeah Cloud gameing would eat up bandwidth reaaaaaally quick.

Imagine spending 12hours of gameing, would be something like 12hours of watching youtube 1080p.


*edit:
I tested a youtube video of 1080p and it took about 150mb to steam 4minuets of video.
This mean if you "play" a 1080p video for 1hour it would be 60/4 = 15 times as much => 2,250 mb pr hour.

This means if you play like 8hours a day, you d use about ~18 Gigabytes of bandwidth (pr day).

Youtube can be streamed ahead of the video you are watching (called buffering). It also very lossy. On a PC a "1080p" image will be uncompressed and cannot be buffered because it is being generated on the fly.

1920 x 1080 = 2073600 pixels
2073600 x 32 = 66355200 bits
66355200 / 8 = 8294400 bytes
8294400 / 1024 = 8100 kilobytes
8100 / 1024 = 7.91015625 megabytes

Running at 60 FPS, to get the exact same data rate as your pc (not latency which is different) means you would need do 474.6 megabytes / second to have the same experience as a PC.

From there you would have to deal with latency since video card to your eyes is 7-10ms on good monitors. 50-100ms latency is "expected" on the internet so now you have latency in the 60-110ms range which for some games is way to much. You can simulate this issue for yourself by telling your living room TV to do post processing on your video games input. It becomes very obvious that it is there when you flip back and forth. Adding compression to lower the data rate will also increase this latency.
 

kelco

Member
Aug 15, 2012
76
0
0
Youtube can be streamed ahead of the video you are watching (called buffering). It also very lossy. On a PC a "1080p" image will be uncompressed and cannot be buffered because it is being generated on the fly.

1920 x 1080 = 2073600 pixels
2073600 x 32 = 66355200 bits
66355200 / 8 = 8294400 bytes
8294400 / 1024 = 8100 kilobytes
8100 / 1024 = 7.91015625 megabytes

Running at 60 FPS, to get the exact same data rate as your pc (not latency which is different) means you would need do 474.6 megabytes / second to have the same experience as a PC.

From there you would have to deal with latency since video card to your eyes is 7-10ms on good monitors. 50-100ms latency is "expected" on the internet so now you have latency in the 60-110ms range which for some games is way to much. You can simulate this issue for yourself by telling your living room TV to do post processing on your video games input. It becomes very obvious that it is there when you flip back and forth. Adding compression to lower the data rate will also increase this latency.
Depends on where the compression is happening.
 

IlllI

Diamond Member
Feb 12, 2002
4,927
11
81
i think eventually we'll see something like this with steam. i can see them branching out so you can play your games on other devices other than your computer.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
Even in a perfect world where there is an infinite stream of bandwidth at zero latency, cloud gaming will still never take off. Now that Intel has taken an interest in iGPU performance, the delta between what the common consumer has and what the common game requires is going to be much smaller. As GPUs get more complex, the gains they can achieve over an IGP isn't going to work out for the cloud. The difference between a streamed cost effective cloud setup, and just running it locally is only going to diminish.

Even phone GPUs are getting to the point where you wouldn't see any gains from having a super powerful cloud renderer. Wireless eats battery life. Not quite as fast as running the SOC full power, but most of the battery life improvements from wireless are done. Physics wins again.
 

imagoon

Diamond Member
Feb 19, 2003
5,199
0
0
Depends on where the compression is happening.

And? Depends on what? You answer doesn't give anything constructive to go on. And compression is addressed in the post.

---

Here is some real world examples:

http://kdepepo.wordpress.com/2012/01/30/fast-lossless-color-image-compression/

Code:
Method 		File Size 	Compression 	Decompression
uncompressed 	46380 KB 	- 		-
JLS 		14984 KB 	6.6 s 		7.3 s
PNG 		16256 KB 	42.4 s 		2.4 s
IZ 		15496 KB 	1.2 s 		1.3 s

You have 0.016 seconds to compress and decompress the image at 60FPS. The test image was about 4x the size of the 1080 screen so even 1.2/4 is .3 seconds which is not even close. That would be good for 3FPS with compression and a datarate of 3.8MB/s still. You would need at least a 30mbps internet connection to get there and it would be running near 100%.

This is all to get the same quality as a local pc. With Intel making even the cheapest GPUs able to do this easily I don't see high hopes for a viable cloud gaming yet.
 
Last edited:

kelco

Member
Aug 15, 2012
76
0
0
And? Depends on what? You answer doesn't give anything constructive to go on. And compression is addressed in the post.

Well I'm sure an imaginative engineer can develop some neat tricks to make it happen. Without knowing what those kepler cloud cards can do, I really am just shooting in the dark. I'm sure they'll have either some sort of cuda-accelerated compression or maybe even direct hardware-accelerated compression.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Will never happen. Most people, including myself, have way to low of a bandwidtch cap to make this even remotly possible.

Not to mention every company that has tried this has gone down in flames.

I thought the same thing of Netlfix. No really! For years I was an ardent and staunch member of the "netflix is never going to fly, who wants postage stamp movies delivered on 28.8k modem speeds?"

And years went by, and I held to my preconceived notions that the very concept of a netflix was dead on arrival.

Then one year while on vacation I was visiting my sister and her husband, and we were watching what I thought was a TV episode of mythbusters. Only when the episode ended I realized it was netflix streaming through a Wii. I had no idea it wasn't actual television/cable until then.

And now I am a happy Netflix subscriber.

I can understand that arguments of why the model of cloud gaming is a no-go in today's world, so was netflix in 1999 when modems were still the dominant bandwidth connection for home users.

But don't fall victim to this "technology as a snapshot in time" mentality. Bandwidth is improving moving at ridiculous speeds. Hell, the very fact that the concept of bandwidth itself is prevalent enough that you and I can even have this conversation about the very topic is absurd proof of the pudding type stuff.

Our parents could not have even dreamed of us having this opportunity, just as we cannot hope to dream of the reality that the future holds for our kids and for us aging enthusiasts.

Never say never, time will make a fool of you every chance it can get ;)
 

Hacp

Lifer
Jun 8, 2005
13,923
2
81
And? Depends on what? You answer doesn't give anything constructive to go on. And compression is addressed in the post.

---

Here is some real world examples:

http://kdepepo.wordpress.com/2012/01/30/fast-lossless-color-image-compression/

Code:
Method         File Size     Compression     Decompression
uncompressed     46380 KB     -         -
JLS         14984 KB     6.6 s         7.3 s
PNG         16256 KB     42.4 s         2.4 s
IZ         15496 KB     1.2 s         1.3 s
You have 0.016 seconds to compress and decompress the image at 60FPS. The test image was about 4x the size of the 1080 screen so even 1.2/4 is .3 seconds which is not even close. That would be good for 3FPS with compression and a datarate of 3.8MB/s still. You would need at least a 30mbps internet connection to get there and it would be running near 100%.

This is all to get the same quality as a local pc. With Intel making even the cheapest GPUs able to do this easily I don't see high hopes for a viable cloud gaming yet.

Of coursed, fixed hardware could be developed to accelerate the compression.
 

imagoon

Diamond Member
Feb 19, 2003
5,199
0
0
I thought the same thing of Netlfix. No really! For years I was an ardent and staunch member of the "netflix is never going to fly, who wants postage stamp movies delivered on 28.8k modem speeds?"

And years went by, and I held to my preconceived notions that the very concept of a netflix was dead on arrival.

Then one year while on vacation I was visiting my sister and her husband, and we were watching what I thought was a TV episode of mythbusters. Only when the episode ended I realized it was netflix streaming through a Wii. I had no idea it wasn't actual television/cable until then.

And now I am a happy Netflix subscriber.

I can understand that arguments of why the model of cloud gaming is a no-go in today's world, so was netflix in 1999 when modems were still the dominant bandwidth connection for home users.

But don't fall victim to this "technology as a snapshot in time" mentality. Bandwidth is improving moving at ridiculous speeds. Hell, the very fact that the concept of bandwidth itself is prevalent enough that you and I can even have this conversation about the very topic is absurd proof of the pudding type stuff.

Our parents could not have even dreamed of us having this opportunity, just as we cannot hope to dream of the reality that the future holds for our kids and for us aging enthusiasts.

Never say never, time will make a fool of you every chance it can get ;)

I think main issue is gaming is 2 way vs 1 way. Netflix can take the time build a well tuned lossy compressed file that can be sent to the WII at a decent rate. When you factor in gaming, the speed of light starts apply in the sense that 40-50ms is pretty destructive. It takes 135ms for light to go around the earth (in a vacuum... glass fiber would be slower.) and excludes processing on the routers etc. Netflix can handle 135ms of latency and you as the end user would never care. However 135ms from a server to the screen and then a response of 135 back to the server is 270ms which is very noticeable to an end user. The issue is more of a physics thing at the moment than a bandwidth issue.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
I think main issue is gaming is 2 way vs 1 way. Netflix can take the time build a well tuned lossy compressed file that can be sent to the WII at a decent rate. When you factor in gaming, the speed of light starts apply in the sense that 40-50ms is pretty destructive. It takes 135ms for light to go around the earth (in a vacuum... glass fiber would be slower.) and excludes processing on the routers etc. Netflix can handle 135ms of latency and you as the end user would never care. However 135ms from a server to the screen and then a response of 135 back to the server is 270ms which is very noticeable to an end user. The issue is more of a physics thing at the moment than a bandwidth issue.

If the ping is that long then yes, no question, the model itself won't take off for those games which are dependent on latency (FPS). But there is a large group of people who don't need that - they play farmville already.

My point is more to say that obviously the solution here is quantum entanglement as that negates space-time latency at its core. Er, wait, that's not what I meant, what I meant was that they will need to put the servers closer to their customers to ensure that the ping is not deleterious.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
I thought the same thing of Netlfix. No really! For years I was an ardent and staunch member of the "netflix is never going to fly, who wants postage stamp movies delivered on 28.8k modem speeds?"

And years went by, and I held to my preconceived notions that the very concept of a netflix was dead on arrival.

Then one year while on vacation I was visiting my sister and her husband, and we were watching what I thought was a TV episode of mythbusters. Only when the episode ended I realized it was netflix streaming through a Wii. I had no idea it wasn't actual television/cable until then.

And now I am a happy Netflix subscriber.

I can understand that arguments of why the model of cloud gaming is a no-go in today's world, so was netflix in 1999 when modems were still the dominant bandwidth connection for home users.

But don't fall victim to this "technology as a snapshot in time" mentality. Bandwidth is improving moving at ridiculous speeds. Hell, the very fact that the concept of bandwidth itself is prevalent enough that you and I can even have this conversation about the very topic is absurd proof of the pudding type stuff.

Our parents could not have even dreamed of us having this opportunity, just as we cannot hope to dream of the reality that the future holds for our kids and for us aging enthusiasts.

Never say never, time will make a fool of you every chance it can get ;)

I have been working in the ISP business for the last 10 years. And I work in a country that is 5-10 years ahead of USA in terms internet infrastructure. And there is no way its gonna be viable for the next 10 years here. Not to mention the logistics needed. We already host akamai servers today, so we know abit about distribution.

Streaming movies is also extremely forgiving due to the singleway communication. A small hiccup is buffered away. No input from you to the server that needs processing before coming back either. And you can stream a movie for basicly no cost due to simplicity of hardware and software. A 500$ PC can stream multiple movies to 1000s of active users when cached in cheap memory. A 500$ PC today can barely play a game for 1.

Not to mention, you still need a PC at home...

And lastly, lets rewind to 2010 and OnLine that AMD invested in. It went bankrupt.

Also just think of it, even a highend card like a GTX680. You cant really share that with more than 2. With 2 users they are already down to 1GB, 128Bit bus and 768SPs. Plus some switching overhead. Compare that to the price of the PC that can stream out to 1000s of users...ouch. That ROI is just doomed unless people start to play 1990 games on 2012 HW.
 
Last edited:

Hmoobphajej

Member
Apr 8, 2011
102
0
76
I think the problem is going to be mostly latency with gamers. It would require a lot of dedicated servers for players to play on and i don't think there is a real company out there that wants to dedicated all that money to it. At the same time bandwidth is going to be a problem. I understand that the internet is getting faster but at the same time I don't think ISPs are letting go of their bandwidth cap. Where I live we're capped at roughly 250GB a month. I once went over it by a lot... it was not a pretty call we received from the ISP. I would assume that if you had a family of gamers or even just roommates of gamers it would spill doom for us if we all moved on to cloud gaming.
 

MisterMac

Senior member
Sep 16, 2011
777
0
0
I have been working in the ISP business for the last 10 years. And I work in a country that is 5-10 years ahead of USA in terms internet infrastructure. And there is no way its gonna be viable for the next 10 years here. Not to mention the logistics needed. We already host akamai servers today, so we know abit about distribution.

Streaming movies is also extremely forgiving due to the singleway communication. A small hiccup is buffered away. No input from you to the server that needs processing before coming back either. And you can stream a movie for basicly no cost due to simplicity of hardware and software. A 500$ PC can stream multiple movies to 1000s of active users when cached in cheap memory. A 500$ PC today can barely play a game for 1.

Not to mention, you still need a PC at home...

And lastly, lets rewind to 2010 and OnLine that AMD invested in. It went bankrupt.

Also just think of it, even a highend card like a GTX680. You cant really share that with more than 2. With 2 users they are already down to 1GB, 128Bit bus and 768SPs. Plus some switching overhead. Compare that to the price of the PC that can stream out to 1000s of users...ouch. That ROI is just doomed unless people start to play 1990 games on 2012 HW.

While we've certainly lost the Internet Speed edge we had 15 years ago - i'd say your perspective is a bit harsh.

The monoply of a certain company is the only thing keeping us back ;).


I find it funny you use an argument like a single CARD cannot service multiple users.
NVidia is movingly heavily towards more threaded architecture on a hardware level - then it's just the same problem on the cpu-side.
Uber Fast "long distance" interconnects.


Anyhow, the biggest issue is latency and correction for packet loss.
We're just NOT there in terms technology to keep this table in a scaled enviroment (imaging a gaming stream service ala CDN's around the world) - that includes as much upstream to the Server enviroment without speed issues - when comparing to a local Desktop.

We might be at an acceptable level as seen by the OnLive companies and so on - for casual players, but whenever something launches like this it's going to have to be "community\pr" accepted by us geeks first to gain mainstream traction.

That won't happen until latency issues are 99% of a local setup.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
If the ping is that long then yes, no question, the model itself won't take off for those games which are dependent on latency (FPS). But there is a large group of people who don't need that - they play farmville already.

My point is more to say that obviously the solution here is quantum entanglement as that negates space-time latency at its core. Er, wait, that's not what I meant, what I meant was that they will need to put the servers closer to their customers to ensure that the ping is not deleterious.


Let's ignore the whole latency issue and pretend that that is acceptible. At that point, it's the VDI or App presentation model, but with a workload that requires special equipment (GPU) that doesn't allow for resource sharing on that particular piece of equipment. That puts it in a realm even worse than VDI, and regardless of what you may hear, no one is ever implementing VDI for cost savings. Sure, you only have to deal with peak load, but the gear a lot more expensive than home systems.

Now let's talk processing, delivery, etc.

The netflix analogy really doesn't fly. Netflix is purely a bandwidth problem. Given enough bandwidth, and even marginal latency, you can watch a movie.

Remote gaming is much more involved. You have rendering, *then* you have to be able to encode the video. Only at that point can you send it (all Netflix videos are pre-encoded for minimum bandwidth needed). Once the data is there, the remote system has to decode it, display it, and only then can a player act upon it. I've seen onlive before, and the visual fidelity was pretty much garbage in order to be able to encode fast enough to keep latency low enough that it didn't totally destroy the experience.

Now, the speed of light isn't going to change, but the encoding time will, over time, drop. The decoding time will as well, though they'll never be 0. While there is room for improvement, there is no magic sauce that will remove the latency entirely, and as we're trending towards (slowly) higher resolution displays, the encoding improvements may or may not be entirely offset by having to drive higher expected resolutions.

Netflix is easy. Pre-encode in the most efficient but reasonable to decode format, and add enough bandwith. That just doesn't translate in to an interactive medium.
 

kelco

Member
Aug 15, 2012
76
0
0
Let's ignore the whole latency issue and pretend that that is acceptible. At that point, it's the VDI or App presentation model, but with a workload that requires special equipment (GPU) that doesn't allow for resource sharing on that particular piece of equipment. That puts it in a realm even worse than VDI, and regardless of what you may hear, no one is ever implementing VDI for cost savings. Sure, you only have to deal with peak load, but the gear a lot more expensive than home systems.

Now let's talk processing, delivery, etc.

The netflix analogy really doesn't fly. Netflix is purely a bandwidth problem. Given enough bandwidth, and even marginal latency, you can watch a movie.

Remote gaming is much more involved. You have rendering, *then* you have to be able to encode the video. Only at that point can you send it (all Netflix videos are pre-encoded for minimum bandwidth needed). Once the data is there, the remote system has to decode it, display it, and only then can a player act upon it. I've seen onlive before, and the visual fidelity was pretty much garbage in order to be able to encode fast enough to keep latency low enough that it didn't totally destroy the experience.

Now, the speed of light isn't going to change, but the encoding time will, over time, drop. The decoding time will as well, though they'll never be 0. While there is room for improvement, there is no magic sauce that will remove the latency entirely, and as we're trending towards (slowly) higher resolution displays, the encoding improvements may or may not be entirely offset by having to drive higher expected resolutions.

Netflix is easy. Pre-encode in the most efficient but reasonable to decode format, and add enough bandwith. That just doesn't translate in to an interactive medium.

The only way to beat the speed of light is to know what happens in the future.