sephiroth9878
Member
- Sep 29, 2012
- 26
- 0
- 0
I think its a fairly good idea yet hard copies are always better as you have it in your hand and its not just data... i think its mostly to try and stop game-sharers maybe ?
The fact that Onlive exists, is simply a case of truth number 3.The Twelve Networking Truths
(2) No matter how hard you push and no matter what the priority,
you can't increase the speed of light.
RFC1925 is from 1996.(3) With sufficient thrust, pigs fly just fine. However, this is
not necessarily a good idea. It is hard to be sure where they
are going to land, and it could be dangerous sitting under them
as they fly overhead.
(8) It is more complicated than you think.
Well I guess Nvidia is smoking crack with their cloud dreams. Wouldn't they be shooting themselves in the foot? Or do they really beleive they wont cannabalize the market?
I'm skeptical. There have been a number of companies that have claimed this and tried to make cloud gaming work over the past decade, and they have all failed.
With ISPs starting to put bandwidth caps it seems even less likely to me.
So after seeing Carmack's keynote at quakecon, and acknowledging that cloud computing is capable of streaming games realtime, will that kill off the performance computing parts market and just leave xeon and opterons in the future?
onlive in trouble tells the opposite, I think there is still a long way go for cloud gaming to become really relevant.
Yeah Cloud gameing would eat up bandwidth reaaaaaally quick.Will never happen. Most people, including myself, have way to low of a bandwidtch cap to make this even remotly possible.
<--- Skeptic of Cloud gameing.
I want to see a game server, that can handle 300,000 users all "streaming" the game from servers.
I want to know how badly the latency/lag will be if everyone is "streaming" a 1920x1200 size window.
I mean with youtube you can watch 720p videos.... without haveing to wait on the video, so I guess its possible with a fast connection, however the amount of server horse power it d take to run 300,000 "game" instances would be insane.
Yeah Cloud gameing would eat up bandwidth reaaaaaally quick.
Imagine spending 12hours of gameing, would be something like 12hours of watching youtube 1080p.
*edit:
I tested a youtube video of 1080p and it took about 150mb to steam 4minuets of video.
This mean if you "play" a 1080p video for 1hour it would be 60/4 = 15 times as much => 2,250 mb pr hour.
This means if you play like 8hours a day, you d use about ~18 Gigabytes of bandwidth (pr day).
<--- Skeptic of Cloud gameing.
I want to see a game server, that can handle 300,000 users all "streaming" the game from servers.
I want to know how badly the latency/lag will be if everyone is "streaming" a 1920x1200 size window.
I mean with youtube you can watch 720p videos.... without haveing to wait on the video, so I guess its possible with a fast connection, however the amount of server horse power it d take to run 300,000 "game" instances would be insane.
Yeah Cloud gameing would eat up bandwidth reaaaaaally quick.
Imagine spending 12hours of gameing, would be something like 12hours of watching youtube 1080p.
*edit:
I tested a youtube video of 1080p and it took about 150mb to steam 4minuets of video.
This mean if you "play" a 1080p video for 1hour it would be 60/4 = 15 times as much => 2,250 mb pr hour.
This means if you play like 8hours a day, you d use about ~18 Gigabytes of bandwidth (pr day).
Depends on where the compression is happening.Youtube can be streamed ahead of the video you are watching (called buffering). It also very lossy. On a PC a "1080p" image will be uncompressed and cannot be buffered because it is being generated on the fly.
1920 x 1080 = 2073600 pixels
2073600 x 32 = 66355200 bits
66355200 / 8 = 8294400 bytes
8294400 / 1024 = 8100 kilobytes
8100 / 1024 = 7.91015625 megabytes
Running at 60 FPS, to get the exact same data rate as your pc (not latency which is different) means you would need do 474.6 megabytes / second to have the same experience as a PC.
From there you would have to deal with latency since video card to your eyes is 7-10ms on good monitors. 50-100ms latency is "expected" on the internet so now you have latency in the 60-110ms range which for some games is way to much. You can simulate this issue for yourself by telling your living room TV to do post processing on your video games input. It becomes very obvious that it is there when you flip back and forth. Adding compression to lower the data rate will also increase this latency.
Depends on where the compression is happening.
Method File Size Compression Decompression
uncompressed 46380 KB - -
JLS 14984 KB 6.6 s 7.3 s
PNG 16256 KB 42.4 s 2.4 s
IZ 15496 KB 1.2 s 1.3 s
And? Depends on what? You answer doesn't give anything constructive to go on. And compression is addressed in the post.
Will never happen. Most people, including myself, have way to low of a bandwidtch cap to make this even remotly possible.
Not to mention every company that has tried this has gone down in flames.
And? Depends on what? You answer doesn't give anything constructive to go on. And compression is addressed in the post.
---
Here is some real world examples:
http://kdepepo.wordpress.com/2012/01/30/fast-lossless-color-image-compression/
You have 0.016 seconds to compress and decompress the image at 60FPS. The test image was about 4x the size of the 1080 screen so even 1.2/4 is .3 seconds which is not even close. That would be good for 3FPS with compression and a datarate of 3.8MB/s still. You would need at least a 30mbps internet connection to get there and it would be running near 100%.Code:Method File Size Compression Decompression uncompressed 46380 KB - - JLS 14984 KB 6.6 s 7.3 s PNG 16256 KB 42.4 s 2.4 s IZ 15496 KB 1.2 s 1.3 s
This is all to get the same quality as a local pc. With Intel making even the cheapest GPUs able to do this easily I don't see high hopes for a viable cloud gaming yet.
I thought the same thing of Netlfix. No really! For years I was an ardent and staunch member of the "netflix is never going to fly, who wants postage stamp movies delivered on 28.8k modem speeds?"
And years went by, and I held to my preconceived notions that the very concept of a netflix was dead on arrival.
Then one year while on vacation I was visiting my sister and her husband, and we were watching what I thought was a TV episode of mythbusters. Only when the episode ended I realized it was netflix streaming through a Wii. I had no idea it wasn't actual television/cable until then.
And now I am a happy Netflix subscriber.
I can understand that arguments of why the model of cloud gaming is a no-go in today's world, so was netflix in 1999 when modems were still the dominant bandwidth connection for home users.
But don't fall victim to this "technology as a snapshot in time" mentality. Bandwidth is improving moving at ridiculous speeds. Hell, the very fact that the concept of bandwidth itself is prevalent enough that you and I can even have this conversation about the very topic is absurd proof of the pudding type stuff.
Our parents could not have even dreamed of us having this opportunity, just as we cannot hope to dream of the reality that the future holds for our kids and for us aging enthusiasts.
Never say never, time will make a fool of you every chance it can get![]()
I think main issue is gaming is 2 way vs 1 way. Netflix can take the time build a well tuned lossy compressed file that can be sent to the WII at a decent rate. When you factor in gaming, the speed of light starts apply in the sense that 40-50ms is pretty destructive. It takes 135ms for light to go around the earth (in a vacuum... glass fiber would be slower.) and excludes processing on the routers etc. Netflix can handle 135ms of latency and you as the end user would never care. However 135ms from a server to the screen and then a response of 135 back to the server is 270ms which is very noticeable to an end user. The issue is more of a physics thing at the moment than a bandwidth issue.
I thought the same thing of Netlfix. No really! For years I was an ardent and staunch member of the "netflix is never going to fly, who wants postage stamp movies delivered on 28.8k modem speeds?"
And years went by, and I held to my preconceived notions that the very concept of a netflix was dead on arrival.
Then one year while on vacation I was visiting my sister and her husband, and we were watching what I thought was a TV episode of mythbusters. Only when the episode ended I realized it was netflix streaming through a Wii. I had no idea it wasn't actual television/cable until then.
And now I am a happy Netflix subscriber.
I can understand that arguments of why the model of cloud gaming is a no-go in today's world, so was netflix in 1999 when modems were still the dominant bandwidth connection for home users.
But don't fall victim to this "technology as a snapshot in time" mentality. Bandwidth is improving moving at ridiculous speeds. Hell, the very fact that the concept of bandwidth itself is prevalent enough that you and I can even have this conversation about the very topic is absurd proof of the pudding type stuff.
Our parents could not have even dreamed of us having this opportunity, just as we cannot hope to dream of the reality that the future holds for our kids and for us aging enthusiasts.
Never say never, time will make a fool of you every chance it can get![]()
I have been working in the ISP business for the last 10 years. And I work in a country that is 5-10 years ahead of USA in terms internet infrastructure. And there is no way its gonna be viable for the next 10 years here. Not to mention the logistics needed. We already host akamai servers today, so we know abit about distribution.
Streaming movies is also extremely forgiving due to the singleway communication. A small hiccup is buffered away. No input from you to the server that needs processing before coming back either. And you can stream a movie for basicly no cost due to simplicity of hardware and software. A 500$ PC can stream multiple movies to 1000s of active users when cached in cheap memory. A 500$ PC today can barely play a game for 1.
Not to mention, you still need a PC at home...
And lastly, lets rewind to 2010 and OnLine that AMD invested in. It went bankrupt.
Also just think of it, even a highend card like a GTX680. You cant really share that with more than 2. With 2 users they are already down to 1GB, 128Bit bus and 768SPs. Plus some switching overhead. Compare that to the price of the PC that can stream out to 1000s of users...ouch. That ROI is just doomed unless people start to play 1990 games on 2012 HW.
If the ping is that long then yes, no question, the model itself won't take off for those games which are dependent on latency (FPS). But there is a large group of people who don't need that - they play farmville already.
My point is more to say that obviously the solution here is quantum entanglement as that negates space-time latency at its core. Er, wait, that's not what I meant, what I meant was that they will need to put the servers closer to their customers to ensure that the ping is not deleterious.
Let's ignore the whole latency issue and pretend that that is acceptible. At that point, it's the VDI or App presentation model, but with a workload that requires special equipment (GPU) that doesn't allow for resource sharing on that particular piece of equipment. That puts it in a realm even worse than VDI, and regardless of what you may hear, no one is ever implementing VDI for cost savings. Sure, you only have to deal with peak load, but the gear a lot more expensive than home systems.
Now let's talk processing, delivery, etc.
The netflix analogy really doesn't fly. Netflix is purely a bandwidth problem. Given enough bandwidth, and even marginal latency, you can watch a movie.
Remote gaming is much more involved. You have rendering, *then* you have to be able to encode the video. Only at that point can you send it (all Netflix videos are pre-encoded for minimum bandwidth needed). Once the data is there, the remote system has to decode it, display it, and only then can a player act upon it. I've seen onlive before, and the visual fidelity was pretty much garbage in order to be able to encode fast enough to keep latency low enough that it didn't totally destroy the experience.
Now, the speed of light isn't going to change, but the encoding time will, over time, drop. The decoding time will as well, though they'll never be 0. While there is room for improvement, there is no magic sauce that will remove the latency entirely, and as we're trending towards (slowly) higher resolution displays, the encoding improvements may or may not be entirely offset by having to drive higher expected resolutions.
Netflix is easy. Pre-encode in the most efficient but reasonable to decode format, and add enough bandwith. That just doesn't translate in to an interactive medium.