Yeah, okay...hello, bandwidth/latency problems!
This is incredibly stupid.
yeah this, since most are on capped internet connections and HD video streams that are of decent quality take ALOT of data i cant see this working. Total fail idea.
Yeah, okay...hello, bandwidth/latency problems!
This is incredibly stupid.
I can share my experience with Diablo 3.After 4-5 hrs of gaming the data transfer is measly ~20MB.It needs a good ping to enjoy the game though.
For streamed 60fps 720p video? Or, if you want to actually show off the processing power on the other end, 1080p video?How much bandwith is needed?
No, it has not and is not. The article you linked earlier shows a broad range of console latencies, some as low as 67ms, and even in the absolute worst case they only begin to approach the very best case Nvidia's marketing department is reporting.Lantency is already debunked....it's the same as on consoles.
For streamed 60fps 720p video? Or, if you want to actually show off the processing power on the other end, 1080p video?
A hell of a lot. A half hour 1080p Youtube video is 1GB (mp4 format).
No, it has not and is not. The article you linked earlier shows a broad range of console latencies, some as low as 67ms, and even in the absolute worst case they only begin to approach the very best case Nvidia's marketing department is reporting.
Less of an issue than some people might assume? Maybe. But not even close to the same as consoles.
Lets adjust the chart a bit.
Issue 1: network does not get faster via nvidia's GRID. it should be the same in both, which is 75ms in both (assuming a ping of 37.5ms that is; remember they are talking about lag... so you need to send an order first and only then receive a reply; their chart assumes ping went from 37.5ms on gen1 cloud to 15ms on GRID).
Issue 2: Game pipeline does not take 100 ms.
If the game is chugging along at a weak 30 fps then it takes 33.(3) ms for the game to render each frame.
Issue 3: The display does not take 66 ms to display an image, much closer to 10.
Issue 4: GRID makes virtualization easier, services like onlive already virtualize to meet a certain min framerate at a specific quality setting and then use the rest of the GPU for other things. All GRID Does is require them to buy fewer GPUs to reach that target... as such the pipeline does NOT get any shorter.
So, realistic console figure is 43ms total. Assuming their capture and decode and network figures are correct, gen1 cloud gives you 163 ms. And GRID cuts that down by a whopping 30 ms to a 133 ms.
Where did I state that? Only a very small handful of current generation console titles do. The new systems coming out? I don't know.LOL
This is traget the audience that uses consoles.
You think consoels run at 60 FPS
(strike 1)
Seriosly?
Currently, again, a rarity.At 720p?
(strike 2)
Cherry picking? This is the entire list from the article YOU linked:And cherry picking number eh?
The lantency goes alll the way up to +160 ms in controller lag.
(strike 3)
Cloud is great until the connection or service craps itself at either end. Anybody whos had Steam offline access fail or lost access to their GFWL saved games at the most inopportune moments knows this.
Furthermore, the quality of this will be directly influenced by the quality of your network connection. A fixed console box will guarantee 100ms + 66ms much more often than an arbitrary network connection will guarantee 30ms. Many network connections measure in the hundreds.
Where did I state that? Only a very small handful of current generation console titles do. The new systems coming out? I don't know.
If you are trying to say that I shouldn't expect 60fps because our 6-7 year old consoles can't reliably do it, then where is the need for cloud processing? Which (again) is why I think GRID is going to be aimed at very specific applications that will not challenge home consoles.
Currently, again, a rarity.
Not my point though. Are you proposing that GRID is only going to match what current consoles offer in terms of visual quality? What happened to invalidating the "crap boxes" when you're setting the bar at tech 7 years old?
Cherry picking? This is the entire list from the article YOU linked:
Game Latency Measurement
Burnout Paradise 67ms
BioShock (frame-locked) 133ms
BioShock (unlocked) as low as 67ms
Call of Duty 4: Modern Warfare 67ms-84ms
Call of Duty: World at War 67ms-100ms
Call of Juarez: Bound in Blood 100ms
Forza Motorsport 2 67ms
Geometry Wars 2 67ms
Guitar Hero: Aerosmith 67ms
Grand Theft Auto IV 133ms-200ms
Halo 3 100ms-150ms
Left 4 Dead 100ms-133ms
LEGO Batman 133ms
Mirror's Edge 133ms
Street Fighter IV 67ms
Soul Calibur IV 67ms-84ms
Unreal Tournament 3 100ms-133ms
X-Men Origins: Wolverine 133ms
I reiterate what I said, at worst you might see up to 166ms, and that includes the display on top of controller input. Nvidia's marketing graph states 166ms as optimally possible. You stated they are equal which is simply not true.
GRID is interesting tech that, as others have mentioned, might be nice as a supplement to other services and content delivery systems. It will not challenge home consoles in a meaningful way by itself, and I don't know why those keep getting brought up.
yeah this, since most are on capped internet connections and HD video streams that are of decent quality take ALOT of data i cant see this working. Total fail idea.
This. Unless I can cache the content to my fileserver and run it from my GbE network locally, this is destined to fail. Most home network connections are going to be 100-200ms latency.
Yet another good reason this will fail hard. Data caps from USA ISPs are yet again stifling innovation due to the ISPs also being streaming video content providers. I would really like to see the US government intervene and make IPTV from the ISP count towards bandwidth caps. You'd see the bandwidth caps go away really quickly at that point.
I will let u know tomorrowRight, but they are talking about doing the rendering on their end and streaming you the video, go screencapture diablo for 5 hours and let me know what the file size of that is
I mean a DVD is 2-5GB and its not even HD, sure you can compress it but to have a artifact free 1080P stream is still going to be in the 500mb-1GB a hour range. I would hit my monthly data cap in less than 2 days. I would hit my cell phone data cap in less than a hour.
What?So you artificial raise the bar...pulling numbers from your *beep**sigh*
Actually those are the primary two factors in determining bandwidth requirements which is where this started.I can see this must be very confusing to you.
Think of consolee games ingame grafik vs CGI.
You are hopelessly stuck in resolution + FPS.
Now who is setting artificial bars to suit their position?These should just match the OUTPUT of consoles(after upscaling).
That gives us a target of 30 FPS @ 720p.
Stay classy, Lonbjerg! <3 <3Anyways I am done with you
Well, I did answer your question. I seem to be the only one who did.nothing you have said made any positive impact on me...or shown me that you graps the technology
If you think 100-200ms of lantenncy is the norm...you are sadly uninformed.
I don't even get 200ms of ping, if I game on a US seerver from the EU...I get a 100 ms (0.65 x c matter here)
It will be obscene...Right, but they are talking about doing the rendering on their end and streaming you the video, go screencapture diablo for 5 hours and let me know what the file size of that is
I mean a DVD is 2-5GB and its not even HD
Emotions running high in here.
100-200 ms is pretty normal. I have seen much worse... A really good broadband can get much lower though
0.65c, aka 200km/ms. Is the speed of light in a fiber optic cable.
the US is 3000 miles across (east to west)
3000 miles = 4 828 032 meters
It takes a mere 24.14016ms for light to cross that distance in a fiber optic cable.. And usually the servers are closer. The vast majority of your ping is coming from switching. (the fact the cables are not going in a straight line adds a bit too)
It will be obscene...
yea but DVDs are encoded in MPEG2, an algorithm from 1996.
Still, its going to be an obscene amount of data that will make ISPs throw a hissy fit.