NVIDIA introduces GRID - Might be the end of consoles!

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Rifter

Lifer
Oct 9, 1999
11,522
751
126
Yeah, okay...hello, bandwidth/latency problems!

This is incredibly stupid.

yeah this, since most are on capped internet connections and HD video streams that are of decent quality take ALOT of data i cant see this working. Total fail idea.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Just to add something:
Not everywhere there is CAP on internet.
In Denmark there is no such thing.

And I'll bite (since this is my very field)

How much bandwith is needed?

Lantency is already debunked....it's the same as on consoles.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
I can share my experience with Diablo 3.After 4-5 hrs of gaming the data transfer is measly ~20MB.It needs a good ping to enjoy the game though.
 

firewolfsm

Golden Member
Oct 16, 2005
1,848
29
91
Cloud processing is a growing trend in technology and is good for all of us because we won't need to spend as much on processing power and get lighter, cooler devices as a result.

Online gaming is always picking up in popularity, and it's obvious how having one computer process an entire 64 player FPS match and doling out individual video feeds to players is the most adaptable and efficient way of doing things. This will also allow developers to make games for incredibly powerful computers, if they can bear the cost of hogging up cloud processing.
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
I can share my experience with Diablo 3.After 4-5 hrs of gaming the data transfer is measly ~20MB.It needs a good ping to enjoy the game though.

Right, but they are talking about doing the rendering on their end and streaming you the video, go screencapture diablo for 5 hours and let me know what the file size of that is :)

I mean a DVD is 2-5GB and its not even HD, sure you can compress it but to have a artifact free 1080P stream is still going to be in the 500mb-1GB a hour range. I would hit my monthly data cap in less than 2 days. I would hit my cell phone data cap in less than a hour.
 

Mistwalker

Senior member
Feb 9, 2007
343
0
71
How much bandwith is needed?
For streamed 60fps 720p video? Or, if you want to actually show off the processing power on the other end, 1080p video?

A hell of a lot. A half hour 1080p Youtube video is 1GB (mp4 format).

Lantency is already debunked....it's the same as on consoles.
No, it has not and is not. The article you linked earlier shows a broad range of console latencies, some as low as 67ms, and even in the absolute worst case they only begin to approach the very best case Nvidia's marketing department is reporting.

Less of an issue than some people might assume? Maybe. But not even close to the same as consoles.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
OnLive literally makes me sick feeling from all the lag. that coupled with the low graphics settings at 720 res make for an ass ugly nauseating experience.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
For streamed 60fps 720p video? Or, if you want to actually show off the processing power on the other end, 1080p video?

A hell of a lot. A half hour 1080p Youtube video is 1GB (mp4 format).

No, it has not and is not. The article you linked earlier shows a broad range of console latencies, some as low as 67ms, and even in the absolute worst case they only begin to approach the very best case Nvidia's marketing department is reporting.

Less of an issue than some people might assume? Maybe. But not even close to the same as consoles.

LOL

This is traget the audience that uses consoles.
You think consoels run at 60 FPS
(strike 1)
Seriosly?

At 720p?
(strike 2)

And cherry picking number eh?
The lantency goes alll the way up to +160 ms in controller lag.
(strike 3)

Come back when you have facts, otherwise don't bother posting to me.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
05.jpg


Lets adjust the chart a bit.
Issue 1: network does not get faster via nvidia's GRID. it should be the same in both, which is 75ms in both (assuming a ping of 37.5ms that is; remember they are talking about lag... so you need to send an order first and only then receive a reply; their chart assumes ping went from 37.5ms on gen1 cloud to 15ms on GRID).
Issue 2: Game pipeline does not take 100 ms.
If the game is chugging along at a weak 30 fps then it takes 33.(3) ms for the game to render each frame.
Issue 3: The display does not take 66 ms to display an image, much closer to 10.
Issue 4: GRID makes virtualization easier, services like onlive already virtualize to meet a certain min framerate at a specific quality setting and then use the rest of the GPU for other things. All GRID Does is require them to buy fewer GPUs to reach that target... as such the pipeline does NOT get any shorter.

So, realistic console figure is 43ms total. Assuming their capture and decode and network figures are correct, gen1 cloud gives you 163 ms. And GRID cuts that down by a whopping 30 ms to a 133 ms.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
05.jpg


Lets adjust the chart a bit.
Issue 1: network does not get faster via nvidia's GRID. it should be the same in both, which is 75ms in both (assuming a ping of 37.5ms that is; remember they are talking about lag... so you need to send an order first and only then receive a reply; their chart assumes ping went from 37.5ms on gen1 cloud to 15ms on GRID).
Issue 2: Game pipeline does not take 100 ms.
If the game is chugging along at a weak 30 fps then it takes 33.(3) ms for the game to render each frame.
Issue 3: The display does not take 66 ms to display an image, much closer to 10.
Issue 4: GRID makes virtualization easier, services like onlive already virtualize to meet a certain min framerate at a specific quality setting and then use the rest of the GPU for other things. All GRID Does is require them to buy fewer GPUs to reach that target... as such the pipeline does NOT get any shorter.

So, realistic console figure is 43ms total. Assuming their capture and decode and network figures are correct, gen1 cloud gives you 163 ms. And GRID cuts that down by a whopping 30 ms to a 133 ms.

FALSE...read my link.
Console latency(input lag) goes from ~67ms to +160ms.

And you might want to look into networkstacks.
(I do this for a living)

http://channel9.msdn.com/Events/BUILD/BUILD2011/SAC-593T

I predict that a lot of geeks need to hone their networking skills...before entering debates that include networkning...and not just basic networking.

Just saying...:cool:
 

Mistwalker

Senior member
Feb 9, 2007
343
0
71
LOL

This is traget the audience that uses consoles.
You think consoels run at 60 FPS
(strike 1)
Seriosly?
Where did I state that? Only a very small handful of current generation console titles do. The new systems coming out? I don't know.

If you are trying to say that I shouldn't expect 60fps because our 6-7 year old consoles can't reliably do it, then where is the need for cloud processing? Which (again) is why I think GRID is going to be aimed at very specific applications that will not challenge home consoles.

At 720p?
(strike 2)
Currently, again, a rarity.

Not my point though. Are you proposing that GRID is only going to match what current consoles offer in terms of visual quality? What happened to invalidating the "crap boxes" when you're setting the bar at tech 7 years old?

And cherry picking number eh?
The lantency goes alll the way up to +160 ms in controller lag.
(strike 3)
Cherry picking? This is the entire list from the article YOU linked:
Game Latency Measurement
Burnout Paradise 67ms
BioShock (frame-locked) 133ms
BioShock (unlocked) as low as 67ms
Call of Duty 4: Modern Warfare 67ms-84ms
Call of Duty: World at War 67ms-100ms
Call of Juarez: Bound in Blood 100ms
Forza Motorsport 2 67ms
Geometry Wars 2 67ms
Guitar Hero: Aerosmith 67ms
Grand Theft Auto IV 133ms-200ms
Halo 3 100ms-150ms
Left 4 Dead 100ms-133ms
LEGO Batman 133ms
Mirror's Edge 133ms
Street Fighter IV 67ms
Soul Calibur IV 67ms-84ms
Unreal Tournament 3 100ms-133ms
X-Men Origins: Wolverine 133ms

I reiterate what I said, at worst you might see up to 166ms, and that includes the display on top of controller input. Nvidia's marketing graph states 166ms as optimally possible. You stated they are equal which is simply not true.

GRID is interesting tech that, as others have mentioned, might be nice as a supplement to other services and content delivery systems. It will not challenge home consoles in a meaningful way by itself, and I don't know why those keep getting brought up.
 

Golgatha

Lifer
Jul 18, 2003
12,640
1,482
126
Cloud is great until the connection or service craps itself at either end. Anybody who’s had Steam offline access fail or lost access to their GFWL saved games at the most inopportune moments knows this.

Furthermore, the quality of this will be directly influenced by the quality of your network connection. A fixed console box will guarantee 100ms + 66ms much more often than an arbitrary network connection will guarantee 30ms. Many network connections measure in the hundreds.

This. Unless I can cache the content to my fileserver and run it from my GbE network locally, this is destined to fail. Most home network connections are going to be 100-200ms latency.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Where did I state that? Only a very small handful of current generation console titles do. The new systems coming out? I don't know.

So the set "target" is one you just invented to suit your posistion...nice *chough*

If you are trying to say that I shouldn't expect 60fps because our 6-7 year old consoles can't reliably do it, then where is the need for cloud processing? Which (again) is why I think GRID is going to be aimed at very specific applications that will not challenge home consoles.

I guess you choose to neglect the target audience...good job!

Currently, again, a rarity.

So you artificial raise the bar...pulling numbers from your *beep**sigh*

Not my point though. Are you proposing that GRID is only going to match what current consoles offer in terms of visual quality? What happened to invalidating the "crap boxes" when you're setting the bar at tech 7 years old?

I can see this must be very confusing to you.
Think of consolee games ingame grafik vs CGI.
You are hopelessly stuck in resolution + FPS.
These should just match the OUTPUT of consoles(after upscaling).

That gives us a target of 30 FPS @ 720p.

How can that be better than consoles?

I'll let you think for a little while...because this is too funny.
If you give up here is the answer:

A console does output 720p at 30'ish FPS.
But what it outputs is the key here!

Not resolution..bacause it's matched.
Not FPS...because it is matched.
Not lag...because it is mathced.

It's what is being rendered ingame that seperates and elevates this over consoles.

Gone are the limitaions of 256MB RAM.
Textures can be HIGH res in the game engine...unlike current consoles textures = I.Q IMPROVEMENT!
You can use HIGH res shadowmaps in the game engine = I.Q. improvement.
You can use AF in the game engine = I.Q. improvement.
You can use AA in the game enige = I.Q. improvement.

NAme any I.Q. feature...and you can add it...unlike on consoles.

And it dosn't stop there.
Because you are not limited by 6-7years old hardware with limitied resources, you can go futher:

Sandbox.
Destructable architecture.
Better A.I.
Better control options

The list is long.

Cherry picking? This is the entire list from the article YOU linked:
Game Latency Measurement
Burnout Paradise 67ms
BioShock (frame-locked) 133ms
BioShock (unlocked) as low as 67ms
Call of Duty 4: Modern Warfare 67ms-84ms
Call of Duty: World at War 67ms-100ms
Call of Juarez: Bound in Blood 100ms
Forza Motorsport 2 67ms
Geometry Wars 2 67ms
Guitar Hero: Aerosmith 67ms
Grand Theft Auto IV 133ms-200ms
Halo 3 100ms-150ms
Left 4 Dead 100ms-133ms
LEGO Batman 133ms
Mirror's Edge 133ms
Street Fighter IV 67ms
Soul Calibur IV 67ms-84ms
Unreal Tournament 3 100ms-133ms
X-Men Origins: Wolverine 133ms

I reiterate what I said, at worst you might see up to 166ms, and that includes the display on top of controller input. Nvidia's marketing graph states 166ms as optimally possible. You stated they are equal which is simply not true.

GRID is interesting tech that, as others have mentioned, might be nice as a supplement to other services and content delivery systems. It will not challenge home consoles in a meaningful way by itself, and I don't know why those keep getting brought up.

You are funny, you are so lost staring at the color of the truck...you ingore it's cargo.

Anyways I am done with you, nothing you have said made any positive impact on me...or shown me that you graps the technology o_O
 

Golgatha

Lifer
Jul 18, 2003
12,640
1,482
126
yeah this, since most are on capped internet connections and HD video streams that are of decent quality take ALOT of data i cant see this working. Total fail idea.

Yet another good reason this will fail hard. Data caps from USA ISPs are yet again stifling innovation due to the ISPs also being streaming video content providers. I would really like to see the US government intervene and make IPTV from the ISP count towards bandwidth caps. You'd see the bandwidth caps go away really quickly at that point.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
This. Unless I can cache the content to my fileserver and run it from my GbE network locally, this is destined to fail. Most home network connections are going to be 100-200ms latency.

I call bullshit!

I work for an ISP.

You just pulled a number from thin air.

Do you even know the protokcol lantecy of:

ITU G.992.1
ITU G.992.3 Annex L
ITU G.992.5 Annex M
ITU G.991.2
ITU G.993.1
ITU G.993.2

Do you?

If you think 100-200ms of lantenncy is the norm...you are sadly uninformed.
I don't even get 200ms of ping, if I game on a US seerver from the EU...I get a 100 ms (0.65 x c matter here)

What do you base you random numbers on?
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Yet another good reason this will fail hard. Data caps from USA ISPs are yet again stifling innovation due to the ISPs also being streaming video content providers. I would really like to see the US government intervene and make IPTV from the ISP count towards bandwidth caps. You'd see the bandwidth caps go away really quickly at that point.

I never got the CAP part of US ISP's.
You market must be free, when none steps up...and offers no-CAP lines.

If an ISP offered CAP lines in DK...their customerscount would be: 0
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Right, but they are talking about doing the rendering on their end and streaming you the video, go screencapture diablo for 5 hours and let me know what the file size of that is :)

I mean a DVD is 2-5GB and its not even HD, sure you can compress it but to have a artifact free 1080P stream is still going to be in the 500mb-1GB a hour range. I would hit my monthly data cap in less than 2 days. I would hit my cell phone data cap in less than a hour.
I will let u know tomorrow ;)
 

Mistwalker

Senior member
Feb 9, 2007
343
0
71
So you artificial raise the bar...pulling numbers from your *beep**sigh*
What?

You asked about bandwidth requirements. I responded with what my expectations would be, you responded condescendingly because "consoles don't currently deliver that" (in fact they do, just not in most cases).

I can see this must be very confusing to you.
Think of consolee games ingame grafik vs CGI.
You are hopelessly stuck in resolution + FPS.
Actually those are the primary two factors in determining bandwidth requirements which is where this started.

These should just match the OUTPUT of consoles(after upscaling).
That gives us a target of 30 FPS @ 720p.
Now who is setting artificial bars to suit their position?

By the way, you're still looking at 900MB/hour at 30fps@720p, in answer to your initial question + the standards you have now set.

And yes, more processing power means a potentially better image above standardized hardware limitations. That's great! I wasn't the one who brought consoles into the discussion (multiple times), just remarking on your need to set the bar at what they deliver as far as resolution and FPS. Which, again, are the only pieces in all this that matter when talking about bandwidth.

Anyways I am done with you
Stay classy, Lonbjerg! <3 <3

nothing you have said made any positive impact on me...or shown me that you graps the technology o_O
Well, I did answer your question. I seem to be the only one who did.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
If you think 100-200ms of lantenncy is the norm...you are sadly uninformed.
I don't even get 200ms of ping, if I game on a US seerver from the EU...I get a 100 ms (0.65 x c matter here)

100-200 ms is pretty normal. I have seen much worse... A really good broadband can get much lower though

0.65c, aka 200km/ms. Is the speed of light in a fiber optic cable.

the US is 3000 miles across (east to west)
3000 miles = 4 828 032 meters

It takes a mere 24.14016ms for light to cross that distance in a fiber optic cable.. And usually the servers are closer. The vast majority of your ping is coming from switching. (the fact the cables are not going in a straight line adds a bit too)

Right, but they are talking about doing the rendering on their end and streaming you the video, go screencapture diablo for 5 hours and let me know what the file size of that is :)
It will be obscene...

I mean a DVD is 2-5GB and its not even HD

yea but DVDs are encoded in MPEG2, an algorithm from 1996.
Still, its going to be an obscene amount of data that will make ISPs throw a hissy fit.
 
Last edited:

Rifter

Lifer
Oct 9, 1999
11,522
751
126
100-200 ms is pretty normal. I have seen much worse... A really good broadband can get much lower though

0.65c, aka 200km/ms. Is the speed of light in a fiber optic cable.

the US is 3000 miles across (east to west)
3000 miles = 4 828 032 meters

It takes a mere 24.14016ms for light to cross that distance in a fiber optic cable.. And usually the servers are closer. The vast majority of your ping is coming from switching. (the fact the cables are not going in a straight line adds a bit too)


It will be obscene...



yea but DVDs are encoded in MPEG2, an algorithm from 1996.
Still, its going to be an obscene amount of data that will make ISPs throw a hissy fit.

I realize DVD is old algorithm but its fast to encode and decode, remember any compressing they do of the output file(H.264 or whatever they go with) is going to add latency which is already going to be high with this type of setup, they would have to go with a very fast algorithm that doesnt take much CPU/GPU time to encode/decode for best performance and thats going to result in larger files to stream.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
they don't have to send everything from GRID fully video encoded

parts could be done locally and then combined with data from cloud

also, number of possible images in a game is finite, and some clever compression may be at work.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Even if GRID flops in N. America, it may be a hit in places in the world with much faster (incl. latency) and pervasive broadband. Also in cybergaming-cafes and such. Plus, not all games are fast-paced. :)