• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

is OnLive to bring an end to the graphic cards market?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Harvey
Give me one reason I would want to give up a stand alone machine that can get things done without having to be connected to any other machine or system. :roll:

When it becomes self aware I'd rather have some IT dweebs getting their arms ripped off 1000 miles away than have my computer start whipping dvds at me.
 
Here's a serious question: why don't you get a life, and preferably one that involves English lessons

Mutz is confusing the description of the technology, and you guys are making knee-jerk reactions based on billiard table physics that don't apply

Many of the graphics routines that 10 years ago had to be inefficiently and brute force rasterized by that hardware are now generated by more efficient algorithms built into other layers in the hardware. What mutz is actually referring to is moving some of those more descriptive layers to an actual client level, which is perfectly viable.

Take a popular game like Wow, which is not only absurdly popular but absurdly primitive in terms of graphics rendering. There's no reason that graphics calls in a game like that couldn't be sent down the pipe before hand. I'd also guess there are more people playing Wow, and hence a lot more money to be made on it via subscriptions, than a render hog like Crysis with .0001% the market share.

Nobody (and if they are, then they are incorrect) is claiming that every f---ing frame of the game would be centrally rasterized in 1600-1200 X-Y resolution and broadcast at 30fps - christ. Some of you (and it isn't Mutz) don't comprehend the differerence between H.264 and OpenGL. We're gaming here and moving graphics objects around that can easily be pre-canned via client software - not streaming porn. You're simply moving from 2-D to 3-D since the former has been handled quite efficiently with Citrix and RDP for over a decade.

The server was $250k, so it was a cheaper option than buying more workstations....The stand alone workstation kitted out with appropriate hardware and proprietary software runs around $50k

Been there - done that. 98% of the costs you describe are either licenses, or fat GPUs that have an aftermarket value a year from now that won't fill up your gas tank because they can't do anything else. If you have a problem or disagreement with this I'll be happy to trade you some Silicon Graphics stock given they tried the exact same model. By the sime SGI was done jerking clients around you could buy a faster Intel station with integrated GPU that was just as fast for 1/4 the price.

Yeah, let's brute render everything on custom cards and/or a Unix back end, and charge the customer a fortune because they have them by the nads and were forced to buy the software. Pull my finger. Trust me, the guy in India that will be doing your job in the future won't be using this kind of hardware. He doesn't now.

and thin client software on the general PCs.

Hate to tell you this, but the term 'Thinclient' and 'High End Switch' are mutually exclusive. If you honestly need a 5k$ switch to run a thin client, then you're running software that shouldn't be running in a thin client stream because it's causing huge framing problems. Again, been there - done that.

I've worked for a lot of big manufacturers, and *none* of them will ever again try to run CAD on a thin client topography because it totally sucks. It's like trying to run full motion video through a Citrix client - it actually slows down because the latency algorithms built into Citrix / RDP don't jive with H.264.

So, if we sum this up, you're complaining about the per seat hardware costs of running CAD stations, but then spend 5k$ on a LAN switch to run a thin client. You don't by chance work for GM do you?

Otherwise, while Mutz is being a bit sensitive here, it's the same old arguement everytime somebody dare insult the PC gamer crowd that defends their right to buy the latest and greatest GPU. Haven't upgraded the GPU in my PS3 or Xbox, and they still run the latest games just fine. The only difference is moving more of the graphics routines out into a client level which obviously NVidia and ATI will scream about - until of course they cut a license deal with some fat cash cow like Microsoft or Apple.
 
OnLive can work if they can get work out the time in transit problems.
If I am using an application that I want to display at 30 frames per second, then the maximum time for each frame cannot exceed 33ms, that is for send and receive so not more than a 16ms delay between me and the server. Taking into account the limits of the speed of light , that means the server cannot be more than 3000 miles from the PC using the service and has to have an absolutely perfect connection at that distance.

Compression of the data isn't the problem, the speed of light is, in this case, not fast enough.
 
I wouldn't mind OnLive for turn based games but the technical hurdles are staggering for FPS and RTS games where the order of hits and operations are critical to the game play itself.

ATI's cloud computing seems to be a step in this direction but realize that they are currently marketed as outsourced 3d render farms rather than for real time video game content.

Thinking about it, I wouldn't mind playing Empire Total War with all graphic settings on my netbook. But I wouldn't want to subscribe to anything to do so. Maybe a 1 time "purchase" fee for the game and game time similar to using minutes on a cell phone.
 
Taking into account the limits of the speed of light , that means the server cannot be more than 3000 miles from the PC using the service and has to have an absolutely perfect connection at that distance.

i'd say, much less,
if u go certain test sites, or ping some IP's, servers 800 miles away, might give you a 200ms ping, and servers 3000 miles away might also give you the same result,
so it seems to be realy dependent on the infrastructure, the ACTUAL distance, and how many nodes are on the packets way,
and less on the geographic distance.

the maximum time for each frame cannot exceed 33ms, that is for send and receive so not more than a 16ms delay between me and the server.
i'd remark here, that the video will be streaming from the server one way on an handshake connection, while only the controls would be sent back to the server,
so 33ms should be enough.

Thinking about it, I wouldn't mind playing Empire Total War with all graphic settings on my netbook. But I wouldn't want to subscribe to anything to do so. Maybe a 1 time "purchase" fee for the game and game time similar to using minutes on a cell phone.
it does raises a question, whether one will have to pay a certain amount each month, even if he's not playing..
so what exactly he's paying for?
being able to access the server and watch other people play..? as anyway he'll have to pay for the console..
or he'll be able to watch free of charge?!
this is an absurd.
i'd say,
they'll have to give the service for free and charge only for game time, but this way,
they probably wouldn't be able to afford it!


 
Originally posted by: mutz
Sometimes I wonder how some people manage to not drown in the shower each morning.
sounds like a truely unwise&arrogant remark,
it would've been wiser, to better explain yourself.
aside from that, u probably meant about the missing parts of information that came out eventually,
well, just viewing the product, havn't created it, and trying to raise some sirious questions which were in mind,
not every forum, is capable of delievering profound information about technology and so one happens to make mistakes,
it is also very demanding for anyone not having any core background with IT,
and so it happens, that u forget some things or ignore others, as u cannot catch everything at once.
still, the best thing is, if u try to obtain a rock from the bottom of the spring, is to dive strait to it,
as hard as it means.


LOL

And ahahahah @ you guys discussing payment methods for this failing of a model. People arnt ever going to give up their home systems. Have you seen the stink about DRM? Limited game installs? People are about to cast EA into the pits of hell just for doing a extremely innocent version of what you propose, except with disconnects, downtime and lag! WHAT A DEAL!

 
Data progression seems to be -
  • Control Input
    Internet
    Server (runs game and renders)
    Compression (1ms)
    Internet
    Decompression (depends on your computer)
    Display

Getting all that done in <30 ms is challenging...

1mb add on to browser 😕

720p requires a 5 mb connection. Which not everybody will have. 1080p will require infrastructure upgrades from the ISPs.

Requires a subscription and requires you to buy/rent the games you want to play. Although that is similar to xbox live isn't it?

If people want to sign up there is a beta program - here. I can't since I'm not a Yank.
 
The bandwidth claims are entirely possible.

ATSC HDTV channels (including the -2, -3, etc. off of each station #) are limited to 19Mbps total per channel, and a typical 1080i stream uses less than half of that. DivX and xvid are both capable of far better compression ratios than the mpeg-2 used for American TV (IIRC European digital cable is mp4 based).

But the latency will make it entirely unusable for games. While bandwidth gets better with new technology, latency does not. You are limited by the speed electricity or even light can travel through the cables (or up to satellites & back). At the speed of light, it takes about 20ms to go from the east coast of the U.S. to the west coast, one way, which means the minimum theoretically possible cross-country ping time is 40-50 ms.
 
Requires a subscription and requires you to buy/rent the games you want to play.
they kind of hold you by the balls!
the manifestation of they're dream:laugh:.

DivX and xvid are both capable of far better compression ratios than the mpeg-2 used for American TV
why not H.264?

While bandwidth gets better with new technology, latency does not. You are limited by the speed electricity or even light can travel through the cables (or up to satellites & back).
yeah, thats exactly, though, but isn't it that light travels the same speed as electricity..? curious..
and was wondering whether different waveguides lowers or uper the wave speed, they might control it's strenght but speed? any idea?
the speed of light basicaly differs through different waveguides or atleast through humidity&air,
it is much lower going through the atmosphere then through vacum space,
any light on how this happens exactly would be strongly appreciated,
as for the traveling time coast to coast in the states, it'll probably be much more..
the infrastructure isn't that strait ofcourse, and signal has to go through all the different nodes on it's way,
maybe the best way testing the approximate speed, might be simply pinging even a 50 miles away address,
note that the developers are planning on a maximum of thousand miles distanced end points from the servings centers,
anyhow, you'r point is actually the main bugging matter!
this is something that is driving one NUTS!
 
The only way this could work is if hardware was cheap enough to put servers in every neighborhood. That or someone figures out subspace from Star Trek 🙂
 
welcome to viral
if you'r pleading that's the reason for opening this thread, then you got it all wrong,
it was opened in other forum before it came here, to seek out the truth behind it,
at other places, it got only 1 or two responses, very unproffesional and you can even say childish.
about the viral thing,
yeah, it's kind of like that, but the Tech is also both interesting and curiousity creating, it is a kind of a "bomb"..
pleading "not guilty" at this, not infected (though some would argue that :laugh🙂 but this is something that people are pooling their hair out trying to figure out how it'll work..
i guess this is one of the web's characteristic's, it's not exactly viral marketing though as this thread had started as a query, a research, and not as an enthusiast remark/thought.
aspects of the possibilities for it to work were taken with doubt and questions have been rising to see whether it's a fake or a reality,
it's hard to tell corrently about the future, as this project seems to be quite unfamiliar even at renowned forums, and almost not any techincal details about it has been released even by the company who produces it..
i think it's an interesting subject to anyone who deals at least a bit with computers as it is aiming at rocking the earth underneath (probably) the entire conception of human interaction with the machine, including many "facts" and situations that have been taken already as granted,
this is a revolution, something very unic, we'll have to wait till the upcoming winter to see how exactly it's done,
but lets be patient, and wait for it to be released,
it just may be "the next big thing.."🙂.
regards.
 
Heh, anyone remember streammygame? That had noticable lag even on a local gigabit ethernet, I can't imagine how this would work. And even if the workload is shared on the network versus local (say, do shader calculations on the backend, and just send display lists to the front end) it wouldn't work well, remember how much PCI bottlenecked graphics cards? The Internet is slower.

Might be usable for cell-phone games though.
 
it must be possible in a way althought it seems rather not,
even youtube streaming is laggy on a 5Mbps connection but it could be due to the file size though..,
if they'll manage a fast responsive compression algorithm, that'll be awsome, that just might be it,
the only matter that remains, is the latency, and there doesn't seem to be anyway passing it (corrently..),
regarding Mark R's post, it doesn't seem to have such a huge demand for connection & server resources,
it'll stream through compression,
as Modelworks noted,
they'll probably have to place very near by server infrastructure,
the connection has to be perfect at all time (which is possible),
they just have to enable high prefered loosless compression,
but still the latency remains a flaw..
cannot be overcomed..
30 FPS with the corrent environment,
seems almost impossible,
or atleast itching the limits..🙂.
 
They could...

1. incorporate branch prediction, much like CPU cache.
2. pre-render static objects such as walls, obstructions, environment and store it in cache and only render dynamic objects
3. apply hit detection, collision, and movement calculations on the client which only send a small subset of information to the central server (prone to hackers)
4. render at low-res, and up-convert video to hi-def

Some speculation. Although I doubt this will ever work for FPS games, I would love to see technology such as this come live.
 
it sounds both complicated to create & inaccurate, though possible,
Although I doubt this will ever work for FPS games
can't tell either much about it..
I would love to see technology such as this come live.
sharing the same hopes,
that's the spirit!😉
 
I don't see this working for games, but for general computing it could become attractive. If I could ditch my computer and just hook my monitor up to my cable modem to search the net, write programs, etc. etc., I'd be all for that. There would certainly be challenges but in the general case I am definitely interested in this.
 
The problem with these kind of concepts is that they are a large scale version of the x86 pc. Very powerfull hardware connected with eachother through relatively narrow busses.

Dedicated hardware will always be faster. Today the problem is not calculation power but the means to get the data as fast as we want from one point to another. For example the intel processors with the FSB, through the use of prefetchers that load up data before it is needed into the local cache. Intel has been able to hide any memory bottleneck by use of their outstanding prefetchers. These prefetchers analyse the datastream and then prefetch data the prefetchers expect that will be needed.

Let's say Onlive uses the same principle. This kind of principle can only be used if the serverfarm knows in advance what you are going to do. This will limit the choices of games that you can play. That for certain means no 1st person shooters. Races would be a possibility. But to be competition to any current race game with a local gfx you need gigabit interrnet that performs like gigabit intranet.
 
Let's say Onlive uses the same principle.
it cannot use it,
(they presented crysis running on it, from a 50 miles away server),
it doesn't seem to be it,
u probably won't be able to play any FPS games through this kind of preftching technology, u'll never be able to foresee the next frame..,
this is also extensively limits the game graphics, and probably will need a very fast link (not the corrent internet) to overcome these limitations.
if u'll ping a very near by server, lets say some 20-30 miles away, you can get easily 30ms and less ping, this makes it plausible and feasible,
they claimed for 1000 miles away servers,
can't see how this is possible..
they can place few servers farms at the center of a big city, supplying the service to up to millions of people..
this is possible,
just probably at a very near by ("tight") environment..🙂.
If I could ditch my computer and just hook my monitor up to my cable modem to search the net, write programs, etc.
they're probably going at that direction,
there is a storage@home project incoming (same as F@H but for storage use),
maybe in the future, the entire computer interface will be based upon cloud computing..
everything that'll be needed will be shared upon interconnected dedicated servers..
if they'll be able to overcome latency & speed of light in that matter...
the possibilities are endless!
there were some rumors (apart from OnLive) about quantom tunneling which enables instantly particle movement in a timeless speed or, at a different dimension then light is going through corrently, so actually by that, going faster than the speed of light..
there also been rumors about Yale creating the first quantom processor..,
this kind of hardware would have to have a much faster environment to support it..
cables, HDD, MB's and probably also the internet connection,
maybe that'll be something the future will bring..? 🙂.
 
quantum tunneling does not get around faster than light travel of information. I don't see the market going in a direction that's slower, less efficient, or more expensive.
 
Mutz is either a shill, a troll (what was your name before you were banned?), or a small child. This thread made me face palm in real life.

If the video data is being compressed then you're going to need a hefty cpu to uncompress it. H.264 isn't exactly friendly to older cpus (anything modern will run it but when it first came out a lot of CPUs in use would choke on it), and you're expecting this compression and uncompression to happen in almost realtime?
 
I don't see the market going in a direction that's slower, less efficient, or more expensive.
can't tell much about it as these were rumors.., not too inwardly researched by the writer and quite new even as idea's..
just as a thought... talking very modestly here.
p.s - in time things will get cheaper probably..
the corrent Yale quantom CPU is probably the same as Intels first bit CPU, the 4004 or even it's prototype if were..
Mutz is either a shill, a troll (what was your name before you were banned?), or a small child. This thread made me face palm in real life.
yeah i'm a child, a troll, whatever you like, a shill, and you'r the master, what shall i do? i'm so dumb, so ashamed, how could it be ever thought that any genuine answer could come from posting such ridiculous thoughts at the temple of the oracles?
should never do something like that, should hide underground.
now seriously,
H.264 isn't exactly friendly to older cpus
yeah that's right, it was a suggestion, maybe better then the MPEG suggested before,
they never released they're specs, these were mere assumptions,
as spikespiegal have said,
Mutz is confusing the description of the technology, and you guys are making knee-jerk reactions based on billiard table physics that don't apply
it doesn't have to go H.264, they provide a console that'll probably handle the compression,
can't tell if they created a new codec,
they claim for a highly efficient very low lattency compression algorithm,
as been said before,
this thread has been opened in few forums, and got some unserious comments,
here atleast it was discussed so some of this product sides were lightened.
or shouldn't anyone ask any questions, raise any thoughts..?
or otherwise he shouldn't be interested in any new technologies as people will hount him for advertising..?
i think this world is not too welcoming (at least some of it..) thought
maybe even too cynical at this era,
at this point..
 
So it's going to be very fast, give the kind of amazing compression you'd need for this kind of system to work, and run on a very cheap console of some sort. If you came in here asking about the feasibility of such a setup people would approach you open mindedly, but your poor grammar, ridiculous claims on it's effects, and pimping of a vapor project is of course going to cause you to catch flak.

This thread basicly boils down to:

"hey look at this amazing technology this company is touting they're going to change the world!!"
"Ugh, that's both physically and economicly unfeasible how are they doing to do it?"
"Magic!!!"
"That's not good enough"
"Look they're being funded by big companies and are hyping it up it's got to be for realz!!!"
"Yea, because no company every hyped up some completely unreasonable concept to get funding from loose walleted companies spent a few years in development always being as vague as possible and then disappeared into obscurity without every bringing anything to market"
 
first, thanks for caring, for reaching out.
as a note to what you said,
i think people here were actually quite serious about it,
most of the replies were considerably thoughtful, and reasonably contributing.
ridiculous claims on it's effects
i think, that if it'll work, it'll have a tremendous impact,
it just seemed as something everybody are inpatiently waiting for, are excited to see whether it'll work or not,
as a remark, there are better words to pronounce it..

hey look at this amazing technology this company is touting they're going to change the world!!
^^ look above,

Ugh, that's both physically and economicly unfeasible how are they doing to do it?
economicly it probably is, but as RealyScrued said,
we cannot realy discuss it as there weren't any prices or hardware requierments released..
and physically it does seem possible, not 1000 miles away but it seems to depend on the infrastructure too,
no, it's not magic..:laugh:
nice of you to notice..:laugh:,
That's not good enough
yep, you noticed..🙂
it came into realization eventually..:laugh: 🙂,
Look they're being funded by big companies and are hyping it up it's got to be for real
it just seems that Perlman is realy enthusiastic about it, it's the feeling of it 🙂,
like when you buy something new, and you talk to the salesman and feel him, see whether his truthfull, whether he's passionate about his product,
or doeses it just to impress, just to make a deal,
i don't think they'll risk they're reputation for it, they seem aware and fully visionate, people not of talks, but deeds, serious guys,
Yea, because no company every hyped up some completely unreasonable concept to get funding from loose walleted companies spent a few years in development always being as vague as possible and then disappeared into obscurity without every bringing anything to market
yep, RealyScrued has pointed that too,
this product is just too huge to miss,
it's not like designing a new Xbox or mobile phone,
it's not related 100% to people's likes or dislikes, it's a whole new idea..
from realyscrued (page 1)
Yeah, and surely Nintendo wouldn't have invested in VirtualBoy. Nokia in N-Gage, Apple in the Newton, Phillips, NEC, Panasonic in the 3DO company, yada yada yada.
i don't think one can compare this product to any such as the N-gage, Newton, the 3DO (which were actually a partnership between LG, Matsushita, AT&T, MCA, Time Warner, and Electronic Arts -by wiki..-),
this product (not as knowing everything about it) is aiming at eliminating piracy..!!, canceling the need for some future upgrades..!!
and don't think only about computer experts but on millions that spend alot of money buying computers they don't realy need (exept for gaming..!!)
organizing the games market so companies will have much better control over the product which suites these companies needs!
both from the side of 1.people wouldn't be able to steal they're products!!
2.they'll be able to watch it online viewing the customers dislikes and likes about it and straitly be able to fix it..!
this will enable a much stronger bond between the producers and the consumers!
i think it just opens up so many posibilities that seems much far away at this point,
posibilities that'll spread new blood in this wounded market.
 
Back
Top