After seven years in development OnLive “the most powerful gaming system in the world” has been launched at GDC in San Francisco and threatens to make existing home consoles obsolete. Nine games publishers have signed up including Ubisoft, Electronic Arts, THQ and Atari Interactive.
OnLive is the server based thin client model that I have mentioned so many times on here. This is what I thought Steam and Xbox Live would evolve into, and now they have been leapfrogged. When this takes off high street game retail will be dead. Initially it runs on PC (including all those millions of netbooks pouring out of the factories) and Mac with a cheap and simple OnLive MicroConsole available if you want to use your Television instead. Eventually, as I have said before, the electronics will be built into TVs. It would only cost a few cents. The potential here is so enormous and the capabilities so vast that it could put everything we have done till now into the shade.
There is the potential here for a community of Facebook proportions because thousands of players will be in the same server at the same time. For the same reasons a game can be released to a global audience instantly and all updates and expansions will be distributed equally instantly. So you are always playing the latest games and the latest versions of games.
Another nice thing is that this could be made to work on smartphones. So you would have a seamless gaming experience. PC, television and mobile phone. Gaming of the highest order would be there on demand anytime anywhere complete with a big and active community. This is exactly what I was expecting technology and enterprise to deliver.
And it is goodbye to piracy. All those people who steal games instead of paying for them will have to go and steal something else instead. Motor cars maybe. Which means that all the people who put their work and lives into creating games will be rewarded properly for their labours. And which makes the whole industry a far more sensible place to invest.
Of course the genie is out of the bottle now. This vision has been taken on board by a vast public. So if OnLive doesn’t succeed then someone else will. And Nintendo, Sony, Microsoft and Steam are all going to have to radically change their business models and their technology if they are not to be left behind.
In this industry we always live in interesting times!
Permalink
I’m highly sceptical about this until I see it in action. Having worked with realtime financial trading systems, I know how difficult it can be getting even small amounts of data about the globe in a timely fashion. They’d need server farms dotted about all over the globe, and to wait for infrastructure to improve.
Interesting idea, but Im putting this in the “Phantom” category until we’ve seen it working in the wild.
Permalink
There isn’t anything in OnLive that makes it so your arguments apply to this service, but not to Steam or anything else.
If there is enough bandwidth to stream the entire experience as a video stream, not rendering anything on the local client, then surely there is enough bandwidth to download patches and new content almost instantaneously.
If the server is powerful enough to stream video to thousands of users sharing “the same server at the same time”, then this same power can be used to let WoW servers handle 50K simultaneous users.
But if it’s not about the video streaming and it’s more a community, massive server thing, then there’s no magic ingredient that prevents Steam or whoever from doing the same thing either. OnLive is just another API/framework/platform then.
Also, the difficulty in instant global release of a game is in localizing text, cutscenes, voice-overs, supporting cultural settings, fancy text rendering, IME work, and testing/QA; the problem is not shipping a couple of discs around the world.
Nevermind technical feasibility, buffering vs. latency, image quality, needing to be online over an extremely high quality link all the time, having to pay for the link _and_ the service _and_ the games…
So yeah, sceptical 🙂
Permalink
Definitely in the Phantom category until they either explain convincingly how they are getting over the input/response latency problem, or show it in action properly.
Permalink
I don’t think that this is the technical mountain that you guys make it out to be. Jagex already deliver the Runescape MMO (increased to higher definition last year) and the FunOrb suite of casual games to millions of users worldwide using servers and thin clients.
http://www.jagex.com/
Permalink
Acclaims Dave Perry has also announced he’s been working on a cloud service by the way Bruce:
http://www.vg247.com/2009/03/25/gdc-david-perry-confirms-entry-into-cloud-gaming-race/
Watch the GDC presentation, it is impressive to be honest:
http://www.engadget.com/2009/03/25/video-onlive-streaming-game-demonstrated/#continued
I regretably gave up PC gaming as I simply couldn’t afford the upgrades (and technical knowledge) to play the latest games, and have since opted for console. Such a service as OnLive would be absolutely superb, although surely the state of the UK’s NET infrastructure would prove a hindrance.
I’m not skeptical at all regarding the Cloud. OnLive may not succeed as I personally think its still to early given the state of broadband (primarily in the UK), but eventually all gaming will be Cloud based methinks.
Permalink
Lag kills it. End of. It takes about 150ms to ping even huge sites outside the UK, so even if encoding and decoding were instant (they’re not) you’re playing a game near a fifth of a second behing. With Forza that’s 50 frames of physics calculations gone past before you even get the chance to react.
For non arcade games like say Peggle absolutely. For Quake? Forza? Fifa? No.
Permalink
Agreed with Dudley.
At the end of the day, no amount of compression will get light to travel down those fibres more quickly.
As an aside, it takes 6ms for me to ping my own router!
Permalink
Interestingly we got an online multiplayer offroad racing game working very well on a server at Codemasters.
It was called 1nsane, the system it ran on was Codemasters Multiplayer Network and the year was 2001.
Admittedly the client wasn’t very thin but the game ran very well in online multiplayer mode. And compression technology and bandwidths have improved enormously in eight years.
Plus the fact that OnLive has been seen and played by journalists at GDC who have reported that it works well.
http://www.mad.co.uk/Main/News/Disciplines/Digital/Articles/152702e9b04544788899755ee0c6a429/Codemasters-first-to-market-with-multiplayer-games.html
Permalink
Wow, that was quick. I saw this on the BBC website today and was planning to email you about it, but you beat me to it…
Permalink
“And compression technology and bandwidths have improved enormously in eight years.”
But, Bruce – the speed of light has not.
Permalink
Bruce, Jagex’s technology (and 1nsane) is in no way, shape or form comparable to what OnLive are claiming of their technology.
This is how selling snakeoil works. Make it sound plausible to the layperson by making vague (faulty) comparisons to existing tech. Have a physical prototype to show off, and big-name publishers giving their “backing” (i.e. accepting free money).
I’ve seen a few greedstruck business journalists get excited about this. I’ve seen far more developers comment that it faces insurmountable problems (the main one being latency) that will prevent it from ever working as described.
Digital Foundry (probably the most authoritative source on such matters) are going to be giving their view soon – I expect it won’t be positive.
Permalink
Develops have a hard time getting input from the controller to update the world on screen in a matter of four or five frames (at a good, stable frame rate). Any slower and the controls feel sluggish.
The above + extra unknown lag = too much delay in response and not much you can do to hide it.
There are smoke and mirrors that you can use but it limits the kinds of games that will be able to get away with it.
Permalink
7 years of behind the scenes work and internet commentators are able to dismiss OnLive with a breezy “Lag. End Of.” or “It’ll never work”.
These OnLive people are obviously idiots who’ve never achieved anything like the monumental feats of these anonymous message board posters.
I am of the opinion that this form of delivery WILL work – eventually.
Who knows if OnLive are a little (or a lot) too early, but the idea of a Cloud-based service which handles processing, input and relays an image back to the player feels like the next stage of evolution for this industry.
Just think, these guys may have removed a massive barrier to entry (the $150-$400 console purchase). All they need is to get their technology bundled with a few TV manufacturers and things will get really interesting.
(assuming it all works, of course)
Permalink
Nick, it doesn’t just have to work, it has to work better than the alternative.
The alternative is: the ultimate form of compression, by sending textures and meshes over the client just once, and rendering it there. Sure it needs horsepower, but GPU power has been going up MUCH faster than bandwidth got faster. This is the crux of the matter: to make OnLive work, it needs massive technical progress on the bandwidth front, but in the meantime notebooks will continue to evolve, until in X years your “thin client” runs circles around today’s HD4870 / GF9.
If it competes at all, it’ll likely be with something that’s not 3D sourced 7th guest meets wing commander cutscenes or somesuch? The sims with prepared live segments?
Permalink
If this stuff can be made so cheap it needs to be given away free.
Permalink
We’re terribly close to the 1st of April. 😐
Some say OnLive aims at casual gaming, notably to answer the question of lag and else on high end gaming.
But unless I missed som’thing, there are elements to consider:
1. Many casual games are not Solitaire. They are REAL TIME applications.
There, no matter the power requirements, lag happens.
2. I wonder how you’ll convince mom and dad that they should play their casual game at a low resolution on their super expensive HD TV because there are “packets being lost”…
3. How are you going to explain to your average casual gamer what lag is?
Why would they care? They’re used to their PC, Wii and DS game. Eventually, you may tell them that it’s the future, and see that mobile phone game that seems to lag simply because it’s not powerful enough? Well, kinda the same here, there will be lag, regardless of the CPU/GPU requirements.
I really think mom is going to like the idea of having her e-pees take intermittent naps and I surely don’t want to be the one to explain her this stuff, no thanks.
By the way, how the hell can this stop piracy pray tell?
Sure, you cannot pirate THEIR system, but what about the current way, you know, buy a machine of $600-800 that does plenty of things, and go torrent some?
Permalink
Besides, light doesn’t travel at c in optical fiber. Depending on your “mileage”, it can lose a great lot
OnLive will be good when computers will treat the light signal directly without conversion, rushing through boosted cables.
Meanwhile, we’ll probably be buying the Playstation 5.
Permalink
http://news.bbc.co.uk/1/hi/technology/7976206.stm
OnLive say it will work:
“We have nine of the largest game publishers in world signed up.
“They have spent several years in some cases actually going and reviewing our technology before allowing us to associate with their company names and allowing us to have access to their first-tier franchises.”
Permalink
“We are not doing video encoding in the conventional sense,” explained Mr Perlman, dismissing an article in gaming website Eurogamer that said the service was unworkable.
“It’s a very ignorant article,” said Mr Perlman, who said Eurogamer had conflated issues of frame rate and latency.
“They are independent factors,” he said.
FYI, the Eurogamer article :
http://www.eurogamer.net/articles/gdc-why-onlive-cant-possibly-work-article
We can’t tell for sure that onlive won’t work, but the article is interesting nevertheless and, at least, seems documented and argumented enough.
We’ll see soon enough if onlive is the revolution which is announced !
Permalink
It can’t change the fact that even THEY say 80ms, which would kill Burnout Paradise stone dead.
The tests at GDC were done with dedicated machines in the building (despite what they claimed about external connections, such things are not allowed at GDC in that form) and so CAN work.
This DOES have a future in local places like hotels. In the outside world, no.
Bruce – yes, online multiplayer can work even up to 300ms or so because a) that’s not a lag to your input and b) the computer can predict the paths of your opponents usually quite well.
Here it can’t predict or buffer inputs, it’s only playing a game and trying to stream a 5Mbits video back to you without buffering or the ability to drop a frame. Even youtube with massive, massive servers and video less than 1/10th that size has to buffer ahead and doesn’t always work.
The final killers for the real world are that, if they really have generated encoding this magic, they’d make a billion times more selling it to places like YouTube and even remote trading floors than they ever will here. The price won’t work either, they have to pay for the computers, and the computers to encode, and the massive bandwidth required out of subscriptions, they won’t be price competitive with consoles.
Nice idea, will work nicely in hotels, won’t work worldwide. Even if you ignore the speed of light (which even if everything was perfect is 100ms of lag between here and New York.