OnLive and cloud gaming

So OnLive is having some trouble, leading a lot of people to proclaim the death of cloud gaming. Not me. I’m pretty convinced that cloud gaming is the inevitable future, the question is merely one of time.

First, let’s just talk about the technical issue of latency. It’s not solved right now, but it’s really not a fundamental law of physics like some people make out. Plenty of games released today have more than 100ms of latency, most of which is avoidable through software, and some of which can be avoided with better hardware and protocols (e.g. high refresh rates for monitors). Point is that we can probably get the engine+display side latency down to ~25-30ms or so if we really wanted to (16.7ms for game engine with no game/render thread pipelining - use fork/join parallelism instead of pipelining concurrency - plus a few extra to keep the GPU slightly behind the CPU so it doesn’t starve, then new hardware for low-latency transfer and display on the display). That gives us more than enough leeway to add another 20ms or so for network traffic and low-latency compression. Go to speedtest.net and see what your ping is to a nearby server, I bet you it’s less than 20! Altogether, 40-50ms is totally doable, which is competitive with the very lowest latency current first person shooters played locally.

So, the technical issues can be solved if we really wanted to. What about the business side of it. I like to structure my thinking on this in a sort of “incremental” way based on how the console business already works. So, imagine that the next console cycle after the next one rolls out. You go and pay your five hundred bucks for a console (which is equivalent to a mid-range PC, power-wise), but instead of actually getting a physical console in your hands, you get a video receiver and a controller, and your console is put into a data center near your house.

So far, the business model hasn’t changed much w.r.t. normal the normal console business. Each customer gets a mid-range PC, with some custom bits, but we’ve added the cost of video transfer hardware and bandwidth. So really we’re losing money overall here.

The next step is to exploit that we’re in a data center rather than in people’s homes. So get rid of all the sleek packaging and miniaturized hardware, cooling solutions etc. that comes with being consumer hardware. We no longer care about fan noise, or slight bulkiness, just get the cheapest parts that will do the job. This alone probably saves you more than enough to cover the cost of the extra streaming hardware/software.

Next, when consoles fail (as they will, to some degree), you can just transfer customers transparently over to another box in the data center. No support calls, no expensive shipping, no PR nightmares. Another huge source of savings.

Next up, combine multiple consoles on to the same machine. Amortize the cost of storage, networking, cooling etc. In fact, games for this platform could be careful about marking up readonly data pages (textures, audio, geometry, etc.) and the OS could share those pages in between games and even reduce overall memory usage. This saves more money.

Next, move cooling and power supplies out to a central system per-data center. Having tight control over climate means your tolerances for all components go up, and you probably save some money by not having each box be a consumer-grade cooled system that has to take variable environments into account.

Here’s the big one: piracy is down to 0% for any exclusive title to the platform. If all a customer ever sees is a video stream, they couldn’t possibly pirate it. I suspect this will benefit both consumers and producers. It’s unclear what the market dynamic of piracy actually does to retail prices - I suspect it causes them to be artificially higher than they would be if consumer spending was the only factor

Next, the obvious step of oversubscribing each box. If you’re playing an indie game that only uses 20% of the box, you can have four other people use the same hardware at the same time. Presumably the platform would charge developers higher licensing fees based on how much perf. they use. Find some conservative bound (based on peak numbers for AAA releases) for your “customers : consoles” ratio, maybe 2:1, and you’ve all of a sudden saved a boat load of money by simply not having one console for each customer.

Finally, any remaining cycles you can plug into existing cloud computing for businesses and web sites etc. Gaming tends to have clear “peak hours” so for work that doesn’t have requirements for latency (typical cloud compute stuff) you just shuffle it around the time zones until you find a place where gaming isn’t peaking and you have enough cycles to spare. So effectively, business computing is subsidizing entertainment computing.

We haven’t even begun to talk about the consumer benefits here. Instant access to any game (each data center could have a ready-to-go memory image for the most popular games already loaded up and ready to go, with copy-on-write semantics), no massive console taking up space, no worrying about hardware breakage, transparent and incremental hardware upgrades (it would start as simply allowing more people per box, but then eventually high end games can start taking up more resources per instance), etc. etc. This is a good thing.

Anyway, this was supposed to be a quick note. Suffice to say I think OnLive is on to something. They may not be timing it correctly, but I do think it’s inevitable, and we shouldn’t read too much into their current troubles.

Comment Form is loading comments...