Thoughts on OnLive

Add comment!

June 18th, 2010

Today OnLive officially launched. If you haven't heard of it, OnLive is an ambitious new gaming service with a unique proposition: instead of downloading and running games traditionally on your computer, you run the games on their remote data centers and they stream the audiovisual output to your computer. Regardless of whether or not it works in your area, whether the proposition is attractive to you, or even if you are interested in the gaming industry at all -- I think OnLive is worth paying attention to.

Connecting to OnLive
It is real!

When it was first announced, it was met with incredulity from pretty much everyone, especially from game journalists and game developers. Most stories about it inevitably focused on how unbelievable it is and how it's at least several years too early -- the infrastructure couldn't possibly support it. OnLive's heavy optimism combined with their policy of secrecy definitely gave off the fumes of vaporware. If their product was so great, why not let it speak for itself?

Well, today the veil and all media embargoes have been lifted. I have been part of the beta program, and have been playing it on my MacBook Pro and PC pretty extensively, so I thought I'd add my two cents. First of all, let me start with the most important point: this is the real deal. It's not perfect, but at least for beta users in San Francisco, it works pretty well.

At the very least, OnLive serves as a proof of concept that some of the most ambitious, computationally intensive software applications can be virtualized and streamed in real-time at extremely low latencies. What this means for the game industry and consumers is even more fascinating though.

How does it work?

The actual OnLive app for Mac OS X is an 8 MB executable. You launch it, type in your username and password, and hit connect. OnLive authenticates your user account and begins to perform some network tests.

Connecting to OnLive
Connecting to OnLive

On my computer, OnLive sends out requests to several different IPs, ending up in servers in various data centers before deciding on one to use. I assume it is performing some kind of network diagnostics to figure out which route to which server is optimal for my location. For me (Comcast in San Francisco) the client chooses the Santa Clara data center, which is on average a 14 ms round trip for any given packet.

Connecting to OnLive
A successful connection to OnLive, adapted from the OS X activity monitor. Note that after the peak, OnLive is using 700KB/sec.

According to a naive glance at the Mac OS X Activity Monitor, OnLive opens a massive, HD stream of 700k/sec (about 5.5 megabits).

Hold on a second... 700k/sec? While people in South Korea, Japan, or many parts of Europe may take that sort of bandwidth for granted (or even orders of magnitude more than that, for granted), in the United States, that is a big deal. I have the cheapest, residential Comcast plan possible -- nothing special. So how is this possible? I'll go into that later in this post.

On the other hand, I tried it at a friend's house who has ADSL from AT&T. Even though she only lives a mile away from me, she barely can visit YouTube on her poor connection. Needless to say, OnLive didn't even let me log in with those conditions.

Connecting to OnLive
Trying to log in on a sub-standard connection.

A common point of skepticism for OnLive goes something like this: 700k/sec seems unbelievable. Normally, I'm happy to download at around 150k/sec! Also isn't this costing OnLive a fortune in bandwidth fees? There's no way it can be viable right?

If you are not familiar with content delivery network technology, like Akamai, then that does seem like a deathblow to OnLive. However, here's another way to think about it:

How much does it cost to transfer 100 GB at 100 megabits/second to your roommate's computer? How about a terabyte? A petabyte? The answer is about $10: the cost of a standard Cat5 Ethernet cable.

Now how about to your neighbor one room over in the apartment complex or in a college dorm? Depending on how the network is set up, your packets may not even leave the building and you will be transferring at 100 megabits right on an internal network.

Now how about transferring to the building across the street? If you are both on the same network provider, it's likely that you can still transfer extremely fast. By doing a traceroute, you'll likely see that you are only a few routers away from the other person.

Just like the $10 Cat5 cable in the first example, none of these transfers really cost anyone anything. The infrastructure has already been placed. Any fees that may be incurred are the companies amortizing their initial infrastructure investment. In many cases, it's simply marketing: sort of like how phone companies may charge you $0.25 per 60 byte text message, while letting you call someone across the country for free (using the bandwidth of tens of thousands of text messages). Data centers typically charge you for a direct pipe to the internet, for example, you simply reserve a 1 gigabit/second connection, and you are not metered by the gigabyte.

The real problem comes when you try to go through underdeveloped infrastructure. For example, there are only a handful of routers between me and the Santa Clara data center. However, if I do a traceroute to the same data center from my friend's ADSL modem, it balloons to twice as many routers. The more networks you have to pass through, the more likely it is to hit a bottleneck.

OnLive seems to be addressing this by having many geographic data centers and by striking deals with various ISPs. They are very tightlipped about the specifics. All I can really say is that "it works for me", at least under the light beta load -- we may soon find out if it works on a larger scale.

How is the latency?

This will definitely depend on where you are and how good your connection to an OnLive data center is. The Santa Clara OnLive data center responds to a ping from me in San Francisco on Comcast in about 14 ms, so I am assuming the latency of the display is roughly that, plus a few more milliseconds for their compression algorithm.

It is definitely noticeable, but I quickly got used to it. I played through the entirety of F.E.A.R. 2 with this latency and it didn't bother me. However, if you have a local copy of F.E.A.R. 2 running and switch back and forth, it takes a moment to adjust to the OnLive version.

Remember when LCD screens first came out and they initially had latency? Some screens even sported a gamer mode that would make the screen more responsive when enabled. OnLive feels like playing on one of those TVs -- it is most definitely playable for me, but if you're used to zero latency, it will feel weird.

Something that was interesting to me is that when I first heard about OnLive, I thought it would be perfect for games like World of Goo while FPS games would be unplayable. After having actually tried it, I find the opposite to be true. Games where you need to track a cursor are very difficult to control with any latency, while I found it easier to get into 3D games like Batman, Borderlands, and F.E.A.R. 2.

Another interesting thing is that OnLive doesn't let you use a wifi connection, although this is supposed to be coming eventually. You must be connected with ethernet. I was a little peeved at this, so I tried a pretty simple experiment. I set up an Airport Express to join my wireless network and then ran an ethernet cord to the Airport. This added about a 4 ms latency and caused some jitter when packets were dropped, so I can see why they want you to plug in for the launch.

How do games look?

This is another contentious issue. A game looks like a 720p video of the game. It is nearly as good as the real thing, but obviously the compression is lossy, so it's not perfect.

It's really easy to tell in a game like World of Goo, which has very precise 2d art. However, in 3d games, the compression is much harder to notice, since the textures are not so precise.

World of Goo on OnLiveWorld of Goo
World of Goo on OnLive versus native.

When I looked at F.E.A.R. 2, I made a weird discovery -- the OnLive version is definitely not running at maximum detail as I had originally assumed. I asked OnLive about this and they said that they don't necessarily max out the settings, they choose what they feel is a good balance.

F.E.A.R. 2 on OnLiveF.E.A.R. 2
F.E.A.R. 2 on OnLive versus the F.E.A.R. 2 demo on my PC at max settings. I was trying to show the OnLive video compression, but it turns out that some of the quality settings are set to the defaults, so it's not too useful (although that fact is interesting).

It is hard to describe exactly what the algorithm is doing, and screenshots are not too useful, since taking one frame of a video is sort of misleading.

What's the point?

So we've established that OnLive is really, really hard to do, but seems to work quite well (via Comcast in San Francisco during the beta, at least). But why bother in the first place? Why is this such a big deal?

Hardware is no longer relevant

The main feature seems to be that OnLive removes the user's hardware from the equation. No matter how obsolete your computer may become, you can count on it being able to play the latest AAA title. The implications of this are massive.

Furthermore, OnLive could theoretically publish games that even the latest desktop computers couldn't reasonably play.

OnLive is its own self contained platform

You "port" a PC game to OnLive and it gets sold to OnLive clients. However, the OnLive platform runs on Mac OS X and Windows, and soon, independently on your TV. At E3, OnLive demoed it running on the iPad.

Mac OS X users now get access to a ton of games that they wouldn't have otherwise. If OnLive creates a Linux client, that will be even more dramatic: Linux users who traditionally are lucky to see a AAA title per half decade would suddenly be treated to a buffet of games.

Of course, as you might imagine, some publishers are not too happy about this:

Mass Effect 2 on OnLive
Mac users are greeted to this when they try to buy Mass Effect 2.

From the OnLive FAQ

Unfortunately, because of licensing restrictions, we can only offer Mass Effect 2 for play under Windows. So, if you do not have access to a PC, your only option to play it on a Mac is under Windows using Boot Camp or a similar system. We apologize for the inconvenience. OnLive has no other games in the pipeline that are Windows-only, and we do not expect to have any others.

This is ridiculous on so many levels and a great example of why OnLive is so fascinating and controversial. I might be able to virtualize OnLive in Parallels, so that I would be playing Mass Effect 2 through OnLive on Windows running inside of Parallels virtualized on Mac OS X. It feels bad enough when publishers don't make the effort to support Mac OS X and Linux, the fact that EA has actually gone out of their way to make ME2 inacessible to Mac OnLive users is worth examining in its own blog post.

Games are completely sandboxed

It is a weird feeling to have a game running, not just in a sandbox, but on an entirely different computer. The result is that even though you have 32 bots running around, with tons of explosions and other things that might bring your computer to its knees, OnLive always uses the exact same resources. The result is that you can cmd-tab out of a game at any point, run it inside of a window, resize the window in real-time, and pause the game no matter what is happening.

We take it for granted that you can do things like watch movies on your computer while chatting with friends, answering emails, or writing code. However, when you are playing a AAA game, it typically takes significant effort to pause the game and tab to another window (if that is even possible). With OnLive all games are good citizens: you can easily pause the game (no matter what is happening) and you can always instantly cmd-tab out of it, or run it in a window to start with.

In fact, you can even encode video, install things in the background, compile code, and perform other tasks that would normally be impossible while playing a game simultaneously. Depending on the task, you may introduce some latency into OnLive. I was able to encode a video while playing a game and it seemed to be fine, but while capturing the screen using Screen Flow for the video below, I definitely overloaded my MacBook Pro and added a bit of latency to OnLive.

Demos are painless

For this post, I wanted to do some side-by-side comparison screenshots with Unreal Tournament 3. I don't own the game, so I figured I would get the demo. Let's see how that turned out:

I had to Google around for "Unreal Tournament 3 demo" and do some research. I found a few demo links from various sites, but they appeared to be the beta version. The seventh or so result on Google looked pretty legitimate: UT3 on the NVIDIA site.

I started the 758 MB download (not bad -- the Batman and F.E.A.R. 2 demos weigh in at 2 GB). A 30 minute download later and I'm in business!

I ran the installer. First, it had to self-extract the highly compressed installer assets (maybe 5 minutes). I had to baby-sit the installer by agreeing to the EULA and hit next a few times. Unfortunately, Windows was installing an update in the background -- I had to restart my PC and try again. Another 5 minutes to launch and re-extract the installer data files and manually begin the install.

Even more babysitting was necessary during the install process, when it wanted to install a version of PhysX as well. Not a big deal, it just forces you to monitor the install, instead of letting it complete in the background.

Finally -- everything finished installing successfully.

I ran the UT3 demo, and I wish I was joking, but I got an error message saying that the UT3 demo has been tampered with and cannot run. I failed to successfully run it at all.

Compare to OnLive:

Well, I'll let this YouTube video do the talking. :) This video shows the complete launch of OnLive from a cold start to entering the demo of UT3. Note that this video isn't meant to demo OnLive's graphics (it looks kind of weird after being compressed by OnLive, Screen Flow, and then again by YouTube) or my rusty UT3 skills!

"Installing" and running the UT3 demo in OnLive in seconds.

Oh, I forgot to mention something else: UT3 doesn't actually exist on Mac OS X!

To be fair, I also tried the Batman and F.E.A.R. 2 demos on my PC. These were much easier to get through Steam despite the 2 GB download sizes, and there were no installer issues. Granted, Batman didn't run well on my PC due to system requirements, but F.E.A.R. 2 was great.

Software as a service

As a "software as a service" application, OnLive has a huge amount of flexibility on what it can do. For example, it is possible to buy 3-5 day passes for games at a hugely discounted price. Steam and other more traditional distributors can't do this unless they load up the games with tremendous DRM. OnLive can do it effortlessly. They also let you play most games for 30 minutes for free (currently, as many times as you want).

The flipside is that you have absolutely no ownership of games. See Stallman's warning about software as a service. This is basically a free software nightmare: you run a small proprietary portal into a world of completely closed applications. The only window between you and your games is a proprietary audiovisual stream.

I have a feeling this is going to be such a contentious issue it is probably worth a separate blog post about it, but I am finding it hard to be unhappy about it because as a consumer you inherently know what you are getting when you use OnLive. It is completely intuitive that you are not the owner of the game and you are playing it through a thin-client. It's akin to watching something on pay-per-view, instead of buying a DVD. The DVD will often have all sorts of EULA and onerous DRM which tries to bully you into making it as similar to the pay-per-view experience as possible. When you watch something on PPV, you typically have no expectations that you own it, so it is not even an issue.

Indie friendly?

World of Goo and Dejobaan's AaaaaAAaaaAAAaaAAAAaAAAAA!!! are already on the service as launch titles, which is a good sign. I briefly talked to OnLive at GDC and the impression I got is that they definitely want indie games, but they have limited resources so can only sign up a few at a time. We'll see if they let Overgrowth or Lugaru on.

Final thoughts

I think that OnLive is the most impressive demo of cloud computing to date. Video games are pretty much the most intensive kinds of desktop applications out there, stressing every part of a computer's hardware and requiring updates in real-time. The fact that OnLive has apparently tackled this beast pretty well opens a ton of possibilities.

I (and probably other people) were under the impression that OnLive was going to have a grand opening today, but it looks like they will be in fact having a more modest opening, letting people in slowly on a first come first serve basis.

If you sign up for their "founding members program" you'll get an email explaining:

There are a limited number of available Accounts in each region of the contiguous United States. Founding Member Waiting List registrants deemed eligible for the Offer will be sent email invitations in the order that valid sign- ups were received in regions that become available.

What do you think about OnLive and cloud gaming? Is it going to be a sea of change in how we play games, or will it burn out its investor funding before it takes off?