Pipeline Dreams

When I read and hear about this fabulous OnLive service that everyone is suddenly talking about at GDC, I’m immediately reminded of the Phantom, which I hasten to point out is still not the centerpiece of my living room. After all, while OnLive may have what the Phantom never did — credibility — the basic idea is now running six years old, and still seems like the stuff of myth and legends that computer engineers tell their kids to put them to sleep at night. Pardon my enormous skepticism, but I believe as much in the practicality of OnLive as I do the existence of Leprechauns and Unicorns.

To his credit, founder Steve Perlman is doing a masterful job of publicly selling the idea that this service should be seen as something more than an improbable silicon dream, and the promise of “interactive video compression” technology to deliver high-definition gaming in real time sure sounds fancy. But, really, am I now meant to buy into the idea that some company I’ve never heard of has partnered with the worlds major publishers to deliver gaming from some nebulous cloud of data a thousand miles away at the speed of virtual light, and all on the cheap?

But, even as I question the technology itself, it's the practicality of building a long-term consumer base where I find my real stumbling block. Call it lack of vision, but I don't see gamers, a historically finicky genus, getting all warm and fuzzy on the idea of relying on their internet connection and flawless latency every time they feel like accessing content for which they have paid top dollar. I can feel the furious heat from a distant future where two hours downtime means that your entire library of gaming has been held captive.

But, for the time being let’s give OnLive the benefit of the doubt. Let’s say, for the sake of argument, that in a year or so you can go to the store, buy a fifty dollar piece of equipment, plug it into your television or computer and rely on distant processors and routers to consistently deliver latency free gaming. Suddenly, everyone with even a rudimentary PC now has convenient, controlled access to a library of games.

Well, that answers everyone’s problems, doesn’t it? I mean, here it is! This is it, the future — the road to the one-console dream.

I long to be enthusiastic. I want to be optimistic. On paper, this seems feasible, not just because it models what seems to be the natural evolution of gaming but because it answers so many issues along the way. At first, I couldn’t quite figure out how this virtual unknown was able to get companies like EA or Take-Two to even give them an appointment, and then I realized exactly how publisher-friendly this kind of platform is. Think about how many problems OnLive potentially addresses:

Lost Revenue to the Used Market
Piracy
DRM
Licensing Fees
The Costs of Cross-platform Development
Scalability

With a proof of concept and a good powerpoint presentation, its really not that surprising that publishers might be falling over themselves at the prospect. And, lets face it, the timing is right. The major console manufacturers are reeling from the magnificent costs of getting the latest generation rolling. There’s no enthusiasm from consumers or manufacturers to even think about the next generation. No, that doesn’t mean that Sony and Microsoft will be enthusiastic about conceding their hard-fought marketshare, but it’s not difficult to imagine that if publishers found an alternative that addressed their concerns, their support for traditional consoles might flag.

From an industry point of view, it all makes sense. Practically, however? Well, that’s another story. Now may be the time to begin exploring this kind of service, but it’s hard to believe that the broadband infrastructure is there for enough people to invest and get a good result. Beyond that, there’s the question of adoption from what has become one of the most significant segments of the market, casual gamers. This is exactly the sort of thing they hate about video games. You just get them comfortable with their Wii and all of the sudden you tell them that now they need to do something entirely different and alien. There’s something very hardcore and impersonal about OnLive, and while I can see myself trying the service out it’s not as easy to imagine a lot of parents and grandparents getting on board with a service based system delivering games from a network cloud.

Face it, you can't really put OnLive subscriptions under the Christmas Tree.

And, then there’s Nintendo. While I can imagine vague scenarios in which Microsoft and Sony might collapse under pressure from partners and crippled budgets, the Big N by comparison seems indefatigable. Go ahead and try to imagine a realistic scenario where a Mario game is delivered through OnLive, I dare you.

So basically then, this is another online games delivery service, and so I have to ask why this is better than GameTap or Steam. Get right down to it, and they are asking me to pay a premium to subscribe to service where I can buy the same games for what’s sure to be the same price, but instead of using physical media or a relatively permanent media now I must rely on an internet connection and distant servers to properly process and deliver the content. Remind me again why I would want to pay a fee to not have my games?

It’s a buzzworthy concept that is titillating on the surface, but seems rife with impracticalities when closely examined. I’m just not sure that gamers in general are ready to abandon gaming as a product and look at instead as an indistinct service.

Comments

TheGameguru wrote:
Indignant wrote:

I know Comcast and other cable providers have placed caps on how much you can download a month. Go over that cap and you have to pay per gig. If on live ever comes to light, it could easily get killed by gamers internet providers hitting them with hefty charges after they go over the cap.

What's stopping them from forming parternships with various isp's?

All of the potential heartache over network neutrality wouldn't stop those partnerships, but it could decrease the likelihood that any such partnership would be profitable for OnLive. If the ISPs couldn't charge a premium rate to customers for OnLive service, OnLive would need to compensate the ISPs somehow for that additional bandwidth; to get the extra money for that expense, OnLive would have to pass the buck onto their potential consumers through a higher subscription rate, which just makes the barrier for entry into OnLive even worse.

(Of course, if the political landscape surrounding network neutrality changes in the future, then those partnerships could easily become more agreeable.)

Zelos wrote:

They're (presumably) providing the equivalent of a ~3Ghz CPU, a high end GPU and 1-2GB of RAM per user, I doubt a couple of gigabytes of hard disk space per user on the server for saved games is going to be a problem.

I think it's likely a problem if you're going to be playing more than one or two hefty games at once. I know I easily use more than that much disk space, but I could well be a spendthrift that way. I've got a big pile of Oblivion, Gothic 3, and other single-player RPG save games on my gaming PC at home, for example.

And is that "per user" or "per paid account"? If I've got three of the four people in my family that want to play these games, do I need to have separate accounts for each, or do we all end up sharing the 2G or so?

TheGameguru wrote:
Indignant wrote:

I know Comcast and other cable providers have placed caps on how much you can download a month. Go over that cap and you have to pay per gig. If on live ever comes to light, it could easily get killed by gamers internet providers hitting them with hefty charges after they go over the cap.

What's stopping them from forming parternships with various isp's?

They said in the press conference that no such partnerships are currently in the works. Since they're supposed to launch in less than a year, they'd better get cracking on that.

Edited because I've now read what you wrote :)

Thank you for bringing sense to this vaporware.

Anyone who knows anything about networking knows that this is nothing but a pipedream unless there have already been incredible advances in communication technologies (it takes time to roll such technology out). They've said the service can be used on a 1.5Mb internet connection but who has 1.5Mb of data guaranteed to anywhere on the net?. The other aspect they fail to discuss is latency, which is the real killer for this idea. Any WoW player knows that latency is the ultimate fun killer.

There is no point thinking beyond these simple facts.

OnLive truly is The Phantom of 2010.

Well I've had a little change of heart today. The technology may not be as far off as we expect. If you have HD now when you watch the news, its broadcasting HD to your set at 29.97 frames per second.

The kicker is that games require control. Because when you are playing a game you are in a sense changing the channel multiple times per second. And everyone who's had HD knows changing the channel takes at least a half second.

Parallax Abstraction wrote:
TheGameguru wrote:
Indignant wrote:

I know Comcast and other cable providers have placed caps on how much you can download a month. Go over that cap and you have to pay per gig. If on live ever comes to light, it could easily get killed by gamers internet providers hitting them with hefty charges after they go over the cap.

What's stopping them from forming parternships with various isp's?

They said in the press conference that no such partnerships are currently in the works. Since they're supposed to launch in less than a year, they'd better get cracking on that.

So what? You saying for sure then its impossible? We've done deals with Comcast in 6 months.

All of the potential heartache over network neutrality wouldn't stop those partnerships, but it could decrease the likelihood that any such partnership would be profitable for OnLive. If the ISPs couldn't charge a premium rate to customers for OnLive service, OnLive would need to compensate the ISPs somehow for that additional bandwidth; to get the extra money for that expense, OnLive would have to pass the buck onto their potential consumers through a higher subscription rate, which just makes the barrier for entry into OnLive even worse.

So you have first hand knowledge of all the cost structures in place?

TheGameguru wrote:
All of the potential heartache over network neutrality wouldn't stop those partnerships, but it could decrease the likelihood that any such partnership would be profitable for OnLive. If the ISPs couldn't charge a premium rate to customers for OnLive service, OnLive would need to compensate the ISPs somehow for that additional bandwidth; to get the extra money for that expense, OnLive would have to pass the buck onto their potential consumers through a higher subscription rate, which just makes the barrier for entry into OnLive even worse.

So you have first hand knowledge of all the cost structures in place?

My point is completely speculative, but I'd like to think that speculation is at least somewhat informed by reality; I would think that any partnership that an ISP would take on with OnLive would have to involve some sort of direct monetary compensation from OnLive, since any ISP (in the US, at least) will likely have the FCC yelling at them if they thought they could make up that money from the customer through premier service charges. (e.g. the net neutrality complaint that the FCC held up against Comcast last year).

In this scenario, OnLive will need to find a way to accommodate that payment to their ISP partners ('cause they ain't getting that bandwidth for free) through their business model, which would likely involve passing costs down to the customers. They can either do that through increasing price points across their hardware or by ramping up the subscription fees accordingly, both of which would directly impact any potential customer buy-in.

My point is completely speculative
They can either do that through increasing price points across their hardware or by ramping up the subscription fees accordingly, both of which would directly impact any potential customer buy-in.

It's speculative but your sure they only have an A or B option?

I feel absolutely safe in saying that this will never work as advertised.

Why? Because they're doing it all wrong. They're rendering frames and then compressing and delivering the result as video. This is hideously stupid, massively inefficient. They should be delivering the geometry for the remote terminal to render and display. This would be enormously more bandwidth- and latency-effective, and bandwidth and latency are the choke points, not rendering capacity.

And if the remote terminal is running the rendering, why not just put the CPU locally and run the whole game there?

There's a reason why people mostly don't use X Window dedicated terminals anymore; they almost all run local machines with full computing capacity. It's just easier and more effective to have the computing power there instead of somewhere far away.

This solution offers me absolutely nothing I don't already have, except maybe driver headaches. And it imposes a whole bunch of painful compromises.

Not going to fly.

Why? Because they're doing it all wrong. They're rendering frames and then compressing and delivering the result as video. This is hideously stupid, massively inefficient. They should be delivering the geometry for the remote terminal to render and display. This would be enormously more bandwidth- and latency-effective, and bandwidth and latency are the choke points, not rendering capacity.

Not really.. it would be more work doing that split of duties and then attempting to sync the actions.. rather than the way they are doing it.

Not going to fly.

Really? I mean this is getting out of hand. Are we all that smart and successful here that we have figured everything out before everyone else? Is everyone here a multi-millionaire with several successful companies in our past? Is anyone here actually speaking out of any sort of real experience with this sort of technology?

Edit.

I want to add.. that companies and technology like this have the potential to "shake things up" which in the end is always a good thing. Even if it ends up being way to early this is something that we should all look at as where things end up.. start thinking about how others things will be impacted by this and in what ways this benefits other industry related to the various technology bits.

If I'm Comcast I'm drooling over this.. imagine the ability to sell for another $10 the ability to watch all your subscribed channels in almost real time on any web browser in the world.

I think that the fact we're still talking about this says two things:

1. The idea is interesting
An interesting idea can make people talk and consider the possibilities. This venture has succeeded in doing exactly that, though I wonder exactly why. This doesn't seem like a lot of talk and not much else?

2. A lack of knowledge makes the idea seem possible
The only reason this idea has gained any traction is because most people have no idea that the technical requirements for this idea to work simply do not exist in the real world (ie where consumers/customers might use it).

For those wasting valuable brain cells on #1, please see #2. The whole thing just stinks of Phantom 2.0, or is it 3.0?

I think at this point the conversation is dead. I see a lot of arguments as to why people think it will fail with no information to back it up. Just a lot of uneducated guessing as to why you think it won't work and that doesn't really help the conversation move forward. This thread is going around in the circles with everybody pretty much saying the same things.

I think we get it. Some of you think this is going to fail miserably and some think it's not as far fetched as the others assume.

I think Gameguru is right, even if it fails to work as advertised the tech may be useful in a number of different ways in other areas. The company has in the past proven they can develop good tech. So who knows what might happen. The beta will allow people to get first hand experience with it and once that happens we'll know more.

Until than can we at least agree to disagree?

If they really had a way to encode HD video with 1ms added lag time, as they claim, while still maintaing high quality, they'd be too busy rolling in the ten or hundreds of millions they could make selling that technology to everyone in the world. They would own the video compression field.

They wouldn't even bother wasting time with games; that's small potatoes.

As some article said, it would be like inventing teleportation, and going into business to compete with Netflix. "Instant delivery of movies through our awesome teleport tech!"

Gaald wrote:

I think at this point the conversation is dead. I see a lot of arguments as to why people think it will fail with no information to back it up. Just a lot of uneducated guessing as to why you think it won't work and that doesn't really help the conversation move forward. This thread is going around in the circles with everybody pretty much saying the same things.

IMO, uneducated guessing is what got this thing made in the first place.

This is a glorified video codec combined with a remote keylogger, and a fat pipe needed to demonstrate its "miraculous power".

The lag is unsolvable. Client prediction is not a solution for lag. Video compression artifacting is unsolvable. The expensive special conditions on the side of ISP to accomodate it will never be fulfilled due to the product's absolute lack of niche appeal thanks to existence of modern videogame consoles and computers.

This thing cannot even be used for medical imaging, because those guys are forbidden from using compressed/artifacted imagery.

It's got nowhere to go.

This is the best thread ever. If I feel down I can just read this and feel ten times better about myself.

I hope to god this isn't brought up on next weeks conference call.

TheGameguru wrote:

This is the best thread ever. If I feel down I can just read this and feel ten times better about myself. :)

Really? You're going with that?

TheGameguru wrote:
My point is completely speculative
They can either do that through increasing price points across their hardware or by ramping up the subscription fees accordingly, both of which would directly impact any potential customer buy-in.

It's speculative but your sure they only have an A or B option?

Okay, what other options do you see in front of them?

And, to clarify further, I'm actually not saying that this thing is doomed from the outset. I agree that it's silly to simply disqualify some of the stuff surrounding the tech out of hand, as though the company has somehow not considered these factors; the product has apparently been in development for something like seven (!) years (if the discussion from the GDC episode of ListenUP is to be believed). I'm just questioning the idea of OnLive being able to leverage ISP partnership to mitigate the bandwidth cap issue which, ultimately, only impacts subscribers for particular ISPs anyway.

Certis wrote:
TheGameguru wrote:

This is the best thread ever. If I feel down I can just read this and feel ten times better about myself. :)

Really? You're going with that?

Really? your first post in this thread is to my response to some of the more ridiculous things posted here? Seriously? Sorry I'll get on the over the top uneducated negativity.. since that apparently is what is condoned.

wow.

With respect, you can check your PM. No need to drag any further conversation into the thread.

Here is a nice little piece from eurogamer that sums up onlive's current feasibility.

http://www.eurogamer.net/articles/gdc-why-onlive-cant-possibly-work-article

Nice idea, but practically?? I mean lets face it how many of us are actually happy with the existing speed of our Internet connections?? So to be reliant on a gaming resource that is dependant on it does seem a pipedream for now sadly..

But I am still curious to see if they do pull it all together and wish them luck as so much thought and development has gone into it.

I don't think lag is an issue. Most of us here played CS and TFC over a modem dialup, considering ourselves lucky if "56K" delivered at least 42Kb, and finding ping below 200mps entirely enjoyable, and anything below 100ms suitable enough for competitive clan matches. Besides, as noted above, this business model would look very enticing for all Comcasts and Akamais of the world to participate. It should be entirely achievable for them to deliver, say, a sub 10mps roundtrip to an OnLive server farm co-located somewhere very close to you.

But GPUs are indeed not scalable. At current state of the art, you simply cannot get away from the reality of a GPU-bound app requiring its own GPU. A possible way around this could be using the power of multi-core CPUs and virtualizing GPUs. This is actually the way the industry pundits are predicting for the entire graphics industry (although there are no sings so far that this may actually come true anytime soon).

Combined with virtualization and MPP, this could address the insane computation power requirements of GPU. But it would mean three things:

1) games would most likely have to be modified in order to take full advantage of this
2) Nvidia's and AMD/ATI's graphics business will be undercut, and they'll refuse to cooperate
3) in all the likelihood, this stuff will require something a tad beefier than the current datacenter commodity blades

This factors are not completely insurmountable, but together they just may turn out to be a chunk too big for OnLive to chew through.

Gorilla.800.lbs wrote:

I don't think lag is an issue. Most of us here played CS and TFC over a modem dialup, considering ourselves lucky if "56K" delivered at least 42Kb, and finding ping below 200mps entirely enjoyable, and anything below 100ms suitable enough for competitive clan matches.

Those games have client-side prediction though, to help counter-act the effect of latency.