All For One

At first glance it might seem convenient to occupy a world where all my gaming occurred on a single platform. It is a dream convincingly illustrated and suggested by industry insiders from David Jaffe and Denis Dyack to Gerhard Florin and Keita Takahashi; the promise of high quality games at low price points, which is the sort of line we're fed every time the industry wants to lurch down some new path of supposed innovation. But, in a rare twist on the old story, the numbers behind the one-console philosophy actually support a scenario where the cost of doing business would drop so dramatically and the market would open so wide that it's tempting to believe that we would be the beneficiaries.

I think Silicon Knights' Denis Dyack provides perhaps the most convincing and reasonable argument for the potential of a single platform, which may be why he's become the concept's knight in shining armor. Dyack not only posits that this is a potentially positive force, but even suggests that inevitably the industry will move to the single platform standard. For a gamer, like me, that owns a PC, Xbox 360, Playstation 3 and a DS, the idea of reducing to a single platform standard – potentially one that would be on the low end of the cost spectrum – starts with roughly a $1000 head-start.

So would it work?

Simplistic arguments are too often the most vulnerable to the subtleties of real world application, so when I tell you that Dyack's arguments incorporate complicated concepts that can't be easily explained without a correspondence course in economics; see it as both blessing and curse. Theories of commodification and performance oversupply may make for poor pull quotes, but in they are powerful, tidal forces in the business of technology. Dyack lives, breathes and makes video games in the real world where nothing that might spin the momentum of a multi-billion dollar industry is ever simple.

So I asked an expert to put these theories into something I could wrap my noodle around. Here's how I understand the inevitability of a one-console standard as Dyack describes it.

This is a future where the manufacture of video game hardware looks a lot like DVD players. There would exist a single standard on which the platform is built, and hardware manufacturers like Microsoft or Sony would be analogs to Toshiba, Phillips, Panasonic and others in the DVD market. Dyack puts the cell phone industry forward as a prime example of commodification and an analog for the video game industry, where companies still compete with hardware but it is all built off compatible standards. In the way that you can receive a call from cross-platform cell phones or watch a DVD on basically any player, so too would the gaming industry move to a platform where a single console standard would be the same between hardware made by Sony, Microsoft, Nintendo, Panasonic or Toshiba.

The knee-jerk concern is that a console standard will somehow stifle competition, but historical evidence of commodification doesn't really support the concern. Tech sectors aren't burdened by standards, and just as Motorolla, Apple and other cell phone manufacturers still manage healthy competition, so too would the players in a gaming industry. We are not talking about a scenario where, say, Microsoft manages to corner the market and everyone is buying Xbox 360s. If anything an open standard would be a force for reducing manufacturing costs opening up the market to more competition. On the developing and publishing end, things get even better.

A standardized system of tools for creating games on a single console would dramatically lower the costs of development. We already see evidence of this from middleware like Havoc or DirectX, saving developers money and man hours by providing standardized tools for delivering content in games. In an environment where it costs millions of dollars to even put the tools in place to begin development of current-gen (formerly known as next-gen) games, the idea of standardized tools and a unified consumer base where you no longer have to support multiple platforms to maximize sell-through is incredibly enticing.

The benefits for smaller development houses is even more staggering. Taking digital art as a rough analog, imagine that a JPG, PNG or TIFF is an equivalent to a video game. With these standards in place a digital artist has development options for creating his work from using expensive proprietary software like Photoshop to opensource tools like GIMP. By incorporating the standards, and as a by-product eliminating the approval process necessary to get a game on a major console, the creators of products are back in the driver's seat.

And, for consumers, the traditional promise of lower prices remains the carrot perpetually dangled on the end of that stick of promises. But from a broader perspective, gone would be the days of exclusives, the necessity of purchasing multiple platforms, and potentially more inventive and innovative games as developers no longer need the bona fide hit every time they come up to bat. Along with the traditional AAA multi-million dollar budgeted games, we'd see the potential for a broader range of content as small developers would be able to experiment similar to what we see from companies like PopCap and GarageGames.

It could be, as proponents of the single console standard point out, a win-win-win scenario.

But before we march onward to digital utopia, a moment must be given to the other side of the equation. While a lot of good may come from the commodification of the gaming industry, I'd argue that Dyack is premature to imply that the forces driving the industry are inexorable in their tidal surge toward the one-console standard. And, in fact, that might not be a bad thing.

We can see standardization at work now in the high-definition DVD war, which will ultimately choose either HD-DVD, Blu-Ray or some combination of the two as the long-term standard, just as VHS beat out Betamax a few decades ago. The reason this force is so strong is because, from a consumer point of view, the difference between the two formats is not significant. But, is it so easy to draw such distinct lines in the sand between the 360 and a PS3, or more dramatically still a PS3 and Wii? The argument seems to be, historically speaking at least, that as these systems add features like internet connectivity, dvd playback, media streaming, etc. they are moving to this concept of commodification by taking the competition away from the idea of what a video game is into the realm of what features you can add onto this standard idea of a video game. But, put three consumers in a room, one who plays Wii Sports, one who plays JRPGs and a third who plays Halo 3 competitively online, to describe a video game and you might be hard pressed to find any kind of standard. The market for video games is, if anything, more divided on the point than ever as three strong systems with marked differences compete in a market that seems as far away from standardization as ever.

And, as for the hardware manufacturers themselves, they are only interested in standardization if their hardware becomes the standard. Simply put, the strongest argument against the feasibility of commodification is the intense stubbornness of the players involved. Where publishers are luke warm on the idea at best, and dismissive of it otherwise, and retailers are enjoying the benefits of a nice hot console war, the only people in the industry with a convincing stake in the matter is people exactly like Denis Dyack. That's not a criticism by a mile; an industry where smaller independent developers find a strong footing on an even playing field is certainly attractive, but without a strong push from consumers, retailers, publishers and to some degree even the hardware manufacturers themselves to standardize this seems like a nice dream at best.

And, for use consumers, the price for standardization would not be lack of competition but lack of innovation. Let's not forget, first of all, how long consumers resisted the transition from just VHS to DVD, so too imagine if there was one standard for gaming how exaggerated might be the transition between gaming generations. Standardization encourages stagnation, and developers, publishers and manufacturers are unlikely to keep pushing the borders of new technologies unless there is competition for hardware. Dyack's concerns about performance oversupply may eventually become valid as the technology leaps between generations deteriorates much as has occurred in the processor market, but the difference between a PS2 and PS3 is still so great as to be significant. The competition between Microsoft and Sony and Nintendo remains an innovative, if expensive, force.

The single console argument finds some strength in arguing that simplifying the market would benefit a broader consumer base, but in practical terms it's hard to suggest that the video game industry is suffering on the heels of yet another record breaking year. Consumers may grudgingly be forced to make a choice as to what console they will spend their money on, but consumers will have to start choosing not to spend their money at all or overwhelmingly force the issue in a single direction for natural commodification to occur. Despite the PS3's disappointing performance versus its lofty expectations, even that system has received broad adoption. Standardization of gaming platforms is not something that the market is demanding.

It is nice to imagine that prices would drop with a standard, but I would argue that the competition between console makers this generation alone has done more to drop prices as fast or faster than any kind of enforced standard would. Even cell phones remain expensive propositions, usually requiring multi-year commitments for the basic, entry level phones, and one can spend as much on a decent Blackberry or iPhone as they would an Xbox 360. DVD players aren't exactly handed out in Cracker Jack boxes either, and took years to drop in price compared to a Playstation 3 which has enjoyed multiple price drops and is among the cheapest Blu-Ray players on the market.

So, as consumers what the one-console revolution might bring us is basically the same games we get now for basically the same price with less hardware innovation and a broader focus on mass market appeal. In the same way that watching the industry become as bloated and vacuous as the movie industry was like watching your kid grow up to become a divorce lawyer (apologies to divorce lawyers, but it's not like I'm the first guy to call you out), the commodification of gaming might not be an ideal option. Frankly, I like that video games push boundaries and exist on the bleeding edge of technology. I remain unconvinced that the gains from making life easier for developers and publishers would outweigh the potential stagnation of the industry – which is, after all, about software more than hardware – or dramatically benefit consumers. More than that, the entrenched conflict in the current generation makes the entire argument seem moot, where no one company is likely to agree on any kind of standard unless it's theirs.

To me, the one-console world seems like a pipe dream, and as a consumer, not even a very good one.

Comments

PyromanFO wrote:
Elysium wrote:

PCs aren't standardized in the way I'm talking about here.

But they are standardized in the way Dennis Dyack is talking about, DirectX and Windows.

/concur
I totally agree. These standards make things playable on infinitely customizable hardware.

boogle wrote:

I totally agree. These standards make things playable on infinitely customizable hardware.

Ever since the days of "open systems" in the 80s this has been a lie propagated by people who have an interest in making you think standards are good and you can just put together commodity pieces to build anything you want.

The problem is that margins on non-commodity items are always higher than those on generics so this has grand dream has never, ever, come to pass.

The modern PC is just another example of this. Those software standards ultimately depend on vendor-specific pieces of software (drivers, compatibility layers, etc) that promise interoperability but never quite deliver.

Hence the constant chance that any given game simply will not run on your "standard" software platform because of the configuration of the hardware underneath.

In contrast, all PS2s are for all practical purposes exactly the same.

A true turnkey hardware system would require all the various vendors get on board and give up whatever distinguishes them from the standard. Given that Sony, Microsoft and Nintendo are at least partly in the business of selling hardware, I don't see why they'd be motivated to do this any more than the Unix vendors of the 80s were.

The two biggest problems from the unified console will come from Microsoft and Nintendo.

Nintendo actually makes money off hardware sales so as long as they feel they could make more money or have a more constant revenue stream by doing it themselves, they will. I can see Microsoft and Sony still blasting each other over their superiority despite the manufactured goods being nigh identical. And thus, I can see Nintendo making more money as they aren't investing countless millions into that moot marketing campaign.

Microsoft will insist that the OS be a flavor of Windows. I can't imagine Sony or Nintendo having any inclination of accepting this. I think a unified console should run on some sort of linux/unix. So Microsoft would have to change Windows CE to run a lot like Linux in order for this to be remotely possible.

I also think at some point the console will diverge again because of Nintendo's penchant for unique control schemes. It may as well be a different console if one party creates unique games that only work with a unique control scheme that is supported by said party.

When game developers talk of standardization, they're talking about a Virtual Machine to write games against. This is how cellphones and HD-DVD/Blu-Ray work now. Standardization for software is not done on a hardware level anymore. You have a piece of Virtual Machine software and a standard for that Virtual Machine, then anyone can implement hardware any way they feel like it, as long as it can execute the basic instructions in the Virtual Machine format. For Cellphones and BluRay the virtual machine is Java, for instance. Alot of the PCs problems is that they're trying to standardize on the actual machine instead of a virtual machine like .NET or Java. Microsoft seems to be trying to change this with XNA which runs game code on a .NET virtual machine.

Computer games, or at least the kinds of computer games that most of here are interested in playing, require substantial computational horsepower. Games, and maybe video editing, are among the only things that the general public will use their computers for that actually taxes the limits of their machines' processing abilities. Virtual machines have a tremendous processing overhead that negates their usefulness for anything other than casual games. When was the last time you saw a dual Mac/Windows release running in a java virtual environment?

Furthermore, more computational power translates directly to new/innovative/different game experiences.

CD/VHS/DVD/HD-DVD/Blu-Ray/cellphone standardization is only possible because those types of applications have fixed and defined processing requirements.

The only way game console standardization will happen is if computers stop getting faster and more capacious. That's probably not going to happen within our lifetimes.

Furthermore, more computational power translates directly to new/innovative/different game experiences.

It does? How?

Given the choice between, say, Dwarf Fortress on the one hand and Crysis on the other, I wouldn't hesitate to pick the one that doesn't require a $3500 homebuilt machine to run.

I don't argue that more computational power may sometimes translate to different game experiences, but I'd say that "innovative" experiences generally come from innovative use of existing resources, rather than just putting a bigger engine in the same boring car. Hasn't the Wii taught us anything?

Actually I can see the pc coming back in vogue as the standardized gaming platform. I mean what's the Wii proving right now? That graphics aren't everything for many people and you can have fun without having state of the art graphics.

What happens as integrated graphics chips in 2 or 3 years are as powerful as today's 8800gt gpus? I know graphics can always get better, but, like the Wii has shown, at some point they become good enough and really game design takes over as well as other factors.

And so at some point everyone has a pc that can do 8800gt graphics. Why do they then need to buy a console?

Man if they make laptops super easy to hook up to TVs then you have a console already and can download games to it from the 'net. Laptops are getting more and more popular.

A wireless gamepad is all you'd need.

Look at how many play WoW today. 2.3 million current subscribers in the US and it's a 3+yr old game. It can run decently on integrated graphics on a Macbook afaik.

psu_13 wrote:
boogle wrote:

I totally agree. These standards make things playable on infinitely customizable hardware.

Ever since the days of "open systems" in the 80s this has been a lie propagated by people who have an interest in making you think standards are good and you can just put together commodity pieces to build anything you want.

The problem is that margins on non-commodity items are always higher than those on generics so this has grand dream has never, ever, come to pass.

The modern PC is just another example of this. Those software standards ultimately depend on vendor-specific pieces of software (drivers, compatibility layers, etc) that promise interoperability but never quite deliver.

Hence the constant chance that any given game simply will not run on your "standard" software platform because of the configuration of the hardware underneath.

Regardless of hardware, the point is that the drivers allow them to communicate with a common OS and hence act as a unified system while maintaining hardware plurality. The fact that there are some compatibility issues is mere testament to the evolving nature of this cornucopia of hardware. The main problem currently as trip1ex points out above is that some developers are obsessed with the bleeding edge rather than the low end where a majority of systems are.

It does? How?
Given the choice between, say, Dwarf Fortress on the one hand and Crysis on the other, I wouldn't hesitate to pick the one that doesn't require a $3500 homebuilt machine to run.

Dwarf Fortress wouldn't run on a C-64. It wouldn't run on a Nintendo DS. Its rogue-like graphics disguise the fact that it is doing a ton of computational upkeep in real time.

The last two major game genres, the RTS and the FPS, were directly enabled by the development of computers powerful enough to run them.

Even the most advanced current games have only the most trivial and limited physics simulations. Sandbox games rely on a lot of smoke and mirrors to hide the fact that you can only interact with only that miniscule portion of the game world directly surrounding the player.

It's also worth noting that a $1200 rig will run Crysis almost as well as a $3500 one...

Why are people still overblowing the cost of computers? It's not nearly as expensive to build a great gaming rig as people keep saying it is.

Thin_J wrote:

It's also worth noting that a $1200 rig will run Crysis almost as well as a $3500 one...

Why are people still overblowing the cost of computers? It's not nearly as expensive to build a great gaming rig as people keep saying it is.

Aye, my recently bought rig (for $1200 no less!) actually set Crysis to "Very High" settings when I hit the optimize button. Made me happy. I didn't even have to build it!

I don't argue that the tools aren't there to move this forward, but as you point out in the very next paragraph the will to do it is completely lacking on all ends of the spectrum.

Not true, what I said was that every developer and publisher will be screaming at the top of their lungs for this. All it takes is one hardware manufacturer to come on board for it to happen. And that whether or not that happens, it's still going to happen in the backend regardless, which is only going to increase the temptation for someone to make a console based on the VM and allow every game they can get their hands on to work with it.

Either way it's going to make it cheaper to make games, and that benefits the consumer whether they drop the price or not. Cheaper games means less cost to take risk on the publisher part which means more innovation and more homebrew/indie gaming.

Also, I'm not sure that the fact that some games are made with UE3 means that it's a standard exactly. It's an example on a small scale of what sort of large scale possibilities exist with a standard, like say DirectX which is a standard for Windows game development (also, you're leaving out that there's a lot of Personal Computer architecture out there that isn't standardized).

There is no PC gaming hardware that is not standardized around the DirectX interface. Unless it does, it's useless.

Computer games, or at least the kinds of computer games that most of here are interested in playing, require substantial computational horsepower. Games, and maybe video editing, are among the only things that the general public will use their computers for that actually taxes the limits of their machines' processing abilities. Virtual machines have a tremendous processing overhead that negates their usefulness for anything other than casual games. When was the last time you saw a dual Mac/Windows release running in a java virtual environment?

Virtual Machines do not necessarily mean a hit in performance. You build an actual machine to conform to the virtual machine spec, it's then running as fast as a "natively compiled" x86 code, using the virtual part on non-performance critical sections. There are tons of cellphones that implement Java in hardware, for instance. I think you'd be suprised if you look at an enterprise server room just how much of it is running on virtualized hardware.

Please don't use cell phone software as a way to show off the right way to do a mono-platform setup for the industry. It's a lot more nightmarish on the coding side that you believe.

I have a pretty decent idea about how cellphone development works, and the fact that cellphone game developers have to worry about things like screen size and processor power instead of things like learning a different language for every cellphone just shows you well the Java VM has worked for the cellphone platform. The only problem with cellphone development is the fact that every single person has one. If only all game developers had this problem. And while there's a zillion types of cellphones the fact that you can even support them all in the same language is amazing in itself.

Dennis Dyack's examples of DVD players are cell phones are poor examples of how standardization would apply to game consoles. In the case of DVD players, streaming MPEG-2 video off of a standard disc format is a whole, whole lot simpler than a whole series of standardized API's for all the aspects of gaming.

As for cell phones, because different cell phones can all make and receive calls on the same network does not mean that they are compatible to run the same software. He is mistaking protocol compatibility for software execution compatibility. The assumption that because they implement the same call handling protocol means that they are software compatible is less valid than the assumption that all devices which have a web browser handling HTTP & HTML are software compatible with each other. Cell phone operating systems and programmability are much more fragmented than in the PC space.

As was said earlier, the layer of compatibility is essentially a VM, and in the case of cell phones a Java VM is common. However, the Java that is commonly implemented on cell phones is Java ME. ME has a horribly minimal API, compared to what is needed for a decent game. It becomes the lowest common denominator of functionality. And despite minimal API and standardization, apps are not without tweaking from device to device.

All I know is that after buying a console I have saved myself twice that amount in upgrading my PC. If there was a single standard and playing video games was as generic as playing a DVD then gaming would be easier. Of course the hackers would have a field day.

I cant see standardization in the way of just one console, but standardization where the underlying technology and coding are the same, and where you theoretically could have various brands with their own console (differentiating on design and specific features etc.) while a game would work in any of the consoles you threw it in, and even in a PC.
That would be great in many ways, even if unlikely.
The only negative aspect I could see, would be the dumping down of PC gaming (ye, Im a PC fanboy...) to the standard of consoles. In the end as I see it, some games just work better with a mouse and keyboard (and others work better with controllers), would be sad to get that taken away from me. One shared platform really had to offer both.
The best example of standardization might really be the TV, there is rarely anything new coming out separating TV owners from each other, although the differentiation in the TV market is still there. Its mostly been with HD and, for some, digitalized signals which has recently been a bit splitting the market up in segments, and even that will fade away pretty fast when everyone adjust to the new standards.

Its really not how the Mobile market is now, its very separated, with windows based phones, Symbian based phones (which afaik arent even very compatible between different versions), Iphone, touchscreen vs. non-touchscreen etc. etc.

To get away from these days Microsoft bashing when it comes to the PC as a gaming platform, wasnt one of the goals with DX10 to standardize the pc as a gaming platform a bit (Minimum requirements for graphic cards (A video card cant support just some DX10 features afaik, either its all or non, while this wasnt the case for earlier DX versions), removing some of the backward compability which had existed between DX1 through 9 etc.). It still remains to be seen if this will lead to more standardization for C gaming, but it shouldnt be that difficult to achieve, by setting some fairly straightforward requirements of the hardware makers to fit DX.
While the PC will never be standardized in hardware (luckily), moving toward more standardization as in "hardware requirements to be considered a gaming PC" might help on the innate problems of PC game developing, and maybe even move the PC and the different consoles closer to each other, where, even if they will never merge into one single platform, at least get so close, that making your game for all available platforms will be easy and cheap enough to be the standard.

What annoys me currently is not whether it costs me 1000+ to have a decent PC or 500 to have a Xbox, its the fact that you nearly have to have both an Xbox, a PS3, a Wii AND a PC to be able to play the games you want. Thats not exactly going to save me money in the "PC's are so expensive!"-argument.
So far, instead of giving up with the PC, Ive just given up with the consoles instead, which doesn't make it less sad though, there are plenty of games on most of the consoles I would love to own.
I cant really see more standardization as a bad thing, although, I'm also pretty sure most of the console makers would fight to death to avoid it. They obviously would like to end up as the next DVD-standard or similar, not by inviting everyone else to the party, but by beating the others out of the competition. Standardization goes against that.
Only really Microsoft could be interested in some standardization (They surely must have some wet dreams of seeing "Windows" OS brand merging the PC and consoles, and then the TV).

A true turnkey hardware system would require all the various vendors get on board and give up whatever distinguishes them from the standard. Given that Sony, Microsoft and Nintendo are at least partly in the business of selling hardware

They are into the hardware selling business yeah, but they are already mostly throwing in components others made for them. Even with a unified standard, they could still differentiate on design and special features (Buy our console, it can make coffee too!!-kinda stuff), just like you can get TVs in all flavors, although in the end they can all receive the exact same TV broadcasts.