Nextbox rumors..

Re: the problem of ballooning budgets / lack of innovation in AAA games, I agree with polq37 that while there may be some bumps along the way, in the long run it's a self-correcting problem. The largest of AAA releases may stagnate some (and indeed already are), but I don't think there's anything wrong with a system in which the real innovation happens in the budget and indie space. The new ideas that prove themselves in that arena can be iterated on (or stolen, if you want to be cynical about it) later by larger-budget teams. The innovation trickles up, the money trickles down, and everyone wins. In a perfect world, anyway.

polq37 wrote:

I don't buy the latter-half-of-2013 info. If Microsoft is ordering large numbers of chips now, and if the gpu is based on the Radeon 6000/7000 (depends on the rumor - my guess is that it will draw on elements from both series), then it just doesn't make sense for it to be nearly two years away. The tech would be really antiquated by that point.

Holiday 2012 seems much more likely.

Also, the fact that we've seen almost no info on any Holiday 2012 titles suggests to me that they are being designed with the next-gen in mind.

The trend lately seems to be a "soft" launch in the spring or early summer with only a handful of titles and maybe even a few side features that aren't implemented until the first round of system software patches, followed by a big advertising push and big software releases the following Christmas season. I could definitely see Microsoft doing that in 2013 rather than rushing for a Christmas 2012 launch.

I think you're right, hbi2k. The same generation that's given us 4 Halos and 7 CODs has also given us amazing games on the PSP, DS, XBLA, PSN and the PC Indie space. And, as I said earlier, I suspect that AAA budgets will start to level off a bit. Of course, AAA budgets are probably already untenable, judging by what's happening to THQ and other Devs, but I think there's hope that these things self-correct a bit. I know every generation people like to say things are never going to look better graphically, but we're literally hitting a wall with TVs, etc. that means that the next generation isn't going to be as big a jump as going from the Playstation 2 to now.

MannishBoy wrote:
Speedhuntr wrote:
Scratched wrote:

I predict the nextbox will be powered by electricity.

My sources tell me it'll either be hamster-wheels or perpetual motion.

Silly people. It's THC.

I say magnets.

CheezePavilion wrote:
MannishBoy wrote:
Speedhuntr wrote:
Scratched wrote:

I predict the nextbox will be powered by electricity.

My sources tell me it'll either be hamster-wheels or perpetual motion.

Silly people. It's THC.

I say magnets.

You're all wrong. The next Xbox will be powered by the gratuitous angst and insults of XBL kiddies.

MeatMan wrote:
CheezePavilion wrote:
MannishBoy wrote:
Speedhuntr wrote:
Scratched wrote:

I predict the nextbox will be powered by electricity.

My sources tell me it'll either be hamster-wheels or perpetual motion.

Silly people. It's THC.

I say magnets.

You're all wrong. The next Xbox will be powered by the gratuitous angst and insults of XBL kiddies.

That's far more power than any console will need. It will be sending power back to the grid.

DSGamer wrote:

I know every generation people like to say things are never going to look better graphically, but we're literally hitting a wall with TVs, etc. that means that the next generation isn't going to be as big a jump as going from the Playstation 2 to now.

I think the jump is going to be less how good games look, and more how complicated they can be / how many systems they can simulate at once while still looking that good. I mean, 1080p / 60 fps is likely to be the ceiling for what TVs can output for the next little while anyway, so the big graphical leaps are going to be less "making a better-looking Uncharted" and more "making a game that looks as good as Uncharted, with the complexity and scope and freedom of a Skyrim." I think that's sort of the next big hurdle, at least in the AAA space.

hbi2k wrote:

I think the jump is going to be less how good games look, and more how complicated they can be / how many systems they can simulate at once while still looking that good. I mean, 1080p / 60 fps is likely to be the ceiling for what TVs can output for the next little while anyway, so the big graphical leaps are going to be less "making a better-looking Uncharted" and more "making a game that looks as good as Uncharted, with the complexity and scope and freedom of a Skyrim." I think that's sort of the next big hurdle, at least in the AAA space.

How about "making a game that looks as good as Skyrim, with the complexity and scope and freedom of an Ultima."

This has been your '80s CRPG troll.

Actually it's not totally a troll since there's a grain of truth to it. Imagine, for instance, if Dwarf Fortress could look like Skyrim, or Skyrim could play like Dwarf Fortress.

However, like the mistaken idea of AI—that it's a problem that can be solved with more computing power, which anyone dealing with AI will tell you it isn't—more POWERRR won't, by virtue of pure horsepower, lead to more complex games. Many 20-year-old games were arguably more complex than modern ones, and it will take a cultural shift to make those desirable, rather than a supposed technological advance to make them feasible.

I would argue, though, that the race for "prettier" has been a stumbling block towards better AI. The answer this generation was that other people will be the AI. The problem with that, though, is that it turns out some people are tools when they play online. And some people want to play SP games. So I can see the industry not just throwing the horsepower at better AI, but not needing to focus as much on graphics and spend the time on AI. That's me being hopeful.

Gravey wrote:

However, like the mistaken idea of AI—that it's a problem that can be solved with more computing power, which anyone dealing with AI will tell you it isn't—more POWERRR won't, by virtue of pure horsepower, lead to more complex games. Many 20-year-old games were arguably more complex than modern ones, and it will take a cultural shift to make those desirable, rather than a supposed technological advance to make them feasible.

I don't think it takes a technological advance to make games more complex-- I agree with you that devs have been able to make complex games since the early days. I do think that technology can take a complex game and make it prettier, though. Even then, I think the biggest leaps forward will take place in the realm of middleware (since that's the only feasible way to generate the assets to make a large 3D open-world game look all purty), so the connection to hardware is only indirect, but it's real.

Gravey wrote:

How about "making a game that looks as good as Skyrim, with the complexity and scope and freedom of an Ultima."

This has been your '80s CRPG troll.

Actually it's not totally a troll since there's a grain of truth to it. Imagine, for instance, if Dwarf Fortress could look like Skyrim, or Skyrim could play like Dwarf Fortress.

However, like the mistaken idea of AI—that it's a problem that can be solved with more computing power, which anyone dealing with AI will tell you it isn't—more POWERRR won't, by virtue of pure horsepower, lead to more complex games. Many 20-year-old games were arguably more complex than modern ones, and it will take a cultural shift to make those desirable, rather than a supposed technological advance to make them feasible.

I agree with the general idea behind this post.

Something about most developers that no one really says is that most developers aren't great. I'm not even going to suggest that developers don't have a hard job and there isn't a great amount of blood, sweat and tears that goes into a game, and lots of skills are required to make something that actually gets onto shelves and hopefully sells. What I'm trying to get at is that few developers are ever going to make something really exceptional, because that exceptional thing is really f'ing hard and they're not up to it. Competent AI? sure. Brilliant AI? that's a rare thing. That's before you add the commercial pressure to make something that sells rather than something that's great, and with more money in the game the balance is going to tip further away from great. 'Great' is a byproduct now, or from a developer with enough muscle that they can call the shots themselves.

The other thing I'm not looking forward to is the first round of games on the new hardware (unless it's a very close evolution of the old-gen). The first few years is littered with games that are shoehorned onto the new platform, and are generally going to be old-gen games kicked up a bit with more graphical complexity laid on top.

CheezePavilion wrote:
MannishBoy wrote:
Speedhuntr wrote:
Scratched wrote:

I predict the nextbox will be powered by electricity.

My sources tell me it'll either be hamster-wheels or perpetual motion.

Silly people. It's THC.

I say magnets.

You have no idea.

Floomi wrote:
CheezePavilion wrote:
MannishBoy wrote:
Speedhuntr wrote:
Scratched wrote:

I predict the nextbox will be powered by electricity.

My sources tell me it'll either be hamster-wheels or perpetual motion.

Silly people. It's THC.

I say magnets.

You have no idea.

Nextbox will include its own nuclear power plant, and will thus be the very first energy-neutral console ever. Processing the nuclear waste will be a bundled Kinect game, called Half-Life: Kinect.

If the Nextbox includes a Blu-Ray player, it will probably be HDMI only for DRM reasons. My laptop won't output to VGA when playing Blu-Ray movies, I expect Nextbox won't either.

(not strictly directly related to nextbox, but what the hell)

I was reading the RPS interview with the Larian top dog, and out of a whole lot of awesome (go read it) the following was said:

You’re better off aiming for small projects and making sure you can get some revenue behind you, or you will never work on your own games.

As it applies to the nextbox, I think the thing with powerful systems isn't that it's a lottery whether a developer is successful, but that while it gives you more potential to do awesome things, it gives you more rope to hang yourself.

This generation there's been plenty of companies that have over-extended themselves beyond what they could support and gone down the drain of history. What I think will have to happen for nextbox is that the real success will be those well managed, and that doesn't really need a generation. The ability to cram a 50GB disc full of data is admirable, but not a criteria for success. It really makes me wonder if the next-gen will be a supercharged this-gen rather than an 'overhaul and start from scratch'.

NextBox Red Ring Error Codes will generate a disintegration beam that will auto target on concerned users via Kinect based optics, thus resolving all user hardware issues immediately upon system detection.

Expected hardware failure rate expected to hover around 0.00 of a percent with equivalent support costs across the ecosystem.

polq37 wrote:
WipEout wrote:

Honestly, the thought of a new generation of consoles kinda scares me, from an industry point of view.

A more powerful system does mean a chance at better graphics in games, but that means a higher cost in development as more artists/engineers/time is needed to develop and implement the assets. If AAA devs want to push the graphical envelope (and they obviously will), I'm betting what we're going to see in the next generation is a stagnation in the AAA market-- sure graphics will be pushed, but development cycles are so time-consuming and cost so much that I bet there will be very little innovation in gameplay (look at how much they had to pull back on Deus Ex, even with 4 years of development).

I think these fears are overblown. Back in the 90s, industry folks were saying the same kinds of things about Hollywood; the tremendous costs of summer blockbusters were crushing the movie industry and failed films were driving studios into bankruptcy.

Hollywood adapted. Movie production costs for summer blockbusters continued to go up, but Hollywood developed lots of different movies at different price points for different audiences. Also, the summer blockbusters became more reliable with more competent production teams behind them. Some are even very good movies. The same thing is happening in video games right now.

The big difference being that Hollywood/films had unions that fought for the employees and came to something of a middle ground with production companies. The video game industry does not, and has shown little sign of moving in that direction. Financially the game industry might be reflecting such behavior, and production-wise definitely, but so far from the inside, I can tell you that I only hope that's the direction we move in. And even then, I'd be a bit worried-- I'd rather not lose my job to an out-sourced Chinese artist, thank you

Also, I'm not sure that pointing to Deus Ex 3 useful in this case. Other companies are executing great AAA products on rational schedules. Bethesda has been great at releasing its games on target. Activision and EA are reliably releasing their big AAA franchise games on schedule for the Holiday seasons.

Bethesda managed to play their cards right, developed their own engine so there's less overhead on new IPs compared to starting from scratch, and are able to concentrate on design and art (though design and innovation in gameplay varies only slightly from game to game). Same for Naughty Dog, Rockstar SD, etc. Activision and EA are two of the worst publishers when it comes to the turn-and-burn employment policy. They both-- Activision and Infinity Ward especially-- ramp up employment when they're busy, only to over-work the talent then toss them just before the project is completed. You do recall that Infinity Ward isn't the same Infinity Ward that developed MW2, right? And that it took three developers, including the revamped IW, to make MW3? For that much force behind the game, was there really anything unique or evolved in MW3 compared to it's predecessors?

Again, this isn't to say that the AAA space will be dead this next round, but it is certainly going to show the stagnation that we've seen so far, and my fear is that bigger, better systems will turn into another bullet point in the list of reasons the AAA space needs to burn through and throw away their employees to make a game. In other words, it's already profitable for the big dogs to burn through talent, I'm afraid that a bigger system means those same developers will see that practice as absolutely necessary and others will follow suit.

What additional processing power will do for AI systems is allow more AI to be run at once. So maybe, instead of zombies, the processor budget might allow for masses of more complex enemies. Or maybe instead of Assassin's Creed having town people just walk paths with limited random actions, they could do more complex things.

I think the additional power might also allow for a bit better physics, but from the GPU and the CPU. Maybe destructible environments will become the norm instead of the exception, and rag doll bodies won't act so weird.

Those are the kinds of things I look forward to as well, Mannish. That, plus what DS said upthread that at least in terms of graphical fidelity we're hitting a wall, let alone the enormous budgets and practices it would take to push AAA graphics any further, I'm hoping that this next generation is more of an iteration built on top of current tech. Personally, I'd like to see developers continue developing in an environment with which they're already familiar, so they will not only figure out a system that works and is sustainable for all involved (like Bethesda, Telltale, and Naughty Dog), but they'll also be able to advance design as well as graphics since they'll already be building from a solid base and don't have to worry about rebuilding their engines or creating a new one from scratch. At the very least, I hope that Sony doesn't make another system that everyone has to re-learn and start all over. We recall how much beauty Team Ico was able to eek out of the PS2 in it's final hours, since they were able to spend the time learning the system and working to it's strengths. I just don't want to see developers have to figure out a new system's strengths and faults all over again.

I think it would be really funny if it was just an improved this-gen, and the next 10 call of duty games are still using a polished Quake3 engine, and still sell blockbuster numbers.

Scratched wrote:

I think it would be really funny if it was just an improved this-gen, and the next 10 call of duty games are still using a polished Quake3 engine, and still sell blockbuster numbers.

I think that's a big reason a lot of us fear the next generation rather than look forward to it. There's a lot of innovating on XBLA, PSN, etc. on these current platforms. And AAA titles are at risk of being as stagnant as they are now.

MannishBoy wrote:

Also, a 6670 in a console where there's less abstraction layers between hardware and game code is a bit different than the equivalent card in a PC running full blown Windows.

Good point. Also, I realized before I went to sleep last night that consoles only need to target 1080p for the time being. So there's probably less need for a smoking fast GPU at this point.

I should go look up the benchmarks on the 6770 and the old X1800. Would be interesting to see how well this GPU handles 1080p resolution compared to the X1800, since I think most 360 games ran/run at 720p.

Duoae wrote:

But that was true for the launch of the last generation too. So it's not unexpected. Consoles will never be "cutting edge" tech. It just costs too much per unit.

Not quite. As I already found, the 360 shipped with a graphics processor that was on-par with stuff hitting the market the month prior. A 6670 is going to be over a year old if the 360++ ships in the fall. And if you do a little more research you find that the 6670 is basically a rebranded 5770 so we're actually dealing with silicon that was designed for 2010. My current PC has a 5850 which is the 5770's older and bigger sibling.

EDIT: Regarding processing power... My understanding is that this generation was woefully starved for RAM in terms of the hardware. Both the 360 and PS3 focused exclusively on pretty images to show off the HDTV features, to the detriment of RAM which you need things for AI and other systems. This has been brought up by a number of devs, most vocal of which was Chris Hecker.

Like I said earlier, if MS is low-balling on the GPU then hopefully that means we're going to see more ram and maybe a quad-core instead of tri-core CPU (arguably speed probably doesn't matter as much these days) which should allow for more interesting games this next generation. However, I doubt we're going to see anything that approaches the graphical fidelity of the infamous Samaritan demo which was running on a beast of a PC with tri-SLI GTX 580 cards. Although we'll probably see a good number of DX11 effects as highlighted in that demo video. Bokeh depth of field is the new bloom :p

shoptroll wrote:
MannishBoy wrote:

Also, a 6670 in a console where there's less abstraction layers between hardware and game code is a bit different than the equivalent card in a PC running full blown Windows.

Good point. Also, I realized before I went to sleep last night that consoles only need to target 1080p for the time being. So there's probably less need for a smoking fast GPU at this point.

I should go look up the benchmarks on the 6770 and the old X1800. Would be interesting to see how well this GPU handles 1080p resolution compared to the X1800, since I think most 360 games ran/run at 720p.

Duoae wrote:

But that was true for the launch of the last generation too. So it's not unexpected. Consoles will never be "cutting edge" tech. It just costs too much per unit.

Not quite. As I already found, the 360 shipped with a graphics processor that was on-par with stuff hitting the market the month prior. A 6670 is going to be over a year old if the 360++ ships in the fall. And if you do a little more research you find that the 6670 is basically a rebranded 5770 so we're actually dealing with silicon that was designed for 2010. My current PC has a 5850 which is the 5770's older and bigger sibling.

EDIT: Regarding processing power... My understanding is that this generation was woefully starved for RAM in terms of the hardware. Both the 360 and PS3 focused exclusively on pretty images to show off the HDTV features, to the detriment of RAM which you need things for AI and other systems. This has been brought up by a number of devs, most vocal of which was Chris Hecker.

Like I said earlier, if MS is low-balling on the GPU then hopefully that means we're going to see more ram and maybe a quad-core instead of tri-core CPU (arguably speed probably doesn't matter as much these days) which should allow for more interesting games this next generation. However, I doubt we're going to see anything that approaches the graphical fidelity of the infamous Samaritan demo which was running on a beast of a PC with tri-SLI GTX 580 cards. Although we'll probably see a good number of DX11 effects as highlighted in that demo video. Bokeh depth of field is the new bloom :p

Given the cheapness of DDR3 RAM at ts point I would be surprised if the next Xbox didn't have at least 4GB of RAM

For anyone interested here's the Anandtech review of the 6670: http://www.anandtech.com/show/4278/a...

Now maybe there's some secret sauce or something they'd add for a 360 derivative, but the picture isn't pretty since they fail to crack 60 fps at 1680x1050 so that would seem to rule out 1080p gaming.

EDIT 2: Scratch that. Devs will scale their games down to make this work 1080p if that's the target resolution. Unless the GPU framebuffer is memory starved and can't store enough frames ahead of time.

Despite that, it does give some hope that maybe next generation we might not be shelling out $300-600 for a new console

EDIT: If the rumors about the WiiU being a derivative of the 4850/4870 series chips are true, it stands to reason that the WiiU could stand toe-to-toe with this rumored version of the new XBox based on the benchmarks I linked. However, the WiiU would be lacking features like tesselation that were added in the 5000 series cards.

Really the only three things I'd like to see in the next generation hardware would be more texture memory, larger hard drives (or SSDs), and built-in wi-fi. Otherwise, the current generation still has enough headroom to push these consoles for another 3-5 years.

shoptroll wrote:

EDIT 2: Scratch that. Devs will scale their games down to make this work 1080p if that's the target resolution. Unless the GPU framebuffer is memory starved and can't store enough frames ahead of time.

No they won't. Look at the games this-gen that drop resolution for eye-candy.

Despite how "everyone" on PC hardware is running 1080p, I think it's more realistic to see 720p with good image quality aspects like anti-aliasing. Last-gen was also running mostly on DX9-ish hardware and engines that couldn't do anti-aliasing because of the lighting techniques.

I'd also wonder how "everyone" is supposed to be running 1080p displays, I don't think they are. A good 720p benefits everyone.

WipEout wrote:

Really the only three things I'd like to see in the next generation hardware would be more texture memory, larger hard drives (or SSDs), and built-in wi-fi. Otherwise, the current generation still has enough headroom to push these consoles for another 3-5 years.

I wouldn't be surprised if someone other than Nintendo is offering a version of their console with an SSD. I am expecting the WiiU to use a small amount of flash memory for internal storage and expecting customers to suppliment that with SDHC cards or a USB hard drive.

Scratched wrote:

A good 720p benefits everyone.

As a Wii owner I won't deny that. The Wii pushes out some great visuals at 480p if the developer is smart about their art direction.

The only reason I think 1080p might be pushed is so that MS has something to beat their chest about in the hype/launch.

The thing is, selling to geeks isn't what they'll be beating their chest over. Geeks who get hot over the collar over resolution aren't the mass market that they need to reimburse the R+D on something like a console. I'll be very surprised if the launch of nextbox resembles the launch of this-box at all.

Scratched wrote:

Geeks who get hot over the collar over resolution aren't the mass market that they need to reimburse the R+D on something like a console. I'll be very surprised if the launch of nextbox resembles the launch of this-box at all.

Right, which is probably why they've been focusing a lot more on streaming video and kinect with the recent dashboard updates. There's also a big vacuum from J Allard's departure so I really wonder what, if any, the overarching vision is for the next system. Chances are this is going to try and compete not just with Nintendo/Sony but also with the Apple TV device and other set-top boxes. Especially if the rumors of TVs + Siri pan out.

It's reasonably certain that the next Xbox will have as its baseline 1080P 60FPS gaming... so I would expect that in total sum the hardware should be powerful enough to drive that with optimization on the part of savy developers.

I also think for the most part.. most developers have internally running much higher resolution textures et. al. already so the shift from Xbox 360 to NextBox development shouldnt be that difficult.

You kind of have to assume that they'll also want to support 3D, if for no other reason than for a back of the box quote. I'd assume that would mean 720p @ 60fps for the minimum 3D spec to get it running at 30 fps.

The Marketing message needs to be simple.. so whether its 720p scaled or 1080p native.. the box is going to say 1080p resolution or 1080p compatible. The details dont matter.

Otherwise, consumers would react with a simple "Oh man I could set my XBOX 360 / PS3 to 1080p.. This new machine can't even do that?"