Gabe Drops the Bomb

Over at Tech Report they have an article up describing the latest Half-Life 2 debacle. It seems NVidia's newest graphics cards don't cut the mustard when it comes to Half-Life 2. Specifically 30fps on a Geforce 5900Ultra when a Radeon 9800Pro gets 60fps. All the Geforce cards go downhill from there.

The article details a presentation made by Gabe Newell at ATI's Shader Days event which depicted the disappointing performance of NVidia's chipsets and even took shots at thier benchmark "tweaking". They stated how NVidia's latest drivers would try to remove visual quality from Half-Life 2 to gain performance, while the game was in development. To quote one of the slides Mr. Newell presented about the various ways drivers lie to benchmarks "Our customers have traditionally made purchase decisions based heavily on benchmark data ... (because benchmarks are lied to) our customers will be pissed."

The presentation also detailed how Valve wrote an NVidia specific path that was apart from thier DX9 graphics renderer path, the results were still disappointing. Valve also stated they were as surprised as anyone that NVidia's hardware didn't handle Half-Life 2 very well. Long story short, if you want to play HL2 with your newest NVidia card, you're going to have to lose some of the eye candy.

Comments

I'm going to take my Radeon 9500 Pro out and give it a hug.

Make sure you touch your case first so you're grounded.

Gabe is clearly one of those candles that burns twice as bright but half as long. But I don't fear him like I do Roy Battey.

Good idea! Send it to me then, I want to give it a hug as well.

9600 Pro is better than 9500 Pro for DX9, right? Why is it generally cheaper than the 9500 Pro?

It just is Alien.  It is a slightly optimized yet "slower" card is most cases.  I am hugging my 9800 Pro right now :O

How is it that every other game company has no problems making the NVidia hardware work right? It sounds like Carmack had no problems getting relatively equal performance out of the ATI and NVIdia cards. I find it funny that people are blaming NVidia instead of Valve, who should have realized that pretty early on in engine development for their game.

Or maybe I am just a bitter FX 5600 owner.

I think you have a point Winter, I'd be very interested to hear WHY Valve couldn't make the FX performance in line with the Radeon's.

You know, aside from ATI giving them sacks of cash or something.

Well from what I have read the newest Geforce cards have never fully implemented DX9, and Half-Life 2 is the first to take advantage of all that. When the demo and benchmarking tool comes out in a couple of weeks we can all see for ourselves.

Carmack can because he didn't depend on all the features of DX9. DX9's development was heavily influenced by ATI, but only because NVidia refused to cooperate. So Carmack doesn't have a problem because he has been going to Nvidia every few weeks and seeing what they have, then seeing what ATI has, then writing for that. Valve targeted DX9 and went with it.

It still positively amazes me that Gabe just doesn't seem to concerned that a sizable chunk of the people who want to buy this game might be alienated by the fact that it is either going to perform like ass, or look like ass on their big expensive new video cards. I am still hesitant to blame NVidia, even if they haven't been behaving in much of a respectable manner lately. Ah well, guess I'll just run in 640x480 16 bit colour and be happy. Or not.

I wouldn't worry Winter.  I think it will get worked out for Nvidia in regards to HL2 performance.  It just has to or Gabe may get shot !

Oh, I wouldn't say I am worried. I won't buy it until this little "problem" is fixed. There are tons of great games coming out that will need my attention that WILL work quite well with the system I have built. To be honest, Halflife 2 the game is probably going to rock, but Halflife 2: The experience is going to create problems. All the Steam issues, the release date issues, the NVidia issues. I am getting less excited about this game by the day, and getting more excited for other stuff.

So two big bombs this week from Valve: You can't play new HL1 patch, CS 1.6 or HL2 at your LAN party unless your LAN party is connected to the internet because of that stupid Steam shenanigans, and you will get crappy performance from the graphics card leader that every other game can get good performance on. What next? Will the packaging be desinged to give you cancer on purpose? Or maybe just some sort of erectile dysfunction?

Your pal,
a.

Well, since Valve wrote the game using most of DX9's advanced features, they can't be blamed for the low performance of cards that don't properly utilize those features.

Maybe it's not that they were in bed with ATI, but that they didn't want to get in bed with anyone and were trying to remain neutral.

Who's fault is it that Nvidia's new cards aren't truly designed for DX9 compliance and performance?  Certainly not the game developer.  That's like saying it's Valve's fault that the game won't perform well on bargain Wal-Mart computers which use substandard parts.

Okay, this might be a bit late, but I am sure that I saw an article by carmack the other day about how Doom 3 had to have a special renderer to get performance out of the Nvidia Card and that the ATI card was superior (not by leaps and bounds but you get the idea).

Something about the internal path renderer not being 32bit.

Please correct me if I am wrong, couldn't find a link to it.