Nvidia and HL2

I thought I would give this subject a topic in the forums instead of continuing the discussion in the comments section of the article. The following quote is from Winter in the comments thread on the front page.

How is it that every other game company has no problems making the NVidia hardware work right? It sounds like Carmack had no problems getting relatively equal performance out of the ATI and NVIdia cards

Well the short answer to this is every game up until Half Life 2 has been either OpenGL or at most DX8 compliant. Again the same is true with Carmack and Doom3, he is using OpenGL. Since OpenGL 2.0 has not been officialy released yet, Carmack is doing all of his shader coding in assembly. The whole point of DX9 is to give programmers a unified higher level API to write shader code with. Beyond that DirectX 9 allows more instructions in a shader program as well as specifies higher color precision.

Of course Valve did mention that they coded a NVidia specific path for the game. According to Valve this mixed mode path uses 16 bit precision wherever possible without sacrificing image quality], but they also say this specialized codepath took them five times as long as the general DX9 codepath.

Lets not forget this isn't the first sign of bad DX9 performance in Nvidia's latest $400 mega-card. There was of course the whole 3DMark 2003 fiasco as well as John Carmack's plan file which said the following:

The NV30 runs the ARB2 path MUCH slower than the NV30 path.
Half the speed at the moment. This is unfortunate, because when you do an
exact, apples-to-apples comparison using exactly the same API, the R300 looks
twice as fast, but when you use the vendor-specific paths, the NV30 wins.

The reason for this is that ATI does everything at high precision all the
time, while Nvidia internally supports three different precisions with
different performances. To make it even more complicated, the exact
precision that ATI uses is in between the floating point precisions offered by
Nvidia, so when Nvidia runs fragment programs, they are at a higher precision
than ATI's, which is some justification for the slower speed. Nvidia assures
me that there is a lot of room for improving the fragment program performance
with improved driver compiler technology.

Of course this explains why the mixed mode path in HL2 makes the Nvidia card perform better, later in Carmack's plan file he says this.

For developers doing forward looking work, there is a different tradeoff --
the NV30 runs fragment programs much slower, but it has a huge maximum
instruction count. I have bumped into program limits on the R300 already.

Which also indicates a general performance lead in shader program execution for ATI. With three different sources telling us the ATI cards are going to run general code faster than the latest Nvidia, I suspect Nvidia may be the party at fault here, not Valve.

Further lets go with a logical argument. Would it really make sense for Valve to alienate the huge installed base of Nvidia card users if they didn't think this was a major issue? No it wouldn't, the thinking here is that if users are going to upgrade to a new DX9 part for HL2 then Valve wants to let you know which card is going to be the best buy.

Errr..can someone clear this up for me? Will you be able to play Half Life 2 in multiplayer without connecting to steam?, ''cause if you can''t then they can kiss my cash goodbye. Anything that forces you to buy a router or other extra crap just to play over a LAN sucks pretty majorly. I regularly host LAN parties and I don''t need to connect to the net for anything so far. I wouldn''t like to be anyone who hosts small LAN parties and has only a 56k dialup!

Nobody has the ""final"" answer about Steam yet Keg. As soon as any more solid details are made available I will post them

Hey Flux, did you get my page? Give me a call.

Stric9 summed up what I was trying to say in the comments much better.

Of course none of this really affects me since I have a Geforce 4400 and they''re only DX8 anyway, they run HalfLife 2 about as good as they''re going to anyway. It only affects the 5x00 series as they are the one''s that are supposed to have DX9 support. They actually run slower than the 4600 in DX8. I feel sorry for people who bought that.

I just noticed on Evils site theres a link to a petition on petiononline.com about you needing a n internet connection to play HL2 in single player and at LAN parties, as stated previously, theres no way I''m buying a game with single player content that requires me to connect to Steam every damn time!

EDIT they really need to clarify the petition and put me up a link to where it actually says that you''ll need a n internet connection to play the single player. I do wish Valve would say something to clarify their position too!

All of this makes me happy that I''m still running my GF4 4400 card. I''m holding out a while before upgrading again.

I actually understood that from before, but my point is that any industry that gives their customers a substandard product that will pretty much alienate half their target audience is probably not long for this world. Considering how long the NVidia parts have been out, they could have tried to eke out some decent performance. The same goes for NVidia making a non-standard part. It just seems strange to me that two cards (FX vs Radeon) with roughly equal performance in most applications all of sudden differ drastically in JUST ONE application. And you know a company like Valve could have had access to GeForceFX parts long before anybody else had. It just seems so primitive now to have to ""suffer"" at 30 fps instead of a ride of graphical luxury that 60 fps brings. I am glad I did not buy the FX5900, or I would be truly pissed off about this. At both companies.

Sorry for 2 replies in a row, but it looks like NVidia answered Gabe with a press release of some sort, as reported by 3dgpu.com, evil avatar, etc., etc. Url to news post below:
http://www.3dgpu.com/modules/news/article.php?storyid=374
My take is nobody is taking the blame. Ah well, it makes for entertainment, if nothing else.

My personal feeling is this will be fixed, much as the apparently blown out of proportion texture problem with the game was fixed in a very fast fashion.

looks like NVidia answered Gabe with a press release of some sort

This press release hardens my belief that Nvidia is doing something fishy. For one it says they worked with Valve closely during the entire development of HL2. Which tells me Valve decided they were being decieved, or mislead by Nvidia somehow. Otherwise why not tell a company you were working closely with this info in private?

Further to ease everyones fears Nvidia says they don''t understand why Valve is not using the 50 series drivers. Yet Valve supposedly specificaly requested that reviewers do not use the 50 series drivers in the HL2 benchmark that is going to be released soon, as they have some concerns about some of the optimizations going a bit too far.

I will re-iterate, how many developers will have to point the finger at Nvidia before everyone finally realizes Nvidia screwed up with the whole NV30 series. This isn''t one two bit development house making these claims, it has been two of the major players in the industry (Valve, Futuremark). What could these two companies possibly have to gain by attacking Nvidia?

No LAN play without the Internet?!

...

[Silence of the Lambs]
Valve! YOU DON''T KNOW WHAT PAIN IS!!!
[/Silence of the Lambs]

"winter" wrote:

I actually understood that from before, but my point is that any industry that gives their customers a substandard product that will pretty much alienate half their target audience is probably not long for this world. Considering how long the NVidia parts have been out, they could have tried to eke out some decent performance. The same goes for NVidia making a non-standard part. It just seems strange to me that two cards (FX vs Radeon) with roughly equal performance in most applications all of sudden differ drastically in JUST ONE application. And you know a company like Valve could have had access to GeForceFX parts long before anybody else had. It just seems so primitive now to have to ""suffer"" at 30 fps instead of a ride of graphical luxury that 60 fps brings. I am glad I did not buy the FX5900, or I would be truly pissed off about this. At both companies.

Well...thats actually very simple to answer...

Count how many titles have been released so far that make extensive use of PS 2.0 and other DX9.0 features..

Thats why you havent seen the FX''s true colors. Basically bottom line.. Nvidia banked on CG and OpenGL and created a card thats more DX8 than DX9 and now have gotten exposed.

And it won''t be just one application.. if you read the Slides that Gabe used you''ll see that if a developer plans on extensive use of DX9 features they will have to develop custom code just to get it to run decently on the FX series..

Everyone needs to read the article on anandtech.com posted last night.

Also, get the new HL2 video at www.fileshack.com ....WOW.

"TheGameguru" wrote:

Thats why you havent seen the FX''s true colors. Basically bottom line.. Nvidia banked on CG and OpenGL and created a card thats more DX8 than DX9 and now have gotten exposed.

Didn''t Carmack have the same problems working on Doom 3? Wouldn''t that mean they didn''t bank on OpenGL? I was under the impression they just didn''t implement the DX9 features well, meaning if you call the features from OpenGL you''d have the same problems. Carmack had to write his own NV30 path too, right?

Not saying you''re wrong on the CG/DX8 thing, I just thought that OpenGl was having the same problems.

"Pyroman[FO" wrote:

""]

"TheGameguru" wrote:

Thats why you havent seen the FX''s true colors. Basically bottom line.. Nvidia banked on CG and OpenGL and created a card thats more DX8 than DX9 and now have gotten exposed.

Didn''t Carmack have the same problems working on Doom 3? Wouldn''t that mean they didn''t bank on OpenGL? I was under the impression they just didn''t implement the DX9 features well, meaning if you call the features from OpenGL you''d have the same problems. Carmack had to write his own NV30 path too, right?

Not saying you''re wrong on the CG/DX8 thing, I just thought that OpenGl was having the same problems.

Yes...Carmack stated that he''s had to develop a specific ARB path for the FX cards that frequently drops rendering to 16bit as opposed to 24/32 bit.

You right overall OGL is suffering under the NV30 but my impression is that is suffers less than DX9.

Bottom line Nvidia really screwed the pooch here. You have to either be a die hard Nvidiot or just plain ill-informed to purchase one of Nvidia''s current offerings.

Carmack stated that he''s had to develop a specific ARB path for the FX cards that frequently drops rendering to 16bit as opposed to 24/32 bit.

I swear, you people just make words up sometimes!

You right overall OGL is suffering under the NV30 but my impression is that is suffers less than DX9.

In his last plan file he states that the 5900 was actually a bit faster in its custom ARB path than the 9800, so it is suffering quite a bit less. But of course there is no higher level API for OGL yet, so he had to write all the paths in assembly anyways.

"Stric9" wrote:
You right overall OGL is suffering under the NV30 but my impression is that is suffers less than DX9.

In his last plan file he states that the 5900 was actually a bit faster in its custom ARB path than the 9800, so it is suffering quite a bit less. But of course there is no higher level API for OGL yet, so he had to write all the paths in assembly anyways.

Well christ I''d sure hope so seeing as its doing internal rendering more than half the time at 16bit in contrast to 24bit for the Radeons.

I areally stoked about HL2 but I really want to see the numbers on Doom 3 jump up for ATI. They are lagging too much for my liking.

"Flux" wrote:

I areally stoked about HL2 but I really want to see the numbers on Doom 3 jump up for ATI. They are lagging too much for my liking.

numbers? have there been any recent benchmarks released for Doom 3?

Not recent, the benchmarks from a few months ago, I just want an update !

"Flux" wrote:

Not recent, the benchmarks from a few months ago, I just want an update ! :)

Oh...those.. I wouldnt worry about that.. they are so old and will not be indictive of any sort of final performance.

Well in the end I am glad I ordered a Radeon and did not stick with NVidia. But who knows in 2 years we might all be buying S3 or something :).

And Carmack sez:

No doubt you heard about GeForce FX fiasco in Half-Life 2. In your opinion, are these results representative for future DX9 games (including Doom III) or is it just a special case of HL2 code preferring ATI features, as NVIDIA suggests?

Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn''t really matter to Doom, but that won''t be a reasonable option in future games designed around DX9 level hardware as a minimum spec.

Amen for what Carmack said.

As for what people have been saying regarding Valve ""alienating"" their Nvidia customer base, most people have missed out on one key detail: Half-Life 2 can be run in DX8 mode, in which the FX cards absolutely rock the performance house.

The issues lie with running the game in full DX9 mode. Nvidia''s driver philosophy lies with optimizing on a per-application level, whereas ATI, after suffering the reputation of bad drivers for years on end, have successfully worked to make their drivers fully compliant and optimized for general DX9 codebases.