I thought I would give this subject a topic in the forums instead of continuing the discussion in the comments section of the article. The following quote is from Winter in the comments thread on the front page.
How is it that every other game company has no problems making the NVidia hardware work right? It sounds like Carmack had no problems getting relatively equal performance out of the ATI and NVIdia cards
Well the short answer to this is every game up until Half Life 2 has been either OpenGL or at most DX8 compliant. Again the same is true with Carmack and Doom3, he is using OpenGL. Since OpenGL 2.0 has not been officialy released yet, Carmack is doing all of his shader coding in assembly. The whole point of DX9 is to give programmers a unified higher level API to write shader code with. Beyond that DirectX 9 allows more instructions in a shader program as well as specifies higher color precision.
Of course Valve did mention that they coded a NVidia specific path for the game. According to Valve this mixed mode path uses 16 bit precision wherever possible without sacrificing image quality], but they also say this specialized codepath took them five times as long as the general DX9 codepath.
Lets not forget this isn't the first sign of bad DX9 performance in Nvidia's latest $400 mega-card. There was of course the whole 3DMark 2003 fiasco as well as John Carmack's plan file which said the following:
The NV30 runs the ARB2 path MUCH slower than the NV30 path.
Half the speed at the moment. This is unfortunate, because when you do an
exact, apples-to-apples comparison using exactly the same API, the R300 looks
twice as fast, but when you use the vendor-specific paths, the NV30 wins.The reason for this is that ATI does everything at high precision all the
time, while Nvidia internally supports three different precisions with
different performances. To make it even more complicated, the exact
precision that ATI uses is in between the floating point precisions offered by
Nvidia, so when Nvidia runs fragment programs, they are at a higher precision
than ATI's, which is some justification for the slower speed. Nvidia assures
me that there is a lot of room for improving the fragment program performance
with improved driver compiler technology.
Of course this explains why the mixed mode path in HL2 makes the Nvidia card perform better, later in Carmack's plan file he says this.
For developers doing forward looking work, there is a different tradeoff --
the NV30 runs fragment programs much slower, but it has a huge maximum
instruction count. I have bumped into program limits on the R300 already.
Which also indicates a general performance lead in shader program execution for ATI. With three different sources telling us the ATI cards are going to run general code faster than the latest Nvidia, I suspect Nvidia may be the party at fault here, not Valve.
Further lets go with a logical argument. Would it really make sense for Valve to alienate the huge installed base of Nvidia card users if they didn't think this was a major issue? No it wouldn't, the thinking here is that if users are going to upgrade to a new DX9 part for HL2 then Valve wants to let you know which card is going to be the best buy.
Errr..can someone clear this up for me? Will you be able to play Half Life 2 in multiplayer without connecting to steam?, ''cause if you can''t then they can kiss my cash goodbye. Anything that forces you to buy a router or other extra crap just to play over a LAN sucks pretty majorly. I regularly host LAN parties and I don''t need to connect to the net for anything so far. I wouldn''t like to be anyone who hosts small LAN parties and has only a 56k dialup!
MUST...STOP ...PLAYING ....SWG! MUST GET REAL LIFE BACK AGAIN!
Nobody has the ""final"" answer about Steam yet Keg. As soon as any more solid details are made available I will post them
PSN ID: Haul_N_Oats
Hey Flux, did you get my page? Give me a call.
Tobyus
[color=red]Still searching for the perfect game...[/color]
[size=10]Last edited by Tobyus on Sep 14, 2006 - 02:06 PM; edited 1,000,000 times in total[/size]
Stric9 summed up what I was trying to say in the comments much better.
Of course none of this really affects me since I have a Geforce 4400 and they''re only DX8 anyway, they run HalfLife 2 about as good as they''re going to anyway. It only affects the 5x00 series as they are the one''s that are supposed to have DX9 support. They actually run slower than the 4600 in DX8. I feel sorry for people who bought that.
I just noticed on Evils site theres a link to a petition on petiononline.com about you needing a n internet connection to play HL2 in single player and at LAN parties, as stated previously, theres no way I''m buying a game with single player content that requires me to connect to Steam every damn time!
EDIT they really need to clarify the petition and put me up a link to where it actually says that you''ll need a n internet connection to play the single player. I do wish Valve would say something to clarify their position too!
MUST...STOP ...PLAYING ....SWG! MUST GET REAL LIFE BACK AGAIN!
All of this makes me happy that I''m still running my GF4 4400 card. I''m holding out a while before upgrading again.
Switch: SW-5816-4534-9106
I actually understood that from before, but my point is that any industry that gives their customers a substandard product that will pretty much alienate half their target audience is probably not long for this world. Considering how long the NVidia parts have been out, they could have tried to eke out some decent performance. The same goes for NVidia making a non-standard part. It just seems strange to me that two cards (FX vs Radeon) with roughly equal performance in most applications all of sudden differ drastically in JUST ONE application. And you know a company like Valve could have had access to GeForceFX parts long before anybody else had. It just seems so primitive now to have to ""suffer"" at 30 fps instead of a ride of graphical luxury that 60 fps brings. I am glad I did not buy the FX5900, or I would be truly pissed off about this. At both companies.
Sorry for 2 replies in a row, but it looks like NVidia answered Gabe with a press release of some sort, as reported by 3dgpu.com, evil avatar, etc., etc. Url to news post below:
http://www.3dgpu.com/modules/news/article.php?storyid=374
My take is nobody is taking the blame. Ah well, it makes for entertainment, if nothing else.
My personal feeling is this will be fixed, much as the apparently blown out of proportion texture problem with the game was fixed in a very fast fashion.
Money can't buy you happiness...but it can buy you a boat big enough to sell right up next to it!-David Lee Roth
This press release hardens my belief that Nvidia is doing something fishy. For one it says they worked with Valve closely during the entire development of HL2. Which tells me Valve decided they were being decieved, or mislead by Nvidia somehow. Otherwise why not tell a company you were working closely with this info in private?
Further to ease everyones fears Nvidia says they don''t understand why Valve is not using the 50 series drivers. Yet Valve supposedly specificaly requested that reviewers do not use the 50 series drivers in the HL2 benchmark that is going to be released soon, as they have some concerns about some of the optimizations going a bit too far.
I will re-iterate, how many developers will have to point the finger at Nvidia before everyone finally realizes Nvidia screwed up with the whole NV30 series. This isn''t one two bit development house making these claims, it has been two of the major players in the industry (Valve, Futuremark). What could these two companies possibly have to gain by attacking Nvidia?
No LAN play without the Internet?!
...
[Silence of the Lambs]
Valve! YOU DON''T KNOW WHAT PAIN IS!!!
[/Silence of the Lambs]
Anyone who posted in this thread is a racist.*
*Except me. - Certis
Well...thats actually very simple to answer...
Count how many titles have been released so far that make extensive use of PS 2.0 and other DX9.0 features..
Thats why you havent seen the FX''s true colors. Basically bottom line.. Nvidia banked on CG and OpenGL and created a card thats more DX8 than DX9 and now have gotten exposed.
And it won''t be just one application.. if you read the Slides that Gabe used you''ll see that if a developer plans on extensive use of DX9 features they will have to develop custom code just to get it to run decently on the FX series..
Aint nothing new about the world order..it's been playing since the day they put George Washington on a quarter
Delivering Truth while the 10% deliver lies.
Everyone needs to read the article on anandtech.com posted last night.
Also, get the new HL2 video at www.fileshack.com ....WOW.
PSN ID: Haul_N_Oats
Didn''t Carmack have the same problems working on Doom 3? Wouldn''t that mean they didn''t bank on OpenGL? I was under the impression they just didn''t implement the DX9 features well, meaning if you call the features from OpenGL you''d have the same problems. Carmack had to write his own NV30 path too, right?
Not saying you''re wrong on the CG/DX8 thing, I just thought that OpenGl was having the same problems.
Yes...Carmack stated that he''s had to develop a specific ARB path for the FX cards that frequently drops rendering to 16bit as opposed to 24/32 bit.
You right overall OGL is suffering under the NV30 but my impression is that is suffers less than DX9.
Bottom line Nvidia really screwed the pooch here. You have to either be a die hard Nvidiot or just plain ill-informed to purchase one of Nvidia''s current offerings.
Aint nothing new about the world order..it's been playing since the day they put George Washington on a quarter
Delivering Truth while the 10% deliver lies.
I swear, you people just make words up sometimes!
The thing about smart people is they seem like crazy people to dumb people -- Thing I saw on the Internet
In his last plan file he states that the 5900 was actually a bit faster in its custom ARB path than the 9800, so it is suffering quite a bit less. But of course there is no higher level API for OGL yet, so he had to write all the paths in assembly anyways.
Well christ I''d sure hope so seeing as its doing internal rendering more than half the time at 16bit in contrast to 24bit for the Radeons.
Aint nothing new about the world order..it's been playing since the day they put George Washington on a quarter
Delivering Truth while the 10% deliver lies.
I areally stoked about HL2 but I really want to see the numbers on Doom 3 jump up for ATI. They are lagging too much for my liking.
PSN ID: Haul_N_Oats
numbers? have there been any recent benchmarks released for Doom 3?
Aint nothing new about the world order..it's been playing since the day they put George Washington on a quarter
Delivering Truth while the 10% deliver lies.
Not recent, the benchmarks from a few months ago, I just want an update !
PSN ID: Haul_N_Oats
Oh...those.. I wouldnt worry about that.. they are so old and will not be indictive of any sort of final performance.
Aint nothing new about the world order..it's been playing since the day they put George Washington on a quarter
Delivering Truth while the 10% deliver lies.
Well in the end I am glad I ordered a Radeon and did not stick with NVidia. But who knows in 2 years we might all be buying S3 or something :).
No hay banda!
And Carmack sez:
Amen for what Carmack said.
As for what people have been saying regarding Valve ""alienating"" their Nvidia customer base, most people have missed out on one key detail: Half-Life 2 can be run in DX8 mode, in which the FX cards absolutely rock the performance house.
The issues lie with running the game in full DX9 mode. Nvidia''s driver philosophy lies with optimizing on a per-application level, whereas ATI, after suffering the reputation of bad drivers for years on end, have successfully worked to make their drivers fully compliant and optimized for general DX9 codebases.
Switch: SW-5816-4534-9106