Half-Life 2 Performance Benchmark Preview
by Anand Lal Shimpi on September 12, 2003 12:34 AM EST- Posted in
- GPUs
By now you've heard that our Half-Life 2 benchmarking time took place at an ATI event called "Shader Day." The point of Shader Day was to educate the press about shaders, their importance and give a little insight into how ATI's R3x0 architecture is optimized for the type of shader performance necessary for DirectX 9 applications. Granted, there's a huge marketing push from ATI, despite efforts to tone down the usual marketing that is present at these sorts of events.
One of the presenters at Shader Day was Gabe Newell of Valve, and it was in Gabe's presentation that the information we published here yesterday. According to Gabe, during the development of Half-Life 2, the development team encountered some very unusual performance numbers. Taken directly from Gabe's slide in the presentation, here's the performance they saw initially:
Taken from Valve Presentation
As you can guess, the folks at Valve were quite shocked. With NVIDIA's fastest offering unable to outperform a Radeon 9600 Pro (the Pro suffix was omitted from Gabe's chart), something was wrong, given that in any other game, the GeForce FX 5900 Ultra would be much closer to the Radeon 9800 Pro in performance.
Working closely with NVIDIA (according to Gabe), Valve ended up developing a special codepath for NVIDIA's NV3x architecture that made some tradeoffs in order to improve performance on NVIDIA's FX cards. The tradeoffs, as explained by Gabe, were mainly in using 16-bit precision instead of 32-bit precision for certain floats and defaulting to Pixel Shader 1.4 (DX8.1) shaders instead of newer Pixel Shader 2.0 (DX9) shaders in certain cases. Valve refers to this new NV3x code path as a "mixed mode" of operation, as it is a mixture of full precision (32-bit) and partial precision (16-bit) floats as well as pixel shader 2.0 and 1.4 shader code. There's clearly a visual tradeoff made here, which we will get to shortly, but the tradeoff was necessary in order to improve performance.
The resulting performance that the Valve team saw was as follows:
Taken from Valve Presentation
We had to recap the issues here for those who haven't been keeping up with the situation as it unfolded over the past 24 hours, but now that you've seen what Valve has shown us, it's time to dig a bit deeper and answer some very important questions (and of course, get to our own benchmarks under Half-Life 2).
111 Comments
View All Comments
dvinnen - Friday, September 12, 2003 - link
#31: I know what I said. DX9 dosen't require 32 bit. It's not in the spec so you couldn't write shader that uses more than 24bit percision.XPgeek - Friday, September 12, 2003 - link
Well #26, if the next gen of games do need 32 bit precision, then the tides will once again be turned. and all these "my ATi is so faster than for nVidia" will have to just suck it up and buy another new card, whereas the GFFX's will still be plugging along. by then, who knows, maybe DX10 will support 32 bit precision on the nVidia cards better...btw, im still loading down my GF3 Ti500. so regardless, i will have crappy perf. but i also buy cards from the company i like, that being Gainward/Cardex nVidia based boards. no ATi for me, also no Intel for me. Why? bcuz its my choice. so it may be slower, whoopty-doo!
for all i know, HL2 could run for crap on AMD CPUs as well. so i'll be in good shape then with my XP2400+ and GF3
sorry, i know my opinions dont matter, but i put em here anyhow.
buy what you like, dont just follow the herd... unless you like having your face in everyones ass.
Anonymous User - Friday, September 12, 2003 - link
#28 Not 24bit, 32 bit.Anonymous User - Friday, September 12, 2003 - link
Yeah, like mentioned above, what about whether or not AA and AF were turned on in these tests? Do you talk about it somewhere in your article?I can't believe it's not mentioned since this site was the one that make a detailed (and excellent) presentation of the differences b/w ati and nvdia's AA and AF back in the day.
Strange your benchmarks appear to be silent on the matter. I assume they were both turned off.
Anonymous User - Friday, September 12, 2003 - link
>>thus need full 32-bit precision."<<Huh? Wha?
This is an interesting can of worms. So in the future months time, if ATI stick to 24bit, or cannot develop 32 bit precision, the tables will have reversed on the current situation - but even moreso because there would not be a work around (Or optimization).
Will ATI users in the future accuse Valve of sleeping with Nvidia because their cards cannot shade with 32-bit precision?
Will Nvidia users claim that ATI users are "non-compliant with directX 9"? Will ATI users respond that 24bit precision is the only acceptable standard Direct 9 standard, and that Valve are traitors?
Will Microsoft actually force manufacturers to bloody well wait and force them to follow the standard.
And finally, who did shoot Colonel Mustard in the Dining Room?
Questions, Questions.
dvinnen - Friday, September 12, 2003 - link
#26: It means it can't cheat and use 16 bit registries to do it and need a full 24bit. SO it would waste the rest of the registryAnonymous User - Friday, September 12, 2003 - link
#26 That was in reference to the fx cards. They can do 16 or 32 bit precision. Ati cards do 24 bit precision, which is the dx 9 standard.24 bit is the dx 9 standard because it's "good enough." It's much faster than 32 bit, and much better looking then 16 bit. So 16 bit will wear out sooner. Of course, someday 24 bit won't be enough, either, but there's no way of knowing when that'll be.
Anonymous User - Friday, September 12, 2003 - link
Valve says no benchmarks on Athlon 64! :-/Booo!
Quote:
http://www.tomshardware.com/business/20030911/inde...
"Valve was able to heavily increase the performance of the NVIDIA cards with the optimized path but Valve warns that such optimizations won't be possible in future titles, because future shaders will be more complex and will thus need full 32-bit precision."
The new ATI cards only have 24bit shaders!
So would that make ALL current ATI cards without any way to run future Valve titles?
Perhaps I do not understand the technology fully, can someone elaborate on this?
Anonymous User - Friday, September 12, 2003 - link
I agree with #23 in terms of money making power the ATI/Valve combo is astounding. ATI's design is superior as we can see but the point is that ATI is going to get truckloads of money and recognition for this. Its a good day to have stock in ATI, lets all thank them for buying ArtX!Anonymous User - Friday, September 12, 2003 - link
I emailed gabe about my 9600 pro, but he didnt have to do all this just for me :DI love it.