Half-Life 2 Performance Benchmark Preview
by Anand Lal Shimpi on September 12, 2003 12:34 AM EST- Posted in
- GPUs
It's almost ironic that the one industry we deal with that is directly related to entertainment has been the least exciting for the longest time. The graphics world has been littered with controversies surrounding very fickle things as of late; the majority of articles you'll see relating to graphics these days don't have anything to do with how fast the latest $500 card will run. Instead, we're left to argue about the definition of the word "cheating". We pick at pixels with hopes of differentiating two of the fiercest competitors the GPU world has ever seen, and we debate over 3DMark.
What's interesting is that all of the things we have occupied ourselves with in recent times have been present throughout history. Graphics companies have always had questionable optimizations in their drivers, they have almost always differed in how they render a scene and yes, 3DMark has been around for quite some time now (only recently has it become "cool" to take issue with it).
So why is it that in the age of incredibly fast, absurdly powerful DirectX 9 hardware do we find it necessary to bicker about everything but the hardware? Because, for the most part, we've had absolutely nothing better to do with this hardware. Our last set of GPU reviews were focused on two cards - ATI's Radeon 9800 Pro (256MB) and NVIDIA's GeForce FX 5900 Ultra, both of which carried a hefty $499 price tag. What were we able to do with this kind of hardware? Run Unreal Tournament 2003 at 1600x1200 with 4X AA enabled and still have power to spare, or run Quake III Arena at fairytale frame rates. Both ATI and NVIDIA have spent countless millions of transistors, expensive die space and even sacrificed current-generation game performance in order to bring us some very powerful pixel shader units with their GPUs. Yet, we have been using them while letting their pixel shading muscles atrophy.
Honestly, since the Radeon 9700 Pro, we haven't needed any more performance to satisfy the needs of today's games. If you take the most popular game in recent history, the Frozen Throne expansion to Warcraft III, you could run that just fine on a GeForce4 MX - a $500 GeForce FX 5900 Ultra was in no way, shape or form necessary.
The argument we heard from both GPU camps was that you were buying for the future; that a card you would buy today could not only run all of your current games extremely well, but you'd be guaranteed good performance in the next-generation of games. The problem with this argument was that there was no guarantee when the "next-generation" of games would be out. And by the time they are out, prices on these wonderfully expensive graphics cards may have fallen significantly. Then there's the issue of the fact that how well cards perform in today's pixel-shaderless games honestly says nothing about how DirectX 9 games will perform. And this brought us to the joyful issue of using 3DMark as a benchmark.
If you haven't noticed, we've never relied on 3DMark as a performance tool in our 3D graphics benchmark suites. The only times we've included it, we've either used it in the context of a CPU comparison or to make sure fill rates were in line with what we were expecting. With 3DMark 03, the fine folks at Futuremark had a very ambitious goal in mind - to predict the performance of future DirectX 9 titles using their own shader code designed to mimic what various developers were working on. The goal was admirable; however, if we're going to recommend something to millions of readers, we're not going to base it solely off of one synthetic benchmark that potentially may be indicative of the performance of future games. The difference between the next generation of games and what we've seen in the past is that the performance of one game is much less indicative of the performance of the rest of the market; as you'll see, we're no longer memory bandwidth bound - now we're going to finally start dealing with games whose pixel shader programs and how they are handled by the execution units of the GPU will determine performance.
All of this discussion isn't for naught, as it brings us to why today is so very important. Not too long ago, we were able to benchmark Doom3 and show you a preview of its performance; but with the game being delayed until next year, we have to turn to yet another title to finally take advantage of this hardware - Half-Life 2. With the game almost done and a benchmarkable demo due out on September 30th, it isn't a surprise that we were given the opportunity to benchmark the demos shown off by Valve at E3 this year.
Unfortunately, the story here isn't as simple as how fast your card will perform under Half-Life 2; of course, given the history of the 3D graphics industry, would you really expect something like this to be without controversy?
111 Comments
View All Comments
Anonymous User - Friday, September 12, 2003 - link
I perviously posted this in a wrong place so let me just shamelessly repost in here:Let me just get my little disclaimer out, before I dive into being a devil's advocate - I own both 9800pro and fx5900nu and am not biased to neither, ATi or nVidia.
With that being said, let me take a shot at what Anand opted not to speculate about ant that is the question of ATi/Valve colaboration and their present and future relationship.
First of all, FX's architecture is obviously inferior to R3x0 in terms of native DX9 and tha is not going to be my focus. I would rather debate a little about the business/finacial side of ATi/Valve relationship. That's the area of my expertise and looking at this situation from afinacial angle might add another twist to this.
What got my attention are Gabe Newell presentations slides that have omitted small but significant things like "pro" behind r9600 and his statement of "optimiztions going too far" without actually going into specifics, other than new detonators don't render fog. Those are small but significant details that add a little oil on a very hot issue of "cheating" in regards to nVidia's "optimizations". But I sopke of inancial side of things, so let me get back to it. After clearly stating how superior ATi's harware is to FX, stating how much effort they have invested to make the game work on FX (which is absolutely commendable) I can not help but notice that all this perfectly leads into the next great thing. A new line of ATi cards will be bundeled with ATi cards (or vice versa), and ATi is just getting ready to offer a value DX9 line. Remember how it was the only area that they have not covered and nVidia was selling truckloads of FX5200 in the meantime. After they have demonstrated how poorly FX flagship performs, let alone the value parts, is't it a perfect lead into selling shiploads of those bundeled cards(games). Add to that Gabe's shooting down of any optimization efforts on nVidia's part (simply insinuate on "chaets") and things are slowly moving in the right direction. And to top it all off, Valve expilcitley said that future additions will not be done for DX8 or so called mixed class but exclusively DX9. What is Joe consumer to do than? The only logical thing - get him/herself one of those bundles.
That concludes my observations on this angle of this newly emerged attraction and I see only good things on the horizon for ATi stockholders.
Feel free to debate, disagree and criticize, but keep in mind that I am not defending or bashing anybody, just offering my opinion on the part I considered equally as interesting as hardware performance is.
Anonymous User - Friday, September 12, 2003 - link
Wow...I buy a new video card every 3 years or so..my last one was a GF2PRO....hehe...I'm so glad to have a 9800PRO right now.Snif..I'm proud to be Canadian ;-)
Anonymous User - Friday, September 12, 2003 - link
How come the 9600 pros hardly loses any performance going from 1024 to 1280? Shouldn't it be affected by only having 4 pipelines?Anonymous User - Friday, September 12, 2003 - link
MUHAHAHA!!! Go the 9600pros, i'd like to bitch slap my friends for telling me the 9600's will not run half-life 2. I guess i can now purchase an All-In-Wonder 9600pro.Anonymous User - Friday, September 12, 2003 - link
Man, I burst into a coughing/laughing spree when I saw an add using nVidia's "The way it's meant to be played" slogan. Funny thing is, I first noticed the add on the page titled "What's Wrong with Nvidia?"Anonymous User - Friday, September 12, 2003 - link
booyah, i hope my ti4200 can hold me over at 800x600 until i can switch to ATI! big up canadaAnonymous User - Friday, September 12, 2003 - link
You can bet your house nvidia's 50 drivers will get closer performance, but they're STILL thoroughly bitchslapped... Ppl will be buying R9x00's by the ton. Nvidia better watch out, or they'll go down like, whatwassitsname, 3dfx ?dvinnen - Friday, September 12, 2003 - link
Hehe, I concer. Seeing a 9500on there would of been nice. But I really want to see is some AF turned on. I can live with no AA (ok, 2x AA) but I'll be damn if AF isn't going to be on.Anonymous User - Friday, September 12, 2003 - link
Anand, you guys rock. It's because of your in depth reviews that I purchased the Radeon 9500 Pro. I noticed the oddity mentioned of the small performance gap between the 9700 Pro and the 9600 Pro at 1280x1024. I would really like to see how the 9500 Pro is affected by this (and all the other benchmarks). If you have a chance, could you run a comparison between the 9500 Pro and the 9600 Pro (I guess what I really want to know if my 9500 Pro is better than a 9600 Pro for this game).Arigato,
The Internal
Pete - Friday, September 12, 2003 - link
(Whoops, that was me above (lucky #13)--entered the wrong p/w.)