you guys dont seem to understand how software works i guess. ATI and NVIDIA arnt to make the same cards and the coding has to do with proformance.
Nvidia can run DOOMIII better ATi Cards? but then why would HAlf-life2 a game with less graphics and shading run differently? its all done with coding good for ati who is better with half-life and good for Nvidia for running best with DOOMIII. it doesnt matter anyway your ati and nvidia cards will soon be obsolete. so who cares
#162 - I'm there with ya. Whenever It's time for me to upgrade I buy both GPU and CPU that are around the $100 - $150 price point
Funny I always seem to get a nice performing machine at that price without resorting to eating Ramen noodles to support my hardware fix! :)
I also work on computers/network infrastructure systems all day, so I guess it's just like the mechanic drives a junker car - I am never anywhere near the bleeding edge computer technology.
Silly peeps, the Ti4200 ($76) cost a fair bit less than $100- usually almost half the price of the entry-level Ati Card (at that 'performance standard' (9500-$113)). It's not as good a card, agreed- but if you're gloating because you paid 2x-6x to get a better card, and lo and behold, you did- well DUH! Yeah- the FX class of card from nvidia is (ahem) disappointing- okay- sadly disappointing- and Nvidia does suck for pushing a card priced way over it's performance- but come one! Some people pay up to $400 so they can 'better' play a few $50 games! You gotta love this market, right? In 6-9 months, the market will drive the technology to $99 or less (It always does)- then I buy. Maybe you early adopters who get 'cut' by the 'bleeding edge' will learn a little patience so supply & demand has a chance to work- instead of getting suckered by this hype-driven marketing. Don't get me wrong- I have always like Ati- just not to the point where I'd give them twice the price without getting twice the performance. Price <$100, then I'll be an Ati customer again.
Won't be long actually.
Maybe I'm 'out' $80, for buying this 'crappy' card. Oh well- at least it's not $150-$400 due to inadequate research or leapfrogging technology.(Come on- how many of you have a $200-$400 gfx card you had to salvage on ebay or is in your little bro's machine?) The technology evolves too quickly for me to waste my money- but that's just me- those of you who can afford to pay for performance I tip my hat to you.
This is the nature of technology after all- it's good when new technology makes old tachnology crap- and competition makes better pricing for everyone. Th system ain't perfect- but learn from it. All this "Nyah Nyah your card suck shit is juvie"
Great review! it puts in perspective what valve did with there presentation an telling the truth of the current line of NV hardwear. One thing i have been reading everywear is that Det50 will be great for NV cards and DX9 performance but nobody mentions that the CAT's 3.8 will do the same for ATI hardwear.
How dare some f you criticize valves work. Have you actually read or seen some of the demonstrations? The physics alone far superior to that of DoomIII. As for the graphics? Where can you honestly say DIII is better? Sure the gamne looks great but they only ahev to model for one or two kinds of enviroments a freaking spaceship and maybe some outside terrain! HL2 folks have to make at least 8 diff. enviroments. And as for the gameplay HA don't even bring that up. Weeee im shooting zombies on a spaceship weeeeee no strategy or AI weeee. About the only thing DIII offers is per poly collision.
Anand might have journalism skills but if your going to tell people to wait untill 12 midnight for an article then at least be prepared to deliver, not sit around and upload ya article at 12:25am or some shit like that. That is Im going to get something to eat, be back at 3pm, that 1am your time.
Anan this sucks ass!!!!!!!!! WTF say in the article that today at 12pm! hope your pussy ass is running a NV5900 so you cant play HL2 at DX9 settings :p
This is almost sadistic, declaring a deadline for something this eagerly awaited and not doing anything. Other staffers seem to be awake, though.. SAVE THE WORLD PLEASE etc etc
anand's done this before (next week we will..., we will look at X 2 weeks from now..., etc) without any results. Not that I'm having a go, his journalism is fantastic - but better to *mean* what you say when you say these things. It comes across as rather unprofessional otherwise.
Maybe they're putting the finishing touches on it? Serious though, keeping people waiting like this... ten minutes is perfectly fine, thirty is barely acceptable. I bet it'll come at like 4:30AM :P
I'm poster #126, but as OpenGl seems able to encompass and work with NVidia's hardware 'limitations' why couldn't MS have made DX9 more flexible given NVidia would have made any issues known before? MS seems to have let NVidia down rather dramatically.
First off, I'm no fanboy. I only buy hardware on price/performance. As such I must say my only reaction to (recent) nVidia customers is sympathy. I would certainly be upset. I can't think it's entirely (perhaps even mostly) nVidia's fault given their R&D took place before the DX9 standard came out - the point when their hand was already played. Is it MS's fault then having been informed of Nvdia's hardware specs? Perhaps. My hope is that future games will come out with OpenGL support such that Nvidia customers will have an alternative. That would make such unreasonable poor framerates a temporary problem with some soon to be realeased sole DX9 games. Most expectation is already that the Doom3 engine will become the most widely adopted engine - perhaps this will ensure it even more so? As someone else pointed out OpenGL currently is not mature that everyone is looking to implement it in that game. Doom3 should definitely change that. In any case don't berate NVidia too much, the optimisations are for their customers benefit given they have been arguably failed by DX9. Nonetheless these benches are going to scare alot of people who don't understand the technical reasons. The current Nvidia *still are* good cards albeit under everything except DX9.
To cut it short please don't gloat or deride eachother. Alot of people have been unlucky. And surely all being consumers aren't we all in the same boat - I don't understand the gloating or fanboyism.
I think people with FXs shouldn't worry too much. OpenGL may become more of a standard than DX9 - what game developers want to alienate a large segment of the market. If you own an FX and HL2 was one of your most anticipated games then you won't be able to run it in DX9 in it's full glory. Instead you'll have to run it at a very high resolution with 4xAA etc - IMHO that's not bad, not bad at all. If that's not enough then buy ATI - your money.
to anyone who has said that valve has no coding skills, i'd LOVE to see you make a game as good looking and as fast as hl2. if you can't, then you have NO RIGHT to judge the competence of valve's programmers, as they are obviously better than you. i'd say HL2 proves they are extremely talented, and with everything that HL2 is throwing at the cpu and gpu, those numbers seem fine. keep in mind, the game isn't out yet.
(personally, i'm very happy to see thaat my aiw 9700 pro will run hl2 just fine :D )
VALVe didnt make cs they merely bought it when it became extremely popular then proceeded to ruin it by making it so noob friendly it has lost all its depth and strategy cs is now a shitty dm game
hl2 will be good but i think most people are more interested in hl2 mods rather than the actual game most of which will be retail only and making a mod for hl2 with its advanced gfx etc will be a lot more difficult than it was for hl1 making maps for hl2 might even require a team yet alone an entire new mod
the hype around hl2 is getting so ridiculous that it can only be a dissapointment if people arent careful - i didnt think hl1 was amazing it was very good but not amazing, if you expect the world from hl2 then you will probably dissapointed whereas someone who expects an ok game will be impresed if it only turns out to be a good game
I'm just wondering, does anyone remember when the first sneak peaks of Doom3 came out and the catalyst drivers were broke for them. The 9800pro got like 10fps, shortly after the situation was resolved. I have a AIW 9800pro, but I also have a BFG fx5600 256 that oc's nicely to above ultra specs. Just wait until after the game ships to decide what is best and where if fits into the big picture.
NV is saying that Det50 will be THEIR BEST DRIVERS EVER" i guess they will be great running normal DX9 code without optimazations. HELL NO they are going to do is even more! no fog, 12 presition, crap IQ, no AF+AA with Hi res. But ATI also has "THEIR BEST DRIVERS EVER" i now they will be DX9 "NO OPTM.", 24 presition, awsome IQ, AF+AA with Hi res. Too bad NV the shit hit the fence this generation is crap. And 9600 comming to sub 100 market in a few months. NV has lost High and Mid range market to ATI and DX9 in the low end will be Ati domain too. If you own ATI stock sold it until NV40 this gen is crap!!
Basically the nVidia performance stinks, either way IMHO. If the new 5x.xx drivers fix it, then so be it, and that will be great for those cards and then they can run future Valve games. Game runs fine a Ti4600 using DX8.
However, the new ATI cards only have 24bit shaders! So would that make ALL current ATI cards without any way to run future Valve titles?
Perhaps I do not understand the technology fully, can someone elaborate on this?
Here is how your performance will be, depending on the card you have: If you have a DX9 ATI card, your performance will be ok. If you have a current DX9 nVidia card, your performance will not be near as good as an equivalently priced ATI card. Of course nVidia will release a "fixed" card within about 12 months, it would be suicide not to.
If you're using a DX8.x part, like a Ti4200 or an equivalent generation Radeon, then the performance will be roughly the same.
Likewise, DooM3 is rendered using OpenGL, and therefore whatever card you own will run as well as it can run OpenGL. DirectX compliance will have no effect on your FPS in Doom. Some websites have benchmarks for OpenGL games, you can review these to get a good impression of how each kind of card will perform.
Well I'm still running on my Ti4400 - will wait to see how "horrible" it is before I make any changes.
I think it is funny though. I've got some radeon and nvidia cards here. (I'm never cutting edge though - my fastest PC is only a p41.8)
What a silly thing to gloat or argue about. I was never fond of ATI because I was never satisfied with the timeliness or reliability of their drivers in the past (maybe that's changed now, I'm not sure.) When I upgrade I just buy whatever the best card is at the $150-$175 range.
To the point of whom is conspiring with whom is silly as well. There is absolutely nothing wrong with a company partering with another, or even making their product work better with another's. Even if that's not what is going on. There isn't anything illegal or nefarious about it. It's called the free market. So your silly talk about a class-action lawsuit against nvidia is meritless. They sold you a card that does work (compatible) with Directx9. Now since the card came out BEFORE the standard, and DX9 games came out AFTER the card, it's your own choice to purchase a card that may not be the BEST card.
Some of you need a serious lesson in market economics. The power you have is to vote with your wallets. Now that can start with a healthy debate of the facts, but this is a useless debate of speculation and conspiracy theories.
Valve's biggest interest is to provide a game to the largest audience possible, and to make gaming folks happy. That nets them more profits. And that's the wonderful thing about a free market economy. I highly doubt Valve would want to intentionally do anything to alienate the nvidia market, since there is a gigantic install base of nvidia based GPUs.
I'll play HL2 on my P41.8 w/GF4-Ti4400 128M card. If I want to turn on ALL the eye-candy and run at more FPS, then I'll have to spend some cash and build myself a new gaming rig. My guess is that it will probably run pretty well and I'll be perfectly satisfied on my current machine. After the initial release I'm guessing that future patches of HL2 and Det5 will eeek out a couple extra FPS, and that will just be an added bonus.
I will buy HL2 for a couple reasons. The first is rewarding Valve's loyalty to their customers. I spent probably $50 buying HL1 back in 1998 and I got 5 good years of gaming enjoyment. They maintained patches and support WAY BEYOND any resonable expectations for a game. I got some REALLY good extended play out of their game with FREE mods like CS and DoD. I will buy HL:2 to show that their loyalty to me will be rewarded in turn. I'd like to send a message to other game developers that open platformed games and long-term support is what we insist on, and reward with our money. The other reason is Valve showing that they will accept nothing less than cutting edge. HL1 was groundbreaking at the time, and HL2 looks like it will be the same.
Dude, the problem with Nvidia is not because they intend to build a crappy card based on 3Dfx technologies. They went with the .13 micron process at TSMC that is rather new and too bleeding edge at the time. ATI is using a more conservative .15 micron process that is easier to get good yield out of so they are fine right now.
From what I heard, if Nvidia is more conservative back then when they design nv30 they would have been ok. ATI decided to wait and let everyone else at TSMC (including Nvidia) works out all the process problem before jumping aboard.
It had taken Jen-Hsun Huang many a market cycle and an enormous amount of money, but on that fateful Friday in December, 3dfx was finally his. Yes, the ancient Voodoo technology was his at last.. but, unbeknownst to him, so was its curse.
"There is no need to gloat to nVidia card owners, because you'd have to have a LOT of contacts to know that this was coming. "
Not at all. All you had to do was pay attention to the supposedly "useless" synthetic Pixel Shader 2.0 benchmakrs that have been around for MONTHS. They told exactly the same story - the GeForceFX family of cards have abominably bad PS2.0 performance unless you tailor-make a partial precision path specifically for them. That the partial precision path results in lower image quality isnt as important............ ....when your drivers detect SCREEN CAPTURING and then MANIPULATE THE OUTPUT QUALITY to make it appear better!
If nvidia had designed a part that ran DX9 code properly and with decent speed, there would be no discussion here. The fact is they didn't. And their only recourse until NV40 has been to manipulate their drivers to an almost unbelievable extent, reducing quality and introduciig arbitrary clip planes at will.
I don't own an ATI PS2.0 capable part, but it has been obvious since the introduction of the 9700 that it is the one to buy.
#106) Orginal HL is based off the orginal Quake engin which was based off OpenGL. If you notice, the DX version of HL sucks, it sucks alot. Looks terrible is buggy. It isn't supported as fully as the OpenGL version. THere are a buch of lighting effects in OpenGL version that aren't in DX mode. The only time I use DX mode is to debug. Now imagin porting the vastly more complex HL2 over.
To get a good overview of this thread minus most of the fanboyism, use this index with my opinions:
#24 right on! too bad this thread didn't end here though. #33 .. um...no #39 320x240 (just like in Quake on 486 days) #42 agree #44 1993 #62 because of IPC (efficiency) #71 correct! #72 LOL #76 BINGO #80 correct #81 it is ALWAYS a marketing war, silly. #86 heavy denial (and when the fairy godmother came, she turned him into a real DX9 card...) #93 e x a c t l y #103 DONT READ THIS ONE... I actually did, and always fell asleep and fell out of my chair. Nah, just kidding... but it's the best "summary" of this thread. #106 Actually if you've ever done Fortran/VB/C/C++/Java/Perl then you would know that "programming" isn't just fun-time... it's a lot of work, and it sucks if you have to do it twice.
The reason valve went with DX9 over the new OGL engine is that DX9 is more mature then the new OGL standard...which isn't even officially released yet.
#104, they did find a way to run Nvidia cards smoothly, DX 8. They can't guarantee full support with all cards, but they have done their best. I don't think HL2 would be geared towards ATI cards, but isntead thats just how it worked out.
I wonder if Doom 3 will have any similar problems like this... with Nvidia cards
Well, IMHO I think Valve will suffer more because they aren't reaching the whole market of Nvidia users out there. For those guys who want to spend the extra money for hardware just to play HL2, good for you, but I personally don't have the money to throw around like that everytime I want to play the lastest game.
Yes, they are. I doubt saying this is going to affect anything, but seriously, these flames are getting WAY out of hand.
People with GeForce FXes who are sore about their cards being slow about HL2, them's the breaks, it's not your fault. Who knew? But don't accuse Valve of going Futuremark on the community until there's at least a grain of evidence. And don't tell me "Valve and ATi working together is evidence enough," because IIRC, the partnership happened after Valve saw how the FXes did with HL2.. and, thus, after NV30. Ahem.
As for you ATi folks, yes, it's nice. There is no need to gloat to nVidia card owners, because you'd have to have a LOT of contacts to know that this was coming. The FXes do perfectly fine in normal benchmarks, with the obvious exception of NV30. (The 5200 Ultras and 5600 regular are pretty bad cards too, in my opinion, since the 5200U is as expensive as the Ti4200 and a lot slower, and the 5600 regular is pricier than the Ti4200 and slightly slower. You know, the Ti4200 really is a good card. .. uh oh. I've gotten sidetracked.) The FX 5600 Ultra 400/800 was the best midrange card around (well, since nobody knew about HL2 performance), even if it was tricky to find, and the FX 5900 Ultra dominated the high end. (The 9800 Pro came close, but unless one of them cheated I'd say it was a win for nVidia, albeit a small one.) They didn't make a stupid choice, they probably decided on an nVidia card because of benchmarks or because of good experiences with them. Okay? No more of this. It's stupid and immature.
And to people on both sides of the line, just because someone says something stupid is no reason to flame them. Maybe they're trolling, probably not, but either way you'll do better just politely explaining why what they said was incorrect and/or illogical. Name-calling just makes you look worse. And if there's an all-out flame, ignore it.
Why am I putting this in a comment thread? Hmm. I guess I have too much time on my hands. OTOH, this HAS gotten sort of ridiculous... well, whatever. It's not as if anyone's going to pay attention to any of what I typed, they'll just skip over it and say something about how stupid those goddamned blind fanATIcs have to be if they don't realize that Valve is totally being bribed by ATi and the Evil Moon People to cripple FXes in HL2 or how stupid those goddamned blind nVidiots are to buy GeForce FXes when they obviously should have a tech demo of HL2 on hand. Eh, I tried.
Whoever made the comment about OpenGL and DirectX was very right; Doom III is a very different game, and the FXes seem to only fail with lots of DX9 usage. They certainly perform well in OGL, though, looks like.
God, I remember all the reviews saying the FX 5200 Ultra was decent because while it was slower than the comparably priced Ti4200, it was DX9. Ha. =(
Since this is a video card thread-thingy, I guess I should end by stating what sort of video card I have and either insinuating that my use of it makes me unbiased (if I use a card from the company that I just explained my problems with, or if I use some cruddy aging integrated thing) or explaining that just because I use it doesn't mean I'm biased (if I use a card from the company I backed up). (You're supposed to include that in posts on these things, usually at the end, just like how in CPU-related posts you have to make a joke about cooking food on a Prescott, or how in heatsink-related posts you have to mention that your current [insert cooling solution here] does just fine for you-- and if it's a non-air-cooled system, you are required to make a happy emoticon afterwards, possibly one with sunglasses. If you don't do these things your opinion is automatically invalidated.) Well, I'm not going to, because then someone would almost certainly call me a fanboy.
I'd actually put some Value in what Gabe says if he wasn't on the payroll of ATI. ATI and Valve have been working together for quiet some time now. Now lets really think about this, Gabe = former Microsoft worker. Microsoft = known for making bs claims and undercutting the competition. Valve = makes more money if ATI cards sell better. Hmmm, should I really trust this guy? Probably not. I'll wait for a non-bias third party says Nvidia fucked up dx9. Till then, I put Gabe right next to the stats on AMD site comparing P4 and Athlon and the study sponsored by Microsoft that shows Windows 2003 is faster than Linux.
As for the 5 times longer on developement, I have very little respect for the staff at Valve in that area. They have repeatedly show that they aren't competent when it comes to coding.
I think the chances of successfully suing nVidia for misrepresentation and fraud will go over as successfully as that one against Intel for the P4's lack of performance. Anyone notice how it just sort of faded from the media spotlight and then dried up? Never sue a big corporation under a Republican government: Nobody gives a shit about the people - a kinder gentler America: Step over the little guys, not on them.
Vis a vis Doom III, I remember reading just recently on HardOCP or ShackNews that TeH Carmack wasn't too impressed with the N3x cards either, and had to write a special mixed-mode path to address performance issues too.
Something somewhere went wrong. And that something happened to be Nvidia AGAIN. There should be a class action lawsuit for fraud against Nvidia Selling supposed DX9 cards when the game publisher and Microsoft say that Nvidia wasnt following the DX9 coding standard. And toboot Gabe says that it took them 5 times longer to code for Nvidia cards which brings the cost of the game higher. So here you have the game being written in DX8 code for the Fake DX9 Nvidia cards to be playable. As Gabe sayed " Id be pissed " if I owned a Nvidia card. So for all the people that paid such a high price for your 5900 ultra get together and sue Nvidia. These companies have to be stopped sometime and be accountable for their actions. You have Gabe with all the proof you need to show fraud.Stand up and be counted and tell them " Your Not Going To Take It Anymore"
I feel fairly unbiased in saying this (as my current card of choice is an integrated Intel810 graphics chip *wooo*) but I think it *isn't* fair to start abusing nVidia over this surprising lack of performance. My hat goes off to ATI, because if you look at what they've done as a company with their line of products from the Mach64 in the past to being an industry leader today (which they share with nVidia), then there has been an amazing amount of growth. The flip side is that nVidia has also achieved great things and is actually a younger company. It's already been mentioned that nVidia pioneered (*awaits flames*) 32-bit rendering depth, when the industry was focused on 16-bit (and back in those days I owned and endorsed 3dfx stuff, but once you see games like Quake3, UT running in 32-bit the difference was noticable), and the reason nVidia did well was because they made good products. All I can say is wait for the final product, and let us all remember that 3dfx dominated the 3D hardware market in the gaming community, and then they bet it all on some so-so hardware and lost. One last thing: If you look at the full-range of current and upcoming games, then nVidia and ATI share the benchmark-leads together, but in a lot of reviews I've seen the 5900 Ultra wins over the 9800Pro, then vice-versa. You can almost compare it to the CPU field where Intel dominates performance, but loses on price. (IMHO) With all that said I will probably buy a 9600Pro, and an Athlon, because of the price/performance ratio, and I'm confident I'll at least get playable performance on HL2 and Doom3. Thanks, etherboy
Seems like NVidia fooled buyers selleing dx9 card especially in the low end of the market. I just bought a 5200 card after seeing some benchmarks and seeing nothing really faster on the ATI side (9000, 9100, 9200). The point was : at equal price and perf. I take the DX9 card for the future. Now I feel fooled :(
A lot of people here are comparing DOOM III to HL2. This is just not possible. You have to remember that DOOM III was coded using the OpenGL API and HL2 was coded using the Direct3D API.
With that said, OpenGL is a bit more flexible with support for hardware, since it's open-standard. Direct3D, on the other hand, is a bit more rigid in it's design. It takes 2 years for a GPU to be designed. These chips were being designed long before MS nailed down DX9 spec. Nobody is really to blame, Nvidia just picked the wrong way to design their card for DX9 compatibility.
It seems to me that, as a 9500 pro owner, nVidia's gotten themselves into fairly hot water. Absorbing 3dfx was not a smart move, as they really haven't brought anything to the table that nVidia did not already have either in the marketplace, or in development. They need to think "lean & mean", like they did in the old days, where they addressed the issue of the day (32 bit colour back then), rather than hedging around it, breaking benchmarks, and generally carrying on like a spoiled and over-indulged child.
FINALLY A WORD FROM ANAND. But why not just post it in one article Anand? Seems a waste of time since either Wesley or Evan will post yet another article today. :)
Let me see...ATI-only conference, ATI footing the bill, ATI is business partner of Valve, ATI wins! What a surprise! Of couse, it wouldn't be any different if it was an Nvidia-only conference, would it? BTW, I'll buy no game that won't run on the video card I have.
Valve actually spent 5x more time on the NV30 path then they did on the default dx9 path, and the FX still got owned. So anyone accusing Valve of taking the time to code their game for the FX series needs to have their head checked.
This is just the first DX9 game (well there was also tomb raider witch showed the same difference in performance) witch confirms what dmark03 (at the time when cheating at it wasnt allowed) showed us.
I'll wait until I see it run on my fx 5600. If UT2004 is optimized for NVidia, I'll take that trade-off.
Love the 'my card is better than your card' crap though...... but remember NVidia owners laugh at you ATI guys all the time, so enjoy it while it lasts...
I think this is a marketing war - mainstream cards are the bulk of sales and whoever dominates that sector almost FORCES games producers to make products for THOSE cards - regardless of implementation...
"What more, he [Newell] said, smaller developers are not likely to have the resources Valve was able to bring to bear on the problem. "
"Half-Life 2 has a special NV3x codepath - even with the special NV3x codepath, ATI is the clear performance leader under Half-Life 2."
"ATI didn't need these special optimizations to perform well and Valve insists that they have not optimized the game specifically for any vendor."
What's with "Not optimized for any vendor" and "NV3x codepath"??? Valve is slapping themselves!!!
"The 5900 ultra is noticeably slower with the special codepath and is horrendously slower under the default dx9 codepath;"
"the Radeon 9600 Pro performs very well - it is a good competitor of the 5900 ultra"
Geez, pit the Radeon 9600 Pro vs 5900 Ultra in the current crop of games and we all know that's not true.
1) Valve's stories don't exactly gel very well, I suspect they built Half-Life 2 from the ground with ATi class hardware in mind as per the standard DirectX9 specifications. (Nvidia just gotta blame themselves cause they didn't follow the specifications.) So, in the end Valve have to extend the development time to include the NV3x codepath, which obviously isn't working well enough.
2) Valve and Ati are probably in bed together as shown with HL2 bundling with ATi Radeons. However, ATi did an exellent job with the 9700/9800 Pro, besting everything Nvidia can conjure up in every market segment not only in terms of performance but price too.
3) Nvidia stumbled big time with the GeforceFX, it's overpriced compared to any equivelent ATi card as of now. (The merging of 3DFX and Nvidia technologies just didn't make the cut.)
Conclusion:
1) Radeon 9xxx are the best buys as of now. They run the current crop of games and also HL2 very well.
2) GeforceFXs are over-priced, but still can't beat Radeons 9xxx convincingly. For people with GeforceFXs now, I guess you made wrong purcase decision in regards for this generation of GFX cards.
3) For the time being, ATi owners are doing the laughing, but not the final laugh as of yet though. Speaking as a consumer, I hope that the competition continues to heat up, for I will buy whichever has the best price/performance.
hehe, I got a 9800 pro 256mb (i know, iknow the extra 128mb makes no difference but im a sucker for the numbers!) and I nearly fainted when I saw the price for the equivalent FX card, now im LMAO about this, but I feel really sorry for the guys who payed a fortune for an FX who are probably in denial at the moment. Its a real pain that games are now fighting gfx card technology instead of being able to enhance their software with them. I think we will see the reverse of this FX situation when Doom III comes out though!
Note that when anyone says their DX9 code is "not vendor-specific" that the reason NVIDIA's been having so much trouble is that MS basically sold the DX9 spec to ATI in no small part because of its constant squabbles with NVIDIA. Contrary to popular opinion, these hardware architectures were actually far along in development well before DX9 was nailed down. In a reversal of the DX8 development, DX9 was basically a software/API description of the R300. People bitch about NVIDIA using 16 and 32 bit FP and "not being to-spec," but you must realize that these major architectural decisions (all-FP24 vs. fixed+FP16+FP32, 64 instructions and multiple outputs vs. 4k instructions and one 128bit output supporting packing of multiple smaller-precision elements, etc.) were being weighed and worked out well before MS came down with the DX9 spec. The spec was developed, as usual, with the involvement of all the players in the hardware world, but ALL of the bitchy specifics were handed to ATI. Admittedly, this has happened in the past with NVIDIA, but it's particularly problematic once the DX spec starts defining code paths and internal representations for these immensely complex stream programs in today's vertex and fragment units. As such, though it's clearly an important target which NVIDIA bungled largely in its business relationship with Microsoft, DX9 could easily be considered an ATI-specific codepath as sticking to spec forces a very non-optimal code path for the way the NV3x pixel shader is architected.
from what i've gathered.. HL2 = HL + new gfx + physics.
by new gfx, i just mean that they finally figured out how to make the Quake engine load textures that are > 8-bit, and then they read some soft shadowing/bumpmapping tutorials and cut/pasted that code in there as well.
concerning the confusingly low sys requirements #39 was referring to:
if you're running a TNT/GF2 or possibly (?) GF3, you'll probably have to turn OFF the fancy gfx that have gotten HL2 half the hype. just so you can play it, instead of watch it play (ie slideshow). so you'll basically be playing a physics upgrade mod for HL. along with all the new content for HL2 (maps/textures/models/story etc).
------------------------
as for the comparison between HL2/Doom3 that you lamers can't give up on:
HL2 undoubtably has more dynamic gameplay than Doom3. Doom3 definitely has more atmospheric mood-driven gameplay than HL2.
imo, there is no such thing as better gameplay. just as there is no such thing as a more fun game. it's just a matter of preference.
a product is what a product is.. if you prefer apples, eat apples.. if you prefer oranges, eat oranges....
if it's so important to you to argue why one is better than the other, then ur a politician..
terrorists are politicians too y'know.
to all u who bought new hardware to play the game before it comes out... i just avoid gambling all together and wait for the game to come out first.
NVidias Hardware renders at 16 and 32 bit. 32 bit is too slow and 16 bit renders with IQ loss thanks to the lesser precision.
Also, R3x0 hardware renders 8 textures per pass, while NVidia renders 4 or 8 textures per pass depending on code. Using Single texturing and advanced DX code (ie DX9) the engine works at 4 textures per cycle, even when using smaller precision shader code. The problem is the hardware, not the drivers.
I find this all quite fascinating. Half life was the first game i played on the first computer i owned. I was running nVidia then and have been since (current = Ti-4200). I am about to upgrade and have been researching for hours a day about latest DX9 cards and must say that without question, ATI will be getting my cash this time...and from everything ive read/seen/heard...they have produced a superior product...PERIOD (please no "in the future..." posts cause i could be dead before nVidia catches up...i care about NOW)
I've never owned an ATI card and I've owned more than a few nVidia cards (currently a 4600 in my main rig).. so I think I can make this statement without bias:
Some of you guys are desperate to make yourselves feel better about your ultra expensive nVidia FX cards. It's pathetic and sad.
Personally, I am going to wait and see how my 4600 will run HL2 before deciding if I need to upgrade (I can live without max eye candy)... and if I do I will probably buy an 9600Pro simply because it seems it might be the best price/performance value for HL2.
I have LITTLE patience for buggy drivers though and find my 4600 w/40.72 Dets to be ultra stable so ATI's drivers better not piss me off. :D
God, I love all the nvidiots out ther saying valves programers suck. Everything is not software, it is mustly hardware problems. Valve came up with a work around for nVidia not following spec. Be happy with that. Just because nVidia made "Wonder" drivers a couple years ago dosen't mean that it always works out that way. DX9 calls for 20 bit percision. That's what ATi uses and what Valve decided to use. Nvidia decided to use 16 and 32 for some reason. This is why nVidia doesn't liek 3dMark03. Because they didn't follow spec, they are mad about it.
Geezus I'm not sure which fanbois are worse, the nvidiots or the fanATIcs (DigitalWanderer et al). Right now I'm leading towards the fanATICs but only because the nvidiots are more less hushed up these days.
It is pretty remarkable - Valve have come out and said "Well, ATI cards are great and they work properly, but nVidia cards don't run properly. Also we can't make FSAA work in our game, we don't know how. But we're really good programmers and it's the nVidia card that is at fault."
Obviously Valve has a different team of developers since they did the original Half-Life.
31, If making a game run properly on NV3x hardware entails not using proper DX9 high precision rendering, then yes.. Valve is guilty of optimizing for ATi. Otherwise, nobody is to blame but Nvidia (for designing the NV3x), and Microsoft (for designing the DX9 specifications).
Fudge, I guess I will wait for the 9800XT cause those numbers off the 9800 Pro are pitiful. But certainly not as pitiful as Nvidia. Nvidia, you lost this round. See ya next year.
"Half-Life 2 has a special NV3x codepath that was necessary to make NVIDIA's architecture perform reasonably under the game;"
"Valve insists that they have not optimized the game specifically for any vendor."
Don't these statements seem to contradict each other? I assume the first is Anand's opinionated statement. The second is Valve insisting no optimizations. I have to believe the second.
If ATI does hold slim lead, I'm sure GeForce FX 5950 will remedy that or an optimized driver update. :-)
Well, I've been waiting for HL2 to be released before getting my new vid card, but I guess this clinches it. It'll be ATI. :) Heck, if I'm lucky maybe there'll be a 9900 pro out by then. :)
Figures, Nvidia's "code it crappy now, fix it later" mentality finally bit them in the ass. They shove this crappy code out that barely works in the minor games, and then release optimizations... errrmm.. patches... for every major game.
Stick with good 'ole ATI. Sure, their drivers might be slightly buggy, but at least they actually fix the problems.
Rmember, we still know very little about performance even though he said that it is getting 60 FPS at 1024x768 on the 9800 Pro. Does that mean with Anti Aliasing on or off? How about Anisotropic Filtering? You can't just forget those guys when you argue the 9800 performs better.
Until we see exactly is being used in the tests, we can't accurately judge the performance.
Also, the comment about it runing in dx8 on 5200 and 5600 cards means that they understand the market and are making it playable. When they commented some time back about it running on a 4600, you have to remember that was when there was lots of debate regarding AA issues and they weren't very sure. That was one problem they saw more towards Nvidia and said they can try and make a work around for ATI cards.
Also, we've seen the wonders Nvidia can do with drivers some time back, and so we can expect something to change the situation now. I'm sure that Nvidia has known about these problems and is working on fixes or enhancements for its speed.
Let me also say that I'm not saying ATI isn't doing good, they are doing great and this is something that has fallen at a very good time frame if they can get the next generation cards to outdo Nvidia because the 9700 and 9800 have given them the higher performance for a much longer time.
What I'm looking forward to is seeing the AA and AF results as well as if they made any other optimizations such as for hyper threading...
I could have sworn I mentioned somewhere here before that the truth would come out in the fullness of time...I guess the time is full now!
To any nVidia enthusiasts saying that the Det 50s will fix this, you might want to check out our story on the Det 50s over at www.elitebastards.com ....they're comparable in performance to the 45.23 set! :lol:
"Gabe Newell: Valve and NVIDIA both know that we have a lot of shared customers, and we've invested a lot more time optimizing that rendering path to ensure the best experience for the most customers." - Makes you wonder what would happen if they hadn't. 20fps anyone?
#40 i think if you have low end gfx cards they want you to run it in dx6 #39 vice city is a terrible game gta3 great vice city is fucking shit dribbling out of a gorillas ass in comparison
It makes sense if you think of it in terms of what kind of detail you'll be able to milk out of the card. GeforceFX will play HL2 better than GF2, but not at the level you would you expect for a premium price card.
I don't quite understand how it's elledgedly ran "fine" on a 4600 (they claimed they ran HL2 with that card and it ran fine) but it runs like shit on a 5600 and even on a 5900. So running it in DX8 mode ona 5600 will only make it "playable?" (30+ fps?). Then how is a TNT card in the min requirements? Won't a GF2 run it in DX8 mode also?...How can it be playable on that and on a GF4 but not on an FX card? That doesn't make sense to me.
Also, someone mentionned STALKER. Well, they've once shown STALKER being played on an FX 5200, and it seemed to run pretty well to me. It had a lot of grass too, so I don't think the details were on "low." I own a 9800 Pro, but that still doesn't make me think that it's normal to claim HL2 runs well on a 4600, but very slowly on a 5900 and only at 60fps on a 9800 Pro. It doesn't make sense.
HL2 will beat the crap out of Doom3 any day. Just look at the gameplay. Why is GTA: Vice City so great? Certainly not for the graphics, not that HL has anything to be ashamed of in that department. I have a feelig that Doom 3 will look great, but play like crap, while HL2 will have it all :)
BTW: the leaked Doom3 demo works surprisingly well on my R9700 and if I recall correctly it was first showed off runnig on an Radeon9700. I know because I had just bought a GeForce 4 Ti4400 for $450 and was shocked that they didnt use a Ti4600.
Hell why did they even call it dx9? Why did nvidia and ati make videocards specifically for dx9? </sarcasm off> Performance-wise, I'd say the diference isn't enough to give you a stroke but it's certainly noticeable.
Why can't people make the distinction between DirectX and OpenGL? Everytime a DX9 game comes out and performs way better on the R3xx than NV3x based GPUs are they going to keep on citing Doom III? It's not even a DX9 or DX9 class game! The NV3x still even needs a special pathway in that game that runs at lower precision to beat the R3xx. I doubt even the guys developing Stalker can make the NV3x outshine the R3xx in a fullblown DX9 game.
What would you prefer to play, based on the previews/videos/etc, Doom3 or HL2? I vote HL2 hands-down. id haven't made a compelling game since Doom2. Just an opinion, don't take offense, of course Carmack & Co are extremely gifted.
Anand, change the title text! When I read this i get this image of u jumping up and down with a ATi flag advertising their hardware or something. Shitloads can can change before HL2 is released in November, and maybe it "rocks on nVidia" too on release.
You'll lose some of the fancier water effects among other things. Turns of some mode of bump mapping on models too i think. Saw a chart once, you can try looking for it.
I don't want to take this into fanboy territory, but valve weren't "crappy coding guys" when HL1 came out several years ago were they? In fact, one could argue that if HL2 was designed to work on systems that are 2+ years old they can't be all that bad can they?
Is it just me or is everyone ignoring the situation with Doom 3 benchmarks running noticeably faster on nVidia's hardware.
Okay, lets try something; I'll restate what Anand said but swap nVidia for ATi and HL2 for Doom3: " - with the NV3x codepath, nVidia is the clear performance leader under Doom3 with the FX5900U hitting around 60 fps at 10x7. The R9800 is noticeably slower. - the FX5700 ultra performs well, - it is a good competitor of the R9800; - nVidia didn't need these special optimizations to perform well and Carmack insists that they have not optimized the game specifically for any vendor.
Okay, just to make sure guys, go read Anand's Doom3 article. You'll see all the above holds, (if you extrapolate what we know, the FX5700 bit too).
You've also got to remember regardless of designs/pipes/whatever, *both* the FX59 and R98 are about 120M transistors, and nVidia is actually clocking these transistors faster (tho ATi can make up for this by using 24bit instead of 32bit per unit memory). Unless nV's engineers were plain stupid desiging the chip (unlikely), they have hardware that's easily on level with R98, and later driver's are likely to exploit this further. ATi's basically been optimising their R300/350 for years now, there's probably not nearly as much headroom for compiler optimisations in later drivers.
Basically I think when the Det.50's come out, the FX59 will even up with the R98 in HL2, and totally trounce it in Doom3. Maybe not, but just thought I'd restore some balance here :)
nV basically overshot a bit with their NV3x hardware generation while ATi stuck to DX9 fundamentals, so they've been sorta screwed from the start. I reckon they'll kick ass with NV40 though. I hope ATi do as well!
#21 Doom III 's looks with Half-Life's 2 interaction ("great gameplay") could be possible, with both great framerates @ ATi and Nvidia cards. But NOT with valve's crappy coding guys behind the wheel. That takes a Carmack or Sweeney.
Nvidia is, I bet, a bit desperate about this. How else do you explain those ridiculous "The Way It's Meant To Be Played" logos they're marketing so agressively?
Oh well, at least I only have a Ti4200 so it doesn't come as such a shock that performance will be sucky! It'll be ATI for me next time for sure. If the 9800Pro is pulling 60fps at 1024x768 and is way ahead of the FX... wow, I feel for you owners of $400 FX5900's.
You know Anand, I somehow had a feeling you were working up a Half Life 2 article or a Doom 3 article... maybe I'm starting to pay too much attention to your writing ;)
Its good to see an article from you after so much time! :)
Thats the most shocking news i have came upon in my gaming life... i feel sad for this with FX cards... Lets face what if this was not Nvidia cards, but ATI cards, i would have been really sad.
I'm guessing he has the Half Life 2 benchmark Valve was talking about when HL2 was first announced, you know, the benchmark that was supposed to come out 2-3 weeks before HL2 shipped?
Steam comes out tomorrow, so maybe the Benchmark will follow, and thats why he can't post now?
It can't possibly be how slowly NV3X runs the ARB2 rendering path, NV3X's extreme brute force approach, or the massive fundamental differences between NV3X and R3XX. It has to be a conspiracy!
People who bought Radeon 9500 Pros before prices went sky high, people who bought Radeon 9500s and soft-modded them, you may now spend the next week laughing at people with GeForce FX 5600s. Ready.. set..
not it softmods just fine. Runs hella fast and all. Just some driver issues. For like a week no OpenGL programs would work. Did everything I could think of short of reformatting, couldn't get it to work. Then one day it just started working at random, go figure.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
169 Comments
Back to Article
Anonymous User - Wednesday, October 15, 2003 - link
you guys dont seem to understand how software works i guess. ATI and NVIDIA arnt to make the same cards and the coding has to do with proformance.Nvidia can run DOOMIII better ATi Cards? but then why would HAlf-life2 a game with less graphics and shading run differently? its all done with coding good for ati who is better with half-life and good for Nvidia for running best with DOOMIII. it doesnt matter anyway your ati and nvidia cards will soon be obsolete. so who cares
Anonymous User - Friday, October 10, 2003 - link
looks like the tables are turningAnonymous User - Monday, September 22, 2003 - link
hehe the 9600 Pro hangs with the best NVIDIA has to offer. you fanboys blaming developers rather than NVIDIA are a joke.Anonymous User - Friday, September 19, 2003 - link
miaowAnonymous User - Sunday, September 14, 2003 - link
This is complete CRAP!!!! I don't buy this for one minute especially with a BIG fat Advertisment for an ATI card underneath.Pffff bullshit
Anonymous User - Saturday, September 13, 2003 - link
Is the first half-life game any good? Somebody gave me the Platimum collection of the game for Christmas, but I haven't played it yet.Anonymous User - Friday, September 12, 2003 - link
#162 - I'm there with ya. Whenever It's time for me to upgrade I buy both GPU and CPU that are around the $100 - $150 price pointFunny I always seem to get a nice performing machine at that price without resorting to eating Ramen noodles to support my hardware fix! :)
I also work on computers/network infrastructure systems all day, so I guess it's just like the mechanic drives a junker car - I am never anywhere near the bleeding edge computer technology.
Anonymous User - Friday, September 12, 2003 - link
Silly peeps, the Ti4200 ($76) cost a fair bit less than $100- usually almost half the price of the entry-level Ati Card (at that 'performance standard' (9500-$113)). It's not as good a card, agreed- but if you're gloating because you paid 2x-6x to get a better card, and lo and behold, you did- well DUH!Yeah- the FX class of card from nvidia is (ahem) disappointing- okay- sadly disappointing- and Nvidia does suck for pushing a card priced way over it's performance- but come one! Some people pay up to $400 so they can 'better' play a few $50 games! You gotta love this market, right?
In 6-9 months, the market will drive the technology to $99 or less (It always does)- then I buy. Maybe you early adopters who get 'cut' by the 'bleeding edge' will learn a little patience so supply & demand has a chance to work- instead of getting suckered by this hype-driven marketing. Don't get me wrong- I have always like Ati- just not to the point where I'd give them twice the price without getting twice the performance. Price <$100, then I'll be an Ati customer again.
Won't be long actually.
Maybe I'm 'out' $80, for buying this 'crappy' card. Oh well- at least it's not $150-$400 due to inadequate research or leapfrogging technology.(Come on- how many of you have a $200-$400 gfx card you had to salvage on ebay or is in your little bro's machine?) The technology evolves too quickly for me to waste my money- but that's just me- those of you who can afford to pay for performance I tip my hat to you.
This is the nature of technology after all- it's good when new technology makes old tachnology crap- and competition makes better pricing for everyone. Th system ain't perfect- but learn from it. All this "Nyah Nyah your card suck shit is juvie"
Take care.
Anonymous User - Friday, September 12, 2003 - link
Great review! it puts in perspective what valve did with there presentation an telling the truth of the current line of NV hardwear. One thing i have been reading everywear is that Det50 will be great for NV cards and DX9 performance but nobody mentions that the CAT's 3.8 will do the same for ATI hardwear.Anonymous User - Friday, September 12, 2003 - link
How dare some f you criticize valves work. Have you actually read or seen some of the demonstrations? The physics alone far superior to that of DoomIII. As for the graphics? Where can you honestly say DIII is better? Sure the gamne looks great but they only ahev to model for one or two kinds of enviroments a freaking spaceship and maybe some outside terrain! HL2 folks have to make at least 8 diff. enviroments. And as for the gameplay HA don't even bring that up. Weeee im shooting zombies on a spaceship weeeeee no strategy or AI weeee. About the only thing DIII offers is per poly collision.Anonymous User - Friday, September 12, 2003 - link
lol...guys goto bed...continue with life as you'd normally do. Don't you dare stay up for a review!What does it matter today or tomorrow?
:-)
Anonymous User - Friday, September 12, 2003 - link
ITS HERE! ITS HERE! HAHAHHAHAHHAAnonymous User - Friday, September 12, 2003 - link
"tomorrow night at midnight" What the hell does that mean? Is that september 12th at 00:00 or september 12th 24:00 which is september 13th.That dude should have at least given a straight date not "tomorrow night at midnight"
Anonymous User - Friday, September 12, 2003 - link
Anyone know any cool websites we can goto while we all wait?Anonymous User - Friday, September 12, 2003 - link
valve gave the deadline right? and they are in seattle, so prolly western time.Anonymous User - Friday, September 12, 2003 - link
Patience, I live in the west and its only 9:45.Anonymous User - Friday, September 12, 2003 - link
Oh, I get it, he must've meant midnight "California time".Anonymous User - Friday, September 12, 2003 - link
Anand might have journalism skills but if your going to tell people to wait untill 12 midnight for an article then at least be prepared to deliver, not sit around and upload ya article at 12:25am or some shit like that.That is Im going to get something to eat, be back at 3pm, that 1am your time.
Anonymous User - Friday, September 12, 2003 - link
12AM, actually. 12PM is noon.Technically he's a day and thirty-four minutes late, but I'm sure we all knew he meant 12AM on September 12th, not 12AM on September 11th.
Anonymous User - Friday, September 12, 2003 - link
Anan this sucks ass!!!!!!!!! WTF say in the article that today at 12pm! hope your pussy ass is running a NV5900 so you cant play HL2 at DX9 settings :pAnonymous User - Friday, September 12, 2003 - link
This is almost sadistic, declaring a deadline for something this eagerly awaited and not doing anything. Other staffers seem to be awake, though.. SAVE THE WORLD PLEASE etc etcAnonymous User - Friday, September 12, 2003 - link
Did you guys hear that Anand is coding for 3drealms, now? :PAnonymous User - Friday, September 12, 2003 - link
im going to bed ZzzZZzzZZZzZZZZZz...Anonymous User - Friday, September 12, 2003 - link
anand's done this before (next week we will..., we will look at X 2 weeks from now..., etc) without any results. Not that I'm having a go, his journalism is fantastic - but better to *mean* what you say when you say these things. It comes across as rather unprofessional otherwise.Anonymous User - Friday, September 12, 2003 - link
*leaves angry* Now I'll be dead tired for school AND no article :(Anonymous User - Friday, September 12, 2003 - link
Maybe they're putting the finishing touches on it? Serious though, keeping people waiting like this... ten minutes is perfectly fine, thirty is barely acceptable. I bet it'll come at like 4:30AM :PuncJIGGA - Friday, September 12, 2003 - link
where it at???Anonymous User - Friday, September 12, 2003 - link
its freaking 12:30am and still no benchmarks. Good Night.Anonymous User - Friday, September 12, 2003 - link
I'm calling it a night at 12:30... :(Anonymous User - Friday, September 12, 2003 - link
Agreed.Anonymous User - Friday, September 12, 2003 - link
MAN THE GUY FELL ALSEEP. WHAT AN ANTI CLIMAX.Anonymous User - Friday, September 12, 2003 - link
/me thinks about going to sleepAnonymous User - Friday, September 12, 2003 - link
YO WHERES THE BENCHMARKS. IVE BEEN WAITING SINCE LAST NIGHT? (IM IN AUSTRALIA BTW)Anonymous User - Friday, September 12, 2003 - link
Its there on the right side of the page but the link is not working yet arrrrrg!Anonymous User - Friday, September 12, 2003 - link
I CAN'T STAND THE WAITINGAnonymous User - Friday, September 12, 2003 - link
i hope he didn't fall asleep ;[Anonymous User - Friday, September 12, 2003 - link
i'm waiting...Anonymous User - Friday, September 12, 2003 - link
TIME!!Anonymous User - Thursday, September 11, 2003 - link
20 more minutesAnonymous User - Thursday, September 11, 2003 - link
is Anan benchmarks comming out today?Anonymous User - Thursday, September 11, 2003 - link
hey... it's almost midnight... where is our article :)... i know... 26 mins to goAnonymous User - Thursday, September 11, 2003 - link
I'm poster #126, but as OpenGl seems able to encompass and work with NVidia's hardware 'limitations' why couldn't MS have made DX9 more flexible given NVidia would have made any issues known before? MS seems to have let NVidia down rather dramatically.Anonymous User - Thursday, September 11, 2003 - link
FUNNY THAT I DONT READ A THING IN THIS WHOLE THREAD ABOUT EA GAMES OPTIMIZING FOR NVIDIA CARDS SPECIFICALLY BASED ON THIER PARTNERSHIP.Anonymous User - Thursday, September 11, 2003 - link
First off, I'm no fanboy. I only buy hardware on price/performance. As such I must say my only reaction to (recent) nVidia customers is sympathy. I would certainly be upset. I can't think it's entirely (perhaps even mostly) nVidia's fault given their R&D took place before the DX9 standard came out - the point when their hand was already played. Is it MS's fault then having been informed of Nvdia's hardware specs? Perhaps. My hope is that future games will come out with OpenGL support such that Nvidia customers will have an alternative. That would make such unreasonable poor framerates a temporary problem with some soon to be realeased sole DX9 games. Most expectation is already that the Doom3 engine will become the most widely adopted engine - perhaps this will ensure it even more so? As someone else pointed out OpenGL currently is not mature that everyone is looking to implement it in that game. Doom3 should definitely change that. In any case don't berate NVidia too much, the optimisations are for their customers benefit given they have been arguably failed by DX9. Nonetheless these benches are going to scare alot of people who don't understand the technical reasons. The current Nvidia *still are* good cards albeit under everything except DX9.To cut it short please don't gloat or deride eachother. Alot of people have been unlucky. And surely all being consumers aren't we all in the same boat - I don't understand the gloating or fanboyism.
I think people with FXs shouldn't worry too much. OpenGL may become more of a standard than DX9 - what game developers want to alienate a large segment of the market. If you own an FX and HL2 was one of your most anticipated games then you won't be able to run it in DX9 in it's full glory. Instead you'll have to run it at a very high resolution with 4xAA etc - IMHO that's not bad, not bad at all. If that's not enough then buy ATI - your money.
I've finished my ramblings. Peace all.
Anonymous User - Thursday, September 11, 2003 - link
#124, they're just pissed at having forked out the moolah for the WRONG card for HL2. Feels good to be on the "right" side for once.akdjr - Thursday, September 11, 2003 - link
to anyone who has said that valve has no coding skills, i'd LOVE to see you make a game as good looking and as fast as hl2. if you can't, then you have NO RIGHT to judge the competence of valve's programmers, as they are obviously better than you. i'd say HL2 proves they are extremely talented, and with everything that HL2 is throwing at the cpu and gpu, those numbers seem fine. keep in mind, the game isn't out yet.(personally, i'm very happy to see thaat my aiw 9700 pro will run hl2 just fine :D )
Anonymous User - Thursday, September 11, 2003 - link
#121, wrong thread.Anonymous User - Thursday, September 11, 2003 - link
IRT#115VALVe didnt make cs they merely bought it when it became extremely popular then proceeded to ruin it by making it so noob friendly it has lost all its depth and strategy
cs is now a shitty dm game
hl2 will be good but i think most people are more interested in hl2 mods rather than the actual game
most of which will be retail only and making a mod for hl2 with its advanced gfx etc will be a lot more difficult than it was for hl1
making maps for hl2 might even require a team yet alone an entire new mod
the hype around hl2 is getting so ridiculous that it can only be a dissapointment if people arent careful - i didnt think hl1 was amazing it was very good but not amazing, if you expect the world from hl2 then you will probably dissapointed whereas someone who expects an ok game will be impresed if it only turns out to be a good game
Anonymous User - Thursday, September 11, 2003 - link
http://www.robyncom.com/AMD Athlon64 3200+ with the MSI motherboard and the h/s to (dont trust the h/s)
Anonymous User - Thursday, September 11, 2003 - link
I'm just wondering, does anyone remember when the first sneak peaks of Doom3 came out and the catalyst drivers were broke for them. The 9800pro got like 10fps, shortly after the situation was resolved. I have a AIW 9800pro, but I also have a BFG fx5600 256 that oc's nicely to above ultra specs. Just wait until after the game ships to decide what is best and where if fits into the big picture.Anonymous User - Thursday, September 11, 2003 - link
NV is saying that Det50 will be THEIR BEST DRIVERS EVER" i guess they will be great running normal DX9 code without optimazations. HELL NO they are going to do is even more! no fog, 12 presition, crap IQ, no AF+AA with Hi res. But ATI also has "THEIR BEST DRIVERS EVER" i now they will be DX9 "NO OPTM.", 24 presition, awsome IQ, AF+AA with Hi res. Too bad NV the shit hit the fence this generation is crap. And 9600 comming to sub 100 market in a few months. NV has lost High and Mid range market to ATI and DX9 in the low end will be Ati domain too. If you own ATI stock sold it until NV40 this gen is crap!!Anonymous User - Thursday, September 11, 2003 - link
Basically the nVidia performance stinks, either way IMHO. If the new 5x.xx drivers fix it, then so be it, and that will be great for those cards and then they can run future Valve games.Game runs fine a Ti4600 using DX8.
However, the new ATI cards only have 24bit shaders!
So would that make ALL current ATI cards without any way to run future Valve titles?
Perhaps I do not understand the technology fully, can someone elaborate on this?
Anonymous User - Thursday, September 11, 2003 - link
Here is how your performance will be, depending on the card you have:If you have a DX9 ATI card, your performance will be ok.
If you have a current DX9 nVidia card, your performance will not be near as good as an equivalently priced ATI card. Of course nVidia will release a "fixed" card within about 12 months, it would be suicide not to.
If you're using a DX8.x part, like a Ti4200 or an equivalent generation Radeon, then the performance will be roughly the same.
Likewise, DooM3 is rendered using OpenGL, and therefore whatever card you own will run as well as it can run OpenGL. DirectX compliance will have no effect on your FPS in Doom. Some websites have benchmarks for OpenGL games, you can review these to get a good impression of how each kind of card will perform.
Anonymous User - Thursday, September 11, 2003 - link
lol@113Anonymous User - Thursday, September 11, 2003 - link
Well I'm still running on my Ti4400 - will wait to see how "horrible" it is before I make any changes.I think it is funny though. I've got some radeon and nvidia cards here. (I'm never cutting edge though - my fastest PC is only a p41.8)
What a silly thing to gloat or argue about. I was never fond of ATI because I was never satisfied with the timeliness or reliability of their drivers in the past (maybe that's changed now, I'm not sure.) When I upgrade I just buy whatever the best card is at the $150-$175 range.
To the point of whom is conspiring with whom is silly as well. There is absolutely nothing wrong with a company partering with another, or even making their product work better with another's. Even if that's not what is going on. There isn't anything illegal or nefarious about it. It's called the free market. So your silly talk about a class-action lawsuit against nvidia is meritless. They sold you a card that does work (compatible) with Directx9. Now since the card came out BEFORE the standard, and DX9 games came out AFTER the card, it's your own choice to purchase a card that may not be the BEST card.
Some of you need a serious lesson in market economics. The power you have is to vote with your wallets. Now that can start with a healthy debate of the facts, but this is a useless debate of speculation and conspiracy theories.
Valve's biggest interest is to provide a game to the largest audience possible, and to make gaming folks happy. That nets them more profits. And that's the wonderful thing about a free market economy. I highly doubt Valve would want to intentionally do anything to alienate the nvidia market, since there is a gigantic install base of nvidia based GPUs.
I'll play HL2 on my P41.8 w/GF4-Ti4400 128M card. If I want to turn on ALL the eye-candy and run at more FPS, then I'll have to spend some cash and build myself a new gaming rig. My guess is that it will probably run pretty well and I'll be perfectly satisfied on my current machine. After the initial release I'm guessing that future patches of HL2 and Det5 will eeek out a couple extra FPS, and that will just be an added bonus.
I will buy HL2 for a couple reasons. The first is rewarding Valve's loyalty to their customers. I spent probably $50 buying HL1 back in 1998 and I got 5 good years of gaming enjoyment. They maintained patches and support WAY BEYOND any resonable expectations for a game. I got some REALLY good extended play out of their game with FREE mods like CS and DoD. I will buy HL:2 to show that their loyalty to me will be rewarded in turn. I'd like to send a message to other game developers that open platformed games and long-term support is what we insist on, and reward with our money. The other reason is Valve showing that they will accept nothing less than cutting edge. HL1 was groundbreaking at the time, and HL2 looks like it will be the same.
PandaBear - Thursday, September 11, 2003 - link
Dude, the problem with Nvidia is not because they intend to build a crappy card based on 3Dfx technologies. They went with the .13 micron process at TSMC that is rather new and too bleeding edge at the time. ATI is using a more conservative .15 micron process that is easier to get good yield out of so they are fine right now.From what I heard, if Nvidia is more conservative back then when they design nv30 they would have been ok. ATI decided to wait and let everyone else at TSMC (including Nvidia) works out all the process problem before jumping aboard.
Anonymous User - Thursday, September 11, 2003 - link
It had taken Jen-Hsun Huang many a market cycle and an enormous amount of money, but on that fateful Friday in December, 3dfx was finally his. Yes, the ancient Voodoo technology was his at last.. but, unbeknownst to him, so was its curse.Anonymous User - Thursday, September 11, 2003 - link
"There is no need to gloat to nVidia card owners, because you'd have to have a LOT of contacts to know that this was coming. "Not at all. All you had to do was pay attention to the supposedly "useless" synthetic Pixel Shader 2.0 benchmakrs that have been around for MONTHS.
They told exactly the same story - the GeForceFX family of cards have abominably bad PS2.0 performance unless you tailor-make a partial precision path specifically for them. That the partial precision path results in lower image quality isnt as important............
....when your drivers detect SCREEN CAPTURING and then MANIPULATE THE OUTPUT QUALITY to make it appear better!
If nvidia had designed a part that ran DX9 code properly and with decent speed, there would be no discussion here.
The fact is they didn't. And their only recourse until NV40 has been to manipulate their drivers to an almost unbelievable extent, reducing quality and introduciig arbitrary clip planes at will.
I don't own an ATI PS2.0 capable part, but it has been obvious since the introduction of the 9700 that it is the one to buy.
dvinnen - Thursday, September 11, 2003 - link
#106) Orginal HL is based off the orginal Quake engin which was based off OpenGL. If you notice, the DX version of HL sucks, it sucks alot. Looks terrible is buggy. It isn't supported as fully as the OpenGL version. THere are a buch of lighting effects in OpenGL version that aren't in DX mode. The only time I use DX mode is to debug. Now imagin porting the vastly more complex HL2 over.Anonymous User - Thursday, September 11, 2003 - link
To get a good overview of this thread minus most of the fanboyism, use this index with my opinions:#24 right on! too bad this thread didn't end here though.
#33 .. um...no
#39 320x240 (just like in Quake on 486 days)
#42 agree
#44 1993
#62 because of IPC (efficiency)
#71 correct!
#72 LOL
#76 BINGO
#80 correct
#81 it is ALWAYS a marketing war, silly.
#86 heavy denial (and when the fairy godmother came, she turned him into a real DX9 card...)
#93 e x a c t l y
#103 DONT READ THIS ONE... I actually did, and always fell asleep and fell out of my chair. Nah, just kidding... but it's the best "summary" of this thread.
#106 Actually if you've ever done Fortran/VB/C/C++/Java/Perl then you would know that "programming" isn't just fun-time... it's a lot of work, and it sucks if you have to do it twice.
Personally I think everybody should chill and read this article: http://www.3dcenter.org/artikel/cinefx/index_e.php
Anonymous User - Thursday, September 11, 2003 - link
ha... i am glad i don't have the money to buy all those new cards... sticking with my "old" ti4200 :)Anonymous User - Thursday, September 11, 2003 - link
106 -The reason valve went with DX9 over the new OGL engine is that DX9 is more mature then the new OGL standard...which isn't even officially released yet.
Icewind - Thursday, September 11, 2003 - link
#106, Doom 3 is built on the new Glide engine, Duh.Anonymous User - Thursday, September 11, 2003 - link
In HL1 we had the choice of OGL or DX. Why can't a game support both formats now?Would the game have to be completely redesigned to support OGL, or are the 2 formats so much more different from each other than they were in 1998?
AgaBooga - Thursday, September 11, 2003 - link
#104, they did find a way to run Nvidia cards smoothly, DX 8. They can't guarantee full support with all cards, but they have done their best. I don't think HL2 would be geared towards ATI cards, but isntead thats just how it worked out.I wonder if Doom 3 will have any similar problems like this... with Nvidia cards
Anonymous User - Thursday, September 11, 2003 - link
Well, IMHO I think Valve will suffer more because they aren't reaching the whole market of Nvidia users out there. For those guys who want to spend the extra money for hardware just to play HL2, good for you, but I personally don't have the money to throw around like that everytime I want to play the lastest game.Anonymous User - Thursday, September 11, 2003 - link
Yes, they are. I doubt saying this is going to affect anything, but seriously, these flames are getting WAY out of hand.People with GeForce FXes who are sore about their cards being slow about HL2, them's the breaks, it's not your fault. Who knew? But don't accuse Valve of going Futuremark on the community until there's at least a grain of evidence. And don't tell me "Valve and ATi working together is evidence enough," because IIRC, the partnership happened after Valve saw how the FXes did with HL2.. and, thus, after NV30. Ahem.
As for you ATi folks, yes, it's nice. There is no need to gloat to nVidia card owners, because you'd have to have a LOT of contacts to know that this was coming. The FXes do perfectly fine in normal benchmarks, with the obvious exception of NV30. (The 5200 Ultras and 5600 regular are pretty bad cards too, in my opinion, since the 5200U is as expensive as the Ti4200 and a lot slower, and the 5600 regular is pricier than the Ti4200 and slightly slower. You know, the Ti4200 really is a good card. .. uh oh. I've gotten sidetracked.) The FX 5600 Ultra 400/800 was the best midrange card around (well, since nobody knew about HL2 performance), even if it was tricky to find, and the FX 5900 Ultra dominated the high end. (The 9800 Pro came close, but unless one of them cheated I'd say it was a win for nVidia, albeit a small one.) They didn't make a stupid choice, they probably decided on an nVidia card because of benchmarks or because of good experiences with them. Okay? No more of this. It's stupid and immature.
And to people on both sides of the line, just because someone says something stupid is no reason to flame them. Maybe they're trolling, probably not, but either way you'll do better just politely explaining why what they said was incorrect and/or illogical. Name-calling just makes you look worse. And if there's an all-out flame, ignore it.
Why am I putting this in a comment thread? Hmm. I guess I have too much time on my hands. OTOH, this HAS gotten sort of ridiculous... well, whatever. It's not as if anyone's going to pay attention to any of what I typed, they'll just skip over it and say something about how stupid those goddamned blind fanATIcs have to be if they don't realize that Valve is totally being bribed by ATi and the Evil Moon People to cripple FXes in HL2 or how stupid those goddamned blind nVidiots are to buy GeForce FXes when they obviously should have a tech demo of HL2 on hand. Eh, I tried.
Whoever made the comment about OpenGL and DirectX was very right; Doom III is a very different game, and the FXes seem to only fail with lots of DX9 usage. They certainly perform well in OGL, though, looks like.
God, I remember all the reviews saying the FX 5200 Ultra was decent because while it was slower than the comparably priced Ti4200, it was DX9. Ha. =(
Since this is a video card thread-thingy, I guess I should end by stating what sort of video card I have and either insinuating that my use of it makes me unbiased (if I use a card from the company that I just explained my problems with, or if I use some cruddy aging integrated thing) or explaining that just because I use it doesn't mean I'm biased (if I use a card from the company I backed up). (You're supposed to include that in posts on these things, usually at the end, just like how in CPU-related posts you have to make a joke about cooking food on a Prescott, or how in heatsink-related posts you have to mention that your current [insert cooling solution here] does just fine for you-- and if it's a non-air-cooled system, you are required to make a happy emoticon afterwards, possibly one with sunglasses. If you don't do these things your opinion is automatically invalidated.) Well, I'm not going to, because then someone would almost certainly call me a fanboy.
This post is way too long. It ends now.
dvinnen - Thursday, September 11, 2003 - link
they should make you register to post. These kiddie flames are getting annoying.Anonymous User - Thursday, September 11, 2003 - link
#27, here is something that explains some of the differences between DX8 and DX9 with visuals.http://ati.com/vortal/r300/dx9demo/main.html
Anonymous User - Thursday, September 11, 2003 - link
I'd actually put some Value in what Gabe says if he wasn't on the payroll of ATI. ATI and Valve have been working together for quiet some time now. Now lets really think about this, Gabe = former Microsoft worker. Microsoft = known for making bs claims and undercutting the competition. Valve = makes more money if ATI cards sell better. Hmmm, should I really trust this guy? Probably not. I'll wait for a non-bias third party says Nvidia fucked up dx9. Till then, I put Gabe right next to the stats on AMD site comparing P4 and Athlon and the study sponsored by Microsoft that shows Windows 2003 is faster than Linux.As for the 5 times longer on developement, I have very little respect for the staff at Valve in that area. They have repeatedly show that they aren't competent when it comes to coding.
Anonymous User - Thursday, September 11, 2003 - link
I think the chances of successfully suing nVidia for misrepresentation and fraud will go over as successfully as that one against Intel for the P4's lack of performance. Anyone notice how it just sort of faded from the media spotlight and then dried up? Never sue a big corporation under a Republican government: Nobody gives a shit about the people - a kinder gentler America: Step over the little guys, not on them.Vis a vis Doom III, I remember reading just recently on HardOCP or ShackNews that TeH Carmack wasn't too impressed with the N3x cards either, and had to write a special mixed-mode path to address performance issues too.
Anonymous User - Thursday, September 11, 2003 - link
it sounds to me like MOST of you guys are all prepubescents, in which case i can understand 99 posts over such a stupid post in the first place.NVIDIA will find a way to patch their crap.
And in any case, as if any one game ever determines the merit of a graphics cards, please, wake up kids.
Anonymous User - Thursday, September 11, 2003 - link
*Where the hell did that pitchfork go*Anonymous User - Thursday, September 11, 2003 - link
Something somewhere went wrong. And that something happened to be Nvidia AGAIN. There should be a class action lawsuit for fraud against Nvidia Selling supposed DX9 cards when the game publisher and Microsoft say that Nvidia wasnt following the DX9 coding standard. And toboot Gabe says that it took them 5 times longer to code for Nvidia cards which brings the cost of the game higher. So here you have the game being written in DX8 code for the Fake DX9 Nvidia cards to be playable. As Gabe sayed " Id be pissed " if I owned a Nvidia card.So for all the people that paid such a high price for your 5900 ultra get together and sue Nvidia. These companies have to be stopped sometime and be accountable for their actions. You have Gabe with all the proof you need to show fraud.Stand up and be counted and tell them " Your Not Going To Take It Anymore"
Anonymous User - Thursday, September 11, 2003 - link
I feel fairly unbiased in saying this (as my current card of choice is an integrated Intel810 graphics chip *wooo*) but I think it *isn't* fair to start abusing nVidia over this surprising lack of performance. My hat goes off to ATI, because if you look at what they've done as a company with their line of products from the Mach64 in the past to being an industry leader today (which they share with nVidia), then there has been an amazing amount of growth. The flip side is that nVidia has also achieved great things and is actually a younger company. It's already been mentioned that nVidia pioneered (*awaits flames*) 32-bit rendering depth, when the industry was focused on 16-bit (and back in those days I owned and endorsed 3dfx stuff, but once you see games like Quake3, UT running in 32-bit the difference was noticable), and the reason nVidia did well was because they made good products. All I can say is wait for the final product, and let us all remember that 3dfx dominated the 3D hardware market in the gaming community, and then they bet it all on some so-so hardware and lost.One last thing:
If you look at the full-range of current and upcoming games, then nVidia and ATI share the benchmark-leads together, but in a lot of reviews I've seen the 5900 Ultra wins over the 9800Pro, then vice-versa. You can almost compare it to the CPU field where Intel dominates performance, but loses on price. (IMHO)
With all that said I will probably buy a 9600Pro, and an Athlon, because of the price/performance ratio, and I'm confident I'll at least get playable performance on HL2 and Doom3.
Thanks,
etherboy
Anonymous User - Thursday, September 11, 2003 - link
Seems like NVidia fooled buyers selleing dx9 card especially in the low end of the market.I just bought a 5200 card after seeing some benchmarks and seeing nothing really faster on the ATI side (9000, 9100, 9200).
The point was : at equal price and perf. I take the DX9 card for the future.
Now I feel fooled :(
NerdMan - Thursday, September 11, 2003 - link
A lot of people here are comparing DOOM III to HL2. This is just not possible. You have to remember that DOOM III was coded using the OpenGL API and HL2 was coded using the Direct3D API.With that said, OpenGL is a bit more flexible with support for hardware, since it's open-standard. Direct3D, on the other hand, is a bit more rigid in it's design. It takes 2 years for a GPU to be designed. These chips were being designed long before MS nailed down DX9 spec. Nobody is really to blame, Nvidia just picked the wrong way to design their card for DX9 compatibility.
Anonymous User - Thursday, September 11, 2003 - link
Has anyone here actually put both a 9800 and 5900 side by side on a bench with HL2 ? , no ....then shut up LMAO.You all follow the crowd like sheep BAHHHHHHHHHH
Anonymous User - Thursday, September 11, 2003 - link
Anybody got any specs for R360? - looks like I'm buyin ATI from now on...Icewind - Thursday, September 11, 2003 - link
What makes this even sadder is the little Nvidia fanboys trying to cover up for Nvidia. Pathetic.Anonymous User - Thursday, September 11, 2003 - link
It seems to me that, as a 9500 pro owner, nVidia's gotten themselves into fairly hot water. Absorbing 3dfx was not a smart move, as they really haven't brought anything to the table that nVidia did not already have either in the marketplace, or in development. They need to think "lean & mean", like they did in the old days, where they addressed the issue of the day (32 bit colour back then), rather than hedging around it, breaking benchmarks, and generally carrying on like a spoiled and over-indulged child.Anonymous User - Thursday, September 11, 2003 - link
ha ha hanvidia suckers crying now
enjoy your video card Nvidia owners
anyhow, nvidia fan boys can still play the
Half Life 1
Anonymous User - Thursday, September 11, 2003 - link
FINALLY A WORD FROM ANAND. But why not just post it in one article Anand? Seems a waste of time since either Wesley or Evan will post yet another article today. :)Anonymous User - Thursday, September 11, 2003 - link
Let me see...ATI-only conference, ATI footing the bill, ATI is business partner of Valve, ATI wins! What a surprise! Of couse, it wouldn't be any different if it was an Nvidia-only conference, would it?BTW, I'll buy no game that won't run on the video card I have.
--Old Man Gamer
Anonymous User - Thursday, September 11, 2003 - link
edit: of *not* taking the time...Anonymous User - Thursday, September 11, 2003 - link
Valve actually spent 5x more time on the NV30 path then they did on the default dx9 path, and the FX still got owned. So anyone accusing Valve of taking the time to code their game for the FX series needs to have their head checked.This is just the first DX9 game (well there was also tomb raider witch showed the same difference in performance) witch confirms what dmark03 (at the time when cheating at it wasnt allowed) showed us.
Anonymous User - Thursday, September 11, 2003 - link
ATI cards WANK WANK WANKAnonymous User - Thursday, September 11, 2003 - link
I'll wait until I see it run on my fx 5600. If UT2004 is optimized for NVidia, I'll take that trade-off.Love the 'my card is better than your card' crap though...... but remember NVidia owners laugh at you ATI guys all the time, so enjoy it while it lasts...
Anonymous User - Thursday, September 11, 2003 - link
I think this is a marketing war - mainstream cards are the bulk of sales and whoever dominates that sector almost FORCES games producers to make products for THOSE cards - regardless of implementation..."What more, he [Newell] said, smaller developers are not likely to have the resources Valve was able to bring to bear on the problem. "
Anonymous User - Thursday, September 11, 2003 - link
I quote from Anand:"Half-Life 2 has a special NV3x codepath - even with the special NV3x codepath, ATI is the clear performance leader under Half-Life 2."
"ATI didn't need these special optimizations to perform well and Valve insists that they have not optimized the game specifically for any vendor."
What's with "Not optimized for any vendor" and "NV3x codepath"??? Valve is slapping themselves!!!
"The 5900 ultra is noticeably slower with the special codepath and is horrendously slower under the default dx9 codepath;"
"the Radeon 9600 Pro performs very well - it is a good competitor of the 5900 ultra"
Geez, pit the Radeon 9600 Pro vs 5900 Ultra in the current crop of games and we all know that's not true.
1) Valve's stories don't exactly gel very well, I suspect they built Half-Life 2 from the ground with ATi class hardware in mind as per the standard DirectX9 specifications. (Nvidia just gotta blame themselves cause they didn't follow the specifications.) So, in the end Valve have to extend the development time to include the NV3x codepath, which obviously isn't working well enough.
2) Valve and Ati are probably in bed together as shown with HL2 bundling with ATi Radeons. However, ATi did an exellent job with the 9700/9800 Pro, besting everything Nvidia can conjure up in every market segment not only in terms of performance but price too.
3) Nvidia stumbled big time with the GeforceFX, it's overpriced compared to any equivelent ATi card as of now. (The merging of 3DFX and Nvidia technologies just didn't make the cut.)
Conclusion:
1) Radeon 9xxx are the best buys as of now. They run the current crop of games and also HL2 very well.
2) GeforceFXs are over-priced, but still can't beat Radeons 9xxx convincingly. For people with GeforceFXs now, I guess you made wrong purcase decision in regards for this generation of GFX cards.
3) For the time being, ATi owners are doing the laughing, but not the final laugh as of yet though. Speaking as a consumer, I hope that the competition continues to heat up, for I will buy whichever has the best price/performance.
Anonymous User - Thursday, September 11, 2003 - link
I tried both 5900 and 5900 Ultra (both overclocked) and couldn't see any great difference over my Ti4600 - so I sent them both back - I'm glad I didI gonna sit on my money until reviews/benchmarks for R360/NV38 come in....
Anonymous User - Thursday, September 11, 2003 - link
Should I run HL2 in DX8 or DX9 on my S3 Virge.Mayby I should have paid the extra dollars for a 486DX and not be the cheapoo 486SX...
FPC - Frames Per Century rules..
;)
Anonymous User - Thursday, September 11, 2003 - link
hehe, I got a 9800 pro 256mb (i know, iknow the extra 128mb makes no difference but im a sucker for the numbers!) and I nearly fainted when I saw the price for the equivalent FX card, now im LMAO about this, but I feel really sorry for the guys who payed a fortune for an FX who are probably in denial at the moment. Its a real pain that games are now fighting gfx card technology instead of being able to enhance their software with them. I think we will see the reverse of this FX situation when Doom III comes out though!Anonymous User - Thursday, September 11, 2003 - link
Note that when anyone says their DX9 code is "not vendor-specific" that the reason NVIDIA's been having so much trouble is that MS basically sold the DX9 spec to ATI in no small part because of its constant squabbles with NVIDIA. Contrary to popular opinion, these hardware architectures were actually far along in development well before DX9 was nailed down. In a reversal of the DX8 development, DX9 was basically a software/API description of the R300. People bitch about NVIDIA using 16 and 32 bit FP and "not being to-spec," but you must realize that these major architectural decisions (all-FP24 vs. fixed+FP16+FP32, 64 instructions and multiple outputs vs. 4k instructions and one 128bit output supporting packing of multiple smaller-precision elements, etc.) were being weighed and worked out well before MS came down with the DX9 spec. The spec was developed, as usual, with the involvement of all the players in the hardware world, but ALL of the bitchy specifics were handed to ATI. Admittedly, this has happened in the past with NVIDIA, but it's particularly problematic once the DX spec starts defining code paths and internal representations for these immensely complex stream programs in today's vertex and fragment units. As such, though it's clearly an important target which NVIDIA bungled largely in its business relationship with Microsoft, DX9 could easily be considered an ATI-specific codepath as sticking to spec forces a very non-optimal code path for the way the NV3x pixel shader is architected.Anonymous User - Thursday, September 11, 2003 - link
Chazz, very well put.Adul - Thursday, September 11, 2003 - link
I was post 73 btwAnonymous User - Thursday, September 11, 2003 - link
Expect Anand to have his number in about 21 hours and 10 minutes ;)Anonymous User - Thursday, September 11, 2003 - link
ummmm... my 2 cents:from what i've gathered.. HL2 = HL + new gfx + physics.
by new gfx, i just mean that they finally figured out how to make the Quake engine load textures that are > 8-bit, and then they read some soft shadowing/bumpmapping tutorials and cut/pasted that code in there as well.
concerning the confusingly low sys requirements #39 was referring to:
if you're running a TNT/GF2 or possibly (?) GF3, you'll probably have to turn OFF the fancy gfx that have gotten HL2 half the hype. just so you can play it, instead of watch it play (ie slideshow). so you'll basically be playing a physics upgrade mod for HL. along with all the new content for HL2 (maps/textures/models/story etc).
------------------------
as for the comparison between HL2/Doom3 that you lamers can't give up on:
HL2 undoubtably has more dynamic gameplay than Doom3. Doom3 definitely has more atmospheric mood-driven gameplay than HL2.
imo, there is no such thing as better gameplay. just as there is no such thing as a more fun game. it's just a matter of preference.
a product is what a product is.. if you prefer apples, eat apples.. if you prefer oranges, eat oranges....
if it's so important to you to argue why one is better than the other, then ur a politician..
terrorists are politicians too y'know.
to all u who bought new hardware to play the game before it comes out... i just avoid gambling all together and wait for the game to come out first.
-Chazz
Anonymous User - Thursday, September 11, 2003 - link
#62DX9 is based on 24 bit coding.
NVidias Hardware renders at 16 and 32 bit. 32 bit is too slow and 16 bit renders with IQ loss thanks to the lesser precision.
Also, R3x0 hardware renders 8 textures per pass, while NVidia renders 4 or 8 textures per pass depending on code. Using Single texturing and advanced DX code (ie DX9) the engine works at 4 textures per cycle, even when using smaller precision shader code. The problem is the hardware, not the drivers.
Anonymous User - Thursday, September 11, 2003 - link
#54.WTF are you talking about?
I've ran here games as old as Rogue Squadron, DarkStone, and not so old HomeWorld....they all run fine even when using AA and AF.
Anonymous User - Thursday, September 11, 2003 - link
I find this all quite fascinating. Half life was the first game i played on the first computer i owned. I was running nVidia then and have been since (current = Ti-4200). I am about to upgrade and have been researching for hours a day about latest DX9 cards and must say that without question, ATI will be getting my cash this time...and from everything ive read/seen/heard...they have produced a superior product...PERIOD (please no "in the future..." posts cause i could be dead before nVidia catches up...i care about NOW)Anonymous User - Thursday, September 11, 2003 - link
I've never owned an ATI card and I've owned more than a few nVidia cards (currently a 4600 in my main rig).. so I think I can make this statement without bias:Some of you guys are desperate to make yourselves feel better about your ultra expensive nVidia FX cards. It's pathetic and sad.
Personally, I am going to wait and see how my 4600 will run HL2 before deciding if I need to upgrade (I can live without max eye candy)... and if I do I will probably buy an 9600Pro simply because it seems it might be the best price/performance value for HL2.
I have LITTLE patience for buggy drivers though and find my 4600 w/40.72 Dets to be ultra stable so ATI's drivers better not piss me off. :D
dvinnen - Thursday, September 11, 2003 - link
God, I love all the nvidiots out ther saying valves programers suck. Everything is not software, it is mustly hardware problems. Valve came up with a work around for nVidia not following spec. Be happy with that. Just because nVidia made "Wonder" drivers a couple years ago dosen't mean that it always works out that way. DX9 calls for 20 bit percision. That's what ATi uses and what Valve decided to use. Nvidia decided to use 16 and 32 for some reason. This is why nVidia doesn't liek 3dMark03. Because they didn't follow spec, they are mad about it.Anonymous User - Thursday, September 11, 2003 - link
Geezus I'm not sure which fanbois are worse, the nvidiots or the fanATIcs (DigitalWanderer et al). Right now I'm leading towards the fanATICs but only because the nvidiots are more less hushed up these days.Anonymous User - Thursday, September 11, 2003 - link
#64: Where was #63 saying that nVidia cards were the best? He simply said Valve programmers suck. You sound like the fanboy.Anonymous User - Thursday, September 11, 2003 - link
#63 Give it up. Your Nvidia fanboy days are over with. Your card failed, period.Anonymous User - Thursday, September 11, 2003 - link
It is pretty remarkable - Valve have come out and said "Well, ATI cards are great and they work properly, but nVidia cards don't run properly. Also we can't make FSAA work in our game, we don't know how. But we're really good programmers and it's the nVidia card that is at fault."Obviously Valve has a different team of developers since they did the original Half-Life.
Anonymous User - Thursday, September 11, 2003 - link
Care to explain how does a video card that has only 9.6gb/s 9600 Pro bandwidth manage to shame 27.2gb/s 5900 Ultra?give it about a month for Valve to release a patch for Nvidia cards. Pretty stupid of 'em to do so.
Shimmishim - Thursday, September 11, 2003 - link
You are such a tease Anand...do you do this to your girl as well? :)
Anonymous User - Thursday, September 11, 2003 - link
31, If making a game run properly on NV3x hardware entails not using proper DX9 high precision rendering, then yes.. Valve is guilty of optimizing for ATi. Otherwise, nobody is to blame but Nvidia (for designing the NV3x), and Microsoft (for designing the DX9 specifications).Anonymous User - Thursday, September 11, 2003 - link
24, NV3x codepath translation: Render with crappy precision because our card can't handle real DX9 rendering at anything faster than a snail's pace.Anonymous User - Wednesday, September 10, 2003 - link
Fudge, I guess I will wait for the 9800XT cause those numbers off the 9800 Pro are pitiful. But certainly not as pitiful as Nvidia. Nvidia, you lost this round. See ya next year.digitalwanderer - Wednesday, September 10, 2003 - link
If ATI does hold slim lead, I'm sure GeForce FX 5950 will remedy that or an optimized driver update. :-)Yeah ATi holds a slim lead...with their mid-range card over nVidia's best! :lol:
nVidia's day of reckoning is coming...probably tomorrow. ;)
Anonymous User - Wednesday, September 10, 2003 - link
im so happy, I have a radeon 9600 pro, and im just so very happy.Anonymous User - Wednesday, September 10, 2003 - link
"Half-Life 2 has a special NV3x codepath that was necessary to make NVIDIA's architecture perform reasonably under the game;""Valve insists that they have not optimized the game specifically for any vendor."
Don't these statements seem to contradict each other? I assume the first is Anand's opinionated statement. The second is Valve insisting no optimizations. I have to believe the second.
If ATI does hold slim lead, I'm sure GeForce FX 5950 will remedy that or an optimized driver update. :-)
Anonymous User - Wednesday, September 10, 2003 - link
Buggy? ATI cards have some of the worst drivers in existence if your not playing brand spanking new games.Anonymous User - Wednesday, September 10, 2003 - link
Well, I've been waiting for HL2 to be released before getting my new vid card, but I guess this clinches it. It'll be ATI. :) Heck, if I'm lucky maybe there'll be a 9900 pro out by then. :)AgaBooga - Wednesday, September 10, 2003 - link
In an interview with Gabe Newell, here: http://www.gamersdepot.com/interviews/gabe/001.htm he mentions FSAA. I wonder if they worked everything out and got it working...NerdMan - Wednesday, September 10, 2003 - link
Figures, Nvidia's "code it crappy now, fix it later" mentality finally bit them in the ass. They shove this crappy code out that barely works in the minor games, and then release optimizations... errrmm.. patches... for every major game.Stick with good 'ole ATI. Sure, their drivers might be slightly buggy, but at least they actually fix the problems.
AgaBooga - Wednesday, September 10, 2003 - link
Rmember, we still know very little about performance even though he said that it is getting 60 FPS at 1024x768 on the 9800 Pro. Does that mean with Anti Aliasing on or off? How about Anisotropic Filtering? You can't just forget those guys when you argue the 9800 performs better.Until we see exactly is being used in the tests, we can't accurately judge the performance.
Also, the comment about it runing in dx8 on 5200 and 5600 cards means that they understand the market and are making it playable. When they commented some time back about it running on a 4600, you have to remember that was when there was lots of debate regarding AA issues and they weren't very sure. That was one problem they saw more towards Nvidia and said they can try and make a work around for ATI cards.
Also, we've seen the wonders Nvidia can do with drivers some time back, and so we can expect something to change the situation now. I'm sure that Nvidia has known about these problems and is working on fixes or enhancements for its speed.
Let me also say that I'm not saying ATI isn't doing good, they are doing great and this is something that has fallen at a very good time frame if they can get the next generation cards to outdo Nvidia because the 9700 and 9800 have given them the higher performance for a much longer time.
What I'm looking forward to is seeing the AA and AF results as well as if they made any other optimizations such as for hyper threading...
Anonymous User - Wednesday, September 10, 2003 - link
haha take that blind nvidia fanboyssome people will do anything to defend Nvidia
digitalwanderer - Wednesday, September 10, 2003 - link
ROFLMFAO~~~~~I could have sworn I mentioned somewhere here before that the truth would come out in the fullness of time...I guess the time is full now!
To any nVidia enthusiasts saying that the Det 50s will fix this, you might want to check out our story on the Det 50s over at www.elitebastards.com ....they're comparable in performance to the 45.23 set! :lol:
Anonymous User - Wednesday, September 10, 2003 - link
surely its close to illegal what theyre doing - unfair practices / stifling competiton like microsoftAnonymous User - Wednesday, September 10, 2003 - link
"Gabe Newell: Valve and NVIDIA both know that we have a lot of shared customers, and we've invested a lot more time optimizing that rendering path to ensure the best experience for the most customers." - Makes you wonder what would happen if they hadn't. 20fps anyone?Anonymous User - Wednesday, September 10, 2003 - link
Here's something very interresting that sort of answers my question:http://www.gamersdepot.com/hardware/video_cards/at...
(benchmark results)
-Big D.
Anonymous User - Wednesday, September 10, 2003 - link
#24, who cares about D3?!? Doom is the same ole crap it always was! Nothing's changed. It's not 1996 anymore :PAnonymous User - Wednesday, September 10, 2003 - link
Boy am I glad I bought a 9800Pro just yesterday! Decided to skit nvidia for the first time in years, and it seems I did the right choice! Phew!Anonymous User - Wednesday, September 10, 2003 - link
#40 i think if you have low end gfx cards they want you to run it in dx6#39 vice city is a terrible game
gta3 great vice city is fucking shit dribbling out of a gorillas ass in comparison
Anonymous User - Wednesday, September 10, 2003 - link
It makes sense if you think of it in terms of what kind of detail you'll be able to milk out of the card. GeforceFX will play HL2 better than GF2, but not at the level you would you expect for a premium price card.Anonymous User - Wednesday, September 10, 2003 - link
HAHAHA Nvidia had it coming to them, like to see them "Optimize" this one.Anonymous User - Wednesday, September 10, 2003 - link
I don't quite understand how it's elledgedly ran "fine" on a 4600 (they claimed they ran HL2 with that card and it ran fine) but it runs like shit on a 5600 and even on a 5900. So running it in DX8 mode ona 5600 will only make it "playable?" (30+ fps?). Then how is a TNT card in the min requirements? Won't a GF2 run it in DX8 mode also?...How can it be playable on that and on a GF4 but not on an FX card? That doesn't make sense to me.Also, someone mentionned STALKER. Well, they've once shown STALKER being played on an FX 5200, and it seemed to run pretty well to me. It had a lot of grass too, so I don't think the details were on "low." I own a 9800 Pro, but that still doesn't make me think that it's normal to claim HL2 runs well on a 4600, but very slowly on a 5900 and only at 60fps on a 9800 Pro. It doesn't make sense.
-Big D.
Anonymous User - Wednesday, September 10, 2003 - link
HL2 will beat the crap out of Doom3 any day. Just look at the gameplay. Why is GTA: Vice City so great? Certainly not for the graphics, not that HL has anything to be ashamed of in that department. I have a feelig that Doom 3 will look great, but play like crap, while HL2 will have it all :)BTW: the leaked Doom3 demo works surprisingly well on my R9700 and if I recall correctly it was first showed off runnig on an Radeon9700. I know because I had just bought a GeForce 4 Ti4400 for $450 and was shocked that they didnt use a Ti4600.
Anonymous User - Wednesday, September 10, 2003 - link
lol@#37Anonymous User - Wednesday, September 10, 2003 - link
Hell why did they even call it dx9? Why did nvidia and ati make videocards specifically for dx9? </sarcasm off> Performance-wise, I'd say the diference isn't enough to give you a stroke but it's certainly noticeable.Anonymous User - Wednesday, September 10, 2003 - link
Why can't people make the distinction between DirectX and OpenGL? Everytime a DX9 game comes out and performs way better on the R3xx than NV3x based GPUs are they going to keep on citing Doom III? It's not even a DX9 or DX9 class game! The NV3x still even needs a special pathway in that game that runs at lower precision to beat the R3xx. I doubt even the guys developing Stalker can make the NV3x outshine the R3xx in a fullblown DX9 game.Anonymous User - Wednesday, September 10, 2003 - link
"Valve recommends running geforce fx 5200 and 5600 cards in dx8 mode in order to get playable frame rates."The differences from DX8/9 are VERY minimal. And its not bumb-mapping on the NPCs that you lose it's simply shading of the skin to look sweaty.
Anonymous User - Wednesday, September 10, 2003 - link
#26.... valve weren't "crappy coding guys" when HL1 came out several years ago were they?At that point they used ID software's Quake 1/2 engine.
Anonymous User - Wednesday, September 10, 2003 - link
"tomorrow night at midnight"is illogical.
Midnight is the very first event of morning, not evening.
Micronaut - Wednesday, September 10, 2003 - link
Didn't specifically program for one or another? But it runs like crap on nVidia, but great on ATI?Where's my BS flag?
(not that I really care, I won't play HL2)
Anonymous User - Wednesday, September 10, 2003 - link
What would you prefer to play, based on the previews/videos/etc, Doom3 or HL2? I vote HL2 hands-down. id haven't made a compelling game since Doom2. Just an opinion, don't take offense, of course Carmack & Co are extremely gifted.Anonymous User - Wednesday, September 10, 2003 - link
Anand, change the title text!When I read this i get this image of u jumping up and down with a ATi flag advertising their hardware or something. Shitloads can can change before HL2 is released in November, and maybe it "rocks on nVidia" too on release.
Anonymous User - Wednesday, September 10, 2003 - link
You'll lose some of the fancier water effects among other things. Turns of some mode of bump mapping on models too i think. Saw a chart once, you can try looking for it.Anonymous User - Wednesday, September 10, 2003 - link
Um, whats diff. between running HL2 in DX8/9?Anyone?
Anonymous User - Wednesday, September 10, 2003 - link
I don't want to take this into fanboy territory, but valve weren't "crappy coding guys" when HL1 came out several years ago were they? In fact, one could argue that if HL2 was designed to work on systems that are 2+ years old they can't be all that bad can they?Anonymous User - Wednesday, September 10, 2003 - link
I LOVE MY ATI 9700 PRO MUHAHA -=0)~Anonymous User - Wednesday, September 10, 2003 - link
Is it just me or is everyone ignoring the situation with Doom 3 benchmarks running noticeably faster on nVidia's hardware.Okay, lets try something; I'll restate what Anand said but swap nVidia for ATi and HL2 for Doom3:
"
- with the NV3x codepath, nVidia is the clear performance leader under Doom3 with the FX5900U hitting around 60 fps at 10x7. The R9800 is noticeably slower.
- the FX5700 ultra performs well, - it is a good competitor of the R9800;
- nVidia didn't need these special optimizations to perform well and Carmack insists that they have not optimized the game specifically for any vendor.
Okay, just to make sure guys, go read Anand's Doom3 article. You'll see all the above holds, (if you extrapolate what we know, the FX5700 bit too).
You've also got to remember regardless of designs/pipes/whatever, *both* the FX59 and R98 are about 120M transistors, and nVidia is actually clocking these transistors faster (tho ATi can make up for this by using 24bit instead of 32bit per unit memory). Unless nV's engineers were plain stupid desiging the chip (unlikely), they have hardware that's easily on level with R98, and later driver's are likely to exploit this further. ATi's basically been optimising their R300/350 for years now, there's probably not nearly as much headroom for compiler optimisations in later drivers.
Basically I think when the Det.50's come out, the FX59 will even up with the R98 in HL2, and totally trounce it in Doom3. Maybe not, but just thought I'd restore some balance here :)
nV basically overshot a bit with their NV3x hardware generation while ATi stuck to DX9 fundamentals, so they've been sorta screwed from the start. I reckon they'll kick ass with NV40 though. I hope ATi do as well!
GS
Anonymous User - Wednesday, September 10, 2003 - link
#21 Doom III 's looks with Half-Life's 2 interaction ("great gameplay") could be possible, with both great framerates @ ATi and Nvidia cards. But NOT with valve's crappy coding guys behind the wheel. That takes a Carmack or Sweeney.Anonymous User - Wednesday, September 10, 2003 - link
Bruhahaha nvidia fans, in your face! Sure glad I cashed out for a 9700 :)Anonymous User - Wednesday, September 10, 2003 - link
How about Half-Life 2's great gameplay versus the Doom 3's horrible didn't-I-do-this-before-in-the-first-two-Dooms-and-all-the-Quakes?Anonymous User - Wednesday, September 10, 2003 - link
Maybe valve just can't code ? It's the other way around on Doom III. Half Life's 2 crap looking engine versus Doom III great looking engine.Anonymous User - Wednesday, September 10, 2003 - link
Nvidia is, I bet, a bit desperate about this. How else do you explain those ridiculous "The Way It's Meant To Be Played" logos they're marketing so agressively?Anonymous User - Wednesday, September 10, 2003 - link
Oh well, at least I only have a Ti4200 so it doesn't come as such a shock that performance will be sucky! It'll be ATI for me next time for sure. If the 9800Pro is pulling 60fps at 1024x768 and is way ahead of the FX... wow, I feel for you owners of $400 FX5900's.AgaBooga - Wednesday, September 10, 2003 - link
You know Anand, I somehow had a feeling you were working up a Half Life 2 article or a Doom 3 article... maybe I'm starting to pay too much attention to your writing ;)Its good to see an article from you after so much time! :)
Anonymous User - Wednesday, September 10, 2003 - link
im glad i got the 9700pro, it was that or wait until nv came out w/ a competing card when i bought itAnonymous User - Wednesday, September 10, 2003 - link
Somewhat of an odd news piece. oh well good to see i have the right video card though =)Anonymous User - Wednesday, September 10, 2003 - link
Thats the most shocking news i have came upon in my gaming life... i feel sad for this with FX cards... Lets face what if this was not Nvidia cards, but ATI cards, i would have been really sad.sandorski - Wednesday, September 10, 2003 - link
Glad I went with a 9600 Pro, though Nvidia wasn't tempting for me at any rate. Dasm though, what happened to Nvidia?Anonymous User - Wednesday, September 10, 2003 - link
#4, im one of the early 9500 pro owners and im LMAO with u =)Anonymous User - Wednesday, September 10, 2003 - link
This is kinda good news.I'm guessing he has the Half Life 2 benchmark Valve was talking about when HL2 was first announced, you know, the benchmark that was supposed to come out 2-3 weeks before HL2 shipped?
Steam comes out tomorrow, so maybe the Benchmark will follow, and thats why he can't post now?
Anonymous User - Wednesday, September 10, 2003 - link
Or #6 yea there in bed, ati paid valve a ton of money just so it would run like crap on nvidia's hardware.I mean come on, its becuase Nvidia went the cheap ruote on their cards, they put in crap for hardware muscle to push dx9.
Anonymous User - Wednesday, September 10, 2003 - link
i'm sure it nvidia will release drivers that will make it much more competitiveAnonymous User - Wednesday, September 10, 2003 - link
Yeah, #6, that's it. They're in bed together.It can't possibly be how slowly NV3X runs the ARB2 rendering path, NV3X's extreme brute force approach, or the massive fundamental differences between NV3X and R3XX. It has to be a conspiracy!
Anonymous User - Wednesday, September 10, 2003 - link
yo anand, its rejoice not rejoyceAnonymous User - Wednesday, September 10, 2003 - link
Illogical to code a game that won't perform well with Nvidia cards. I guess Valve and ATI are in bed together.Anonymous User - Wednesday, September 10, 2003 - link
Sweet jesus!Anonymous User - Wednesday, September 10, 2003 - link
People who bought Radeon 9500 Pros before prices went sky high, people who bought Radeon 9500s and soft-modded them, you may now spend the next week laughing at people with GeForce FX 5600s. Ready.. set..dvinnen - Wednesday, September 10, 2003 - link
not it softmods just fine. Runs hella fast and all. Just some driver issues. For like a week no OpenGL programs would work. Did everything I could think of short of reformatting, couldn't get it to work. Then one day it just started working at random, go figure.Anonymous User - Wednesday, September 10, 2003 - link
You said bad things about your 9500? What the.. oh, wait. Was it a 9500 plain that refused to softmod? In that case, ouch :(I'm really happy I decided to go with an ATI card. It's no 9800 Pro, but it should come close.
dvinnen - Wednesday, September 10, 2003 - link
Lord,Forgive me of all the bad things I've said about my 9500. I take them back now.
Sucks to be a nVidea owner eh?