Comments Locked

79 Comments

Back to Article

  • Nuke Waste - Thursday, December 16, 2004 - link

    Would it be possible for AT to update the timedemos to Source Enigne 7? Steam "graciously" updated my HL2 platform, and now none of my timedemos work!
  • The Internal - Friday, December 3, 2004 - link

    Which x700 XT card was used? How much RAM did it have?
  • VortigernRed - Tuesday, November 23, 2004 - link

    "Remember that we used the highest detail settings with the exception of anisotropic filtering and antialiasing, "

    That is not what you are showing on the SS on page 2. You are showing there that you have the water details set to "reflect world" not "reflect all".

    I would be interested to see how that affects the performance in your benchmarks with water in them, as some sites are showing larger wins for ATI and it seems possible that this setting may be the difference.

    It certainly looks much better in game with "reflect all" but does affect the performance.

    PS, sorry for the empty post above, trying to guess my username and password!
  • VortigernRed - Tuesday, November 23, 2004 - link

  • Warder45 - Sunday, November 21, 2004 - link

    I'd like to know what you guys think about X0bit's and other reviews that have ATI way ahead in numbers do to turning on Reflect All and not just reflect world.

    http://www.chaoticdreams.org/ce/jb/ReflectAll.jpg
    http://www.chaoticdreams.org/ce/jb/ReflectWorld.jp...

    Some SS.
  • Counterspeller - Friday, November 19, 2004 - link

    I forgot about my specs : P4 3.0 3HD 8, 16, 60Gb, MB P4P800-E Deluxe, Samtron 96BDF Screen.
  • Counterspeller - Friday, November 19, 2004 - link

    I don't understand... I have a GeForce 256 DDR, and the ONLY game that I have not been able to play is DOOM 3, only because it asks for 64Mb of VRAM, and I only have 32. I'd like to play HL2, but I don't have it. Perhaps it'll be like D3... not enough VRAM, and in that case, the 2nd game I can't play with that board. What I don't understand is this : how can anyone be complaining because x game or y game «only» gives us 200 fps... Can YOU see 200 fps ? we're happy with 24fps on TV, 25fps in the theaters, and we're bitchin' about some game that only gives us 56.7 fps instead of the «behold perfection» 67.5. I know there is a difference, and yes, we can see that difference, but is it useful, in terms of gameplay ? Will you be fragged because of a 1 or 2 or even 3 fps difference between you and your opponent ? Stupidity gets us fragged, not fps. I believe that anything below 30/40 fps is nice, but unplayable, when it comes to action games. I'm happy with 60. Anything above it is extra. I have played with this very board many demanding games, and I can say that yes, some parts are demanding on the board. But I never lost because of it. Resuming : I don't understand this war between ATI lovers and NVIDIA lovers. I've been using the same board for years, and I never needed to change it. Unless it crumbles, I'll stick with it.
  • Counterspeller - Friday, November 19, 2004 - link

    I don't understand... I have a GeForce 256 DDR, and the ONLY game that I have not been able to play is DOOM 3, only because it asks for 64Mb of VRAM, and I only have 32. I'd like to play HL2, but I don't have it. Perhaps it'll be like D3... not enough VRAM, and in that case, the 2nd game I can't play with that board. What I don't understand is this : how can anyone be complaining because x game or y game «only» gives us 200 fps... Can YOU see 200 fps ? we're happy with 24fps on TV, 25fps in the theaters, and we're bitchin' about some game that only gives us 56.7 fps instead of the «behold perfection» 67.5. I know there is a difference, and yes, we can see that difference, but is it useful, in terms of gameplay ? Will you be fragged because of a 1 or 2 or even 3 fps difference between you and your opponent ? Stupidity gets us fragged, not fps. I believe that anything below 30/40 fps is nice, but unplayable, when it comes to action games. I'm happy with 60. Anything above it is extra. I have played with this very board many demanding games, and I can say that yes, some parts are demanding on the board. But I never lost because of it. Resuming : I don't understand this war between ATI lovers and NVIDIA lovers. I've been using the same board for years, and I never needed to change it. Unless it crumbles, I'll stick with it.
  • TheRealSkywolf - Friday, November 19, 2004 - link

    I have a fx 5950, i have turned on the x9 path and things run great. 1st and all the graphics dont look much better, you see slight differences on the water and in some bumpmapping, but minor things.
    So i guess its time for Ati fans to shut up, both the fx and the 9800 cards run the game great.
    Man, doom3 showed all the wistles and bells, why wouldnt hl2? I think is very unprofessional from Valve to do what they did.
  • SLI - Friday, November 19, 2004 - link

    Umm, why was the Radeon P.E. not tested?
  • zhangping0233 - Thursday, January 5, 2012 - link

    Try xecconlight.com and Flashlightbox.com, you will find the best flashlight for the world. Shipping to all the world.
  • nthexwn - Thursday, November 18, 2004 - link

    I've also noticed that having the steam client running in the background can place quite a load on your entire system! After downloading all the content to cut down on network/disk/buffering wierdness I did some tests benchmarking UT2004 with the ons_dria demo from nvnews and noticed that my fps drops up to 10 when steam is running in the background!

    Might it be possible to compare performance between the retail version of Half-life 2 and the steamed version available for internet purchase to see if there's any sort of performance difference? Or does the retail version just run through an offline steam client anyway? (I bought over web)
  • cryptonomicon - Thursday, November 18, 2004 - link

    if your game crashes when switching to fullscreen it is because you have refresh overrides in place.

    add:
    -width X -refresh Y

    to your command line, for example
    -width 1024 -refresh 100

    it fixxed all my video problems.
  • meatless - Thursday, November 18, 2004 - link

    #10 - That's a pretty stupid thing to say. Kyle used the cards that his readers were most likely to buy; I know I wouldn't waste my money on a non-BFG nVidia 68xx card, and I know most other gamers wouldn't either. It's a part of [H]'s focus on doing real-world-style benches instead of OMG LETZ C IF NV RULZ ATI 2DAY IN HL2!!111111

    With all that said, it's great to see stiff competition in the video card arena, finally--should make for exciting product lines on the next go-round.
  • Jedi2155 - Thursday, November 18, 2004 - link

    #64
    there is a hidden HL2 MP in the game....however its not yet complete....

    quote

    11/17/2004 22:58 PST | Half-Life 2 | by MarmaladeMan
    HL2 World is reporting that they've found a working Half-Life 2 multiplayer built in to standard retail HL2. Here's the story, including how to do it:
    Here's how:
    net_start
    sv_lan 0
    deathmatch 1
    maxplayers (whatever you want)
    map (mapname)
    restart
    It will add you to the master server and it works. I know, it looks like the leak, but I assure you this is the retail HL2.
    They have a screenshot, as well as a test map for you to check out if interested. Head on over to HL2 World for the full story.

    http://www.hl2world.com/

    /quote
  • jonmcc33 - Thursday, November 18, 2004 - link

    Get back to Half-Life 2? Why? There's no point other than the fact it's a pretty single player game. If I wanted single player than I would have raved about Max Payne 2, which I didn't. Why Valve didn't think to make a Half-Life 2 MP side is beyond me. That's where the market is these days. Single player games, you play them once and you are done. Multiplayer is always changing. I don't want to wait for any stupid MP mod either. Curse you, Valve, for making us wait a year longer and then only giving us one piece of the cake!
  • TrungRacingDev - Thursday, November 18, 2004 - link

    you do realize that the fx5900 is default directx 8.1 right? if u think its beautiful now...try a directx 9.0 card =)
  • Motley - Thursday, November 18, 2004 - link

    I'm glad I didn't read this article before actually playing HL2.

    My system:
    P4 3.4GHz, 2GB Ram
    5900 Ultra video card
    ASUS P4P800 Motherboard

    I was playing HL2 at 1280x1024 with 6xAA, and 16xAnitropic Filtering, with everything else turned on to maximum. Besides a half-second stutter just after loading a new level, the game played GREAT, looked GREAT.

    Then again, maybe I'm not expecting the world, but, I can say that I was pleased, and maybe the x800 or 6800 can turn out better numbers, HOWEVER, at no time did I feel that I needed (or even wanted in the slightest) faster frame rates, or smoother gameplay. It just owned from the beginning to the end.
  • southernpac - Thursday, November 18, 2004 - link

    Anand, In light of the significant ATI X800XT DX9 (HL2) performance over the nVIDIA 6800 Ultra, would you today favor the X800XT PE graphic card in combination with the MSI K8N Neo2 motherboard? In your last High-End Buyers Guide (30 August) you recommended the nVIDIA 6800 Ultra be used with the MSI K8N Neo2 because Wesley thought that motherboard performed "a bit better" with a nVIDIA card. What would your recommendation be today? Can we anticipate another High-End Buyers Guide this month(its been 3 months)? Bill
  • blckgrffn - Thursday, November 18, 2004 - link

    I know, I know, but if gf2 can play at 800*600 MQ than maybe they can handle 1024*768...that would mean that a bunch of my friends wouldn't have to upgrade from their $60 cards, and they would be overjoyed :)
  • araczynski - Thursday, November 18, 2004 - link

    yawn, i'm too busy enjoy the game (6800gt) to read the article and/or care which card is better :) i'm playing at 1600x1200 0AA/4AF (2.4@3.3/1GB) and have absolutely no complaints, other then knowing that the game will eventually end :(
  • Jeff7181 - Thursday, November 18, 2004 - link

    #57... poorly :)
  • blckgrffn - Thursday, November 18, 2004 - link

    I would also like to to see how the 9200/9000 series Radeons perform too, and if you have extra time, the 8500/9100.

    Again, Thanks!
  • Jeff7181 - Thursday, November 18, 2004 - link

    #16... that's correct, although the only REAL observation that needs to be made is that Half Life 2 makes heavy use of pixel shaders which is very GPU dependant, and GPU's are just now growing the required testicles to process those shaders :)
  • blckgrffn - Thursday, November 18, 2004 - link

    Anand -

    I would like to see how the 6600 performs. As an 8 pipe card, it should perform better than the 9600xt and a little under a 9700 Pro, but it would be interesting to see if that is true. It is a great budget PCIe card along with the x700.

    Thanks!
    Nat
  • eva2000 - Thursday, November 18, 2004 - link

    nice review downloaded your demos to run on AMD64 3700+ @ 12x 222 = 2664mhz with 1GB BH-5 @ 222mhz 2-2-2-6 1T and Powercolor X800XT PE @ 520/560 and all demo results were within 3-4fps of the reviews :)
  • Live - Thursday, November 18, 2004 - link

    Good reading as always. Would like to see minimum FPS tough. I find it very important to see how low the cards drop when stressed. You can't see that with only average FPS.
  • housecat - Wednesday, November 17, 2004 - link

    So... wheres the Nvidia SLI versus ATI results??

    Muwahahaha.
  • Avalon - Wednesday, November 17, 2004 - link

    Hey Anand, I have an interesting request. Could you try Rivatuner on your 6800, unlock its pipes, and then bench it again? :P
    Just kidding. Actually, I'm glad you've confirmed what I've been thinking...that AF hasn't been doing much for me. Since I'm running on a lowly 9700, I think I'll just turn it off now, and enjoy a nice speed boost.
  • PrinceGaz - Wednesday, November 17, 2004 - link

    How about throwing a GF 5600 and maybe even a GF 5200 in as well for part 2, as an awful lot of people have them. Ultra versions of either if you prefer.

    I don't have one of them myself as I'm still using a Ti4200, but it would be interesting to see how they stacked up in the DX8 codepath against the Ti4600 you are planning to test. And it should be worth a giggle to see just how "fast" the 5600 or 5200 can manage the DX9 codepath :)

    Thanks to the resolution scaling-graphs this review included and how the fastest cards were generally CPU limited with that A64 4000+ when the resolution was dropped to 1024x768, I'm not sure how much a CPU scaling article for part 3 will show that can't already be quite accurately guesstimated from how different CPUs generally tend to perform in games. But a comparison of the Athlon 64 4000+, against an Athlon XP, a Prescott, a Northwood, and if time permits a fast P3, Duron and Celeron also, would be great.
  • nthexwn - Wednesday, November 17, 2004 - link

    In reply to Jeff7181 (#14):

    I have a Radeon 9700 pro with the 4.11 drivers and I'm having the same problems with my LCD (Samsung Syncmaster 710T @ 1280x1024)! Refresh rate is set to 70hz and with vsync I either get 35 (Interleaving frames to every other) or 70 fps (Matching frames to refresh rate)... Since our cards are from different companies I'm guessing it's a problem with the game itself...

    I've tried both triple buffering and alternating the DVI frequency (don't know if that would even help) and it doesn't solve the problem...

    It's rather irritating because I actually PLAY my games instead of just gawking over the benchmark scores (I'm one of those lucky people that has some free time!), and the screen looks like a Freddy Kruger job without vsync on! :*(

    Also, when the game switches between 70 and 35 there is a bit of a stall, which, even though 35fps is still playable can ruin online play in CS:S! Especially since player models running onto the screen tend to temporarily stress the card enough to make it hitch up on me, in which time said player usually caps me in the head and moves on! :*(

    I suppose we could type "fps_max 35" or "fps_max 42.5" (assuming it accepts floating values. You could just set your monitor to 80hz and set fps_max to 40) in the console (don't use the "s), but limiting the framerate to those values isn't what I'd call an ideal solution...

    Oh well...

    GREAT GAME! GREAT HARDWARE! GREAT WEBSITE!
  • smn198 - Wednesday, November 17, 2004 - link

    I'v got a 9800SE 128MB (256bit) card. Would like to know how that compares. I fried my 9500Pro making it into a 9700Pro so that won't do 3D no more (Artifacts then crashes) :(

    What graphics card which will be tested would have similar performance to a 9800SE (256bit RAM)?
  • ElMoIsEviL - Wednesday, November 17, 2004 - link

    "The one issue I'm struggling with right now is the fact that the X700 XT is still not available in retail, while the X700 Pro (256MB) is. If I have the time I may go back and run some X700 Pro numbers to make this a more realistic present-day comparison."

    I should post you a picture.. the x700XT is available at futurshop in Canada and has been for about a week now.. :)

    Althought not my cup of tea they are selling quite well I'm told.

    But then again ATi cards always sell well in Canada.. so well ATi usually cannot fill the demand (with the USA taking soo many of the chips lol).
  • ElMoIsEviL - Wednesday, November 17, 2004 - link

    Well... for one thing the numbers are not even close to what other sites are showing and secondly where's the x800XT PE?

    It's the card I own (albeit mine is clocked at 650/625).

    It's good to see ATi in the lead by such significant margins and that the game can be easilly played at 1600x1200 with 4xAA and 8xAF with an x800XT PE. Also great to see that the game runs well without the final HL2 drivers from ATi (yeah the 4.12's are only interim, the full 4.12's are going to be fully optimised).

    The biggest surprise is how well the 6600GT performed although losing convinsingly against the x700XT it still put on a good showing.

    BTW, other sites are showing the x800 Pro beating the 6800 Ultra with the same drivers albeit using an AthlonFX-55.

    Meh,

    Looks like ATi can probably offer even greater performance at lower resolutions according to the 1600x1200 results being soo close to the lower resolutions.
  • SMT - Wednesday, November 17, 2004 - link

    Anand,

    My flashlight worked throughout Nova Prospekt. Are you sure yours wasn't available?
  • abravo01 - Wednesday, November 17, 2004 - link

    Was the 6800GT used on the test 128 or 256MB? Huge price difference around here: if it was the 128MB, than it's definitely the best buy.
  • Anand Lal Shimpi - Wednesday, November 17, 2004 - link

    The AA benchmarks actually used 8X Aniso as well.

    Take care,
    Anand
  • OriginalReaper - Wednesday, November 17, 2004 - link

    on page 8 and 9 you discuss AA and AF, yet on page 10, 11, 12, and 13, you only list 4xAA being used. Did you forget to put 8xAF in the results or did the benchmark only do 4xAA?

    Thanks.
  • CU - Wednesday, November 17, 2004 - link

    I think an investigative article that shows when what hardware becomes a bottleneck for HL2 would be great. I look forward to it.

    "Any other requests?

    Take care,
    Anand"

    Can you send me all the hardware when you are done testing HL2. :-)
  • Cybercat - Wednesday, November 17, 2004 - link

    Nice, I wanted to know how the 9550 performed, mostly to see how it compares with the FX 5200. Is that 128 bit memory or 64 bit memory interface version? I'm pretty excited about the 6200 as well, since this is finally a budget card that performs better than the Ti4200. The performance leap this gen is spectacular.

    Overall, I think you left the other guys in the dust with this one.

    And on the subject of the X700 Pro, it's kind of an odd card, because with its price range (the 128MB version at about $180, 256MB at $200), it's unclear what card it's competing with. It acts like a fifth wheel in this way. People would much rather buy a X700XT or 6600GT instead since they're in the same general price range. Only thing is, like you said the X700XT isn't widely available yet, making the X700 Pro a stopgap for now, and giving NVIDIA the clear win in the mainstream market until ATI can start shipping out the more competitive card. That's the only thing saving the X700 Pro right now from being completely pointless.
  • Anand Lal Shimpi - Wednesday, November 17, 2004 - link

    Thanks for all of the comments guys. Just so you know, I started on Part 2 the minute the first article was done. I'm hoping to be done with testing by sometime tomorrow and then I've just got to write the article. Here's a list of the new cards being tested:

    9600XT, 9550, 9700, X300, GF 6200, GF 5900XT, GF4 Ti 4600, GF4 MX440

    I'm doing both DX9 and DX8 comparisons, including image quality.

    After Part 2 I think I'll go ahead and do the CPU comparison, although I've been thinking about doing a more investigative type of article into Half Life 2 performance in trying to figure out where its performance limitations exist, so things may get shuffled around a bit.

    We used the PCI Express 6600GT for our tests, but the AGP version should perform quite similarly.

    The one issue I'm struggling with right now is the fact that the X700 XT is still not available in retail, while the X700 Pro (256MB) is. If I have the time I may go back and run some X700 Pro numbers to make this a more realistic present-day comparison.

    Any other requests?

    Take care,
    Anand
  • Cybercat - Wednesday, November 17, 2004 - link

    You guys made my day comparing the X700XT, 6800, and 6600GT together. One question though (and I apologize if this was mentioned in the article and I missed it), did you guys use the PCIe or AGP version of the 6600GT?
  • Houdani - Wednesday, November 17, 2004 - link

    18: Many users rely on hardware review sites to get a feel for what technology is worth upgrading and when.

    Most of us have financial contraints which preclude us from upgrading to the best hardware, therefore we are more interested in knowing how the mainstream hardware performs.

    You are correct that it would not be an efficient use of resources to have AT repeat the tests on hardware that is two or three generations old ... but sampling the previous generation seems appropriate. Fortunately, that's where part 2 will come in handy.

    I expect that part 2 will be sufficient in showing whether or not the previous generation's hardware will be a bottleneck. The results will be invaluable for helping me establish my minimum level of satisfaction for today's applications.
  • stelleg151 - Wednesday, November 17, 2004 - link

    forget what i said in 34.....
  • pio!pio! - Wednesday, November 17, 2004 - link

    So how do you softmod a 6800NU to a 6800GT???
    or unlock the extra stuff....
  • stelleg151 - Wednesday, November 17, 2004 - link

    What drivers were being used here, 4.12 + 67.02??
  • Akira1224 - Wednesday, November 17, 2004 - link

    Jedi

    lol I should have seen that one coming!
  • nastyemu25 - Wednesday, November 17, 2004 - link

    i bought a 9600XT because it came boxed with a free coupon for HL2. and now i can't even see how it matches up :(
  • coldpower27 - Wednesday, November 17, 2004 - link

    These benchmarks are more in line with what I was predicting, the x800 Pro should be equal to 6800 GT due to similar Pixel Shader fillrate while the X800 XT should have an advantage at higher resolutions due to it's having a higher fillrate being clocked higher.

    Unlike DriverATIheaven:P.

    This is great I am happy knowing Nvidia's current generation of hardware is very competitive in performance in all aspects when at equal amounts of fillrate.
  • Da3dalus - Wednesday, November 17, 2004 - link

    In the 67.02 Forceware driver there's a new option called "Negative LOD bias", if I understand what I've read correctly it's supposed to reduce shimmering.

    What was that option set to in the tests? And how did it affect performance, image quality and shimmering?
  • alexlck - Wednesday, November 17, 2004 - link

    In map AT_coast_05.dem, GF6800U have no performance penalty with 4xAA@1024x768?
  • HardwareD00d - Wednesday, November 17, 2004 - link

    hey, #27, I was gonna say that ;)
  • jediknight - Wednesday, November 17, 2004 - link

    Well, it's obvious from the benchmarks. They don't lie.

    ATI RULZ NVIDIA SUXORZ!!

    (lol@#3)
  • bob661 - Wednesday, November 17, 2004 - link

    Do you need HL2 to play CS: Source? Thanks.
  • wien - Wednesday, November 17, 2004 - link

    #24 There is.. It's called Counter-Strike: Source
  • bob661 - Wednesday, November 17, 2004 - link

    Anyone know if there's multiplayer support in HL2? Thanks.
  • L1FE - Wednesday, November 17, 2004 - link

    Nice and thorough comparison. That 6600GT looks more and more enticing...
  • Rekonn - Wednesday, November 17, 2004 - link

    Great article, looking forward to reading the next one with slower cpus. Being cpu limited with an A64 4000+ is a little scary, I wonder what kind of fps an XP3200+ gets when paired with an AGP 6600GT. (still running an overclocked Barton 2500+)
  • Jalf - Wednesday, November 17, 2004 - link

    I'm surprised at how well NV stacks up... And good to see the 6800 GT beat the X800 Pro. Not because I'm an NV fan, but simply because it makes it easier to choose. When the 6800 GT wins over the equivalent ATI card, even in an ATI-optimized game, then it's kinda easy to choose what to buy... :D
    It's a lot harder with the other cards, where both companies scores some wins in different games.
  • Regs - Wednesday, November 17, 2004 - link

    Yeah, I'm hoping a CPU round up will come after part two! I can afford a 400 dollar video card but not paired with a 700 dollar AMD CPU.

    I did notice a lot of stuttering in my gaming experience with a A64 3000 + 6800 GT/1024 MB pC3200. I was playing at 1280x1024 with 4x/8x max details. So likely I would have to cut out the 8x Aniso to have smooth gameplay. I don't know if that was what Anand was mentioning about with the "Shimmering" of textures with the Manhatten calculations.
  • ballero - Wednesday, November 17, 2004 - link

    it'd be nice a comparison between cpu
  • Jalf - Wednesday, November 17, 2004 - link

    To those wanting benchmarks on older hardware, remember that this is a hardware site, not a games review site.

    Their focus is on the hardware, and honestly, few hardware enthusiasts can get excited about an 800 mhz cpu or a Geforce 3. ;)

    For AT, HL2 is a tool to compare new *interesting* hardware. It's not the other way around.
  • CU - Wednesday, November 17, 2004 - link

    I would also like to see slower cpu's and 512meg systems tested. It seems all recent cards can run it fine, so it would be nice to see how other things affect HL2.
  • CU - Wednesday, November 17, 2004 - link

    Based on the 6800nu vs 6600gt I would say that HL2 is being limited by fillrate and not bandwith. I say this since they both have about the same fillrate, but the 6800nu has around 40% more bandwidth than the 6600gt. So, unlocking extra pipes and overclocking the GPU should give the most increase in fps. Anyone want to test this?
  • Jeff7181 - Wednesday, November 17, 2004 - link

    ... in addition... this is a case where minimum frame rates would be very useful to know.
  • Jeff7181 - Wednesday, November 17, 2004 - link

    Those numbers are about what I expected. I'm sorta thinking that triple buffering isn't working with the 66.93 drivers and HL2 because I have vsync enabled, it seems like the frame rate is either 85 or 42.

    I also suspected that anistropic filtering wasn't particularly necessary... I'll have to try it without and see how it looks... although with 4XAA and 8XAF I'm still getting acceptable frame rates.
  • nserra - Wednesday, November 17, 2004 - link

    #8 i never heard of 6800 extra pipes unlocked, where did you see that. Arent you making some confusion with the Ati 9500 cards?
  • MAME - Wednesday, November 17, 2004 - link

    Make some budget video card benchmarks (Ti4200 plus or minus) and possibly a slower cpu or less ram so that people will know if they have to upgrade
  • Akira1224 - Wednesday, November 17, 2004 - link

    #8 Thats not a fair comparison. Yes atm it would seem the 6800Nu is a better buy. However if you go to Gameve you will find the XFX (clocked at PCIe speeds)6600GT for $218. Thats a much better deal than your example using Newegg. You talk about a $5 diff... if you are a smart shopper you can get upwards of a $50 diff.

    THAT makes the 6600GT the better buy. Esp when you consider that the market this card is aimed at is not the same market that will softmod their cards to unlock pipes. Either way you go you will get great performance.

    I digress off topic.... sorry.
  • nserra - Wednesday, November 17, 2004 - link

    You didn’t use overclocked nvidia cards like hardocp did. That Kyle has the shame to say he used stock clock, those BFG OC are overclocked from factory. Just 25Mhz but its something.

    Very good review!!! Better then the NVIDIA's GeForce 6600GT AGP review where something was missing.
  • Kovie - Wednesday, November 17, 2004 - link

    Mis-type, meant to say 6600GT being gouged.
  • Kovie - Wednesday, November 17, 2004 - link

    "Recently a number of users have asked that we compare the $300 GeForce 6800 to the $200 GeForce 6600GT to see if the added cost is truly worth it."

    Actually we asked to compare the currently $245 6600GT (newegg) against the currently $250 6800 (outpost). Once the 6800GT stops being gouged and goes down to its supposed price then it will be a better buy. Right now the $5 difference between them and the ability to potentially unlock the extra pipes on the 6800 make it a better buy.
  • Le Québécois - Wednesday, November 17, 2004 - link

    For my part I'm more curious about Slower CPU, to see how much it affect the FPS.
  • mikecel79 - Wednesday, November 17, 2004 - link

    Where's the ATI 9600 and 9500 series cards in this? The are DX9 cards also.
  • LocutusX - Wednesday, November 17, 2004 - link

    So... I wonder how all the poor souls who went with GF59xx's are feeling now... ;)


    But yes, both manufacturers' "current-gen" parts are doing very well.
  • ciwell - Wednesday, November 17, 2004 - link

    I find the 6800 vs the 6600GT results to be intriguing as the 6600GT stacks up very nicely. I wonder how the comparison is in other games though.
  • Akira1224 - Wednesday, November 17, 2004 - link

    I know the flames are going to start soon. I would like to say great job to both Nvidia and ATI. Both cards are spectacular this round and we should all be impressed with the tech being shown in this roundup. To anyone who is gonna start with the ATI RULZ NVIDIA SUXORZ or vice versa lets all just save it. The performance is so close either way you can't lose. For the record I have a 6800GT.
  • Jalf - Wednesday, November 17, 2004 - link

    Well, I can give you the results with my hardware. :)

    I'm running an ancient Geforce 2 GTS (32mb) and Athlon TBird 1400 MHz.

    I haven't noted down actual FPS values, but in 800x600, with medium-ish settings, it runs perfectly smoothly. That's impressive, if you ask me. :P

    So I doubt you'll have a problem. :)
  • ksherman - Wednesday, November 17, 2004 - link

    i wonder how old, old hardware will be... mabye theyll go as far back as the 8500 and Ti400's.... (cuz thats what i have ;)

Log in

Don't have an account? Sign up now