ATI Radeon X800 GT: A Quality Mid-range Solution
by Josh Venning on September 28, 2005 12:05 AM EST- Posted in
- GPUs
The Card, Specs and Test
Our Radeon X800 GT happens to be made by PowerColor, and looks about the same as other X800's with the exception of the sticker on the heatsink. As we've mentioned in the past, however, it's not the looks, but the performance that makes a good part. Let's talk about the card specifications.
The X800 GT is kind of the compromise between the high end of ATI's X700 and low end of their X800 series. Specifically, it has the same number of pixel pipelines as the X700, but it has the memory bandwidth of the X800 Pro. We find this to be an interesting approach to bridging the gap between the X700 and X800, and are curious to see what kinds of performance we see. Here is a table comparing a few of the parts that we'll be testing.
We will also be testing the GeForce 6800 ($200) and the Radeon X800 XT ($325) to give us a better performance comparison. We chose these cards to test based on their relative closeness in price and performance. The Radeon X800 XT is an exception with its higher price, and it is included here for reference. The X800 XT will obviously dominate here in framerate except in tests which are severely CPU-limited (i.e. Unreal Tournament), as it represents much higher performance in graphics cards.
Note that the X800 GT, X800, 6800, and X800 XT all have a 256 bit memory bus, while the 6600 GT only has a 128 bit memory bus. This will theoretically give the X800 GT an edge over the 6600 GT in tests with higher resolutions and anti-aliasing enabled. Conversely, the X800 GT can't process as many pixels per clock due to its slower core clock (and lack of certain features like SM3.0), so games that require more processing power should do worse on the X800 GT than on the 6600 GT. This means that they basically compromise by giving us more of one thing and less of another in order to compete with the 6600 GT.
Here is the system configuration that we used in our tests:
MSI K8N Neo4 Platinum/SLI motherboard
AMD Athlon 64 FX-55 Processor
2x512MB OCZ 2-2-2-6 1T DDR400 RAM
Seagate 7200.7 120 GB Hard Drive
OCZ 600 W PowerStream Power Supply
Our Radeon X800 GT happens to be made by PowerColor, and looks about the same as other X800's with the exception of the sticker on the heatsink. As we've mentioned in the past, however, it's not the looks, but the performance that makes a good part. Let's talk about the card specifications.
The X800 GT is kind of the compromise between the high end of ATI's X700 and low end of their X800 series. Specifically, it has the same number of pixel pipelines as the X700, but it has the memory bandwidth of the X800 Pro. We find this to be an interesting approach to bridging the gap between the X700 and X800, and are curious to see what kinds of performance we see. Here is a table comparing a few of the parts that we'll be testing.
Card Comparison | |||||
- | Pixel Pipelines | Vertex Pipelines | Core Clock | Memory Clock | Price |
Radeon X800 GT: | 8 | 6 | 470MHz | 495MHz | $160 |
Radeon X800: | 12 | 6 | 390MHz | 350MHz | $200 |
GeForce 6600 GT: | 8 | 3 | 500MHz | 500MHz | $160 |
We will also be testing the GeForce 6800 ($200) and the Radeon X800 XT ($325) to give us a better performance comparison. We chose these cards to test based on their relative closeness in price and performance. The Radeon X800 XT is an exception with its higher price, and it is included here for reference. The X800 XT will obviously dominate here in framerate except in tests which are severely CPU-limited (i.e. Unreal Tournament), as it represents much higher performance in graphics cards.
Note that the X800 GT, X800, 6800, and X800 XT all have a 256 bit memory bus, while the 6600 GT only has a 128 bit memory bus. This will theoretically give the X800 GT an edge over the 6600 GT in tests with higher resolutions and anti-aliasing enabled. Conversely, the X800 GT can't process as many pixels per clock due to its slower core clock (and lack of certain features like SM3.0), so games that require more processing power should do worse on the X800 GT than on the 6600 GT. This means that they basically compromise by giving us more of one thing and less of another in order to compete with the 6600 GT.
Here is the system configuration that we used in our tests:
MSI K8N Neo4 Platinum/SLI motherboard
AMD Athlon 64 FX-55 Processor
2x512MB OCZ 2-2-2-6 1T DDR400 RAM
Seagate 7200.7 120 GB Hard Drive
OCZ 600 W PowerStream Power Supply
48 Comments
View All Comments
Leper Messiah - Wednesday, September 28, 2005 - link
Hm. ATi is really sucking recently. My 9800pro gets some better results than that thing...the performance should be much better than a 6600gt, I mean only 30MHz less clock, more vertex shaders, 256-bit memory bus, etc...drivers? I dunno.Kinda funny to the 6800nu getting last though.
yacoub - Wednesday, September 28, 2005 - link
BS! I mean maybe if you purposely ignore the 6800GT and X800XL that sell for around $250, sure you could pretend there's a reason to be frustrated and stuck between getting a $400 power card or a 9800Pro, but the reality is quite different.
This card is clearly pointless and a year or two late at this price point. (And if it were released a year ago, you know it would have cost a lot more, meaning it would have been equally pointless then as well.)
yacoub - Wednesday, September 28, 2005 - link
Would anyone honestly spend $160 on a brand new GPU that can't even push beyond 20-30fps in most modern games? What the heck's the point?? Spend $80 more and get an X800XL and at least be able to PLAY the games instead of slideshow them.Also, correct me if I'm wrong but isn't the fps listed in Anandtech tests the PEAK fps and not the average fps? If so that means there's a good chance everytime there's any real action on screen your fps are dipping down to the teens or single digits. Yeah, that's worth paying $160 for. @___@
jkostans - Wednesday, September 28, 2005 - link
The X800GT is actually a very capable gaming card. I just built a system with one and it ran everything i threw at it very nicely. Not much of a difference between this system and the last one i built with an x800xl. Definately not a slideshow on any game (doom3, farcry, f.e.a.r, hl2, all ran smooth).wharris1 - Wednesday, September 28, 2005 - link
I realize that the release of the x1600/R530 won't be until December, but I was wondering what the chance of it being released in AGP form would be, and if so, how delayed that version would be. Are any of the next gen cards (7800/7600?, x1800/x1600) going to be released as AGP at any time. If not, I'll bite the bullet and get either a x800 XL or GTO; if they will have AGP versions of the newer cards, I'll probably wait until they come outcoldpower27 - Wednesday, September 28, 2005 - link
I am sorry I must have missed it where are the memory configurations of the cards you tested.Assuming you used the PCI-E versions of all cards due to motherboard choice.
X800 Vanilla = 128MB or 256MB???
X800 GT = 256MB???
6800 PCI-E = 325/300,600MHZ Effective & 256MB???
6600 GT this is obvious at least, 128MB.
OrSin - Wednesday, September 28, 2005 - link
Can we get a benchmark for non-FPS. And don't say EQ2 because thats pretty close to one in terms of play style. We all don't play FPS, can we get a RTS or even RPG in the benchmarks. I can understand now using them in all test, but for the mid and low range cards that what people are playing more then Doom 3.PrinceGaz - Wednesday, September 28, 2005 - link
Hear, hear. They should if necessary to save time dump one or two of the FPS games and replace them with an RTS, a driving game, and a flight/space-sim in order to provide true variety.jkostans - Wednesday, September 28, 2005 - link
I think this is the first review I've seen where the the 6600GT was the better overall card. Every other review has them neck and neck in most games, with a few victories going to the x800gt and doom3 going to the 6600gt. This review seems a little off... but what do I know.coldpower27 - Wednesday, September 28, 2005 - link
Hmm, they look fairly even to me still. Each has it's own stregths and weaknesses.