I thinks its important to note if one is building a mid range gaming machine and loocking for the best bang foe the buk, then one would be better off with an AGP system with a 6800nu from a decent manucaturer.
getting this card to 16 pipelines and at its default clock of 350Mhz (BFG) you get 5.6GPixel fillrate which I think supreseeds all of the cards in the article.
One should also consider that ATI is releasing some lower level cards using the higher end cores which *might* be good cards for pipelne unlocking provided they dont lock the pipes.
Hey, can you change the teaser line on the frontpage to read, "ATI's answer to a question nobody asked."? That would be much more humorous and accurate. As much as I love my ATi cards, I am really disappointed in them just throwing away all of their initiative this year with long product delays, paper launches, and putting out cards that don't meet the needs of the main gaming fanbase.
A bit OT, but from the 2nd to last page: "While 236 Watts might seem like a lot, keep in mind that with NVIDIA's 7800 series cards, we've seen power draws as high as 280 Watts..."
I'm no expert, but if your looking at systems where the max draw is 280 watts, then what would be the point of having a PSU of more than 300 watts (as long as that 300wat PSU is of high quality).
Wouldn't getting 400/500/and even 600W power supplies just be a waste of money - a victim to manufacturers' bogus marketing?
Thing is cheap power supplies rated at 300 rarely hit 300w and if you look at the systems most of that draw is on the 12v rail. Also if you get a good power supply, rated higher than you need, its going to me more efficient handling smaller loads. Thats means lower energy bills.
One reason I brought it up is I saw reviews of the Antec Phantom PSUs at silentpcreview.com, and from what they say, it appears that the main difference between the two models (350W and 500W) is that the 350 is fanless, and the 500 has a fan which only kicks in when temperature/power draw warrants it.
So I'm wondering how many of these high-watt PSUs are merely rated higher because they have more agressive cooling systems, and they're actually not of any better quality than their lower-rated siblings.
I guess the reason companies like Dell use relatively low-wattage PSUs is because they're less expensive & higher power units just aren't needed - and they're quieter too.
That is incorrect. An overspec power supply will not be more efficent than a lower (but sufficent) psu. The best efficency is typically near full load. A half loaded psu will be much less efficent that a mostly loaded psu. Just look at the efficency curves and where they are rated at.
Hm. ATi is really sucking recently. My 9800pro gets some better results than that thing...the performance should be much better than a 6600gt, I mean only 30MHz less clock, more vertex shaders, 256-bit memory bus, etc...drivers? I dunno.
quote: For the majority of us who aren't able to go right out and pick up the most powerful card available for upwards of $400, finding the best option for your price range can be frustrating.
BS! I mean maybe if you purposely ignore the 6800GT and X800XL that sell for around $250, sure you could pretend there's a reason to be frustrated and stuck between getting a $400 power card or a 9800Pro, but the reality is quite different.
This card is clearly pointless and a year or two late at this price point. (And if it were released a year ago, you know it would have cost a lot more, meaning it would have been equally pointless then as well.)
Would anyone honestly spend $160 on a brand new GPU that can't even push beyond 20-30fps in most modern games? What the heck's the point?? Spend $80 more and get an X800XL and at least be able to PLAY the games instead of slideshow them.
Also, correct me if I'm wrong but isn't the fps listed in Anandtech tests the PEAK fps and not the average fps? If so that means there's a good chance everytime there's any real action on screen your fps are dipping down to the teens or single digits. Yeah, that's worth paying $160 for. @___@
The X800GT is actually a very capable gaming card. I just built a system with one and it ran everything i threw at it very nicely. Not much of a difference between this system and the last one i built with an x800xl. Definately not a slideshow on any game (doom3, farcry, f.e.a.r, hl2, all ran smooth).
I realize that the release of the x1600/R530 won't be until December, but I was wondering what the chance of it being released in AGP form would be, and if so, how delayed that version would be. Are any of the next gen cards (7800/7600?, x1800/x1600) going to be released as AGP at any time. If not, I'll bite the bullet and get either a x800 XL or GTO; if they will have AGP versions of the newer cards, I'll probably wait until they come out
Can we get a benchmark for non-FPS. And don't say EQ2 because thats pretty close to one in terms of play style. We all don't play FPS, can we get a RTS or even RPG in the benchmarks. I can understand now using them in all test, but for the mid and low range cards that what people are playing more then Doom 3.
Hear, hear. They should if necessary to save time dump one or two of the FPS games and replace them with an RTS, a driving game, and a flight/space-sim in order to provide true variety.
I think this is the first review I've seen where the the 6600GT was the better overall card. Every other review has them neck and neck in most games, with a few victories going to the x800gt and doom3 going to the 6600gt. This review seems a little off... but what do I know.
Man, you guys sure take your time (probably all those useless 7800 GTX reviews took their toll). At least you could have included the X800 GTO (and 9800 Pro for reference - same spec old tech), but that said it is one of the better GPU reviews lately. Just one gripe. You shold have made it VERY CLEAR that 128 MB X800 GT is much slower frequency wise than the 256 MB one.
I must say I'm more than a bit dissappointed in X800 GT. It sure looked better on paper. 6600 GT still seems to be the better card overall (1280x1024 no AA -- which is what the great majory uses)
Here's hoping that the X1600 brings something better.
"several titles coming out in the near future that will use the same engine. Quake 4 and Enemy Territory: Quake Wars"
Of which are terrible examples, thats one way to not get on doom 3 side. lol
Case in point download the multiplayer video of Quake 4...you will laugh so much you wonder if its still quake 2 engine. It does not even look changed from last quake
It's okay, Paul - we know how names with the first letter can get mixed up. ;)
A few of the paragraphs are mine, but I doubt anyone would be able to pick them out. LOL. It's like "Where's Waldo": where's the paragraph written by a different editor?
Whenever you see an article disappear like that, just know that they published it accidently. Although I must say, you guys have been having technical difficulties for quite a bit lately.
I probably wouldn't mind if the ATI R520 article went up early due to technical difficulties. ;)
oh man.. these cards are supposed to cater to different price points, but now there's so many cards that it makes it even more confusing to buy... i'll stick with 6600GT because of purevideo (next pc = htpc)
ATIs next generation cards are supposedly one upping nVidia with Pure Video with H.264 encoding and while your at it, you may want to look into the whole Avivo thing since it seems to be up your alley(although i guess alot of it is marketing, im just referring to the concept in general)
that aside i don't know why you recomend the X800XT to those needing to play at those settings at all completely ignoring the X800XL which at times has reached price levels below $250
But how does the 6600gt compare in SCCT with SM3.0 on. It's not an unbiased test if you're not using the cards to the best of their abilities. SM3 was built to give a performance boost that would encourage people to by cards with it, no sense in leaving this out.
Remember the 6800 Strength lies in situations where AA & AF are applied. it's overall pixel fillrate is only 3.9GPixel compare to the 6600 GT 4.0GPixel, if not memory bandwidth limited, there is potential for 6600 GT to outperform 6800 Vanilla. Vertex Shader power doesn't matter also all that much as the amount 6600 GT has seems to be sufficient. Pixle Shader fillrate is one of the most important indicators of performance when comparing across the same architecture.
I was wondering about this myself. I've seen a number of benchmarks from other sources showing the 6800 to be the better performer. I hope Josh used a genuine NV41/42 6800 rather than just taking a NV45 and cutting it down.
I just built a computer for a buddy a X800 GT 256MB card plus a A64 3500+ in it. The 3500+ overclocked to 2.63GHz prime95 stable, and the video card was running solid at 580Mhz Core 595Mhz Memory and looped 3dmark tests all night without a single problem. Probably the best bang for the buck system I've built so far. Performance wise better than the 3800+ and X800 XL system I built prior to it (stock speeds) and a lot cheaper.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
48 Comments
Back to Article
ixelion - Wednesday, September 28, 2005 - link
I thinks its important to note if one is building a mid range gaming machine and loocking for the best bang foe the buk, then one would be better off with an AGP system with a 6800nu from a decent manucaturer.getting this card to 16 pipelines and at its default clock of 350Mhz (BFG) you get 5.6GPixel fillrate which I think supreseeds all of the cards in the article.
One should also consider that ATI is releasing some lower level cards using the higher end cores which *might* be good cards for pipelne unlocking provided they dont lock the pipes.
acx - Wednesday, September 28, 2005 - link
6800 is performing better in EQ with 1600x1200 4XAA than 1280x1024 4XAA??JarredWalton - Wednesday, September 28, 2005 - link
Hmmm... looks like a typo. It looks like 21.6 instead of 11.6 would be more appropriate, given the other charts. Not like either is really good.DerekWilson - Wednesday, September 28, 2005 - link
well ... actually ... it's not a typo ...we tested and retested ... but kept getting the same thing.
we can't explain it. sorry we glossed over it, but we are looking into. we should have mentioned it.
yacoub - Wednesday, September 28, 2005 - link
Hey, can you change the teaser line on the frontpage to read, "ATI's answer to a question nobody asked."? That would be much more humorous and accurate. As much as I love my ATi cards, I am really disappointed in them just throwing away all of their initiative this year with long product delays, paper launches, and putting out cards that don't meet the needs of the main gaming fanbase.sri2000 - Wednesday, September 28, 2005 - link
A bit OT, but from the 2nd to last page: "While 236 Watts might seem like a lot, keep in mind that with NVIDIA's 7800 series cards, we've seen power draws as high as 280 Watts..."I'm no expert, but if your looking at systems where the max draw is 280 watts, then what would be the point of having a PSU of more than 300 watts (as long as that 300wat PSU is of high quality).
Wouldn't getting 400/500/and even 600W power supplies just be a waste of money - a victim to manufacturers' bogus marketing?
Just askin'
Pythias - Wednesday, September 28, 2005 - link
Thing is cheap power supplies rated at 300 rarely hit 300w and if you look at the systems most of that draw is on the 12v rail. Also if you get a good power supply, rated higher than you need, its going to me more efficient handling smaller loads. Thats means lower energy bills.sri2000 - Wednesday, September 28, 2005 - link
One reason I brought it up is I saw reviews of the Antec Phantom PSUs at silentpcreview.com, and from what they say, it appears that the main difference between the two models (350W and 500W) is that the 350 is fanless, and the 500 has a fan which only kicks in when temperature/power draw warrants it.So I'm wondering how many of these high-watt PSUs are merely rated higher because they have more agressive cooling systems, and they're actually not of any better quality than their lower-rated siblings.
check out this link:
http://www.silentpcreview.com/article28-page4.html">http://www.silentpcreview.com/article28-page4.html
I guess the reason companies like Dell use relatively low-wattage PSUs is because they're less expensive & higher power units just aren't needed - and they're quieter too.
kleinwl - Wednesday, September 28, 2005 - link
That is incorrect. An overspec power supply will not be more efficent than a lower (but sufficent) psu. The best efficency is typically near full load. A half loaded psu will be much less efficent that a mostly loaded psu. Just look at the efficency curves and where they are rated at.bob661 - Wednesday, September 28, 2005 - link
Where can I get these efficency curves?Leper Messiah - Wednesday, September 28, 2005 - link
Hm. ATi is really sucking recently. My 9800pro gets some better results than that thing...the performance should be much better than a 6600gt, I mean only 30MHz less clock, more vertex shaders, 256-bit memory bus, etc...drivers? I dunno.Kinda funny to the 6800nu getting last though.
yacoub - Wednesday, September 28, 2005 - link
BS! I mean maybe if you purposely ignore the 6800GT and X800XL that sell for around $250, sure you could pretend there's a reason to be frustrated and stuck between getting a $400 power card or a 9800Pro, but the reality is quite different.
This card is clearly pointless and a year or two late at this price point. (And if it were released a year ago, you know it would have cost a lot more, meaning it would have been equally pointless then as well.)
yacoub - Wednesday, September 28, 2005 - link
Would anyone honestly spend $160 on a brand new GPU that can't even push beyond 20-30fps in most modern games? What the heck's the point?? Spend $80 more and get an X800XL and at least be able to PLAY the games instead of slideshow them.Also, correct me if I'm wrong but isn't the fps listed in Anandtech tests the PEAK fps and not the average fps? If so that means there's a good chance everytime there's any real action on screen your fps are dipping down to the teens or single digits. Yeah, that's worth paying $160 for. @___@
jkostans - Wednesday, September 28, 2005 - link
The X800GT is actually a very capable gaming card. I just built a system with one and it ran everything i threw at it very nicely. Not much of a difference between this system and the last one i built with an x800xl. Definately not a slideshow on any game (doom3, farcry, f.e.a.r, hl2, all ran smooth).wharris1 - Wednesday, September 28, 2005 - link
I realize that the release of the x1600/R530 won't be until December, but I was wondering what the chance of it being released in AGP form would be, and if so, how delayed that version would be. Are any of the next gen cards (7800/7600?, x1800/x1600) going to be released as AGP at any time. If not, I'll bite the bullet and get either a x800 XL or GTO; if they will have AGP versions of the newer cards, I'll probably wait until they come outcoldpower27 - Wednesday, September 28, 2005 - link
I am sorry I must have missed it where are the memory configurations of the cards you tested.Assuming you used the PCI-E versions of all cards due to motherboard choice.
X800 Vanilla = 128MB or 256MB???
X800 GT = 256MB???
6800 PCI-E = 325/300,600MHZ Effective & 256MB???
6600 GT this is obvious at least, 128MB.
OrSin - Wednesday, September 28, 2005 - link
Can we get a benchmark for non-FPS. And don't say EQ2 because thats pretty close to one in terms of play style. We all don't play FPS, can we get a RTS or even RPG in the benchmarks. I can understand now using them in all test, but for the mid and low range cards that what people are playing more then Doom 3.PrinceGaz - Wednesday, September 28, 2005 - link
Hear, hear. They should if necessary to save time dump one or two of the FPS games and replace them with an RTS, a driving game, and a flight/space-sim in order to provide true variety.jkostans - Wednesday, September 28, 2005 - link
I think this is the first review I've seen where the the 6600GT was the better overall card. Every other review has them neck and neck in most games, with a few victories going to the x800gt and doom3 going to the 6600gt. This review seems a little off... but what do I know.coldpower27 - Wednesday, September 28, 2005 - link
Hmm, they look fairly even to me still. Each has it's own stregths and weaknesses.bupkus - Wednesday, September 28, 2005 - link
What would be a good minimum fps for UT2004?tuteja1986 - Wednesday, September 28, 2005 - link
I saw the X800GTO selling at $280AUD which is cheap since 6600GT sell arround $250 - $300AUD in australia. Anyways I read the X800GTO review "http://www.tweaktown.com/document.php?dType=review...">http://www.tweaktown.com/document.php?dType=review... i thought X800GTO was great for its price in australia anyways. Anyways if i do upgrade in end of this year it would either X1600XT or 7600GT when ever they come out.AtaStrumf - Wednesday, September 28, 2005 - link
Man, you guys sure take your time (probably all those useless 7800 GTX reviews took their toll). At least you could have included the X800 GTO (and 9800 Pro for reference - same spec old tech), but that said it is one of the better GPU reviews lately. Just one gripe. You shold have made it VERY CLEAR that 128 MB X800 GT is much slower frequency wise than the 256 MB one.I must say I'm more than a bit dissappointed in X800 GT. It sure looked better on paper. 6600 GT still seems to be the better card overall (1280x1024 no AA -- which is what the great majory uses)
Here's hoping that the X1600 brings something better.
arturnow - Wednesday, September 28, 2005 - link
ATi respond to GeForce 6600GT after one year. Congratulation !!!CrystalBay - Wednesday, September 28, 2005 - link
For $200 , FTW...DerekWilson - Wednesday, September 28, 2005 - link
we're waiting for one ... but you might end up looking in another direction before we get to it.imaheadcase - Wednesday, September 28, 2005 - link
"several titles coming out in the near future that will use the same engine. Quake 4 and Enemy Territory: Quake Wars"Of which are terrible examples, thats one way to not get on doom 3 side. lol
Case in point download the multiplayer video of Quake 4...you will laugh so much you wonder if its still quake 2 engine. It does not even look changed from last quake
Pete - Wednesday, September 28, 2005 - link
One note, I think you listed the effective rather than actual RAM speed for the 6600GT in the table on p.2.DerekWilson - Wednesday, September 28, 2005 - link
first, josh wrote this one (though jarred did some editing)second, I just fixed the problem -- you were correct.
Pete - Wednesday, September 28, 2005 - link
Josh! I meant Josh! :)JarredWalton - Wednesday, September 28, 2005 - link
It's okay, Paul - we know how names with the first letter can get mixed up. ;)A few of the paragraphs are mine, but I doubt anyone would be able to pick them out. LOL. It's like "Where's Waldo": where's the paragraph written by a different editor?
ViRGE - Wednesday, September 28, 2005 - link
It went up nearly 2 days ago then disappeared, and now it's finally back up. What happened?DerekWilson - Wednesday, September 28, 2005 - link
Technical difficulties :-)We worked it out though
overclockingoodness - Wednesday, September 28, 2005 - link
Whenever you see an article disappear like that, just know that they published it accidently. Although I must say, you guys have been having technical difficulties for quite a bit lately.I probably wouldn't mind if the ATI R520 article went up early due to technical difficulties. ;)
DerekWilson - Wednesday, September 28, 2005 - link
hehe ... If we could possibly get it done early, that might be a problem :-) But we'll be working hard and late on that one.rqle - Wednesday, September 28, 2005 - link
On the Power Consumption page, is that the WHOLE system power draw (cpu, hd, video card, ram, board) or is it just then Video card?rqle - Wednesday, September 28, 2005 - link
nvm, didnt read it clear the first way throughnourdmrolNMT1 - Wednesday, September 28, 2005 - link
my 9800 pro is seriously taking a beating.maybe this christmas ill get a whole new inside. since upgrading my gpu means upgrading my mobo too.
hmm
ShadowVlican - Wednesday, September 28, 2005 - link
oh man.. these cards are supposed to cater to different price points, but now there's so many cards that it makes it even more confusing to buy... i'll stick with 6600GT because of purevideo (next pc = htpc)Jep4444 - Wednesday, September 28, 2005 - link
ATIs next generation cards are supposedly one upping nVidia with Pure Video with H.264 encoding and while your at it, you may want to look into the whole Avivo thing since it seems to be up your alley(although i guess alot of it is marketing, im just referring to the concept in general)that aside i don't know why you recomend the X800XT to those needing to play at those settings at all completely ignoring the X800XL which at times has reached price levels below $250
drinkmorejava - Wednesday, September 28, 2005 - link
But how does the 6600gt compare in SCCT with SM3.0 on. It's not an unbiased test if you're not using the cards to the best of their abilities. SM3 was built to give a performance boost that would encourage people to by cards with it, no sense in leaving this out.lifeguard1999 - Wednesday, September 28, 2005 - link
Why?coldpower27 - Wednesday, September 28, 2005 - link
Remember the 6800 Strength lies in situations where AA & AF are applied. it's overall pixel fillrate is only 3.9GPixel compare to the 6600 GT 4.0GPixel, if not memory bandwidth limited, there is potential for 6600 GT to outperform 6800 Vanilla. Vertex Shader power doesn't matter also all that much as the amount 6600 GT has seems to be sufficient. Pixle Shader fillrate is one of the most important indicators of performance when comparing across the same architecture.Cybercat - Wednesday, September 28, 2005 - link
I was wondering about this myself. I've seen a number of benchmarks from other sources showing the 6800 to be the better performer. I hope Josh used a genuine NV41/42 6800 rather than just taking a NV45 and cutting it down.Kagjes - Wednesday, September 28, 2005 - link
hmm, could someone plz tell me what's the overclocking like with 6800s? is it worth buying at all?DerekWilson - Wednesday, September 28, 2005 - link
they have different strengthsjkostans - Sunday, September 25, 2005 - link
I just built a computer for a buddy a X800 GT 256MB card plus a A64 3500+ in it. The 3500+ overclocked to 2.63GHz prime95 stable, and the video card was running solid at 580Mhz Core 595Mhz Memory and looped 3dmark tests all night without a single problem. Probably the best bang for the buck system I've built so far. Performance wise better than the 3800+ and X800 XL system I built prior to it (stock speeds) and a lot cheaper.Thatguy97 - Wednesday, June 24, 2015 - link
dont see how the x800 gt was a quality mid range solution as the x800 xl and x800 were much better cards