Anyone got any info on release date?Only thing I have seen is "sometime in Sep" but I don't see any place to preorder or any manufactuers advertising it.
As far as benchmarks.I don't care how much memory it has,how many pipes,how many bits,what the core and memory is clocked at...ect,I've seen lots of different benches and different sites with different rigs and in all of them it performs close to a 6800 and 800x and in some cases slightly outperformed.I don't know how to explain it but thats the way it is and $200 for this card compared to $400+ for the others is a no brainer.
Actually, mickyb, it's not so much that NVIDIA neglected DX9 performance as it is ATI neglecting OpenGL performance. If a game doesn't make any use of the special features of the GF6 cards (i.e. SM3.0 extensions), the X800 XT PE has a very large advantage in pixel rates and vertex rates. They're both 16x1 pixel and 6 vertex designs, but the ATI card runs at 520 MHz compared to 450 MHz on the 6800UE. That's a 15% performance advantage in clock speed, which correlates pretty well with the Source Stress Test results. So, I agree with you to a point, but really ATI just needs to get their OpenGL drivers up to par.
**** WARNING: RANT ****
I am spending way too much time on this article, but everytime I read the benchies on the 6800 and X800 series, I keep asking what is going on with NV and DirectX and ATI and Open GL? I have a hard time believing that the two APIs are so different that the hardware has to be designed for them. I think NV spent way too much time in the closet with Quake III and ATI in their closet with HL-2. Both engines are going to have an presence in other games and I don't want to have to pick a video card for the games that I play. If this continues, it will turn into the same war that the consoles have. I chose a PC so I can play all games. Maybe this will all correct itself after the next driver releases.
I don't have any insider information, and I haven't signed any NDA's, so I can say that NF4 is probably due out within the next few months. That could be wrong, of course.
There will be a 6600 vanilla part, Visual, but due to the use of different RAM and other factors, we do not have benches for that yet. It was mentioned on the second page, forth paragraph. I'm sure they're coming soon.
Finally, regarding the 6800 vanilla vs. 6600GT, there are two things to consider. First, how much faster is the 6800 when running on the same system? Generally, it is slightly faster and up to 20% faster when the system is essentially the same, so 50% more for up to 20% performance probably isn't worth it to most people. The bigger problem, however, is that the 6600GT is a PCIe part. There should be AGP bridged versions in the future, but how far off are they? After the "paper launch" of the X800 and 6800 cards, I wouldn't expect an AGP 6600GT for another six weeks at best.
I wouldn't be surprised to see NF4 and SLI and 6600GT all reach widespread availability around the same time. Well, that's what I would try to shoot for if I were Nvidia, anyway. If NF4 gets delayed due to technical issues, though, who knows?
Oooh, reading mickyb's post, imagining two of these things each with a TV tuner, with the SLI bridge between them... mmmm :)
The 3d performance would be perfect for today, and you'd have 4 monitors (or 2 monitors/2tvs) max.. if it could work without disabling NF4's IGP, 5/6 monitors, just trashing ATI's hyrdaview!
Two tuners giving you picture in picture or recording independently of what you're watching... Or one of the cards can be with digital tuner the other with analog... there'd be something for everyone :)
I'm drooling :)
Visual
P.S. Quite off-topic, but what's know about nVidia's plans for NF4, and is it expected soon?
Seeing this card named "GT" I wonder if there will be cheaper versions of it... any info on that from nVidia?
It'd be good to see a comparison between this card, the cheaper x600XT and the bit more expencive vanilla 6800, all on the same system. I'm pretty sure you guys won't bother to make another article for this, but at least if you see something on the web I hope you'll post in the news section :)
Also I wonder if the 6800 is worth the extra cash in your oppinion. I was almost decided on getting one but this article makes me consider waiting again...
Not sure any of this is viable, but going along with my previous comment, this looks like it would be a good candidate for SFF systems. It provides good performance at the right price. Furthermore, it supports SLI. For non-SFF systems, it is a nice way to incrementally get to the performance needs that your game requires. Practically, it is always a challenge to incrementally approach any performance target, because when you are ready to step up, it is time to get a whole new system. Allright, back to SFF. It would be cool if this was the onboard graphics candidate for NF4. This chip coupled with Video-in and it could finally compete with ATI in the AIW space. Currently I would always go for ATI, due to having to make the performance vs. all round use decision. Now here is where I think NV could have something. Since this chip supports SLI, it would be really something if the NF4 had this video on-board and if you wanted a little more power, you just add another card. I don't know about you all, but I don't like buying a nice MB just to have to disable an almost passable video card because it couldn't run the next game.
First, perhaps the disclaimer wasn't big enough, but running the tests on different platforms is never an optimal solution. This can and will affect the results in various instances. A 3400+ is a very fast chip, but the 3.4 GHz P4EE is still going to beat it in many situations. However, going back and running hundreds of benchmarks on different platforms really isn't an option. At some point in the future, I'm sure there will be a more equal roundup of the cards, but it's just an approximate measurement of performance anyway.
If you look at the "performance estimates" in the GPU Cheatsheet article we recently published, you can see some additional support for why the 6600 GT would sometimes beat the 6800 vanilla. In pixel performance, the raw clockspeed of the 6600GT puts it just a fraction faster than the 6800, while it's slower in bandwidth by a relatively large amount. They're also pretty close in vertex processing power. Change the CPU from a 2.2 GHz 1MB L2 A64 to a 3.4 GHz 2 MB L2 P4EE and the closeness of the cards can end up pushing the GT ahead in cases where the P4EE is faster.
(I may have the wrong number of vertex pipelines as well, which could mean that the 6600GT has substantially more vertex processing power. Any word on that Derek? I had 3 vertex pipelines as "rumor" on the 6600 chips, but the benchmarks seem to indicate that it might have more.)
Without knowing the internal structure of the chips and cards, we can't say for sure, but there could be occasions where an 8x1 chip makes more efficient use of its resources than a 12x1 chip. Perhaps certain latencies are higher on the NV40 than on the NV43, or maybe a 12x1 pipeline setup just taxes the memory in a different fashion.
Relative to the older generation hardware, it should come as no surprise that a 500 MHz 8x1 design is generally faster than a 412 MHz 8x1 (9800XT) or a handicapped 475 MHz 4x2 design 5950U). At lower resolutions, the memory bandwidth advantage isn't as much of a factor, and the 6600 has a superior architecture. In bandwidth hungry games, the 6600GT would likely lose out, but on PS/VS heavy operations, the raw clock speed will be very beneficial.
You can bench the Pentium 4 Prescott 3.4GHZ for i875P, with DDR400 2-2-2-5 for the AGP GPU.
Then you can bench the Prescott 3.4GHZ LGA775 on the i925X plaform with DDR2-533 4-4-4-12, for the PCI Express GPU that would be roughly equivalent.
Or you can supplement both with the Pentium 4 EE 3.4GHZ if you got both the S478 and LGA775 Edition of those two processors.
It's too bad you can't bench SLI, but it's hard to expect them to, they need Xeons on the Tumwater chipset:S and isn't that for LGA775 only too? So they need 2 Noconas???
Looks like a great card, but it looks like it's actually better than what they're showing here, since they're comparing it to weaker cards in systems with better CPUs...
Really needs to be redone with everything but the video cards kept constant.
The point of the article was to compare the PCI-E mid-range, and guess what, if the 6600GT is 200 bucks, it's direct price competition is the x600XT and it can't even hold a candle to the 6600GT. If they release the AGP version at 200 it still is a great competitor to the 9800pro from performance alone, plus the added feature set is a bonus. nV is definitely taking advantage of the complete lack of midrange from ATi.
Carfax, they have a standard testbed, so the numbers from the other NV cards come from previous benchmarks. If they upgrade the drivers on everything else, they'll have to re-run the benchmarks on everything else.
"That isn't to say that they are less power hungry than an AGP card that requires external power, but that the PCI Express slot supplies enough voltage to the card that it doesn't need any more juice."
Nitpick: the AGP and PCIE slots provides enough voltage, but the main restriction is current. Each spec is designed to deliver so many amps of current at the specified voltage. As cards get bigger and badder, they draw more current, and need the extra power hookups.
looks pretty good to me even at its worse its still on par with a 9800xt and in alot of games is besting the x800pro looks like a pretty good deal at 200 bucks prolly will go cheaper once it hits the stores in mass
ksherman : overclock a GF2 MX? I think you have that confused with a GF4-4200. I've never been able to successfully OC a GF2 MX.
mcveigh : Nvidia does have intentions to add DVI->Component adapter support into Forceware. Good luck on it being stable though.
Jalf: If you read the beginning of the article, you'll note that Anandtech originally was going to compare the 6600GT to the ATi Radeon X600 series card because there was no "underpowered" X800-PE to compete with the 6600GT. All of those only 2 card charts were showing the PCIe 6600GT vs. the nearest (under-pricepoint) Radeon PCIe product.
Questar : Read my earlier note and stop trolling please. It's rather obvious why the charts suddenly changed if you had bothered to read the words and not the pretty pictures. Most of the article was comparing PCIe cards to AGP cards. Please, think before you troll.
Illissius : It's not really that odd. The GF6 tech is present in full, so the 6600GT does benifit from the better memory controller and other optimizations. However, as we notice, once we start enable filtering, the card is easily decimated by the competition. I think I'll stick with my 9800 Pro's for now.
mickyb : if memory serves correctly, the stock cooling fan was ~50 db back in May. It was still a little obnoxious for a fan, but nowhere near as bad as it's older brethren. Looking at the card Anandtech appeared to have, I'd guess the noise range was probably between 40-50 db. Doesn't look like Nvidia changed much.
also, would make more sense to directly compare to the 9800 XT/Pro, as they are in same price bracket, unlike X600/9600. meh. same applies to all the other sites. I dont give a damn if 9800 isnt on PCI-Express, tests have shown, like AGP 8x vs AGP 4x, theres no damn difference outside margins of error. so.... AGP 6600 should perform the same. At least we'd know then whether this £150 9800 Pro is worth it against a £150-£200 6600. I'm off to pray the X700 reviews dont pit it against a Geforce PCX 5300, Cos thats precisely what comparing a 6600 to a 9600/X600 is.
yeah i see it now... not sure how i missed that :D. It certainly sounds like a kickn' card! One thing i was disappointed about in the article was that you didnt try and overclock the card... That was one of the things that made the GForce2 MX a great buy. Im not sure if that is something that still carries into the current gen cards, but i would be interesting to see how well it ocs
Yeah, I wondered about that too. Why did some of the charts only show two cards? I wouldn't call it a piece of shit article, and the card does look like really great value, but I did wonder about that. :)
Still, I'd call it a good article, and a good card.
OMFG I can't beleive what has happened to this place.
Can we please at least have the charts consistant from one page to another? Let's see on this page I'll make a chart with a 6800U and an X600, then on this page I'll throw in 10 other cards, and on the next page I'll take two out!
Didn't your card have 256MB memory by any chance? It's very, very odd how it pulls away from the 9800XT and 6800 at higher resolutions is some games, when by all logical reasons if there's any change at all compared to lower resolutions, the opposite should be happening...
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
44 Comments
Back to Article
SuperDuper28 - Monday, September 13, 2004 - link
Anyone got any info on release date?Only thing I have seen is "sometime in Sep" but I don't see any place to preorder or any manufactuers advertising it.As far as benchmarks.I don't care how much memory it has,how many pipes,how many bits,what the core and memory is clocked at...ect,I've seen lots of different benches and different sites with different rigs and in all of them it performs close to a 6800 and 800x and in some cases slightly outperformed.I don't know how to explain it but thats the way it is and $200 for this card compared to $400+ for the others is a no brainer.
Thatguy97 - Saturday, May 9, 2020 - link
Corona will kill us allJarredWalton - Thursday, September 9, 2004 - link
Actually, mickyb, it's not so much that NVIDIA neglected DX9 performance as it is ATI neglecting OpenGL performance. If a game doesn't make any use of the special features of the GF6 cards (i.e. SM3.0 extensions), the X800 XT PE has a very large advantage in pixel rates and vertex rates. They're both 16x1 pixel and 6 vertex designs, but the ATI card runs at 520 MHz compared to 450 MHz on the 6800UE. That's a 15% performance advantage in clock speed, which correlates pretty well with the Source Stress Test results. So, I agree with you to a point, but really ATI just needs to get their OpenGL drivers up to par.mickyb - Wednesday, September 8, 2004 - link
**** WARNING: RANT ****I am spending way too much time on this article, but everytime I read the benchies on the 6800 and X800 series, I keep asking what is going on with NV and DirectX and ATI and Open GL? I have a hard time believing that the two APIs are so different that the hardware has to be designed for them. I think NV spent way too much time in the closet with Quake III and ATI in their closet with HL-2. Both engines are going to have an presence in other games and I don't want to have to pick a video card for the games that I play. If this continues, it will turn into the same war that the consoles have. I chose a PC so I can play all games. Maybe this will all correct itself after the next driver releases.
JarredWalton - Wednesday, September 8, 2004 - link
I don't have any insider information, and I haven't signed any NDA's, so I can say that NF4 is probably due out within the next few months. That could be wrong, of course.There will be a 6600 vanilla part, Visual, but due to the use of different RAM and other factors, we do not have benches for that yet. It was mentioned on the second page, forth paragraph. I'm sure they're coming soon.
Finally, regarding the 6800 vanilla vs. 6600GT, there are two things to consider. First, how much faster is the 6800 when running on the same system? Generally, it is slightly faster and up to 20% faster when the system is essentially the same, so 50% more for up to 20% performance probably isn't worth it to most people. The bigger problem, however, is that the 6600GT is a PCIe part. There should be AGP bridged versions in the future, but how far off are they? After the "paper launch" of the X800 and 6800 cards, I wouldn't expect an AGP 6600GT for another six weeks at best.
I wouldn't be surprised to see NF4 and SLI and 6600GT all reach widespread availability around the same time. Well, that's what I would try to shoot for if I were Nvidia, anyway. If NF4 gets delayed due to technical issues, though, who knows?
Visual - Wednesday, September 8, 2004 - link
Oooh, reading mickyb's post, imagining two of these things each with a TV tuner, with the SLI bridge between them... mmmm :)The 3d performance would be perfect for today, and you'd have 4 monitors (or 2 monitors/2tvs) max.. if it could work without disabling NF4's IGP, 5/6 monitors, just trashing ATI's hyrdaview!
Two tuners giving you picture in picture or recording independently of what you're watching... Or one of the cards can be with digital tuner the other with analog... there'd be something for everyone :)
I'm drooling :)
Visual
P.S. Quite off-topic, but what's know about nVidia's plans for NF4, and is it expected soon?
Visual - Wednesday, September 8, 2004 - link
Seeing this card named "GT" I wonder if there will be cheaper versions of it... any info on that from nVidia?It'd be good to see a comparison between this card, the cheaper x600XT and the bit more expencive vanilla 6800, all on the same system. I'm pretty sure you guys won't bother to make another article for this, but at least if you see something on the web I hope you'll post in the news section :)
Also I wonder if the 6800 is worth the extra cash in your oppinion. I was almost decided on getting one but this article makes me consider waiting again...
Thanks for the article,
Visual
mickyb - Wednesday, September 8, 2004 - link
Not sure any of this is viable, but going along with my previous comment, this looks like it would be a good candidate for SFF systems. It provides good performance at the right price. Furthermore, it supports SLI. For non-SFF systems, it is a nice way to incrementally get to the performance needs that your game requires. Practically, it is always a challenge to incrementally approach any performance target, because when you are ready to step up, it is time to get a whole new system. Allright, back to SFF. It would be cool if this was the onboard graphics candidate for NF4. This chip coupled with Video-in and it could finally compete with ATI in the AIW space. Currently I would always go for ATI, due to having to make the performance vs. all round use decision. Now here is where I think NV could have something. Since this chip supports SLI, it would be really something if the NF4 had this video on-board and if you wanted a little more power, you just add another card. I don't know about you all, but I don't like buying a nice MB just to have to disable an almost passable video card because it couldn't run the next game.JarredWalton - Wednesday, September 8, 2004 - link
First, perhaps the disclaimer wasn't big enough, but running the tests on different platforms is never an optimal solution. This can and will affect the results in various instances. A 3400+ is a very fast chip, but the 3.4 GHz P4EE is still going to beat it in many situations. However, going back and running hundreds of benchmarks on different platforms really isn't an option. At some point in the future, I'm sure there will be a more equal roundup of the cards, but it's just an approximate measurement of performance anyway.If you look at the "performance estimates" in the GPU Cheatsheet article we recently published, you can see some additional support for why the 6600 GT would sometimes beat the 6800 vanilla. In pixel performance, the raw clockspeed of the 6600GT puts it just a fraction faster than the 6800, while it's slower in bandwidth by a relatively large amount. They're also pretty close in vertex processing power. Change the CPU from a 2.2 GHz 1MB L2 A64 to a 3.4 GHz 2 MB L2 P4EE and the closeness of the cards can end up pushing the GT ahead in cases where the P4EE is faster.
(I may have the wrong number of vertex pipelines as well, which could mean that the 6600GT has substantially more vertex processing power. Any word on that Derek? I had 3 vertex pipelines as "rumor" on the 6600 chips, but the benchmarks seem to indicate that it might have more.)
Without knowing the internal structure of the chips and cards, we can't say for sure, but there could be occasions where an 8x1 chip makes more efficient use of its resources than a 12x1 chip. Perhaps certain latencies are higher on the NV40 than on the NV43, or maybe a 12x1 pipeline setup just taxes the memory in a different fashion.
Relative to the older generation hardware, it should come as no surprise that a 500 MHz 8x1 design is generally faster than a 412 MHz 8x1 (9800XT) or a handicapped 475 MHz 4x2 design 5950U). At lower resolutions, the memory bandwidth advantage isn't as much of a factor, and the 6600 has a superior architecture. In bandwidth hungry games, the 6600GT would likely lose out, but on PS/VS heavy operations, the raw clock speed will be very beneficial.
kmmatney - Tuesday, September 7, 2004 - link
What! No Quake3 benchmarks?:)
trenzterra - Tuesday, September 7, 2004 - link
Are you reviewing a 256mb or 128mb card? I can't imagine a 128mb card beating the hell out of X800 and even their 6800.coldpower27 - Tuesday, September 7, 2004 - link
oops i mean it's probably for Socket 604 so they need 2 Xeons, preferably the Nocona's but they aren't LGA775 :Dcoldpower27 - Tuesday, September 7, 2004 - link
Yeh that would be possible in a way I believe,You can bench the Pentium 4 Prescott 3.4GHZ for i875P, with DDR400 2-2-2-5 for the AGP GPU.
Then you can bench the Prescott 3.4GHZ LGA775 on the i925X plaform with DDR2-533 4-4-4-12, for the PCI Express GPU that would be roughly equivalent.
Or you can supplement both with the Pentium 4 EE 3.4GHZ if you got both the S478 and LGA775 Edition of those two processors.
It's too bad you can't bench SLI, but it's hard to expect them to, they need Xeons on the Tumwater chipset:S and isn't that for LGA775 only too? So they need 2 Noconas???
JackHawksmoor - Tuesday, September 7, 2004 - link
Looks like a great card, but it looks like it's actually better than what they're showing here, since they're comparing it to weaker cards in systems with better CPUs...Really needs to be redone with everything but the video cards kept constant.
DEMO24 - Tuesday, September 7, 2004 - link
Why the heck is that card beating a 6800? Hopefully that 6800 will pull ahead more than that with newer drivers.Cygni - Tuesday, September 7, 2004 - link
RTFABored Guy - Tuesday, September 7, 2004 - link
anyone know if the 6600 gpu will be available in an agp interface anytime soon?8NP4iN - Tuesday, September 7, 2004 - link
nice to see a 200$ card beating a 450$ 9800XT or 9800 pro...cant wait to upgrade when nforce4 comes out
Carfax - Tuesday, September 7, 2004 - link
I was just wondering, because I know the 65.76 drivers would have raised the 6800 series performance aswell!ZobarStyl - Tuesday, September 7, 2004 - link
The point of the article was to compare the PCI-E mid-range, and guess what, if the 6600GT is 200 bucks, it's direct price competition is the x600XT and it can't even hold a candle to the 6600GT. If they release the AGP version at 200 it still is a great competitor to the 9800pro from performance alone, plus the added feature set is a bonus. nV is definitely taking advantage of the complete lack of midrange from ATi.ViRGE - Tuesday, September 7, 2004 - link
Carfax, they have a standard testbed, so the numbers from the other NV cards come from previous benchmarks. If they upgrade the drivers on everything else, they'll have to re-run the benchmarks on everything else.Cybercat - Tuesday, September 7, 2004 - link
Very nice performance. Best mainstream card in a LONG time.Carfax - Tuesday, September 7, 2004 - link
WHY where two sets of drivers used?!? Why couldn't you just use the 65.76 drivers for both the 6600GT and the rest of the Nvidia cards?ViRGE - Tuesday, September 7, 2004 - link
And continuing on #20's tangent, the 5950 Ultra beats the 6600GT by 200%? That definitely isn't right.railer - Tuesday, September 7, 2004 - link
I don't think those Jedi Knight results are correct. 9800xt beats the 9700 pro by 300%? I think not......Doormat - Tuesday, September 7, 2004 - link
"That isn't to say that they are less power hungry than an AGP card that requires external power, but that the PCI Express slot supplies enough voltage to the card that it doesn't need any more juice."Nitpick: the AGP and PCIE slots provides enough voltage, but the main restriction is current. Each spec is designed to deliver so many amps of current at the specified voltage. As cards get bigger and badder, they draw more current, and need the extra power hookups.
LoneWolf15 - Tuesday, September 7, 2004 - link
I'm puzzled as to why the 6600GT beats the 6800(straight) so often. Doesn't the 6800 have a full 256-bit path?neogodless - Tuesday, September 7, 2004 - link
Yes, fix "looses" on the final page!Crazy to see a Radeon 9700 Pro do so poorly... very surprised it's doing poorly compared to the 9600/X600. Is that right? Doesn't seem right to me...
Jalf - Tuesday, September 7, 2004 - link
#14: Yeah, but looks kinda weird that half the charts only shows those two cards, while the other half shows the full spectrum :)Falloutboy - Tuesday, September 7, 2004 - link
looks pretty good to me even at its worse its still on par with a 9800xt and in alot of games is besting the x800pro looks like a pretty good deal at 200 bucks prolly will go cheaper once it hits the stores in massSaist - Tuesday, September 7, 2004 - link
ksherman : overclock a GF2 MX? I think you have that confused with a GF4-4200. I've never been able to successfully OC a GF2 MX.mcveigh : Nvidia does have intentions to add DVI->Component adapter support into Forceware. Good luck on it being stable though.
Jalf: If you read the beginning of the article, you'll note that Anandtech originally was going to compare the 6600GT to the ATi Radeon X600 series card because there was no "underpowered" X800-PE to compete with the 6600GT. All of those only 2 card charts were showing the PCIe 6600GT vs. the nearest (under-pricepoint) Radeon PCIe product.
Questar : Read my earlier note and stop trolling please. It's rather obvious why the charts suddenly changed if you had bothered to read the words and not the pretty pictures. Most of the article was comparing PCIe cards to AGP cards. Please, think before you troll.
Illissius : It's not really that odd. The GF6 tech is present in full, so the 6600GT does benifit from the better memory controller and other optimizations. However, as we notice, once we start enable filtering, the card is easily decimated by the competition. I think I'll stick with my 9800 Pro's for now.
mickyb : if memory serves correctly, the stock cooling fan was ~50 db back in May. It was still a little obnoxious for a fan, but nowhere near as bad as it's older brethren. Looking at the card Anandtech appeared to have, I'd guess the noise range was probably between 40-50 db. Doesn't look like Nvidia changed much.
DeathByDuke - Tuesday, September 7, 2004 - link
argh, keep charts consistent!also, would make more sense to directly compare to the 9800 XT/Pro, as they are in same price bracket, unlike X600/9600. meh. same applies to all the other sites. I dont give a damn if 9800 isnt on PCI-Express, tests have shown, like AGP 8x vs AGP 4x, theres no damn difference outside margins of error. so.... AGP 6600 should perform the same. At least we'd know then whether this £150 9800 Pro is worth it against a £150-£200 6600. I'm off to pray the X700 reviews dont pit it against a Geforce PCX 5300, Cos thats precisely what comparing a 6600 to a 9600/X600 is.
ksherman - Tuesday, September 7, 2004 - link
yeah i see it now... not sure how i missed that :D. It certainly sounds like a kickn' card! One thing i was disappointed about in the article was that you didnt try and overclock the card... That was one of the things that made the GForce2 MX a great buy. Im not sure if that is something that still carries into the current gen cards, but i would be interesting to see how well it ocsmcveigh - Tuesday, September 7, 2004 - link
read the article.inshort: not yet but they are exprcted to.
ksherman - Tuesday, September 7, 2004 - link
do they make a non-PCIe version? I really dont want to spend the money to convert to Intelmcveigh - Tuesday, September 7, 2004 - link
anyone know if it can take a dvi->component adapter?I heard a rumor the 6600 series would be able to do this like radeon's can.
FuryVII - Tuesday, September 7, 2004 - link
Yea, "nobody 'looses'".mickyb - Tuesday, September 7, 2004 - link
How loud is this card? I need something quiter than I have. I built a SFF system for my stereo rack and it looks like this card may be the ticket.Jalf - Tuesday, September 7, 2004 - link
Yeah, I wondered about that too. Why did some of the charts only show two cards? I wouldn't call it a piece of shit article, and the card does look like really great value, but I did wonder about that. :)Still, I'd call it a good article, and a good card.
Questar - Tuesday, September 7, 2004 - link
OMFG I can't beleive what has happened to this place.Can we please at least have the charts consistant from one page to another? Let's see on this page I'll make a chart with a 6800U and an X600, then on this page I'll throw in 10 other cards, and on the next page I'll take two out!
What a piece of shit article.
Illissius - Tuesday, September 7, 2004 - link
Didn't your card have 256MB memory by any chance? It's very, very odd how it pulls away from the 9800XT and 6800 at higher resolutions is some games, when by all logical reasons if there's any change at all compared to lower resolutions, the opposite should be happening...TheAudit - Tuesday, September 7, 2004 - link
I'll take it.tfranzese - Tuesday, September 7, 2004 - link
Looking VERY nice. I can't wait to see SLI comparisons once the end of the year comes. Should be interesting analysis.coldpower27 - Tuesday, September 7, 2004 - link
Geforce 6600 GT ah, so advanced trchnology :)