According to Digitimes[1], XT. XL, LE has 32, 24, 16 pipelines respectively. Maybe ATI is using GeForce FX style of arranging pipelines, or splitting fragment shader from Raster OP engine?
All we can say is that AnandTech has a source of information that is quite convincing and it states 16 pipelines for X1800 XT/XL and 12P for Pro. The Inquirer, Digitimes, etc. can report whatever they want, but we do not have their same sources, and our sources disagree with their sources right now. Time will tell who is correct. One of the goals of AnandTech is to do our best to report truth rather than rumor - or if we report rumor we make it clear that it is speculation and not fact.
On the contrary - we have all the sources those guys do, and then some of course.
The difference between AnandTech and other websites is we have a strict policy that we only publish on information that is internally documented, and we have a copy of the document. We don't publish information without a roadmap/memo/etc.
As i said before, in the last generation cards ATI cards, on a apples 2 apples comparsion eg x850xt pe v 6800 ultra ATI was simply better at running half life 2, the same went for nvidia in doom3, so it always not about what power you are putting out its about how efficient you use what youve got, so until benchmarks are out we can speculate all we want:) my money is on a even split across different benchmarks and games
They better have more than 16. Otherwise it's a complete failure and no high frequencies will save them. I'm really looking forward to the new cards. And I hope they will continue the tradition and KICK nVIDIA's ASS! :)
quote: They better have more than 16. Otherwise it's a complete failure and no high frequencies will save them. I'm really looking forward to the new cards. And I hope they will continue the tradition and KICK nVIDIA's ASS! :)
Altough I agree that it would be nice to see a continuous trend of general whoopage on ATI's part, I have to say that pipelines and even clockspeed have little affect on how performance turns out. It all depends on the design of the pixel and vertex pipes. A good example of this is the FX5800 (Blow Dryer Edition). The 9700 Pro had 8 pixel pipes, 4 vertex pipes, and 256bit/DDR memory bus running at 325MHz/620MHz. The 5800 Ultra had a 4x2 pixel pipes, 3 vertex pipes, and had a 128bit/DDR2 which ran at 500MHz/1GHz. The 9700 Pro blew the FX5800 completely out of the water just because it had a better design. Even though the clocks were high, pure ingenuity came out on top. The same will (or won't, it's yet to be determined) happen here.
Just because the R520 may be a 16pp design, it doesn't mean that it'll be inferior to a 24pp design. Personally, I think the R520 will have more vertex processing power due to the high clockspeed. Assuming they use 8 vertex pipes, the R520's 8x600MHz is a lot better than the G70's 8x430MHz. Yet again, maybe the R520's vertex pipes will be weaker than the G70's and it'll all balance or. Who knows? Anand does...
I guess it also worth taking into consideration what next-gen games are optimized to run on ATI or Nvidia archecture....just like half life 2 and doom 3, either way this dosn't look like the monster of I card i was hoping to see
Actually, you missed a few:
X800
X800GT
X800GTO
X800PRO
X800SE
X800LE
X800PRO
X800XL
X800XT
X800XT
X850 PRO
X850XT
X850XTPE
and I think there's also an X850LE/SE somewhere (it's an OEM).
well, see, it just goes to show how many cards there are and most of them didn't come out until after the initial release. It also shows how many cards there really are and how confusing it is regardless if you're informed or Joe Six Pack
Well, you have to realize that at first, the GT, GTO, XL and X850 series didn't exist. So, at the release of the R420, there was the X800, Pro, XT and XT-PE. Cards like the LE, GT, XL, and X850 were released later to fill in gaps in the market. Give it some time ^^
I can't wait to see how these cards stack up to their equivalently priced nvidia parts. The most I've ever spent on a video card was $350 for a geforce 2 ultra 64mb ddr, and I doubt I'd go higher than that ever. I love having the fastest card available, but it seems like that's no longer a possibility for me. Makes me sad, maybe one day ATI and nVIDIA will stop being so greedy and price their cards more sanely. Or maybe they will give me a free one because I'm better than everyone else at everything...... (yeah that includes you a$$hole)
Bottom of page 1: "R580 is essentailly a clock ramp and pipe ramp of R520, but both of those details have not been disclosed yet (even to AIBs)." So no 24-pipe part? Disappointing....
I don't think you understood. A pipe and clock ramp insinuates that there will be more pipes in the R580. It'll probably be a 24 pp GPU. Roadmaps disclosed that the R520 would be a four quad (4x4=16) GPU and the R580 would be a 6 quad (6x4=24) GPU.
Does anyone know how many vertex pipes the R520 has? : ( I can't remember where, but I think I once saw it would have up to 10 vps. Is this true? You'd think 8 VPs working at 600MHz would blow the G70 out of the water in terms of vertex processing power. I need to do more research regarding GPUs. Does anyone have any good references?
Basically, with the higher clock speed there's no point in having more than 6 vertex pipelines. With R580, if they move to 24 pixel pipelines, it would make more sense to go to 8 vertex pipelines. 32 pixel pipes would probably need 10 vertex pipes. Then there's the whole "unified architecture" that we're moving towards.
Anyway, the main point is that I have yet to see anything officially stating that R520 has 6, 8, 10, or whatever pipes. Everything is pretty much a guess, and Ocham's Razor suggests that if 16/6 was good for last generation, and R520 has 16 pixel pipes, it probably has 6 vertex pipes again. :p
"First of all, ATI's traditional core design can do "more" per clock cycle (at least on the R3xx design) than NVIDIA."
Ati 9700 with just 275Mhz core speed, and ONLY 270Mhz(540Mhz) memory speed killed any card, even the ones that worked at 500Mhz core and 500Mhz(1000Mhz) memory speed (nvidia 5800).
Put all these new cards at 275Mhz (memory and core) if possible (under clock them) to see who does more work.
I don’t think the phrase is correct for the R4xx design since it as higher memory and core speed than nvidia 6xxx and 7xxx.
"First of all, ATI's traditional core design can do "more" per clock cycle" - I have to disagree. X850XT has higher fill rate and memory bandwidth then GeForce 7800GT but slower in most games...
The X850XT also has 16 pipes vs. the 20 pipes of the 7800GT. And the GT is only faster in "most" games, not all and then also not by that much. So, all in all, what they said sounds right.
of course not. Who cares if it's 20 pipelines oraz 16 pipline. It all depends on core clock and pipline. X850XT PE is 540/1180 MHz, 7800GT 400/1000MHz theoretical Radeon is faster but in games GeForce prevail :-]
I dont think you understand what you're talking there. The 850XT PE is faster in a few games under certain resolutions with AA/AF cranked up. That is due to clock speed and weaker AA/AF modi. But in most games under normal conditions, the 7800GT is faster - due to the 4 more pipes...
the pipeline performance is theoretically negated by the higher clock speed so you are actually wrong
the reason the X850XT PE is slower is becuase its based off older technology and hasn't had any improvements made to its effeciency from the Radeon 9700 days which is why GeForce 6800 cards operate at lower clock speeds for the same performance
This is false. There are actually quite a few improvements and optimizations to the efficiency of the engine in the R420 over the original R300 architecture. The basic principle and design is still there, but you have to take into account other smaller features that make no small difference altogether.
either way the level of improvements made from R300 to R420 since they are based upon the same architecture is likely to be smaller than that from R420 to R520, also factor in the move to SM3.0(and consequently the move from FP24 to FP16/32) and we'll definitely see changes in performance on a pipeline level
no, you're tring to tell me, that 32 pipeline card will always be better then 16 pipeline and it's not true :] In that way making R520 16-pipeline would be suicide :]
Any game-dev that did that, would be a complete idiot. The graphics high-end account for a small fraction of the market. Why whould they make a game only for that small fraction, when they can lower the requirements, and sell millions upon millions?
That sums it up, used to be 6 months after a top of line card was released, you can get it for $200. Now if you look at it you will see you have OVER a year to get that same price vs performance.
Seems like ATI and Nvidia are going backwards imo...who cares about performance when its adding %30+ onto a cost of a PC to build. The whole idea about hardware is that even if performance doubles, price should not.
A case in point, i run a 9700 Pro, used to be the best card, when it first came out it was $350, i got mine for $120 about 6months after launch. When your best card now is topping out around $500-600, and 6 MONTHS after release your seeing less than $20 than that something is wrong
I have a hard time believing you got a 9700pro for $120 six months after launch, unless you bought it used.
I got a 9700pro in Jan 2004 for $180-190 (purchased thru newegg), and that was more than a year after launch if I recall correctly.
Aside from that minor quibble, I do agree with your overall point. It does seem as though the vendors are much more slow in lowering the price of previous gen "high-end" cards to bring them to the mid range. Instead, they release faster cards that are also much more expensive.
I will admit that I got my first 2nd and third 3d card (Voodoo 1, GF 1 SDR, GF3 Ti200) for about $200 each. The first two cards were definitely high end at the time, while the GF3 was second in line to the Ti500. It does seen somewhat odd that graphics cards are about the only computer compenent that has seemed to get higher and higher in price, instead of lower and lower like most other parts. On the other hand, I think the performance of the GPU has increased at a greater rate than that of the CPU over the last few years, so maybe the higher cost is somewhat justified (but it definitely is annoying).
I'm currently using an X800XL which I bought for $300 when they first arrived, and I don't plan on upgrading for at least another year, so I'll be interested in the next, next gen from NV and ATI.
I'm of the same opinion.. It used to be the CPU was the most expensive part of an build from a buyers standpoint, but over the past three years that has changed entirely.. Video cards manufacturers have consistently increased prices to the point that it's not affordable for most people to purchase a current generation card.. If ATi and Nvidia think that they continue this present trend of pushing MSRPs to new levels, they'll eventually find they've priced themselves out of most consumer's budgets..
Would you be more satisfied if they released $300 vid cards every 18 months? It would definately cost less to own the latest-and-greatest.
I wouldn't, because having super-high end video cards is a good thing, no matter how much they cost. This is what makes PCs such an great platform: if you want to pay more to get better graphics, you have the ability to do so. Games devs will always target the most popular platform, so you will never *need* a top-of-the line card to get a good experience (HL2 kicked ass on my 3.5 year-old PC).
You no longer need a high-end CPU to enjoy games because, generally, user demand for game improvements in CPU-intensive functions has gone down (only with the popularity of realistic physics has demand gone up, but with dedicated physics cards on the way, it will go down once again, just like when GPUs took over transform and lighting).
"Would you be more satisfied if they released $300 vid cards every 18 months? "
As opposed to $600 cards every 6 months, thus costing twice the price and being outdated three times as quickly? Yes, I'd rather have a $300 purchase every 18 months, which is about as frequently as a person currently needs to upgrade a videocard anyway. The Radeon 9800 Pro 128mb card has been out around 24 months now, IIRC, and it is just now needing to be replaced by a 7800-series card to run very high resolutions smoothly in the latest games. So yeah, $300 every 18 months is about right.
whats wrong is people that are WILLING to pay $500-600 for a video card
its all about supply and demand
also for as long as I can remember now high end pcs cost a little over 2k US, over time, many things have reduced in price, so expensive video cards just make it possible to keep the total system cost to around the same mark
The price for the 512MB X1800 XT looks like a steal if it can debut at the same price NVIDIA had for the 256MB 7800GTX. Damn, and I just bought a 7800GTX, hopefully I won't regret it!
Being that the GTX has been out for so much longer and the prices have dropped since release, the GTX retails for about the same price as the X1800 PRO. That might be a tough sell for ATI unless these cards are monsters at all price points. Either way, it should be interesting to see how the new cards perform as they look competetive on paper.
When Nvidia is ready to debut products on their 90nm process. The 6800 GT, 6800 will do in the meantime. They are basically feature complete speedwise they maybe a problem.
i dont see how the 6800gt can be a "problem" -- it kicks! the only reason why the x800xl is competitive right now is price, when these ati cards come out, nvidia will sure revise their whole pricing scheme
by looking at the speculation right now - the ati cards may only perform marginally better than the nvidia counterparts, not quite the revolutionary "kick ass" chip everyone's been expecting
quote: by looking at the speculation right now - the ati cards may only perform marginally better than the nvidia counterparts, not quite the revolutionary "kick ass" chip everyone's been expecting
Yea but what if they got a 24 or 32 pipe version up their sleeve? I planned to buy a 7800GT for my new box this year, but after finding out how bad the visual quality (texture flickering) is compared to older GF and the current ATI card, I'm not so sure what I should do right now. I'm even tempted to get a very cheap card from either the current ATI line or the 6xxx series from NV and replace it later with a 7800GT (if NV can fix their visual problem with drivers) or the next ATI line - if it's superior.
At any rate, it will be a step-up from my trusty 9700pro. :)
But why toms, anand, xbit, ....
doesnt say anything about it...
But what is nvidia trying to achive? Sis Xabre image quality levels?
I dont understand the AF hit is much lower then AA. Why remove quality from it if AF is much more important (with less performance it than AA) to achive higger image quality levels.
Why did Nvidia disable the "old" AF of Geforce3/4 (and FX)? Is it impossible to suppord both at hardware and driver level?
It seems that FPS is what sells hardware these days.. ATI is no exception, though their image quality was not as low as the nvidia counterparts. This episode taught me a lesson though. I value image quality very high, especially when I put down several hundred bucks for a single piece of hardware. I will check and doublecheck any vid card from either company before I even consider upgrading in the future.
Somebody mentioned that the newest drivers fixed the texture flickering, gonna have to check that out somehow before I order the 7800GT I planned to buy.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
65 Comments
Back to Article
yacoub - Thursday, September 15, 2005 - link
I can't WAIT 'til these are available ... so I can pick up an X800 XL dirt cheap! :DIntelUser2000 - Thursday, September 15, 2005 - link
Is it too much to ask for the editors to do a test of power consumption for at least mobile video cards?? If not desktop ones.quanta - Wednesday, September 14, 2005 - link
According to Digitimes[1], XT. XL, LE has 32, 24, 16 pipelines respectively. Maybe ATI is using GeForce FX style of arranging pipelines, or splitting fragment shader from Raster OP engine?[1] http://www.digitimes.com/news/a20050914A7037.html">http://www.digitimes.com/news/a20050914A7037.html
JarredWalton - Wednesday, September 14, 2005 - link
All we can say is that AnandTech has a source of information that is quite convincing and it states 16 pipelines for X1800 XT/XL and 12P for Pro. The Inquirer, Digitimes, etc. can report whatever they want, but we do not have their same sources, and our sources disagree with their sources right now. Time will tell who is correct. One of the goals of AnandTech is to do our best to report truth rather than rumor - or if we report rumor we make it clear that it is speculation and not fact.KristopherKubicki - Wednesday, September 14, 2005 - link
On the contrary - we have all the sources those guys do, and then some of course.The difference between AnandTech and other websites is we have a strict policy that we only publish on information that is internally documented, and we have a copy of the document. We don't publish information without a roadmap/memo/etc.
Kristopher
SimonNZ - Thursday, September 15, 2005 - link
As i said before, in the last generation cards ATI cards, on a apples 2 apples comparsion eg x850xt pe v 6800 ultra ATI was simply better at running half life 2, the same went for nvidia in doom3, so it always not about what power you are putting out its about how efficient you use what youve got, so until benchmarks are out we can speculate all we want:) my money is on a even split across different benchmarks and gamesStas - Wednesday, September 14, 2005 - link
They better have more than 16. Otherwise it's a complete failure and no high frequencies will save them. I'm really looking forward to the new cards. And I hope they will continue the tradition and KICK nVIDIA's ASS! :)mistersnail - Wednesday, September 14, 2005 - link
Altough I agree that it would be nice to see a continuous trend of general whoopage on ATI's part, I have to say that pipelines and even clockspeed have little affect on how performance turns out. It all depends on the design of the pixel and vertex pipes. A good example of this is the FX5800 (Blow Dryer Edition). The 9700 Pro had 8 pixel pipes, 4 vertex pipes, and 256bit/DDR memory bus running at 325MHz/620MHz. The 5800 Ultra had a 4x2 pixel pipes, 3 vertex pipes, and had a 128bit/DDR2 which ran at 500MHz/1GHz. The 9700 Pro blew the FX5800 completely out of the water just because it had a better design. Even though the clocks were high, pure ingenuity came out on top. The same will (or won't, it's yet to be determined) happen here.
Just because the R520 may be a 16pp design, it doesn't mean that it'll be inferior to a 24pp design. Personally, I think the R520 will have more vertex processing power due to the high clockspeed. Assuming they use 8 vertex pipes, the R520's 8x600MHz is a lot better than the G70's 8x430MHz. Yet again, maybe the R520's vertex pipes will be weaker than the G70's and it'll all balance or. Who knows? Anand does...
SimonNZ - Wednesday, September 14, 2005 - link
I guess it also worth taking into consideration what next-gen games are optimized to run on ATI or Nvidia archecture....just like half life 2 and doom 3, either way this dosn't look like the monster of I card i was hoping to seetayhimself - Wednesday, September 14, 2005 - link
X800 X800GT X800GTO X800XL X800 X800XL X800XT X800XTPE X800XT X850XT PE X850Now only 3 X1800?? What a shoame
raz3000 - Wednesday, September 14, 2005 - link
Actually, you missed a few:X800
X800GT
X800GTO
X800PRO
X800SE
X800LE
X800PRO
X800XL
X800XT
X800XT
X850 PRO
X850XT
X850XTPE
and I think there's also an X850LE/SE somewhere (it's an OEM).
Griswold - Thursday, September 15, 2005 - link
You mentioned the X800XT twice. :Pmistersnail - Wednesday, September 14, 2005 - link
well, see, it just goes to show how many cards there are and most of them didn't come out until after the initial release. It also shows how many cards there really are and how confusing it is regardless if you're informed or Joe Six PackJep4444 - Wednesday, September 14, 2005 - link
i never heard of the X800LEGriswold - Thursday, September 15, 2005 - link
There is a X800LE, friend of mine got one. Cant say I've seen many being sold though.mistersnail - Wednesday, September 14, 2005 - link
Well, you have to realize that at first, the GT, GTO, XL and X850 series didn't exist. So, at the release of the R420, there was the X800, Pro, XT and XT-PE. Cards like the LE, GT, XL, and X850 were released later to fill in gaps in the market. Give it some time ^^Jep4444 - Wednesday, September 14, 2005 - link
the X800SE came out close to launch, the X800 came out at the same time as the XLjkostans - Wednesday, September 14, 2005 - link
I can't wait to see how these cards stack up to their equivalently priced nvidia parts. The most I've ever spent on a video card was $350 for a geforce 2 ultra 64mb ddr, and I doubt I'd go higher than that ever. I love having the fastest card available, but it seems like that's no longer a possibility for me. Makes me sad, maybe one day ATI and nVIDIA will stop being so greedy and price their cards more sanely. Or maybe they will give me a free one because I'm better than everyone else at everything...... (yeah that includes you a$$hole)OvErHeAtInG - Wednesday, September 14, 2005 - link
LOLBottom of page 1: "R580 is essentailly a clock ramp and pipe ramp of R520, but both of those details have not been disclosed yet (even to AIBs)." So no 24-pipe part? Disappointing....
mistersnail - Wednesday, September 14, 2005 - link
I don't think you understood. A pipe and clock ramp insinuates that there will be more pipes in the R580. It'll probably be a 24 pp GPU. Roadmaps disclosed that the R520 would be a four quad (4x4=16) GPU and the R580 would be a 6 quad (6x4=24) GPU.
Does anyone know how many vertex pipes the R520 has? : ( I can't remember where, but I think I once saw it would have up to 10 vps. Is this true? You'd think 8 VPs working at 600MHz would blow the G70 out of the water in terms of vertex processing power. I need to do more research regarding GPUs. Does anyone have any good references?
yacoub - Thursday, September 15, 2005 - link
And me still waiting for my four-speed, dual-quad, positraction 409...JarredWalton - Wednesday, September 14, 2005 - link
I asked Kris this exact question. The answer: we don't know. The roadmaps/PDFs don't say anything about vertex pipelines. However, consider this:G70: 8 vertex @ 430 MHz / 4 = 860 MV/s
R520: 6 vertex @ 600 MHz / 4 = 900 MV/s
Basically, with the higher clock speed there's no point in having more than 6 vertex pipelines. With R580, if they move to 24 pixel pipelines, it would make more sense to go to 8 vertex pipelines. 32 pixel pipes would probably need 10 vertex pipes. Then there's the whole "unified architecture" that we're moving towards.
Anyway, the main point is that I have yet to see anything officially stating that R520 has 6, 8, 10, or whatever pipes. Everything is pretty much a guess, and Ocham's Razor suggests that if 16/6 was good for last generation, and R520 has 16 pixel pipes, it probably has 6 vertex pipes again. :p
Questar - Wednesday, September 14, 2005 - link
What if the 16 pipe card performs like a 24 pipe 7800?Is an eight cylinder engine always better than a six?
jkostans - Wednesday, September 14, 2005 - link
Exactly, this is a new architecture we're talking about. Chances are it's a good deal more efficient than the last generation of cards.nserra - Wednesday, September 14, 2005 - link
"First of all, ATI's traditional core design can do "more" per clock cycle (at least on the R3xx design) than NVIDIA."Ati 9700 with just 275Mhz core speed, and ONLY 270Mhz(540Mhz) memory speed killed any card, even the ones that worked at 500Mhz core and 500Mhz(1000Mhz) memory speed (nvidia 5800).
Put all these new cards at 275Mhz (memory and core) if possible (under clock them) to see who does more work.
I don’t think the phrase is correct for the R4xx design since it as higher memory and core speed than nvidia 6xxx and 7xxx.
arturnow - Wednesday, September 14, 2005 - link
"First of all, ATI's traditional core design can do "more" per clock cycle" - I have to disagree. X850XT has higher fill rate and memory bandwidth then GeForce 7800GT but slower in most games...Griswold - Wednesday, September 14, 2005 - link
The X850XT also has 16 pipes vs. the 20 pipes of the 7800GT. And the GT is only faster in "most" games, not all and then also not by that much. So, all in all, what they said sounds right.arturnow - Wednesday, September 14, 2005 - link
of course not. Who cares if it's 20 pipelines oraz 16 pipline. It all depends on core clock and pipline. X850XT PE is 540/1180 MHz, 7800GT 400/1000MHz theoretical Radeon is faster but in games GeForce prevail :-]Griswold - Wednesday, September 14, 2005 - link
I dont think you understand what you're talking there. The 850XT PE is faster in a few games under certain resolutions with AA/AF cranked up. That is due to clock speed and weaker AA/AF modi. But in most games under normal conditions, the 7800GT is faster - due to the 4 more pipes...Jep4444 - Wednesday, September 14, 2005 - link
the pipeline performance is theoretically negated by the higher clock speed so you are actually wrongthe reason the X850XT PE is slower is becuase its based off older technology and hasn't had any improvements made to its effeciency from the Radeon 9700 days which is why GeForce 6800 cards operate at lower clock speeds for the same performance
Cybercat - Wednesday, September 14, 2005 - link
This is false. There are actually quite a few improvements and optimizations to the efficiency of the engine in the R420 over the original R300 architecture. The basic principle and design is still there, but you have to take into account other smaller features that make no small difference altogether.Jep4444 - Wednesday, September 14, 2005 - link
either way the level of improvements made from R300 to R420 since they are based upon the same architecture is likely to be smaller than that from R420 to R520, also factor in the move to SM3.0(and consequently the move from FP24 to FP16/32) and we'll definitely see changes in performance on a pipeline levelarturnow - Wednesday, September 14, 2005 - link
no, you're tring to tell me, that 32 pipeline card will always be better then 16 pipeline and it's not true :] In that way making R520 16-pipeline would be suicide :]KHysiek - Wednesday, September 14, 2005 - link
Before game devs bumb up game requirements to these absurdal prices.wien - Wednesday, September 14, 2005 - link
Any game-dev that did that, would be a complete idiot. The graphics high-end account for a small fraction of the market. Why whould they make a game only for that small fraction, when they can lower the requirements, and sell millions upon millions?imaheadcase - Wednesday, September 14, 2005 - link
That sums it up, used to be 6 months after a top of line card was released, you can get it for $200. Now if you look at it you will see you have OVER a year to get that same price vs performance.Seems like ATI and Nvidia are going backwards imo...who cares about performance when its adding %30+ onto a cost of a PC to build. The whole idea about hardware is that even if performance doubles, price should not.
A case in point, i run a 9700 Pro, used to be the best card, when it first came out it was $350, i got mine for $120 about 6months after launch. When your best card now is topping out around $500-600, and 6 MONTHS after release your seeing less than $20 than that something is wrong
Just my 2 cents
yacoub - Thursday, September 15, 2005 - link
Totally agree. Prices are ridiculous.AnnoyedGrunt - Wednesday, September 14, 2005 - link
I have a hard time believing you got a 9700pro for $120 six months after launch, unless you bought it used.I got a 9700pro in Jan 2004 for $180-190 (purchased thru newegg), and that was more than a year after launch if I recall correctly.
Aside from that minor quibble, I do agree with your overall point. It does seem as though the vendors are much more slow in lowering the price of previous gen "high-end" cards to bring them to the mid range. Instead, they release faster cards that are also much more expensive.
I will admit that I got my first 2nd and third 3d card (Voodoo 1, GF 1 SDR, GF3 Ti200) for about $200 each. The first two cards were definitely high end at the time, while the GF3 was second in line to the Ti500. It does seen somewhat odd that graphics cards are about the only computer compenent that has seemed to get higher and higher in price, instead of lower and lower like most other parts. On the other hand, I think the performance of the GPU has increased at a greater rate than that of the CPU over the last few years, so maybe the higher cost is somewhat justified (but it definitely is annoying).
I'm currently using an X800XL which I bought for $300 when they first arrived, and I don't plan on upgrading for at least another year, so I'll be interested in the next, next gen from NV and ATI.
-D'oh!
100proof - Wednesday, September 14, 2005 - link
I'm of the same opinion.. It used to be the CPU was the most expensive part of an build from a buyers standpoint, but over the past three years that has changed entirely.. Video cards manufacturers have consistently increased prices to the point that it's not affordable for most people to purchase a current generation card.. If ATi and Nvidia think that they continue this present trend of pushing MSRPs to new levels, they'll eventually find they've priced themselves out of most consumer's budgets..Sh0ckwave - Wednesday, September 14, 2005 - link
The fastest CPUs are still more expensive then the fastest graphics cards though (4800 X2 $880, FX57 $1000).imaheadcase - Wednesday, September 14, 2005 - link
But you don't need the fastest CPU out to get the latest and greatest, unlike graphics card.IKeelU - Wednesday, September 14, 2005 - link
Would you be more satisfied if they released $300 vid cards every 18 months? It would definately cost less to own the latest-and-greatest.I wouldn't, because having super-high end video cards is a good thing, no matter how much they cost. This is what makes PCs such an great platform: if you want to pay more to get better graphics, you have the ability to do so. Games devs will always target the most popular platform, so you will never *need* a top-of-the line card to get a good experience (HL2 kicked ass on my 3.5 year-old PC).
You no longer need a high-end CPU to enjoy games because, generally, user demand for game improvements in CPU-intensive functions has gone down (only with the popularity of realistic physics has demand gone up, but with dedicated physics cards on the way, it will go down once again, just like when GPUs took over transform and lighting).
yacoub - Thursday, September 15, 2005 - link
"Would you be more satisfied if they released $300 vid cards every 18 months? "As opposed to $600 cards every 6 months, thus costing twice the price and being outdated three times as quickly? Yes, I'd rather have a $300 purchase every 18 months, which is about as frequently as a person currently needs to upgrade a videocard anyway. The Radeon 9800 Pro 128mb card has been out around 24 months now, IIRC, and it is just now needing to be replaced by a 7800-series card to run very high resolutions smoothly in the latest games. So yeah, $300 every 18 months is about right.
xsilver - Wednesday, September 14, 2005 - link
whats wrong is people that are WILLING to pay $500-600 for a video cardits all about supply and demand
also for as long as I can remember now high end pcs cost a little over 2k US, over time, many things have reduced in price, so expensive video cards just make it possible to keep the total system cost to around the same mark
tonyou - Tuesday, September 13, 2005 - link
The price for the 512MB X1800 XT looks like a steal if it can debut at the same price NVIDIA had for the 256MB 7800GTX. Damn, and I just bought a 7800GTX, hopefully I won't regret it!Cybercat - Tuesday, September 13, 2005 - link
"First of all, ATI's traditional core design can do "more" per clock cycle (at least on the R420 design) than NVIDIA."refering to what? obviously not shader ops...
jonny13 - Tuesday, September 13, 2005 - link
Being that the GTX has been out for so much longer and the prices have dropped since release, the GTX retails for about the same price as the X1800 PRO. That might be a tough sell for ATI unless these cards are monsters at all price points. Either way, it should be interesting to see how the new cards perform as they look competetive on paper.Pete84 - Tuesday, September 13, 2005 - link
Heck, ATi has a card for EVERY price point!!! When are the mid and low range GeForce 7's going to come out?coldpower27 - Tuesday, September 13, 2005 - link
When Nvidia is ready to debut products on their 90nm process. The 6800 GT, 6800 will do in the meantime. They are basically feature complete speedwise they maybe a problem.xsilver - Wednesday, September 14, 2005 - link
i dont see how the 6800gt can be a "problem" -- it kicks! the only reason why the x800xl is competitive right now is price, when these ati cards come out, nvidia will sure revise their whole pricing schemeby looking at the speculation right now - the ati cards may only perform marginally better than the nvidia counterparts, not quite the revolutionary "kick ass" chip everyone's been expecting
Griswold - Wednesday, September 14, 2005 - link
Yea but what if they got a 24 or 32 pipe version up their sleeve? I planned to buy a 7800GT for my new box this year, but after finding out how bad the visual quality (texture flickering) is compared to older GF and the current ATI card, I'm not so sure what I should do right now. I'm even tempted to get a very cheap card from either the current ATI line or the 6xxx series from NV and replace it later with a 7800GT (if NV can fix their visual problem with drivers) or the next ATI line - if it's superior.
At any rate, it will be a step-up from my trusty 9700pro. :)
patrick0 - Wednesday, September 14, 2005 - link
Texture flickering has been fixed with the new driver release.Griswold - Thursday, September 15, 2005 - link
Really? Would be good news.
nserra - Wednesday, September 14, 2005 - link
What more nvidia optimizations? Never heard that before....Griswold - Wednesday, September 14, 2005 - link
Check out this article:http://tinyurl.com/9kwzn">http://tinyurl.com/9kwzn
nserra - Wednesday, September 14, 2005 - link
I dindt know that....But why toms, anand, xbit, ....
doesnt say anything about it...
But what is nvidia trying to achive? Sis Xabre image quality levels?
I dont understand the AF hit is much lower then AA. Why remove quality from it if AF is much more important (with less performance it than AA) to achive higger image quality levels.
Why did Nvidia disable the "old" AF of Geforce3/4 (and FX)? Is it impossible to suppord both at hardware and driver level?
Griswold - Thursday, September 15, 2005 - link
It seems that FPS is what sells hardware these days.. ATI is no exception, though their image quality was not as low as the nvidia counterparts. This episode taught me a lesson though. I value image quality very high, especially when I put down several hundred bucks for a single piece of hardware. I will check and doublecheck any vid card from either company before I even consider upgrading in the future.Somebody mentioned that the newest drivers fixed the texture flickering, gonna have to check that out somehow before I order the 7800GT I planned to buy.
Slappi - Tuesday, September 13, 2005 - link
So for $50 more you get 1250mhz memory vs. 1000mhz memory and a 550mhz core vs. a 500mhz core AND 256MB of more memory?!?I smell BS.
KristopherKubicki - Tuesday, September 13, 2005 - link
MSRPs have a difficult time translating into Retail.Kristopher
knitecrow - Tuesday, September 13, 2005 - link
When doesthe NDA expire?Chadder007 - Tuesday, September 13, 2005 - link
Looks like the ATI parts are all prices just WAY too high. I guess ill be getting last gen probably.Ozz1113 - Tuesday, September 13, 2005 - link
~= 50 watts, thats pretty amazing. Should be quiet and cool too.smn198 - Wednesday, September 14, 2005 - link
Why are the 60W top end cards going to be dual slot?Leper Messiah - Wednesday, September 14, 2005 - link
Hm. I don't think that the x1800's TDP is only 60 watts, IIRC its the x1600's that max out at a 60Watt envelope.Looks interesting, but WE WANT BENCHIES!
pxc - Tuesday, September 13, 2005 - link
when the NDA expires.I will be interesting to see how the new X1800 series performs.