NVIDIA GeForce GTX 670 Review Feat. EVGA: Bringing GK104 Down To $400by Ryan Smith on May 10, 2012 9:00 AM EST
In a typical high-end GPU launch we’ll see the process take place in phases over a couple of months if not longer. The new GPU will be launched in the form of one or two single-GPU cards, with additional cards coming to market in the following months and culminating in the launch of a dual-GPU behemoth. This is the typical process as it allows manufacturers and board partners time to increase production, stockpile chips, and work on custom designs.
But this year things aren’t so typical. GK104 wasn’t the typical high-end GPU from NVIDIA, and neither it seems is there anything typical about its launch.
NVIDIA has not been wasting any time in getting their complete GK104 based product lineup out the door. Just 6 weeks after the launch of the GeForce GTX 680, NVIDIA launched the GeForce GTX 690, their dual-GK104 monster. Now only a week after that NVIDIA is at it again, launching the GK104 based GeForce GTX 670 this morning.
Like its predecessors, GTX 670 will fill in the obligatory role as a cheaper, slower, and less power-hungry version of NVIDIA’s leading video card. This is a process that allows NVIDIA to not only put otherwise underperforming GPUs to use, but to satisfy buyers at lower price points at the same time. Throughout this entire process the trick to successfully launching any second-tier card is to try to balance performance, prices, and yields, and as we’ll see NVIDIA has managed to turn all of the knobs just right to launch a very strong product.
|GTX 680||GTX 670||GTX 580||GTX 570|
|Memory Clock||6.008GHz GDDR5||6.008GHz GDDR5||4.008GHz GDDR5||3.8GHz GDDR5|
|Memory Bus Width||256-bit||256-bit||384-bit||320-bit|
|FP64||1/24 FP32||1/24 FP32||1/8 FP32||1/8 FP32|
|Manufacturing Process||TSMC 28nm||TSMC 28nm||TSMC 40nm||TSMC 40nm|
Like GeForce GTX 680, GeForce GTX 670 is based on NVIDIA’s GK104 GPU. So we’re looking at the same Kepler design and the same Kepler features, just at a lower level of performance. As always the difference is that since this is a second-tier card, NVIDIA is achieving that by harvesting otherwise defective GPUs.
In a very unusual move for NVIDIA, for GTX 670 they’re disabling one of the eight SMXes on GK104 and lowering the core clock a bit, and that’s it. GTX 670 will ship with 7 active SMXes, all 32 of GK104’s ROPs, and all 4 GDDR5 memory controllers. Typically we’d see NVIDIA hit every aspect of the GPU at once in order to create a larger performance gap and to maximize the number of GPUs they can harvest – such as with the GTX 570 and its 15 SMs & 40 ROPs – but not in this case.
Meanwhile clockspeeds turn out to be equally interesting. Officially, both the base clock and the boost clock are a fair bit lower than GTX 680. GTX 670 will ship at 915MHz for the base clock and 980MHz for the boost clock, which is 91MHz (9%) and 78MHz (7%) lower than the GTX 680 respectively. However as we’ve seen with GTX 680 GK104 will spend most of its time boosting and not necessarily just at the official boost clock. Taken altogether, depending on the game and the specific GPU GTX 670 has the capability to boost within 40MHz or so of GTX 680, or about 3.5% of the clockspeed of its more powerful sibling.
As for the memory subsystem, like the ROPs they have not been touched at all. GTX 670 will ship at the same 6.008GHz memory clockspeed of GTX 680 with the same 256-bit memory bus, giving it the same 192GB/sec of memory bandwidth. This is particularly interesting as NVIDIA has always turned down their memory clocks in the past, and typically taken out a memory controller/ROP combination in the past. Given that GK104 is an xx4 GPU rather than a full successor to GF110 and its 48 ROPs, it would seem that NVIDIA is concerned about their ROP and memory performance and will not sacrifice performance there for GTX 670.
Taken altogether, this means at base clocks GTX 670 has 100% of the memory bandwidth, 91% of the ROP performance, and 80% of the shader performance of GTX 680. This puts GTX 670’s specs notably closer to GTX 680 than GTX 570 was to GTX 580, or GTX 470 before it. In order words the GTX 670 won’t trail the GTX 680 by as much as the GTX 570 trailed the GTX 580 – or conversely the GTX 680 won’t have quite the same lead as the GTX 580 did.
As for power consumption, the gap between the two is going to be about the same as we saw between the GTX 580 and GTX 570. The official TDP of the GT 670 is 170W, 25W lower than the GTX 680. Unofficially, NVIDIA’s GPU Boost power target for GTX 670 is 141W, 29W lower than the GTX 680. Thus like the GTX 680 the GTX 670 has the lowest TDP for a part of its class that we’ve seen out of NVIDIA in quite some time.
Moving on, unlike the GTX 680 launch NVIDIA is letting their partners customize right off the bat. GTX 670 will launch with a mix of reference, semi-custom, and fully custom designs with a range of coolers, clockspeeds, and prices. There are a number of cards to cover over the coming weeks, but today we’ll be looking at EVGA’s GeForce GTX 670 Superclocked alongside our reference GTX 670.
As we’ve typically seen in the past, custom cards tend to appear when GPU manufacturers and their board partners feel more comfortable about GPU availability and this launch is no different. The GTX 670 launch is being helped by the fact that NVIDIA has had an additional 7 weeks to collect suitable GPUs compared to the GTX 680 launch, on top of the fact that these are harvested GPUs. With that said NVIDIA is still in the same situation they were in last week with the launch of the GTX 690: they already can’t keep GK104 in stock.
Due to binning GTX 670 isn’t drawn from GTX 680 inventory, so it’s not a matter of these parts coming out of the same pool, but realistically we don’t expect NVIDIA to be able to keep GTX 670 in stock any better than they can GTX 680. The best case scenario is that GTX 680 supplies improve as some demand shifts down to the GTX 670. In other words Auto-Notify is going to continue to be the best way to get a GTX 600 series card.
Finally, let’s talk pricing. If you were expecting GTX 570 pricing for GTX 670 you’re going to come away disappointed. Because NVIDIA is designing GTX 670 to perform closer to GTX 680 than with past video cards they’re also setting the prices higher. GTX 670 will have an MSRP of $399 ($50 higher than GTX 570 at launch), with custom cards going for higher yet. This should dampen demand some, but we don’t expect it will be enough.
Given its $399 MSRP, the GTX 670 will primarily be competing with the $399 Radeon HD 7950. However from a performance perspective the $479 7970 will also be close competition depending on the game at hand. AMD’s Three For Free promo has finally gone live, so they’re countering NVIDIA in part based on the inclusion of Deus Ex, Nexuiz, and DiRT Showdown with most 7900 series cards.
Below that we have AMD’s Radeon HD 7870 at $350, while the GTX 570 will be NVIDIA’s next card down at around $299. The fact that NVIDIA is even bothering to mention the GTX 570 is an interesting move, since it means they expect it to remain as part of their product stack for some time yet.
Update 5/11: NVIDIA said GTX 670 supply would be better than GTX 680 and it looks like they were right. As of this writing Newegg still has 5 of 7 models still in stock, which is far better than the GTX 680 and GTX 690 launches. We're glad to see that NVIDIA is finally able to keep a GTX 600 series card in stock, particularly a higher volume part like GTX 670.
|Spring 2012 GPU Pricing Comparison|
|$999||GeForce GTX 690|
|$499||GeForce GTX 680|
|Radeon HD 7970||$479|
|Radeon HD 7950||$399||GeForce GTX 670|
|Radeon HD 7870||$349|
|$299||GeForce GTX 570|
|Radeon HD 7850||$249|
|$199||GeForce GTX 560 Ti|
|$169||GeForce GTX 560|
|Radeon HD 7770||$139|
Post Your CommentPlease log in or sign up to comment.
View All Comments
Spunjji - Friday, May 11, 2012 - linkThank you for being a voice of sanity in an otherwise brutally argumentative and deeply sad comments section. +1 to you.
CeriseCogburn - Friday, May 11, 2012 - link"Voice of sanity" your amd fanboy friend, got the 43% larger amd wafer die size "cost drop not a problem" a bit overlooked, not to mention the 3 added free games additional cost.
Is it a voice of reason to claim the largest base cost of the card at 43% greater is no problem since the "dies" are "about the same size" ?
ROFL... tsk tsk.
SlyNine - Saturday, May 12, 2012 - linkIts 43% greater now.
Also you seem to forget the 7970 wins in 5 out of 10 of Anandtechs benchmarks.
Since you're going to argue this with me I'll put it out right now.
Dirt 3 ( on the MOST Intensive test 5760x1200 min frames) Tied, but to me min frames is more important so I'd rather have AMD in that situation.
Now you can argue , But NVidia wins on the other res's But since this is the ONLY time it even gets below 60 this is the ONLY test that it really makes a difference.
ShoGun, AMD, big time, ya nvidia wins when FPS is over 100, but AMD wins by a lot when FPS is at a premium. With a driver fix I'm sure it will be a lot closer.
Batman, Basically a tie, yea Nvidia takes it but C'mon, 1 fps when it matters most. My guess is if they added 4x aa to the 3 screen mode AMD would take it.
Portal 2 Nvidia kills AMD, esp. in the high res, because that's where fps are low enough that the diff matters.
Battlefield 3 Nvidia kills amd again. and again when FPS matters.
SC2, Nvidia is faster. FPS is so high it doesn't matter But AMD is catching up fast and with the 5760res I wonder if AMD wouldn't' win. and by then FPS might actually matter.
Skyrim, same as SC2, AMD is catching up fast at the higher resolution, if it keeps going AMD might come out ahead where FPS is low enough that the difference matters.
Civ 5, tie. With the trend the Nvidia might be better at higher res here.
Portal 2 and BF3 are the to situation in Anands testing suit where Nvidia is MUCH better.
But other than that FPS either doesn't matter or AMD is winning when FPS is low enough that the difference matters.
As far as future games, we have NO idea what card might be better, but AMD does seem to have more raw power, and has more ram.
In compute AMD won 2 , loses one by like 7% and then actually loses one by a lot. Of course it doesn't have a cuda score so how to you count that as a loss, that's stupid.
CeriseCogburn - Sunday, May 13, 2012 - linkYou make excuses across the board for the amd card, and nVidia's card is a smoother experience anyway if you want to glom onto min frames - and we haven't even used things like adaptive v-sync (better min frame rates for nVidia), nor did you figure in the enormous drivers difference.
It's just such a huge gap when everything is considered it's beyond ridiculous to go for the amd card, as this amd favoring reviewer even admits.
Have your favorite brand, but you've got stretch and spin to justify it.
Galidou - Sunday, May 13, 2012 - linkI could just really stop answering you if that wasn't for the fact you're being so much disrespectful. I have a little problem with people lacking of respect, I have to let them know they are, even more when they do not think they are lacking respect...
CeriseCogburn - Sunday, May 13, 2012 - linkYou have never answered anything late comer troll Galidou, you're a pure 100% trolling personal attacker right now fella in all your posts so far. You have said absolutely nothing, so it is clear you should have never posted.
Galidou - Monday, May 14, 2012 - linkI'm not attacking your person, just the way you throw your arguments at people calling them names like they are pure ignorant worthless living zombies... it just feels that way...
Gastec - Tuesday, November 13, 2012 - link"You make excuses across the board for the amd card, and nVidia's card is a smoother experience[..] - and we haven't even used things like..."
We? WE?? YOU are from NVIDIA???? And you post here and admit it? I think you can get fired for doing this.
Or maybe you are not from nVidia but because you use a nVidia card you, for some very disturbing reason, feel like you are part of the company?
CeriseCogburn - Friday, May 11, 2012 - linkOkay, first of all substantial competition is the GTX590 and the 6990, that both still beat the overpriced amd lost 3 and only won 2 compute benchmarks in that 2.5+ month evil amd price scalping period before the massive smack down the nVidia 680 delivered.
Now nVidia made a third move, the 670, not the initial move as you spoke about it, and this third move is another massive smackdown on the already smacked down by the 680 failing and utterly depleted value 7970 has to endure.
Nice try pretending the 1st smack down just occurred, but once again, what else to expect from an amd fanboy, and also clearly why another amd fanboy immediately thanked you a perfectly leveled headed post. LOL
Now onto your other ridiculous spew, based on facts not twisted perceptions.
You note the die sizes of the competing products, and conclude by stupid first look there is no reason amd cannot drop it's price (again) - you avoid the again, twice - (once for $$$(yes 3 figures), twice for 3 games added, now a 3rd time coming ) - but whatever, let's take your die size non chalant info and do our little math amd fanboys now desperately want to avoid.
300mm sq. nVidia vs 365mm sq Amd - doesn't look so bad does it ?
Unfortunately, the Amd die is well over 40% LARGER :)
Sorry about that amd fanboy brainfart .... you forgot to multiply for AREA, hence size/cost of the wafer....
300x300 vs 365x365
90,000 nVidia wafer area vs 133,225 huge 43% + more amd wafer cost.
So let's get this straight - do you still not really see a problem ?
yankeeDDL > " AMD will need to drop the prices and I see really no reason why they couldn't, as they have just a marginally larger die size (300mm2 vs 365mm2) on the same fab/technology. "
So 43% plus more base cost, no problem going a hundred bucks + games costs less... ?
R O F L
Thank you, as the amd fanboy said, for being such a clear thinking person with a calm and fair mind... (rolls eyes)
SlyNine - Saturday, May 12, 2012 - linkIts hard to say the 680 was a smack down when most people couldn't even get the card..
You seem to be angered by the 7970. I agree that it was not a good deal. But its not a bad card either.