nForce4 SLI Roundup: Painful and Rewarding
by Wesley Fink on February 28, 2005 7:00 AM EST- Posted in
- Motherboards
The Roundup
The four motherboards in this roundup represent all of the motherboards currently on the market that support SLI. This includes the three motherboards from the largest tier one manufacturers: Asus, Gigabyte, and MSI. The surprise in this group is the very early DFI nForce4 SLI and Ultra boards from a tier 2 manufacturer. Why is this a surprise? Tier 1 manufacturers generally get favored treatment from chipset manufacturers because of the sheer volume of board sales that they represent. This means that they get chips first, reference designs first, and lots of manufacturer design help. Particularly in this launch, nVidia made it clear that none of the tier 2 manufacturers would get SLI chips or design assistance until after the Tier 1 launch of SLI motherboards.DFI basically went their own way in the design of their SLI and Ultra motherboards so that they could get to market when tier 1 SLI boards would arrive. This is why the DFI is unique in design and unique in the ability to mod an Ultra to SLI as we discussed in Morphing nForce4 Ultra into nForce4 SLI. You can also find a launch review of the DFI Ultra and SLI chipsets at DFI nForce4: SLI and Ultra for Mad Overclockers.
Other motherboards in this roundup received extensive pre-production coverage. Anand covered the MSI K8N Neo4 SLI in prototype in Taiwan. The Asus pre-production board was used as the Reference board for the nVidia SLI launch, and the Gigabyte SLI received a First Look in late November.
Since those early reviews and announcements, we have been looking for a Forum to compare actual production models of all these motherboards using consistent test procedures and benchmarks. The buzz surrounding these SLI boards, the huge sales that they are generating, and the fact that single video performs the same in nForce4 SLI and nForce4 Ultra made the SLI roundup the perfect vehicle for launching our new motherboard test suite.
All testing for the SLI roundup was performed from scratch for each of the SLI boards. We felt production models deserved a fresh new look to help you decide better which SLI board, if any, to choose for your next system.
108 Comments
View All Comments
Rike - Monday, February 28, 2005 - link
And then he double posts and can't spell "graphs." (not "graphes!") *bangs head on wall* Oh well. To err is human. Happy Monday! :)Wesley Fink - Monday, February 28, 2005 - link
#33 - That is correct, but to implement PCIe Ethernet the mfg must use a PHY gigabit ethernet controller. In fact, as I state in the review, all 4 SLI boards implement PCIe on Gigabit #1, but all 4 boards have dual gigabit ethernet. Most implement PCI on Gigabit #2 with the results you can see in the ethernet performance charts.#30 - It IS a significant point and I thought we were clear that the 3132 is PCIe. I will add that to the chart to removee any confusion. However, there is another side to MSI using PCIe on all the on-board features. With both PCIe LAN's and PCIe SATA2 add-on there are no channels left from the 20 lanes available for PCIe slots.
Rike - Monday, February 28, 2005 - link
Rike - Monday, February 28, 2005 - link
Minor typo: HL2 resolutions on the graphes are listed as 16,000 x 1200 instead of 1600 x 1200. Either that or you're using a seriously wide screen! ;)mechBgon - Monday, February 28, 2005 - link
Wes, an academic point: unless something's changed with nVidia southbridges, the nForce3/4 gigabit Ethernet controller isn't a PCI-based device, it's native to the southbridge and rides the Hypertransport bus. If you're getting >900Mbit/sec in your test, it's pretty obvious it's not on a 32-bit PCI bus ;)AlanStephens - Monday, February 28, 2005 - link
#28 - I know for a fact that Creative doesn't support Dolby Digital Encoding. I wish they did though. Here is a quote from Creative's Knowledge Basse on this:"Computer games written with support for 3D audio do not require a Dolby Digital Interactive Content Encoder (DICE) to output multichannel sound, with no exceptions. Sound devices that support the real time encoder technology from Dolby will simply receive the multichannel wave file output and encode it in real time to a somewhat modified Dolby Digital bitstream. Creative does not support the Dolby Digital Interactive Content Encoder on any of its sound cards.
The only difference between a Sound Blaster card and an audio card that has a real time encoder, is that you can make a one-wire, digital connection from your audio card to your home theater receiver and enjoy discrete multichannel sound from the game. However there will be a continuous, slight delay, known as "latency", as the encoder is creating and transmitting the bitstream, and of course the compression scheme being used is "lossy" (i.e. not bit-accurate).
If you want to enjoy 3D audio in 3D enabled PC games in multichannel surround sound with a Sound Blaster card, it is recommended that you connect the analog outputs of the sound card directly to the analog inputs of the receiver."
EODetroit - Monday, February 28, 2005 - link
I wanna know the answer to #28's question. I've been looking for the next Soundstorm... ie I want to output a dolby digital 5.1 (or better) signal through a spdif connection from a non-pre-encoded source, like only Soundstorm can.Can the MSI do this now too??
RyanVM - Monday, February 28, 2005 - link
Why didn't you guys bother to note that the Si3132 SATA controller is PCIe? I think that's a fairly significant point in comparison to the PCI Si3114 controller and it likely explains why the Si3132 was faster.Lakku - Monday, February 28, 2005 - link
I wish you would have discussed 6600GTs in more detail. I am perturbed at a number of sites saying the 6600GT is not worth it for SLI. Specifically from X-Bit (though many have echoed it) "We guess it is the 6600GT SLI configuration that's not very appropriate". The only fact for this statement is the lack of a 256MB 6600GT card. This means its high resolution on some games and FSAA capabilites are limited. But so what? Noone has discussed REAL WORLD prices of 6800GT and Ultra cards. They range from 430 to 800 dollars, for ONE card. Yet sites claim it's better to just get a 6800GT rather then two 6600GTs. I picked up my 6600GT for 170 bucks, brand new retail. It overclocked to 550/1100 easily and I kept it at that, even though it went quite a bit higher with air cooling. I put that in because it gave me another 5%-10% or so increase over stock speed. I could get two for just over 350, almost $100 cheaper, and in a majority of tests, it equals or BEATS a 6800GT. It only suffers when you get to 1600x1200 with FSAA, as its smaller frame buffer and 128-bit interface is a hinderance. But for such a HUGE price difference, is it not worth it? If I had bought two outright (my plan was to get a start on a PCIe based system to bring me up to date for the future, and then buy the next generation cards for SLI), I could use that extra 100 for the GF7600GT (I am assuming this name, and the fact that if the next generations are 24 pipes, then the mainstream will be 12) that when SLId, will be up there in terms of todays speed on 700 to 1000 dollar video combinations, all for under 400. The point is that I wish someone would actually take an IT type approach to this situation instead of badmouthing SLI or the price for running an SLI setup. You get SLI now, be cost effective and get 6600GTs, and you have a great platform for the next generation (if it still exists, but nVidia seems to have put its eggs in this basket so I assume it will be around for awhile) of cards that you can SLI and get great performance. Is SLI worth it? I say the answer SHOULD be a resounding yes if the same performance holds true for the future. It means you can always get two mainstream cards to equal ONE high end card, for a much cheaper price.chup - Monday, February 28, 2005 - link
Is the SB Live! 24 really capable of encoding audio stream into Dolby Digital stream?