Comments Locked

84 Comments

Back to Article

  • darkrequiem - Tuesday, November 16, 2004 - link

    What I'd really like to see is how Doom3 performs with this SLI setup running in Ultra quality mode. They recommend a 512MB card for this mode, and here we have a total of 512MB between the two cards. It'd be interresting to see how many FPS can be achieved at this quality setting.
  • MightyB - Wednesday, November 10, 2004 - link

    Hmm Im gonna wait out for the ATI solution and hope they will make it possible to use two diffferent types of cards. Would love to have the new x800 All in Wonder matched with another X-series card :-) This way I can get great performance along with the much better ATI picture quality and even TV.. :-)

    And before you flame me being a ATI fanboy.. I own a Nvidia card.. Im talking about 2D (windows desktop) and Movies when i refer to picture quality. I see now real difference in games!

    Best regards
    mightyB
  • MiLaMber - Sunday, October 31, 2004 - link

    Perhaps slightly annoying this article to those ppl who have a Geforce 6800GT, like myself, just like the fact that the NF 4 is a no go.
    Looks like a possible upgrade to PCI Express will be on teh cards in..18 months, but I see this as a good thing, better motherboards, a maturer PCI Express solution.

    Guess could always sell the 6800gt agp when the pcie version comes out though huh lol, and then start considerin SLI again and indeed NF4.
  • Reflex - Sunday, October 31, 2004 - link

    Thats his point, because each card does 50% no matter how much work a section of the screen has, it won't necessarily utilize both cards fully. If your wandering through UT2k4 and there are no vehicles in the sky, but there is a massive ground battle going on, you will see hardly any benefit from the Alienware setup, but you'd see a huge benefit from the nVidia SLI solution since it will give the card rendering the top half of the screen more to do rather than sitting idle.

    Believe what you want, but I have a bad feeling that the Alienware tech will never actually appear on the market. It was announced months ago, and now if it arrived it would more or less be an Ati only solution, since if you had nVidia cards you can do SLI natively without Alienware's technology(and nVidia's solution should be faster most of the time). It was a good attempt by them, but at this point its not worth pursuing.
  • GhandiInstinct - Sunday, October 31, 2004 - link

    Swaid: Your ratios are a bit off, did you see the E3 demonstration? Maybe you should have another look at it. The ratio is always 50/50 even if the lower or upper half has a bit more work to be done, the thing is its only half the screen, resolution. Making the cards work a lot less than if they were single. Never the less, it's always seamless no matter how high your graphics settings.

    Ever heard of frame locking? Frame locking synchronizes display refresh and buffer swaps across multiple cards, preventing visual artifacts and ensuring image continuity in multi-monitor (or multiple video card) applications like simulations.
  • Reflex - Sunday, October 31, 2004 - link

    Hell, I'm an anti-nVidia guy and I still recognize this as a nice deal for those of us who'd love a cheap way to upgrade in the future when a secondary card will be cheap.

    I still won't be buying it however, the 2D performance of nVidia solutions is still crap at high res, and I spend a lot more time in front of a web browser than I do in front of games. But since Civ3 is the most recent game I have purchased, I don't imagine its really much of an issue for me. A Parhelia would meet my gaming needs just fine. ;)
  • Swaid - Sunday, October 31, 2004 - link

    GhandiInstinct,
    His logic is correct, while your's on the other hand needs to be double checked. In your case (X2/Video Array), we can take the an example of a situation where the upper half of the screen has a more complex scene (or more activity) going on, thus making Card 1 (video card rendering the upper half) work at 100% while Card 2 (video card rendering the bottom half) only needs to work at 80% to keep up with Card 1. Thats not an efficient solution. This is where SLI's technology can really shine. If you take that same scene thats being played through and use SLI, Card 1 will now render the upper 40% while Card 2 will render the bottom 60% and this would keep the 2 GPU's at 100% load. Now that is an efficient solution. That senario can pretty much go for all FPS games. Dynamic load balancing appears to be a better 'logical' solution. But I am willing to bet that you are an ATI fan or maybe it has to do with some sort of loyalty towards Alienware, so SLI is feeling you left behind, thus your unwillingness for reason. There are many articles that try and explain the whole situation with SLI.
  • Zebo - Sunday, October 31, 2004 - link

    Oh and thanks Anand for this exclusive. I'm with the camp which says "single card" and "frequent upgrades" only because of noise, heat, power and very low resale when I'm finally done with the two cards.
  • Zebo - Sunday, October 31, 2004 - link

    More options are always a good thing. How can anyone be down on this tech is beyond me.
  • GhandiInstinct - Saturday, October 30, 2004 - link

    SLI requires special circuitry to be incorporated into GPUs and, for extra speed gain, into core-logic. Alienware’s Video Array technology does not require any special logic to be incorporated into graphics or system chips.

    This makes it less of a driver prone problem than SLI.

    Over.
  • GhandiInstinct - Saturday, October 30, 2004 - link

    I don't understand why you think X2's split work screen will be worse... Even if the scenes get more complex its still only half! So if a single X800XT renders the complex scene at 70fps then two will chop up? Your logic is FLAWED!!!
  • TrogdorJW - Saturday, October 30, 2004 - link

    61 - Sokaku, I wasn't any more rude that you were in your original post. You were incorrect in your claims, as #62 pointed out. I'll repeat: it was a claim without a whole lot of thought/research behind it. Certainly SLI isn't a huge step forward, but to call it a step backwards is ludicrous. SMP would also be a step backwards, and dual-core would be pointless as well. Obviously, the 22 year old webmaster knows quite a few things that you don't. Being wrong at the top of your lungs is why pure guesses aren't used when writing any professional level article.

    One thing I find odd is that there's mention of the new Scalable Link Interface SLI doing either screen division - i.e. one card renders the top 2/3 and the other renders the bottom 1/3 - or Alternate Frame Rendering (AFR). I thought ATI created and patented AFR back with their Rage MAXX card, just like 3dfx created and patented Scan Line Interleave. (One problem with Scan Line Interleave, for those that don't realize this, is that it basically makes AA impossible to do without a massive performance hit. That's why NVIDIA calls the new SLI Scalable Link Interface.) I can't see ATI allowing NV to use AFR technology without a lawsuit, unless there was some other agreement that we haven't heard about.
  • IamTHEsnake - Saturday, October 30, 2004 - link

    Come on ATi, surprise me!!!!!!!
  • GhandiInstinct - Saturday, October 30, 2004 - link

    In addition, I hate overhyping brand new technology, it's so pointless. I can see analyzing this 6 months from now after people have been using it and more benchmarks are revealed.
  • GhandiInstinct - Saturday, October 30, 2004 - link

    No one is debating that SLI delivers phenomenal performance. The issue is with the limit on manufacturing creating an "SLI monopoly" lol.

    Everyone knows Nvidia cheats on visual quality and that ATI's cards perform better on an overwhelming amount of games. So if I have $400 to spend on my SLI setup I'd go for the latter of cards. Get it? It's nothing complex here folks.
  • mkruer - Saturday, October 30, 2004 - link

    Here is my 2 cents on SLI.

    I think there is a misconception buying two 6600GT at the start, and thinking its is going to be cheaper then a single 6800GT is incorrect. The TOC (total cost of ownership) for people jumping on issue of just is not there. Currently it is just as expensive as picking up a single card solution.

    Here is where SLI makes sense.

    12-18 months down the line the next latest and greatest game will arrive demanding twice the processing power that you currently have. Now you could purchase the bleeding edge graphic card for another $400US or you could pick up another 6800GT for half that and get nearly double the performance (that would also translate into the same performance of the new card), if not better. TCO is now about 75% of picking up a new bleeding edge $400US card.

    So I guess my recommendation to all of you out there that are thinking of picking up two 6600GT, don’t. Spend the same amount of money and get the 6800GT, and in the next 12-18 months pick up a second 6800GT for half the price, and you will still be getting the same performance as nVidia’s next generation, but for half the cost.

    Possible future.

    ATI is undoubtedly working on a similar solution, and possible working on a few “flaws” in nVidias current design, namely the SLI bridge connection. I suspect that in the future the SLI bridge connection will disappear completely and instead, be migrated to the last 8x of the 16x pci-e connection, thereby creating a direct point to point connection between the two cards. The advantage of this is now both cards could share there collective memory similarly to how AMD does with it processors between memory banks. This will allow for two 256mb cards to truly act as one 512mb card.
  • Tides - Saturday, October 30, 2004 - link

    dual core gpus in the future?
  • Sokaku - Saturday, October 30, 2004 - link

    #62 - PrinceGaz

    Thanks for clearing that up, I stand corrected. :-)
  • Ivo - Saturday, October 30, 2004 - link

    The enthusiastic market, where the two graphic cards SLI solution is positioned, is something like the F1 by cars: it advertises and proves new technologies, but it doesn't sell directly in profitable quantities. Probably, the mainstream market will never adopt it, at least because it is too expensive and too noisy. Nevertheless, a modified SLI solution, with IGP and ONE graphic card, could still be interesting for this market. In that case, the card, a 3D accelerator, should be idle for not intensive 3D applications and the SLI should adopt effective combination of two unequal GPUs.
  • Denial - Saturday, October 30, 2004 - link

    I'm glad I can buy anything I want at my job as the CFO doen't know what a 6800GT is. "Uhhh, it keeps the flux capicitor cool."

    How long till the SLI boards come out? Better yet, that dual SLI board from Tyan. I hope all this is out before the new year so I can slip it in with all the other end-of-year hardware purchases.
  • PrinceGaz - Saturday, October 30, 2004 - link

    The geometry processing *should* be shared between the cards to some extent as each card only renders a certain area of the screen when it is divided between them. So polygons that fall totally outside its area can be skipped. At least thats what I imagine happens but I don't know for sure. Obviously when each card is rendering alternate frames then the geometry calculation is effectively totally shared between the cards as they each have twice as long to work on it to maintain the same framerate.

    As for the 105% improvement of the 6800GT SLI over a single 6800GT in Far Cry 1600x1200. All I can say is No! It's against the laws of physics! That or the drivers are doing something fishy.
  • stephenbrooks - Saturday, October 30, 2004 - link

    --[I do not need a 21 years young webmaster’s article as a reference for making these claims,]--

    So what's wrong with 21 y/o webmasters then, HUH? I could be very offended by that. :)

    I'm surprised no-one's picked up this before but I just love the 105% improvement of the 6800GT on Far Cry highest res. I'm surprised because normally when a review has something like that in it, a load of people turn up and say "No! It's against the laws of physics!" Well, it isn't _technically_, but it makes you wonder what on Earth has gone on in that setup to make two cards more efficient per transistor than one.

    [Incidentally, does anyone here know _for sure_ whether or not the geometry calculation is shared between these cards via SLI?]
  • PrinceGaz - Saturday, October 30, 2004 - link

    #61 Sokaku- your understanding of the new form of SLI (Scalable Link Interface) is incorrect. You are referring to Scan-Line Interleave which was used with two Voodoo 2 cards.

    Using Scalable Link Interface, one card renders the upper part of the screen, the other card renders the lower part. Note that I say "part" of the screen, instead of "half"-- the amount of the screen rendered by each card varies depending on the complexity so that each card has a roughly equal load. So if most of the action is occurring low down, the first card may render the upper two-thirds, while the second card only does the lower third.

    The current form of SLI can also be used in an alternative mode where each card renders every other frame, so card A does frames 1,3,5 etc, while card B does frames 2,4,6 etc.

    However regardless of which method is used, SLI is only really viable when used in conjunction with top-end cards such as the 6800Ultra or 6800GT. It doesn't make sense to buy a second 6600GT later when better cards will be available more cheaply, or to buy two 6600GTs together now when a single 6800GT would be a better choice. Therefore the $800+ required for two SLI graphics-cards will mean only a tiny minority ever use it (though some fools will go and buy a second 6600GT a year later no doubt).
  • Sokaku - Saturday, October 30, 2004 - link

    #49 "Do you have some reference that actually states this? Seems to me like it's just a blatant guess with not a lot of thought behind it.":

    I often wonder how come people are this rude on the net, probably because they don’t sit face to face with the ones they talk to. I do not need a 21 years young webmaster’s article as a reference for making these claims, I think therefore I claim. And if you want a conversation beyond this point, sober up your language and tone.

    In a SLI configuration card 1, renders the 1st scan line, card 2 the 2nd scan line, card 1 the third, card 2 the fourth and so on.

    It is done this way because it’s easier to keep the cards synchronized. If you had card 1 render the left half and card 2 render the right, then card 1 may lag seriously if the left part of the scene is vastly more complex than the right part.

    So, in SLI both cards need to do the complete geometry processing for the entire frame. When the cards then render the pixels, they only have to do half each.

    Thus, a card needs a geometry engine that is twice as fast (at a given target resolution) as the pixel rendering capacity on one card, because the geometry engine must be prepared for the SLI situation.

    If the geometry engine were exactly (yeah I know, it all depends on the game and all) matched to the pixel rendering process, you wouldn't gain anything from a SLI configuration, because the geometry engine would be the bottleneck.

    This didn’t matter anything back in the voodoo2 days, because all the geometry calculations was done by the CPU, the cards only did the rendering and therefore nothing was "wasted". Now the CPU offloads the calculations to the GPU, hence the need to have twice the triangle calculation capacity on the GPU.
  • AtaStrumf - Saturday, October 30, 2004 - link

    First of all congrats to Anand for this exclusive preview!

    Now to those that think next GPU will outperform a 6800GT SLI, you must have been living under a rock for the last 2 years.

    How much is 9800XT faster than 9700 Pro? Not nearly as much as two 6800GTs are faster than a single 6800GT is the correct answer!

    Now consider that 9700 Pro and 9800XT are 3 generations apart, and 6800GT/SLI are let's say one generation apart in terms of time to market.

    How can you complain about that?!?! And don't forget that 6800GT is not a $500 high end card!

    If you get one 6800 GT now and one in say 6 months you're still way ahead performancewise, than if you buy the next top of the line $500 GPU and you only spend about $200 more, plus you get to do it in two payments.

    This is definately a good thing as long as there are no driver/chipset problems.

    Last but not least. Just as we have seen with CPUs, GPUs will also hit the wall probably with the next generation, and SLI is to GPUs what dual core is to CPUs only a hell of a lot better.

    My only gripe is that SLI chipset availability will probably be a big problem for quite some time to come and I would not buy the first revision, so add additional 4 months for this to be a viable solution.

    Me being stuck in S754 and AGP may seem like a problem, but I intend to buy a 6800GT AGP sometime next year and wait all this SLI/90nm/dual core out. I'll let others be free beta testers :-)
  • thomas35 - Saturday, October 30, 2004 - link

    pretty cool review once agian.

    Though most people seem to miss one big glaring thing. SLI while nice a pretty toy for games, is going to have a huge impact on the 3D animation industry. Modern video cards have hugefully powerfull processors on them, but because AGP isn't duplex and can't work with more than one GPU at a time, video card based hardware rendering isn't used much. But now with PCI-e and SLI, I can take 2 powerfull professional cards (FireGL and Quadro's) and have them start helping to render animations. This means, rather than add extra computers to cut down on render times, I just simply add in a couple of video cards. And that in turn, means I have more time to develope a high quality animation that I would have in the past.

    So in the middle of next year, I'll be buying a dual (dual core) cpu system and dual video card system to render. And then will have the same power as I get from the 6 computers I own now, in your standard ATX case.
  • Nick4753 - Saturday, October 30, 2004 - link

    Thanks Anand!!!!
  • Reflex - Saturday, October 30, 2004 - link

    Ghandi - Actually the nVidia SLI solution should outperform X2 in most scenerios. The reason for that is that their driver will intelligently load balance the rendering job to both graphics chips rather than simply split a scene in half and give each chip a half. Much of the time there is more action in one part of a scene than the other, so the second card would be completely wasted at those times. On average, the SLI solution should outperform X2.

    X2 has other drawbacks as well. Few techies really want to buy a complete system, preffering to build their own. So something like X2, which is only able to be acquired with a very overpriced PC that I can build on my own for a lot less money(and use nVidia's SLI if I really need that kinda power) is not a very attractive solution for a power user. You also point out a part about drivers as an advantage that is truly a drawback. What kind of experience does Alienware have with drivers? How long do they intend to support that hardware? I can get standard Ati or nVidia drivers dating back to 1996, will Alienware guarnatee me that in 8 years that rig will still be able to run at the least an OS with full driver support? What kind of issues will they have, seeing as they have never had to do that before, writing drivers is NOT a simple process.

    I have nothing against the X2, but I do know it was first mentioned months ago and I have yet to see any evidence of it since. It would not suprise me if they ditched it and just went with nVidia's solution. At this point they are more or less just doing design for Ati, as anyone who wants to do SLI with nVidia cards can now do it without paying a premium for Alienware's solution.

    Your comment about nVidia vs. Ati is kinda odd. It really depends on what games you play as to which you prefer. Myself, I am perfectly happy with my Radeon 9600SE, yes its crappy, but it plays Freedom Force and Civilization 3 just fine. ;)
  • GhandiInstinct - Saturday, October 30, 2004 - link

    I don't consider my statement any form of criticism it is merely a realization that the high-end user might want to wait for X2 because the current concensus reveals that Nvidia cards aren't a big hype compared to ATi. Ask that high-end user would he rather waste that $200 on dual-Nvidias or Dual-Ati's?
  • SleepNoMore - Saturday, October 30, 2004 - link

    I think you could build this and RENT it to friends and gawkers at 20 bucks an hour. Who knows maybe some gaming arcade places will do this?

    Cause that's what it is at this point: a really cool, ooh-ahh, novelty. I'd rather rent it as a glimpse at the future than go broke trying to buy it. The future will come soon enough.

    The final touch - for humor - you just need to build it into a special case with a Vornado fan ;) to make it totally unapologetically brute force funky.

  • Dasterdly - Saturday, October 30, 2004 - link

    Im willing to settle :p
    Also the 2 gpu on one card, or even on one chip would be good. Probably what ati should/will do now to keep up.
    I had the v2 12 mb and it was the fastest card to play my games for more than a year. After that I bought another one and was good for another year or so till the gf2 gts came.
    With the product cycles bumped up by Nv (and everyone else to compete) to 6 mo, I dont know if it would be worth it till they reach thier cap.
  • Grishnakh - Saturday, October 30, 2004 - link

    Well, Human beings seem to be preset to criticize what they just don't need.
    If you think SLI is nothing to you, that mean you just don't need these behemoths, so you will never buy nF4 SLI, KT890, etc, then SLI is nothing concerned with you.
    And Honestly, I wonder what kind of loss from nVidia can be? If you don't need it, fine, most of products meet your demand. If you need it, better! you would pay double, and so the company would earn double.
    SLI just a little like dual CPU, there always a certain population, though not much, need it
  • GhandiInstinct - Friday, October 29, 2004 - link

    Well X2 utilizes each GPU to perform half the screen making a more efficient cooperative effort than SLI. Plus you won't need to keep updating your drivers like SLI and the drivers will come straight from AlienWare.

    It's more appealing to use any combination of GPUs you want rather than SLI. So I want the best performance so I have to pay a premium to be stuck with Nvidia again? Not making that mistake again...
  • caliber fx - Friday, October 29, 2004 - link

    Wonder why alot of you are saying that the driver needs to be "specially written" for a game because even anand said that "In our conversation with NVIDIA, we noted that this technology should work with all software without any modification necessary". If you are talking about driver tweaking then even single gpu solutions are guilty of that one. The tweaks toward the nv30 or ati with their ai solution are just a few examples and I bet if the previewer had more time with the system in the right place he would have ran many other applications. I think most of you have gotten dual cores cpus mixed up with sli and I don't blame you because their are so many just introduced features that are currently not in use in alot of software like amd64, sse3, ps 3.0 and multithreading. Funny thing if their are games out there that can take advantage of all these features to the fullest I can't imagine what that would produce and the sad thing is all these features can be implemented on one machine. Also that alienware solution seems less efficient than sli.
  • GhandiInstinct - Friday, October 29, 2004 - link

    I'm sure everyone agrees that the drawback with this technology is it only supports inferior Nvidia GPUs.

    I'm looking forward to Alienware's X2 technology that combines any gpu combination at a much more efficient architecture.
  • TrogdorJW - Friday, October 29, 2004 - link

    My only question is about the small HSF on the NF4 Ultra chipset. That appears to sit directly underneath the second PCIe slot. Kind of odd, that. How difficult was it to install the cards in that board, Anand? It will also be interesting to see how performance changes over time. With their higher clock speed, I think SLI 6600GT should do better than a 6800GT. Seems like a driver optimization problem to me, although the lack of RAM might also come into play.

    And #11, what was that crap about requiring more geometry processing power to do SLI!? Do you have some reference that actually states this? Seems to me like it's just a blatant guess with not a lot of thought behind it. A card might need to do more geometry work in SLI relative to non-SLI, but twice as much? Hardly. I have a strong suspicion that the vast majority of applications do not come near to maxing out the geometry processing of current cards. Look at 6600GT vs. X700XT: 3 vertex pipelines vs. 6 vertex pipelines. Why then does the 6600GT win out in the majority of tests?
  • Reflex - Friday, October 29, 2004 - link

    #44: Why would DD encoding be a selling point? It is a compression algorithm among other things, and as a result it will degrade your sound quality. It makes sense for DVD's, but for quality PC audio it makes no sense at all. If you want multi-channel(sound on your back speakers) just use analog connections and specify in the control panel for whatever card your using that you'd like it, most give the option.

    Contrary to popular misconception, Dolby Digital, while nice for movies, is a bad thing for PC audio in general. It is one of the reasons that the SoundStorm is not considered a high end solution, despite how nVidia marketed it. Regardless, if you use a digital connection and you have a DD source(DVD movie for instance) your sound card no matter what brand will pass that signal through to your reciever and allow it to decode DD.
  • DrumBum - Friday, October 29, 2004 - link

    is it possible to run three monitors off of an SLI setup and run extended desktop across all three?

    (play a game or watch a dvd across three monitors)
  • Mrvile - Friday, October 29, 2004 - link

    Wow nVidia totally blew ATI away in Farcry (which is weird cuz Farcry is Direct3D) according to http://www.anandtech.com/video/showdoc.aspx?i=2044... benchmarks. But these are kinda old benchies, from May...
  • gplracer - Friday, October 29, 2004 - link

    I think this is a good solution for the time being. If I were going to build a new system I would want the GF4 with SLI capabilities. What if someone bought this board and one 6800 GT. Then at a later would it be impossible to buy another newer nvidia card and run it sli or would it have to be the exact same card? Also no one has noted that this sli capability is great for amd and not so good for intel. Some people will want this and intel has nothing to currently offer that I am aware of.
  • ImJacksAmygdala - Friday, October 29, 2004 - link

    Thanks for the article.

    I think I will skip the nforce3 and nforce4 boards. I hear that there will even be HT problems with the Nforce4 AO3 silicon and I don't feel like rolling the dice with any other problems.

    I;m not sold on SLI anymore either. I have the cash for it, but I'm considering the extra cost of 2 high end cards instead of just getting the latest and greatest every 1.5 to 2 years. I'm concerned about the extra heat and noise aswell.

    I would have much rather had Sound Storm than SLI. I think I will just wait and see if a Dolby Live 5.1 encoding sound solution shows up before I upgrade to a AMD64 system. Intel has Dolby Live 5.1 encoding so maybe Creative will soon too.
  • Lord Banshee - Friday, October 29, 2004 - link

    Can you please test Spec ViewPref 7.1.1 or above with the next SLI mobo you test. Alot of us 3D modelers want to know if SLI will benifit.
  • CrystalBay - Friday, October 29, 2004 - link

    GJ Anand, you scooped everyone (other review sites) again... :)
  • bob661 - Friday, October 29, 2004 - link

    mrdudesir

    See #34.
  • mrdudesir - Friday, October 29, 2004 - link

    I dont get why everyone is bitching about the added cost for people who dont want it. There is no added cost if you dont want SLI. Just buy a board based on the NF4 Ultra Chipset. ITs the exact same chipset just with no SLI. In fact if anything SLI lowers the price because it leaves a new top of the line chipset so that the NF4 Ultra doesn't have to be the absolute best and hence it is cheaper.
  • nserra - Friday, October 29, 2004 - link

    I already had a dual voodoo2 SLI, and besides the extra speed (and not always), no more....

    This is not that brilliant:
    1st - Need motherboard support and a special/specific one (voodoo2 didn’t)
    2nd - Doesn’t bring anything new features besides extra speed (play at 1280x1024 instead of 1024x768?)
    3rd - More heat and power requirement.
    4th - The driver must support the game (I don’t know if voodoo2 also needed this)
    5th - It will prolong your PC how? Does the SLI 6600GT have the same functionalities/features of future products (NV50) don’t think so.
    6th – Price, price, price …..
    7th – Voodoo2 also had a version of SLI in a single board, a much cleaver solution, for the immediate since every board would accept it.
    8th - I bet there will be games incompatibles (voodoo2 had to disable SLI in some games in order to work/play)
    ….
  • Reflex - Friday, October 29, 2004 - link

    #35: If you do not wish to use the second slot for graphics, it is still a fully functioning PCI Express slot you can use for *anything* else, so it is not wasted board space at all.
  • Reflex - Friday, October 29, 2004 - link

    #9: There will be no add in SoundStorm solution. The group that developed that technology at nVidia has been dissolved and moved on to other projects.

    Just as well, it was not a quality solution anyways.
  • bob661 - Friday, October 29, 2004 - link

    The hardware does exist. You can buy 6600GT's right now on Newegg.
  • haris - Friday, October 29, 2004 - link

    SLI is an option on the motherboard. Great. SLI might work because of the driver, but doesn't the hardware have to exist for the feature to be used in the driver?

    What if Nvidia/ATI have to use up valuable board space for a feature that will only be used by high end users, this means that everyone else is paying for a feature that they don't want or will never use. I don't like the idea that I might be paying extra for my card because one person out of ten thousand (or whatever the % of high end to average users is) wanted that feature.
  • bob661 - Friday, October 29, 2004 - link

    I think some these guys are mad because the motherboard that suits their needs won't be considered "the best". For some, it's an image thing. If it isn't, then why do you care that SLI is even available? Just but the HF4 Ultra. Then there some that come here just to piss people off.
  • bob661 - Friday, October 29, 2004 - link

    Two GPU's on one card is more expensive and there would proabably be some heat issues.
  • Pete - Friday, October 29, 2004 - link

    Whoops. NV43 has only four ROPs, while NV40 has sixteen. So SLIed 6600GTs still have only half the ROPs as a single 6800GT. Mah bad.
  • Tides - Friday, October 29, 2004 - link

    SLI is meant for one thing, HIGH END. It's like spending 800 on an Athlon FX. Before now the option wasn't there, now it is. What's the problem?
  • Pete - Friday, October 29, 2004 - link

    Thanks for the preview, Anand (and MSI). One note:

    "At 1280 x 1024 we see something quite unusual, the 6800GT gains much more from SLI than the 6600GT. The 6800GT received a 63.5% performance boost from SLI while the 6600GT gets "only" a 45.7% improvement; given the beta nature of the drivers we'll avoid hypothesizing about why."

    Not enough RAM? 12x10 4xAA is getting pretty RAM-intensive, no? That's one of the reasons I'm not that excited about SLI'ing two 6600GTs to the level of a 6800GT, but without the extra breathing room afforded by 256MB.

    Two questions for you, too, Anand:

    (1) The 6600GT is 500MHz core, 8 pipe, 4 ROP, 500MHz 128-bit memory. The 6800GT is 350MHz core, 16 pipe, eight ROP, 500MHz 256-bit memory. All else being equal, I'd have thought the SLI'ed 6600GTs would way outperform the 6800GT because they have the same specs and a 40% higher core clock. Is this just a matter of some efficiency lost due to SLI overhead?

    (2) Is there a way to tell if the cards are rendering in "SLI" or AFR mode, or even to force one or the other? I'd be curious to know which helps which app more.
  • justauser - Friday, October 29, 2004 - link

    I don't get it. Why not just put two GPUs on one 16x card. This bridge thing is so hokey.
  • Tides - Friday, October 29, 2004 - link

    Better yet don't buy the SLI version of the mobo, there ARE 3 versions of NF4 boards afterall.
  • Tides - Friday, October 29, 2004 - link

    Why are people complaining about an additional feature on motherboards, that you are no way forced to use? It's like having 2 agp slots on a motherboard, it's ROOM FOR UPGRADE. What's wrong with that?
  • xsilver - Friday, October 29, 2004 - link

    I think the performance boost is viable, only you need to know when to buy

    6600gt SLI is close to a 6800gt in most benchies and in the ones that aren't may be due to driver issues rather than performance... however 2X 6600gt does not equal 6800gt in price, but in say 12months time will a 6600gt + the price of the old 6600gt = or be less than the price of a 6800gt originally?
    The new mainstream product in 12 months time should still perform less than a 6600gt in SLI
    Think of it as getting as good card on "layaway" (am I saying this right? im not in the US :)

    The other viability is of course having 2X 6800GT and saying I've got the best performance money can buy.... again you should not be superceded within 12-18 months


  • haris - Friday, October 29, 2004 - link

    This is a horrible move by Nvidia. Several people have already stated so because of some of the main problems: Heat, noise, power requirements, and SLI may only work if the driver supports that specific game/engine. It might work out great for them since they will be able to get people to pay for two cards instead of just getting a more powerful single card solution which will work just as well if not better in every game. For most people, by the time they would be ready to upgrade a low-mid range card, it would probably still be more cost effective to just buy a new card.

    I love the performance boost as much as the next guy/girl, but I still think that this is just plain stupid.
  • KraftyOne - Friday, October 29, 2004 - link

    Thanks bob661! Anyone have any info on AGP x700's? If ATI gets them out first, they get my money... :-)
  • HardwareD00d - Friday, October 29, 2004 - link

    Those benchmarks are amazing! I wasn't going to shell out the dough for SLI, but now I'm going to reconsider.

    I was glad to read that the 2 PCIe slots being only 8x will not really be a performance issue. A lot of people are down on nForce4 because it won't do 2 16x slots. F***ing A nVidia Nice JOB! Now if only NF4 had soundstorm...
  • bob661 - Friday, October 29, 2004 - link

    See http://www.theinquirer.net/?article=19340 for 6600GT AGP availability.
  • KraftyOne - Friday, October 29, 2004 - link

    Yes, this is all fine and great, but when will the x700 and/or 6600GT be available for AGP ports for those of us who can't afford all the latest and greatest?
  • HardwareD00d - Friday, October 29, 2004 - link

    I have only two words to say:

    Anand Rules!
  • Nyati13 - Friday, October 29, 2004 - link


    On page one it says the SLI slots are electrically x8 slots instead of x16. That is not correct, they are x8 data slots, but will still provide the full x16 electrical power needed to the graphics card.

    Jeremy
  • mongoosesRawesome - Friday, October 29, 2004 - link

    What PSU were you using for these tests?
  • suave3747 - Friday, October 29, 2004 - link

    I would expect that a 6800 Ultra Extreme SLI setup will not be outdone by a new nVidia card until at least a year from now. And at that time, when you bought the second one, it would push you from mid-range back towards the top for much less than that new card would be at the time.

    This is brilliant by nVidia, because:

    A. It allows you a way to buy half of the GPU setup that you want now, and half later. That's a great plan for a budget-oriented consumer. It will make GPU purchases a lot easier for parents to swallow on holidays. It allows for someone to give them $1200 for a ridiculously powerful GPU setup.

    B. It will keep their high end cards of today selling well all the way into next year. The way the market is now, people want a new-gen card. They don't want a 5950 right now, they'd rather buy a 6600. This will keep 6800 series and 6600 GTs selling all the way through 2005.
  • Subhuman25 - Friday, October 29, 2004 - link

    2 Video cards - No thanks.
    Price factor,heat factor,noise factor,space factor,extra & new technology factor(guinea pig)
    No thanks to this avenue.
    I feel sorry for the poor saps who buy into this crap and spend $800+ alone for 2 x 6800GT's only to be outperformed in a year by a newer generation single card setup.
    I highly doubt this will ever become a trend.If you think so,then ask how many people have dual CPU systems.
  • jkostans - Friday, October 29, 2004 - link

    Hmm lets see, I can get a 6800 GT for less than two 6600GTs, and the 6800GT is faster..... Let me think about this, no SLI for me! MAYBE if you want a mid-range card now, then get the 6600GT and upgrade when another 6600GT is very cheap and have the equivalent of low-end card then?. I still think it's worth dropping the extra cash on the 6800 GT. But I guess when it cam time to upgrade, I could just add another 6800 GT? I kinda doubt it.
  • Sokaku - Friday, October 29, 2004 - link


    #12 I'm with that on that one, however we did see how it went with 3dfx's SLI solution.

    I had a voodoo2 and when it came to the point where I wanted more power, I could have bought another voodoo2, however the graphics card available at that point, out performed a dual voodoo2 configuration considerably, so as an upgrade path, it was never feasable to do so.
    I'm afraid that if I should buy an 6800Ultra, when the time comes, I would not buy another one, because at that point, we have one 8600Ultra way outperforming dual 6800ultras...


    #13 by lebe0024: I assumed that you've been banned 23 times from this forum and I sure hope you'll be banned for the 24th time.
  • lebe0024 - Friday, October 29, 2004 - link

    Shut your pie hole #11
  • xsilver - Friday, October 29, 2004 - link

    #11 .... how bout this -- "if they could they would"

    nvidia is here to may money after all..... a single card solution should certainly be cheaper to produce than a dual one, but I don't think that's feasable right now so that's why its not made
  • Sokaku - Friday, October 29, 2004 - link

    I find it horrible that NVIDIA is taking up SLI again. "Why?" you probably wonder... Well, in order to be able to gain anything from SLI, NVIDIA has to overpower the GPU considerably when it comes to geometry handling. Infact, the geometry engine must be able to outperform the pixel rendere by a factor 2, should the customer happen to use this card in a SLI configuration. This means that people who do NOT want a SLI configuration will have to pay for a geometry engine that is way more powerful than needed.

    Think of the additional cost of making a SLI configuration, you need a motherboard prepared for it, you need dual graphics card, gpu and all.. And what problem does this solve?

    Well, basicly all it does is giving you twice the memory bandwidth and pixel render capacity.

    This could be solved way more cost effective by doubling the data bus width and keeping the solution on one card. Also, this way the geometry engine would be dimmensioned to exactly match the capacity of the rendere.

    This is a step back in innovation and not a step forward.
  • smn198 - Friday, October 29, 2004 - link

    ...meaning different RAM manufacturers/speeds on the graphics cards.
  • smn198 - Friday, October 29, 2004 - link

    Any one heard if you can get differing versions of the same board. e.g. one 6600GT from Asus and the other from MSI? Anyone heard of any tests with this and different RAM?

    (I think the dual 6600GT upgrade path makes up for the lack of SoundStorm. Hope they hurry up and make their add in SS boards tho!)
  • TimTheEnchanter25 - Friday, October 29, 2004 - link

    Any word on when PCI-E versions of the 6800 GT will start showing up in stores? I'm not waiting for SLI boards, but I'm getting worried that the 6800GTs won't be out by time Nforce4 boards are in stores.
  • ballero - Friday, October 29, 2004 - link

    How about testing the last patch by crytek with hdr enabled?


  • ballero - Friday, October 29, 2004 - link

  • Pete84 - Friday, October 29, 2004 - link

    #4 Why would you want SLI for the 6200? It is the lowest end card - think 9200. No real use, other than Civ III and Firefox.
  • SDA - Friday, October 29, 2004 - link

    Maybe, maybe not, #3. Modern PSUs have a lot more juice than the systems actually use. I wouldn't be surprised if a 380W Tagan could handle an A64 rig with dual 6800Us.

    Anyway, yeah, this is decidedly sexy. What I like about it is that it actually has appeal for normal people as well... say you want to buy a midrange system that you can upgrade easily later. Get a dual PCI-E 16x motherboard and a 6600GT, then add in another 6600GT whenever you feel like you want a performance boost! Shame SLI isn't available on the 6200...
  • ukDave - Friday, October 29, 2004 - link

    1600x1200 high detail for any current game, that certainly is impressive. I'm an 'ATi Fanboy', but congrats to nVidia, damn fine job.

    I think even my luberly Tagan might have issues running two on these 6800 Ultras :o
  • keitaro - Friday, October 29, 2004 - link

    Here's an interesting idea... take SLI setup, plug in 2 17in to 19in LCD monitors, do some benchmarks and play with that setup for a while. Then tell us about the experience. :)
  • xsilver - Friday, October 29, 2004 - link

    Sweet.... Sweet candy ...... drools... wants now

    I'm no fanboy but those ATI fanboys will have to give in to the fact that nvidia now has the "fastest" card abeit it will cost 2x more

    Finally on the issue of SLI if the exact same card must be used to enable SLI, is there any info on the future availability of these cards.... eg if u buy a 6600gt now in 18 months this card may not be available anymore? eg. will be 6700gt? will these be compatible? Forcing you to upgrade before the product is phased out may not be so good

    Also just noticed in the far cry 1600x1200 test SLI is more than 100% faster... how the hell does that work? margin of error?

Log in

Don't have an account? Sign up now