I'm not sure how board-specific this would be (although the BIOS could easily get
in on the act), but I notice nVidia are claiming a big readback speed increase on
the Quadro FX4500 over the FX4400 (2.4GB/s vs 1GB/s). This doesn't seem to apply
to the 7800GTX in the GPUbench report I managed to find, but it's the kind of thing
which could be massively driver and BIOS-dependent.
I know this is a more artifical figure than the games which have been run, but
significant jumps like this (along with the increased vector dot rate) make these
cards much more attractive than the 6800 series for non-graphical work. Would it
be possible to try to confirm whether this speed-up is specific to the Quadro
board, or whether it applies to the consumer cards too? (Either by a little bit
of test code, or by running some artificial benchmarks.)
Just curious. Not that I'll be able to afford a 4500 anyway...
ALL 7800GTX cards at this point are manufactured by nvidia and sold as is by the "vendors". ONLY physical difference is the logo on the cooler. If some vendors screen and OC their cards before selling, clock speeds would be the only difference. ANY perfomance or heat dissipation differences at similar clock speeds are MERELY manufacturing variances.
Not true. Vendors have some bios control over aspect of the cards that are not exposed to users. We have not been able to confirm any details from any vendor or NVIDIA (as they like to keep this stuff under wraps), but temp, heat, and noise (and even overclockability) could be affected by video bios settings.
We don't know the details; we need more clarification. In the meantime, these are the numbers we are seeing so we will report them. If we are able to get the information we need to really say why we see these differences then we will definitely publish our findings.
no matter how much better a card does im always going to by evga... ive saved more time and money with the step up program. there customer support is soo good too.
After reading an article about how CPU performance is tapering off (murphy's law or moores law, i forget which one), but GPU performance has continued to increase, and has showed signs that it will continue to increase. Also I remember an article about Nvidia or ATi (i cant remember which) was asked about any "dual core" GPU's that will be developed. They answered that if you really look at the hardware, GPUs are like multiprocessors, or something to that nature. Perhaps this could be the reason for the clockspeed questions? It would seem logical to me that their technology doesnt run like a typical cpu, because each "processor" runs at a different speed? I think you might understand what im trying to say, at least I hope so cuz im failing miserably at...what was i sayin?
Not sure if this has already been discussed in earlier articles, but, the 7800GTX as everyone (including myself) seems bottlenecked at every resolution except 16x12. And then with AA and AF enabled the X850XT seems to catch up. While the averages might be the same, has anandtech ever thought of including the minimum and maximum framerates on their graphs.
Just wanted to thank Derek and Josh for clarifying the dual link situation. MSI don't mention anything about dual link, but after the debacle with their 6800"GT" I'm not sure I'd have trusted their publications anyway... If *all* the 7800GTXs are dual link, I'm more confident (although if there's actually a chance to try one with a 30" ACD or - preferably - a T221 DG5 in a future review I'd be even happier!)
Good review, even if we can expect most cards to be pretty much clones of the reference design for now.
Cool; it'd reassure me before I splash out! (Although I'm still hoping for the extra RAM pads to get filled out - got to hate 36MB frame buffers - but with the Quadro 4500 allegedly due at SIGGRAPH it shouldn't be long now.)
Sounds like the same solution as the Quadro 3400/6800GTo, with the internal transmitter used for one link and the SiI part for the other. I don't suppose you've pulled the fan off to find out the part number?
I'd also be interested in knowing whether the signal quality has improved on the internal transmitter; nVidia have a bad record with this, and the T221 pushes the single link close to the 165MHz limit (and the dual link, for that matter). People have struggled with the 6800 series, even in Quadro form, where the internal transmitters have been in use. It'd be nice to find out if they're learning, although asking you to stick an oscilloscope on the output is a bit optimistic. :-) These days this probably affects people with (two) 1920x1200 panels as well as oddballs like me with DG5s, though.
On the subject of DVI, I don't suppose nVidia have HDCP support yet, do they? (Silicon Image do a part which can help out, or I believe it can be done in the driver.) It's really a Longhorn thing, but you never know...
Now, if only nVidia would produce an SLi SFR mode with horizontal spanning which didn't try to merge data down the SLi link, I'd be able to get two cards and actually play games on two inputs to the T221 (or two monitors); the way the 7800 benchmarks are going, 3840x2400 is going to be necessary to make anything fill rate limited in SLi. (Or have they done this already? There was talk about Quadros having dual-card OpenGL support, but I'm behind on nVidia drivers while my machine's in bits.)
Nether Evga NOR MSI advertise that their card is capable of driving at the resolutions that would suggest that the dual-link DVI is enabled.
E.g., MSI:
Advanced Display Functionality
• Dual integrated 400MHz RAMDACs for display resolutions up to and including 2048x1536 at 85Hz
• Dual DVO ports for interfacing to external TMDS transmitters and external TV encoders
• Full NVIDIA nView multi-display technology capability
"Advertise" is perhaps a strong word, but the PDF data sheet on the eVGA web site
does say that one output is dual link (even though the main specifications say
the maximum digital resolution is 1600x1200, which is nonsense, like all resolution
claims, even for most single link cards).
I couldn't (last I looked) find anything about dual link support on the MSI site.
But then, MSI have in the past ignored that the 6800GTo was dual link, and then
claimed that their (real) 6800GT *was* dual link, and that the SiI transmitters
were unnecessary... (Although I'm still mystified how the PNY AGP 6600GT seems to
have dual dual link support without external transmitters.)
I'm presuming both heads have analogue output, btw (I only ask because the GTo,
for some astonishing reason, only has digital output on its single link head).
Past experience (with the 6800) suggests that the reason none of the manufacturers
mention it is that very few people actually know what dual link DVI *is*. A lot
probably haven't tried it - there being, last I looked, only three monitors which
can use it anyway, two of which are discontinued. nVidia caused a lot of confusion
by claiming support in the chipset and putting an external transmitter on their
reference card, which most manufacturers left off without updating their specs.
Unfortunately, nVidia seem to fob off all their tech support to the manufacturers,
who aren't always qualified to answer questions - I've not found anywhere to send
driver feature requests, for example. Seeing the external transmitter make it to
released boards is a vast relief to me.
Now the Quadro 4500 has been announced, I'm hoping the 512MB boards will appear
(and they might be DDL). Fingers crossed.
For that extra $4 you are also paying for a longer Warranty. eVGA has a 1+1 warranty, so 1 year warranty out of the box, and another 1 year when you register online at eVGA. MSI on the other hand has a 3 year warranty, and BFG a lifetime warranty.
It must be the corporate purchaser in me, $4 is well worth the extra year ( or 2 ), but I guess if you are going to be on the "bleeding" edge, then you are buying a new video card every 6 months anyways, so who cares?
Regarding measuring the card's noise output and the way you measured the sound
"We had to do this because we were unable to turn on the graphics card's fan without turning on the system."
Would it be possible to try and measure the voltages going to the fan when the card is idle and under full load? Then supply the fan with these voltages when the system is off using a different power supply such as a battery (which is silent) and a variable resister.
It would also be interesting to see a graph of how the noise increases when going from idle to full load over 10 minutes (or however long it takes to reach the maximum speed) on cards which have . Instead of trying to measure the noise with the system on, again measure the voltage over time and then using your battery, variable resistor and voltage meter recreate the voltages and use this in conjunction with the voltage/time data to produce noise/time data.
We are definitely evaluating different methods for measuring sound. Thanks for the suggestions.
Just to be clear, even after hours of looping tests on the 7800 GTX overclocked to 485/625, we never once heard an audible increase in the fan's speed.
This is very much unlike our X850 parts that spin up and down frequenly during any given test.
We have considered attempting to heat the environment to simulate a desert like climate (we've gotten plenty of email from military personel asking about heat tolerance on graphics cards), but it is more difficult than it would seem to heat the enviroment without causing other problems in our lab.
We have considered attempting to heat the environment to simulate a desert like climate [...] but it is more difficult than it would seem to heat the enviroment without causing other problems in our lab
Derek, If you really wanna simulate desert like heat in the room, may i suggest inviting Monica Belluci to your lab ....should work like a charm :p
Ive been using MSI cards for a few years now, there fans always seem to run at top speed and ive found they usually run at higher RPM's(Slightly louder) than other manufacturers. I think that explains why the card is cooler while drawing more power, and why you didn't notice a difference in sound as the card was stressed. Im not entirely certain, but thats from my own expenriances with MSI cards.
"As you can see, The EVGA slightly outperforms the MSI across the board at stock speeds."
Either I'm reading it wrong or you mis-wrote that line, since I see the e-VGA normal and OC'd, the NVidia reference, and the MSI OC'd, but no MSI at stock speeds. Thus it's hard to compare th EVGA stock speeds vs the MSI stock speeds when one of them isn't on the charts.
MSI stock performance is the same as the NVIDIA reference performance at 430MHz ...
To compare stock numbers compare the green bar to the EVGA @ 450/600
Sorry for the confusion, but we actually tested all the games a second time and came up with the exact same numbers. Rather than add another bar, we thought it'd be easier to just reference the one.
If you guys would rather see multipler bars for equivalent results across the board, we can certainly do that.
Since the MSI card drew a lot more power than expected but remained cooler than the eVGA card, I was thinking that some of the excess may be due to the cooling of the card itself. Maybe the fan on the MSI card works harder than the one on the eVGA card.
The people at Anandtech could test the power usage of the stock video-card cooling fans independently to see what their effect is on power load. This may explain the 6 extra watts used by the MSI card. This information might be mildly useful to a person who was already stressing out their power supply with other things (such as several hard drives). Does anyone think that is worth doing?
I like the new style, but is there a way to put the reply box back to the main comments page, posting quick replies without going to another page would be nice.
----------------------------------
i wonder when they will release a $400 7800, hopefully a 24 pipe card... my 9800pro is so "old" now.
just had a thought.
are there any games confirmed that need ps3.0 or any features present in this card or the 6800 series?
The only game that I can think of is UT'07 where it was stated that the 6800 series was the first to be able to support such high poly counts
any games stated to come out before then that would require an upgrade from a ati card? (9800/x800)
comes on the back of the ti4600 owners that cant play BF2 at all
There are instructions somewhere on battlefield2.com that describe how to use the demo.cmd file, but you will need to edit this file to set resolutions other than 800x600.
Our data is compiled from the last 1700 frames of our demo run. The many thousand other frames that can result come from the load screen and aren't useful to show performance.
What version of the game were these made under, Derek? Are they for the unpatched game? I couldn't get the demo to play. It would load the map, then it would crash me to the desktop while it was checking player assets or something to that effect. This was on a version of the game with the 1.02 patch.
by the way, since it's the last 1700 frame, you've got to go into the frametimes csv file and manual calculate an average for the last 1700 lines of the 3rd column (after you've split the columns on ; ) for every test you do. It's kind of a pain, but for people that care, there it is.
congrats to anand to adding the filters
no more dumb "first post" or "in soviet russia" anymore
anyways with regard to the card
why on earth would you not buy the egva card as it comes with BF2 --- its one of the few games that taxes the card -- even if you have the game already - selling it would net you an extra $30 at least
kudos to evga for including a good game
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
42 Comments
Back to Article
Fluppeteer - Friday, July 29, 2005 - link
I'm not sure how board-specific this would be (although the BIOS could easily getin on the act), but I notice nVidia are claiming a big readback speed increase on
the Quadro FX4500 over the FX4400 (2.4GB/s vs 1GB/s). This doesn't seem to apply
to the 7800GTX in the GPUbench report I managed to find, but it's the kind of thing
which could be massively driver and BIOS-dependent.
I know this is a more artifical figure than the games which have been run, but
significant jumps like this (along with the increased vector dot rate) make these
cards much more attractive than the 6800 series for non-graphical work. Would it
be possible to try to confirm whether this speed-up is specific to the Quadro
board, or whether it applies to the consumer cards too? (Either by a little bit
of test code, or by running some artificial benchmarks.)
Just curious. Not that I'll be able to afford a 4500 anyway...
tmehanna - Thursday, July 28, 2005 - link
ALL 7800GTX cards at this point are manufactured by nvidia and sold as is by the "vendors". ONLY physical difference is the logo on the cooler. If some vendors screen and OC their cards before selling, clock speeds would be the only difference. ANY perfomance or heat dissipation differences at similar clock speeds are MERELY manufacturing variances.DerekWilson - Thursday, July 28, 2005 - link
Not true. Vendors have some bios control over aspect of the cards that are not exposed to users. We have not been able to confirm any details from any vendor or NVIDIA (as they like to keep this stuff under wraps), but temp, heat, and noise (and even overclockability) could be affected by video bios settings.We don't know the details; we need more clarification. In the meantime, these are the numbers we are seeing so we will report them. If we are able to get the information we need to really say why we see these differences then we will definitely publish our findings.
lambchops3344 - Wednesday, July 27, 2005 - link
no matter how much better a card does im always going to by evga... ive saved more time and money with the step up program. there customer support is soo good too.NullSubroutine - Tuesday, July 26, 2005 - link
After reading an article about how CPU performance is tapering off (murphy's law or moores law, i forget which one), but GPU performance has continued to increase, and has showed signs that it will continue to increase. Also I remember an article about Nvidia or ATi (i cant remember which) was asked about any "dual core" GPU's that will be developed. They answered that if you really look at the hardware, GPUs are like multiprocessors, or something to that nature. Perhaps this could be the reason for the clockspeed questions? It would seem logical to me that their technology doesnt run like a typical cpu, because each "processor" runs at a different speed? I think you might understand what im trying to say, at least I hope so cuz im failing miserably at...what was i sayin?Gamingphreek - Monday, July 25, 2005 - link
Not sure if this has already been discussed in earlier articles, but, the 7800GTX as everyone (including myself) seems bottlenecked at every resolution except 16x12. And then with AA and AF enabled the X850XT seems to catch up. While the averages might be the same, has anandtech ever thought of including the minimum and maximum framerates on their graphs.Thanks,
-Kevin Boyd
Fluppeteer - Monday, July 25, 2005 - link
Just wanted to thank Derek and Josh for clarifying the dual link situation. MSI don't mention anything about dual link, but after the debacle with their 6800"GT" I'm not sure I'd have trusted their publications anyway... If *all* the 7800GTXs are dual link, I'm more confident (although if there's actually a chance to try one with a 30" ACD or - preferably - a T221 DG5 in a future review I'd be even happier!)Good review, even if we can expect most cards to be pretty much clones of the reference design for now.
DerekWilson - Monday, July 25, 2005 - link
We'll have some tests with a Cinema Display at some point ...But for now, we can actually see the Silicon Image TMDS used for Dual-Link DVI under the HSF. :-)
Fluppeteer - Monday, July 25, 2005 - link
Cool; it'd reassure me before I splash out! (Although I'm still hoping for the extra RAM pads to get filled out - got to hate 36MB frame buffers - but with the Quadro 4500 allegedly due at SIGGRAPH it shouldn't be long now.)Sounds like the same solution as the Quadro 3400/6800GTo, with the internal transmitter used for one link and the SiI part for the other. I don't suppose you've pulled the fan off to find out the part number?
I'd also be interested in knowing whether the signal quality has improved on the internal transmitter; nVidia have a bad record with this, and the T221 pushes the single link close to the 165MHz limit (and the dual link, for that matter). People have struggled with the 6800 series, even in Quadro form, where the internal transmitters have been in use. It'd be nice to find out if they're learning, although asking you to stick an oscilloscope on the output is a bit optimistic. :-) These days this probably affects people with (two) 1920x1200 panels as well as oddballs like me with DG5s, though.
On the subject of DVI, I don't suppose nVidia have HDCP support yet, do they? (Silicon Image do a part which can help out, or I believe it can be done in the driver.) It's really a Longhorn thing, but you never know...
Now, if only nVidia would produce an SLi SFR mode with horizontal spanning which didn't try to merge data down the SLi link, I'd be able to get two cards and actually play games on two inputs to the T221 (or two monitors); the way the 7800 benchmarks are going, 3840x2400 is going to be necessary to make anything fill rate limited in SLi. (Or have they done this already? There was talk about Quadros having dual-card OpenGL support, but I'm behind on nVidia drivers while my machine's in bits.)
Thanks for the info!
(Starts saving up...)
meolsen - Wednesday, July 27, 2005 - link
Nether Evga NOR MSI advertise that their card is capable of driving at the resolutions that would suggest that the dual-link DVI is enabled.E.g., MSI:
Advanced Display Functionality
• Dual integrated 400MHz RAMDACs for display resolutions up to and including 2048x1536 at 85Hz
• Dual DVO ports for interfacing to external TMDS transmitters and external TV encoders
• Full NVIDIA nView multi-display technology capability
Why would they conceal this feature/?
Fluppeteer - Friday, July 29, 2005 - link
"Advertise" is perhaps a strong word, but the PDF data sheet on the eVGA web sitedoes say that one output is dual link (even though the main specifications say
the maximum digital resolution is 1600x1200, which is nonsense, like all resolution
claims, even for most single link cards).
I couldn't (last I looked) find anything about dual link support on the MSI site.
But then, MSI have in the past ignored that the 6800GTo was dual link, and then
claimed that their (real) 6800GT *was* dual link, and that the SiI transmitters
were unnecessary... (Although I'm still mystified how the PNY AGP 6600GT seems to
have dual dual link support without external transmitters.)
I'm presuming both heads have analogue output, btw (I only ask because the GTo,
for some astonishing reason, only has digital output on its single link head).
Past experience (with the 6800) suggests that the reason none of the manufacturers
mention it is that very few people actually know what dual link DVI *is*. A lot
probably haven't tried it - there being, last I looked, only three monitors which
can use it anyway, two of which are discontinued. nVidia caused a lot of confusion
by claiming support in the chipset and putting an external transmitter on their
reference card, which most manufacturers left off without updating their specs.
Unfortunately, nVidia seem to fob off all their tech support to the manufacturers,
who aren't always qualified to answer questions - I've not found anywhere to send
driver feature requests, for example. Seeing the external transmitter make it to
released boards is a vast relief to me.
Now the Quadro 4500 has been announced, I'm hoping the 512MB boards will appear
(and they might be DDL). Fingers crossed.
DerekWilson - Thursday, July 28, 2005 - link
Yes. Again, the SI TMDS for dual-link is on the pcb. So far there are no 7800 cards that we have seen without dual-link on one port.NVIDIA didn't even make this clear at their initial launch. But it is there. If we see a board without dual-link we'll let you know.
Wulvor - Monday, July 25, 2005 - link
For that extra $4 you are also paying for a longer Warranty. eVGA has a 1+1 warranty, so 1 year warranty out of the box, and another 1 year when you register online at eVGA. MSI on the other hand has a 3 year warranty, and BFG a lifetime warranty.It must be the corporate purchaser in me, $4 is well worth the extra year ( or 2 ), but I guess if you are going to be on the "bleeding" edge, then you are buying a new video card every 6 months anyways, so who cares?
smn198 - Monday, July 25, 2005 - link
A suggestion:Regarding measuring the card's noise output and the way you measured the sound
"We had to do this because we were unable to turn on the graphics card's fan without turning on the system."
Would it be possible to try and measure the voltages going to the fan when the card is idle and under full load? Then supply the fan with these voltages when the system is off using a different power supply such as a battery (which is silent) and a variable resister.
It would also be interesting to see a graph of how the noise increases when going from idle to full load over 10 minutes (or however long it takes to reach the maximum speed) on cards which have . Instead of trying to measure the noise with the system on, again measure the voltage over time and then using your battery, variable resistor and voltage meter recreate the voltages and use this in conjunction with the voltage/time data to produce noise/time data.
Thanks
DerekWilson - Monday, July 25, 2005 - link
We are definitely evaluating different methods for measuring sound. Thanks for the suggestions.Just to be clear, even after hours of looping tests on the 7800 GTX overclocked to 485/625, we never once heard an audible increase in the fan's speed.
This is very much unlike our X850 parts that spin up and down frequenly during any given test.
We have considered attempting to heat the environment to simulate a desert like climate (we've gotten plenty of email from military personel asking about heat tolerance on graphics cards), but it is more difficult than it would seem to heat the enviroment without causing other problems in our lab.
Suggestions are welcome.
Thanks,
Derek Wilson
at80eighty - Tuesday, July 26, 2005 - link
We have considered attempting to heat the environment to simulate a desert like climate [...] but it is more difficult than it would seem to heat the enviroment without causing other problems in our labDerek, If you really wanna simulate desert like heat in the room, may i suggest inviting Monica Belluci to your lab ....should work like a charm :p
reactor - Monday, July 25, 2005 - link
Ive been using MSI cards for a few years now, there fans always seem to run at top speed and ive found they usually run at higher RPM's(Slightly louder) than other manufacturers. I think that explains why the card is cooler while drawing more power, and why you didn't notice a difference in sound as the card was stressed. Im not entirely certain, but thats from my own expenriances with MSI cards.Good article, looking forward to the BFG.
yacoub - Monday, July 25, 2005 - link
"As you can see, The EVGA slightly outperforms the MSI across the board at stock speeds."Either I'm reading it wrong or you mis-wrote that line, since I see the e-VGA normal and OC'd, the NVidia reference, and the MSI OC'd, but no MSI at stock speeds. Thus it's hard to compare th EVGA stock speeds vs the MSI stock speeds when one of them isn't on the charts.
DerekWilson - Monday, July 25, 2005 - link
Check the bold print on the Performance page --MSI stock performance is the same as the NVIDIA reference performance at 430MHz ...
To compare stock numbers compare the green bar to the EVGA @ 450/600
Sorry for the confusion, but we actually tested all the games a second time and came up with the exact same numbers. Rather than add another bar, we thought it'd be easier to just reference the one.
If you guys would rather see multipler bars for equivalent results across the board, we can certainly do that.
Thanks,
Derek Wilson
davecason - Monday, July 25, 2005 - link
Since the MSI card drew a lot more power than expected but remained cooler than the eVGA card, I was thinking that some of the excess may be due to the cooling of the card itself. Maybe the fan on the MSI card works harder than the one on the eVGA card.The people at Anandtech could test the power usage of the stock video-card cooling fans independently to see what their effect is on power load. This may explain the 6 extra watts used by the MSI card. This information might be mildly useful to a person who was already stressing out their power supply with other things (such as several hard drives). Does anyone think that is worth doing?
DigitalDivine - Monday, July 25, 2005 - link
I like the new style, but is there a way to put the reply box back to the main comments page, posting quick replies without going to another page would be nice.----------------------------------
i wonder when they will release a $400 7800, hopefully a 24 pipe card... my 9800pro is so "old" now.
at80eighty - Monday, July 25, 2005 - link
but is there a way to put the reply box back to the main comments page, posting quick replies without going to another page would be niceBears repeating. (the formatting options are nice tho!)
Brian23 - Monday, July 25, 2005 - link
Bears?!?!at80eighty - Monday, July 25, 2005 - link
You Sir, may have latent gay issues to work onnot that there's anything wrong with that...
bob661 - Monday, July 25, 2005 - link
Some of my best friends are gay. :)xsilver - Monday, July 25, 2005 - link
just had a thought.are there any games confirmed that need ps3.0 or any features present in this card or the 6800 series?
The only game that I can think of is UT'07 where it was stated that the 6800 series was the first to be able to support such high poly counts
any games stated to come out before then that would require an upgrade from a ati card? (9800/x800)
comes on the back of the ti4600 owners that cant play BF2 at all
at80eighty - Monday, July 25, 2005 - link
Maybe i missed the memo - but why the hell had AT gone /. ?!!?! AARRGH!!Anand i really liked the prior comments style! very simple and easy to read. Keep the filters if you like, just gimme back the old layout!!
/puts pacifier back in the mouth :p
bersl2 - Monday, July 25, 2005 - link
Ha ha!I like it Slash-ed up!
p3r2y - Monday, July 25, 2005 - link
dude...an xfx comes with a 490/1300 factory overclock, the largest of any card, and comes with bf2. period.InTheFlow - Monday, August 1, 2005 - link
I'd love to see this card reviewed right after the BFG card.imaheadcase - Monday, July 25, 2005 - link
I know how to show FPS in bf2, but does it have a in game benchmark?Curious how you benchmark bf2 as I would like to see how some cards compare (or not compare i should say).
Good review
DerekWilson - Monday, July 25, 2005 - link
A lot of people have been asking, so here ya go ... links to our battlefield 2 benchmark demos files.[L]http://images.anandtech.com/reviews/video/bf2/atfi...[/L]
[L]http://images.anandtech.com/reviews/video/bf2/atfi...[/L]
There are instructions somewhere on battlefield2.com that describe how to use the demo.cmd file, but you will need to edit this file to set resolutions other than 800x600.
Our data is compiled from the last 1700 frames of our demo run. The many thousand other frames that can result come from the load screen and aren't useful to show performance.
Spacecomber - Monday, August 1, 2005 - link
What version of the game were these made under, Derek? Are they for the unpatched game? I couldn't get the demo to play. It would load the map, then it would crash me to the desktop while it was checking player assets or something to that effect. This was on a version of the game with the 1.02 patch.Space
DerekWilson - Monday, July 25, 2005 - link
by the way, since it's the last 1700 frame, you've got to go into the frametimes csv file and manual calculate an average for the last 1700 lines of the 3rd column (after you've split the columns on ; ) for every test you do. It's kind of a pain, but for people that care, there it is.bob661 - Monday, July 25, 2005 - link
How do you turn on the FPS in BF2? Thanks.Spacecomber - Monday, July 25, 2005 - link
Use the tilde ~ key to access the console and enter this command, "renderer.drawFps 1" (no quotes).You can find these tips and a lot more in the Battlefield Tweak Guide that I mentioned in my first link in the post above.
Space
Spacecomber - Monday, July 25, 2005 - link
You'll need to create (or find one for downloading) a "demo" file which you can then run with the timedemo feature of the demo.cmd script file.A couple of sources for general information on downloading the demo.cmd script,creating demos, converting them to AVIs, and running timedemos.
http://www.tweakguides.com/BF2_6.html">From the BF2 Tweak Guide
http://forum.eagames.co.uk/viewtopic.php?t=934">EA UK's BF2 Forum Thread on the BF2 Recorder
And, http://www.overclockers.com.au/article.php?id=3841...">Overclockers.AU article on running BF2 benchmarks (mentioned in the BF2 Tweak Guide.
HTH,
Space
p3r2y - Monday, July 25, 2005 - link
anand has an article about different gpu's in bf2 stupidSea Shadow - Monday, July 25, 2005 - link
Great review, and props for filtering all the random spam.I can't wait to see the BFG review as it will help me decide which 7800 I am going to get.
xsilver - Sunday, July 24, 2005 - link
congrats to anand to adding the filtersno more dumb "first post" or "in soviet russia" anymore
anyways with regard to the card
why on earth would you not buy the egva card as it comes with BF2 --- its one of the few games that taxes the card -- even if you have the game already - selling it would net you an extra $30 at least
kudos to evga for including a good game
DerekWilson - Monday, July 25, 2005 - link
Actually, Chronicles of Riddick can stress the card too. It's also a really good game.If the choice is just between BF2 and Riddick and you already have BF2 I would *definitely* suggest Riddick.
Nothing beats $500 for the 450MHz card with BF2 right now though.
arswihart - Sunday, July 24, 2005 - link
dumb sticker