Antialiasing Performance
With midrange cards, dropping resolution a little and enabling antialiasing is usually an option. We tend to prefer a higher resolution and more settings, especially in an age where games like Oblivion and Splinter Cell: Chaos Theory require a choice between HDR and antialiasing in some cases. Hopefully we'll see fewer discrepancies in the future. For now, we've selected three of the games we tested to evaluate AA performance for our midrange group.
Battlefield 2
We see the ~140fps CPU limitation of the Core 2 Extreme X6800 having less of an impact on the X1900 XT, but the rest of the pack seems to scale similarly either with or without AA enabled. Our 6600 GT was unable to render 1920x1440 with 4xAA due to its 128MB memory size, but it isn't playable with AA at over 1024x768 anyway. While the high end of our test shows the top three cards playable at 1920x1440 with 4xAA, our 7600 GT can't be pushed past 1600x1200. The X1600 XT is stuck somewhere between 1024x768 and 1280x1024 depending on how smooth the gamer wants BF2 to run.
As with our non-AA test, the X1900 XT leads at the ~$300 price point, while the X1900 GT leads the 7900 GT in value without sacrificing performance. At the same time, the bump up from the 7600 GT in cost for an X1900 GT looks well worth it if greater than 1600x1200 resolutions are desired for Battlefield 2.
Half-Life 2: Episode One
This time the 6600 GT runs out of gas at 1280x1024 with 4xAA enabled. At the same time, every card other than the (stock) X800 GTO and X1600 XT are playable at 1600x1200 with 4xAA. This is a fairly good alternative to 1920x1440 without AA in Half-Life 2: Episode One. Having a little AA enabled does bring a little more life to the game. Since most of these midrange cards we tested can pull it off, and a good many people don't run higher than 1600x1200 anyway, this is a great option.
Quake 4
Due to the low contrast edges in most of the art and design in Quake 4, antialiasing is usually a little overkill. We'd prefer to run at a higher resolution or with uncompressed normal maps (ultra quality) rather than with AA enabled. But as Id favors OpenGL, we decided it would be beneficial to talk about antialiasing under Quake 4. Like our other tests, the 6600 GT and it's 128MB of RAM just can't handle 4xAA at 1920x1440. We might care about this if the game was at all playable at over 800x600 with 4xAA. The X1900 GT maintains its performance lead over the 7900 GT with AA enabled, but only the X1900 XT can hang on to playability at 1920x1440 with 4xAA. We do see good performance from the X1900 GT and 7900 GT at 1600x1200 though. X1600 XT users will need to stop at 1024x768 if they want to enable 4xAA with high quality settings under Quake 4.
74 Comments
View All Comments
DerekWilson - Thursday, August 10, 2006 - link
look again :-) It should be fixed.pervisanathema - Thursday, August 10, 2006 - link
You post hard to read line graphs of the benchmarks that show the X1900XT crushing the 7900GT with AA/AF enabled.Then you post easy to read bar charts of an O/Ced 7900GT barely eeking out a victory over the X1900XT ins some benchmarks and you forget to turn on AA/AF.
I am not accussing you guys of bias but you make it very easy to draw that conclusion.
yyrkoon - Sunday, August 13, 2006 - link
Well, I cannot speak for the rest of the benchmarks, but owning a 7600GT, AND Oblivion, I find the Oblivion benchmarks not accurate.My system:
Asrock AM2NF4G-SATA2
AMD AM2 3800+
2GB Corsair DDR2 6400 (4-4-4-12)
eVGA 7600GT KO
The rest is pretty much irrelivent. With this system, I play @ 1440x900, with high settings, simular to the benchmark settings, and the lowest I get is 29 FPS under heavey combat(lots of NPCs on screen, and attacking me.). Average FPS in town, 44 FPS, wilderness 44 FPS, dungeon 110 FPS. I'd also like to note, that compared to my AMD 3200+ XP / 6600GT system, the game is much more fluid / playable.
Anyhow, keep up the good work guys, I just find your benchmarks wrong from my perspective.
Warder45 - Thursday, August 10, 2006 - link
The type of chart used just depends on if they tested multiple resolutions vs a single resolution.Similar to your complaint, I could say they are bias towards ATI by showing how the X1900XT had better marks across all resolutions tested yet only tested the 7900GT OC at one resolution not giveing it the chance to prove itself.