ATI's New Leader in Graphics Performance: The Radeon X1900 Series
by Derek Wilson & Josh Venning on January 24, 2006 12:00 PM EST- Posted in
- GPUs
Image Quality, Feature Tests, and Power
Something we'd like to look at a bit more in-depth for this review is image quality. It's no secret that due to ATI and NVIDIA's differences in rendering graphics, there is always going to be some variation in the look of the graphics from one brand to another. Most times this variation is too subtle to notice, but upon closer inspection, certain patterns tend to emerge.
With Black and White 2, we can see how well the in-game maximum AA does at cleaning up the image. Note how there is a significant difference between the edges in the pictures without AA and with "high" AA enabled by the game. However, we don't see the same kind of difference between the image without AA enabled and the one with maximum quality enabled (in the graphics driver). This is a good example of in-game AA doing a much better job, quality and performance-wise, than the max quality settings in the control panel. We suspect that Black and White 2 has implimented a custom AA algorithm and has issues running stock MSAA algorithms. For this reason we recommend using the Black and White 2's in-game AA instead of the control panel's AA settings.
Both ATI and NVIDIA hardware look great and render similar images, and luckily for ATI there is an upcoming patch that should improve performance.
Battlefield 2 gives us a good view of how the maximum quality settings in the control panel (specifically transparency AA) fix certain graphical problems in games. Fences in particular have a tendency to render inaccurately, especially when looking through them at certain angles. While you can see that the in-game AA without adaptive or transparency AA cleans up a lot of jagged edges (the flag pole for instance), it still has trouble with parts of the fence.
As for power, we ran the multitexturing and pixel shader feature tests under 3dmark06 and measured the maximum powerload via our trusty Kill-A-Watt. This measures power at the wall before the PSU, so it doesn't focus only on the graphics cards.
We can see the CrossFire and SLI systems pull insane ammounts of power, but even as a single card the X1900 XTX is a very hungry part.
Something we'd like to look at a bit more in-depth for this review is image quality. It's no secret that due to ATI and NVIDIA's differences in rendering graphics, there is always going to be some variation in the look of the graphics from one brand to another. Most times this variation is too subtle to notice, but upon closer inspection, certain patterns tend to emerge.
Hold your mouse over the links below to see Image Quality (Right Click the links to download the full-resolution images):
ATI | |||
NVIDIA |
With Black and White 2, we can see how well the in-game maximum AA does at cleaning up the image. Note how there is a significant difference between the edges in the pictures without AA and with "high" AA enabled by the game. However, we don't see the same kind of difference between the image without AA enabled and the one with maximum quality enabled (in the graphics driver). This is a good example of in-game AA doing a much better job, quality and performance-wise, than the max quality settings in the control panel. We suspect that Black and White 2 has implimented a custom AA algorithm and has issues running stock MSAA algorithms. For this reason we recommend using the Black and White 2's in-game AA instead of the control panel's AA settings.
Both ATI and NVIDIA hardware look great and render similar images, and luckily for ATI there is an upcoming patch that should improve performance.
Hold your mouse over the links below to see Image Quality (Right Click the links to download the full-resolution images):
Hold your mouse over the links below to see Image Quality (Right Click the links to download the full-resolution images):
Battlefield 2 gives us a good view of how the maximum quality settings in the control panel (specifically transparency AA) fix certain graphical problems in games. Fences in particular have a tendency to render inaccurately, especially when looking through them at certain angles. While you can see that the in-game AA without adaptive or transparency AA cleans up a lot of jagged edges (the flag pole for instance), it still has trouble with parts of the fence.
As for power, we ran the multitexturing and pixel shader feature tests under 3dmark06 and measured the maximum powerload via our trusty Kill-A-Watt. This measures power at the wall before the PSU, so it doesn't focus only on the graphics cards.
We can see the CrossFire and SLI systems pull insane ammounts of power, but even as a single card the X1900 XTX is a very hungry part.
120 Comments
View All Comments
blahoink01 - Wednesday, January 25, 2006 - link
Considering the average framerate on a 6800 ultra at 1600x1200 is a little above 50 fps without AA, I'd say this is a perfectly relevant app to benchmark. I want to know what will run this game at 4 or 6 AA with 8 AF at 1600x1200 at 60+ fps. If you think WOW shouldn't be benchmarked, why use Far Cry, Quake 4 or Day of Defeat?At the very least WOW has a much wider impact as far as customers go. I doubt the total sales for all three games listed above can equal the current number of WOW subscribers.
And your $3000 monitor comment is completely ridiculous. It isn't hard to get a 24 inch wide screen for 800 to 900 bucks. Also, finding a good CRT that can display greater than 1600x1200 isn't hard and that will run you $400 or so.
DerekWilson - Tuesday, January 24, 2006 - link
we have looked at world of warcraft in the past, and it is possible we may explore it again in the future.Phiro - Tuesday, January 24, 2006 - link
"The launch of the X1900 series no only puts ATI back on top, "Should say:
"The launch of the X1900 series not only puts ATI back on top, "
GTMan - Tuesday, January 24, 2006 - link
That's how Scotty would say it. Beam me up...DerekWilson - Tuesday, January 24, 2006 - link
thanks, fixedDrDisconnect - Tuesday, January 24, 2006 - link
Its amusing how the years have changed everyone's perception as to how much is a reasonalble price for a component. Hardrives, memory, monitors and even CPUs have become so cheap many have lost the perspective of what being on the leading edge costs. I paid 750$ for a 100 MB drive for my Amiga, 500$ for a 4x CR-ROM and remember spending 500$ on a 720 X 400 Epson colour injet. (Yeah I'm in my 50's) As long as games continue to challenge the capabilities of video cards and the drive to increase performance continues the top end will be expensive. Unlike other hardware (printers, memory, hardrives) there are still perfomance improvements to be made that the user will perceive. If someday a card can render so fast that all games play like reality, then video cards will become like hardrives are now.finbarqs - Tuesday, January 24, 2006 - link
Everyone gets this wrong! It uses 16 PIXEL-PIPELINES with 48 PIXEL SHADER PROCESSORS in it! the pipelines are STILL THE SAME as the X1800XT! 16!!!!!!!!!! oh yeah, if you're wondering, in 3DMark 2005, it reached 11,100 on just a Single X1900XTX...DerekWilson - Tuesday, January 24, 2006 - link
semantics -- we are saying the same things with different words.fill rate as the main focus of graphics performance is long dead. doing as much as possible at a time to as many pixels as possible at a time is the most important thing moving forward. Sure, both the 1900xt and 1800xt will run glquake at the same speed, but the idea of the pixel (fragment) pipeline is tied more closely to lighting, texturing and coloring than to rasterization.
actually this would all be less ambigous if opengl were more popular and we had always called pixel shaders fragment shaders ... but that's a whole other issue.
DragonReborn - Tuesday, January 24, 2006 - link
I'd love to see how the noise output compares to the 7800 series...slatr - Tuesday, January 24, 2006 - link
How about some Lock On Modern Air Combat tests?I know not everyone plays it, but it would be nice to have you guys run your tests with it. Especially when we are shopping for $500 dollar plus video cards.