MultiGPU Update: Two-GPU Options in Depth
by Derek Wilson on February 23, 2009 7:30 AM EST- Posted in
- GPUs
Call of Duty World at War Analysis
This game, as with previous CoD installments, tends to favor NVIDIA hardware. The updated graphics engine of World at War, while looking pretty good, still offers good performance and good scalability.
1680x1050 1920x1200 2560x1600
In this test, even though we disabled the frame rate limit and vsync, single GPU solutions seem limited to around 60 frames per second. This is part of why we see beyond linear scaling with more than one GPU in some cases: it's not magic, it's that single card performance isn't as high as it should be. We don't stop seeing artificial limits on single GPU performance until 2560x1600.
1680x1050 1920x1200 2560x1600
SLI rules this benchmark with GT 200 based parts coming out on top across the board. This game does scale very well with multiple GPUs, most of the time coming in over 80% (the exception is the 9800 GTX+ at 2560x1600). At higher resolutions, the AMD multiGPU options do scale better than their SLI counter parts, but the baseline NVIDIA performance is so much higher that it really doesn't make a big practical difference.
1680x1050 1920x1200 2560x1600
In terms of value, the 9800 GTX+ (at today's prices), leads the way in CoD. Of course, though it offers the most frames per second per dollar, it is a good example of the need to account for both absolute performance and value: it only barely squeaks by as playable at 2560x1600.
Because we see very good performance across the board, multiple GPUs are not required even for the highest settings at the highest resolution. The only card that isn't quite up to the task at 2560x1600 is the Radeon HD 4850 (though the 4870 512MB and 9800 GTX+ are both borderline).
95 Comments
View All Comments
DerekWilson - Monday, February 23, 2009 - link
It really is a great looking game for an MMO. It's not the most played MMO around, but it is definitely the easiest to test. There is an area near the beginning where the player is alone in the environment and it's always the same time of day and all that stuff ... It takes out some of the factors that make getting consistent data out of other MMOs incredibly difficult.I've never had any real "issues" with it or with the results either. It's been very consistent as well. It does add value, and it's clear that games can be coded in a way that looks really good and perform like this one, so we feel it's important to getting a better feeling for what's out there and what's possible.
IKeelU - Monday, February 23, 2009 - link
Not really a big deal, but could you cut out the offhand game review comments when introducing benchmarks? I.e.: "Crysis Warhead, while not the best game around..." It feels out of place in a hardware analysis.SiliconDoc - Wednesday, March 18, 2009 - link
And Derek disses Far Cry 2 and Oblivioin where nvidia slaughters ati - then derek praises Bioshock where ati has an edge.Derek CAN'T HELP HIMSELF.
SiliconDoc - Wednesday, March 18, 2009 - link
Oh yes, and below don't forget the age of conan that favors the ati card - Derek can't stop drooling all over the place.Then come to COD, where nvidia once again slaughters - red blood everywhere - Derek says "do we really need another war game~" or the like.
Derek is red fan central and cannot stop himself.
The0ne - Monday, February 23, 2009 - link
This game is poorly programmed in the first place, does it deserve to even be included in the benchmark tests? Yes, it has the programming necessary to for the test but they're poorly programmed.IKeelU - Monday, February 23, 2009 - link
The fact that CryEngine 2 is taxing on today's hardware (and that Crytek will no doubt use derivatives of it in future games), makes it very useful in benchmarks. I hope reviewers keep using it. But by all means, feel free to disassemble Crytek's binaries and point out their code's weaknesses.Yeah, I thought not.
poohbear - Wednesday, February 25, 2009 - link
what do u mean they shouldnt include crysis warhead??? its the seminal game to see how graphics performance is to get an idea of how a particular video card will perfrom in the future. Cryengine2 is the most advanced graphics engine on the market. If a video card can provide 30 fps on a cryengine @ your resolution, then its good to last u for atleast 2 years.Razorbladehaze - Wednesday, February 25, 2009 - link
Yeah.... NO.I totally disagree with it being the most advanced. It is a decent game engine especially for benchmarking, but....
In all reality the STALKER Clear Sky revamped xray engine is far and away more advanced and superior in almost every way. It is about the same or better in regards to taxing the system (low frame rates does not necessarily translate to the game is taxing the system.). Being that these are also used in similar FPS titles they would make a interesting comparrison.
I would really like to see Anand include or swap a clear sky bench (there is a premade one available), for the Crysis or Crysis warhead. Either way no big deal many other sites post results with a CS bench that view all the time.
DerekWilson - Monday, February 23, 2009 - link
i'll take care of it.Stillglade - Monday, February 23, 2009 - link
I would love to see more info about the 4850 X2 1GB version. For over $50 cheaper, is the 1GB memory enough to compete? Is it worth paying 24% more for the 2GB version that you reviewed here?