F.E.A.R.
F.E.A.R has its own built in benchmark that consists of a fly-through of different game elements such as a firefight between soldiers, an explosion with fire, and hovering over rippling water. We tested the cards with most of the high end settings enabled, turning off soft shadows as we have in the past. We've found that the performance cost of soft shadows isn't worth considering, no matter what card is used, given how poor this effect looks in this game.
As with all the games, we again tested without antialiasing and with filtering set to "trilinear." Since this is another somewhat fast-paced first-person shooter, we consider an average of no less than 25 fps to be acceptable for playable performance.
This is another game that seems to favor ATI hardware a bit more than NVIDIA. We again see the Sparkle 7300 GS Ultra 2 at the bottom of the list, and it's not really playable at 800x600, as in Rise of Legends. The Gigabyte X1300 manages to pull off a playable framerate of 29 fps at this resolution. The Gigabyte and HIS X1600 Pro cards do very well up to 1280x1024, and for those who don't mind being limited to 1024x768 the X1300 Pro offerings from the same companies will run this game fairly well. The emerging theme in this article, however, is that the 7600 GS performs better than either of these cards (more or less depending on the game) and it's hard for ATI to compete with the value of the Gigabyte 7600 GS given current prices.
F.E.A.R has its own built in benchmark that consists of a fly-through of different game elements such as a firefight between soldiers, an explosion with fire, and hovering over rippling water. We tested the cards with most of the high end settings enabled, turning off soft shadows as we have in the past. We've found that the performance cost of soft shadows isn't worth considering, no matter what card is used, given how poor this effect looks in this game.
As with all the games, we again tested without antialiasing and with filtering set to "trilinear." Since this is another somewhat fast-paced first-person shooter, we consider an average of no less than 25 fps to be acceptable for playable performance.
This is another game that seems to favor ATI hardware a bit more than NVIDIA. We again see the Sparkle 7300 GS Ultra 2 at the bottom of the list, and it's not really playable at 800x600, as in Rise of Legends. The Gigabyte X1300 manages to pull off a playable framerate of 29 fps at this resolution. The Gigabyte and HIS X1600 Pro cards do very well up to 1280x1024, and for those who don't mind being limited to 1024x768 the X1300 Pro offerings from the same companies will run this game fairly well. The emerging theme in this article, however, is that the 7600 GS performs better than either of these cards (more or less depending on the game) and it's hard for ATI to compete with the value of the Gigabyte 7600 GS given current prices.
49 Comments
View All Comments
Josh Venning - Thursday, August 31, 2006 - link
I also forgot to mention that some people use their pcs in home theater systems as well. This would be another case when you want as little noise from your computer as possible.imaheadcase - Thursday, August 31, 2006 - link
That was not always the case, my 9700 Pro i still use when fan went out a year ago, works like a charm without it on. It was in its time the high end card, lets hope those days come buy again :Deckre - Thursday, August 31, 2006 - link
What a great review, when tom did their silent VC review, they included a grand total of three cards...pfft. nice job anand.I have the 7600GT, very sweet and 0dB is oh so nice.
Josh Venning - Thursday, August 31, 2006 - link
We just wanted to say thanks all for your comments and we are still trying to make sure we've caught any errors. (there are actually only 20 cards in the roundup and not 21) As Derek said, these cards were included in the article because we requested any and all silent cards that any of the manufacturers were willing to give us to review. That's also why we have more cards from ASUS and Gigabyte than the others.Olaf van der Spek - Thursday, August 31, 2006 - link
Because the videocard industry hasn't introduced such a bad design as the netburst architecture.
epsilonparadox - Thursday, August 31, 2006 - link
No they've introduced worse. When they recommend a second PS just for grafx or even a 1Kw single PS, they've taken intel's lack of thermal control to a whole new level.DerekWilson - Thursday, August 31, 2006 - link
graphics cards use much much less power in 2d mode than in 3d mode -- and even their 3d power saving capabilities are really good.this is especially true when you consider the ammount of processing power a GPU delivers compared to a CPU.
Theoretical peak performance of a current desktop CPU is in the 10-15 GFLOPS range at best. For a GPU, theoretical peak performance is at least one order of magnitude larger reaching up over 200 GFLOPS in high end cases.
I'm not saying we can reach these theoretical peak rates on either a CPU or a GPU, but a GPU is doing much much more work under load than a CPU possibly could.
Keep in mind we aren't even up to GHz on GPU cores. On the CPU front, Intel just shortened the pipeline and decreased clock speeds to save power -- doing more work in one cycle. This is absolutely what a GPU does.
And the icing on the cake is the sheer options on the silent GPU front. Neither AMD nor Intel make a fast desktop CPU that can be (easily) passively cooled. These parts are a testiment to the efficiency of the GPU.
On the flip side, ATI and NVIDIA push their high end parts way up in clock speed and power consumption trying as hard as possible to gain the performance crown.
There are plenty of reasons GPUs draw more power than a CPU under load, but a lack of thermal control or inefficient desing is not one of them. It's about die size, transistor count, and total ammount of work being done.
JarredWalton - Saturday, September 2, 2006 - link
I disagree with Derek, at least in some regards. The budget and midrange GPUs generally do a good job at throttling down power requirements in 2D mode. The high-end parts fail miserably in my experience. Sure, they consume a lot less power than they do in 3D mode, but all you have to do is look at the difference between using a Radeon Mobility X1400 and a GeForce Go 7800 in the Dell laptops to http://www.anandtech.com/mobile/showdoc.aspx?i=276...">see the difference in battery life.In 2D mode, graphics chips still consume a ton of power relatively speaking -- probably a lot of that going to the memory as well. A lot of this can be blamed on transistor counts and die size, but I certainly think that NVIDIA and ATI could reduce power more. The problem right now is that power use is a secondary consideration, and ATI and NVIDIA both need to have a paradigm shift similar to what Intel had with the Pentium M. If they could put a lot of resources into designing a fast but much less power-hungry GPU, I'm sure they could cut power draw quite a bit in both idle and load situations.
That's really the crux of the problem though: resources. Neither company has anywhere near the resources that AMD has, let alone the resources that Intel has. Process technology is at least a year behind Intel if not more, chip layouts are mostly computer generated as opposed to being tweaked manually (I think), and none of the companies have really started at square one trying to create a power efficient design; that always seems to be tacked on after-the-fact.
GPUs definitely do a lot of work, although GFLOPS is a terrible measure performance. The highly parallel nature of 3D rendering does allow you to scale performance very easily, but power requirements also scale almost linearly with performance when using the same architecture. It would be nice to see some balance between performance scaling and power requirements... I am gravely concerned about what Windows Vista is going to do for battery life on laptops, at least if you enable the Aero Glass interface. Faster switching to low-power states (for both memory and GPU) ought to be high on the list for next-generation GPUs.
DaveLessnau - Thursday, August 31, 2006 - link
I'm wondering why Anandtech tested Asus' EN7800 GT card instead of their EN7600 GT. That card would be more in line with Gigabyte's 7600 GT version and, I believe, is more available than the 7800 version. In the near future, I'd like to buy one of these silent 7600GTs and was hoping this review would help. Oh, well.DerekWilson - Thursday, August 31, 2006 - link
you can get a really good idea of how it would perform by looking at Gigabyte's card.as I mentioned elsewhere in the comments, we requested all the silent cards manufacturers could provide. if we don't have it, it is likely because they were unable to get us the card in time for inclusion in this review.