Noise and Heat
Here are the meat and potatoes of our comparison between the 11 Geforce 6600GT cards today. How quiet and efficient are the cooling solutions attached to the cards? We are about to find out.Our noise test was done using an SPL meter in a very quiet room. Unfortunately, we haven't yet been able baffle the walls with sound deadening material in the lab, and the CPU and PSU fans were still on as well. But in each case, the GPU fan was the loudest contributor to the SPL in the room by a large margin. Thus, the SPL of the GPU is the factor that drove the measured SPL of the system.
Our measurement was taken at 1 meter from the caseless computer. Please keep in mind when looking at this graph that everyone experiences sound and decibel levels a little bit differently. Generally, though, a one dB change in SPL translates to a perceivable change in volume. Somewhere between a 6 dB and 10 dB difference, people perceive the volume of a sound to double. That means the Inno3D fan is over twice as loud as the Galaxy fan. Two newcomers to our labs end up rounding out the top and bottom of our chart.
The very first thing to notice about our chart is that the top three spots are held by our custom round HSF solutions with no shroud. This is clearly the way to go for quiet cooling.
Chaintech, Palit, and Gigabyte are the quietest of the shrouded solutions, and going by our rule of thumb, the Palit and Gigabyte cards may be barely audibly louder than the Chaintech card.
The Albatron, Prolink, and XFX cards have no audible difference, and they are all very loud cards. The fact that the Sparkle card is a little over 1 dB down from the XFX card is a little surprising: they use the same cooling solution with a different sticker attached. Of course, you'll remember from the XFX page that it seems that they attached a copper plate and a pad to the bottom of the HSF. The fact that Sparkle's solution is more stable (while XFX has tighter pressure on the GPU from the springs) could mean the slight difference in sound here.
All of our 6600 GT solutions are fairly quiet. These fans are nothing like the ones on larger models, and the volume levels are nothing to be concerned about. Of course, the Inno3D fan did have a sort of whine to it that we could have done without. It wasn't shrill, but it was clearly a relatively higher pitch than the low drone of the other fans that we had listened to.
NVIDIA clocks its 6600 GT cores at 300 MHz when not in 3D mode, and since large sections of the chip are not in use, not much power is needed, and less heat is dissipated than if a game were running. But there is still work going on inside the silicon, and the fan is still spinning its heart out.
Running the coolest is the XFX card. That extra copper plate and tension must be doing something for it, and glancing down at the Sparkle card, perhaps we can see the full picture of why XFX decided to forgo the rubber nubs on their HSF.
The Leadtek and Galaxy cards come in second, pulling in well in two categories.
We have the feeling that the MSI and Prolink cards had their thermal tape or thermal glue seals broken at some point at the factory or during shipping. We know that the seal on the thermal glue on the Gigabyte card was broken, as this card introduced us to the problems with handling 6600 GT solutions that don't have good 4 corner support under the heatsink. We tried our best to reset it, but we don't think that these three numbers are representative of what the three companies can offer in terms of cooling. We will see similar numbers in the load temp graphs as well.
Our heat test consists of running a benchmark over and over and over again on the GPU until we hit a maximum temperature. There are quite a few factors that go into the way a GPU is going to heat up in response to software, and our goal in this test was to push maximum thermal load. Since we are looking at the same architecture, only the particular variance in GPU and the vendor's implementation of the product are factors in the temperature reading we get from the thermal diode. These readings should be directly comparable.
We evaluated Doom 3, Half-Life 2, 3dmark05, and UT2K4 as thermal test platforms. We selected resolutions that were not CPU bound but had to try very hard not to push memory bandwith beyond saturation. Looping demos in different levels and different resolutions with different settings while observing temperatures gave us a very good indication of the sweet spot for putting pressure on the GPU in these games, and the winner for the hardest hitting game in the thermal arena is: Half-Life 2.
The settings we used for our 6600 GT test were 1280x1024 with no AA and AF. The quality settings were cranked up. We looped our at_coast_12-rev7 demo until a stable maximum temperature was found.
We had trouble in the past observing the thermal diode temperature, but this time around, we setup multiple monitors. Our second monitor was running at 640x480x8@60 in order to minimize the framebuffer impact. We kept the driver open to the temperature panel on the second monitor while the game ran and observed the temperature fluctuations. We still really want an application from NVIDIA that can record these temperatures over time, as the part heats and cools very rapidly. This would also eliminate any impact from running a second display. Performance impact was minimal, so we don't believe temperature impact was large either. Of course, that's no excuse for not trying to do thing in the optimal way. All we want is an MBM5 plugin, is that too much to ask?
It is somewhat surprising that Galaxy is the leader in handling load temps. Of course, the fact that it was the lightest overclock of the bunch probably helped it a little bit, but most other cards were running up at about 68 degrees under load before we overclocked them. The XFX card, slipping down a few slots from the Idle clock numbers with its relatively low core overclock, combined with the fact that our core clock speed leader moves up to be the second coolest card in the bunch, makes this a very exciting graph.
For such a loud fan, we would have liked to see Inno3D cool the chip a little better under load, but their placement in the pack is still very respectable. The Sparkle card again shows that XFX had some reason behind their design change. The added copper bar really helped them even though they lost some stability.
The Gigabyte (despite my best efforts to repair my damage), MSI, and Prolink cards were all way too hot, even at stock speeds. I actually had to add a clamp and a rubber band to the MSI card to keep it from reaching over 110 degrees C at stock clocks. The problem was that the thermal tape on the RAM had come loose from the heatsink. Rather than having the heatsink stuck down to both banks of RAM as well as the two spring pegs, the heatsink was lifting off of the top of the GPU. We didn't notice this until we started testing because the HSF had pulled away less than a millimeter. The MSI design is great, and we wish that we could have had an undamaged board. MSI could keep this from happening if they put a spacer between the board and the HSF on the opposite side from the RAM near the PCIe connectors.
84 Comments
View All Comments
1q3er5 - Thursday, December 16, 2004 - link
errr weird how the albatron despite its so called HSF mounting problem scored so high on all the tests albeit a bit loud and didn't get an award !Also looks like LEADTEK changed the design of the board of the bit
http://www.leadtek.com/3d_graphic/winfast_a6600_gt...
They added a heatsink on the RAM and you may also notice that the shroud now extends right over the other chips on the card.
miketus - Thursday, December 16, 2004 - link
Hi, has anyboby experience with Albatron 6600GT for AGPgeogecko - Monday, December 13, 2004 - link
Personally, I'd be willing to spend the extra $15-20 to get a decent HSF on these cards. Of course, the first one I buy will go in an HTPC, which will all be passively cooled, so the HSF in this case doesn't matter, because I'll just be removing it.However, for my PC, I sure would like a decent quality HSF. It would stink to have a $200 card burn up in your PC because of a $10 HSF setup.
WT - Monday, December 13, 2004 - link
Interesting that GigaByte used a passive HSF on their 6800 card (with great results), but went with a craptastic fan on the 6600GT. I have an MSI 5900 and didn't want to settle for the cheesy MSI 5900XT cards HSF setup, so we are seeing the same thing occur with the 6600GTs .... cut costs by using a cheaper HSF.Excellent article .. I found it answered every question I had left on the GT cards, further convincing me to buy the 6800 series.
DerekWilson - Sunday, December 12, 2004 - link
#49 -- it was a problem with our sample ... the actual issue was not a design flaw, but if the design (of most 6600 GT cards) was different, it might have been possible for our sample to have avoid breakage.That's kind of a complicated way of saying that you should be alright as long as you are careful with the card when you install it.
After it's installed, the way to tell if you have a problem is to run a 3D game/application in windowed mode. Open display properies and click on the system tab. Hit the advanced button and select the NVIDIA tab. select the temperature option and if you see temperatures of 90 degrees C and higher, you probably have a problem.
if your temp is lower than that you're fine.
Vico26 - Sunday, December 12, 2004 - link
Derek,was the 6600 GT MSI a broken piece, or is there a problem with the HS design? Plz let me know, as I bought the MSI card on the same day as you published the article. Now, I am shocked, and I would like to find a solution - new cooling system? Am I able to install it (I m not a sort of professional)?
Anyway many thanks, I should have waited a day...
DerekWilson - Sunday, December 12, 2004 - link
http://www.gfe.com.hk/news/buy.aspNyati13 - Sunday, December 12, 2004 - link
What I'd like to know is where are the Galaxy 6600GTs available? I've looked at some e-tailers that I know of, and searched pricewatch and e-bay, and there aren't any Galaxy cards for sale.geogecko - Sunday, December 12, 2004 - link
Well, I actually meant to say something in that last post.Anyway, short and sweet. That's the way I like these articles. Who wants to spend more than about 15-30 minutes to find out which card is best for them.
I do think that the HDTV thing could have been looked at, but other than that, it's a great article.
geogecko - Sunday, December 12, 2004 - link