GeForce 6800 GS: Where Two Generations Meet The very successful graphics card GeForce 6800, based on the 0.11-micron NV42 graphics processor, has a number of indisputable advantages including simple PCB design, support of advanced technologies like Shader Model 3.0 and HDR, and very low power consumption and heat dissipation parameters achieved through the use of the progressive tech process. Priced at about $199, this 12-pipelined card would be nearly ideal if it were not for its slow memory, clocked at 350 (700) MHz. This is a very low clock rate by today’s standards. So, despite the 256-bit memory bus, it is the memory that prevents the GeForce 6800 from showing its full potential, especially in high resolutions and with enabled full-screen antialiasing. For example, in our PowerColor X800 GT review the 12-pipelined GeForce 6800 was often slower than the 8-pipelined RADEON X800 GT in the “eye candy” test mode, i.e. with enabled full-screen antialiasing and anisotropic filtering, and only due to the sheer difference in the memory bandwidth (32GB/s against 22.4GB/s). The ATI RADEON X800 GT is in fact an answer to the NVIDIA GeForce 6600 GT, while the GeForce 6800 is supposed to fight another market opponent, ATI’s RADEON X800 GTO, which has 12 pixel pipelines and GDDR3 memory clocked at 1GHz. So, we suspect it will also have some advantage over the GeForce 6800 in high resolutions, especially if you turn on FSAA. Meanwhile, the 0.11-micron NV42 chip is known to have a very high frequency potential and can be easily overclocked from 325 to 450MHz and higher, its power consumption still remaining low. It was also clear that NVIDIA did not have a product to oppose the yet-unavailable RADEON X1600 XT with. The GeForce 6800 doesn’t suit for this part due to obvious reasons, but it also belongs to a lower price category ($199 against $249). The GeForce 6800 GT could serve that purpose, but NV40 and NV45 chips are being manufactured at IBM facilities and have a high cost, so NVIDIA doesn’t have much room for price adjustment with them.
Taken all together, these facts just called for the obvious solution. The potential of the inexpensive NV42 chip could be made use of to add more vigor to the GeForce 6800 by increasing the GPU clock rate and equipping it with 256MB of faster memory. That’s the background behind NVIDIA’s new product, GeForce 6800 GS, which is expected to replace the more expensive GeForce 6800 GT and challenge the RADEON X1600 XT as well as the RADEON X800 XL. We’ll check this graphics card today to see how power-economical and fast it is in comparison with competing solutions. Today’s graphics processors are often pin-compatible with their predecessors for the developers not to spend time and money on designing new PCBs. An example of this approach is the RADEON X8 series from ATI Technologies: the entire series, from X800 GT to X850 XT Platinum Edition, use unified PCBs with minor variations. NVIDIA’s GeForce 7800 and GeForce 6800 GPUs are also physically compatible, so the company didn’t have to develop the PCB for the new product from scratch. They already had the GeForce 7800 GT printed circuit board. Simple (i.e. ch**p) and rather compact, it could normally power up a 20-pipelined GPU working at 400MHz and memory clocked at 1GHz. So, it was sure to suffice for the 12-pipelined NV42, even clocked at a high frequency. As you have already guessed, that PCB became the foundation of the GeForce 6800 GS graphics card: If it were not for the cooler and the connectors (DVI-I and D-Sub), you wouldn’t tell the new card from a GeForce 7800 GT. NVIDIA abandoned the efficient but noisy cooling system of the GeForce 7800 GT, but equipped the reference sample of the GeForce 6800 GS with the cooler from the GeForce 6800 GT. In brief, the cooler consists of three parts: a plastic casing with a blower, a GPU heatsink and an L-shaped heatsink with a heat pipe for cooling the memory chips. The L-shaped heatsink on our card was electrochemically blackened for better heat transfer, while in the original cooler it had been just painted black. The pipe transfers heat to the heatsink section which is blown at by the fan. This way the GDDR3 memory clocked at 500 (1000) MHz is effectively cooled. For comparison, the memory chips on RADEON X800 GTO and RADEON X1600 XT cards is not cooled at all. The GPU heatsink is made of copper, although NVIDIA had earlier used all-aluminum heatsinks of the same shape. It was ASUS Computer that first employed a copper heatsink on a GeForce 6800 in its unique V9999 Gamer Edition graphics card which actually had the same technical characteristics as the today’s GeForce 6800 GS, but worked on the AGP platform. Generally speaking, copper has better heat conductivity but worse heat capacity than aluminum. In other words, a copper heatsink takes heat off the GPU faster and more effectively, but requires a stronger airflow. The blower installed in the GeForce 6800 GS cooling system can create this airflow, but its noise characteristics may be too much for a sensitive ear – we’ll talk about that in the next section of the review. The main radiator is fastened to the PCB with four spring-loaded screws, so a proper contact with the GPU is guaranteed. The fan speed control system lacks feedback, which we have on graphics cards from ATI Technologies, but it can work in three fixed-speed modes: 2D, Low Power 3D and Performance 3D. The speeds in these modes can be adjusted to some extent with the RivaTuner utility, so you can reduce the noise from the card a little in a particular mode. Typical dark-gray thermal paste with low heat resistance is employed as a thermal interface between the heatsink’s sole and the GPU die. The memory chips touch the heatsink through cloth pads soaked in white thermal paste like on other NVIDIA products. The cooling system is overall satisfactory. It used to cope nicely with 0.13-micron 16-pipelined NV40/45 chips, so it should handle a 0.11-micron 12-pipelined NV42 as well. We removed the memory heatsink to check the marking. NVIDIA employed popular K4J55323QF-GC20 chips of GDDR3 memory from Samsung here. They have 256Mb capacity, 2.0 voltage, and 2.0ns access time. It means they are rated to work at 500 (1000) MHz frequency. And then we took off the main heatsink to see an ordinary NV42 chip that we described some long time ago in our MSI NX6800 review . As you remember, this GPU has 12 pixel and 5 vertex processors and can work at frequencies over 400MHz. The GPU frequency is declared to be 425MHz in the GeForce 6800 GS specification. This is 100MHz above the frequency of the GeForce 6800. Combined with 500 (1000) MHz memory frequency, it should ensure a tremendous performance gain – the GeForce 6800 GS seems to have much more raw power than the GeForce 6800. The card is not equipped with a VIVO chip, although there is a place for it on the PCB. NVIDIA probably tried to reduce the cost of the product as much as possible, but some graphics card manufacturers will probably equip their versions of the GeForce 6800 GS with an appropriate chip from Philips. The GeForce 6800 GS fully supports NVIDIA’s SLI technology. Unfortunately, we couldn’t test it in this mode since we had only one sample of the GeForce 6800 GS on our hands. But as soon as we get a second one, we’ll tell you how fast such SLI configurations may be. Power consumption has become a crucial parameter of any modern graphics card and we of course checked how much juice NVIDIA’s new product needs. Officially, the GeForce 6800 GS can eat up to 70 watts of power at maximum, but this is a purely theoretical number, hardly achieved under real conditions. We checked the power consumption of the card on a special testbed configured like follows: Intel Pentium 4 560 processor (3.60GHz, 1MB L2 cache) Intel Desktop Board D925XCV PC4300 DDR2 SDRAM (2 x 512МБ) Samsung SpinPoint SP1213C hard disk drive (Serial ATA-150, 8MB buffer) Microsoft Windows XP Pro SP2, DirectX 9.0c The GPU was put to test by running the third 3DMark05 subtest in a loop in 1600x1200 resolution with enabled 4x FSAA and 16x AF. We performed the measurements with a digital multimeter Velleman DVM850BL (its 0.5% accuracy suits well for our purpose). And here are the results: The high frequencies and the 256 megabytes of onboard memory must be fed amply: the peak power consumption of the GeForce 6800 GS is high, closely approaching that of the GeForce 6800 GT. Even the 0.11-micron tech process couldn’t help the new graphics card much. The RADEON X1600 XT consumes 13 watts less, but it has a 0.09-micron GPU, only 4 texture-mapping units, and 4 memory chips (the GeForce 6800 GS carries eight memory chips). Curiously, the new card almost does not use the 3.3V power line, consuming a mere 0.1W from it. This is a characteristic feature of the PCB and power circuit of the GeForce 7800 GT we mentioned in our NVIDIA GeForce 7800 GT review . Overall, our expectations about the power consumption of the GeForce 6800 GS came true. The card requires about 55W under load, i.e. about the same amount as the GeForce 6800 GT needs. The new card is just a little worse than the RADEON X800 XL in this parameter, and noticeably worse than the newer RADEON X1600 XT. On the other hand, the power consumption remains at a reasonable level, not exceeding even 60 watts. Theoretically speaking, the card could do even without an external power source, but feed through the PCI Express slot alone, as the ATI RADEON X800 XL with a slightly lower power consumption does, for example. Most likely, this was not implemented due to technical reasons: the power circuit borrowed from the GeForce 7800 GT probably must have an external power source. This should positively affect the stability of operation and the overclockability of the GeForce 6800 GS, by the way. Noise, Overclocking, 2D Quality Our GeForce 6800 GS being an engineering sample, its cooler could only work in two modes. The fan is rotating at its full speed and is rather loud until the OS and the ForceWare driver are loaded. After that the fan speed goes down and its noise diminishes, but not to vanish completely. The fact is the fan-control system on our sample of the card was programmed in such a way that the fan speed was constant in all the modes (2D, Low Power 3D and Performance 3D). RivaTuner agrees with us: the sliders stand on 53% for all three modes. Of course, the speed-control system will be set up properly in the final revision of the card, and its acoustic characteristics will be like those of the GeForce 6800 GT which is not silent, but is not annoyingly loud, either. Moreover, some graphics card makers will surely equip their versions of the GeForce 6800 GS with quieter coolers. We were much pleased with the overclockability of our sample of the card. We easily overclocked the GPU from the default 425MHz to 500MHz. The cooling system probably prevented it from speeding up more (the GPU was stable even at 540MHz, but there were some visual artifacts on the screen). But the real surprise was that the memory could work at 680 (1360) MHz – a fantastic achievement for 2.0ns memory chips! We know of many cases when Samsung’s K4J55323QF-GC20 could work at 550-600 (1100-1200) MHz, but not higher! Could NVIDIA have culled chips especially for the engineering sample of the GeForce 6800 GS or maybe they just increased the memory voltage? It is probable since we could not be mistaken. The monitoring module of RivaTuner reported the frequency gain, and this gain resulted in an appropriate performance gain in tests, and the card was absolutely stable at that. You can rarely meet a graphics card today that would provide other than excellent image quality in 2D applications. The new product from NVIDIA was not an exception, giving out a crystal-sharp picture in all display modes supported by our monitors. We observed no fuzziness or ghosting or any other undesired effect. The next section of the review deals with the performance of the new graphics card in games. Testbed and Methods AMD Athlon 64 4000+ CPU (2.40GHz, 1MB L2 cache) ASUS A8N-SLI Deluxe mainboard (NVIDIA nForce4 SLI chipset) OCZ PC3200 Platinum EL DDR SDRAM (2 x 1GB, CL2-3-2-5) Samsung SpinPoint SP1213C (Serial ATA-150, 8MB buffer) Creative SoundBlaster Audigy 2 sound card Cooler Master Real Power 450 power supply (RS-450-ACLY, 450W) Dell P1130 and Dell P1110 monitors (21”, 1800x1440@75Hz max display mode) Microsoft Windows XP Pro SP2 with DirectX 9.0c ATI Catalyst 5.9 (with RADEON X1000 support) NVIDIA ForceWare 81.87 (according to NVIDIA, this version is an analog of the WHQL-certified ForceWare 81.85, but features support of GeForce 6800 GS and some other new products) We set up the ATI and NVIDIA drivers in the following way: ATI CATALYST 5.9: CATALYST A.I.: Standard Mipmap Detail Level: Quality Wait for vertical refresh: Always off Adaptive antialiasing: Off Temporal antialiasing: Off Quality AF: Off Other settings: default NVIDIA ForceWare 81.87: Image Settings: Quality Vertical sync: Off Trilinear optimization: On Anisotropic mip filter optimization: Off Anisotropic sample optimization: On Gamma correct antialiasing: On (for GeForce 7 only) Transparency antialiasing: Off (for GeForce 7 only) Other settings: default We select the highest graphics quality settings in each game, identical for graphics cards from ATI and NVIDIA. If possible, we use the games’ integrated benchmarking tools (to record and reproduce a demo and measure the reproduction speed in frames per second). Otherwise we measure the frame rate with the FRAPS utility. If it is possible, we measure minimal as well as average fps rates to give you a fuller picture. We turn on 4x full-screen antialiasing and 16x anisotropic filtering in the “eye candy” test mode from the game’s own menu if possible. Otherwise we force the necessary mode from the driver. We don’t test the “eye candy” mode if the game engine doesn’t support FSAA. Besides the NVIDIA GeForce 6800 GS, the following graphics cards took part in this test session: GeForce 6800 GT (NV45, 350/1000MHz, **pp, 6vp, 256-bit, 256MB) for details please see our article called Leadtek WinFast PX6800 GT TDH Graphics Card: Overclocker's Dream GeForce 6800 (NV42, 325/600MHz, 12pp, 5vp, 256-bit, 128MB) for details please see our article called MSI NX6800-TD128E (GeForce 6800) in SLI-Configurations: High Speed at a Reasonable Price? RADEON X1600 XT (RV530, 590/1380Mhz, 12pp, 5vp, 128-bit, 256MB) for details please see our article called ATI RADEON X1600 XT: Mainstream Performance Redefined Once Again? RADEON X800 XL (R430, 400/980MHz, **pp, 6vp, 256-bit, 256MB) for details please see our article called ATI's 0.11 Micron Chip on the Run: ATI RADEON X800 XL Review These games and applications were used as benchmarks: First-Person 3D Shooters: >Battlefield 2 >The Chronicles of Riddick >Doom 3 >Far Cry >F.E.A.R. >Half-Life 2 >Pariah >Project Snowblind >Quake 4 >Serious Sam 2 >Unreal Tournament 2004 Third-Person 3D Shooters: >Prince of Persia: Warrior Within >Splinter Cell: Chaos Theory Simulators: >Colin McRae Rally 2005 >Pacific Fighters Strategies: >Age of Empires 3 >Warhammer 40.000: Dawn of War Semi-synthetic benchmarks: >Aquamark3 >Final Fantasy XI Official Benchmark 3 Synthetic benchmarks: >Futuremark 3DMark03 build 360 >Futuremark 3DMark05 build 120 Performance in First-Person 3D Shooters Battlefield 2 The GeForce 6800 GS is generally as fast as the GeForce 6800 GT (maybe a little slower in high resolutions without FSAA since the new product’s fill rate is lower). The RADEON X1600 XT is slower than the GeForce 6800 GS in almost every resolution, being limited in speed by its four TMUs. The overclocking gain we managed to get from our GeForce 6800 GS is proportional to the impressive frequency gain and amounts to 30% in some cases Conclusion So, is the new graphics card from NVIDIA a worthy successor to the passing-away GeForce 6800 GT and is it a worthy rival to the new ATI RADEON X1600 XT? We can answer both these questions in affirmative, now that we’ve seen the GeForce 6800 GS in action. The new member of the GeForce 6800 family turned to be faster than the older one, sometimes by as much as 30-40%! The injection of megahertz steroids into the GeForce 6800 architecture brings about the desired effect. In most games and applications the GeForce 6800 GS leaves no chance to the RADEON X1600 XT, a not yet available product from ATI Technologies. In a few cases these two graphics cards have the same speed, and the RADEON X1600 XT won 3DMark05, but this is hardly an achievement. Considering the identical price of these two graphics cards (about $250), the GeForce 6800 GS enjoys a complete victory, making the RADEON X1600 XT a much less appealing product for gamers. It is yet too early to make any final verdicts, since the latter device obviously suffers from insufficient driver optimization, but the general trend is unfavorable for ATI Technologies. The RV530 graphics processor has a performance-negative feature: it has only four texture-mapping units which automatically become a bottleneck in games that operate with high-resolution textures. Another drawback is the 128-bit memory bus. Working at 690 (1380) MHz, it has a maximum bandwidth of 22.1GB/s, while the 256-bit memory bus of the GeForce 6800 GS clocked at 500 (1000) MHz easily provides 32GB/s and positively affects the performance of the card in high resolutions and/or with enabled full-screen antialiasing. If the price of the RADEON X1600 XT remains the same, it will not stand a chance against the GeForce 6800 GS, except in applications where low power consumption and advanced video support are crucial. Note also that the GeForce 6800 GS is already shipping while the RADEON X1600 XT will be coming to market in mass quantities only starting from November 30. This is a definite advantage for NVIDIA before the imminent Christmas s*l*s season. As for the ATI RADEON X800 XL, the GeForce 6800 GS is slower in some cases, mostly when FSAA and anisotropic filtering are turned on. But each time the new card from NVIDIA could overtake the X800 XL through overclocking. Moreover, the RADEON X800 XL does not support Shader Model 3.0 and HDR which may be important features for PC gamers. The excellent overclockability of the GeForce 6800 GS needs to be mentioned, too. The combination of the GeForce 7800 GT printed circuit board with the 0.11-micron NV42 chip proved to be very overclocker-friendly. We managed to speed up our sample of the card from the default 425/1000MHz to the impressive 500/1360MHz. The ensuing performance gain lifted the speed of the card to the level of the more expensive GeForce 6800 Ultra and this is probably not the limit. We could have achieved higher frequencies if we had replaced the cooling system for something more efficient. Thus, the GeForce 6800 GS is not just a high-performance solution at a relatively low price. It may become a sensation among overclocker! It’s not certain yet how much room for price adjustment NVIDIA has: the 6800 GS PCB is obviously cheaper than the PCB employed in the GeForce 6800 GT, but more expensive than the GeForce 6800 one. Theoretically speaking, the GeForce 6800 GS has some reserve for further price reduction, but considering that the NV42 is produced in mass quantities and the chip yield is high, the price of GeForce 6800 GS graphics cards may go down by a few dozen dollars in a few weeks after the release. On the other hand, NVIDIA may not want to reduce the price because of the virtual lack of competition. So, a highly appealing product has emerged in between the GeForce 6 and GeForce 7 series. It is probably not destined to live long because NVIDIA is already preparing the G72 GPU, a mass-user version of the G70, but the GeForce 6800 GS will surely fulfill its purpose – to be the killer product in NVIDIA’s lineup during the Christmas s*l*s season. Highs: >High performance >Efficient cooling system >Shader Model 3.0 >HDR support >Hardware HD content playback acceleration >HDTV support >Excellent overclockability >256-bit memory bus >SLI-ready Lows: >Noticeably loud >Rather high power consumption
No comments:
Post a Comment