To begin with, NVIDIA installed the G80 on 2 video cards: GeForce 8800 GTX and GeForce 8800 GTS.

GeForce 8800 Series Specifications
GeForce 8800 GTX GeForce 8800 GTS
Number of transistors 681 million 681 million
Core clock (including allocator, texture units, ROP units) 575 MHz 500 MHz
Shader frequency (stream processors) 1350 MHz 1200 MHz
Number of shaders (stream processors) 128 96
Memory frequency 900 MHz (effective 1.8 GHz) 800 MHz (effective 1.6 GHz)
Memory interface 384 bit 320 bit
Memory Bandwidth (GB/s) 86.4 GB/s 64 GB/s
Number of ROP blocks 24 20
Memory size 768 MB 640 MB

As you can see, the number of transistors in the GeForce 8800 GTX and 8800 GTS is the same, this is because they are absolutely identical G80 GPUs. As already mentioned, the main difference between these GPU options is 2 disabled banks of stream processors - a total of 32 shaders. At the same time, the number of working shader units has been reduced from 128 for the GeForce 8800 GTX to 96 for the GeForce 8800 GTS. NVIDIA also disabled 1 ROP (rasterization unit).

The core and memory frequencies of these video cards are also slightly different: the core frequency of the GeForce 8800 GTX is 575 MHz, while that of the GeForce 8800 GTS is 500 MHz. Shader units GTX operate at a frequency of 1350 MHz, GTS - 1200 MHz. With the GeForce 8800 GTS, NVIDIA also uses a narrower 320-bit memory interface and 640 MB of slower memory that runs at 800 MHz. The GeForce 8800 GTX has a 384-bit memory interface, 768 MB memory / 900 MHz. And, of course, a completely different price.

The video cards themselves are very different:


As you can see in these photos, the GeForce 8800 reference boards are black (a first for NVIDIA). With a cooling module GeForce 8800 GTX and 8800 GTS are two-slot. The GeForce 8800 GTX is slightly longer than the GeForce 8800 GTS: its length is 267 mm, versus 229 mm for the GeForce 8800 GTS, and, as previously stated, the GeForce 8800 GTX 2 PCIe power connector. Why 2? The maximum power consumption of the GeForce 8800 GTX is 177 W. However, NVIDIA says that this can only be an extreme case, when all the functional units of the GPU are loaded to the maximum, and in normal games during testing, the video card consumed an average of 116 - 120 W, maximum - 145 W.

Since each external PCIe power connector on the video card itself is rated for a maximum of 75W, and the PCIe slot is also rated for a maximum of 75W, then 2 of these connectors will not be enough to supply 177W, so I had to make 2 external connectors PCIe power supply. By adding a second connector, NVIDIA provided the 8800 GTX with a solid headroom. By the way, the maximum power consumption of the 8800 GTS is 147 W, so it can get by with one PCIe power connector.

Another feature added to the design of the GeForce 8800 GTX reference board is a second SLI slot, a first for NVIDIA GPUs. NVIDIA does not officially announce the purpose of the second SLI connector, but the journalists managed to get the following information from the developers: “The second SLI connector on the GeForce 8800 GTX is for hardware support possible expansion SLI configurations. Only one SLI connector is used with current drivers. Users can connect the SLI bridge to both the first and second contact groups.”

Based on this, and the fact that nForce 680i SLI motherboards come with three slots PCI Express(PEG), we can conclude that in the near future NVIDIA plans to implement support for three SLI video cards. Another option would be to increase the power for the SLI physics, but that doesn't explain why the GeForce 8800 GTS doesn't have a second SLI connector.

It can be assumed that NVIDIA reserves its GX2 “Quad SLI” technology for the less powerful GeForce 8800 GTS, while the more powerful GeForce 8800 GTX will operate in a triple SLI configuration.

If you remember, NVIDIA's original Quad SLI graphics cards are closer in performance to GeForce 7900 GT than GeForce 7900 GTX, as 7900 GT video cards have lower power consumption/heat dissipation. It is natural to assume that NVIDIA will follow the same path with the GeForce 8800. Gamers with motherboards with three PEG slots will be able to increase the speed of the graphics subsystem by assembling a triple SLI 8800 GTX configuration, which in some cases will give them better performance than Quad SLI system, judging by the characteristics of the 8800 GTS.

Again, this is just a guess.

The cooling block of the GeForce 8800 GTS and 8800 GTX is double-slot, ducted, bringing hot air from the GPU outside the computer case. The heatsink consists of a large aluminum heatsink, copper and aluminum heatpipes, and a copper plate that presses against the GPU. This whole structure is blown by a large radial-type fan, which looks a little intimidating, but is actually quite quiet. The cooling system of the 8800 GTX is similar to that of the 8800 GTS, only the former has a slightly longer heatsink.


In general, the new cooler copes with cooling GPU quite good, and at the same time almost silent - like the GeForce 7900 GTX and 7800 GTX 512MB video cards, but the GeForce 8800 GTS and 8800 GTX are a little more audible. In some cases, to hear the noise from the video card fan, you will need to listen well.

Production

All production of the GeForce 8800 GTX and 8800 GTS is carried out under an NVIDIA contract. This means that whether you buy a video card from ASUS, EVGA, PNY, XFX or any other manufacturer, they are all made by the same company. NVIDIA does not even allow manufacturers to overclock the first batches of GeForce 8800 GTX and GTS video cards: they all go on sale with the same clock speeds regardless of the manufacturer. But they are allowed to install their own cooling systems.

For example, EVGA has already released its e-GeForce 8800 GTX ACS3 Edition with its unique ACS3 cooler. The ACS3 video card is hidden in a single large aluminum cocoon. On it are applied letters E-V-G-A. For additional cooling On the back of the graphics card, EVGA placed an additional heatsink, directly opposite the G80 GPU.

In addition to cooling, manufacturers of the first GeForce 8800 graphics cards can only customize their products warranty obligations and a complete set - games and accessories. For example, EVGA bundles its video cards Dark game Messiah, the GeForce 8800 GTS BFG comes with a BFG T-shirt and mouse pad.

It will be interesting to see what happens next - many NVIDIA partners believe that for future releases of GeForce 8800 video cards, NVIDIA's restrictions will not be so strict, and they will be able to compete in overclocking.

Since all video cards come from the same pipeline, all GeForce 8800s support 2 dual-link DVI and HDCP connectors. In addition, it became known that NVIDIA does not plan to change the amount of memory in GeForce 8800 GTX and GTS (for example, 256 MB GeForce 8800 GTS or 512 MB 8800 GTX). At least for now, the standard configuration for the GeForce 8800 GTX is 768 MB, and the GeForce 8800 GTS is 640 MB. NVIDIA also has no plans to make an AGP version of GeForce 8800 GTX/GTS video cards.

Driver for 8800

NVIDIA has made a few changes in the GeForce 8800 driver that should be mentioned. First of all, the traditional overclocking utility Coolbits has been removed, replaced by NVIDIA nTune. That is, if you want to overclock the GeForce 8800 video card, you will need to download the nTune utility. This is probably good for owners of motherboards based on the nForce chipset, since the nTune utility can be used not only for overclocking a video card, but also for system configuration. Otherwise, those who, for example, have managed to upgrade to Core 2 and have a motherboard with a 975X or P965 chipset, will need to download a 30 MB application to overclock the video card.

Another change in the new driver that we noticed is that there is no option to switch to the classic NVIDIA control panel. Hopefully NVIDIA will bring this feature back into their video driver as it was well liked by many, unlike the new NVIDIA control panel interface.

For more than a year that has passed since the release of video cards based on NVIDIA GeForce 8800 chips, the situation on the graphics accelerator market has been extremely unfavorable for the end customer. In fact, an overclocker who could pay a tidy sum of money for top-end video cards simply had no alternative. A competitor from ATI(AMD) appeared later and, in the end, could not compete with the GeForce 8800 GTX, and later the Ultra version of NVIDIA GeForce 8800. video cards are not required. As a result, throughout this period, the prices for the GeForce 8800 GTX and Ultra remained at the same very high level, and only a few could afford such video cards.

However, the top price segment has never been a defining and priority for manufacturers of graphics chips and video cards. Yes, leadership in this class is certainly prestigious for any company, but from an economic point of view, the average price range is the most profitable. However, as recent tests of middle class aspirants have shown AMD Radeon HD 3850 and 3870, the performance of such video cards is unsatisfactory for modern games and, in principle, unacceptable for their quality modes. The NVIDIA GeForce 8800 GT is faster than the pair, but also falls short of being comfortable in DirectX 10 games. What is next after him, if there is an opportunity to pay extra? Until yesterday, in fact, there was nothing, since there is literally an abyss in terms of price between GT and GTX and that's it.

But technical progress does not stand still - the appearance of the new NVIDIA G92 chip, manufactured using 65-nm technology, allowed the company not only to attract overclockers with a quite successful video card GeForce 8800 GT, but also yesterday, December 11 at 17:00 Moscow time, to announce a new product - GeForce 8800 GTS 512 MB. Despite the quite uncomplicated name of the video card, the new graphics accelerator has a number of significant differences from regular version GeForce 8800 GTS. In today's material, we will get acquainted with one of the first video cards GeForce 8800 GTS 512 MB, appearing on Russian market, we will check its temperature regime and overclocking potential, and, of course, we will study the performance of the novelty.

advertising

1. Specifications of video cards participating in testing

The technical characteristics of the novelty are presented to your attention in the following table in comparison with NVIDIA video cards of the GeForce 8800 family:

Name of technical
characteristics
NVIDIA GeForce
8800 GT 8800 GTS 8800 GTS
512 MB
8800
GTX/Ultra
GPU G92 (TSMC) G80 (TSMC) G92 (TSMC) G80 (TSMC)
Process technology, nm 65 (low-k) 90 (low k) 65 (low-k) 90 (low k)
Core area, sq. mm 330 484 330 484
Number of transistors, mln. 754 681 754 681
GPU frequency, MHz 600
(1512 shaders)
513
(1188 shaders)
650
(1625 shaders)
575 / 612
(1350 / 1500
shader)
Effective operating frequency
video memory, MHz
1800 1584 1940 1800 / 2160
Memory size, Mb 256 / 512 320 / 640 512 768
Supported memory type GDDR3
Bit width of the exchange bus with memory, bit 256
(4x64)
320 256
(4x64)
384
Interface PCI Express
x16 (v2.0)
PCI Express
x16 (v1.x)
PCI Express
x16 (v2.0)
PCI Express
x16 (v1.x)
Number of unified shaders
processors, pcs.
112 96 128
Number of texture blocks, pcs. 56 (28) 24 64 (32) 32
Number of rasterization blocks (ROP's), pcs. 16 20 16 24
Pixel Shaders/Vertex version support
shaders
4.0 / 4.0
Video memory bandwidth, Gb/s ~57.6 ~61.9 ~62.1 ~86.4 / ~103.7

fills, Gpix./sec
~9.6 ~10.3 ~10.4 ~13.8 / ~14.7
Theoretical top speed
texture sampling, Gtex/sec
~33.6 ~24.0 ~41.6 ~36.8 / ~39.2
Peak power consumption in
3D operating mode, Watt
~106 ~180
power supply requirements,
Watt
~400 ~400 ~400 ~450 / ~550
Dimensions of the reference video card
design, mm (L x H x T)
220 x 100 x 15 228 x 100 x 39 220 x 100 x 32 270 x 100 x 38
exits 2 x DVI-I
(Dual Link)
TV Out, HDTV
Out, HDCP
2 x DVI-I
(Dual Link)
TV Out,
HDTV Out
2 x DVI-I
(Dual Link)
TV Out, HDTV
Out, HDCP
2 x DVI-I
(Dual Link)
TV Out,
HDTV Out
Additionally SLI support
Recommended cost, USD 199 / 249 349 ~ 399 299~349 499~599 / 699

2. Review of BFG GeForce 8800 GTS 512 MB OC (BFGR88512GTSE)

The latest video card from a company well known to overclockers comes in a very compact box, decorated in dark colors.

Comparative testing of four GeForce 8800GTS 512 and 8800GT

Let's take a look at the GeForce 8800GTS 512 boards and compare them with the cheaper GeForce 8800GT and the veteran GeForce 8800GTX. Along the way, we run in a new test bench and collect flaws in drivers for DX10

With the release of a new series of GeForce 8800GTS 512 video cards, NVIDIA has significantly strengthened its position. The new product replaced the more expensive, hotter and bulkier GeForce 8800GTX, and the only drawback compared to its predecessor was a narrower 256-bit memory bus (vs. . However, the novelty has undergone not only reductions, but also some improvements: the number of texture units has been increased from 32 to 64 pieces, which, of course, partly compensates for the simplifications in the map. Also, to compensate for the simplifications, the frequencies were increased compared to their predecessor, and the amount of video memory is easily expanded to 1 GB by simply installing larger-capacity chips, which, by the way, some manufacturers have already begun to do. But, despite the fact that the GeForce 8800GTS 512 replaced the GeForce 8800GTX, its main competitor is not its predecessor, but its closest relative GeForce 8800GT, and the whole point is in its lower price. Video cards GeForce 8800GTS 512 and GeForce 8800GT are not much different from each other, since GeForce 8800GT is a stripped-down version of GeForce 8800GTS 512 and, oddly enough, appeared on the market before the full-fledged version. Both video cards are equipped with 512 MB of video memory and, as today's study showed, they have the same memory. The main differences lie in the GPU, and specifically, in the GT version, some of its functional blocks are disabled. More detailed information is shown in the table below:

As you can see, the GeForce 8800GT differs from its older sister in the number of universal processors reduced to 112 and the number of texture units reduced to 56. Initially, the cards also differ in clock speeds, but this does not matter for our today's review, since almost all cards have been factory overclocked. Let's find out how the differences on paper are reflected in reality.

Leadtek 8800GTS 512

The designers from Leadtek chose a bright orange color to draw attention to their video card, and they were absolutely right: the novelty will not go unnoticed.
The face of the novelty was the image of a scene from a fictional “shooter”, under which the technical characteristics of the video card and a note about the bonus are located - full version Neverwinter Nights 2 games.
The reverse side of the box contains the characteristics of the video card, a list of the package and standard information from NVIDIA.
  • S-video > S-video + component out splitter;
  • DVI > D-sub adapter;
  • CD with drivers;
  • CD with Power DVD 7;

The Leadtek 8800GTS 512 video card is based on the reference design familiar to us from GeForce 8800GT boards. Outwardly, the novelty is distinguished by a “two-story” cooling system, which, unlike its predecessor, throws hot air out of the computer. The advantages of such a solution are obvious, and the reason for using an improved cooling system is, most likely, not that the “new” chip heats up more, but that the buyer has every right to get a better product for big money. After all, to be honest, the reference system of the GeForce 8800GT does not cope with its duties in the best way.
The reverse sides of the GeForce 8800GTS 512 and GeForce 8800GT look almost the same and differ in that the 8800GTS 512 version has all the elements mounted. However, we will be able to see the differences later on the example of the Leadtek 8800GT video card, but for now let's get under the hood of the new product.
Having removed the cooling system, we can again verify the identity of the boards. However, pay attention to the right side of the board, where the power subsystem is located. Where the GeForce 8800GT is empty and there are only seats, Leadtek 8800GTS 512's space is densely populated with radio elements. It turns out that the GeForce 8800GTS 512 has a more sophisticated power subsystem than the GeForce 8800GT. In principle, it's not surprising, because the GeForce 8800GTS 512 has higher operating frequencies, and, consequently, more stringent requirements for power quality.
There are no external differences between the G92 chip in Leadtek 8800GTS 512 and the G92 chip in GeForce 8800GT video cards.
AT new video card the same Qimonda chips with 1.0 ns access time as in GeForce 8800GT are used. A set of eight chips forms 512 MB of video memory. The nominal frequency for such chips is 2000 MHz DDR, but the actual frequency set in the video card is slightly lower.
The cooling system for the video card is aluminum with a copper plate. This combination of two materials has been used for a long time and allows you to achieve the required performance at a lower weight and at a lower price.
The processing of the copper "core" is at a satisfactory level, but no more.
After removing the casing from the cooling system, a stunning picture appears before us: as many as three heat pipes are engaged in heat removal from the copper base, which go to different parts of the radiator made of aluminum plates. Such a scheme serves to evenly distribute heat, and the large dimensions of the radiator should have the best effect on the quality of cooling, which cannot be said about the reference cooling system of GeForce 8800GT. There are also three heat pipes, but their dimensions are noticeably smaller, as are the dimensions of the radiator itself.

Differences, overclocking and efficiency of the cooling system


The differences from the GeForce 8800GT lie in the increased number of universal processors from 112 to 128, as well as the operating frequencies of the entire GPU.
In Leadtek 8800GTS 512, the frequencies correspond to the recommended ones and are equal to 650/1625 MHz for the graphics processor and 1944 MHz for the video memory.

Now - about the heating of the video card, which we will check using the Oblivion game with maximum settings.


The video card Leadtek 8800GTS 512 warmed up from 55 degrees at rest to 71 degrees, the noise from the fan was almost inaudible. However, this was not enough for overclocking, and with the help of the same Riva Tuner, we increased the fan speed to 50% of the possible maximum.
After that, the temperature of the GPU did not rise above 64 degrees, while the noise level remained at a low level. The video card Leadtek 8800GTS 512 was overclocked to 756/1890 MHz for the GPU and 2100 MHz for the video memory. Such high frequencies were unavailable for the GeForce 8800GT, apparently due to the simplified power supply system.

Well, let's get acquainted with the next participant in our testing today - the ASUS EN8800GTS TOP video card.

ASUS EN8800GTS TOP


When looking at the packaging of powerful ASUS video cards, you may get the feeling that this is not a video card at all, but, for example, a motherboard. It's all about large dimensions; for example, in our case, the size of the box is noticeably larger than that of the first participant in today's test. The large area of ​​the front side of the package made it possible to fit a large image of a branded archer girl and a considerable diagram showing a 7% faster speed compared to the "regular" GeForce 8800GTS 512. The "TOP" abbreviation in the name of the video card indicates that it has undergone factory overclocking. The minus of the package is that it is not obvious that the video card belongs to the GeForce 8800GTS 512 series, but, by and large, these are trifles. At first, it is surprising that there is too little information on the box, however, the truth is revealed later, by itself, and literally.
It is worth taking the box by the handle, as at the first breath of the breeze it opens like a book. The information under the cover is completely devoted to proprietary utilities from ASUS, in particular, ASUS Gamer OSD, which now can not only change the brightness / contrast / color in real time, but also show the FPS value, as well as record video and take screenshots. The second described utility called Smart Doctor is designed to monitor the value of the supply voltages and frequencies of the video card, and also allows you to overclock it. It should be said that ASUS' proprietary utility can change two GPU frequencies, that is, the core and the shader unit. This brings it very close to the famous Riva Tuner utility.
The reverse side of the box contains a bit of everything, in particular, a brief description of the Video Security utility, designed to use a computer as a "smart" online video surveillance system.
The complete set of the card is executed according to the principle of "nothing more":
  • adapter for powering PCI-express cards;
  • adapter S-video > component out;
  • DVI > D-sub adapter;
  • bag for 16 discs;
  • CD with drivers;
  • CD with documentation;
  • short instruction for installing a video card.

Externally, the video card is almost an exact copy of Leadtek 8800GTS 512, and this is not surprising: both cards are based on the reference design and, most likely, produced at the same factory by order of NVIDIA itself, and only then sent to Leadtek and ASUS. To put it simply, today a card from Leadtek could well become a card from ASUS, and vice versa.
It is clear that the reverse side of the video card also does not differ from that of Leadtek 8800GTS 512, except that they have different branded stickers.
Under the cooling system is also nothing unusual. The power system on the right side of the board is fully assembled, in the center is the G92 GPU with 128 active stream processors and eight memory chips, totaling 512 MB.
The memory chips are manufactured by Qimonda and have an access time of 1.0 ns, which corresponds to a frequency of 2000 MHz.
The appearance of the GPU does not reveal its noble origin, just like in Leadtek 8800GTS 512.
The cooling system of the ASUS EN8800GTS TOP video card is exactly the same as that of the Leadtek 8800GTS 512 video card: a copper "core" is built into the aluminum radiator to remove heat from the GPU.
The polishing quality of the copper core is satisfactory, as with its predecessor.
The heat from the copper core is distributed over the aluminum fins using three copper heat pipes. We have already seen the effectiveness of this solution on the example of the first card.

Rated frequencies and overclocking

As we have already said, the TOP prefix after the name of the video card indicates its factory overclocking. The nominal frequencies of the novelty are 740/1780 MHz for the GPU (against 650/1625 MHz for Leadtek) and 2072 MHz for video memory (against 1944 MHz for Leadtek). Note that for memory chips with 1.0 ns access time, the nominal clock frequency is 2000 MHz.

We managed to overclock the card to the same frequencies as the Leadtek 8800GTS 512: 756/1890 MHz for the GPU and 2100 MHz for the video memory at a fan speed of 50% of the maximum.

Well, now let's go down a step and get acquainted with two video cards of the GeForce 8800GT class.

Leadtek 8800GT

The Leadtek 8800GT video card is a typical representative of the GeForce 8800GT series and, in fact, differs little from the majority. The whole point is that the GeForce 8800GT video cards are cheaper than the "advanced" GeForce 8800GTS 512, so they don't become less interesting.
The box of Leadtek 8800GT is almost the same as that of the more expensive 8800GTS 512. The differences are in thinner thickness, no carrying handle and, of course, in the name of the video card. The inscription "extreme" after the name of the video card indicates its factory overclocking.
The back side of the box contains brief information about the video card, its advantages and a list of equipment. By the way, in our case, there was no Neverwinter Nights 2 game and instructions for installing a video card.
The new package includes:
  • adapter for powering PCI-express cards;
  • S-video > S-video + component out splitter;
  • DVI > D-sub adapter;
  • CD with drivers;
  • CD with Power DVD 7;
  • CD with the full version of the game Newervinter Nights 2;
  • brief instructions for installing a video card.

The Leadtek 8800GT video card is made according to the reference design and differs only in the sticker on the cooling system cover.
The reverse side of the video card does not stand out either, however, after getting acquainted with the GeForce 8800GTS 512 video card, the missing row of chip capacitors on the left of the board attracts attention.
The cooling system is made according to the reference design and is well known to us from previous reviews.
When examining the printed circuit board, the absence of elements on the right side of the card attracts attention, which, as we have already seen, are mounted in the 8800GTS 512 version. Otherwise, it is a quite ordinary board with a G92 graphics processor cut up to 112 stream processors and eight memory chips, in general forming 512 MB.
Like the previous participants in today's tests, the memory chips of Leadtek 8800GT are manufactured by Qimonda and have an access time of 1.0 ns, which corresponds to 2000 MHz.

Rated frequencies and overclocking

As already mentioned, the Leadtek 8800GT video card has a standard factory overclock. Its nominal frequencies are 678/1700 MHz for the GPU and 2000 MHz for the video memory. Very good, however, despite such a considerable factory overclocking, the video card did not show the best result during manual overclocking, only 713/1782 MHz for the GPU and 2100 MHz for the video memory. Recall that the participants in previous reviews were overclocked to frequencies of 740/1800 MHz for the video processor and 2000-2100 MHz for the video memory. We also note that we achieved this result with top speed fan of the cooling system, since, as we have already said, the reference system of the GeForce 8800GT does not cope with its duties in the best way.

Now let's move on to the next participant of today's testing.

Palit 8800GT sonic


The face of the video card Palit 8800GT sonic is a fighting frog in a spectacular design. Silly, but very funny! However, our life consists of nonsense, and remembering this once again does not hurt at all. Turning from fun to business, you should pay attention to the lower right corner, where there is a sticker indicating the frequencies of the video card and its other characteristics. The frequencies of the novelty are almost the same as those of the GeForce 8800GTS 512: 650/1625 MHz for the graphics processor and 1900 MHz for the video memory, which is only 44 MHz less than that of the 8800GTS 512.
The reverse side of the box does not contain anything remarkable, because everything interesting is located on the front side.
The new package includes:
  • adapter for powering PCI-express cards;
  • adapter S-video > component out;
  • S-video adapter > tulip;
  • DVI > D-sub adapter;
  • DVI > HDMI adapter;
  • CD with drivers;
  • CD with full game tomb raider The Legend;
  • brief instructions for installing a video card.
It should be noted that this is the first video card of the GeForce 8800GT class with a DVI > HDMI adapter that has been in our test lab; Previously, only some video cards of the AMD Radeon family were equipped with such an adapter.
And here is the first surprise! The Palit 8800GT sonic video card is based on a printed circuit board of its own design and is equipped with a proprietary cooling system.
The reverse side of the video card also has differences, but it is still difficult for us to judge the pros and cons of the new design. But we can fully judge the installation of video card components and its quality.
Since the height of the racks between the GPU heatsink and the board is less than the gap between them, and the heatsink is fastened with screws without any damping pads, the board itself and the graphics chip substrate are very curved. Unfortunately, this can lead to their damage, and the problem lies not in the strength of the textolite from which the board is made, but in the tracks, which can burst under tension. However, it is not at all necessary that this will happen, but the manufacturer should pay more attention to attaching cooling systems to their video cards.
The cooling system is made of painted aluminum and consists of three parts - for the GPU, video memory and power subsystem. The base of the heatsink for the GPU does not shine with any special processing, and a solid gray mass is used as a thermal interface.
Changes in the design of the printed circuit board affected the power subsystem, small elements were replaced with larger ones, their layout changed. As for the rest, we have before us the well-known GeForce 8800GT with the G92 graphic processor and eight video memory chips, totaling 512 MB.
Like the rest of today's testers, the memory chips are manufactured by Qimonda and have an access time of 1.0 ns.

Cooling efficiency and overclocking

We will test the effectiveness of the proprietary cooling system used in Palit 8800GT sonic using the game Oblivion with maximum settings, however, as always.


The video card warmed up from 51 to 61 degrees, which, in general, is a very good result. However, the fan speed increased noticeably, as a result of which the already not quiet cooling system became clearly audible against the general background. Therefore, it is difficult to recommend a video card from Palit to lovers of silence.

Despite changes in the power subsystem and improved cooling, the Palit 8800GT sonic video card overclocked to the usual frequencies of 734/1782 MHz for the GPU and 2000 MHz for the video memory.

So we have finished getting acquainted with the participants of today's testing, and therefore we will move on to reviewing the test results.

Testing and Conclusions

Today's testing differs not only in that we compare four video cards, but also in that we made it on a different test bench than you are familiar with, the configuration of which is as follows:

The change in the test platform is due to the fact that it was originally planned to test the Leadtek 8800GTS 512 and ASUS EN8800GTS TOP video cards in SLI mode, but, unfortunately, the ASUS video card could not stand our bullying by the end of the tests, and the idea collapsed. Therefore, we decided to move SLI testing to a separate article as soon as we have the necessary hardware in our hands, but for now we will limit ourselves to tests of single video cards. We will be comparing seven video cards, one of which is GeForce 8800GTS 512 overclocked to 756/1890/2100 MHz. For comparison we added GeForce 8800GT and GeForce 8800GTX operating at frequencies recommended by NVIDIA. To make it easier for you to navigate, here is a table with the clock frequencies of all test participants:

Video card name GPU frequency, core / shader unit, MHz Effective video memory frequency, MHz
Leadtek 8800GTS 512 650 / 1625 1944
ASUS EN8800GTS TOP 740 / 1780 2072
Leadtek 8800GT 678 / 1674 2000
Palit 8800GT 650 / 1625 1900
Overclocked GeForce 8800GTS 512 (on the diagram 8800GTS 512 756/1890/2100) 756 / 1890 2100
GeForce 8800GT (8800GT on the diagram) 600 / 1500 1800
GeForce 8800GTX (8800GTX on the diagram) 575 / 1350 1800

We have used ForceWare 169.21 and ForceWare 169.25 drivers for Windows XP and Windows Vista respectively. We will traditionally start our acquaintance with the test results with 3DMark tests:
Based on the results of 3DMark tests, of course, you can see who is stronger and who is weaker, but the difference is so small that there are no obvious leaders. But still, it is worth noting the fact that the most expensive of the participants - the video card GeForce 8800GTX - took the last places. To complete the picture, it is necessary to familiarize yourself with the results of gaming tests, which, as before, we produced with 4x anti-aliasing and 16x anisotropic filtering.
In the Call of Duty 4 game, attention is drawn to the fact that the Leadtek 8800GT video card is almost on a par with Leadtek 8800GTS 512, and the ASUS EN8800 TOP video card is almost not far behind the overclocked GeForce 8800GTS 512. GeForce 8800 GT. The winner was the video card GeForce 8800GTX, apparently due to the wider (compared to other test participants) memory bus.
In the Call of Juarez game under Windows XP, the Leadtek 8800GTS 512 video card is almost on a par with the GeForce 8800GTX, which is no longer saved by a wider memory bus. Let's note the fact that Leadtek 8800GT does not lag behind them, and at 1024x768 even outperforms them, which is explained by higher frequencies compared to the other two video cards. The leaders are the video card from ASUS and the overclocked GeForce 8800GTS 512, and the penultimate place is again taken by the video card from Palit, immediately after the GeForce 8800GT.
In the game Call of Juarez under Windows control Vista had problems with 1600x1200, which had large drops in speed and very strong "brakes" in places. We assume that the problem lies in the lack of video memory in such a hard mode, and whether it is or not, we will check in the next review using the ASUS 8800GT video card with 1 GB of video memory as an example. Let's note right away that there were no problems with the GeForce 8800GTX. Based on the results at two lower resolutions, it can be seen that the alignment of forces has not changed compared to Windows XP, except that the GeForce 8800GTX reminded of its noble origin, but did not become a leader.
In the Crysis game under Windows XP, the alignment of forces has changed a little, but in fact everything remains the same: the Leadtek 8800GTS 512 and Leadtek 8800GT video cards are approximately on the same level, the ASUS EN8800GTS TOP video cards and the overclocked GeForce 8800GTS 512 are the leaders, and the last place goes to the video card GeForce 8800 GT. We also note the fact that as the resolution grows, the gap between the overclocked GeForce 8800GTS 512 and GeForce 8800GTX narrows due to the latter's wider memory bus. However, high clock speeds still prevail, and yesterday's champion remains out of work.
The problem in Windows Vista with a resolution of 1600x1200 did not bypass the Crysis game either, leaving only the GeForce 8800GTX behind. Similar to Call of Juarez, it experienced bursts of speed and in places very severe drops in performance, sometimes below one frame per second. Based on the results at two lower resolutions, it can be seen that this time the Leadtek 8800GTS 512 outperformed its younger sister, taking the third place. The first places were taken by ASUS EN8800GTS TOP video cards, overclocked by GeForce 8800GTS 512 and GeForce 8800GTX, which finally took the lead at 1280x1024.
In Game Need for Speed ​​Pro Street Racing is led by the GeForce 8800GTX video card, and at a resolution of 1024x768 it is far behind. It is followed by the video card Leadtek 8800GTS 512, followed by ASUS EN8800GTS TOP and the overclocked GeForce 8800GTS 512, and the last places went to GeForce 8800GT and Palit 8800GT sonic. Since the GeForce 8800GTX video card has become the leader, we can conclude that the game is heavily dependent on bandwidth video memory. After that, we can guess why the overclocked versions of the GeForce 8800GTS 512 turned out to be slower than the non-overclocked version. Apparently, the reason for this is the increased video memory delays due to an increase in its clock frequency.
AT Need game for Speed ​​Carbon we see a familiar picture: Leadtek 8800GTS 512 and Leadtek 8800GT are roughly on a par, the overclocked GeForce 8800GTS 512 and ASUS EN8800GTS TOP take the first place, and the GeForce 8800GT takes the last place. The video card GeForce 8800GTX looks good, but nothing more.
In the game Oblivion, attention is drawn to the fact that at a resolution of 1024x768 the overclocked video card GeForce 8800GTS 512 and ASUS EN8800GTS TOP took the last places. We assumed that it was the memory delays that increased due to the increase in frequency, and we were right: after lowering the memory frequency of the overclocked GeForce 8800GTS 512 video card to nominal, it showed a result of over 100 frames per second. As the resolution grows, the situation returns to normal, and former outsiders become leaders. By the way, the fact that Leadtek 8800GT outperforms Leadtek 8800GTS 512 is noteworthy, most likely due to the higher frequency of the shader unit.
The Prey game turned out to be undemanding to all video cards, and they settled down according to their clock frequencies. Except that the GeForce 8800GTX behaved a little differently, but this is understandable, because it has a wider memory bus, and the game depends heavily on its bandwidth.

conclusions

The purpose of today's testing was to find out how much video cards differ from each other, and how much the high price for the "advanced" video card GeForce 8800GTS 512 is justified. GeForce 8800GTS 512 outperforms GeForce 8800GT in characteristics, including active functional blocks inside the GPU. The obvious advantages of the new GeForce 8800GTS 512 video cards are a high-quality and quiet cooling system and a higher overclocking potential than the GeForce 8800GT. The video card from ASUS deserves special attention, which, thanks to factory overclocking, occupies a leading position. Of course, you can overclock the card yourself, and, most likely, all GeForce 8800GTS 512 video cards will "take" the frequencies of the video card from ASUS. On the whole, we would like to note once again that the new family of video cards based on the G92 graphics chips turned out to be very successful and may well replace the recent leader GeForce 8800GTX.

Pros and cons of individual video cards:

Leadtek 8800GTS 512

Pros:
  • good overclocking potential;
  • good equipment;
  • bright and comfortable packaging.
Minuses:
  • not noticed.

ASUS EN8800GTS TOP

  • Pros:
  • factory overclock;
  • high-quality cooling system;
  • good overclocking potential.
Minuses:
  • too big and uncomfortable packaging.

Leadtek 8800GT

Pros:
  • factory overclock;
  • decent kit.
Minuses:
  • not noticed.

Palit 8800GT sonic

Pros:
  • factory overclock;
  • alternative cooling system;
  • decent kit.
Minuses:
  • heavily curved board in the GPU area;
  • noticeable fan noise.
It is well known that the flagship models of graphics adapters related to the highest price range are, first of all, a public demonstration of the technological achievements of the developer company. Although these solutions are well-deservedly popular with enthusiastic players, they never make the main sales picture. Not everyone is able or willing to pay $600, an amount comparable to the cost of the most expensive modern gaming console, for a graphics card alone, so the main contributor to AMD/ATI and Nvidia revenues is made by less expensive, but much more mainstream cards.

On November 9 last year, Nvidia announced the first unified architecture consumer graphics processor with DirectX 10 support. The novelty was described in detail in our Directly Unified: Nvidia GeForce 8800 Architecture Review. Initially, the novelty formed the basis of two new graphics cards - GeForce 8800 GTX and GeForce 8800 GTS. As you know, the older model showed itself perfectly in games and may well be considered the choice of an enthusiast who is not embarrassed by the price, while the younger model took its rightful place in its price category - less than $500, but more than $350.

$449 is not a very high price for a new generation product that has full support for DirectX 10 and can offer the user a serious level of performance in modern games. Nevertheless, Nvidia decided not to stop there, and on February 12, 2007, presented to the public a more affordable GeForce 8800 GTS 320MB model with an official price of $299, which seriously strengthened its positions in this sector. These two graphics cards will be discussed in today's review. Along the way, we will find out how critical the amount of video memory is for the GeForce 8 family.

GeForce 8800 GTS Specifications

To evaluate the qualities and capabilities of both GeForce 8800 GTS models, we should remind our readers of the characteristics of the GeForce 8800 family.


All three GeForce 8800 models use the same G80 graphics core, which consists of 681 million transistors, as well as an additional NVIO chip containing TMDS transmitters, RAMDAC, etc. adapters belonging to different price categories is not the best option in terms of the cost of the final product, but you can’t call it unsuccessful either: Nvidia has the opportunity to sell rejected versions of the GeForce 8800 GTX (which did not pass the frequency rejection and / or have a number of defective blocks), and the cost of video cards sold at prices above $250 is hardly critical. This approach is actively used by both Nvidia and its sworn competitor ATI, just remember the history of the G71 graphics processor, which can be found both in the mass inexpensive GeForce 7900 GS video adapter and in the powerful two-chip monster GeForce 7950 GX2.

The GeForce 8800 GTS was created in the same way. As you can see from the table, in terms of technical characteristics, this video adapter differs significantly from its older brother: not only does it have lower clock speeds and some of the stream processors are disabled, but the amount of video memory is also reduced, the width of the access bus to it is cut, and part of the TMU and rasterization units are inactive .

The GeForce 8800 GTS has 6 groups of stream processors, 16 ALUs each, for a total of 96 ALUs. The main rival of this card, AMD Radeon X1950 XTX, has 48 pixel processors, each of which, in turn, consists of 2 vector and 2 scalar ALUs - 192 ALUs in total.

It would seem that in a clean computing power The GeForce 8800 GTS should be seriously inferior to the Radeon X1950 XTX, but there are a number of nuances that make such an assumption not entirely legitimate. The first one is that the GeForce 8800 GTS stream processors, like the ALU in Intel NetBurst, operate at a much higher frequency than the rest of the core - 1200 MHz versus 500 MHz, which already means a very serious increase in performance. Another nuance follows from the architectural features of the R580 GPU. Theoretically, each of its 48 pixel shader execution units is capable of executing 4 instructions per clock, not counting branch instructions. However, only 2 of them will be of type ADD/MUL/MADD, and the remaining two are always ADD instructions with a modifier. Accordingly, the efficiency of R580 pixel processors will not be maximum in all cases. On the other hand, the G80 stream processors have a completely scalar architecture and each of them is capable of executing two scalar operations per clock cycle, for example, MAD+MUL. Although we still do not have exact data on the architecture of Nvidia stream processors, in this article we will look at how the new unified architecture of the GeForce 8800 is more advanced than the architecture of the Radeon X1900 and how this affects the speed in games.

As for the performance of texturing and rasterization systems, judging by the characteristics, the GeForce 8800 GTS has a larger number of texture units (24) and rasterizers (20) compared to the Radeon X1950 XTX (16 TMU, 16 ROP), however, their clock frequency ( 500MHz) is significantly lower than the clock frequency of the ATI product (650MHz). Thus, none of the parties has a decisive advantage, which means that the performance in games will be affected mainly by the “success” of the micro-architecture, and not by the numerical advantage of the execution units.

It is noteworthy that both the GeForce 8800 GTS and the Radeon X1950 XTX have the same memory bandwidth - 64GB / s, however, the GeForce 8800 GTS uses a 320-bit video memory access bus, it uses GDDR3 memory operating at 1600 MHz, while the The Radeon X1950 XTX can be found with 2GHz GDDR4 memory with 256-bit access. Given ATI's claims of a superior ring memory controller on the R580 compared to a typical Nvidia controller, it will be interesting to see if ATI's Radeon solution gains some advantage at high resolutions with FSAA enabled against a next-gen competitor, as happened with the GeForce. 7.

The less expensive version of the GeForce 8800 GTS with 320MB of memory, announced on February 12, 2007 and intended to replace the GeForce 7950 GT in the performance-mainstream segment, differs from the regular model only in the amount of video memory. In fact, to get this card, Nvidia only needed to replace 512 Mb memory chips with 256 Mb chips. A simple and technologically advanced solution, it allowed Nvidia to assert its technological superiority in the $299 price category, which is quite popular among users. In the future, we will find out how this affected the performance of the new product and whether a potential buyer should pay extra $150 for a model with 640 MB of video memory.

In our today's review GeForce 8800 GTS 640MB will be represented by the MSI NX8800GTS-T2D640E-HD-OC video adapter. Let's talk about this product in more detail.

MSI NX8800GTS-T2D640E-HD-OC: packaging and bundle

The video adapter arrived in our lab in a retail package - packed in a colorful box along with all related accessories. The box turned out to be relatively small, especially in comparison with the box from MSI NX6800 GT, which at one time could compete with Asustek Computer packages in terms of dimensions. Despite its modest size, MSI packaging traditionally comes with a convenient carrying handle.


The design of the box is made in soothing white and blue colors and does not hurt the eyes; the front side is decorated with an image of a pretty red-haired angel girl, so there is no talk of aggressive motives so popular among video card manufacturers. Three stickers inform the buyer that the card is pre-overclocked by the manufacturer, supports HDCP and comes with the full version of the Company of Heroes game. On the back of the box, you can find information about Nvidia SLI and MSI D.O.T. express. The latter is a dynamic overclocking technology, and according to MSI, it can increase the performance of the video adapter by 2%-10%, depending on the overclocking profile used.

Opening the box, in addition to the video adapter itself, we found the following set of accessories:


Quick Installation Guide
Quick User Guide
Adapter DVI-I -> D-Sub
Splitter YPbPr/S-Video/RCA
S-video cable
Power adapter 2хMolex -> 6-pin PCI Express
CD with drivers and utilities MSI
Two-Disc Edition of Company of Heroes

Both manuals are made in the form of posters; in our opinion, they are too simple and contain only the most basic information. The pursuit of the number of languages, and there are 26 of them in the quick start guide, has led to the fact that nothing particularly useful, except for the basic information on installing the card into the system, can be gleaned from it. We feel that the guides could be a little more detailed, which would give some advantage to inexperienced users.

The driver disk contains an outdated version of Nvidia ForceWare 97.29, as well as a number of proprietary utilities, among which MSI DualCoreCenter and MSI Live Update 3 deserve special mention. The first is a unified control center that allows you to overclock both the video card and CPU, however, for full functionality, the program requires a system MSI boards, equipped with a CoreCell chip and, therefore, of little use to owners of motherboards from other manufacturers. MSI Live Update 3 is designed to keep track of driver and BIOS updates and update them conveniently over the Internet. This is a fairly convenient way, especially for those who do not want to understand the intricacies of the manual process of updating the video adapter BIOS.

The presence of the full version of the popular tactical RTS Company of Heroes in the kit deserves special praise from MSI. This is truly a game of the highest category, with excellent graphics and a thoroughly developed gameplay; many players call her the best game in this genre, which is confirmed by numerous awards, including the title of "Best Strategy Game E3 2006". As we have already noted, despite belonging to the real-time strategy genre, Company of Heroes boasts modern graphics at the level of a good first-person shooter, so the game is perfect for demonstrating the capabilities of the GeForce 8800 GTS. In addition to Company of Heroes, a demo version of Warhammer 40,000: Dawn of War - Dark Crusade can be found on the discs.

We can confidently call the MSI NX8800GTS-T2D640E-HD-OC bundle a good package due to the presence of the full version of the very popular tactical RTS Company of Heroes and user-friendly software from MSI.

MSI NX8800GTS-T2D640E-HD-OC PCB design

For the GeForce 8800 GTS model, Nvidia has developed a separate, more compact PCB than the one used to manufacture the GeForce 8800 GTX. Since all GeForce 8800s are delivered to Nvidia partners ready-made, practically everything that will be said below applies not only to MSI NX8800GTS, but also to any other GeForce 8800 GTS model, be it the version with 640 or 320 MB of video memory.


The GeForce 8800 GTS PCB is significantly shorter than the GeForce 8800 GTX board. Its length is only 22.8 centimeters versus almost 28 centimeters for the flagship model GeForce 8. In fact, the dimensions of the GeForce 8800 GTS are the same as those of the Radeon X1950 XTX, even slightly smaller, since the cooler does not protrude beyond the PCB.

Our copy of MSI NX8800GTS uses a board covered with a green mask, although the product is shown on the company's website with a more familiar black PCB. Currently, both "black" and "green" GeForce 8800 GTX and GTS are on sale. Despite numerous rumors circulating on the Web, there is no difference between such cards, except for the PCB color itself, which is also confirmed by the official Nvidia website. What is the reason for such a "return to the roots"?

There are many conflicting rumors about this. According to some, the composition of the black coating is more toxic than the traditional green, others believe that the black coating is more difficult to apply or more expensive. In practice, this is most likely not the case - as a rule, the prices for solder masks of different colors are the same, which eliminates additional problems with masks of certain colors. The simplest and most logical scenario is most likely - cards of different colors are produced by different contract manufacturers - Foxconn and Flextronics. Moreover, Foxconn probably uses coatings of both colors, since we have seen both "black" and "green" cards from this manufacturer.


The power supply system of the GeForce 8800 GTS is almost as complex as the similar system of the GeForce 8800 GTX and even contains more electrolytic capacitors, but has a denser layout and only one connector external power supply owing to what, the printed circuit board managed to be made much shorter. The same digital PWM controller is responsible for GPU power management as in the GeForce 8800 GTX, Primarion PX3540. The memory power is controlled by the second controller, Intersil ISL6549, which, by the way, is absent on the GeForce 8800 GTX, where the memory power scheme is different.

The left side of the PCB, where the main components of the GeForce 8800 GTS - GPU, NVIO and memory are located, is almost identical to the analogous section of the GeForce 8800 GTX PCB, which is not surprising, since developing the entire board from scratch would require significant financial and time costs. In addition, it would most likely be impossible to significantly simplify the board for the GeForce 8800 GTS by designing it from scratch, in light of the need to use the same G80 and NVIO tandem as on the flagship model. The only visible difference from the GeForce 8800 GTX is the absence of the second "comb" of the MIO interface (SLI), instead of which there is a place for installing a technological connector with locking latches, possibly performing the same function, but not soldered. Even the 384-bit memory bus layout has been preserved, and the bus itself was cut to the required width in the simplest way: instead of 12 GDDR3 chips, only 10 are installed. Since each chip has a 32-bit bus, 10 chips in total just give the necessary 320 bits. Theoretically, nothing prevents the creation of a GeForce 8800 GTS with a 384-bit memory bus, but the appearance of such a card in practice is extremely unlikely, therefore, a full-fledged GeForce 8800 GTX with lower frequencies has a great chance of being released.


The MSI NX8800GTS-T2D640E-HD-OC is equipped with 10 GDDR3 Samsung K4J52324QE-BC12 chips with a capacity of 512 Mbps, operating at a supply voltage of 1.8V and having a nominal frequency of 800 (1600) MHz. According to the official Nvidia specifications for the GeForce 8800 GTS, the memory of this video adapter must have exactly this frequency. But the MSI NX8800GTS version we are considering has the letters "OC" in its name for a reason - it is pre-overclocked, so the memory operates at a slightly higher frequency of 850 (1700) MHz, which gives an increase in bandwidth from 64 GB / s. up to 68 GB/s

Since the only difference between the GeForce 8800 GTS 320MB and the regular model is the halved amount of video memory, 256 Mbit memory chips are simply installed on this card, for example, Samsung K4J55323QC/QI series or Hynix HY5RS573225AFP. Otherwise, the two GeForce 8800 GTS models are identical to each other down to the smallest detail.

The marking of the NX8800GTS GPU is somewhat different from the marking of the GeForce 8800 GTX processor and looks like "G80-100-K0-A2", while in the reference flagship card the chip is marked with the symbols "G80-300-A2". We know that the production of the GeForce 8800 GTS can be launched copies of the G80, which have defects in terms of functional blocks and / or did not pass the frequency selection. Perhaps it is these features that are reflected in the labeling.

The 8800 GTS processor has 96 active stream processors out of 128, 24 TMUs out of 32 and 20 ROPs out of 24. For standard variant GeForce 8800 GTS has a base GPU frequency of 500 MHz (513 MHz real frequency), and a shader frequency of 1200 MHz (1188 MHz real frequency), but for MSI NX8800GTS-T2D640E-HD-OC these parameters are 576 and 1350 MHz, which corresponds to the frequencies of GeForce 8800 GTX. How this will affect the performance of the MSI product, we will find out later in the section on gaming test results.

The NX8800GTS output connector configuration is standard: two DVI-I connectors capable of operating in dual-link mode and a universal seven-pin mini-DIN connector that allows you to connect both HDTV devices via the analog YPbPr interface and SDTV devices using the S-Video or Composite interface. The MSI product has both DVI connectors carefully covered with rubber protective caps - a rather meaningless, but pleasant trifle.

MSI NX8800GTS-T2D640E-HD-OC: Cooling System Design

The cooling system installed on MSI NX8800GTS, as well as on the vast majority of GeForce 8800 GTS from other graphics card vendors, is a shortened version of the GeForce 8800 GTX cooling system described in the corresponding review.


Shortened radiator and heat pipe, which transfers the heat flow from the copper soleplate in contact with the GPU heatspreader. Also, the flat U-shaped heat pipe, pressed into the base and responsible for the uniform distribution of the heat flux, is located differently. An aluminum frame on which all parts of the cooler are fixed. has a lot of protrusions at the points of contact with memory chips, power transistors of the power stabilizer and the NVIO chip chip. Reliable thermal contact is provided by traditional inorganic fiber pads impregnated with white thermal paste. For the GPU, a different, but also familiar to our readers, thick dark gray thermal paste is used.

Due to the fact that there are relatively few copper elements in the design of the cooling system, its mass is small, and mounting does not require the use of special plates that prevent fatal bending of the PCB. Eight ordinary spring-loaded bolts that fasten the cooler directly to the board are enough. The possibility of damage to the GPU is practically excluded, since it is equipped with a heat-distributing cover and surrounded by a wide metal frame that protects the chip from possible distortion of the cooling system, and the board from excessive bending.

The radiator is blown by a radial fan with an impeller diameter of about 75 mm, which has the same electrical parameters as in the GeForce 8800 GTX cooling system - 0.48A/12V, and is connected to the board via a four-pin connector. The system is covered with a translucent plastic casing in such a way that hot air is blown out through the slots in the mounting plate.

The design of the GeForce 8800 GTX and 8800 GTS coolers is thoughtful, reliable, time-tested, almost silent in operation and provides high cooling efficiency, so it makes no sense to change it to something else. MSI replaced only the Nvidia sticker on the casing with its own, repeating the pattern on the box and provided the fan with another sticker with its own logo.

MSI NX8800GTS-T2D640E-HD-OC: noise and power consumption

To assess the noise level generated by the MSI NX8800GTS cooling system, a Velleman DVM1326 digital sound level meter with a resolution of 0.1 dB was used. The measurements were made using a weighted A-curve. At the time of measurement, the background noise level in the laboratory was 36 dBA, and the noise level at a distance of one meter from a working bench equipped with a passively cooled graphics card was 40 dBA.






In terms of noise, the cooling system of the NX8800GTS (and any other GeForce 8800 GTS) behaves in exactly the same way as the system installed on the GeForce 8800 GTX. The noise level is very low in all modes; in this parameter, the new Nvidia design surpasses even the excellent GeForce 7900 GTX cooler, which was previously rightfully considered the best in its class. In this case, you can achieve complete noiselessness and not lose cooling efficiency except by installing a water cooling system, especially if serious overclocking is planned.

As our readers know, reference copies of the GeForce 8800 GTX from the first batches refused to run on a stand equipped to measure the power consumption of video cards. However, most of the new cards belonging to the GeForce 8800 family, including MSI NX8800GTS-T2D640E-HD-OC, worked without problems on this system with the following configuration:

Processor Intel Pentium 4 560 (3.60GHz, 1MB L2);
Systemic Intel board Desktop Board D925XCV (i925X);
Memory PC-4300 DDR2 SDRAM (2x512MB);
Hard drive Samsung SpinPoint SP1213C (120 GB, Serial ATA-150, 8 MB buffer);
Microsoft Windows XP Pro SP2, DirectX 9.0c.

As we reported, the mainboard, which is the heart of the measuring platform, has been specially upgraded: measuring shunts equipped with connectors for connecting measuring equipment are included in the power circuit break of the PCI Express x16 slot. The 2xMolex -> 6-pin PCI Express power adapter is equipped with the same shunt. As a measuring tool, a Velleman DVM850BL multimeter is used, which has a measurement error of no more than 0.5%.

To create a load on the video adapter in 3D mode, the first SM3.0 / HDR graphics test is used, which is part of the Futuremark 3DMark06 package and runs in an endless loop at a resolution of 1600x1200 with forced anisotropic filtering 16x. Peak 2D is emulated by running the 2D Transparent Windows test, which is part of the Futuremark PCMark05 package.

Thus, after passing standard procedure measurements, we were able to obtain reliable data on the power consumption level not only of MSI NX8800GTS-T2D640E-HD-OC, but of the entire Nvidia GeForce 8800 family.











The GeForce 8800 GTX is indeed ahead of the previous "leader" Radeon X1950 XTX in terms of power consumption, but only by 7 watts. Considering the huge complexity of the G80, 131.5 watts in 3D mode can be safely considered a good indicator. Both additional power connectors of the GeForce 8800 GTX consume approximately the same power, not exceeding 45 watts even in the heaviest mode. Although the design of the GeForce 8800 GTX PCB assumes the installation of one eight-pin power connector instead of a six-pin one, it is unlikely to be relevant even in the case of a significant increase in GPU and memory clock frequencies. In idle mode, the economy of the Nvidia flagship leaves much to be desired, but this is the price of 681 million transistors and a huge, by GPU standards, shader processor frequency. Such a high level of power consumption in idle is partly due to the fact that the GeForce 8800 family does not lower clock frequencies in this mode.

Both versions of the GeForce 8800 GTS have much more modest performance, although they cannot boast of cost-effectiveness at the level of Nvidia cards using the previous generation core, G71. The single power connector on these cards carries a much heavier load, individual cases capable of reaching 70 watts or more. The power consumption levels of the GeForce 8800 GTS variants with 640 and 320 MB of video memory differ slightly, which is not surprising, because this parameter is the only difference between these cards. The MSI product running at higher frequencies consumes more than the standard version of the GeForce 8800 GTS - about 116 watts under load in 3D mode, which is still less than that of the Radeon X1950 XTX. Of course in 2D AMD card much more economical, however, video adapters of this class are purchased specifically for use in 3D, therefore, this parameter is not as critical as the level of power consumption in games and three-dimensional applications.

MSI NX8800GTS-T2D640E-HD-OC: overclocking features

Overclocking representatives of the Nvidia GeForce 8800 family is associated with a number of features that we consider it necessary to tell our readers about. As you probably remember, the first representatives of the seventh generation of GeForce, using the 0.11-micron G70 core, could increase the frequencies of ROPs and pixel processors only in 27 MHz increments, and if the overclock turned out to be less than this value, there was practically no performance gain. Later, in cards based on the G71, Nvidia returned to standard scheme overclocking in steps of 1 MHz, however, in the eighth generation of GeForce, the discreteness of changing clock frequencies appeared again.

The scheme of distribution and change of clock frequencies in the GeForce 8800 is rather non-trivial, which is due to the fact that the shader processor units in the G80 operate at a much higher frequency than the rest of the GPU units. The frequency ratio is approximately 2.3 to 1. Although the main frequency of the graphics core can change in smaller steps than 27 MHz, the frequency of shader processors always changes in 54 MHz steps (2x27 MHz), which creates additional difficulties during overclocking, because all utilities manipulate the main frequency , and not the frequency of the shader "domain". However, there is a simple formula that allows you to determine the frequency of GeForce 8800 stream processors after overclocking with sufficient accuracy:

OC shader clk = Default shader clk / Default core clk * OC core clk


Where OC shader clk is the target frequency (approximately), Default shader clk is the starting frequency of the shader processors, Default core clk is the initial core frequency, and OC core clk is the overclocked core frequency.

Let's take a look at the behavior of MSI NX8800GTS-T2D640E-HD-OC during overclocking using the RivaTuner2 FR utility, which allows you to track the real frequencies of various areas or, as they are also called, "domains" of the G80 GPU. Since the MSI product has the same GPU frequencies (576/1350) as the GeForce 8800 GTX, the following information is also valid for the flagship graphic card Nvidia. We will increase the main frequency of the GPU in steps of 5 MHz: this is a fairly small step, and at the same time it is not a multiple of 27 MHz.


An empirical test confirmed that the main frequency of the graphics core can indeed change with variable steps - 9, 18 or 27 MHz, moreover, we failed to catch the pattern of change. The frequency of shader processors in all cases was changed in steps of 54 MHz. Because of this, some frequencies of the main "domain" of the G80 turn out to be practically useless during overclocking, and their use will only lead to excessive heating of the GPU. For example, it makes no sense to increase the main core frequency to 621 MHz - the shader unit frequency will still be 1458 MHz. Thus, GeForce 8800 overclocking should be carried out carefully, using the above formula and referring to monitoring data from Riva Tuner or another utility with similar functionality.

It would be illogical to expect serious overclocking results from the already overclocked version of the NX8800GTS, however, the card unexpectedly showed quite good potential, at least from the GPU side. We managed to raise its frequencies from the factory 576/1350 MHz to 675/1566 MHz, while the NX8800GTS steadily passed several 3DMark06 cycles in a row without any additional cooling. The processor temperature, according to Riva Tuner, did not exceed 70 degrees.

The memory succumbed to overclocking much worse, since the NX8800GTX OC Edition had chips designed for 800 (1600) MHz, operating at a frequency higher than the nominal - 850 (1700) MHz. As a result, we had to stop at around 900 (1800) MHz, since further attempts to increase the memory frequency invariably led to a freeze or failure in the driver.

Thus, the card showed good potential for overclocking, but only for the graphics processor: the relatively slow memory chips did not allow a significant increase in its frequency. For them, the GeForce 8800 GTX level should be considered a good achievement, and a 320-bit bus at this frequency is already capable of providing a significant bandwidth advantage over the Radeon X1950 XTX: 72 GB/s versus 64 GB/s. Of course, the result of overclocking may vary depending on the specific instance of MSI NX8800GTS OC Edition and the use of additional tools, such as modifying the card's power circuit or installing water cooling.

Test platform configuration and test methods

Comparative study of GeForce 8800 GTX performance was conducted on platforms with the following configuration.

Processor AMD Athlon 64 FX-60 (2 x 2.60GHz, 2 x 1MB L2)
Abit AN8 32X (nForce4 SLI X16) Motherboard for Nvidia GeForce Cards
Systemic asus motherboard A8R32-MVP Deluxe (ATI CrossFire Xpress 3200) for ATI cards Radeon
Memory OCZ PC-3200 Platinum EL DDR SDRAM (2x1GB, CL2-3-2-5)
Hard drive Maxtor MaXLine III 7B250S0 (Serial ATA-150, buffer 16MB)
Sound creative card Sound Blaster Audigy 2
Power supply Enermax Liberty 620W (ELT620AWT, rated power 620W)
Dell 3007WFP Monitor (30", Max Resolution 2560x1600)
Microsoft Windows XP Pro SP2, DirectX 9.0c
AMD Catalyst 7.2
Nvidia ForceWare 97.92

Since we consider the use of trilinear and anisotropic filtering optimizations unjustified, the drivers were configured in a standard way, which implies the highest possible quality of texture filtering:

AMD Catalyst:

Catalyst A.I. Standard
Mipmap Detail Level: High Quality
Wait for vertical refresh: Always off
Adaptive antialiasing: Off
Temporal antialiasing: Off
High Quality AF: On

Nvidia ForceWare:

Texture Filtering: High Quality
Vertical sync Off
Trilinear optimization: Off
Anisotropic optimization: Off
Anisotropic sample optimization: Off
Gamma correct antialiasing: On
Transparency antialiasing: Off
Other settings: default

Each game was set to the highest possible level of graphics quality, while, configuration files games have not been modified. To collect performance data, either the built-in features of the game were used, or, in their absence, the Fraps utility. Wherever possible, minimum performance data was recorded.

Testing was carried out in three standard resolutions for our methodology: 1280x1024, 1600x1200 and 1920x1200. One of the goals of this review is to evaluate the impact of the amount of video memory in GeForce 8800 GTS on performance. Besides, specifications and the cost of both variants of this video adapter, allow us to count on a fairly high level of performance in modern games when using FSAA 4x, so we tried to use the "eye candy" mode wherever possible.

FSAA and anisotropic filtering were activated using the game tools; in the absence of such, their forcing was carried out using the appropriate settings ATI drivers Catalyst and Nvidia ForceWare. Testing without full-screen anti-aliasing was used only for games that do not support FSAA for technical reasons, or when using FP HDR at the same time as participating in testing members of the GeForce 7 family that does not support simultaneous operation of these features.

Since our task was, among other things, to compare the performance of graphics cards that differ only in the amount of video memory, MSI NX8800GTS-T2D640E-HD-OC was tested 2 times: at factory frequencies, and at frequencies lowered to the reference values ​​for GeForce 8800 GTS: 513 /1188/800 (1600) MHz. In addition to the MSI product and the reference copy Nvidia GeForce 8800 GTS 320MB, the following video adapters took part in testing:

Nvidia GeForce 8800 GTX (G80, 576/1350/1900MHz, 128sp, 32tmu, 24rop, 384-bit, 768MB)
Nvidia GeForce 7950 GX2 (2xG71, 500/1200MHz, 48pp, 16vp, 48tmu, 32rop, 256-bit, 512MB)
AMD Radeon X1950 XTX (R580+, 650/2000MHz, 48pp, 8vp, 16tmu, 16rop, 256-bit, 512MB)

The following set of games and applications was used as test software:

3D First Person Shooters:

Battlefield 2142
Call of Juarez
Far Cry
F.E.A.R. extraction point
Tom Clancy's Ghost Recon Advanced Warfighter
Half Life 2: Episode One
Prey
Serious Sam 2
S.T.A.L.K.E.R.: Shadow of Chernobyl


Three-dimensional shooters with a third-person view:

Tomb Raider: Legend


RPG:

Gothic 3
Neverwinter Nights 2
The Elder Scrolls IV: Oblivion


Simulators:


Strategy games:

Command & Conquer: Tiberium Wars
Company of Heroes
Supreme Commander


Synthetic gaming benchmarks:

Futuremark 3DMark05
Futuremark 3DMark06

Playtests: Battlefield 2142


There is no significant difference between the two versions of the GeForce 8800 GTS with different video memory sizes up to 1920x1200, although at 1600x1200 the younger model is inferior to the older one by about 4-5 frames per second with quite comfortable performance of both. The resolution of 1920x1440, however, is a turning point: the GeForce 8800 GTS 320MB abruptly drops out of the game with more than 1.5 times the average and two times the minimum fps. Moreover, it also loses to the cards of the previous generation. There is a shortage of video memory or a problem with the implementation of its management in the GeForce 8800 family.

MSI NX8800GTS OC Edition is noticeably ahead of the reference model, starting at 1600x1200, but of course it can't catch up with the GeForce 8800 GTX, although at 1920x1440 the gap between these cards becomes impressively narrow. Obviously, the difference in the width of the memory access bus between GeForce 8800 GTS and GTX is insignificant here.

Playtests: Call of Juarez


Both GeForce 8800 GTS models show the same level of performance at all resolutions, including 1920x1200. This is quite natural considering testing with HDR enabled but FSAA disabled. Operating at nominal frequencies, the cards are inferior to the GeForce 7950 GX2.

The overclocked version of MSI allows you to achieve parity in high resolutions, the use of which in this game is impractical even if you have a GeForce 8800 GTX in your system. For example, at 1600x1200, Nvidia's flagship graphics card averages just 40 fps, with dips of up to 21 fps in graphics-intensive scenes. For a first-person shooter, such indicators can hardly be called truly comfortable.

Game Tests: Far Cry


The game is no longer young and is not well suited for testing modern high-end video adapters. Despite the use of anti-aliasing, noticeable differences in their behavior can only be seen at a resolution of 1920x1200. The GeForce 8800 GTS 320MB is running low on video memory here, and is therefore outperformed by about 12% of the model equipped with 640 MB of video memory. However, due to the modest requirements of Far Cry by today's standards, the player is not in danger of losing comfort.

MSI NX8800GTS OC Edition goes almost on a par with GeForce 8800 GTX: Far Cry clearly doesn't use the latter's power.


Due to the nature of the scene recorded at the Research level, the readings are more varied; Already at 1600x1200 you can see the performance differences of various representatives of the GeForce 8800 family. Moreover, the lag of the version with 320MB of memory can already be seen here, despite the fact that the action takes place in an enclosed space of an underground cave. The performance difference between the MSI product and the GeForce 8800 GTX at 1920x1200 is much larger than in the previous case, since the performance of shader processors plays a more important role at this level.




In the FP HDR mode, the GeForce 8800 GTS 320MB no longer experiences problems with the amount of video memory and is in no way inferior to its older brother, providing a decent level of performance in all resolutions. The version offered by MSI gives another 15% speed boost, but even the version running at standard clock speeds is fast enough to use 1920x1200 resolution, and the GeForce 8800 GTX will no doubt provide comfortable conditions for the player at 2560x1600 resolution.

Game Tests: F.E.A.R. extraction point


The visual richness of F.E.A.R. requires appropriate resources from the video adapter, and a 5% lag in the GeForce 8800 GTS 320MB is already visible at a resolution of 1280x1024, and at the next resolution, 1600x1200, it sharply turns into 40%.

The benefits of overclocking the GeForce 8800 GTS are not obvious: both the overclocked and the regular versions allow you to equally successfully play at 1600x1200. At the next resolution, the speed gain from overclocking is simply not enough to reach a comfortable level for first-person shooters. Only the GeForce 8800 GTX with 128 active shader processors and a 384-bit memory subsystem can do this.

Game tests: Tom Clancy's Ghost Recon Advanced Warfighter

Due to the use of deferred rendering, the use of FSAA in GRAW is technically impossible, therefore, the data is given only for the anisotropic filtering mode.


The advantage of MSI NX8800GTS OC Edition over a conventional reference card grows as the resolution increases, and at 1920x1200 it reaches 19%. In this case, it is these 19% that allow us to achieve an average performance of 55 fps, which is quite comfortable for the player.

As for the comparison of two GeForce 8800 GTS models with different video memory sizes, there is no difference in their performance.

Playtests: Half-Life 2: Episode One


At a resolution of 1280x1024, there is a limitation on the part of the central processor of our test system- all cards show the same result. At 1600x1200, the differences are already revealed, but they are not fundamental, at least for three variants of the GeForce 8800 GTS: all three provide a very comfortable performance level. The same can be said about the resolution of 1920x1200. Despite the high-quality graphics, the game is undemanding to the amount of video memory and the GeForce 8800 GTS 320MB is only about 5% behind the older and much more expensive model with 640 MB of memory on board. The overclocked version of the GeForce 8800 GTS offered by MSI confidently takes the second place after the GeForce 8800 GTX.

Although the GeForce 7950 GX2 shows top scores than the GeForce 8800 GTS at 1600x1200, we should not forget about the problems that can arise when using a card that is, in fact, an SLI tandem, as well as the significantly lower quality of texture filtering in the GeForce 7 family. also has problems with drivers, but it has promising capabilities, and, unlike the GeForce 7950 GX2, it has every chance to get rid of "childhood diseases" in as soon as possible.

Gaming Tests: Prey


The GeForce 8800 GTS 640MB doesn't show any advantage over the GeForce 8800 GTS 320MB, perhaps because the game uses a modified Doom III engine and doesn't show much appetite in terms of video memory requirements. As in the case of GRAW, the improved performance of the NX8800GTS OC Edition allows the owners of this video adapter to count on a fairly comfortable game at a resolution of 1920x1200. For comparison, the regular GeForce 8800 GTS demonstrates the same figures at 1600x1200. The flagship of the line, the GeForce 8800 GTX is beyond competition.

Playtests: Serious Sam 2


The brainchild of Croatian developers from Croteam has always strictly demanded that the video adapter have 512 MB of video memory, otherwise punishing it with a monstrous drop in performance. The volume provided by the inexpensive version of the GeForce 8800 GTS was not enough to satisfy the appetites of the game, as a result, it was able to show only 30 fps at 1280x1024, while the version with 640 MB of memory on board turned out to be more than twice as fast.

For some unknown reason, the minimum performance of all GeForce 8800s in Serious Sam 2 is extremely low, which may be due to both the architectural features of the family, which, as you know, have a unified architecture without division into pixel and vertex shaders, or to flaws in the ForceWare drivers. For this reason, GeForce 8800 owners will not be able to achieve complete comfort in this game.

Playtests: S.T.A.L.K.E.R.: Shadow of Chernobyl

Eagerly awaited by many players, the GSC Game World project, after many years of development, finally saw the light, 6 or 7 years after the announcement. The game turned out to be ambiguous, but, nevertheless, multifaceted enough to try to describe it in a few phrases. We only note that in comparison with one of the first versions, the project engine has been significantly improved. The game received support from a number of modern technologies, including Shader Model 3.0, HDR, parallax mapping and more, but has not lost the ability to work in a simplified mode with a static lighting model, providing excellent performance on not very powerful systems.

Since we are aiming for the highest level of visual quality, we tested the game in full dynamic lighting mode with maximum detail. In this mode, which includes, among other things, the use of HDR, there is no support for FSAA; at least that's how it is in current version S.T.A.L.K.E.R. Since when using a static lighting model and DirectX 8 effects, the game loses a lot of attractiveness, we limited ourselves to anisotropic filtering.


The game does not at all suffer from modesty of appetites - with maximum details even the GeForce 8800 GTX is not able to provide 60 fps in it at 1280x1024. However, it should be noted that at low resolutions the main limiting factor is CPU performance, since the spread between the cards is small and their average results are quite close.

Nevertheless, a certain lag of the GeForce 8800 GTS 320MB from its elder brother can already be seen here, and as the resolution increases, it only gets worse, and at a resolution of 1920x1200, the younger representative of the GeForce 8800 family simply lacks the available video memory. This is not surprising, given the scale of the game scenes and the abundance of special effects used in them.

In general, we can say that the GeForce 8800 GTX does not provide a significant advantage in S.T.A.L.K.E.R. ahead of the GeForce 8800 GTS, and the Radeon x1950 XTX looks just as successful as the GeForce 8800 GTS 320MB. The AMD solution even outperforms the Nvidia solution in some ways, since it works at a resolution of 1920x1200, however, the practical use of this mode is impractical due to the average performance of 30-35 fps. The same applies to the GeForce 7950 GX2, which, by the way, is somewhat ahead of both its direct competitor and the younger model of the new generation.

Playtests: Hitman: Blood Money


Earlier we noted that the presence of 512 MB of video memory provides such a video adapter with some gain in Hitman: Blood Money at high resolutions. Apparently, 320 MB is also sufficient, since the GeForce 8800 GTS 320MB is almost as good as the regular GeForce 8800 GTS, regardless of the resolution used; the difference does not exceed 5%.

Both cards, as well as the overclocked version of the GeForce 8800 GTS offered by MSI, allow you to successfully play at all resolutions, and the GeForce 8800 GTX even allows you to use better FSAA modes than the regular MSAA 4x, as it has the necessary performance headroom for this.

Playtests: Tomb Raider: Legend


Despite the use of settings that provide maximum quality graphics, the GeForce 8800 GTS 320MB handles the game just as well as the regular GeForce 8800 GTS. Both cards make available to the player a resolution of 1920x1200 in "eye candy" mode. MSI NX8800GTS OC Edition slightly outperforms both reference cards, but only in average fps - the minimum remains the same. The GeForce 8800 GTX has no more, which may mean that this indicator is due to some features of the game engine.

Playtests: Gothic 3

The current version of Gothic 3 does not support FSAA, so testing was done using anisotropic filtering only.


Despite the lack of support for full-screen anti-aliasing, the GeForce 8800 GTS 320MB is seriously inferior not only to the regular GeForce 8800 GTS, but also to the Radeon X1950 XTX, only a little ahead of the GeForce 7950 GX2. Due to performance at 26-27 fps at 1280x1024 this card is not well suited for Gothic 3.

Note that the GeForce 8800 GTX outperforms the GeForce 8800 GTS by 20% at best. It seems that the game is not able to use all the resources that Nvidia's flagship model has. This is evidenced by the slight difference between the regular and overclocked version of the GeForce 8800 GTS.

Playtests: Neverwinter Nights 2

Starting from version 1.04, the game allows you to use FSAA, but HDR support still a work in progress, so we tested NWN 2 in "eye candy" mode.


As already mentioned, the minimum playability barrier for Neverwinter Nights 2 is 15 frames per second, and the GeForce 8800 GTS 320MB balances on this edge already at a resolution of 1600x1200, while for the version with 640 MB of memory 15 fps is the minimum indicator, below which it performance is not affected.

Playtests: The Elder Scrolls IV: Oblivion

Without HDR, the game loses a lot of its appeal, and although the opinions of the players differ on this point, we tested TES IV in the mode with FP HDR enabled.


The performance of the GeForce 8800 GTS 320MB directly depends on the resolution used: if at 1280x1024 the new product is able to compete with the most performance cards of the previous generation, then at 1600x1200 and, especially, 1920x1200, it loses to them, yielding up to 10% to the Radeon X1950 XTX and up to 25% to the GeForce 7950 GX2. Nevertheless, this is a very good result for a solution that has an official price of only $299.

The regular GeForce 8800 GTS and MSI's overclocked variant feel more confident and deliver comfortable first-person shooter-level performance at all resolutions.


While examining two versions of the GeForce 7950 GT, which differ in the amount of video memory, we did not find any serious differences in performance in TES IV, however, in a similar situation with two versions of the GeForce 8800 GTS, the picture is completely different.
If at 1280x1024 they behave the same, then already at 1600x1200 the version with 320 MB of memory is more than twice inferior to the version equipped with 640 MB, and at 1920x1200 its performance drops to the level of Radeon X1650 XT. It is quite obvious that the point here is not the amount of video memory, as such, but the peculiarities of its distribution by the driver. It is likely that the problem can be fixed by tweaking ForceWare, and with the release of new versions of Nvidia drivers, we will check this statement.

As for the GeForce 8800 GTS and MSI NX8800GTS OC Edition, even in the open spaces of the Oblivion world they provide a high level of comfort in all resolutions, although, of course, not around 60 fps, as in closed rooms. The most powerful solutions of the previous generation are simply not able to compete with them.

Playtests: X3: Reunion


The average performance of all representatives of the GeForce 8800 family is quite high, but the minimum performance is still at a low level, which means that drivers need to be improved. The results for GeForce 8800 GTS 320MB are the same as for GeForce 8800 GTS 640MB.

Playtests: Command & Conquer 3: Tiberium Wars

The Command & Conquer real-time strategy series is probably familiar to anyone who is more or less fond of computer games. The continuation of the series, recently released by Electronic Arts, takes the player into the well-known world of confrontation between GDI and the Brotherhood of Nod, which, this time, has been joined by a third faction in the face of alien invaders. The game engine is made at the modern level and uses advanced special effects; in addition, it has one feature - the fps limiter, fixed at around 30 frames per second. Perhaps this is done in order to limit the speed of the AI ​​and thus avoid an unfair advantage over the player. Since the limiter is not disabled by regular means, we tested the game with it, which means that we paid attention first of all to the minimum fps.


Almost all test participants are able to provide 30 fps in all resolutions, except for the GeForce 7950 GX2, which has problems with the SLI mode. Most likely, the driver simply does not have the appropriate support, since the last time the official Windows XP nvidia driver ForceWare for the GeForce 7 family was updated more than six months ago.

As for both GeForce 8800 GTS models, they demonstrate the same minimum fps, and, therefore, provide the same level of comfort for the player. Although the model with 320 MB of video memory is inferior to the older model in 1920x1200 resolution, 2 frames per second is hardly a critical value, which, with the same minimum performance, again, does not affect game process. Complete absence discreteness of control can be provided only by the GeForce 8800 GTX, whose minimum fps does not fall below 25 frames per second.

Playtests: Company of Heroes

Due to problems with FSAA activation in this game, we decided not to use the "eye candy" mode and tested it in pure performance mode with anisotropic filtering enabled.


Before us is another game where the GeForce 8800 GTS 320MB is inferior to the previous generation with a non-unified architecture. In fact, the $299 Nvidia solution is usable at resolutions no higher than 1280x1024, even with anti-aliasing turned off, while the $449 model, which differs by the only parameter - the amount of video memory, can successfully play even at 1920x1200. However, this is also available to owners of AMD Radeon X1950 XTX.

Playtests: Supreme Commander


But Supreme Commander, unlike Company of Heroes, does not impose strict requirements on the amount of video memory. In this GeForce game 8800 GTS 320MB and GeForce 8800 GTS show equally high results. Some additional gain can be obtained with the help of overclocking, which is demonstrated by the MSI product, but such a step will still not reach the level of the GeForce 8800 GTX. However, the available performance is enough to use all resolutions, including 1920x1200, especially since its fluctuations are small, and the minimum fps is only slightly inferior to the average.

Synthetic benchmarks: Futuremark 3DMark05


Since by default 3DMark05 uses a resolution of 1024x768 and does not use full-screen anti-aliasing, the GeForce 8800 GTS 320MB naturally demonstrates the same result as the regular variant with 640 MB of video memory. The overclocked version of the GeForce 8800 GTS supplied to the market by Micro-Star International boasts a beautifully even result - 13800 points.






Unlike the general result obtained in the default mode, we obtain the results of individual tests by running them in "eye candy" mode. But in this case it had no effect on the performance of the GeForce 8800 GTS 320MB - there was no noticeable lag behind the GeForce 8800 GTS even in the third, most resource-intensive test. MSI NX8800GTS OC Edition in all cases took a stable second place after GeForce 8800 GTX, confirming the results obtained in the overall standings.

Synthetic benchmarks: Futuremark 3DMark06


Both versions of the GeForce 8800 GTS behave in the same way as in the previous case. However, 3DMark06 uses more complex graphics, which, combined with the use of FSAA 4x in some tests, may paint a different picture. Let's see.






The results of individual groups of tests are also natural. The SM3.0/HDR group uses a larger number of more complex shaders, so the advantage of the GeForce 8800 GTX is more pronounced than in the SM2.0 group. AMD Radeon x1950 XTX also looks more advantageous in case of active use of Shader Model 3.0 and HDR, and GeForce 7950 GX2, on the contrary, in SM2.0 tests.




After enabling FSAA, the GeForce 8800 GTS 320MB really starts losing to the GeForce 8800 GTS 640MB at 1600x1200, and at 1920x1200 the new Nvidia solution cannot pass the tests at all due to lack of video memory. The loss is close to twofold both in the first and second SM2.0 tests, despite the fact that they are very different in terms of the construction of graphic scenes.






In the first SM3.0/HDR test, the impact of video memory size on performance is clearly visible even at 1280x1024. The younger model GeForce 8800 GTS lags behind the older one by about 33%, then, at 1600x1200, the gap increases to almost 50%. The second test, with a much less complex and large-scale scene, is not so demanding on the amount of video memory, and here the lag is 5% and about 20%, respectively.

Conclusion

Time to take stock. We tested two Nvidia GeForce 8800 GTS models, one of which is a direct competitor to the AMD Radeon X1950 XTX, and the other is designed for the $299 mainstream performance card sector. What can we say, having the results of gaming tests?

The older model, which has an official price of $449, performed well in terms of performance. In most tests, the GeForce 8800 GTS outperformed the AMD Radeon x1950 XTX and only in some cases showed equal performance with the AMD solution and lagged behind the dual-processor tandem GeForce 7950 GX2. However, given the exceptionally high performance of the GeForce 8800 GTS 640MB, we would not unequivocally compare it with the products of the previous generation: they do not support DirectX 10, while the GeForce 7950 GX2 has a significantly worse anisotropic filtering quality, and potential problems caused by the incompatibility of one or another games with Nvidia SLI technology.

GeForce 8800 GTS 640MB can definitely be called the best solution in the $449-$499 price range. However, it is worth noting that the new generation of Nvidia products are still not cured of childhood diseases: in Call of Juarez there are still flickering shadows, and Splinter Cell: Double Agent, although it works, it requires a special launch on version 97.94 drivers. At least until the introduction of cards based on AMD's new generation graphics processor, the GeForce 8800 GTS has every chance of taking its rightful place as "the best $449 accelerator". However, before purchasing the GeForce 8800 GTS, we would recommend clarifying the compatibility of the new Nvidia family with your favorite games.

New GeForce The 8800 GTS 320MB for $299 is also a very good purchase for the money: support for DirectX 10, high-quality anisotropic filtering and not a good level of performance in typical resolutions are just some of the advantages of the new product. Thus, if you plan to play at 1280x1024 or 1600x1200 resolutions, the GeForce 8800 GTS 320MB is an excellent choice.

Unfortunately, a very promising card from a technical point of view, which differs from the more expensive version only in the amount of video memory, is sometimes seriously inferior to the GeForce 8800 GTS 640MB, not only in games with high requirements for video memory, such as Serious Sam 2, but even where there was no difference in the performance of cards with 512 and 256 MB of memory before. In particular, these games include TES IV: Oblivion, Neverwinter Nights 2, F.E.A.R. extraction point and some others. Given that 320 MB of VRAM is clearly more than 256 MB, the problem is clearly related to its inefficient distribution, but, unfortunately, we do not know if it is due to flaws in the drivers or something else. Nevertheless, even taking into account the shortcomings described above, the GeForce 8800 GTS 320MB looks much more attractive than the GeForce 7950 GT and Radeon X1950 XT, although the latter will inevitably lose in price with the advent of this video adapter.

As for the MSI NX8800GTS-T2D640E-HD-OC, we have a well-bundled product that differs from the Nvidia reference card not only in packaging, accessories, and a sticker on the cooler. The video adapter is overclocked by the manufacturer and in most games provides a noticeable performance boost compared to the standard GeForce 8800 GTS 640MB. Of course, it can't reach the level of GeForce 8800 GTX, but additional fps are never superfluous. Apparently, these cards are carefully selected for their ability to work at higher frequencies; at least our copy showed enough nice results in the field of overclocking, and it is possible that most NX8800GTS OC Editions are capable of overclocking well beyond what the manufacturer has already done.

Special praise deserves the inclusion of a two-disc edition of Company of Heroes, considered by many game reviewers to be the best strategy game of the year. If you are serious about buying a GeForce 8800 GTS, then this MSI product has every chance of becoming your choice.

MSI NX8800GTS-T2D640E-HD-OC: pros and cons

Advantages:

Improved performance over reference GeForce 8800 GTS
High performance at high resolutions with FSAA





Low noise
Good overclocking potential
Good equipment

Flaws:

Insufficiently debugged drivers

GeForce 8800 GTS 320MB: advantages and disadvantages

Advantages:

High performance in its class
Support for new anti-aliasing modes and methods
Excellent quality anisotropic filtering
Unified architecture with 96 shader processors
Future proof: DirectX 10 and Shader Model 4.0 support
Efficient cooling system
Low noise

Flaws:

Insufficiently debugged drivers (problem with video memory allocation, poor performance in some games and/or modes)
High energy consumption

Video cards from NVIDIA are traditionally regarded as one of the best on the market in terms of quality, performance and price. This pattern has been formed for a long time. It can be traced, in particular, on the example of the 8800 GT video card, which was launched by the brand on the market in 2007. Its impressive characteristics and performance are among the main factors for the continued demand for this device today in Russia and abroad. What is special about the corresponding graphics adapter?

General information about the device

2 GB or more RAM modules installed;

There is a motherboard similar in characteristics to the ASUS P5B device;

There is a fairly fast HDD- for example, WD Caviar SE.

The PC in the marked configuration will also have optimal compatibility with the overclocked 8800 GT video card.

Summary

So, the GeForce 8800 GT graphics adapter - at the time of its release was considered one of the best products in the corresponding market segment. First of all, in terms of the combination of price and speed of work. The test results we have reviewed show that the solution from NVIDIA works more efficiently than the main analogue from the company's closest competitor in the global graphics adapter market - AMD.

The advantages of the 8800 GT video card largely predetermine its continued popularity in Russia. The 8800 GT graphics card is quite capable of loading many modern games. Device drivers, as we noted above, are available for the most common operating systems - Windows 7, Windows 8, Linux. Now this device available by minimum prices- however, not from official dealers, but from private sellers.