The trend for the release of multi-chip hi-end-class solutions only intensifies every year, and both GPU developers managed to introduce the second generation of video cards with two GPUs of a unified architecture. AMD even succeeded in this regard, allowing a bunch of a pair of Radeon HD 4850s to be born, while the flagship operates at higher frequencies and is equipped with GDDR5 memory, just like the Radeon HD 4870. Due to technical problems, it is easier to develop productive solutions based on two mid-range chips than on one more powerful one, but this affects the final power consumption and, accordingly, heat dissipation. Besides, support for SLI or CrossFire mode on the part of the application is required at the proper level, otherwise the result produced by the accelerator will be equal to a single card.

Initially, AMD and NVIDIA followed their own path, developing two-chip solutions: the first created a compact device on a single PCB, the second produced tandems in the form of a “sandwich” of two separate boards cooled by a common radiator. Naturally, the latter option is less preferable, since the use of two half-cards affects the final cost of the product and requires efficient heat removal from graphics processors located opposite each other. But recently, the Californians still switched to a new design of their flagship GeForce GTX 295, now using only one circuit board. There were even rumors on the Web about the allegedly long length and high power consumption of the new product compared to the old version of the card. But is it so, we will try to find out in our review.


Inno3D GeForce GTX 295 Platinum Edition

As a representative of the new revision of the GeForce GTX 295 accelerator, a video card from Inno3D with a very loud name - Platinum Edition got to the test. The card comes in a small box with a picture of a warrior girl, which also shows the main characteristics of the adapter and the presence of two games in the kit.


Package Inno3D GeForce GTX 295 Platinum:
  • DVI/D-Sub adapter;
  • DVI/HDMI adapter;
  • audio cable (S/PDIF);
  • disc with the game Warmonger;
  • CD with Far Cry 2 game;
  • disk with drivers;
  • 10% discount coupon for the purchase of five games.



Designed by GeForce GTX 295 Rev. B is fundamentally different from its predecessor, both due to the layout of the elements on the same board, and due to the 85 mm fan located in the center, the air from which exits both outside the system unit and inward through the end of the card. Something similar was used back in the days of the GeForce 7900 GTX.





On the reverse side, two heatsink plates are screwed, cooling part of the memory chips. A similar, but integral solution is used in the Radeon HD 4870 X2.

In terms of its size, the new product does not differ at all from the old revision, although there was information on the Web about a slightly longer accelerator length relative to the previously released GeForce GTX 295.



In addition, the card lost the HDMI connector, and now progressive monitors or TV screens will have to be connected in the old fashioned way, through an adapter. The remaining peripheral connectors, namely two Dual Link DVI, remained in place. Naturally, there is no HDTV connector. Immediately on the bar there is an indicator that is responsible for the correct connection of the power cables - nothing has changed in this regard.



Like its predecessor, the new revision of the GTX 295 is equipped with only one MIO interface, which allows you to combine two identical cards in Quad-SLI mode. NVIDIA's technical department, with some hesitation, confirmed that a couple of different accelerators can work in this mode, but all attempts to somehow boot with two video cards of different revisions were unsuccessful.

Structurally, the cooling system has much in common with that of its competitor Radeon HD 4870 X2, but, as noted above, the adapter in question is equipped with a large fan instead of a turbine, part of the air from which, cooling the radiator of one GPU, enters the system through a hole in the back casing.



The cooler cover is easy to remove, unlike the first cards based on the 65nm G200, and to avoid plastic rattling during system operation, small springs are installed on the guides around the perimeter.



The cooler consists of an aluminum base that cools the memory, an NV200 switch chip, two NVIO2 chips and power cells, and two separate heatsinks with heat pipes are used to cool the GPUs.





The heatsinks are small, even smaller than the single-chip cards of the GeForce GTX 200 series, have a copper insert and two heat pipes that transfer heat to thin aluminum fins.



The radiators are blown by a fan with a maximum speed of about 3400 rpm. In the nominal operating mode, the fan speed does not rise above 3100 rpm, but despite this, the system works much quieter than turbine-type coolers.

The disassembled cooling system looks like this:



NVIDIA engineers had to seriously work on the design of the printed circuit board, because compared to the Radeon HD 4870 X2, the tandem from California has a large number of memory chips and additional chips.





But in the end, the arrangement of elements on the board turned out to be quite compact, both due to the transfer of part of the memory chips to the back side of the card, and due to the grouping of additional chips and power piping in the center of the board. The power subsystem, as in the old revision of the GTX 295, is separate: three phases per GPU and one for memory. The card is powered by one 8- and one 6-pin connector.



The card has G200-400 and G200-401 revision B3 GPUs. There are no protective frames, and they are not needed, given that the base of the cooler serves as a stiffening rib for the entire board.



Twenty-eight GDDR3 Hynix H5RS5223CFR-N0C chips (fourteen for each processor) designed for an effective frequency of 2000 MHz are used as memory.



The total amount is 1792 MB, but the actual amount is only half, since due to the specifics of the SLI mode, like CrossFire, the data in the memory of each GPU is duplicated. The memory bus is 448 bits per "half" of the card.

The operating frequencies of the Inno3D GeForce GTX 295 Platinum almost correspond to the reference ones - 576/1242 MHz (core and shader domain) and 2016 MHz (memory). In this case, the memory frequency is overstated by 18 MHz.



When switching to 2D mode, to reduce power consumption, the cards first drop the frequencies to 400/800 and 600 MHz (core and memory), then to 300/600 and 200 MHz.

The overclocking of the card was 684/1476 MHz (raster and shader domains) for chips and 2376 MHz for memory, which is very good for a two-chip solution. For example, a representative of the GeForce GTX 295 Rev. A failed to reach such frequencies.



In addition to the overclocking potential, I was pleased with the temperature regime of the adapter - no more than 82 ° C on the GPU when testing in gaming applications and no more than 86 degrees Celsius when running FurMark.

For comparison with the new product, we used a video card from XFX of the old revision - XFX GF GTX295 576M 1792MB DDR3. The card comes in a large green-and-black box, on which only the features of the product and the presence of the Far Cry 2 game are marked.



Contents of delivery:
  • DVI/D-Sub adapter;
  • audio cable (S/PDIF);
  • HDTV adapter;
  • Molex/PCI-E power adapter;
  • disk with drivers;
  • CD with Far Cry 2 game;
  • instructions.



The design of the card fully corresponds to the reference one: the same "sandwich" as the GeForce 9800 GX2, but in order to save money, it has no back cover and is equipped with a simpler front one.





There are two Dual Link DVI and HDMI on the card, next to it on the bracket there is an indicator for connecting power cables. There is one MIO interface for combining similar cards in SLI mode, but our two test instances of different revisions refused to work together.



The design of the cooling system is slightly different from the GeForce 9800 GX2, but, as before, part of the air is thrown into the system unit, and part is already coming out.



To power the adapter, the half-cards have two connectors: one 6-pin and one 8-pin. Disconnecting cables is now easier, but not as if the connectors were located along the board.

Graphic processors operate at a frequency of 576/1242 MHz, memory, with a total volume of 1792 MHz - at a frequency of 1998 MHz, which is fully consistent with the reference characteristics. Transitions to 2D mode are similar to the map discussed above.



The overclocking of the card was only 612/1332 MHz for the chip and 2232 MHz for the memory, which is significantly lower than the result of the GeForce GTX 295 Rev. b.
The temperature of one of the cores during acceleration reached 98 degrees Celsius, while the turbine worked at 85% of the maximum speed. This applies to gaming applications. If we run the FurMark benchmark, then the GPU warmed up to 100 ° C, and the turbine speed was already at 100 percent. Imagine the rumble from such a system, we think, will not be difficult. Force3D Radeon HD 4870 X2

To compete in the upper price range of video cards, AMD released a two-chip solution based on the RV770 last year, which was able to win the palm from the GeForce GTX 280, and then from the GTX 285 on the updated G200. Naturally, at first, the 65-nm process technology did not allow NVIDIA to combine two graphics cores in one video card, but with the transition to thinner technological standards, this became possible, which made it possible to release a competitor for the Radeon HD 4870 X2 - the GeForce GTX 295.

A solution based on a pair of RV770s came to us for testing in an OEM configuration, i.e. without anything, so let's go straight to the product description. The Black Force3D Radeon HD 4870 X2 video card is made according to the reference design and differs from similar adapters only by a branded sticker on the cooling system. Compared to the single-chip accelerator Radeon HD 4870, this adapter is longer and equal in size to products of the GeForce GTX 200 series.





The set of peripheral connectors is standard: two Dual Link DVI with the ability to output digital audio via a DVI/HDMI adapter from the built-in multi-channel audio codec, HDTV output and one interface for connecting a bridge in CrossFire mode.



The cooling system consists of an aluminum base that cools the memory, a chip switch and power elements, a pair of copper heatsinks for GPUs and a memory heatsink plate on the back of the card.



The turbine, installed at the edge of the adapter, drives air through one radiator, which, after heating, no longer removes heat from the second one as efficiently, and the difference between the cores can reach 15 ° C (75 versus 90). In this case, the cooler of the new revision GeForce GTX 295 looks much more interesting.



The printed circuit board, unlike the new revision of the GeForce GTX 295, has a lot of free space on the right side, and all the main components are grouped in the center and near the peripheral connectors - these are two graphics processors, a PLX PEX8647 switch chip, half of the memory (the rest is moved to the back side cards) and power strapping.



The RV770 processor is equipped with a protective frame that prevents the core from chipping when installing the cooling system.



The switch chip for connecting two GPUs was previously used on the Radeon HD 3870 X2, but now it supports the PCI Express 2.0 bus, unlike PCI-E 1.1 in the old version.



Sixteen Hynix H5GQ1H24MJR-T0C GDDR5 memory chips totaling 2048 MB (1024 MB per GPU) are designed for an effective frequency of 4000 MHz. Memory bus 256 bits for each chip.



The operating frequencies of the Force3D Radeon HD 4870 X2 fully correspond to the reference ones and are equal to 750/3600 MHz, chip and memory respectively. When switching to 2D mode, the frequencies are reduced to 507/2000 MHz.



The overclocking of the card was 792 MHz for the GPU and 3800 MHz for the memory, which is not particularly impressive, but considering the heating of the accelerator, it can be considered quite worthy.
As for the heating of the adapter, the temperature of one of the processors reached 90 degrees Celsius, the other - only 75 °C. When tested in FurMark, the temperature rose to 93 ° C, and the card became more like a vacuum cleaner than a gaming solution, so if possible, it is better to change the reference cooler, for example, to Accelero XTREME 4870X2 from Arctic Cooling. BFG GeForce GTX 285OC

The products of an American company have recently appeared on the domestic market, and although they are not particularly represented, they still won sympathy from a certain category of users.

The BFG GeForce GTX 285 OC comes in a small box with a painted “bodybuilding magician” picture, which also shows the operating frequencies of the video card and provides reference specifications so that you can feel the difference on the counter.



Contents of delivery:
  • DVI/D-Sub adapter;
  • DVI/HDMI adapter;
  • HDTV adapter;
  • audio cable (S/PDIF);
  • Molex/PCI-E power adapter;
  • instructions;
  • warranty memo;
  • advertising insert;
  • disk with drivers;
  • stickers.



The BFG GeForce GTX 285 OC fully complies with the reference products and differs only in a branded sticker on the cooling system.





The set of peripheral connectors is the same: two Dual Link DVI and HDTV. There are a couple of MIO interfaces for combining such cards in SLI mode. The card is powered by two six-pin connectors.



The operating frequencies of the accelerator are slightly different from the reference GeForce GTX 285: 666/1512 chip and 2484 MHz memory, which is 18/36 MHz higher for the GPU. The transition to 2D mode is the same as with the XFX GF GTX295 576M 1792MB DDR3.



The overclocking of the card was not particularly impressive and amounted to 678/1548 MHz for the chip and 2646 MHz for the memory.



The temperature of the chip during acceleration reached 90 ° C, and the turbine rotated at its 100%, which was about 3160 rpm. Of course, you can't call CO quiet operation, but for hi-end products it has already become the order of things.
Characteristics of the cards in question
Video adapter XFX GF GTX295 576M 1792MB DDR3 Force3D Radeon HD 4870 X2 BFG GeForce GTX 285OC
Nucleus 2x G200b 2x G200b 2x RV770 G200b
Number of transistors, million pieces 2x 1.4 billion 2x 1.4 billion 2x 956 million 1.4 billion
Process technology, nm 55 55 55 55
Number of stream processors 2x 240* 2x 240* 2x 800* 240
Number of texture blocks 2x80 2x80 2x40 80
Number of render units 2x28 2x28 2x16 32
Core frequency (nominal), MHz 576 576 750 666 (648)
Shader domain frequency (nominal), MHz 1242 1242 750 1512 (1476)
Memory bus, bit 2x448 2x448 2x256 512
Memory type GDDR3 GDDR3 GDDR5 GDDR3
Memory size, MB 2x896 2x896 2x 1024 1024
Memory frequency (nominal), MHz 2016 1998 3600 2484
Supported version of DirectX 10 10 10.1 10
Interface PCI Express 2.0 PCI Express 2.0 PCI Express 2.0 PCI Express 2.0

* - GPU architecture from AMD and NVIDIA differ from each other

Test stand and test conditions

Testing was carried out on the following stand:

  • Processor: Intel Core 2 Duo 8500 (3, [email protected].27 GHz, FSB 450 MHz);
  • Motherboard: ASUS Rampage Formula (Intel X48);
  • Cooler: Noctua NH-U12P;
  • RAM: G.Skill F2-8800CL5D-4GBPI (2x2048 MB, [email protected], 4-4-4-12-2T, dual channel);
  • Hard disk: Samsung HD252HJ (250 GB, SATA2);
  • Power supply: Silver Power SP-S850 (850 W);
  • Housing: Housing: Chieftec BA-01B-B-SL;
  • Operating system: Microsoft Windows Vista x86SP1;
  • Drivers: NVIDIA GeForce 185.85 and ATI Catalyst 9.4.
The performance of the video drivers was tuned for maximum quality. PhysX version 2.9.09.2 was installed, PhysX acceleration via the GPU was not disabled, except for testing in 3DMark Vantage.

The following applications were used for testing:

  • 3DMark'06 - v1.1.0, default settings;
  • 3DMark Vantage - v1.0.1, "Performance" and "High" profile, basic tests;
  • Last Remnan Benchmark - no settings other than resolution;
  • Cryostasis TechDemo - "High" preset;
  • Far Cry 2 - built-in benchmark, "Ultra High" profile;
  • S.T.A.L.K.E.R.: Clear Sky Benchmark - "Ultra" quality, Day test, DirectX 10.1 support selected for Radeon HD 4870 X2;
  • H.A.W.X. - maximum quality, SSAO - Very High, support for DirectX 10.1 was selected for the Radeon HD 4870 X2 card;
  • Crysis - v1.2, used Crysis Benchmark Tool 1.05, standard demo, quality settings - "Very High";
All measurements were carried out at a screen resolution of 1680x1050 (except for 3DMark "06 and 3DMark Vantage when the "Performance" profile was selected) to create a more or less serious load on video accelerators. In addition, image quality enhancement modes were used in all applications that allowed this - 4x or 8x anti-aliasing and/or 16x anisotropic filtering AA and AF were not forced in the drivers.

On graphs where there is no legend, the average and minimum fps are indicated.

Test results



In the old version of the 3DMark synthetic test package, the leader is the Radeon HD 4870 X2. However, the Inno3D card easily compensates for the lag due to its good overclocking potential. The single-core NVIDIA flagship is not far behind the top cards in simple mode, but with anti-aliasing it is up to 40% behind the GeForce GTX 295.





In 3DMark Vantage, the leadership is already on the side of NVIDIA dual-chip cards, AMD's flagship lags far behind them, and with an increase in resolution, this lag only increases (up to 28%). The single-core GeForce GTX 285 card is not able to compete with its older brothers, but about 20% separates it from the Radeon HD 4870 X2 card, which, however, does not help to catch up with any overclocking.

The Last Remnant



The meager difference between NVIDIA dual-chip cards and the GeForce GTX 285 immediately catches your eye. And the result of the Radeon HD 4870 X2 is very low, and if you look at our previous reviews, you will see about the same results for the regular Radeon HD 4870. So in CrossFire does not work in this game, and SLI does not seem to work either.

Cryostasis TechDemo



Due to the use of the NVIDIA PhysX physics engine, Radeon graphics accelerators do not look good in this test, but in a real game, PhysX can be disabled and play safely on Radeon cards. As for the difference between GeForce video adapters, although dual-chip cards are in the lead, their advantage is no more than 20%. And if we compare by the minimum fps, then the difference is even smaller.

Far Cry 2



Unusual results are observed in the game Far Cry in easy mode. All dual-chip cards are inferior to the GeForce GTX 285, and overclocking does not bring an increase in performance, or even leads to a slight drop in the final result, which suggests that the reason for this lies in the drivers. But let's see how things will be when anti-aliasing is enabled.



In heavy mode, everything looks more or less adequate for NVIDIA cards, but not for Radeon, which loses about 25% of performance and becomes an outsider. On the other hand, dual-chip GeForce cards show approximately the same results as in the simple mode, outperforming the GeForce GTX 285 by 14%. But these cards almost do not differ in minimum fps. But here a small difference can be explained by the high processor dependence of this test.

S.T.A.L.K.E.R.: Clear Sky



Dual-GPU cards from NVIDIA and AMD demonstrate almost identical results, but the Radeon HD 4870 X2 still has a minimal advantage in terms of minimum fps. High overclocking potential easily brings Inno3D GeForce GTX 295 Platinum to the lead. The gap between the single-chip GeForce GTX 285 and more expensive accelerators is about 45%.



Slightly mixed results are seen with both GeForce GTX 295s when anti-aliasing is enabled in this game. Their minimum fps sags a lot, and in this indicator they are inferior even to the GeForce GTX 285. It seems that the drivers for dual-chip NVIDIA still need significant improvement.

H.A.W.X.



In this application, the undisputed leader is the Radeon HD 4870 X2, especially in heavy mode with anti-aliasing. It has a 70% advantage in this mode over the GeForce GTX 295 and more than a twofold advantage over the GeForce GTX 285!



In Crysis, NVIDIA's two-chip flagships again confidently occupy the leader's place. The good overclocking potential of the updated GeForce GTX 295 helps the Inno3D card confidently outperform all rivals. The GeForce GTX 285, although significantly inferior to older cards, is not so far behind the Radeon HD 4870 X2 in terms of minimum fps.



When anti-aliasing is enabled, the performance of the Radeon HD 4870 X2 drops much less than that of all GeForce cards, and therefore, in nominal terms, it even outperforms the GeForce GTX295, but again it cannot compete with it in terms of minimum fps. In such a heavy mode, there is a lack of computing power in the GeForce GTX 285, which is only growing behind the leaders.

Energy consumption

The Seasonic Power Angel was used to measure the total power consumption of the system. As a load on the video adapters, we used: a 10-minute run of the FurMark benchmark, three flights over the island in Crysis with Very High and AA8x graphics quality, as well as the first three tests in Devil May Cry 4 with Super High and MSAA8x quality. The screen resolution has always been 1650x1050. Idle power consumption of the system was measured after the test was completed after 10 minutes.







So, the new revision GeForce GTX 295 card was less gluttonous under load, and the peak power consumption in the first two tests was 10 watts lower than that of the old GTX 295. In the third application, the system power consumption was equal, and the difference was only 2 watts. In idle, the system based on the novelty also consumed less - by 5 watts. The Radeon HD 4870 X2 card broke all records for “economy” - a system based on it under load already consumed 100 W more in FurMark, 50 W more in Crysis and 10 W more in DMC 4 than with the GeForce GTX 295, and at idle, the difference reached almost 20 watts. The result is clearly not in favor of AMD's solution.

conclusions

No sooner had a flagship based on the G200 pair been released than a simplified design with a more efficient and quieter cooling system, lower power consumption (according to our measurements) and better overclocking potential was immediately offered. The question is - why was it impossible to immediately release such a solution, why was it originally "to fence the garden" with two boards? The question remains open.

But be that as it may, the new revision of the GeForce GTX 295 turned out to be extremely successful, probably the best representative of the two-chip video cards ever produced. And given the low power consumption and higher performance than the Radeon HD 4870 X2, the new product becomes the best choice in the hi-end class.

As for the Radeon HD 4870 X2, this accelerator may not always show better results than the GeForce GTX 295 at very high power consumption. second, especially when smoothing is enabled.

  • Max Point for the Silver Power SP-S850 PSU;
  • Noctua for the Noctua NH-C12P cooler and Noctua NT-H1 thermal paste.
  • Resting on our laurels is a very pleasant occupation, but extremely dangerous, especially when it comes to the gaming industry. 3D graphics where the situation can change overnight and you need to be constantly ready for a response. Like no one else, Nvidia must understand this, which was seriously damaged as a result of an unexpected massive attack from ATI with its new RV770 architecture. Fortunately, Nvidia decided not to give up without a fight, and the first serious step on its part was the transition of the G200 architecture to use the 55nm process technology. The result is very successful: as shown by our testing, a new version The GeForce GTX 260 Core 216 not only outpaced the Radeon HD 4870 1GB in most tests, but also showed significantly lower power consumption. This paved the way for creating a solution capable of not only competing on equal terms with the Radeon HD 4870 X2, but also taking away the title of “the fastest gaming accelerator” from it.

    This problem was solved by the only method - by creating a dual-processor video adapter based on the G200. Of course, this can be considered a departure from the principles professed by Nvidia, which consists in relying on the most powerful single-core solutions, but in war as in war - no increase in the frequency potential of the G200, achieved by optimizing the technical process, would allow it to fight alone with a friendly pair RV770 running in CrossFire mode. The ATI doctrine proved its superiority in practice, so Nvidia had no choice but to cast aside prejudice and try to outplay its main competitor in its own field with its own tricks. Previously, such a move was not possible, since a two-chip solution based on the 65nm version of the G200 would have turned out to be excessively hot and uneconomical, but the transition to the 55nm process technology made this enterprise a reality.

    Yes, high-performance and expensive high-end graphics cards bring little income to their creators - the bulk of sales come from mass solutions costing up to $ 200, but they have another, no less, and in some ways even more important role than simple extraction arrived. As already mentioned, the flagship determines the face of the squadron. Powerful decisions are a kind of battle banner, indicating technological possibilities development company, which plays an important role in attracting potential buyers, and therefore, ultimately, affects the market share occupied by the company. Suffice it to recall ATI's position before the release of the Radeon HD 4000 - the company had something to offer to its customers in the sector of low-cost solutions, and, nevertheless, it was quickly losing its positions in the market.

    Although the G200, even in its 55nm variant, is not the most suitable GPU for creating a dual-GPU video adapter, but, given the above point of view, the release of the GeForce GTX 295 should be considered as a necessary step taken by Nvidia in response to the too long dominance of the Radeon HD 4870 X2 . While some observers believe that Nvidia's move away from betting on maximum performance single-chip solutions is a temporary measure, we believe that the company will continue to pursue a new strategy that has proven its superiority. This assumption is supported by our preliminary data on next-generation graphics cores being developed by Nvidia.

    But back to the goals of today's review. In it, we will try to cover Nvidia's new flagship solution comprehensively and put it against the Radeon HD 4870 X2 in a number of popular games in order to find out how much the GeForce GTX 295 can claim to be the champion of 3D gaming graphics.

    Nvidia GeForce GTX 295 vs. ATI Radeon HD 4870 X2: face to face

    Previous round, in which the GeForce GTX 280 and Radeon HD 4850 X2 fought in the ring, Nvidia lost. The new fighter has more impressive characteristics and claims to be the absolute leader, but it will also have to face a very serious opponent in the face of the Radeon HD 4870 X2. Let's compare their specifications:

    Both fighters look very impressive, but each in their own field: if the Radeon HD 4870 X2 has a monstrous head start in computing power, partly, however, offset by the features of the superscalar architecture of shader processors, then the GeForce GTX 295 takes revenge when it comes to texture and raster operations , at least in theory. Given the higher frequency of execution units, this gives the new product every chance of winning in real conditions. A nice addition is support for PhysX hardware acceleration, and the GeForce GTX 295 can use both the usual scheme with dynamic distribution of computing resources, and assign the task of accelerating physical effects to one GPU, while fully preserving the resources of the second to accelerate graphics.

    However, there is a characteristic in which the GeForce GTX 295 is inferior to the Radeon HD 4870 X2 - this is the amount of local video memory available to applications. It is 896 MB against 1024 MB for the opponent, which, theoretically, can affect performance when using high resolutions in conjunction with full-screen anti-aliasing, especially in new games that place more serious demands on video memory. As for bandwidth, thanks to the use of 448-bit access buses, the GeForce GTX 295 practically does not lag behind the Radeon HD 4870 X2 in this parameter. Among other shortcomings, one can note the lack of DirectX 10.1 support, an integrated sound core and a full-fledged VC-1 hardware decoder, but given that the GeForce GTX 295 claims to be the “fastest gaming card” that is clearly not intended for use in HTPC systems, these shortcomings can be attributed to not too significant.

    Features of the implementation of SLI technology GeForce GTX 295 completely inherited from its predecessor, GeForce 9800 GX2.

    In general, the new product is clearly armed to the teeth and is determined to remove the 3D king of the Radeon HD 4870 X2, which has been sitting on it, from the throne. But before we find out how well Nvidia did what they set out to do, let's take a closer look at one of the GeForce GTX 295 variants. Today's guinea pig will be the EVGA GeForce GTX 295+.

    EVGA GeForce GTX 295+: packaging and bundle

    The entire series of EVGA products that use the Nvidia G200 as the graphics core has a unified package with a very slightly different design from model to model. The EVGA GeForce GTX 295+ was no exception, delivered to store shelves in a standard black box of relatively small dimensions, decorated with a bright stripe and wrapped in a plastic film.


    There are few differences from the packaging of the EVGA GeForce GTX 260 Core 216 Superclocked video adapter we reviewed earlier: the strip crossing the box has become dark red and received a pattern of EVGA logos, and the letters from gray have become silver. Nevertheless, the design has not lost a sense of rigor and solidity. Unfortunately, in addition to a very common mistake, expressed in specifying the wrong type of memory - DDR3 instead of GDDR3 - another one has been added, and, probably, intentional. We are talking about the amount of video memory: although its total value is 1792 MB, in reality only half of this amount, 896 MB, is available to applications, since in modern homogeneous multi-GPU systems, data is duplicated for each GPU.

    On the back of the box, as before, there is a window through which you can see a section of the board with a sticker with a serial number, providing a guarantee, and also entitles the user to participate in the EVGA Step-Up program. In this case, the latter looks rather strange: the EVGA GeForce GTX 295+ is by far the highest performing gaming graphics adapter, and it is unlikely that during the 90-day Step-Up program from the date of purchase, the company will be able to offer something significantly more as an upgrade. powerful than the described product. Hypothetically, the EVGA GeForce GTX 200 lineup could see a more overclocked GeForce GTX 295 model with, say, "Superclocked" in the name, but it is known that in the vast majority of cases, factory overclocking does not lead to a serious increase in performance, for which it would be worth considering about replacing an already purchased graphics card with the same, but pre-overclocked model.

    The packaging's protective properties are commendable: instead of the plastic container used in less expensive EVGA models, here a polyurethane foam tray with cut-out recesses is used, in which, covered with lids, the video adapter and its accompanying set of accessories rest. Given the very high price of the product, the latter is somewhat surprising in its asceticism:


    DVI-I → D-Sub adapter
    Adapter 2xPATA → one 6-pin PCIe
    Adapter from two 6-pin PCIe to one 8-pin PCIe
    S/PDIF connecting cable
    Quick Installation Guide
    User's manual
    EVGA logo sticker
    CD with drivers and utilities

    Everything you need to install and fully operate the card as part of a powerful gaming platform is included in the kit, but nothing more - no frills, like the free version of Far Cry 2, which was bundled with the EVGA GeForce GTX 260 Core 216 Superclocked, is not in the box. The lack of a DVI-I → HDMI adapter could be explained by the fact that the card is equipped with a dedicated HDMI port, but, looking ahead, we can say that the presence of such an adapter in the package would be useful, due to the features of supporting multi-monitor configurations inherent in dual-processor Nvidia graphics cards .

    On the driver disk, in addition to the drivers themselves and the electronic version of the user manual, you can find useful utilities such as Fraps and EVGA Precision. The latter is a fairly handy tool for overclocking, fan speed control and graphics adapter temperature control.

    In general, the packaging of the EVGA GeForce GTX 295+ deserves high praise - both for the design and for the protective properties, but we see the package bundle as clearly unworthy of a solution related to the highest price category. The presence of one of the popular games seems logical, especially since one of the EVGA products we reviewed earlier, despite its significantly lower cost, could please the buyer with the full version of Far Cry 2.

    EVGA GeForce GTX 295+ PCB design

    The representative of the new, third generation of Nvidia dual-processor graphics cards is very similar to the second generation, GeForce 9800 GX2, both externally and by the design solutions implemented in it.






    However, if in the case of the competing Radeon 4870X2 it was quite possible to do without a two-board layout, which AMD did, then for the GeForce GTX 295 such a layout is an urgent need. This need is explained very simply: placement of two huge G200b chips on one board, coupled with the wiring of two 448-bit memory buses, would be impossible without a significant increase in the size of the entire structure, and the 27 centimeters length of a card based on single G200 chips is already the maximum allowable size for most modern ATX cases. Thus, the use of a two-board layout in this case is dictated by technological necessity, and not at all by engineering miscalculations of the developers. Going forward, Nvidia's move to GDDR5 memory should pave the way for simpler, cheaper dual-GPU graphics cards.

    As in the GeForce 9800 GX2, GeForce GTX 295 boards are deployed "face" to each other and use a single cooling system. This solution is quite controversial in terms of thermal efficiency, since even the 55nm version of the G200 cannot be called cold, and the components of both boards will definitely heat each other through a common heatsink, but, as mentioned above, this arrangement is the only way to create a dual-processor graphics adapter based on the G200, while keeping within the allowable dimensions in length and height. The developers of the GeForce GTX 295 should also be praised for the fact that the video adapter mounting plate is not cluttered with connectors, as it was in the GeForce 9800 GX2: almost the entire "second floor" is occupied by slots that serve to eject heated air outside the system case; however, part of the air is also thrown into the case.



    Unlike the GeForce 9800 GX2, the procedure for disassembling the GeForce GTX 295 is not too complicated: it is enough to remove the protective cover, the mounting plate and unscrew all the screws that secure the boards to the cooling system, which is also the main load-bearing element, after which it remains only to carefully separate the structure into component parts, carefully overcoming the resistance of thermal paste.









    The extremely dense layout of both boards confirms the idea that a single-board version of the GeForce GTX 295 is impossible, even though part of the area of ​​​​each of the two boards is occupied by a figured cutout serving as an air intake. The boards communicate with each other using two flexible cables connecting the connectors located on the left side.


    Each of the GeForce GTX 295 boards carries an independent four-phase GPU power regulator controlled by a Volterra VT1165MF PWM controller, however, if the top board is entirely powered by an 8-pin PCI Express 2.0 connector, designed for a load of up to 150 W, then the bottom board clearly receives part power supply from the power section of the PCI Express x16 slot. The Anpec APW7142 controller seems to be responsible for powering the memory.


    Next to the 6-pin power connector there is a 2-pin S/PDIF input connector, which is used to organize the translation of an external audio stream coming from the sound card to HDMI. Its presence on the bottom board is natural, since it is on it that the HDMI connector is also installed. It is interesting to note that the presence of a dedicated HDMI port required the installation of a second NVIO chip, so there are two of them in the GeForce GTX 295. However, simultaneous work of three interfaces is supported only when SLI mode is disabled, which is rather pointless, since in this case the GeForce GTX 295 loses its main advantage in the form of high performance in games. Support for dual-monitor configurations in SLI mode has been implemented since version 180 of the GeForce drivers, however, it is not as complete as it is done in ATI CrossFireX technology - a slave monitor can turn off if a game is running on the master in full screen mode.


    An nForce 200 chip is used as a bridge, which is also installed on some motherboards to support Nvidia SLI technology. It is a PCI Express 2.0 bus switch that supports 48 PCIe lanes and dual GPU direct mode.


    Each of the GeForce GTX 295 boards has 14 GDDR3 Hynix H5RS5223CFR-N0C chips with a capacity of 512 Mbps (16Mx32), designed for a supply voltage of 2.05 V and a frequency of 1000 (2000) MHz. It is at this frequency that the memory should work according to the official specifications of Nvidia, but EVGA subjected it to a slight overclocking, therefore, the memory in this option GeForce GTX 295 operates at a frequency of 1026 (2052) MHz.

    Of course, it would be tempting to equip the new flagship of the GeForce GTX 200 line with two gigabyte memory banks with a 512-bit access bus, but this would significantly complicate the already complicated design, therefore, the developers made a compromise, endowing their brainchild with two banks with a capacity of 896 MB each. with a 448-bit bus. Thus, the total amount of video memory of the GeForce GTX 295 is 1792 MB, and available for three-dimensional applications, as in all homogeneous multi-GPU solutions, is half of the total. This should be enough even at 2560x1600, and yet, in some cases, a tandem consisting of two separate GeForce GTX 280s can theoretically demonstrate higher performance due to more video memory. The peak performance of the GeForce GTX 295 memory subsystem should be 224 GB / s, but in the EVGA version it is slightly higher, reaching 229.8 GB / s, which is almost equal to the performance of the Radeon HD 4870 X2 (230.4 GB / s).


    The GPUs are labeled as G200-400-B3, meaning they are a newer revision of the G200b than those installed on the GeForce GTX 260 Core 216, labeled as G200-103-B2. The chips were produced in week 49, 2008, between November 30 and December 6. The configuration of the cores is atypical for solutions based on the G200: although all 240 shader and 80 texture processors are active in each of the two GPUs, some RBE units are disabled, since the memory controller configuration is tightly tied to them. Of the eight raster operation sections present in the kernel, seven are active, which is equivalent to 28 RBEs per core. Thus, each of the "halves" of the GeForce GTX 295 is a cross between the GeForce GTX 280 and the GeForce GTX 260 Core 216. The official clock speeds correspond to the latter - 576 MHz for the main domain and 1242 MHz for the shader processor domain, but in the considered version of the GeForce The GTX 295 EVGA bumped those numbers up to 594 and 1296 MHz, respectively.


    The card is equipped with three connectors for connecting monitors - two DVI-I and one HDMI, moreover, the first ones are connected to the leading GPU and can be used simultaneously in SLI mode, but the last one is connected to the slave core, as a result, it can only be used when this mode is disabled . Pretty strange technical solution, depreciating the very presence of a dedicated HDMI port; for comparison, in the GeForce 9800 GX2, such a port was connected to the leading GPU, along with one of the DVI-I ports.


    There is a blue LED next to one of the DVI ports, indicating that this port is the master port and should be used to connect the main monitor. Another LED, located at the HDMI connector, indicates the presence of problems with the card's power, and if there are none, it glows green. In addition, there is a single MIO port on the bottom board, which serves to organize two GeForce GTX 295s into a quad SLI system.

    EVGA GeForce GTX 295+: Cooling Design

    The cooling system of the GeForce GTX 295 is similar to the cooling system of the GeForce 9800 GX2, and in addition to performing its main function, is the supporting element of the video adapter design, since both boards are attached to it. Technically, the system is a kind of two-sided "sandwich", on the outer sides of which there are copper heat exchangers that are in direct contact with graphics cores, as well as protrusions that serve to remove heat from other elements that require cooling, and the filling is a thin-finned aluminum radiator connected to heat exchangers with flat heat pipes.






    We did not dare to disassemble the cooling system, since its parts are securely connected to each other with glue, however, the main structural features are visible in the pictures without it. The radiator fins are located at an angle to the mounting plate, so only part of the heated air leaves the system case through the slots in it, and the rest is thrown inside from the end of the card, where a special cutout is provided for this in the casing. However, compared to the GeForce 9800 GX2, the proportion of hot air thrown out has increased significantly. As already mentioned, the aluminum bases of the system have a number of protrusions that serve to provide thermal contact with memory chips, NVIO chips, and power elements of GPU power regulators. In the first two cases, fibrous pads, traditional for Nvidia solutions, impregnated with white thermal paste, are used as a thermal interface, and in the last case, a very thick gray thermoplastic mass is used. In addition to the two main heat exchangers responsible for cooling the graphics cores, there is a third one, which has a much smaller area and cools the PCI Express switch chip. Here, the dark gray thick thermal paste familiar to most modern graphics cards is used.



    At the back of the “sandwich” there is a 5.76 W radial fan that is responsible for blowing the radiator and is connected to the top board via a four-pin connector. Air intake is carried out both from above and from below the video adapter, through the corresponding holes in the printed circuit boards. The boards themselves are screwed to the bases of the cooling system with 13 spring-loaded screws each. From above, the cooling system is covered with a metal protective casing with a rubberized coating that is pleasant to the touch, but from below there is only a plastic overlay with an EVGA sticker covering the ferrite cores of the power stabilizer coils.

    In general, this design can hardly be called optimal, especially considering the expected level of heat dissipation of the GeForce GTX 295 in the region of 220-240 W, however, like the described video adapter as a whole, it is a product of a compromise that developers, forced to fit into the given dimensions, had to go not from a good life. Most likely, the described cooling system will cope with its task, but it would be unreasonable to expect outstanding thermal or noise characteristics from it. However, in the next chapter of our review, this assumption will be subjected to experimental verification.

    EVGA GeForce GTX 295+ Power Consumption, Thermals, Overclocking and Noise

    Unlike its ideological predecessors, which were only temporary solutions, the GeForce GTX 295 seriously claims to be the flagship designed to demonstrate the technical superiority of Nvidia, therefore, its characteristics such as power consumption, temperature and noise are of significant interest. In order to find out how the novelty is doing with them, we carried out the corresponding measurements.

    To study the level of energy consumption, a specially equipped stand with the following configuration was used:

    Processor Intel Pentium 4 560 (3.6 GHz, LGA775)
    Motherboard DFI LANParty UT ICFX3200-T2R/G (ATI CrossFire Xpress 3200)
    Memory PC2-5300 (2x512 MB, 667 MHz)
    HDD western digital Raptor WD360ADFD (36 GB)
    Power Supply Chieftec ATX-410-212 (power 410 W)
    Microsoft Windows Vista Ultimate SP1 32-bit
    Futuremark PCMark05 Build 1.2.0
    Futuremark 3DMark06 Build 1.1.0

    According to the standard methodology, the first SM3.0/HDR test of the 3DMark06 package was used to create the load in 3D mode, running in a loop at a resolution of 1600x1200 with forced FSAA 4x and AF 16x. Peak 2D mode was emulated using the 2D Transparent Windows test included in PCMark05. The last test is relevant in light of the fact that the Windows Vista Aero window interface uses the capabilities of the graphics processor.






    The transfer of the G200 to the 55nm process technology had a very beneficial effect on its electrical characteristics, as a result of which the maximum recorded level of power consumption of the GeForce GTX 295 did not exceed 215 W, that is, it turned out to be significantly lower than the demonstrated Radeon HD 4870 X2. Contrary to preliminary forecasts, the GeForce GTX 295 did not become a fire-breathing monster at all, which gives ATI a reason to seriously think about the efficiency of the technologies it uses, because, it turns out that with the same technical process, a pair of RV770s in total consumes significantly more than two G200b, and this, taking into account a much smaller number of transistors!

    As for the layout of the power lines, as expected, one of the GeForce GTX 295 boards is entirely powered by an 8-pin PCIe 2.0 connector, while the other, along with a 6-pin PCIe 1.0 connector, uses the power lines of the PCI Express x16 slot . Note that under load, the chokes of the card's power stabilizers emitted a distinctly audible squeak, but we still cannot say whether this behavior is typical of all instances of the GeForce GTX 295 without exception, or is it a feature of the sample we got. Regarding the power requirements of the power supply, we can say the following: Nvidia recommends power supplies from 680 W, providing a total load current on the +12 V line at a level of at least 46 A. Taking into account the power consumption data we received on the GeForce GTX 295, these recommendations look frankly overpriced , and for the new Nvidia, we can safely recommend any high-quality power supply with a power of 500-550 watts.

    We already know that the 55nm version of the G200 has significantly better overclocking potential than the old version, so we made an attempt to overclock the GeForce GTX 295 we have. 594/1296 MHz for the core and up to 1026 (2052) MHz for memory, we managed to achieve frequencies of 650/1418 MHz and 1200 (2400) MHz, respectively. Pretty good result for a card equipped with two G200b, cooled by one, and, moreover, not the biggest heatsink. Unfortunately, due to time constraints, we did not have time to test the overclocked card in all tests, limiting ourselves to games such as Crysis and Far Cry 2, as well as the popular 3DMark Vantage test suite.


    During overclocking, the thermal regime of the card was controlled. As a result of measurements, the following data were obtained:



    Miracles don't happen. Two cores are larger than one, and they are cooled by a single heatsink, so their temperatures quite naturally turned out to be higher than in single-processor cards based on the G200 even in 2D mode, in which the clock speeds of both cores were automatically reduced to 300/600 MHz. Pretty high temperatures were also recorded in 3D mode, but, by the way, indicators at the level of 82-86 degrees Celsius have not been something transcendent for modern high-performance video adapters for a long time. The only concern is the fact that not all hot air is thrown out by the GeForce GTX 295 cooling system outside the system case - some of it remains circulating inside the computer, therefore, before buying a video card, you should take care of good ventilation in the case.

    Despite the tight layout, the GeForce GTX 295 demonstrated very good noise characteristics for its class:



    The reference cooling system of the GeForce GTX 295 is not only quieter than the cooling system of the Radeon HD 4870 X2, which has received a lot of fair criticism, but also slightly outperforms the reference cooling system of the GeForce GTX 280 in terms of acoustic characteristics. Yes, the card cannot be called completely silent, but, in - firstly, we did not manage to force it to increase the fan speed even after long testing, and secondly, the spectral composition of the noise is quite comfortable, and this noise is perceived by ear as a slight rustling of air, while in the noise spectrum Radeon HD 4870 X2 clearly audible annoying buzz of the fan turbine. Thus, Nvidia continues to lead the way in developing quiet and reasonably efficient cooling systems for its graphics cards. In addition to the question of the disproportionately high power consumption of the RV770, ATI should also think about this, since the designs of reference cooling systems it currently uses cannot even be called quiet even by a stretch.

    Test platform configuration and testing methodology

    The EVGA GeForce GTX 295+ performance comparison study was conducted on a test platform with the following configuration:

    Processor Intel Core i7-965 Extreme Edition (3.2 GHz, 6.4 GT/s QPI)
    Motherboard Asus P6T Deluxe (Intel X58)
    Memory Corsair XMS3-12800C9 (3x2 GB, 1333 MHz, 9-9-9-24, 2T)
    Maxtor MaXLine III 7B250S0 hard drive (250 GB, SATA-150, 16 MB buffer)
    Power supply Enermax Galaxy DXX EGX1000EWL (power 1 kW)
    Dell 3007WFP Monitor (30”, maximum resolution [email protected] Hz)
    Microsoft Windows Vista Ultimate SP1 64-bit
    ATI Catalyst 8.12 for ATI Radeon HD
    Nvidia GeForce 181.20 WHQL for Nvidia GeForce

    Graphics card drivers have been tuned to provide the highest possible quality of texture filtering with minimal impact from default software optimizations. Transparent texture anti-aliasing was enabled, while multisampling was used for both architectures, since ATI solutions do not support supersampling for this feature. As a result, the list of ATI Catalyst and Nvidia GeForce driver settings looks like this:

    ATI Catalyst:

    Smoothvision HD: Anti-Aliasing: Use application settings/Box Filter
    Catalyst A.I. Standard
    Mipmap Detail Level: High Quality
    Wait for vertical refresh: Always Off
    Enable Adaptive Anti-Aliasing: On/Quality

    NVIDIA GeForce:

    Texture filtering - Quality: High quality
    Texture filtering - Trilinear optimization: Off
    Texture filtering - Anisotropic sample optimization: Off
    Vertical sync: Force off
    Antialiasing - Gamma correction: On
    Antialiasing - Transparency: Multisampling
    Multi-display mixed-GPU acceleration: Multiple display performance mode
    Set PhysX GPU acceleration: Enabled
    Select the multi-GPU configuration: Enable multi-GPU mode
    Other settings: default

    The composition of the test package has been subjected to some revision to better match the current realities. As a result of the revision, the following set of games and applications was included in it:

    3D First Person Shooters:

    Call of Duty: World at War
    Crysis Warhead
    Enemy Territory: Quake Wars
    Far Cry 2
    Left 4 Dead
    S.T.A.L.K.E.R.: Clear Sky


    Three-dimensional shooters with a third-person view:

    dead space
    Devil May Cry 4


    RPG:


    Fallout 3
    mass effect


    Simulators:

    Race Driver: GRID

    X³: Terran Conflict


    Strategies:

    Red Alert 3
    world in conflict


    Synthetic tests:

    Futuremark 3DMark06
    Futuremark 3D Mark Vantage

    Each of the games included in the set of test software was configured to provide the highest possible level of detail, and only the tools available in the game itself to any uninitiated user were used. This means a fundamental rejection of manual modification of configuration files, since the player is not required to be able to do this. For some games, exceptions were made, dictated by one or another consideration of necessity; each of these exceptions is mentioned separately in the relevant section of the review.

    In addition to the EVGA GeForce GTX 295+, the testers included the following graphics cards:

    Nvidia GeForce GTX 280 (G200, 602/1296/2214 MHz, 240 SP, 80 TMU, 32 RBE, 512-bit memory bus, 1024 MB GDDR3)
    Nvidia GeForce GTX 260 Core 216 (G200b, 576/1242/2000 MHz, 216 SP, 72 TMU, 28 RBE, 448-bit memory bus, 896 MB GDDR3)
    ATI Radeon HD 4870 X2 (2xRV770, 750/750/3600 MHz, 1600 SP, 80 TMU, 32 RBE, 2x256-bit memory bus, 2x1024 MB GDDR5)

    The first two cards from the above list were also tested in SLI mode, and the Radeon HD 4870 X2 was supplemented with a single Radeon HD 4870 1GB to organize and test a three-way Radeon HD 4870 3-way CrossFireX system.

    Testing was carried out at resolutions of 1280x1024, 1680x1050, 1920x1200 and 2560x1600. Wherever possible, standard 16x anisotropic filtering was supplemented with 4x MSAA anti-aliasing. Activation of anti-aliasing was carried out either by the means of the game itself, or, in their absence, was forced using the appropriate settings of the ATI Catalyst and Nvidia GeForce drivers. By popular demand from readers, some games were additionally tested with forcing CSAA 16xQ anti-aliasing modes for Nvidia solutions and CFAA 8x + Edge-detect Filter for ATI solutions. Both modes use 8 color samples per pixel, but Nvidia's algorithm provides twice as many samples for the coverage mesh, and ATI's algorithm applies an additional edge smoothing filter, which the company says makes it equivalent to MSAA 24x mode.

    To obtain performance data, we used the tools built into the game with the obligatory recording of original test videos, if possible. Whenever possible, data were recorded not only on the average, but also on the minimum productivity. In all other cases, the Fraps 2.9.8 utility was used in manual mode with a three-time test pass and subsequent averaging of the final result.

    Playtests: Call of Duty: World at War


    All test participants, with the exception of the GeForce GTX 280, are so powerful that they easily reach the performance ceiling in this game. Only at 2560x1600 do we manage to get some data, from which it follows that the ATI Radeon HD 4870 X2 cannot resist its new dual-processor enemy: a successful rivalry requires help in the form of another RV770 core.


    When using extreme anti-aliasing modes, Nvidia solutions have an obvious advantage in the form of a less resource-intensive CSAA 16xQ algorithm, which provides 16 samples per coverage mesh, but only 8 color samples, while ATI's CFAA 8x + Edge-detect Filter algorithm imposes on the GPU, an additional load in the form of an edge smoothing filter. As a result, the Radeon HD 4870 X2 shows the worst result among all cards tested in these modes. Moreover, the scope of extreme FSAA modes is limited to a resolution of 1280x1024, and the gain in image quality is so meager that it is just right to look at it under a microscope.

    Nvidia GeForce GTX 295

    MSAA 4xCSAA 16xQ

    ATI Radeon HD 4870 X2

    MSAA 4xCFAA 8x + Edge-detect


    The verdict is simple and logical: the minimum gain in fine detail smoothing is clearly not worth such a monstrous drop in performance.

    Playtests: Crysis Warhead


    Nvidia's new dual-GPU flagship is a solid leader in all resolutions, trailing only slightly behind the much bulkier and hotter GeForce GTX 280 SLI tandem. However, at 2560x1600 what we feared happens - the GeForce GTX 295 lacks 896 MB of local video memory. However, the overall results are so low that this loss is not terrible and is only of theoretical interest.

    Playtests: Enemy Territory: Quake Wars

    ET: Quake Wars has an average performance limiter fixed at 30 fps, as all events are synced at 30 Hz in multiplayer. To obtain more complete data on the performance of graphics cards in Quake Wars, this limiter has been disabled via game console. Since the testing uses the internal capabilities of the game, there is no information on the minimum performance.


    In this case, the GeForce GTX 295 is also somewhat inferior to the GeForce GTX 280 SLI tandem, which can be explained by the lower bandwidth of the memory subsystem, coupled with the use of textures of large volume and resolution in the game. Nevertheless, it outperforms the Radeon HD 4870 X2, especially at 2560x1600, where the advantage of the novelty reaches 22%.


    But an attempt to use extreme anti-aliasing modes reveals the failure of Nvidia cards, moreover, in the field where they have traditionally been strong - that is, in games using the OpenGL API. If at 1280x1024 the picture doesn't differ much from the one that can be seen with the usual MSAA 4x, then at higher resolutions ATI's solutions take a sharp lead. At the same time, the strange behavior of the GeForce GTX 295 cannot be explained by a lack of video memory - there was no significant difference in performance between it and a pair of GeForce GTX 280 SLI. As for the image quality, the differences are almost invisible to the naked eye, especially at resolutions from 1680x1050, so there is not just little point in using extreme modes, it is practically non-existent.

    Playtests: Far Cry 2


    The behavior of the GeForce GTX 295 is within the framework of preliminary forecasts - the performance it demonstrates is approximately at the level of the GeForce GTX 260 Core 216 SLI tandem and slightly lower than the performance of the GeForce GTX 280 SLI tandem. The advantage over the Radeon HD 4870 X2 is insignificant and does not exceed 12-15%.


    An attempt to use extreme anti-aliasing modes does not entail immediate retribution in the form of a drop in performance below acceptable values, but it significantly affects the minimum performance, and also makes it impossible to use a resolution of 2560x1600.

    Nvidia GeForce GTX 295

    MSAA 4xCSAA 16xQ

    ATI Radeon HD 4870 X2

    MSAA 4xCFAA 8x + Edge-detect


    As with Call of Duty: World at War, the screenshots show no significant improvement in image quality. There is a difference, but you need to look for it under magnification using special utilities, such as The Compressonator; in dynamics, it is simply impossible to notice an improvement in the quality of smoothing. Another argument in favor of the fact that the extreme modes, so actively mentioned by the leading developers of graphics solutions, are more likely advertising tricks than ways to really improve image quality in practice.

    Playtests: Left 4 Dead

    The game is based on the Source engine and has built-in testing tools, which, unfortunately, do not provide information about the minimum performance.


    Due to the use of the Source engine, the game is quite modest in its requirements, and all test participants can easily provide a comfortable level of performance in it at resolutions up to 2560x1600 inclusive. Only a single GeForce GTX 280 stands out from the overall picture. Note that the factory overclocking of the EVGA GeForce GTX 295+ allowed it to slightly outperform the GeForce GTX 280 SLI tandem.


    The use of high-quality (according to the developers) anti-aliasing modes gives more interesting results: firstly, ATI solutions are losing their positions, and secondly, at a resolution of 2560x1600, the GeForce GTX 295 is noticeably inferior to the GeForce GTX 280 SLI due to the smaller volume available to applications video memory - 869 MB versus 1024 MB. The difference in image quality is even less noticeable than in previous tests, since Left 4 Dead belongs to the genre of "survival shooters" and most of the scenes in it are quite dark.

    Game Tests: S.T.A.L.K.E.R.: Clear Sky

    To ensure an acceptable level of performance in this game, it was decided to abandon the use of FSAA, as well as such resource-intensive options as "Sun rays", "Wet surfaces" and "Volumetric smoke". When testing, the "Enhanced full dynamic lighting" (DX10) mode was used, for ATI cards additionally enabled DirectX 10.1 mode


    Thanks to a number of concessions we made, described above, all test participants coped with the task of providing an acceptable level of performance, with the exception of a single GeForce GTX 280, and Nvidia solutions demonstrated a slightly higher level of minimum performance at resolutions up to 1920x1200 inclusive. But in the resolution of 2560x1600, ATI solutions took the first place in this indicator, and the Radeon HD 4870 3-way CrossFireX system even set a kind of record, outperforming its rivals by more than 25%.

    Playtests: Dead Space


    Unlike the Radeon HD 4870 X2, the GeForce GTX 295 does not have problems with multi-GPU support, but at the same time, it does not demonstrate outstanding scalability compared to a single GeForce GTX 280. It remains only to wait for multi-processor support to receive similar support. ATI graphics solutions, although it should be noted that this expectation is practically painless, since even in their current state they demonstrate acceptable performance at a resolution of 2560x1600.

    Playtests: Devil May Cry 4


    All multi-processor graphics solutions in this game are superbly scalable, and ATI's three-processor system naturally takes the lead, since it has the largest number of GPUs. The GeForce GTX 295 outperforms the Radeon HD 4870 X2 by 11-26%, depending on the resolution, but against the background of indicators that do not fall below 70 frames per second, this difference looks insignificant and, of course, does not affect the comfort of the gameplay.


    High-quality anti-aliasing modes seriously increase the load on the graphics subsystem, but only the Radeon HD 4870 X2 noticeably loses ground, perhaps due to the presence of only 32 RBE blocks, while its rival has 56 such blocks. Nevertheless, at a resolution of 2560x1600 ATI's solution still provides an acceptable level of performance, albeit balancing on the verge of a foul. The GeForce GTX 295, on the other hand, feels great, but this is largely due to the less resource-intensive anti-aliasing algorithm. However, in both cases, improvements in picture quality are almost impossible to notice, since the game is very dynamic.

    Playtests: Fallout 3


    Starting with a resolution of 1920x1200, a certain advantage of ATI's solutions becomes obvious, and at a resolution of 2560x1600 it no longer raises any doubts. Nevertheless, the GeForce GTX 295 looks quite worthy, yielding very slightly to the GeForce GTX 280 SLI tandem in performance, but significantly surpassing it in other consumer qualities, including cost.

    Playtests: Mass Effect


    At 1280x1024, the advantage of the GeForce GTX 295 over the Radeon HD 4870 X2 is almost imperceptible, at the next two resolutions it increases to 14%, and then to 26%, but at 2560x1600 it again drops to almost zero, and the Radeon HD 4870 3- way CrossFireX. However, in the latter case, none of the participants can provide the minimum acceptable performance.

    Game Tests: Race Driver: GRID


    Throughout the test, ATI solutions retain their advantage in average speed, but up to a resolution of 2560x1600 their minimum performance is almost the same as that of Nvidia solutions. With a general performance level in the region of 100-140 frames per second, it is ridiculous to say that the player can feel a difference of the order of 10-20 frames per second. However, even at 2560x1600 the minimum multi-GPU resolution does not fall below 60 frames per second, which is an excellent result, especially against the backdrop of the GeForce GTX 280, one of the most powerful single-chip graphics cards. It looks like multi-GPU solutions have finally achieved victory, at least in the sector of the most productive gaming cards.

    Playtests: X³: Terran Conflict


    As noted earlier, the game prefers ATI architectural solutions and, at the same time, is not too picky about Nvidia solutions - unlike most other tests, in X³ a single GeForce GTX 280 is practically not inferior to multi-GPU solutions. At the same time, of all the Nvidia solutions presented in the review, only the new GeForce GTX 295 and the GeForce GTX 280 SLI tandem can provide an acceptable level of minimum performance at a resolution of 1680x1050

    Playtests: Red Alert 3

    The game contains a non-disableable average performance limiter, fixed at around 30 frames per second.


    Although with brute force, the GeForce GTX 295 managed to overcome the performance problem of Nvidia solutions in Red Alert 3. At least when using FSAA 4x, the speed remains acceptable at resolutions up to 1920x1200 inclusive. It is noteworthy that SLI tandems using discrete cards cannot do this, although they exchange data through the same nForce200 switch, but located on the motherboard.

    Game Tests: World in Conflict


    All dual-processor solutions from Nvidia demonstrate almost the same level of performance and are noticeably ahead of their rivals from the ATI camp. Only at a resolution of 2560x1600 does the Radeon HD 4870 3-way CrossFireX platform take the lead, and it becomes the only one capable of providing an acceptable minimum speed at this resolution.

    Synthetic benchmarks: Futuremark 3DMark06









    The results obtained in 3DMark06 are hardly representative when it comes to testing high-end accelerators. Since the test package uses a resolution of 1280x1024 without anti-aliasing by default, the modern graphics adapter is not fully loaded, as a result of which it is very difficult to predict the influence of all possible factors on the final result. We cannot yet explain the behavior of the Radeon HD 4870 3-way CrossFireX bundle, however, during repeated testing, a close result was obtained, still inferior to the result of a single Radeon HD 4870 X2. As for the GeForce GTX 295, the figures obtained indicate performance at the level of the GeForce GTX 260 Core 216 SLI tandem, which, in principle, is not far from the truth and is confirmed by the results of gaming tests.

    Synthetic benchmarks: Futuremark 3DMark Vantage

    To minimize the impact CPU, when tested in 3DMark Vantage, the "Extreme" profile is used, using a resolution of 1920x1200, FSAA 4x and anisotropic filtering. To complete the picture of performance, the results of individual tests from now on are taken at the entire range of resolutions.






    As expected, in the overall standings, the GeForce GTX 295 did not show record results. Although it outperformed the Radeon HD 4870 X2 by a huge margin, a bunch of two discrete GeForce GTX 280s operating in SLI mode turned out to be somewhat faster; however, the factory overclocking undertaken by EVGA helped to catch up with it.


    At all resolutions, except for 2560x1600, the GeForce GTX 295, operating at official frequencies, demonstrates a much better result than the GeForce GTX 260 Core 216 SLI bundle, but is slightly inferior to the GeForce GTX 280 SLI tandem. At 2560x1600, this lag is not observed, and factory overclocking allows the EVGA GeForce GTX 295+ to take second place after the three-processor Radeon HD 4870 X3 system, which has much more impressive computing power.


    In the second test, the aforementioned ATI solution also leads, but in all resolutions, and in addition to this, the Radeon HD 4870 X2 manages to significantly reduce its backlog. Apparently, the special effects created using the ray tracing method require serious computing power, and ATI still has a solid advantage in this area. The position of the GeForce GTX 295 remains the same - between the GeForce GTX 260 Core 216 SLI and the GeForce GTX 280 SLI. In the latter case, the lag is small and can be easily compensated for by a slight overclocking.

    Conclusion

    Summing up, we can immediately say with confidence that Nvidia for the first time managed to create not just a competitive dual-processor graphics adapter, but the best solution in its class, significantly surpassing the similar development of Advanced Micro Devices both in terms of performance in modern games and in a number of other important consumer applications. qualities, including economy, noise and heat generation. Not only development engineers, but also programmers did a good job - for the first time in our practice, the multi-GPU system created by Nvidia showed almost no compatibility and performance problems, outperforming ATI Radeon HD 4870 X2 in this area as well.

    This once again confirms the simple thesis stated at the beginning of the review, which says that it is very dangerous to rest on your laurels, especially in such an area as the creation of gaming graphics accelerators, where everything changes very quickly, and you must constantly be ready to repel an unexpected attack from a competitor. After a long series of defeats, Nvidia finally won a landslide victory, and one that allows you to rightfully declare yourself as a technology leader, which, of course, should have a beneficial effect on the company's image and the popularity of its products.

    Let's look at the situation with Nvidia performance GeForce GTX 295 more.



    Already in 1280x1024 resolution new flagship GeForce GTX 200 demonstrated an overwhelming advantage, yielding to the former king of 3D, Radeon HD 4870 X2, only in two tests - Race Driver: GRID and X3: Terran Conflict, and in both cases, maintaining a comfortable performance level. The average advantage of the GeForce GTX 295 over the Radeon HD 4870 X2 was about 19%. It might not seem like much, but taking into account the significantly lower level of power consumption, heat dissipation and noise, this puts ATI's solution in a very dangerous position. If we compare the GeForce GTX 295 with the GeForce GTX 280, then the advantage varies from 7 to 68%, averaging 38%. In fact, this marks the final end of the era of high-performance single-chip graphics cards - it's hard to imagine how powerful and complex a monolithic GPU must be in order to surpass the GeForce GTX 295.



    The summary test results for a resolution of 1680x1050 show approximately the same picture: despite a significant lag in theoretical computing power, the GeForce GTX 295 is ahead of the Radeon HD 4870 X2 in most tests, with the exception, again, of the aforementioned Race Driver: GRID and X3: Terran Conflict . Fallout 3 was added to them, but the lag of less than 2% can be ignored. On average, the advantage of the GeForce GTX 295 was about 17%.



    The transition to a resolution of 1920x1200 does not change much in the overall picture, except that the backlog of the GeForce GTX 295 in Racedriver: GRID is reduced to 5%, and in Fallout 3, on the contrary, it increases to 4%. On average, the new product outperforms the Radeon HD 4870 X2 by 20%, and the GeForce GTX 280 by 61%. Given the good performance indicators - at least for such a monster - such results can rightly be called excellent.



    At a resolution of 2560x1600, the GeForce GTX 295 for the first time encountered a previously predicted problem, expressed in a lack of local video memory, however, this only manifested itself in the extremely demanding game Crysis Warhead, where even the most powerful modern graphics cards can hardly provide acceptable speed at resolutions above 1280x1024. In any case, 896 MB of video memory is a compromise that the developers of the GeForce GTX 295 were forced to make. Creating a similar design that carries two gigabyte memory banks with 512-bit access on its boards may not be impossible, but its cost , for sure, would be unacceptable for Nvidia, already forced to put up with the high cost price of the G200, consisting of 1.4 billion transistors. The advantage over the Radeon HD 4870 X2 at this resolution is the most modest, on average it is only 8%, however, you should still not forget about the lower levels of noise and power consumption demonstrated by the GeForce GTX 295.

    As for the EVGA GeForce GTX 295+, it is an exact copy of the reference card; most likely, we will not see versions of the GeForce GTX 295 with a unique design on the market at all. The only differences from the Nvidia reference card are EVGA branded stickers, as well as a slight factory overclocking, which does not give the card a significant advantage, but still, in some cases, brings it on par with the GeForce GTX 280 SLI tandem in terms of performance. The product is characterized by a good overclocking potential, but a very poor bundle, unworthy of a video adapter belonging to the highest price range - at such a price, the buyer has the right to count on at least one high-quality and popular game in the kit. However, on the other hand, for this money you can get the fastest gaming video adapter, which, moreover, has good noise and electrical characteristics - and this in itself is not so small.

    EVGA GeForce GTX 295+: pros and cons

    Advantages:

    Today's best performance in modern games
    Outperforms Radeon HD 4870 X2 in most tests
    Using the 55nm version of G200
    Wide choice of FSAA modes
    Minimal performance impact of FSAA
    GPU PhysX acceleration support
    Hardware support for HD video decoding
    Support for S/PDIF audio output via HDMI
    Relatively low power consumption and heat dissipation
    Relatively low noise level
    Good overclocking potential

    Flaws:

    Inferior to the Radeon HD 4870 X2 in the amount of video memory available to applications
    Performance bias towards texture processors and RBE
    Lack of support for DirectX 10.1 and Shader Model 4.1
    Incomplete hardware support for VC-1 decoding
    No integrated sound core
    Maximum performance may depend on software support
    poor equipment
    High price

    Other materials on this topic


    EVGA GeForce GTX 260 Core 216 Superclocked 55nm GPU
    Choosing a video card for home theater
    "Anti-crisis CrossFireX": two ATI Radeon HD 4830 versus one ATI Radeon HD 4870

    Today, few people will remember the GTX 295 video card. Its characteristics are useless today. She can hardly run even the most undemanding game. However, it can still be found for sale on forums and marketplaces. What is remarkable about it for buyers?

    thorny path

    The characteristics of the GTX 295 have been carefully designed. This approach of Nvidia was caused by unexpected "attacks" from the main competitor - ATI. The developers of the "red" decided to release a new architecture, which turned out to be in demand and popular among buyers. Nvidia had to solve this problem soon and launch a new leader on the market.

    The first attempt to overcome the importunate competitor was the release of the G200 architecture paired with a video card. The novelty managed to show excellent results in tests and acquire a lower level of power consumption. Nvidia was on its way to not only create an equal competitor to the new Radeon HD 4870 X2, but also to take over the title of the fastest graphics accelerator.

    Solution

    Before the characteristics of the GTX 295 became known to the whole world, it was necessary to develop a strategy. Nvidia decided to create a dual-processor adapter based on a previously released architecture. Many experts were initially surprised by this decision of the manufacturer. Previously, Nvidia stated that it was ready to develop only single-processor models with powerful stuffing.

    The company was driven to such deviations from its principles by a fierce attack by a competitor. In addition, the development of a new architecture that worked on the 55-nm process technology largely predetermined further actions. Indeed, unlike the 65-nm pair of these chips, they produced much less heat. Although there were skeptics on this score who considered such a decision a failure.

    Comparison

    After the release of the Nvidia GTX 295, its characteristics were compared with those of the Radeon HD 4870 X2. It was for the sake of superiority over the model from ATI that Nvidia made such a difficult path.

    Both models work on 55nm technology. Graphics cards look good, but are still in their own specific category. For example, X2 is good because it has high performance computing power. The Nvidia GTX 295 has much better texture and raster performance. The novelty has acquired a high frequency of execution units, support for hardware acceleration.

    The model from ATI proved to be better due to the amount of local video memory. GTX 295 characteristics received in accordance with the company's strategy. The developers wanted to create the fastest gaming graphics card, so they missed DirectX 10.1 support, an integrated sound core, a decoder, etc.

    Modifications

    Like all video cards from Nvidia, this one received a reference model and modifications. The most interesting thing is that in fact, none of the now well-known manufacturing companies have taken up the modification of the novelty. As a result, only a few improved versions from EVGA, Zotac, Inno3D, XFX GF appeared on the market. In general, all these GTX 295 graphics cards received the same characteristics, but the difference was still noticeable.

    EVGA

    One of the most popular modifications turned out to be the EVGA GeForce GTX 295. The characteristics of the video card were not much different from the parameters of the reference model. However, some differences could be noticed.

    Externally, the model seems very cumbersome. But there is nothing surprising in this, since the manufacturer had to place two large chips and a pair of 448-bit buses on the board. Both boards inside the video card work on a single cooling system.

    The option, on the one hand, seems to be quite good, on the other hand, it raises many questions, since the chip based on the 55-nm process technology is still quite warm. In order for the adapter to turn out to be not too large and in the future to be able to connect without problems to an ATX form factor case, both boards had to be placed “face to face”. The developers tried to keep the slots and connectors so that the cooling system could remove hot air from the case.

    EVGA GeForce GTX 295 boards work with an independent four-phase power regulator. However, only the upper part is connected to the power supply and is designed for loads up to 150 watts.

    14 GDDR3 chips of 512 Mbps are responsible for the memory on the boards. Microcircuits operate with a voltage of 2.05 V and a frequency of 1000 MHz. Such indicators of the reference model in the EVGA GeForce GTX 295 were slightly improved: the frequency was raised to 1026 MHz.

    Since we have a dual-processor video card, it houses two GPUs. Each works with 240 shaders and 80 texture processors. In the reference model, the chip operates at 576 MHz in the main domain and 1242 MHz in the shader domain. EVGA was able to overclock frequencies to 594 and 1296 MHz, respectively.

    Pros and cons of EVGA GeForce GTX 295

    The modified version is actually practically no different from the reference one, and this is perhaps the main disadvantage. Today, companies spoil us with unprecedented factory overclocking and the ability to increase frequencies at home. In the case of this modification, there were no noticeable improvements.

    A good clone of the reference version appeared before the buyer. You should not expect a unique design, since dual-processor models are a priori difficult to change visually. But EVGA tried to independently increase the frequencies of memory and processors. In 2009, users were convinced that this was the best graphics card for gaming.

    Zotac

    The modification from this manufacturer was released in a branded box, which was no different from those released earlier. The company does not change its signature black and gold combinations to this day. The box contained a huge amount of information and drawings. The main characteristics of the Zotac GeForce GTX 295 were also written right there.

    The design of the modification is no different from the previous model and the reference version. Before us are the same two boards assembled "face to face" that work with unified system cooling. The latter, in turn, has several copper heat sinks and heat pipes.

    Due to the fact that there is not much space on the boards, it was decided to reduce the memory bus to 448 bits. This decision also affected the rasterization channels. But the developer managed to keep the number of unified processors - 240. The GPU operates at a frequency of 576 MHz, and the shader pipelines - at 1242 MHz. The figures are no different from those presented by Nvidia in the reference version.

    The video memory is made up of chips with a total capacity of 1792 MB. They work with a voltage of 2.05 V at a frequency of 2 GHz.

    To improve the 295, experts tried to overclock the processor and memory frequencies. Since the model can hardly be called cold, and with an increase in voltage and frequencies there is always a temperature jump, it was necessary to ensure stable cooling. For this, someone simply opened the case so that there was an influx of air inside, someone put an additional cooler in order to increase the efficiency of CO.

    The specialists managed to raise the main speed up to 670 MHz and the shader frequency up to 1440 MHz. The increase averaged 15%, which is actually an excellent result. The video memory was overclocked to 2214 MHz.

    Pros and cons of Zotac GeForce GTX 295

    Unlike modification EVGA, this model went on sale with benchmarks. The only difference is a modified plastic shroud that covers the boards and a new cooling system. Otherwise, we have a reference model with Nvidia-installed specifications.

    295 from Zotac was not the most popular. Of course, buyers would like to see factory overclocking and surprises in the box. Instead, they received the usual equipment and standard parameters. Although the possibility of manual overclocking is a definite plus. Yes, and the result of overclocking was pretty good.

    Other modifications

    Against the background of its competitors, the modification from Inno3D looks good. The first thing worth highlighting is a thoughtful cooling system. The board has an aluminum plate with cutouts for chips. Two radiators were placed on top of them, between which there is one compact fan.

    The amount of memory is 1792 MB, and the frequency at which it operates is 2016 MHz. The speed of the processor turned out to be the reference. But overclocking helped raise this figure to 684 MHz. It was possible to increase the memory frequency to 2376 MHz.

    Modification XFX GF GTX295 576M 1792MB also did not receive any special parameters that could stand out from the rest. The only thing is that the developer provided the game Far Cry 2 in the package, which turned out to be undoubtedly a profitable marketing ploy.

    conclusions

    Many users believed that there are 295 1860 Mb. The characteristics of this model cannot be found on the Internet, since such a version did not exist for sale. GTX 295 in any modification was produced with a memory capacity of 1792 MB and a frequency of 2 GHz.

    Despite the first attempts to bring to market a two-processor model with two boards, just a few months later, variations appeared with a compact design and efficient system cooling. And, nevertheless, this did not affect the characteristics of the model in any way. She immediately occupied her niche in the hi-end class, having fulfilled her main task: to defeat the main competitor - Radeon HD 4870 X2.

    Of all the modifications, the EVGA GeForce GTX 295 turned out to be the most successful. This version of the video card became the most popular, since it already had factory overclocking from the box. And if now many buyers focus on purchasing the overclocking version, then this was undoubtedly a huge advantage.

    And while factory overclocking didn't actually provide the huge performance boost that modern models do, the slight boost did affect overall system power. And if the EVGA GeForce GTX 295 fell into the hands of a specialist, then at home the processor frequencies could be increased even more, and, accordingly, the gaming potential of the accelerator could be increased.

    The most powerful 3D accelerator to date is, of course, the dual-processor GeForce GTX 295. It was created as a counterweight to AMD's flagship, the dual-processor Radeon HD 4870 X2 video card. The NVIDIA video card turned out to be much more powerful and productive, surpassing its competitor in most parameters. Unfortunately, it appeared on the market a bit late, when AMD's position as a market leader strengthened, and could not turn the tide. However, the title of formal leader among 3D accelerators belongs to GeForce, not Radeon.

    Design features

    Despite the fact that the GeForce GTX 295 and the Radeon HD 4870 X2 are conceptually identical, they differ significantly in terms of design. This is due to the peculiarities of the graphics controllers used in them.

    The RV770 graphics processor found on the Radeon HD 4800 series supports a 256-bit memory bus, while NVIDIA's GT200 processor supports a 512-bit one. In addition, the external interface controller is integrated in the RV770, while in the GT200 it is made as a separate microcircuit. When creating a dual-processor video card, AMD cost one printed circuit board with two 256-bit buses for connecting two 3D chips to the memory. It also had a PCI Express bus switch, two 1024 MB GDDR5 chipsets, and power circuits. Therefore, the AMD dual-processor video card differs slightly in technological complexity from the competitor's single-processor models, on which, in addition to the graphics controller itself, a second chip responsible for external interfaces is installed.

    As we have already said, architecturally GeForce GTX 295 does not differ from its competitor. From the point of view of the system, these are two video cards with PCI Express interface, each with its own array of local memory and a set of output ports. Only their connection was made not by means of system logic (motherboard chipset), but with the help of an additional PCI Express switch. Video cards operate in SLI mode, rendering frames one by one and writing them to the output video buffer of one of them - the one to which the monitor is connected.

    Structurally, the GeForce GTX 295 is made differently than the Radeon HD 4870 X2.

    It actually consists of two printed circuit boards, only turned by the front sides to each other. On one board there is the first GT200 graphics chip, an NVIO chip (responsible for external interfaces) and an nForce 200 bridge (PCI Express switch), on the other - the second GT200 with its own NVIO chip.

    It was not possible to place all the components on one board. Moreover, the width of the memory buses of each of the video cards had to be reduced from 512 to 448 bits, respectively, and the amount of memory was reduced to 2 x 896 MB (128 MB for each 64-bit channel, of which each 3D accelerator has seven). The cooling system is located between the two boards, it consists of an aluminum chassis in contact with memory chips, transistors and auxiliary chips, copper inserts between the graphics cores, a set of fan-shaped radiator fins and a fan with longitudinal blades. The fan is located at one end of the card, the exhaust vents are at the other, which goes outside, and also at the top.

    The front side is covered with a metal casing covered with rubber, on which the logo of the video card manufacturer is applied.

    The set of video outputs of the standard (reference) GeForce GTX 295 consists of two DVI and one HDMI. The developers have already abandoned the analog output to TV, and the DisplayPort connector was considered redundant due to the small distribution of monitors with its support.

    At the top, the video card has one SLI connector for connecting to another NVIDIA video card, as well as an S / PDIF input for realizing a full-fledged HDMI (with video and audio transmission).

    Despite the rather complex design, the reference design GeForce GTX 295 has tolerable dimensions, for example, it is comparable in size to a single-processor GeForce GTX 285. The installed video card blocks only one adjacent slot (dual-slot cooling system), its length is about 27 cm. The declared maximum the power consumption of the GeForce GTX 295 at nominal frequencies is 289 W, the system power supply must be at least 700 W, two power connectors are required, and one of them must be 8-pin.

    Of course, a "sandwich" type video card is quite difficult to manufacture, which negatively affects its prime cost and final price. Fearing for the competitiveness of products, NVIDIA has already significantly reduced the prices of all its video cards, including the dual-processor one. And in order not to work at a loss, you need to look for ways to reduce costs. The simplified single-board design of the GeForce GTX 295 has been prepared by several video card manufacturers, but such models have not yet become mass market. Most video cards of this series currently available are standard reference models, differing only in packaging, bundles and a logo on the front panel.

    This option is offered to customers by Innovision.

    There are few differences compared to the reference video card - the clock frequencies of graphics controllers (but not their shader units) and memory chips are slightly increased, which should not significantly affect performance.
    Clock speed data (obtained using RivaTuner):

    Otherwise, this is a standard (in terms of design, features, bundle and cost) GeForce GTX 295, the flagship of the current line of NVIDIA video cards.

    Testing

    In fact, the GeForce GTX 295 has one competitor - the already mentioned Radeon HD 4870 X2 video card. Is it theoretically capable of showing the same level of performance? I guess, yes.

    Although each of the two 3D processors of an AMD video card has only a 256-bit memory bus, GDDR5 memory, unlike the GDDR3 used on NVIDIA video cards, communicates at four times the base clock frequency (according to the Quadro Data principle). rate). Therefore, a 256-bit bus with GDDR5 memory is similar to a 512-bit bus with GDDR3 memory. In this parameter, the AMD video card even wins. But do not forget that the NVIDIA GT200 GPU has higher processing power and efficiency thanks to a well-designed and optimized architecture.

    However, a dual-processor AMD video card will cost the user less. The difference sometimes reaches 30% when it comes to "inexpensive" manufacturers like Palit. However, the Revolution 700 Deluxe model is not a basic reference variant of the Radeon HD 4870 X2, but rather an "advanced" video card with an original cooler, factory overclocking and an expanded set of I/O ports.

    Also, for comparison, we provide data on the GeForce GTX 285 video card - the most powerful single-processor model from the current NVIDIA line. Our copy was also released by Palit, but this time it belongs to the base, not overclocked series, and is distinguished by a moderate price (which is 60-70% lower than that of a dual-processor NVIDIA video card).
    Summary data for three video cards:

    The composition of the test stand:
    - Intel Core 2 Duo E8300 processor, overclocked to 3.82 GHz;
    - 2 GB DDR3-1067 memory from Apacer;
    - motherboard Gigabyte EP35C-DS3R (Intel P35);
    - 1 kW FSP Everest 1010 power supply with 8-pin power connector.

    Testing was carried out in Windows environment Vista SP1, NVIDIA graphics drivers version - 185.85, AMD graphics cards - 9.6.

    Performance. Test results in demanding 3D games convincingly prove the serious advantage of the dual-GPU GeForce GTX 295 graphics card.
    Maximum performance (anti-aliasing and anisotropic filtering off):

    Improved quality (AA 4x, AF 16, anti-aliasing of transparent textures):

    Yes, there are cases when the Radeon HD 4870 X2 does not lag behind (and even overtakes within 10%), but more often the GeForce performance is 30-60% higher. As for the single-processor GeForce GTX 285, a gap of 60-70% is in the order of things for it. Which, by the way, clearly corresponds to the difference in prices between the 285 and 295 models.

    temperature and noise. The weak point of the GeForce GTX 295 video card is power consumption. Recall that even in a well-overclocked system, such a video card will consume at least half of the total power. Fortunately, only in 3D - in 2D mode, the video card reduces the frequency of graphic chips by half, and the memory frequency is completely lowered to the minimum mark - 100 MHz. In the presence of a full load, the graphics chips warm up decently (according to our data - up to 80 degrees), but the noise level remains within the bounds of decency and is not noticeable against the background of the fans running in the case.

    Part of the hot air is discharged outside through the slots in the mounting plate, and part - inside the housing through the open part of the casing. Therefore, you need to take care of proper ventilation of the case, otherwise the video card will quickly "warm up" all the components of the system.

    conclusions

    The GeForce GTX 295 is truly the most powerful graphics card on the market. The most powerful in terms of performance and power consumption. It demonstrates record performance in 3D games (in Crysis at maximum settings and a resolution of 1680x1050 - more than 45 fps), but it requires a power supply unit with a capacity of at least 700 watts. Too high a price (we have over $500), even without taking into account the cost of a high-quality case, makes it inaccessible to most users. Which is natural, since this is the "top" product of its generation.

    The choice of manufacturer when buying a GeForce GTX 295 does not really matter. All video cards of the first series are made in the same place. It is enough to focus on the cost of the product. The Inno3D GeForce GTX 295 video card has a standard design, basic package and almost standard clock speeds, but it costs no more than its counterparts.

    According to www. gigamark. com