If you don't follow almost daily the video card market it is really complicated to understand the differences between the several different nVidia graphics chips available on the market today. To facilitate knowing and understanding the difference among these chips, we have compiled the following table:
Chip | Core Clock | Memory Clock | Memory Interface | Memory Transfer Rate | Pixels per clock | DirectX |
GeForce 4 MX 440 AGP 8x | 275 MHz | 512 MHz | 128-bit | 8.1 GB/s | 2 | 7 |
GeForce MX 4000 | 250 MHz | * | 32-bit or 64-bit or 128-bit | * | 2 | 7 |
GeForce FX 5200 | 250 MHz | 400 MHz | 64-bit or 128-bit | 3.2 GB/s or 6.4 GB/s | 4 | 9.0 |
GeForce FX 5200 Ultra | 350 MHz | 650 MHz | 128-bit | 10.4 GB/s | 4 | 9.0 |
GeForce FX 5600 | 325 MHz | 550 MHz | 128-bit | 8.8 GB/s | 4 | 9.0 |
GeForce FX 5500 | 270 MHz | 400 MHz | 64-bit or 128-bit | 3.2 GB/s or 6.4 GB/s | 4 | 9.0 |
GeForce FX 5600 Ultra | 500 MHz | 800 MHz | 128-bit | 12.8 GB/s | 4 | 9.0 |
GeForce FX 5700 LE | 250 MHz | 400 MHz | 128-bit | 6.4 GB/s | 4 | 9.0 |
GeForce FX 5700 | 425 MHz | 600 MHz | 128-bit | 9,6 GB/s | 4 | 9.0 |
GeForce FX 5700 Ultra | 475 MHz | 900 MHz | 128-bit | 14.4 GB/s | 4 | 9.0 |
GeForce FX 5800 | 400 MHz | 900 MHz | 128-bit | 14.4 GB/s | 8 | 9.0 |
GeForce FX 5800 Ultra | 500 MHz | 1 GHz | 128-bit | 16 GB/s | 8 | 9.0 |
GeForce FX 5900 XT | 390 MHz | 680 MHz | 256-bit | 21.7 GB/s | 8 | 9.0 |
GeForce FX 5900 | 400 MHz | 850 MHz | 256-bit | 27.2 GB/s | 8 | 9.0 |
GeForce FX 5900 Ultra | 450 MHz | 850 MHz | 256-bit | 27.2 GB/s | 8 | 9.0 |
GeForce FX 5950 Ultra | 475 MHz | 950 MHz | 256-bit | 30.4 GB/s | 8 | 9.0 |
GeForce PCX 5300 | 325 MHz | 650 MHz | 128-bit | 10.4 GB/s | 4 | 9.0 |
GeForce PCX 5750 | 475 MHz | 900 MHz | 128-bit | 14.4 GB/s | 4 | 9.0 |
GeForce PCX 5900 | 350 MHz | 500 MHz | 256-bit | 17.6 GB/s | 8 | 9.0 |
GeForce PCX 5950 | 475 MHz | 900 MHz | 256-bit | 30.4 GB/s | 8 | 9.0 |
GeForce 6200 | 300 MHz | 550 MHz | 128-bit | 8.8 GB/s | 4 | 9.0c |
GeForce 6200 LE | 350 MHz | 550 MHz | 64-bit | 4.4 GB/s | 2 | 9.0c |
GeForce 6200 (TC) | 350 MHz | 666 MHz * | 32-bit or 64-bit | 2.66 GB/s or 5.32 GB/s * | 4 | 9.0c |
GeForce 6500 (TC) | 400 MHz | 666 MHz * | 32-bit or 64-bit | 2.66 GB/s or 5.32 GB/s * | 4 | 9.0c |
GeForce 6600 | 300 MHz | 550 MHz * | 64-bit or 128-bit | 4.4 GB/s or 8.8 GB/s * | 8 | 9.0c |
GeForce 6600 DDR2 | 350 MHz | 800 MHz * | 128-bit | 12.8 GB/s * | 8 | 9.0c |
GeForce 6600 LE | 300 MHz | * | 64-bit or 128-bit | * | 4 | 9.0c |
GeForce 6600 GT | 500 MHz | 1 GHz | 128-bit | 16 GB/s | 8 | 9.0c |
GeForce 6600 GT AGP | 500 MHz | 900 MHz | 128-bit | 14.4 GB/s | 8 | 9.0c |
GeForce 6800 LE | 300 MHz | 700 MHz | 256-bit | 22.4 GB/s | 8 | 9.0c |
GeForce 6800 XT | 325 MHz | 600 MHz | 256 bits | 19.2 GB/s | 8 | 9.0c |
GeForce 6800 XT AGP | 325 MHz | 700 MHz | 256 bits | 22.4 GB/s | 8 | 9.0c |
GeForce 6800 | 325 MHz | 600 MHz | 256-bit | 19.2 GB/s | 12 | 9.0c |
GeForce 6800 AGP | 325 MHz | 700 MHz | 256-bit | 22.4 GB/s | 12 | 9.0c |
GeForce 6800 GS | 425 MHz | 1 GHz | 256-bit | 32 GB/s | 12 | 9.0c |
GeForce 6800 GS AGP | 350 MHz | 1 GHz | 256-bit | 32 GB/s | 12 | 9.0c |
GeForce 6800 GT | 350 MHz | 1 GHz | 256-bit | 32 GB/s | 16 | 9.0c |
GeForce 6800 Ultra | 400 MHz | 1.1 GHz | 256-bit | 35.2 GB/s | 16 | 9.0c |
GeForce 6800 Ultra Extreme | 450 MHz | 1.1 GHz | 256-bit | 35.2 GB/s | 16 | 9.0c |
GeForce 7100 GS (TC) | 350 MHz | 666 MHz * | 64-bit | 5.3 GB/s * | 4 | 9.0c |
GeForce 7200 GS (TC) | 450 MHz | 800 MHz * | 64-bit | 6.4 GB/s * | 4 | 9.0c |
GeForce 7300 SE (TC) | 225 MHz | * | 64-bit | * | 4 | 9.0c |
GeForce 7300 LE (TC) | 450 MHz | 648 MHz * | 64-bit | 5.2 GB/s * | 4 | 9.0c |
GeForce 7300 GS (TC) | 550 MHz | 810 MHz * | 64-bit | 6.5 GB/s * | 4 | 9.0c |
GeForce 7300 GT (TC) | 350 MHz | 667 MHz | 128-bit | 10.6 GB/s | 8 | 9.0c |
GeForce 7600 GS | 400 MHz | 800 MHz | 128-bit | 12.8 GB/s | 12 | 9.0c |
GeForce 7600 GT | 560 MHz | 1.4 GHz | 128-bit | 22.4 GB/s | 12 | 9.0c |
GeForce 7800 GS | 375 MHz | 1.2 GHz | 256-bit | 38.4 GB/s | 16 | 9.0c |
GeForce 7800 GT | 400 MHz | 1 GHz | 256-bit | 32 GB/s | 20 | 9.0c |
GeForce 7800 GTX | 430 MHz | 1.2 GHz | 256-bit | 38.4 GB/s | 24 | 9.0c |
GeForce 7800 GTX 512 | 550 MHz | 1.7 GHz | 256-bit | 54.4 GB/s | 24 | 9.0c |
GeForce 7900 GS | 450 MHz | 1.32 GHz | 256-bit | 42.2 GB/s | 20 | 9.0c |
GeForce 7900 GT | 450 MHz | 1.32 GHz | 256-bit | 42.2 GB/s | 24 | 9.0c |
GeForce 7900 GTX | 650 MHz | 1.6 GHz | 256-bit | 51.2 GB/s | 24 | 9.0c |
GeForce 7950 GT | 550 MHz | 1.4 GHz | 256-bit | 44.8 GB/s | 24 | 9.0c |
GeForce 7950 GX2 ** | 500 MHz | 1.2 GHz | 256-bit | 38.4 GB/s | 24 | 9.0c |
GeForce 8400 GS *** | 450 MHz / 900 MHz | 800 MHz | 64-bit | 6.4 GB/s | 16 | 10 |
GeForce 8500 GT *** | 450 MHz / 900 MHz | 666 MHz or 800 MHz | 128-bit | 10.6 GB/s or 12.8 GB/s | 16 | 10 |
GeForce 8600 GT DDR2 *** | 540 MHz / 1.18 GHz | 666 MHz or 800 MHz | 128-bit | 10.6 GB/s or 12.8 GB/s | 32 | 10 |
GeForce 8600 GT GDDR3 *** | 540 MHz / 1.18 GHz | 1.4 GHz | 128-bit | 22.4 GB/s | 32 | 10 |
GeForce 8600 GTS *** | 675 MHz / 1.45 GHz | 2 GHz | 128-bit | 32 GB/s | 32 | 10 |
GeForce 8800 GS *** ^ | 550 MHz / 1,375 MHz | 1.6 GHz | 192-bit | 38.4 GB/s | 96 | 10 |
GeForce 8800 GT *** ^ | 600 MHz / 1.5 GHz | 1.8 GHz | 256-bit | 57.6 GB/s | 112 | 10 |
GeForce 8800 GTS *** | 500 MHz / 1.2 GHz | 1.6 GHz | 320-bit | 64 GB/s | 96 | 10 |
GeForce 8800 GTS 512 *** ^ | 650 MHz / 1,625 MHz | 1.94 GHz | 256-bit | 62.08 GB/s | 128 | 10 |
GeForce 8800 GTX *** | 575 MHz / 1.35 GHz | 1.8 GHz | 384-bit | 86.4 GB/s | 128 | 10 |
GeForce 8800 Ultra *** | 612 MHz / 1.5 GHz | 2.16 GHz | 384-bit | 103.6 GB/s | 128 | 10 |
GeForce 9600 GSO *** ^ | 550 MHz / 1.35 GHz | 1.6 GHz | 192-bit | 38.4 GB/s | 96 | 10 |
GeForce 9600 GT *** ^ | 650 MHz / 1,625 MHz | 1.8 GHz | 256-bit | 57.6 GB/s | 64 | 10 |
GeForce 9800 GTX *** ^ | 675 MHz / 1,688 MHz | 2.2 GHz | 256-bit | 70.4 GB/s | 128 | 10 |
GeForce 9800 GTX+ *** ^ | 738 MHz / 1,836 MHz | 2.2 GHz | 256-bit | 70.4 GB/s | 128 | 10 |
GeForce 9800 GX2 ** *** ^ | 600 MHz / 1.5 GHz | 2 GHz | 256-bit | 64 GB/s | 128 | 10 |
GeForce GTX 260 *** ^ | 576 MHz / 1,242 MHz | 2 GHz | 448-bit | 112 GB/s | 192 | 10 |
GeForce GTX 280 *** ^ | 602 MHz / 1,296 MHz | 2.21 GHz | 512-bit | 141.7 GB/s | 240 | 10 |
* The manufacturer can setup a different memory clock rate or interface, so pay attention because not all video cards based on this chip have this spec. The memory transfer rate will depend on the interface and clock rate used. See how to calculate below.
** GeForce 7950 GX2 and GeForce 9800 GX2 use two graphics processors in parallel (SLI mode). The specs published are for just one of the chips.
*** GeForce 8, 9 and 200 series use two clocks, the higher one is used by the shader unit and the lower one by the rest of the chip. The shader unit is unified, meaning that these chips don't have separated pixel shader and vertex shader units. Read our article GeForce 8 Series Architecture for more information.
^ Based on PCI Express 2.0, which doubles the available I/O bandwidth from 2.5 GB/s to 5 GB/s if a PCI Express 2.0 motherboard is used.
(TC) means TurboCache. TurboCache is a technology that allows the video card to simulate more video memory by using part of the main system RAM as video memory. Read our tutorial on this subject for a better understanding on this feature.
At first nVidia’s profusion of letters may seem confusing. The GeForce FX 5700 Ultra chip works at a higher clock than the GeForce FX 5900, GeForce FX 5900 Ultra and GeForce FX 5900 XT chips, and this may make you think that a GeForce FX 5700 Ultra is the faster than chips in the 5900 series.
But that is not really so. Chips from the GeForce FX 5900 series access the memory at 256 bits per time, while the memory is accessed at 128 bits in the FX 5700 series. That makes the 5900 series memory access performance twice as fast as those of the previous series. For instance, the GeForce FX 5700 Ultra would have to access its memory at 1,700 MHz – the double of the memory clock used – to reach the memory performance of the GeForce FX 5900 Ultra.
Another example. From the table you may think GeForce 6600 GT is faster than a GeForce 6800 because it has a higher clock rate (500 MHz against 325 MHz). But GeForce 6800 accesses memory 256 bits at a time while GeForce 6600 GT accesses memory 128 bits at a time, and also GeForce 6800 processes 12 pixels per clock tick, while GeForce 6600 GT processes eight pixels per clock.
The right way to compare the memory performance of different chips is through their memory transfer rate, which is calculated using the formula (clock x bits per clock ) / 8.
Another difference is the graphic processor of the FX 5900 series, which processes eight pixels per clock pulse, while the graphic chip only processes four pixels per clock in the other series. In other words, despite having a higher clock, the graphic processing performance of the GeForce FX 5700 Ultra is inferior than those of chips from the FX 5900 series, as they process the double of pixels when work at the same clock (simply put, the GeForce FX 5700 Ultra would have to work at twice its clock to have the same performance of the GeForce FX 5900 Ultra).
Therefore, it is not correct to compare graphic chips only through their clocks.
We must be careful with the GeForce FX 5900 XT, too. While ATI uses the letters "XT" to indicate high-end chips (for ex.: Radeon 9800 XT), nVidia uses the same letters to indicate the low-end chips of the series (see table).
You have to be very carefull with low-end video cards using nVidia chips, because they can use different clock rates and different memory interface from the table. For example, you can find GeForce FX 5200, GeForce FX 5500 and GeForce 6600 with 64-bit or 128-bit interface. We've seen GeForce FX 5200, GeForce FX 5500 and GeForce 6200 with 32-bit interface on the market!
As for the DirectX version, check the table below:
DirectX | Shader Model |
7.0 | No |
8.1 | 1.4 |
9.0 | 2.0 |
9.0c | 3.0 |
10 | 4.0 |
For a detailed discussion on the subject, read our DirectX tutorial.