nVidia GeForce FX5200 (NV34) Video Card Review
By: Artem Semenkov
NV30 was announced quite a long time ago. But video cards based on the nVidia GeForce FX 5800 (NV 30) chip are too high in price and beyond pocket to many, or the user simply isn't willing to pay extra bucks for speed not so badly needed. Usually, after the release of a flagship model, its cut-down version is then produced, which allows covering all the market sectors.
For May 2003, nVidia is producing three chips based on the FX architecture, to be more precise, FX 5600 video cards are to hit the mass sales only next month - just for now its production stores are being piled up. To be absolutely honest, then as we predicted, the NV30 will never be released to the mainstream market and is aimed solely at the professional Quadro product line, where the price doesn't play the decisive part as it does on the consumer market. In the road-maps of all video card manufacturers, the NV30 has been replaced with the NV35. Nevertheless, in our comparative table we did place the NV30... Much water will have flown by the time the NV35 hits the real-life retail.
Comparison data for the FX family video cards:
Notes: Recommended speeds of the core and memory for NV31 and NV34 chips have been revised several times.
|Process technology, mk
|Q-ty of transistors, mln
|Core speed, MHz
|Memory bus speed, MHz
|Memory bus, bit
||128 (DDR II)
Today we are presenting benchmarks of the nVidia GeForce FX 5200 (NV34) video cards, but in the near future we'll amend that and publish an article to do with the mid-range GeForce FX 5600 (NV31) chip.
Distinctions of the NV34 Chip
To date, nVidia is releasing a line of three GPUs of the FX family for all the three market sectors, with each chip produced in two makes: regular and ultra:
- 5800 (Ultra) - NV30. High-End, a replacement for Ti4800
- 5600 (Ultra) - NV31. Middle class, a replacement for Ti4200
- 5200 (Ultra) - NV34. Low-end, a replacement for MX440
As we can see, nVidia has given up using the MX label to designate its Low-end produce. You now can forget about the MX...
Making the chip and video card cheaper requires a price to be paid. Here is a list of main distinctions of the high-end NV30 from the NV34. So, what is missing in NV30:
- 0.13 Micron Process Technology - allows placing greater number of semiconductor components per chip and increasing the speed of the 256-bit core. In the FX5200 series, the 0.15 process technology is used.
- Intellisample Technology - a new FSAA technology that eliminates "roughness", "toothed" image outlines at least by 50% better than before. It also allows tuning the color spectrum in terms of the light & color perception difference and the way it is reproduced on the monitor. Besides, this technology uses improved anisotropic filtering which reduces distortion of textures through introducing dynamic corrections to the image. FX5200 does not offer Z-compression or "hardcoded" color support inherent to this technology. It simply cannot offer that - the power of the chip is not enough to implement such technologies.
- 8 Pixel Pipelines - allows outputting up to 8 pixels per clock. In our case (5200) there are only 4.
- 400 MHz RAMDAC - for the 5200 series, the video data DAC runs at 350 MHz.
- DDR II memory - instead of the cutting-edge DDR II memory, FX 5200 uses regular DDR.
The nVidia GeForce FX 5200 Chip
The chip has 45 mln transistors and is built on the 0.15 mk process technology.
GeForce FX 5200 offers 2 pixel pipelines and 4 texture units. But all this is very relative, because to date you can judge a video card only by the number of pipelines of one type or the other. This is caused by that the driver configures operation of the chip for each particular scene of a computer game.
All in all, we can say the 3D handling potentials in NV34 are not different from those in NV30/31. NV34 supports API DirectX 9, thus shaders 2.0 and 2.0+. There are differences of course: the GeForce FX 5200 chip does not offer IntelliSample optimization.
The memory interface in GeForce FX 5200 is also different from the higher-end brother - NV30. The chip uses a standard DDR memory controller, which in theory results in essential performance drops, especially with the anisotropic filtering and FSAA enabled.
GeForce FX 5200 chips do not have support for the HDTV, but can boast having the integrated TV codec, a TMDS-transmitter and two integrated 350 MHz RAMDACs. But today you are unlikely to surprise anyone with this.
For the long period of its development and short life, the GeForce FX 5200 has changed the "recommended" clock speeds for the core and memory at least 10 times. The problem is the first tests of pre-production samples demonstrated a performance so astonishingly low that it would be out of the question and even fatal to launch video cards in such a bad condition on to the market. We'd better refrain from bringing in those first raw results - that simply won't be fair to nVidia, but I can assure you the results were much lower than for MX440. The cause of that was primarily in buggy Detonator drivers not fit to an entirely new chip architecture and really low clock speeds of the new chips. Gradually, things broke even - the standard recommended clock speeds had to be raised, and every new revision of the driver streamlined the performance of video cards, with the yield ratio was bit by bit coming to reasonable technology norms. In our comparison table in the beginning of the article we brought in the "original" values of core and memory clock speeds, but they may prove different in reality which you can see for yourself buying a card in the shop close to you. Video card manufacturers are not shy about varying these values within wide ranges ...
Direct competitors of the new chips are Radeon 9000 and 9000 Pro which will be soon replaced with Radeon 9200 and 9200 PRO.
||CPU & Memory: