Leadtek WinFast GeForce2 GTS

Review date: 19 May 2000.
Last modified 03-Dec-2011.

 

Monstrous, horrifying, skin-peeling speed.

Brutal, sphincter-loosening, skull-munching power.

Un-namable, worshipped-by-Lovecraftian-cults, planet-crushing bad-assedness.

Let's face it, that's what we all want from a PC, isn't it?

WinFast GeForce2

People who are into word processing and a spot of light Web browsing can stop reading here. Because this review is of the $AU695 Leadtek WinFast GeForce2 GTS, an AGP graphics card which offers a level of 3D performance substantially higher than, well, than anything you could buy before NVIDIA's new GeForce2 GTS chipset came out.

The GeForce2 is pretty deuced fast for ordinary 2D graphics - business applications, paint and drawing programs, non-3D games. But that doesn't matter. It's difficult to find a graphics card on the market today that isn't faster for 2D than anybody really needs.

Top-end consumer cards like the GeForce2 aren't meant for such pedestrian pursuits. They're aimed, first and foremost, at 3D game players. These cards can put in a surprisingly good showing in professional 3D design applications, as well - they give a good slice of the performance of far more expensive professional 3D boards. But most people who buy top-spec 3D boards do so to get the very smoothest, highest resolution 3D mayhem possible. And it is safe to say that the GeForce2 fills this particular need more spectacularly than anything else to date.

Big iron

Like many other graphics chipset manufacturers, NVIDIA, maker of the GeForce2, sells the chips to various card-makers, who all build their own cards using it. Leadtek made good cards with the earlier GeForce chipset, and they've made a good one using the new chipset, too.

Giant chip cooler

The first thing you notice about Leadtek's version of the GeForce2 is its, well, utterly freakin' gigantic chip cooler.

The main GeForce2 chip has been reduced to a 0.18 micron manufacturing process (the original GeForce is 0.22 micron), which cuts heat output. But it now runs, by default, at 200MHz, versus 120MHz for the old model. So there's still plenty of heat to get rid of.

But video cards can't ship with very big chip coolers; if the cooler sticks up too far, it'll foul a card mounted in the next expansion slot. Many video card overclockers (people who run the core and/or RAM faster than stock speed, of which more in a moment) retrofit their cards with after-market oversized heat sinks and just give up on putting anything in the slot, or slots, invaded by the new cooler. Manufacturers can't do that, though, so if they want to provide lots of cooling, the heatsink has to go outwards, not upwards.

Hence, this.

The fan in the middle of the thing's a bit bigger than normal by itself, but it's surrounded by a forest of pins; the chip cooler shades, but doesn't touch, the RAM chips that surround the main chip.

Apart from the amazing cooling hardware, there's nothing at all remarkable about this board.

Underside view

Like all other GeForce2 cards in this first wave, it's got 32 megabytes of video memory (this picture shows the other four RAM chips on the underside of the card). 64Mb versions will be along soon enough, but that much video memory only helps you in current games when you're running very high resolutions.

Which, by the way, is something the GeForce2 is exceedingly happy to do.

Essentially, if you play 3D games, adding a GeForce2 card to your system will let you wind the resolution up - if your current card isn't that great, make that WAY up - and still enjoy the same or better frame rates.

This is because the resolution you run a 3D game in, when you're using a 3D accelerator card, has nothing to do with your CPU load. More polygons on screen (higher object detail, or more people running around...) means more CPU load, but the CPU knows nothing about the resolution. It just spits the geometry data out into the graphics card, which handles painting the screen.

Octuple pipelined single-pass Bussard ramscoop Jeffries tubes!

NVIDIA, as always, is full of techno-marketing-speak about the superiority of the GeForce2. We've heard it before, and it means about as much as it did the last time.

If you're looking for an in-depth technical appraisal of the GeForce2, allow me to recommend the ones on Tom's Hardware, AnandTech and Sharky Extreme.

NVIDIA's buzzword when they released the first GeForce was "GPU" - Graphics Processing Unit. Now, any video card with some sort of hardware acceleration of some feature or other can be said to be a "GPU"; the GeForce was just the first one to have transform and lighting (T&L) implemented in hardware, taking even more load off the system CPU.

But no matter; the "world's first GPU" marketing line got NVIDIA plenty of column inches.

With the GeForce2 GTS, to give it its full name, the three-letter acronym on the end is the new buzzword. It stands for "Giga Texel Shader". This means more than GPU did, but it's basically just an evolutionary change - the GeForce2 can, theoretically, steam along at 1.6 billion textured pixels ("1.6 gigatexels") per second, versus a mere 480 megatexels per second for the original GeForce.

Apart from that, the real improvements in the GeForce2 are pretty much limited to some tweaks to the transform and lighting engine, and somewhat faster memory than even the GeForce DDR (Double Data Rate). The DDR GeForce had 300MHz memory (150MHz clock speed, with two transactions per clock tick; that's what the DDR part means); the GeForce2 has 333MHz memory.

The parade of features that the GeForce has and the GeForce2 improves aren't actually all that important for the average video card buyer. The GeForce2 is, indeed, substantially faster than the GeForce, even the DDR version, as we'll see in a moment. But the super-fancy 3D extras on all GeForce flavours are, unfortunately, not yet particularly well supported by games.

This is because game development generally takes one to two years, and many development teams lock down the technical capabilities of the 3D "engine" they create (or license from someone else) relatively early in the process. There are always detail and resolution adjustments you can make, but whole different levels need to be made to take advantage of cards like the GeForce2 that can animate multiple dynamic lights and realistic water and other amazing things.

Levels that look fabulous on a GeForce2 will be an unplayable slideshow on the far less capable cards that most game players are still running. Since game developers can't afford to make what amounts to a whole new game for the top few per cent of the market, buyers of bleeding edge cards like the GeForce2 don't get to see the true amazingness of their hardware in anything but the manufacturer's demos.

Which are utterly spectacular. Unfortunately, you don't get 'em with the Leadtek card.

Software

If you want to see the GeForce2's full power, you can download movies of the various new demos from here. But you can't download the demos themselves. Most of the older GeForce demos are available for download here, but the new ones haven't made it onto NVIDIA's site yet. They're supposed to just be bundled with the video cards.

Leadtek didn't include demos with their GeForce cards, and they don't include them with their GeForce2 ones either. You instead get one driver CD - which, as usual for Leadtek, contains drivers for umpteen other cards as well as the GeForce2 - and one software disc, containing Leadtek's current bundle-pack.

There's Colorific colour calibration software and 3Deep gamma correction software, which both now come from E-Color. You also get Leadtek's own WinFastDVD, a DVD playback package with a funny looking interface but perfectly acceptable playback quality. Of course, you need a DVD-ROM drive to use it.

Oh, and the driver disc also contains something called Cult3D, which threatens to let people put 3D objects in Microsoft Office and Adobe Acrobat documents.

And that's it. As usual with Leadtek products, it's a one-bundle-fits-all situation, with no special GeForce2-related software.

Drivers

Traditionally, buyers of brand new video cards get the v1.0 drivers on the disc that comes with the card. Seasoned users know it's a better idea to hit the card and/or chipset manufacturers' Web sites and suck down the current drivers right away; don't even install the card until you've got proper drivers.

NVIDIA has, for quite a while now, rolled all of the drivers for its entire range of cards into one blob. You get the driver pack for your operating system from the download site here, and it doesn't matter whether you're running a low-end TNT2 Vanta board or a GeForce or any of the other zillion and three flavours of card that NVIDIA keep creating.

When you've got a "vanilla" card like this one, with nothing but an ordinary VGA connector and no fancy extras, you might as well use the standard NVIDIA driver set. Lots of GeForce and other NVIDIA card owners do; the drivers from the card manufacturer are usually just lightly facelifted versions of the "reference" drivers.

Driver releases from NVIDIA dried up a while ago, after they made the v3.68 driver pack available for download. v3.78 "Release Candidate" drivers have been made available as well, but the drivers you get from the plain NVIDIA download site are still v3.68.

The reason we haven't seen any new official drivers was because NVIDIA was busy creating the new "v5" driver series. The version 5 drivers have not been officially released yet, though; there are just various "leaked" versions around the place.

The earlier versions of the leaked drivers were obviously not fully baked. The reason was simple enough - NVIDIA was tweaking up the drivers for the GeForce2, with the changes trickling down to the drivers for the other cards in the range, but working out the bugs for these other cards' drivers was a low priority. People making GeForce2 cards got the new GeForce2 driver code to build their drivers on; that was all that needed to be ready for human consumption.

The v5 drivers give great speed with earlier cards - they're impressively faster than earlier versions - but have poor compatibility with some software. Lots of games piled up with the original v5.08 drivers, and the latest (at the time of writing) v5.16 ones are better, but still far from bulletproof. This hasn't stopped lots of users of earlier NVIDIA cards from choosing the v5 drivers, though; if they work with what you want to play, the speed gain's well worth it.

You can find exhaustive lists of official and unofficial driver sets for NVIDIA and other graphics cards at sites like Reactor Critical, here.

Fortunately, you don't have to muck about with all this to use the Leadtek card. Leadtek have a tradition of providing good drivers, and updating regularly and promptly. The driver version on the CD is v1.02, and a better, faster v1.03 driver is already available for download from here.

Performance

It has come to my attention that reviews of video cards will not be taken seriously unless they include at least one big fancy graph, preferably displaying six or fewer actual data points. Accordingly, without further ado, here is just the sort of graph that seems to be required:

Fancy graph

What?

You want more detail?

Oh, all right.

I compared the GeForce2, running its stock drivers and at stock speed, with an ASUS V6600 GeForce 256 board (reviewed here), running the v3.68 drivers and also at stock speed.

The test machine's powered by a 750MHz Athlon on an ASUS K7M motherboard.

In quick WinTune testing, the GeForce2 was slightly faster for 2D - which, as mentioned above, is not important since both cards are ridiculously fast. But it was about twice as fast for Direct3D and OpenGL, in 1280 by 960 resolution, 32 bit colour. Some of this is down to the faster drivers; most of it's down to the faster hardware.

Then I tried good old Quake 2 - out of date, but good for comparative benchmarks with older hardware. I used 1280 by 960 again, but in 16 bit colour, because 32 bit adds nothing to the look of this older engine.

For the undemanding demo2, hectic-deathmatch Massive1 and worst-case-scenario Crusher demos, the GeForce scored, respectively, 82, 76 and 64 frames per second. Which is excellent, at that resolution; the old rule of thumb was that anything better than 30 frames per second in Crusher meant you had a deathmatch-worthy computer, and people were usually running 1024 by 768 or lower to get even that.

The GeForce2, though, clocked in at 136, 124 and 91 frames per second, respectively. Better than 60% faster for the simpler demos, better than 40% faster for the heavyweight one.

Treemark

In TreeMark, NVIDIA's specially made demo/benchmark that shows off the advantages of hardware transform and lighting, the GeForce managed 41.8 frames per second for the Simple test and 12.2 for the Complex one.

The GeForce2 managed 73.1 and 20.5 frames per second, respectively, for 75% and 68% victories.

In 3DMark 2000, Mad Onion's show-off Direct3D benchmark, the GeForce managed 3755 3DMarks in the default 1024 by 768 test, and the GeForce2 scored 5270 - 40% faster.

The difference between the two cards vanishes in lower resolutions. Try 640 by 480 or 800 by 600 and you'll find the new cards score little better than the old ones, because they both draw frames faster than the CPU can feed them fresh data, no matter how fast your CPU is. But the advantage is clear once you get above 1024 by 768.

And you can make the GeForce2 even faster.

Tweaking

For a start, the v1.03 drivers make the Leadtek GeForce2 faster. The 3D benchmark scores went up by a bit more than 5% overall when I updated the drivers.

And then there's overclocking.

The NVIDIA reference drivers have had overclocking abilities built in for some time now. Leadtek's version of the drivers make it easier to do, though, because you don't need to do any registry tweaking to make the overclocking options accessible, and you don't need to restart your computer when you decide to start using them. Just click the "Speed Runner" button in the WinFast Display Settings tab of Display Properties, and you're in business.

Overclocking utility

Video cards like this, with separate core and memory clocks, let you tell which clock you've wound up too far. If you grossly over-overclock either setting your computer will just instantly crash. But if you've just somewhat over-overclocked the card's RAM, you'll get little white dots sprinkling the screen, as RAM locations in the frame buffer mis-read.

If the RAM's OK, but you've over-overclocked the card's core slightly, you'll get geometry errors. Things will show through other things when they shouldn't, for instance. These errors are your sign to shut the game down right now and wind the clock down, if you can. But you usually only get to say "Oh, look, geometry err-" before the computer hangs.

Fortunately, this is no big deal. 2D graphics are much less demanding than 3D, so an over-pumped card won't have any trouble rebooting Windows so you can try less audacious settings.

After a bit of fiddling and a handful of these reboots, I settled on 219MHz core, 396MHz RAM; a 9.5% and an 18.9% overclock, respectively. 9.5 per cent is a pretty lame result from a board with such a gigantic chip cooler on it. Other reviewers have reported more impressive results, but not much more impressive. Oh well; it's not as if you're paying a big premium for the giant heatsink. And the RAM - which isn't touched by the huge cooler - seems happy to run at a more impressive overclock, which is significant, since RAM speed seems to be the limiting factor for GeForce2 performance at high resolutions.

The speed improvement from this much overclocking only worked out at about 4% in 1024 by 768; in 1600 by 1200, the improvement was more like 13%. At higher resolutions, even the super-fast RAM on the GeForce2 isn't fast enough, and overclocking it helps. Core overclocking has less of an effect for GeForce2 boards; it doesn't matter how fast the core is if the RAM can't accept and deliver data quickly enough.

So, overall, overclocking this card is no big deal, unless you get one happier to run fast than the one I checked out. Another ten per cent or so is nice to have, of course, but it's not worth getting too excited over.

Which leaves us with a question - what do you do when you've got a fast CPU, and a ludicrously fast graphics card, but you're not running super-high resolutions because your monitor isn't big enough? What can be done to burn some of that outrageous extra frame rate?

Anti-aliasing, that's what.

Smooth customer

Because images on your screen are all made of square pixels, high-contrast diagonal lines have steps - "jaggies". It's inescapable, until display devices get good enough that the pixels are too small to see.

But you can cure jagginess, by "anti-aliasing" the image. Anti-aliasing fills the steps with pixels of intermediate colours. Do it correctly, and the picture looks a lot less "computery", but doesn't lose detail.

The GeForce2, with its outrageous speed, can do Full Screen Anti-Aliasing (FSAA) quite easily. Here's how it works.

Suppose you've got a 16 pixel diameter circle, like this:

Jaggy circle

This is blown up to 64 by 64, so you can see the jaggies clearly; it's actually only 16 pixels across.

To make an anti-aliased version, you can render the circle bigger than it's going to be displayed - say, a genuine 64 by 64 pixels:

High-res circle

It's easy for 3D graphics systems to generate views of the 3D world of arbitrary dimensions; you just need enough processing power.

Once you've got your high-resolution version of the image, you scale it down, averaging out the values of the pixels as you squish 'em so that, for instance, two black and two white pixels, averaged down, give you one grey one. The result:

Anti-aliased circle

Doing this with a 16 by 16 image is one thing. Doing it in 1024 by 768 is quite another. Double the dimensions of the image and you've got four times as many pixels to generate; then you have to burn more power scaling the high res version down to the resolution you're actually displaying on the monitor.

But if you've got power to burn, you can do it - and the GeForce2 does. FSAA control is built right into the drivers, for OpenGL and Direct3D.

For the OpenGL FSAA, go to Display Properties -> WinFast Display Settings -> Advance -> OpenGL Settings, and then scroll down to the bottom of the options list and check the "Enable full scene antialiasing" option.

If you want to turn on the Direct3D FSAA for games that don't explicitly support it already - which, at the moment, means for pretty much all games - you'll need to go to Display Properties -> WinFast Display Settings -> Advance -> More Direct3D... -> Antialiasing, and check the "Force antialiasing in all applications". The Antialiasing tab isn't there in the stock v1.02 drivers, but the newer ones have it.

In OpenGL, the performance hit from FSAA is large but not intolerable. It chops the frame rate to about a third of the normal figure; if you're starting out at more than a hundred frames per second, that means you still get a playable frame rate afterwards.

In Quake 2, I could even use FSAA in 1280 by 960 mode. In Quake 3, the greater amount of game data taking up card memory meant I had to drop to 1024 by 768, and the frame rate was no good even then - 17.1 frames per second in the standard demo1 test, versus 70.7 without anti-aliasing. In 800 by 600 it scored a tolerable 28.5 fps, versus 84, without FSAA, but if you have to lose that much resolution to enable a smoothing technology, much of the benefit is lost.

Direct3D FSAA isn't ready for prime time yet. I tried it in Counter-Strike (a modification for the Quake 2 based game Half-Life, which supports Direct3D as well as OpenGL). In resolutions above 640 by 480, the frame rate was OK - but the response of the game to controls was laughably drunken. There was a scene-detail-related delay between pressing a move key and something actually happening. Even in 640 by 480, more detailed areas dragged - it felt as if, the moment you looked at something a bit complicated, you got stuck in molasses.

And Counter-Strike is a pretty darn lightweight game, as far as scene complexity goes.

Resolutions below 640 by 480 were fine regardless of scene detail, but dropping your resolution that far just to turn on an image-smoothing option is silly. No doubt future driver revisions will get Direct3D FSAA working properly.

Counter-Strike worked fine with FSAA in OpenGL mode, though; here's a detail shot that shows it in action:

FSAA in action

Click the picture for a bigger chunk of the scene.

FSAA, if it works at a good enough speed in the game of your choice, is groovy. If you've got a smallish monitor and can't profitably use monster resolutions anyway, FSAA is a great way to get a better picture without dropping tons of money on a new screen. If you've got a 19 inch or better monitor, though, then you're not going to get acceptable results from FSAA in higher resolutions - if you can use it at all - and you might as well just wind up the resolution and enjoy the frame rate.

Other features

The GeForce2 has digital HDTV decoding abilities, but that's not as amazing as it sounds. Digital HDTV is, basically, just high resolution MPEG-2 video. And being able to decode it is no good if you don't have a signal source. And no "HDTV decoding capable" video card - of which there are quite a few - has a tuner built in. The tuner has to be a separate card, and they don't seem to exist yet. Which is fair enough, as neither does HDTV broadcasting in pretty much all of the world.

A lot of GeForce2 boards - but not this one - also have TV out. If you're really fired up about playing games on your TV, bear in mind that you do not need an ultra-studly video card like this one to do it. You can get a TV-out capable card with just about any popular video chipset.

You can't use resolutions higher than 800 by 600 for TV output - the cheap TV encoders on video cards generally can't handle higher resolutions, but even if they could, TVs are so fuzzy that higher resolutions don't look any better, and small details like status display text are illegible.

TVs also have a maximum real refresh rate of 50Hz (cycles per second), interlaced, if you're in a PAL or SECAM standard country like Australia, or 60Hz, interlaced, if you're in an NTSC place like the USA. The "interlaced" part means you don't even get 50 or 60 real frames per second, because each frame's carved up into two "fields" - the even numbered lines are drawn, then 1/60th or 1/50th of a second later you get the odd numbered lines.

Now, 60Hz is the lowest refresh rate most computers use; only on old and cruddy systems do you have to put up with interlacing. TVs get away with it because they're fuzzy; on razor sharp computer monitors with lots of fine detail and horizontal lines, 50/60Hz interlace is intolerably flickery.

If you've only got 60 fields per second, that inherently precludes you from being able to perceive more than 30 full frames per second. If your computer spits out, say, 90 frames per second in 800 by 600, then each displayed field will be composed, at best, of three horizontal stripes, each of them a one-third slice of a rendered frame from the computer. Because of the interlacing, which loses one line out of every two, that means that five-sixths of the image data your computer emits at 90fps is thrown away by the TV.

If all you're doing is 800 by 600 and frame rates much above 60fps aren't useful to you, then you'll probably have a hard time telling the difference between a cheap TNT2-based card, for instance, and a much more expensive GeForce 2.

And don't think DVD playback's a big feature for super-powered cards, either; the "motion prediction" they can do in hardware to accelerate DVD MPEG-2 decoding is there to reduce CPU load, on slow systems, at the price of a little image quality.

You get better results if you don't use the special hardware decoding feature - and, in any case, motion prediction's available on all sorts of cheap video cards, too. Assuming the TV encoder's OK, any old cheap video card with a TV output will give you DVD playback results just as good as those from a cutting edge GeForce 2. All you need is a decently fast processor, and just about anything from 400MHz upwards should be fine.

Competitors

If you're willing to look a few weeks down the track, the GeForce2 isn't as clear a winner as the original GeForce was. The original GeForce and the later DDR version were substantially quicker than anything anybody else had on the market at the same time. Later releases from Matrox and S3 and ATI came close to GeForce performance, but nothing pipped it; until the GeForce2 hit the streets, the GeForce 256 DDR was still the king of the hill, and at lower resolutions the plain GeForce wasn't far behind.

Now, 3dfx has its Voodoo 5 products very-nearly-released - although us Australians can expect to wait a while longer before they make it to the retail channel. All of the new 3dfx products - Voodoo 4, 5 and 6 - are based on a scalable chipset called the VSA-100, which allows 3dfx to just add more chips to the card to increase performance.

The dual-chip V5 5500 isn't as fast as the GeForce2. But it doesn't lose by a mile, and it's slightly less expensive. Since it's got 64Mb of video memory to the standard GeForce2 boards' 32, it'll probably be an attractive prospect when it goes on sale.

The huge quad-chip V5 6000 (which, frankly, looks like a joke-card that someone created in Photoshop) can be expected to comfortably blow the GeForce2 away. But you can expect it to cost twice as much as the 5500, which means its Aussie pricing will probably exceed $AU1300. Just a shade out of most people's price range.

ATI's upcoming Radeon 256 chipset will probably be faster than a GeForce2. If it comes out mid-year as they're promising, and doesn't cost an amazing amount of money, and isn't hobbled by bad drivers (which have been something of an ATI tradition...), it'll be a great piece of gear. If it comes out much later, though, it'll be an also-ran; there's a new NVIDIA chipset, code-named NV20, due around September.

If you're shopping for a superpowered 3D card at the moment, though, it seems the only real choice is between different models of GeForce2. And the Leadtek card's a good option.

Put up against the less impressive 3D cards that most game players are still using, the GeForce2 won't just win. It'll kick sand in their faces, pull their underpants over their heads, jam them into a garbage bin and roll them down a steep hill.

If you find that to be a pleasing image, then a GeForce2 is for you.

And now, please excuse me. There are games to be played.


Buy stuff!
Readers from Australia or New Zealand can buy graphics cards from Aus PC Market.
Click here!
(if you're NOT from Australia or New Zealand, Aus PC Market won't deliver to you. If you're in the USA, try a price search at DealTime!)



Give Dan some money!
(and no-one gets hurt)