ASUS AGP-V3800TVR, Diamond Viper V770 and Leadtek WinFast 3D S320 II TNT2 graphics cards

Review date: 19th May 1999.
Last modified 03-Dec-2011.

 

NVidia's TNT2 chipset has well and truly arrived, even here in backward little Australia. That's good. But the hapless consumer hunting for a fast 2D/3D graphics card is likely to find themselves wondering which of several apparently rather similar TNT2 cards is actually right for them. That's bad.

V3800TVR graphics card

To simplify the situation, I've checked out ASUS's all-singing, all-dancing AGP-V3800TVR...

S320 II graphics card

...Leadtek's mid-featured WinFast 3D S320 II...

Viper V770 graphics card

...and Diamond's bargain basement OEM Viper V770. They cost, respectively, $445 , $289 and only $245 (all prices in Australian dollars). What's the difference, and is it worth the money?

Bang per buck

When I reviewed Diamond's Viper V550 (read the review here), which uses the original TNT chipset and has 16Mb of video memory, it cost $469. It's down to $250 as I write this, and can only drop further.

The TNT2 chipset is, generally speaking, like the original TNT only faster. When NVidia was spruiking the TNT, they claimed a lot of very exciting benchmark figures, which as it turned out the actual chipset couldn't manage, simply because they couldn't clock it high enough. It was fast, but not that fast.

The TNT2 is that fast.

The S320 II I checked out has 16Mb of video RAM, the same as most TNT boards, is roughly 40% faster, and is roughly 40% cheaper than the original TNT boards were when they were as young as TNT2 is now. Which makes it, in bang per buck terms, about twice as good value. The other two cards, considering their specification levels, offer similar value.

The future looks bright.

What you get

All three cards are AGP boards. This, of course, means you need an AGP slot on your motherboard, and an AGP-aware operating system, like Windows 98. Windows 95 Release B will do, with the right AGP/USB drivers installed for your motherboard, but 98 makes it easier.

The two main kinds of TNT2 are differentiated by clock speed; there's the plain version, used by the three cards in this comparison, which has a default core graphics clock speed of 125MHz and a memory clock speed of 150MHz. There's also the "Ultra" version, clocked by default at 150 and 183MHz respectively. I review the Ultra version of the V3800 here.

This is of special interest to users, because as with other recent graphics chipsets, the TNT2's clock speeds can be altered easily with a simple tweaking utility. Of which more later.

The two more expensive boards have a heatsink-plus-fan cooler on the toasty-warm main TNT2 chip, and the ASUS one is removable, via a couple of spring-pin doodads, so you could easily upgrade it if you liked. The V770, like the earlier V550 (reviewed here) has only a plain heatsink, which is chunkier than the heatsinks on the other two cards but is unlikely to cool the chip as well in even very well ventilated computers, let alone the cable-packed undercooled sauna that is the average PC case. Adding a fan to the V550 is very simple (I describe how to do it here); the V770 doesn't have the handy mounting holes, but shouldn't be much more complex to modify. I'll put up a page on the subject shortly.

Aside from the chip coolers, the only card that deviates very visibly from NVidia's reference design is the S320 II, which has a toroidal choke sticking up next to the RAM, which might account for the fact that I found the card to be capable of a little more speed than the others.

The ASUS AGP-V3800TVR is so much more expensive than the other two cards because it's a version of the V3800 with more connections, and more memory. The versions of the other two cards which I checked out for this comparison each have 16Mb of video memory. The V3800 is available with 16Mb - indeed, there are versions of it with as little as 8Mb of video memory and nothing but a VGA connector on the back - but the V3800 I checked out has the VGA connector, 32Mb, video in and out connectors (TV-type video, not computer-monitor-type video), and a connector for optional 3D glasses, to boot. The VR-100 LCD shutter-glasses to match are now available in Australia, but I'm afraid they more or less suck. I review them here.

Whether more than 16Mb of video memory is actually useful is something I'll deal with later, but it's a fact that a lot of buyers will go for 32Mb TNT2 boards just for the comforting feeling that nobody else's, ah, memory, is bigger.

The S320 II I checked out has 16Mb of video memory and video out, but that's where its fancy features stop. The cheapo OEM V770 has 16Mb and no connectors beyond the standard VGA jack. But one simple monitor connector is all that a lot of buyers need or want; a cheap card that doesn't skimp in other areas is just the ticket for the average game-player.

Both of the cards with video connectors also come with an S-Video cable, a composite cable, and an S-Video to composite adapter lead, which won't work on anything but a video card for reasons I'll explain below.

The only TNT2 option missing on all of these cards is digital flat panel output - TNT2s can have a dedicated DDWG (Digital Display Working Group) compliant output for LCD panel monitors of up to 1280 by 1024 resolution, which bypasses the inelegant digital-to-analogue-to-digital system that gets the picture onto most LCD screens today. Flat panels are still very expensive compared with conventional monitors of equivalent size, but the price is coming down.

All three cards come with a CD-ROM containing driver software, supporting utilities, documentation and so on. The AGP-V3800TVR and the S320 II are both proper boxed retail cards, with a software bundle. There is, of course, a retail version of the V770, but the one I reviewed ain't it; all the cheap OEM V770 comes with is a driver disk and a little bag containing some jumper blocks with an explanatory note on how to use them. The other two cards are set, by default, to use AGP 4X, the new super-fast AGP speed that'll arrive later this year. Motherboards that don't support 4X, like, for example, everything currently available, should automatically use whatever speed they do support. Apparently, though, some older boards don't work with 4X cards, so the V770 sets the mode manually. It comes set to 2X, and shifting a couple of jumpers selects 4X.

Software

The V770's lack of a software bundle could be, for many buyers, a selling point; the games and utilities you get with most 3D cards are often elderly and/or just plain lousy, and it's nice to be able to save a few bucks by not paying for the extra coasters.

The V770's CD, however, does contain Zoran's SoftDVD, a passable software DVD player, and also gives you the option to buy some cut-price games if you actually want them. Diamond's "Gamezone" program lets buyers who live in the USA, Canada or Europe buy two games (choosing from Asteroids, Shogo, SiN or Fighter Squadron) for $US12.95, and pick up more (from the same list, but with the excellent Starsiege: Tribes added as well) at a smaller but still decent discount. I wholeheartedly approve of this approach; it may be less straightforward than including the games in the box, and it may be useless for Australians like me at present, but it means that people who really do want a cheap bundled game or two can get them, without forcing the rest of the buyers to get the games too.

The V3800, besides the utilities on its driver disk, comes with a couple of games - Turok, Seeds of Evil and Extreme-G 2.

The S320 comes with something more closely resembling the traditional Video Card Software Bundle Of Questionable Value. If you like playing with mystifying "3D world creation" software and cut-down 3D rendering packages, there's a couple of 'em here to amuse you.

It's not all dross, though - you get Leadtek's own PowerDVD software DVD player, which isn't state-of-the-art but still works OK, and there's also Digital Video Producer, an acceptable home-level video editing package. Although, since the S320 II has no video capture facility, you'll need some other way to get your video into the computer. Given the low price of the card, the Leadtek bundle is OK.

Drivers, drivers, drivers...

In most areas of human enterprise, you can expect a product that's made it to market to be more or less bug-free. In the computer world, the extreme complexity of the products and the cut-throat competition to be first to market with hot new gear means that the best manufacturers aren't the ones that put a working product on the market first, but the first to make their initially somewhat broken product work properly via patches, BIOS upgrades and driver updates. It's widely accepted that the v1.0 drivers for any hot new video card will have problems, and this is a good reason to hold off for a month or two, whereupon you'll be able to buy the same card for less and download some drivers that actually work.

Fortunately, any TNT2 card should work with NVidia's standard Detonator drivers, which support TNT, TNT2 and the low-end Vanta and Model 64 (I review a Model 64 card here), and upon which all of the other drivers are based.

The Detonator drivers also include support for the extra instruction sets of current AMD K6-2 and Intel Pentium III processors, and the option to turn this support off; some users swear they get higher frame rates with plain K6 or P-II processors when they turn off the extra instruction support, but I and every proper benchmarker I've ever met disagree. The "disable support for enhanced CPU instruction sets" option has been described as "the placebo switch".

The existence of the Detonator drivers is a lifesaver for cards like the ASUS V3800, whose stock drivers are nonfabulous.

Show-off interface

The ASUS driver CD has everything that opens and shuts, and lets you install not just the drivers but also updated AGP drivers, DirectX 6, the Live3800 video capture utility, Acrobat Reader for the on-disc manual and the Tweak Utility that lets you overclock the card. The opening screen of the installer program lets you see the specs of the various versions of the V3800.

The sizzle is thus quite impressive, but the steak is not so great; the v1.00 V3800 drivers stunk for OpenGL Quake 2. Things showed through other things, which meant powerups were visible though walls and objects like the player's gun strobed hideously. The stock drivers achieved slightly better frame rates at a given clock speed than the Detonator drivers, which had none of the nasty glitches, but I suspect the speed improvement was because the broken ASUS drivers weren't actually drawing as many polygons.

Fortunately, the ASUS Taiwan site now offers updated drivers, here, which saves you from the Detonator drivers. These are fine for plain 2D/3D, but they don't let you activate the ASUS board's video input, or the optional 3D glasses - which are bundled in with higher spec versions of the card, and which I review here.

If you buy the ASUS card, do not use the stock drivers. Download the better ones before you even install the card.

Diamond's standard V770 drivers seemed to work fine, although they really didn't do anything that the plain Detonator drivers didn't. The Ultra version of the V770 comes with a modified InControl Tools that allows you to tweak clock speed with a "Turbo Gauge" feature, but for the regular version I checked out you'll need an outboard utility like the shareware PowerStrip, which you can download from Entech here, or the excellent little free utility TNTClk.

Diamond's drivers caused a strange moment of system paralysis whenever I brought up Display Properties, although I haven't verified this on another machine and it may have been due to the frantic video driver shuffling I'd been doing all day. Since first putting up this comparison, I've received feedback from another user with the same odd problem, so it would appear it's not just me. The drivers have since been updated; check out the V770 page here to see if the new ones fix the problem.

The extra InControl Tools features of the Diamond driver could be handy. Earlier versions of InControl Tools were just another unremarkable taskbar utility for Windows 95/98, with resolution setting, keyboard shortcuts, quick access to applications and so on. The current version, InControl Tools 99, hooks into the standard NVidia driver feature (which the plain Detonator drivers has) that lets you set 3D tweaks separately for every game or application that uses a TNT2 board's 3D mode (the driver automatically notices new 3D programs when you run them). InControl Tools 99 lets you export the particular tweaks you set up for a given program so you can give them to another user. This gives expert users an easy way to "instant-tweak" other systems, and standardise frame rate benchmarks too, and should also be a time-saver if you've spent time twiddling your settings and need to reinstall Windows.

Leadtek's card was, I think, the very first TNT2 to be released by anybody, and that and the fact that the Leadtek driver team seem to be unusually prolific means that there have already been two fresh driver versions and a BIOS update out for the S320 II. The current version has built-in overclocking sliders and seems bugless to me.

To repeat - it's fine to buy a TNT2 card like the V3800, whose stock drivers are dodgy, and just use the Detonator drivers until the manufacturer comes up with something better. The NVidia drivers actually have just about all of the features that the brand-name drivers have, although the brand-name versions may make some of them easier to use. Any actual extras are often not much more than window dressing, or can easily be replaced by utilities like PowerStrip.

Detonator setup interface
The standard Detonator setup interface.

Video hits

Every graphics card, by definition, has "video out" capability, but when you see the phrase on a specification list it means composite and/or S-Video output, which you can plug into a TV or VCR or projector, or a variety of more esoteric devices. The two more expensive cards in this comparison both have video out, and the ASUS board has video in, as well.

In both cases, the video jacks are an "extended S-Video" design, with a connector that'll work as a straight S-Video jack (S-Video and composite cables are included with both cards), and an adapter lead that uses normally fallow pins on the S-Video mini-DIN connector to give you plain composite output as well. This means that the adapter lead will NOT give you composite output from a plain S-Video connection on a a camera, though it shouldn't do any harm if you plug it in by mistake.

Both video-capable cards have PAL and NTSC compatible output (you select the video mode with jumpers), so they'll work no matter what your local TV format is. You can play your games on a big screen, or record graphics to tape. For most users this isn't actually much of a selling point, as the resolution of the average TV is crummy. You can't use output resolutions above 800 by 600 when connected to a video display, and 640 by 480 is more like what most TVs display (for the lowdown on video terminology, by the way, check out my guide here).

If you've got a giant screen TV or, better yet, a projector, playing games on it can still be fun. Low res though it is, a wall full of Quake is darned impressive. There are other applications, too; use Winamp to play music for a party and pump the output of a funky visualisation plugin to a cheap rented projector (or dump the whole performance to a five hour videotape!), or use another video card with capture ability in a separate PC to record game action so you can make a movie of it for demonstration purposes (or, again, dump it to tape, or get adventurous and try to loop the video out back into the video in and Test Your Multitasking Ability...), or knock up title graphics for your hot new indie movie on the cheap.

A lot of better projectors these days have VGA in, though, and most users don't have a big enough TV, or enough tolerance of relatively-chunky graphics, to make video out useful for anything. If you don't need it, don't pay for a graphics card that's got it.

Video in, such as is provided by the V3800, is another matter again.

The V3800 handles video capture with the Live3800 utility. Live3800 works passably well, but it's got some very rough edges. It's set up as if it's meant to be configured when the output is directed to a video monitor, or something, but this doesn't make a lot of sense, since that's not actually how it works. It doesn't have a convenient alternative setup mode, so doing things like changing video mode (Live3800 defaults to NTSC) requires clicking a menu button and using "on screen menus" strongly reminiscent of those provided by current VCRs to change modes.

Live3800 also defaults to the none-too-efficient Cinepak codec, which has the advantage of ubiquity (if you've got Windows 95 or better, you've got Cinepak), but which makes big, big AVI files.

I switched Live3800 to use the Intel I.263 codec, which can make much smaller files (how does an improvement of a factor of 20 grab you?), and it captured just fine. But Live3800 forgets the change when you close it, so you have to reset the codec again every time you use the program. Live3800 remembers the video resolution and colour depth you set it to, but not the codec.

It also appears to handle the video files very poorly - if you capture a new clip to an existing file, Live3800 just overwrites the file from beginning to end, and if the new clip ends up smaller than the old one, the file stays the same size. It took me a moment to figure out why a high-quality I.263 clip that should have been less than 1/15th of the size of the previously captured Cinepak version was, instead, EXACTLY the same size.

That said, if you've got a reasonably powerful computer to handle the real-time compression, this is a worthwhile basic video acquisition option. You can use three capture resolutions, the middle one of which is feasible at 25 frames per second with most codecs on a 400MHz or higher machine and the top one of which offers better-than-VHS resolution, although I doubt you'll be able to capture more than 15 frames per second with any current hardware. It's not a pro video capture solution, but it's not blooming bad for the money.

Setting up

If you've got an AGP/USB compatible operating system and motherboard, installing an AGP graphics card like a TNT2 is simple enough. In Windows 98, you just set the system to use the standard plain VGA driver, shut down, remove your old video card, plug in the new one, restart, put up with whatever amount of autodetection nonsense Windows chooses to favour you with and then run the setup program for the drivers you really want. If you've got Internet access, the rule of thumb is to not even look at the drivers provided on the included CD (which are likely to be the dreaded v1.0...), and go straight to the manufacturer's site for the latest versions.

Diamond and ASUS, as I mentioned above, don't yet have updated TNT2 drivers available for download (Diamond doesn't, at the time of writing, have any TNT2 drivers available at all), but Leadtek's driver download page has their latest S320 II software ready for you to snarf.

Testing

All TNT2 boards support outrageous 2D resolutions - a maximum of 2048 by 1536, which not many screens can display at all, let alone clearly, and high refresh rates in more realistic resolutions. All but the highest resolutions are useable for 3D, too; unless your monitor is larger than 24 inches, you'll be able to use all of its phosphor and then some.

My lazy and lightweight benchmarking effort consisted of Quake 2 framerate tests and a quick buzz with WinTune 98, a mini-benchmark program which is to Ziff-Davis' monstrous WinBench as a roller-skate is to a Star Destroyer but which produces quite reliable and repeatable figures and is, I think, adequate for testing two closely related chipsets like the TNT and TNT2. I ran the tests on my 450MHz Celeron A-powered computer, with an old model ASUS P2B motherboard. I didn't use any special tweaks in Quake 2, other than leaving vsync in the off setting.

At a given clock speed, and using NVidia's January 27th Detonator drivers, all three TNT2 boards test out at exactly the same speed, or as near to it as the result accuracy permits. Which is what you'd expect, since the cards use exactly the same chipset. The first board I benchmarked was the S320 II, and I tested it against my old Viper V550, with the V550 running at its stock speed (90MHz core, 110MHz memory). The Quake 2 results:

Quake 2 framerate results

The S320 II is uniformly faster in 32 bit mode than the older TNT board is in 16 bit, and beats it by about 25% to about 50% across the board. These figures should improve considerably more with the release of the 3.21 patch for Q2; apparently a change in the implementation of multitexturing with the TNT2 drivers has caused a significant performance hit, which is why Voodoo 3 cards beat TNT2 handily for Q2, but more or less level-peg in other tests.

Even at the current performance level, the TNT2 is no slouch.

Quake II is a highly CPU-intensive game, and the advantage delivered by the newer card decreases with increasing CPU load - the two standard Q2 deathmatch demos, massive1 and crusher, are, respectively, very demanding and very, very, very demanding of the CPU. Single player Quake 2 doesn't push as many polygons around, and so the faster hardware shines more brightly - but the Crusher demo represents about the heaviest load any Q2 game will ever impose, and even in 32 bit mode the TNT2 keeps the frame rate above 30fps in 1024 by 768.

In other words, this card will play Quake 2 at more than acceptable speed at resolutions higher than 1024 by 768, especially if you're happy with the very marginally lower image quality of 16 bit rendering. In 16 bit, with a 450MHz or better processor, 1280 by 1024 will run nicely, or maybe even 1600 by 1200, if you've got a monitor that can display it acceptably. 1600 by 1200, even for games, is too high a resolution for pretty much any 17" monitor on the market, and most 19" monitors too. Any half-decent current model monitor can accept a 1600 by 1200 signal and put an image on the screen, but they don't actually have fine enough dot pitch for the higher resolution to really look any better than a lower one. When I play 1024 by 768 Quake 2 on a 15" screen, I have a hard time reading the message text. 1600 by 1200 on a 17" is worse.

WinTune results

The WinTune results show that the TNT2 doesn't seem to be any faster for plain 2D 16 bit video at 1024 by 768 than the original TNT. Since the original TNT is a great deal faster for 2D than any ordinary PC user could possibly need, this is not a particularly worrying problem. In 32 bit colour, the TNT2 has a 20% advantage over the older chipset, and its Direct3D and OpenGL speed scores vary from 36% to 64% faster than the TNT's.

And this is just when it's running at stock speed.

Overclocking

Overclocking lets you wring a bit more speed out of your hardware, by running the chips faster. I'm a fan of it, provided you can get a real performance increase without losing reliability. The faithful Celeron that powers my test machine has been chugging away at 450MHz, instead of its sticker speed of 300MHz, for months. A 50% speed increase for free is nothing to sneeze at.

Leadtek provide an overclocking utility as an integrated part of their video driver - in an earlier version of the drivers there was a separate oc.zip file to download, but the current drivers have their "Speed Runner" overclocking sliders built in. The ASUS drivers have a separate Tweak Utility which looks like PowerStrip's unsophisticated country cousin, and, as mentioned above, the V770 Ultra card's version of InControl Tools (which I haven't seen) lets you overclock the Diamond card, too.

Leadtek's earlier drivers set the S320 II's core and memory clock speeds, respectively, to 140 and 160MHz. The current ones use the more pedestrian 125 and 150MHz speeds. The sliders go to 160 and 175MHz, and I'm happy to say that the S320 II I checked out exhibited no problems with the sliders cranked up all the way. Compared with the 140 and 160MHz stock speeds, this gave 10% higher framerate in Quake 2 for the simple demo2 test. This dropped to 8% and 3% for the more CPU-dependent massive1 and crusher tests. WinTune reported no improvement at all in 2D speeds, and 9% and 5% respectively for OpenGL and Direct3D. At this speed, the card ran stably for hours - but this is on a cool night, in my very well ventilated case (see how I made The Wind-Tunnel PC here). I am unsure how successful this much overclocking would be in a normal, poorly-cooled PC case.

The more conservative default speed set by the current drivers cuts another 10% or so off the framerate, so the maximum-overclock setting gave a more dramatic improvement. Using EnTech's PowerStrip, I managed to wind the card up to 165MHz core speed, at which it ran reliably, but anything more started to cause visible glitches in 3D mode, which are the harbingers of a crash.

The V3800 didn't overclock quite as well. Its default setting is also the standard 125/150MHz, and its Tweak Utility lets you wind up the clock speed, although it's not nearly as comprehensive a tool as PowerStrip. Then again, you have to register PowerStrip or every now and then when you use it it resets some settings to the default.

The most I managed to wring out of the V3800, with any tool, was 160/175MHz, at which speed it beat the S320's 160/175MHz Quake II demo2 results by about 5% with its dodgy, ugly standard drivers, or by about 0% with the Detonator drivers. Glitches that mean polygons don't get drawn can do wonders for performance!

The fan-free V770 could be expected not to overclock as well as the active-cooled cards, and such was indeed the case; PowerStrip let me squeeze it up to only 145/165MHz. Some testers have reported much higher memory clock speeds, up to a rather implausible 200MHz; nothing over 165MHz was stable for me. The V770's performance at this slightly lower speed was what you'd expect; about 7% off the pace of the S320 II. Big deal, I say; look at the price!

I'm inclined to think I maybe got an unusually good S320 II, or perhaps that big, non-reference-design inductor on the S320 II helps high speed performance significantly. If I were you, I wouldn't make a buying decision based on my sample of only one of each card, especially since there's really not much in it anyway. It's a basic rule of computing that any performance increase below 10% isn't noticeable, but you should be able to get more than that out of pretty much any standard TNT2 whose stock drivers run it at 125/150MHz speed.

The latest beta version of PowerStrip can, theoretically, wind both the graphics and memory clocks up to 200MHz. It is entirely possible that a computer sitting outdoors in Siberia on a windy first of January at midnight will actually not exhibit a Festival of Video Glitches when turned up this far.

Frankly, even without overclocking, the TNT2 may be faster than you need, if you're a Quake 2 player who doesn't own a monstrous monitor. Fortunately, games that tax your video system more than Quake 2 are already plentiful, and the more demanding Quake 3 is just around the corner!

16Mb or 32?

Current video cards are reaching new memory capacity heights. A few years ago, 32Mb was plenty of RAM for a whole Windows computer. Now that much memory comes on a video card, and it's a lot faster, too.

But are the 32Mb TNT2 boards like the V3800TVR/32Mb actually worth the extra money over 16Mb ones like the S320 II and V770 I played with?

If you ask me, no.

The TNT2 is an AGP 4X board, which means it can pull texture data from main memory via the Accelerated Graphics Port at more than 900 megabytes per second, provided your motherboard supports AGP 4X as well. AGP 2X is all that current motherboards support; we'll have to wait for Intel's Camino chipset, in the third quarter of 1999, for 4X.

Assuming you've got AGP 4X, having lots of texture memory onboard isn't terribly important. PCI and slower AGP graphics cards with inadequate memory to store all of the textures they need for a given game level can cause significant pauses as they suck texture data from the computer as needed, but AGP 4X cards won't hurt performance as much.

Even AGP 4X is slower than genuine on-board storage, but it should reduce texture loading pauses sufficiently that only the pathologically picky will notice them.

If you've only got an AGP 2X motherboard, an extra 16Mb of video card memory will help more. On my old P2B, texture-load pauses with the 16Mb TNT2 card are as bad as they were with the old 16Mb AGP 2X TNT - which is not too surprising. But I don't care very much, because the pauses are still not bad enough to bother me. I think the significantly lower price of the 16Mb boards more than makes up for the small, albeit noticeable, performance loss. Of course, games that don't need more than 16Mb of texture memory are not unknown, and for them there's no difference between the RAM levels at all.

3Dfx's competing Voodoo 3 boards have only 16Mb of RAM, and don't support AGP texturing at all. They share this limitation with a lot of 3D cards, which may plug into an AGP slot but aren't actually any faster at anything than PCI versions. So far, AGP's fancy features have in fact been something of a bust; 16Mb of onboard storage is rather a lot, and has proved to be more than acceptable for the vast majority of gamers.

As far as 2D performance goes, there's no reason at all to pay for the extra memory. The 16Mb TNT2 can output 1920 by 1200 32 bit video at a 75Hz refresh rate. You need at least a 24 inch monitor to make use of that. Any questions?

Q3Test results

The Q3Test preview version of Quake 3 Arena asks considerably more of the video subsystem than Quake 2 does - partly because it's a great deal prettier, and partly because it's not finished yet and thus uses more power than the final version will.

Q3Test screenshot
Q3Test on the S320 in 16 bit mode, 1024 by 768.

In 1024 by 768, with every pretty-feature turned on but textures still set to the default "compressed" setting, Q3Test was playable but noticeably slow in 32 bit colour.  Dropping to 16 bit mode caused practically no loss of image quality, and the framerate hopped up to a more than acceptable speed. Increasing texture quality to 16 bit made the sky - which has transparent parallax scrolling clouds - look a little better, but made no perceptible difference to anything else. Transparent weapon effects, like the sky, are improved by using more bits for textures and the whole display, but weapon effects happen so quickly that I, for one, can't tell the difference without grabbing screenshots.

Increasing texture quality to 32 bit caused a huge performance drop on the two 16Mb cards, as clearly the total storage needed for textures now exceeded the on-board storage of the card, and "texture thrashing" disk access and pauses happened every time I entered a room with textures in it that the card didn't have on board any more. Again, the increase in image quality, even with the game set to 32 bit overall, was very slight. The 32Mb ASUS board kept cruising along, silky smooth, with 32 bit textures, but I wouldn't pay the extra just for that.

(Start of purely editorial opinion:)

By the way - is anybody else confused by people who apparently spend a third of their time tweaking their system to get 80 frames per second in the 3D game of their choice, another third of their time peering minutely at comparative screenshots of these games (which, by definition, represent something they're only going to see for 1/80th of a second before another frame comes along), and the remaining third of their time posting to Usenet about why barely-visible differences in static shots clearly make A Big Difference in their enjoyment of the game?

If you consider the realism of a parallax-scrolled variable-transparency sky to be an important feature in your 3D card, I can only presume you spend a fair amount of time in your games staring at the sky. You can save time and money, and be shot a lot less, by just leaving your house and staring at the sky for real.

Well, you'll probably be shot a lot less.

It depends on where you live.

(End of editorial opinion. Thank you for your patience.)

Q3Test, unlike most current games, can set its colour depth independently of the depth set for the Windows desktop. Earlier TNT2 drivers caused the game to slow down gigantically when you changed colour depth - selecting menu items took some tens of seconds, and actually playing was completely impossible. The more recent driver releases have cured this problem. As driver problems go, it wasn't much of a one.

I also encountered some odd texture errors while using the older Detonator drivers with Q3Test:

Q3 on brown acid

Imagine playing a game that looks like this, but with all of the deeply wrong textures pulsing away like crazy as they change their opinion about exactly how wrong they want to be. It's a trip, I'm tellin' you. There seemed to be no rhyme or reason to what caused the problem; sometimes switching to a different texture depth fixed it, sometimes just wandering around a bit caused things to come right. If it happened, it happened from the moment I loaded the level; it never came on unexpectedly in mid-game.

I'm told that this problem actually can just suddenly start happening in the middle of a game, but fortunately I've also now been told the cure - bring down the console with the tilde (~) key and type "r_drawstrips 1". Voila. This was with the original release of Q3Test, though; the later versions don't seem to do it.

Overall

The TNT2 chipset is not, overall, the fastest PC 3D video gaming hardware currently available. That honour goes, by a small margin, to 3Dfx's Voodoo 3. TNT2 is faster for some things, but Voodoo 3 is faster for most things, including Quake 2, and will probably still be faster by a small margin when Q2 v3.21 comes out. But the TNT2 is by no means far behind, has a bigger feature set - Voodoo 3 is 16 bit only for 3D - and is, at the moment in Australia, considerably cheaper.

The local Voodoo 3 distributors here in Australia would appear to be importing cards one by one on Learjets. I can think of no other reason why the Leadtek card, which costs $US130 in the USA, should cost $289 here (for reference, the Australian dollar is presently worth about 67 U.S. cents), while a base model Voodoo 3 2000 board, which sells for the same price in the USA as the Leadtek board, should cost $379 locally. The higher-spec Voodoo 3 boards are even more excitingly expensive at the moment - the 3500 model, $US250 in the USA, lists for a whopping $650 here! No doubt this will change in the very near future, but if you want to buy a video card right now, there's really no contest on price.

The TNT2 has quite a few features which the Voodoo 3 lacks, but which actually don't make a huge amount of difference. 32 bit rendering, for instance, can improve the look of games that use fog and smoke effects, but at the cost of a significant performance hit. If you're into single player, 32 bit rendering on the brutally fast TNT2 will probably be perfectly acceptably fast at 1024 by 768, or at even higher resolutions. In multiplayer, though, you'll get more frags if you drop to 16 bit. In games like Quake 2 that barely benefit visually from 32 bit rendering, there's no point using it, and Voodoo 3 doesn't lose out at all.

The TNT2 also supports textures up to 2048 by 2048 pixels in size, versus a mere 256 by 256 for Voodoo 3. This sounds impressive, until you find out that games that actually use textures larger than 128 by 128 are practically unknown, because enormous textures will hopelessly bog even super-fast cards. They eat up texture memory and processing power like nobody's business. Video cards will have to become a lot more powerful before large texture support becomes a useful feature.

The only important feature Voodoo 3 has that TNT2 lacks is Glide support. 3Dfx's 3D API was important when accelerated 3D cards were a novelty and the original Voodoo Graphics chipset was state of the art, but pretty much everything on the shelves today works perfectly with OpenGL or Direct3D (even Starsiege: Tribes does, now, with the 1.6 patch from tribesplayers.com), and so Glide compatibility is a much less important issue. If you want to play old, Glide-only games, though, TNT2 is not for you.

On every other front, though, it's TNT2 all the way. When local Voodoo 3 pricing returns to the land of the sane, TNT2 and Voodoo 3 will be essentially neck and neck - but if you buy a cheap TNT2 right now this minute, you won't be sorry.

For my money, of these three, it's a toss-up between the S-320 II and the Viper V770. If you want video in, the ASUS card is the way to go - maybe the 16Mb version, to save a bit of money - but I don't need it. The S320 II has a fan and good driver support; the V770 is dirt cheap, has OK drivers and works fine with the Detonator drivers too. If you can find a no-video-out S320 II for about the same price as the V770, buy it; otherwise, the Diamond product gets my nod.

As I said before, though, there's not much in it. I don't think there's a single genuinely bad TNT2 card on the market at the moment.

Ain't life grand?

 

ASUS V3800TVR/32Mb
card pic

Diamond Viper V770
card pic

Leadtek S320 II
card pic

Pros:

Cons:

Pros:

Cons:

Pros:

Cons:

  • TV in and out
  • Tons of memory
  • More features than most users need
  • Dodgy drivers
  • Great price
  • No frills
  • Jumper AGP setup (handy for older machines)
  • No cooling fan
  • Mature drivers
  • Easy to overclock
  • Nice and cheap
  • Not much!

The PCI alternative

The cards reviewed here are all AGP devices, but ASUS also make a PCI version of the V3800, for those unfortunate enough not to have an AGP slot - or for those who have an AGP slot, but not one with a hefty enough power supply to run one of today's amp-hungry video cards.

Some people decry PCI versions of AGP cards because they think the PCI version must be slower. This isn't so.

AGP, and particularly AGP 2X and the cutting edge AGP 4X, is faster than PCI for data transfer. But, with 16 or 32Mb of on-board RAM, TNT-series cards generally don't need to transfer data very often; in most 3D games, the RAM gets stocked with all of the texture data at the beginning of each level, and this happens at the mercy of the hard drive, which limits the maximum possible data transfer rate more than PCI ever could.

Games that generate lots of textures on the fly, like Starsiege: Tribes, or games that have a gigantic texture set for a given level that's notably bigger than the graphics card's on-board storage and thus requires lots of "flogging" during play, ought to benefit from the faster transfer of data from main memory to graphics card memory that AGP enables.

In the real world, though, this isn't necessarily the case. Tribes languished with pathetic performance on TNT and TNT2 cards for some time, allegedly because the NVIDIA drivers just didn't do AGP texturing right. At the same time, PCI 3dfx cards worked just fine, because the Glide drivers worked much better than the OpenGL ones.

And very few games have greater-than-16Mb texture sets; even the Quake III Arena Demo Test, released a few days ago as I write this update, doesn't overflow the memory of a 16Mb card with any of its four standard levels, even if you set everything to maximum quality.

If you've got an application that requires AGP texturing, a PCI card ought to be slower. But you probably don't have such an application, and the drivers may well trip you up anyway. The card's chipset is no slower, and the overall performance for the vast majority of tasks is identical.

TNT2 Ultra review

I've also checked out ASUS' AGP-V3800 Ultra, their highest-spec version of the V3800. The review is here. It's faster than any of the cards in this comparison - but is it worth the money?

Geforce review!

NVIDIA's successor to the TNT2 chipset is the much faster GeForce. I review Leadtek and ASUS GeForce cards here.

Glossary

AGP: The Accelerated Graphics Port is based on the PCI standard, but clocked at least twice as fast to accommodate the demands of 3D graphics. AGP lets the graphics board rapidly access main memory for texture storage.

Codec: Short for compressor/decompressor, a codec is software or hardware for compressing and decompressing data. In the narrow definition used here, codecs are just software, and they're used for compressing and decompressing video and audio files.

Colour depth: The number of distinct colours that a piece of hardware or software can display. It's referred to as depth, and sometimes as bit depth, because of the concept of overlapping, stacked "bitplanes", planar arrays of ones and zeroes that, together, define the colour of each pixel. The more bitplanes there are, the more bits per pixel, and the more bits per pixel, the more possible colours - number of colours equals two to the power of the number of bitplanes. 16 bits gives you 65536 possible colours, and 24 bit offers 16.8 million. Cards that do more than 24 bit use the extra bits for mixing channels and other funky stuff - 24 bit is more colours than the eye can discern already.

This is significant for gaming, because running your games in 32 bit mode may be prettier, but will be slower. The image quality difference is not a large one; in Quake 2 you have to look hard to see the vague banding on walls in order to tell you're in 16 bit mode, and in a real game you don't have much time for that. Games with funkier engines that do fog mixing and similar tricks benefit more visually from 32 bit, but since going for 16 bit will let you run a higher resolution at the same speed, most gamers opt for fewer colours.

OpenGL games inherit the colour depth of the desktop when you run them; if you're running 16 bit in Windows, that's what the game'll be. Remember this if you run your favourite game and it seems strangely slow; check your desktop colour depth. Direct3D games choose their own colour depth, and may or may not be switchable between 16 and 32 bit mode. Some, like Incoming, come in different versions for different colour depths.

Direct3D: Microsoft's own 3D graphics Application Programming Interface (API), which serves the same function as OpenGL and Glide - programmers can use the API to get their software to work on any hardware with Direct3D support, instead of having to write their own drivers for every 3D board out there.

Gamma: Adjusting the "gamma" of an image or of an image acquisition device gives you a way of brightening or darkening without losing as much detail as a straight brightness adjustment. Gamma adjustment works by changing the brightness of pixels according to how bright they currently are - the closer a pixel is to the extremes (black and white) the less it's changed, with the largest changes for pixels at the 50% grey level.

Glide: 3DFX's native 3D graphics standard, as used by the Voodoo cards of all flavours. When a game has rendering options that say something like "Standard OpenGL" and "3DFX OpenGL", the second option's Glide.

OpenGL: The platform-independent 3D graphics interface standard, with different flavours developed by Silicon Graphics and Microsoft. Does much the same thing as Direct3D and Glide, but does it on any computer you care to name.

Refresh rate: It's not enough that a given graphics system support the resolution and colour depth you want. It must also do it at a reasonable refresh rate. Refresh rate, measured in Hertz (Hz), is the number of times per second the screen is "repainted" with the image. Refresh rates below about 72Hz cause visible flicker; higher rates don't. Different people have different thresholds of annoyance when it comes to screen flicker, but staring at a 60Hz screen all day is an unpleasant experience for pretty much anyone. In gaming, refresh rate is not so critical, because you're generally not staring intently at relatively stationary objects in great fields of solid colour. But you still want 75Hz or so, if you can get it.

 

Making Quake 2 benchmarks

Frame rate measuring with Quake 2 is very easy. Run Q2. Go to the Video menu option, set the mode you want, and set "sync every frame" to "no". With frame sync turned on, your framerate can never exceed your monitor refresh rate, so you won't find out if your video system is faster.

Hit escape a couple of times to activate your changes, and bring down the console by pressing the tilde (~) key. Type "timedemo 1" to go into timing mode. Now type "map demo2.dm2", or "map demo1.dm2" if you want a slightly higher number (the first demo is a bit less complex than the second; the second is more commonly used as a benchmark).

To get an idea of your system’s deathmatch performance you’ll need to grab yourself a deathmatch demo, far and away the most popular of which are Massive1, which is quite nasty, and Crusher, which is pretty much a worst case scenario.

You can get the massive1 and crusher demos from here. Make a new directory in quake2/baseq2/ called demos, and put them in there. Now you can call them from the Quake 2 console just like demo2; "map crusher.dm2", for example.

If you turn off sound you’ll get a few more frames per second. Testing without sound is the only way to completely fairly compare systems with different sound cards, because different cards cause different CPU loads. Given that the difference is likely to be only in the order of 3 frames per second or so, though, it’s not a very big deal.

To turn off sound, bring down the console and type

s_initsound 0 <enter>

cd_nocd 1 <enter>

snd_restart <enter>



Give Dan some money!
(and no-one gets hurt)