ASUS V7100/T GeForce2 MX graphics cardReview date: 18 July 2000. Last modified 03-Dec-2011.
Top-end PC graphics cards are marvellous, um, tools. Yes, valuable productivity tools, that's what they are.
Oh, the heck with it. Let's be honest, here. For every person that uses a cutting-edge graphics card for 3D CAD on the cheap, there are a thousand that just want to blast 3D zombies at a zillion frames per second. They're not tools, they're toys. But they're still marvellous.
They typically carry a rather alarming price tag, though. $AU500 for a basic card with the hottest current chipset on it is common enough; $AU600 or $AU700 is more like it for cards with extra features, bigger software bundles or just a premium brand name.
The latest and greatest hardware is always, without exception, crummy value for money compared with gear that led the field a mere six months ago, but performance-hounds still grit their teeth and buy it, and try to ignore the gentle hourglass hiss of resale value trickling away minute by minute.
What the world needs is a graphics card with most of the grunt of the current king of the 3D chipsets, NVIDIA's GeForce2 GTS, but at a much more reasonable price.
Well, here it is. It's called the GeForce2 MX. It's not as fast as the GeForce2, but it's still quite a little steamer, thank you very much. And, when they hit the local market around the end of this month, MX boards will cost $AU300 or so, give or take about $AU25 depending on their extra feature set. Which is less than half the price of a GTS card.
The deep-discount US stores are, as I write this, taking orders for basic MX cards for less than $US130, including shipping.
Just because it's called a GeForce2, though, doesn't necessarily mean that a video card's going to be fast.
NVIDIA have a great enthusiasm for covering every market segment. But they also keep coming up with new and improved graphics card chipsets. And they don't want to keep making their once-the-fastest, now-not-so-hot graphics chips. So they make cut-down versions of their newest and greatest chipsets that perform more or less like their older chipsets.
The most heavily re-versioned NVIDIA chipset is the old RIVA TNT2, which predated the original GeForce. There are now no fewer than five variants of that chipset, from the still-quite-speedy Ultra version to the not-so-quick Vanta. There's a big difference between an Ultra and a Vanta, and the slower version may be cheap, but it's arguably not very good value for money.
Thankfully, the GeForce2 MX is not another Vanta.
The card I checked out was an evaluation sample version of ASUS' upcoming AGP-V7100/T. The "T" variant is going to be the middle-spec version; it's got TV output as well as the standard video connector. There'll be a cheaper "Pure" variant with just the video connector, and a fancier "DVI" version, with a second video output for flat panel monitors that use the DVI connector standard.
ASUS will be making versions of all three cards with either 16 or 32Mb of video memory, but the 16Mb ones probably won't be sufficiently cheaper that many distributors will bother importing them into Australia.
32Mb of video memory is worthwhile for current 3D games; 16Mb is a bit on the low side, especially for newer cards that can manage respectable speeds in high resolutions, where each frame takes a lot of RAM.
The V7100/T has both composite and S-Video connectors on the back, as well as the 15 pin VGA socket. The dual connectors can't be used at once, but it's a better solution than the non-standard combo-plug that many cards use to cut costs; they come with an adaptor lead without which you can't use composite output. Other cards again have S-Video output only, which is all very well if you've got a funky TV or VCR with Y/C input, but no good for a lot of people.
With both kinds of connector, you can play your games on your big TV (in 800 by 600 or 640 by 480; TVs aren't very sharp screens), or use your DVD-ROM drive equipped computer as a cheap DVD player, and not have to worry about making the awful discovery that Tab A can by no means be plugged into Slot B.
Unlike pretty much every other speedy video card these days, the GeForce2 MX chipset doesn't need a fan-and-heatsink chip-cooler. This simple heatsink is adequate.
The GeForce2 GTS, despite being brutally fast, doesn't need a ton of cooling compared with the earlier GeForce cards. That doesn't stop some manufacturers from putting titanic chip coolers on their GeForce2 cards (Leadtek, I'm looking at you...), but they don't need to.
The MX version of the GeForce2 chipset is simpler than the GTS, but just as efficiently made, and so there's no need for active cooling. The MX draws, and radiates, roughly half the power of its big sibling.
It's easy enough for people who want to overclock their video card - run it faster than stock - to add a fan to the standard MX heatsink, or pop it off completely and replace it with something else that fits the standard-spacing mounting holes in the video card circuit board. I'll deal with overclocking shortly.
My sample card didn't come with retail packaging or software, just drivers on a CD-R. ASUS promise to update their somewhat creaky standard software bundle for the MX; you'll get the driver disc, of course, but also Soldier Of Fortune (a fine game for anyone who likes it when enemies react appropriately, and bleed realistically, when shot in amusing places), an OK DVD player program (to use it, you of course also need a DVD-ROM drive), and 3Deep gamma correction software.
The GeForce2 MX is rated at less than half of the fill rate of the GeForce2 GTS. Fill rate, measured in megapixels per second, tells you how fast a graphics card can paint the screen. The higher your resolution, the more pixels you need to paint per frame, and the fewer frames per second you can possibly draw from a given fill rate.
The GTS is rated at 800 megapixels per second; the MX manages only 350. This difference is largely down to the much slower, but also much cheaper, RAM on the MX boards; it's 166MHz plain Single Data Rate SDRAM. The GTS's memory runs at the same speed, but it's Double Data Rate, processing two transactions per clock tick, so its actual speed is twice as high.
Apart from the memory, the only big difference between the GTS and the MX is that the more expensive chipset has four rendering pipelines, and the cheaper one has only two. This makes a difference, but not as much of a one as you might think.
At its default speed - 175MHz core, 166MHz RAM - the V7100 benchmarks a bit slower than a GeForce DDR. It's about 10% off the old DDR's pace.
But the GeForce DDR was NVIDIA flagship chipset before the GeForce2 came out. As I write this, even the cheapest DDR boards still cost more than $AU450. Nine-tenths of that performance for two thirds of the price ain't a bad deal at all.
I was interested to see if there was room to goose a significant amount more performance out of the V7100. The GeForce and GeForce2 GTS cards aren't tremendously overclockable; GeForce DDR cards, in particular, tend to already be pretty much on the RAM speed redline; DDR memory's generally begging for mercy by design, and you canna change the laws o' physics.
The included ASUS V7100 beta drivers didn't allow overclocking. Well, they probably would have with ASUS' purpose-made tweak utility, but the disc didn't have it and I couldn't be bothered scaring it up. So I hit Reactor Critical's download page and grabbed the v5.32 unofficial-beta-if-they-set-fire-to-your-computer-it's-your-problem driver set.
By using a simple registry patch like this one, these standard drivers give you access to a simple overclocking panel, which lets you wind the core and RAM speed up.
Getting to the setting-changer's a bit of a pain, though. You've got to go to Display Properties -> Settings -> Advanced -> GeForce2 MX -> Additional Properties... -> Hardware Options to get there, and you've got to restart once to make clock-tweaking available, and press a silly Test New Settings button every time you change the clock speed, even if you're reducing it.
The Hyundai SDRAM on the V7100 (HY57V653220B 0019A TC-6 64 megabit chips, in case you care) has a rated speed of 6 nanoseconds, which means it shouldn't be able to run any faster than its 166MHz stock speed.
Hyundai, though, tend to under-specify their SDRAM. It can often run quite a lot faster. And that's the case here.
After fooling around a little, I found that 210MHz was stable for both core and RAM speed. A 20% and a 26.5% overclock, respectively. I set up a 2CoolPC ducted fan (reviewed here) to blow on the card, rather than bodge a fan onto it. The heatsink didn't get more than lukewarm during the tests, though; I wouldn't be surprised if, in a well ventilated case, an overclocked MX board turned out to be stable enough with no fan of its own.
At 210/210, the MX creeps ahead of the DDR GeForce by a few per cent.
The test machine was powered by a 700MHz AMD Athlon CPU - a pleasingly cheap option, these days - and it delivered unnecessarily fast results in older 3D games, and perfectly acceptable ones in current software.
Going for big silly numbers by running good old Quake 2's "demo2", which is of a single player game without the serious polygon pushing that deathmatch requires, the MX delivered 114 frames per second in 16 bit colour, 1280 by 960 resolution. 1024 by 768 was 156fps, 800 by 600 was 195fps, and 640 by 480 was CPU-limited at 207fps.
Switching to the rather more demanding Quake 3: Arena, the MX also did well on the fairly-hectic-deathmatch "Demo1". In 640 by 480 it hared along at 103 frames per second, was still unnecessarily fast with 81fps at 800 by 600, delivered a more than acceptable 52fps in 1024 by 768, and only bogged down at 1280 by 960, with a less than stellar 32fps. This included some nasty slideshow moments when translucent brushes were overlaid on each other.
But this was all in High Quality mode, with 32 bit everything, which doubles the data a video card has to move. Run it in 16 bit colour from a reasonably beefy processor, and the MX is thoroughly playable for Q3A deathmatch in 1280 by 960. 1024 by 768 is pleasingly fast even in 32 bit colour.
Mad Onion's 3DMark 2000, which gives a good idea of real-world Direct3D game performance, reported a respectable 3751 3DMarks at stock speed, running the default 1024 by 768 16 bit demo, and an imposing 4343 3DMarks with the card overclocked. Which beats a DDR GeForce by a fairly thick hair.
The overclock's worth the effort, I think, even if it doesn't deliver a monstrous performance difference. If you've got an unhealthy fascination with overclocking the MX, check out the Anandtech article on this exact subject here.
Because most DDR GeForce boards can't run their RAM much faster than the stock speed, it's not really worth winding them up.
Higher core speed, without higher RAM speed, only helps you when the graphics card's not drawing a high resolution screen. The more pixels the card has to handle per frame, the more it's leaning on its RAM and the more likely the graphics chipset is to be waiting for the memory to suck data in from one place and blow it out to another.
In low resolution modes, an overclocked GeForce DDR board is likely to show a speed improvement right in line with the core speed overclock - better than 30%, if you can get that much out of the core. If you're playing around with Full Screen Anti-Aliasing (FSAA), where the card internally renders a higher resolution version of the screen and then scales it down for display to reduce the "jagginess" of diagonal lines, then you're likely to see a real improvement.
But if you've got a big enough monitor that you can display 1280 by 960 or higher resolutions clearly, the high RAM load means that core overclocking won't help you. And FSAA, in case you're wondering, will be ludicrously slow, if it's possible at all. Ask for only 2X FSAA oversampling of a mere 1280 by 960 screen in 32 bit colour, and every frame will weigh in, before the card scales it, at a terrifying 18.75 megabytes of data.
Realistically, you can't expect to be able to get much more than 10% more performance out of a GeForce DDR by overclocking. You're unlikely to even notice the difference.
In middling resolutions and only 16 bit colour, you can get better than 15% more performance out of a GeForce2 MX, assuming you can wind it up to 210/210. Again, though, as the resolution climbs the RAM speed will dominate the core speed; the MX core is not tremendously less powerful than the full GeForce2 GTS, but the slower RAM hurts it.
Higher resolutions, though, aren't useful for most people. Most PCs have 15, 17 or, at most, 19 inch monitors on them, and that means the highest resolution they can clearly display (unless the monitor's particularly good) will be 800 by 600, 1024 by 768 and 1280 by 960, respectively. Sure, modern monitors can accept much higher resolution signals, but they just don't have enough phosphor dots to show them sharply.
So there's not much point in running your cheap 17 inch screen at its outrageous maximum resolution, and there's not much point getting a video card that can run games at breakneck speed in that resolution. 1024 by 768 will look pretty much the same, and run nice and fast on a cheaper card.
For $AU300 or so, this thing's significantly faster than a plain, non-DDR original GeForce board, even before you start fooling around with overclocking. By the time the MX is locally available, plain GeForce boards will probably cost about the same as it.
Wind up the core and RAM speeds, and the MX gives you GeForce DDR performance, for quite a bit less money. It's not far behind even without overclocking. And I doubt the DDR boards will fall in price enough to make them better value.
Where the GeForce DDR and the GeForce2 GTS shine, of course, is in very high resolutions. Even overclocked, the GeForce2 MX's RAM lets it down if you want to go much above 1280 by 960. At 1600 by 1200, a DDR GeForce will show an MX a clean pair of heels, and a GeForce2 GTS will be a dot in the distance.
But Ludicrous Resolution is beyond the reach of most buyers. If $AU600 for a video card's more than you want to pay, then $AU2000 for a monitor isn't likely to fill your soul with boundless joy, either.
For real people who'd like to pay as little real money as possible for their next video card, the GeForce2 MX looks like a little ripper, and the ASUS V7100 series looks like a perfectly good implementation of the chipset. Thoroughly recommended.
Review ASUS V7100/T kindly provided by DMA.