ATI Radeon DDR OEM graphics card

Review date: 5 November 2000.
Last modified 03-Dec-2011.

 

Lots of people ask me what video card they should buy. They don't generally just want something that can do a decent resolution and refresh rate for office applications, to replace some ghastly old 1Mb board; just about anything'll do in that case.

Rather, they want something with a bit of 3D grunt, almost invariably for games. Or a lot of 3D grunt. Or an awe-inspiring, pants-wetting amount of 3D grunt. Or just more 3D grunt than any of their mates' computers can manage. And maybe TV output as well, for playing games or watching DVD movies on the bigger screen.

At the moment, my usual recommendation is "A GeForce2 MX card from pretty much any manufacturer, if you've got a small-to-medium monitor. A GeForce2 GTS or even GeForce2 Ultra card, if you've got a big monitor. Some variety of TNT2 if you want to save money."

These are all Nvidia chipsets, you might note. They come on boards put together by various manufacturers, because Nvidia just sells the chips and makes no retail graphics cards of its own. But all boards with a given Nvidia chipset are fundamentally identical.

If you want dual monitor outputs, then a Matrox G400 or G450 Dual Head board might suit you better than a TwinView GeForce2 MX (I talk more about this issue here). If you don't have an AGP slot and so need a PCI card, 3dfx makes faster ones than any other manufacturer.

If you're using a Mac, all bets are off, because you can't get Mac drivers for anything with an Nvidia chipset. Of which more in a moment.

But for ordinary single-monitor AGP cards for PCs, when 3D speed matters, Matrox cards are too slow, and 3dfx cards are too expensive, and nobody else makes anything that can compete with the current 3dfx and Nvidia hardware.

Except ATI.

Somewhat reasonably priced ATI Radeon cards are, finally, available in Australia, and I got an OEM (Original Equipment Manufacturer) one to play with.

This is a Radeon DDR, using Double Data Rate memory for twice the RAM bandwidth of the basic Single Data Rate boards. The $AU715 card's got 64Mb of this pricey memory.

64Mb GeForce2 GTS boards sell for $AU750 or more, versus $AU600 or so for the 32Mb version. The main reason for the 32Mb GeForce2 MX's sub-$AU350 price is that it uses the much cheaper SDR RAM.

What you get

OEM graphics cards are meant, as the name suggests, for system assemblers, not retail buyers. But you can buy them retail, and it's not a bad idea.

They're cheaper, because they have no frills. You don't get a box, you get a bubble-wrap bag. You don't get a fancy software bundle, though you do get a driver/manual CD, and a couple of video cables - RCA for composite video, mini-DIN for Y/C S-Video.

Rear panel connectors

This Radeon has both flavours of TV out in addition to the usual 15 pin VGA connector, so you can plug it into ordinary TVs with a composite input, or better models with a Y/C one. There's also a composite video input on the card for video capture purposes.

Chip cooler

The Radeon main chip doesn't run particularly hot. The Mac variants of the card just have a plain heat sink; this board has a slimline heat-sink-and-fan cooler.

Performance

The Radeon's fast. But it's not super-fast.

I compared it with a GeForce2 MX (Abit's Siluro MX, reviewed here) and a GeForce2 GTS (Leadtek's WinFast GeForce2 GTS, reviewed here) in a 700MHz AMD Duron system with 128Mb RAM. The 700MHz Duron's an excellent mid-range processor these days, not to mention very pleasingly cheap. It's got about half the power of the current king of the hill, the 1.2GHz Athlon, but it's about one-sixth as expensive.

I used the current release driver versions for all of the boards; the v6.31 Detonator drivers (download them here for the Nvidia-based cards, and the v3.056 drivers from here for the Radeon.

The first half of my patented Quick 'N' LazyŽ Benchmarking Session was done with Quake 2. It's well behind the technology curve these days, but the Q2 engine lives on. An enhanced Quake 2 engine powers Half-Life, and the Half-Life modification Counter-Strike is the most popular server-based multiplayer game on the Internet today. Cutting edge tech it ain't, very fun to play it is, and it's also handy for checking results against older cards, when Q2 was where it was at.

I ran the usual three Q2 test demos to get frame rate numbers; the stock Demo2, the multiplayer-action Massive1, and the worst-case-scenario Crusher. The first one tells you what your frame rate in the single player game's going to be, the second one tells you what you'll get in a hectic multiplayer game, and the third one is an insane blast-fest whose average frame rate isn't much better than the worst frame rate you'll ever see in a real multiplayer Q2 match.

The GeForce2 GTS isn't much faster than the MX for 1024 by 768 and below - which means that owners of 17 inch or smaller monitors, which can't clearly display much better than 1024 by 768, really shouldn't buy the more expensive card. But above 1024 by 768 the GTS beats the MX handily, especially when there's a lot going on, and the main reason is its much faster memory. In 1280 by 960, the GTS was 17%, 43% and 63% faster than the MX for Demo2, Massive1 and Crusher, respectively.

1280 by 960 or the nearly-the-same 1280 by 1024 is an excellent game resolution for 19 inch monitors, which are the biggest that most people can afford. Annoyingly, though, the Radeon doesn't seem to support either resolution for full-screen OpenGL games like Quake 2. So I tested it in 1152 by 864 and 1600 by 1200, 16 bit colour.

The Radeon turned out slower than the GTS for the lower resolution, but faster for the higher one, and averaged out at much the same speed. The difference isn't a huge one.

Difficult-to-take-seriously benchmark outfit Mad Onion's 3DMark 2000 utility gives decent numbers on a computer's Direct3D game performance. The default demo runs in only 1024 by 768, 16 bit colour, and so short-changes cards like the Radeon with expensive superfast RAM. But I gave it a whirl anyway. The final, collective "3DMarks" score rolls together the CPU, RAM and graphics card performance, so it's not very useful for inter-computer comparisons of video card speed unless the video card's the only component that differs in the compared machines. Which was just the situation in my test.

The GeForce2 MX got 4151 3DMarks, the GTS got 5181, and the Radeon got a rather disappointing 4272.

When I bumped the resolution up to 1600 by 1200, 32 bit colour, though, the Radeon managed a more satisfying 2160 3DMarks. This is on par for that kind of CPU power with a GeForce2 GTS graphics card, and without any tweaking.

The Radeon has some performance oddities, which explain its ordinary 16 bit results. Its 16 bit rendering is about as fast as its 32 bit, when it should be considerably faster at the lower and less demanding bit depth. The Radeon's 32 bit speed is fine compared with the opposition, but people going for more frame rate at the expense of some image quality by dropping the colour depth won't get much joy from a Radeon.

Overall, with current driver versions, the Radeon seems to be nothing very special for the money if you're not running a great big monitor. But neither's the GeForce2 GTS. It's silly to pay all that money for warp-drive RAM speed if you're not going to run monster resolutions and actually use the memory you've paid for.

If 1600 by 1200's a worthwhile resolution for you, the Radeon seems to perform about as well as the GeForce2 GTS. Both of the expensive cards trample the GeForce2 MX, and everything else on the market with SDR RAM.

When the Radeon first came out, its drivers were somewhat flaky but it actually performed better than a GeForce2 GTS in medium-to-high resolutions and 32 bit colour. Now, though, the Nvidia drivers have taken a bigger leap than have the ATI ones, and the Radeon doesn't have the same edge.

Clock speed

ATI's original press-release specs for the Radeon said it'd run at 200MHz core and RAM speed, but the retail boards came out at 183/183MHz.

Well, the 64Mb ones did, anyway. The 32Mb retail cards run at 166MHz. ATI won no friends by failing to make this fact clear.

All of the hot graphics cards of the moment have software-settable clock speed for RAM and core. For enthusiasts, the stock speed's just the starting point for a thrilling journey to overclocked nirvana. But many users don't have a clue about all this.

ATI doesn't quote megahertz figures for any Radeons if they can help it, because the Radeon architecture runs at lower clock speeds than competitors - like, for instance, the GeForce2 GTS, with its stock 200MHz core speed.

Clock speed doesn't tell you how fast a video card is unless all other factors are equal - comparing on a pure megahertz basis is like assuming that a car whose engine has an 8,000RPM redline must be 14% faster than a completely different car that tops out at 7,000RPM.

But, say ATI, Joe Sixpack doesn't know that. And ATI don't think they're going to get a chance to tell Joe before he buys some other video card, assuming that it must be faster 'cos it's got more of them meggle-hurtz thingies. So ATI don't quote clock speeds.

Which would all be well and good if all Radeons had the same clock speed. But they don't.

This OEM 64Mb board is a 166MHz unit, too. All OEM Radeons are.

The 166-to-183MHz gap is barely more than a 10% clock speed difference, and it has very little real world impact. But anyone buying an OEM Radeon and expecting to get the whole retail enchilada sans only cosmetic frills may still be disappointed.

You can't find out a Radeon's clock speed by looking at the status tabs in the ATI drivers. Unlike many other manufacturers these days, ATI don't give you any clock speed indicators or adjusters in the stock drivers.

The very handy shareware video utility Powerstrip, though, now supports the Radeon, and lets you see and fool with its settings.

I fiddled with Powerstrip to see if the OEM Radeon was particularly overclockable; nope. It seems happy to run all day at 183/183, and 190/190 was fine too.

The 6 nanosecond Hyundai RAM started producing amusing glitches at the 200MHz clock speed mark - the usual single-pixel twinkles, but also rather stylish coloured lighting blobs flashing randomly on walls in Quake 2. It's not every day that you see a graphics card that goes into Disco Fever Mode when excessively overclocked. 190MHz was glitch-free, and so might 195MHz have been, but 2.6% differences don't thrill me enough to spend time finding out.

Retail Radeons apparently come with faster 5.5 nanosecond RAM, which can deal with a 210MHz clock speed. And you can wind up the core speed considerably more on just about any Radeon - 220MHz isn't out of the question. Since you're generally waiting for the RAM, though, core speed cranking makes close to no difference unless you're running in such a low resolution that your frame rate, unless it's CPU-limited, will be stupidly high anyway.

Nvidia-based cards have this same problem, by the way. You can pump the core speed up quite a lot, but there's not much point to it without a magic spell to increase the RAM's bandwidth.

I only saw about a 5% difference in 1600 by 1200 benchmarks between 166 and 190MHz RAM and core speeds on the test Radeon, despite the 14% overclock. The difference was smaller in lower resolutions. You wouldn't notice it at all in the real world.

And, again, the same applies to the current Nvidia boards. Overclock if you want; put ridiculous after-market cooling systems on your video card and see if you can get some more out of it if you must. But if you don't bother, you won't be missing much.

TV out

Settings 1

ATI are proud adherents to the Fisher-Price School of Interface Design...

Settings 2

...but their driver software has all of the usual options, plus less-often-seen ones like proper image controls. This helps considerably for TV output, though whether you can actually get the image properly centred and sized depends on your TV as much as it does on your video card.

ATI make much of the Radeon's on-board MPEG-2 decoding abilities, which take a lot of load off the CPU when you're playing a DVD, but it doesn't really matter if you've got a fairly modern processor. If you don't mind having a computer that responds sluggishly while the DVD's playing - and, since you're probably just sitting and watching the movie, it's unlikely to be a problem - then software decoding's fine. Software players work on any old video card.

Rage Theater chip

The Radeon's TV in/out functions are handled by the Rage Theater chip, which gives the card above-average TV output quality.

The Rage Theater is also capable of doing Macrovision copy protection, a requirement for DVD playback software to output video to a TV. Macrovision aims to foul up a VCR's automatic gain adjustment feature, producing annoying brightness variation, but it has no effect on almost all televisions.

This means that if you want to play DVDs that activate Macrovision - most do - your TV-out cable has to be hooked up straight to the TV, not go via a VCR. This is no big deal as long as your TV's got a video input, so you don't have to rely on the VCR to convert the composite or Y/C signal to an RF modulated antenna-socket signal.

Mac daddy

The Radeon's of particular interest to Macintosh users, because it's almost the only choice in Macintosh super-performance video adapters at the moment. It bloomin' well shouldn't be, if you ask me, but it is.

The story behind this provides a salutary lesson in information technology cynicism.

A few old and grizzled computer users may remember when the PCI (Peripheral Component Interconnect) standard was first released, back in the steam and gaslight days of 1993. They may also remember that one of the big selling points for the new expansion bus was that a PCI card for one system could be used in any other system with PCI slots.

Take the card out of a PC, stick it in a DEC Alpha box. Or a Sun box. Or a Mac. No worries. Everybody would have PCI slots, everyone would be compatible. All you needed was the right drivers for whatever architecture and operating system on which you wanted to use your nifty multi-platform PCI device.

Well, that didn't happen, did it?

Because the IBM compatible's the 900 pound gorilla of the everyday-hardware market, most manufacturers didn't bother to make drivers to suit any other platform. Even if their devices actually were hardware compatible, without a driver they weren't useable on anything but the platform the vendors decided to support, which was almost always the PC.

AGP (Accelerated Graphics Port) has the same problem. You're supposed to be able to use any AGP graphics card in any computer with an AGP slot. All you need are drivers. But you can't get 'em.

As a result, loads of Macintosh users can, if they like, buy a PC PCI or AGP graphics card with a perfectly serviceable Nvidia chipset on it, plug it into the appropriate slot on their Mac, and then just sit and look wistful, because it isnae goin' tae work, laddie. If it's not a 3dfx Voodoo 3, 4 or 5 card, you'll have no drivers and your Mac won't want to know it.

You can use a Radeon with the Mac. Drivers exist.

But you can't use this Radeon. The Radeons that work with the Mac don't work in PCs, and vice versa. Sheesh.

The Radeon Mac Edition, as ATI call it, is a 32Mb DDR board that comes in AGP and PCI flavours. It's a factory option for various current Macs; there's even a version that fits inside the teeny Power Mac G4 Cube. You can upgrade various older Macs, too; see the Radeon Mac Edition FAQ here for more information.

If you don't go for the Radeon option in a new Mac, you get the same ATI Rage 128 Pro that's been shipping in Macs for a while. It's a pumped-up version of the older Rage 128, and it gives you roughly TNT2-level performance. Rage 128s are another card type that I don't recommend, not because they're rubbish but because Nvidia boards are cheaper.

Conclusion

If you're looking for a top-spec graphics card for your Macintosh, a Radeon DDR is, at the moment, the best you can buy. But this Radeon is not that Radeon.

This Radeon is a worthy opponent for the GeForce2 GTS; it'll be a bit slower than the faster-clocked GeForce2 Ultra (which hasn't quite made it to Australia yet), but it's still a darn fast card for super-high-resolution gaming, at a decent price.

Most people, though, don't need a card this fast; the advantage at medium resolutions is small, and the price difference is big.

ATI plan to phase out the Rage 128 chipset shortly and replace it with a cut-down Radeon, the "RV100". That board will probably be a much better option for most people; who knows, I may even add it to my stock list of recommendations.

The Radeon DDR is a high end card for people with big monitors. If you haven't got a 21 inch - or at least 19 inch - screen, then you don't need a card like this.


Buy a Radeon!
Aus PC Market doesn't sell this product any more (click here to see their current video card-related products), but you can still try a price search at DealTime!)



Give Dan some money!
(and no-one gets hurt)