Atomic I/O letters column #24Originally published in Atomic: Maximum Power Computing Reprinted here August 2003.
Last modified 16-Jan-2015.
I suffered that horrible sinking feeling in the pit of my stomach the other day. You know the one you feel when you break a pin on the IDE connection on a part that you can probably never buy again. Worse still, it was on a friend's computer! I managed to destroy pin 1 on a superfloppy drive. However, the next day, to my great relief, when I hooked it up anyway - it worked!
I vaguely remember from my college days that many of the pins don't actually do anything, but have no idea which ones are important. Can you please tell me the pin assignments for IDE?
Well, every pin does something (except for pin 20, used for keying; that one's only mechanically functional), but seven of 'em are earth pins.
You didn't cut one of those; you cut the Reset IDE pin, which is what's used to soft-reset the drive. Soft-resetting isn't often needed. By default, it happens when you reset the PC, and it can cure a drive that's been left in an insane state by a bad driver or some similar mishap, but a drive that isn't currently nuts will work fine if you reset the computer without resetting the drive as well. You can always reset your drives, whether or not they've got a pin 1, by turning the computer off and back on.
You can find the standard IDE connector pinout here.
I was wondering, wouldn't it be great if we could use heavy water for cooling a PC down? Wouldn't that work? Isn't heavy water used in reactors? Why can't we apply the same method with the humble PC?
Heavy water in a PC liquid cooling system would work no better than regular water, but it'd be way, way more expensive.
Heavy water is, chemically, practically identical to ordinary water. It is denser; a heavy water ice cube will sink in normal water. But apart from that, and slightly higher melting and boiling points, all heavy water does is block stray neutrons better than regular water. That's why it's used in some breeder reactors to make plutonium from uranium; fast neutrons are slowed down enough by bouncing off heavy water nuclei that uranium nuclei can then capture them.
Neutron slowing isn't a high priority for PC water cooling systems, and heavy water costs thousands of dollars per litre. Furthermore, if you try to buy some, people are apt to think that you're considering turning a city into a smoking grease spot. All this makes heavy water less than totally interesting as a PC coolant.
Why do ATA hard drive spindle speeds top out at 7200 RPM [OK, 10,000RPM for some server ATA drives now], while SCSI drives go to 15,000 RPM? Surely the interface isn't a bottleneck, because today's ATA drives can't get anywhere near filling the bandwidth offered by ATA/100/133, and now Serial ATA. Is it just that HDD manufacturers want to give their faster drives a SCSI interface so they can justify slugging people an extra $350 for them?
There is an artificial market segmentation thing going on here, but there's more to it as well.
Faster rotating drives need better motors and bearings, but there's no reason at all why that level of engineering can't be applied to ATA drives. And, as you say, current ATA interfaces have bandwidth to spare.
The dead hand of marketing isn't the only thing to blame for the absence of high rotational speed ATA drives in the consumer marketplace, though. Drives that spin much faster than 7200RPM run hot enough that they're not happy in the average case.
Hard drives get hot because there's air inside them; air friction on the spinning platters makes heat. 10,000RPM-plus drives make enough heat that they need extra cooling. If you've got unusually good case ventilation and/or an unusually low ambient temperature, you can run top-spec SCSI drives without adding anything more to keep them cool. But if you drop those same drives into an ordinary minitower case, they're likely to die before their time.
Manufacturers therefore want to keep high-RPM drives out of the mainstream. They'd get showers of warranty returns if they didn't.
The recently released 10,000RPM ATA server drives are, for consumer applications, not interesting products at all. This is partly because they cost a lot more per megabyte, and partly because they don't have very high capacity. Their high rotational speed gives them low rotational latency - the time you have to wait for a given spot on the disk to spin around under the heads - but their low data density hurts their sustained transfer rate.
The Serial ATA Western Digital Raptor has a capacity of only 36.7 hard drive manufacturers' gigabytes (that's 36700000000 bytes, which is 34.18 real 1,073,741,824-byte gigabytes), but it costs more than a 120Gb Parallel ATA drive.
Many servers need fast seek operations more than they need monster sustained transfer rates, so drives like the Raptor are suitable. If you need more transfer rate, you can always use an array of drives.
I recently bought a GeForce4 Ti4200 card with 128Mb DDR memory. My motherboard is an ASUS CUSL2-C with a Pentium III 933 that is not being overclocked (shameful I know). The problem I have found is that my motherboard only allows a maximum AGP aperture size of 64Mb. Does this mean I am losing half the memory performance of my video card? How much of a performance increase would I see if I were able to increase the AGP aperture size?
No, you're not losing any performance. A larger aperture size would give you a barely measurable speed increase, if it gave you any at all.
The AGP aperture size sets the amount of system memory that can be shared with the graphics card, using the AGP bus to allow the card to behave as if it had more memory than it does, and store textures in main memory when it runs out of RAM of its own.
Even if you're using AGP 8X, though, main memory via AGP will still be quite a lot slower than video card memory. If you actually use AGP texturing while you're playing a game, performance will drop substantially.
Some video card and motherboard combinations misbehave with default AGP aperture settings, causing 3D games to crash. In those cases, you need to set the aperture size to some particular size, usually 64 or 128Mb. Setting the aperture size larger than that will give you a tiny performance gain, at best; with modern video cards that have tons of onboard memory, little of the assignable memory is ever likely to be used.
Setting an aperture size larger than the amount of system memory you have shouldn't ever help (you can't share memory that isn't there...), but it does apparently do something, for some systems. "Something", though, doesn't mean "anything you'll notice without running benchmark programs".
Setting a small aperture size, by the way (32Mb or 16Mb), will disable AGP entirely, and leave your video card behaving like a "0.5X AGP" PCI card.
Basically, if your system doesn't have any problems, don't worry about the AGP aperture size.
I am planning on buying an Asus A7N8X Deluxe mainboard and an AGP-V8640 Ultra Deluxe video card, but I have read in places that AGP 4X cards are not compatible with AGP 8X mainboards. Is this correct? If I buy this combination of mainboard and video card will I end up with a very expensive paperweight?
While it's hardly unknown for some motherboards to dislike some video cards, there's no compatibility barrier between 8X and 4X. Stick a card that can only manage 4X in an 8X slot and the motherboard should just run it at 4X, the same as happens with 2X cards in 4X motherboards.
There can be a compatibility issue going the other way, though - plugging an 8X-capable card into a 4X slot. If the older motherboard can't handle the 0.8 volt signalling that the 8X card wants, it won't work.
I'd like to see some information regarding some snazzy aluminium cases vs the old beige boxes. Maybe some temperature measurements with boxes of very similar dimensions, and even cover any air holes to ensure a fair trial (although they're part of the case, so maybe leaving them open would be an idea). You could use the same hardware inside (chip/heatsink/mobo) and measure the temperature inside, CPU temp, and even the surface temps of the metal.
Computers in aluminium cases generally run cooler than the same hardware would in a steel case, but that's not because of the aluminium. It doesn't hurt that aluminium conducts heat better than steel does, but it's the fact that aluminium cases commonly have more fans in them that makes the real difference.
Most of the world's PCs have only one throughflow ventilation fan in them - the single exhaust fan in the PSU. Many better machines have one intake fan as well, and maybe even another exhaust fan, but lots of cases that can accept multiple fans don't actually have them installed.
Many aluminium cases, on the other hand, come standard with extra fans. Most of the current midi-tower Lian Li cases, for instance, come with one exhaust fan on the back, one exhaust on the top, and two front intake fans.
If you're interested in value for money, buy yourself a steel case with the features you want, including extra fan mount spots. You'll probably pay a couple of hundred bucks less than you would for the same thing in aluminium.
Last week I bought Age of Mythology because it was recommended by some of my friends. It installed fine, but when it came to running it gives me this error: "This graphics card is not supported by Age of Mythology. Please check http://www.microsoft.com/games/ageofmythology for a full list of supported cards. Age of Mythology will now exit. Video Card 0: nv4_disp.dll NVIDIA GeForce2 MX/MX 400 Vendor (0x10DE) Device (0x110)."
I've got a 900MHz Duron on a Shuttle MK20N, with 256Mb PC133 SDRAM, 64Mb GeForce2 MX 400 PCI graphics card, and Hercules MUSE XL sound card. I have installed all the latest drivers for every piece of hardware I have, including the latest Detonator drivers, DirectX 9, Via 4in1's, etc.
I tried the demo of AOM and it worked fine. I am able to run every other game I have. Is there a patch available to fix this problem? If so where do I get it from?
Your graphics card explicitly is supported by AOM (every Nvidia chipset from the original TNT onwards should work...), but there's a driver problem. If, as you say, you've already installed the latest drivers for everything in sight, and if none of those driver installs are screwed up (which they may well be, possibly to the point where nothing but a Windows reinstall will fix them; this is particularly likely if you're running Win98 or ME), then it could be a DirectX problem.
Changing your graphics driver to the default VGA one, then reinstalling DirectX, then reinstalling the graphics driver, may help.
I have been getting annoyed by the "Send To" function when you right-click on a file. When Send To is highlighted, my CD-ROM spins up, causing the computer to lag until it has read what is in the drive. I have deleted the programs in the Send To list, but cannot find the CD-ROM drive in the folder. How can I completely remove the Send To function?
Run Regedit (Start -> Run -> type "regedit", without the quotes, and press enter) and navigate to HKEY_CLASSES_ROOT\AllFilesystemObjects\shellex\ContextMenuHandlers\Send To. There should be one key in there, called (Default), of the type REG_SZ, and with a pile of hex as its value (7BA4C740-9E81-11CF-99D3-00AA004AE837). Double click it and delete the value data, and Send To should vanish. No rebooting needed.
Save the chunk of hex in a text file, in case you ever want Send To back; you'll need to replace the original value to restore the function.
After reading your "LCDs, CRTs and geese" article, I realized that I get eyestrain every time I sit at my computer because I cannot raise my screen's refresh rate above 60Hz, while WinXP recognizes my CRT as a "Default Monitor". How can I rectify this?
I bought my 17" Osborne MO117 from a friend without any device driver, and I've been subsequently unable to find any drivers on the Net. On two occasions I've tried raising the refresh rate, firstly under Advanced Display Properties (to 75Hz) and secondly with RefreshForce (to 85Hz), but the screen just blinks its power light after rebooting, forcing me on both occasions to restore my previous settings.
I also tried uninstalling the monitor within Device Manager with the objective of choosing Windows' Plug and Play driver, only for XP to automatically install it as a "Default Monitor".
I know my monitor supports higher refresh rates. What can I do to raise the refresh rate within Windows and during gaming to save my eyes from constantly becoming irritated?
The MO117 is a rebadged Philips Brilliance 17a - all of the Osborne monitors were rebadged Philips screens. If you want better than 60Hz, you're going to have to set the monitor to 1024 by 768; it can do that at 75Hz, but can only manage 60Hz for higher resolutions.
75Hz isn't a very exciting refresh rate, but it's not too awful, and 1024 by 768 is all you can fit clearly on a 17 inch tube anyway.
You can attack this problem from two directions. Here's the high-effort, low-return one.
Go to Control Panel -> System -> Hardware tab -> Device Manager, expand the Monitors entry, and double-click the monitor you want to change. Now go to the Driver tab, select Update driver..., Install from a list..., Don't search..., uncheck Show Compatible Hardware, and select the Philips 17ACM38 driver, which should be an OK match for your screen. You can pick any driver you like, though; as long as you don't end up asking the monitor to do something it can't do, you'll be fine.
If Windows insists on discovering three monitors every time you restart - to think we thought it'd stop doing that when it stopped using Win95 code - then you can make the same change for all of the buggers. You only actually need to change the one monitor that Windows thinks it's using, though.
RefreshForce should be able to solve your problem; it's the easier method. It lets you just plain tell Windows what refresh rates you want for each resolution and - wait for it - it even works in 3D mode. You can do all the fiddling you like in the regular display settings, but Win2000 and WinXP will still want to run at 60Hz in 3D mode. RefreshForce is the best way of fixing that, so far.
If RefreshForce doesn't work, it could be because you're changing refresh rates for one of the pseudo-monitors that Windows has mistakenly detected, rather than the one it's actually using. RefreshForce should indicate which one's "Active" in its list of monitors; if in doubt, do 'em all.
You can get rid of plethoras of unnecessary monitors by editing the registry, but if Windows is nuts enough that they're there in the first place, it may well be nuts enough to believe that the relevant registry keys are undeletable, so never mind.
And then, there's multi-monitor setup.
[Holds flashlight under face, adopts spooky tone] Imagine TWO monitors, each with their own zombie clones in the device list that use the wrong drivers, getting re-detected every time you restart...