Atomic I/O letters column #93Originally published 2008, in Atomic: Maximum Power Computing
Reprinted here April 30, 2009 Last modified 16-Jan-2015.
Last night, the network switch next to my desk started making a sparky noise and emitting a "fried electronics" smell. All of the lights were still on (on the switch, and in the house...), but I unplugged it right away of course. It's mains powered - IEC socket on the back.
But it was still working - even if apparently on fire - when I unplugged it. So it seems to me that it might be fixable. Is this actually likely?
If it's an ordinary home-or-small-office switch then it's not likely to be economical to repair, if you have to pay someone else to do it.
If you do it yourself, though, it may be a very easy job indeed.
NOTE: If you're new to all this, I suggest you at least build a couple of low-voltage electronics kits before going anywhere near mains power. Also, read the excellent sci.electronics.repair FAQ.
Inside all sorts of mains-powered computing devices is a power supply that takes the mains and converts it to the DC voltages the device needs to run. (Plugpacks are the same thing, just in a separate enclosure.) If there's an intermittent contact on the mains side of that power supply - if, for instance, years of cable tension has cracked the circuit trace next to one of the mains plug's contacts - then mains voltage - especially in 220-240VAC countries - is more than enough to arc over the little gap, and burn the circuit board.
Hence, alarming noise, nasty smell. And a device that may indeed keep working, until the gap gets too big for the arc to cross.
There are many other things that can cause similar symptoms and are not nearly as easy to fix. But you can indeed quite safely unplug your switch, open it up, and look for scorches on the power supply board.
If there are exploded capacitors or burned transformers or chips with bits blown out of 'em, then you're out of luck. But if it's just one little arcing gap, you should be able to assault it with a wire brush or pointy scraper to clear away char and expose some clean copper on the PCB to solder to, then bridge the gap with the most elegant blob of solder you can manage. If the gap's long enough to make a solder-blob impractical, you can make a little link out of suitably chunky hook-up wire.
AGAIN: If you drip solder all over the inside of something, or make a solder blob so big that it shorts to the casing when you put the thing back together, please do not sue me.
(I suggest you sue Microsoft. They've got more money.)
I have an old 10" car audio subby laying around not being used anymore. I'm building a cheap ghetto-style home theatre setup, and thought it would be great to hook that sub into my system. Its an old "Dominator" sub and amp which I got for very little $ from Autobarn many many years ago. I was wondering if a computer PSU would be strong enough to power it, or would I need something a bit more beefy? The only other 240-12V converter I have handy is an abundance of old UPS units.
As I said this is very ghetto setup, so I don't mind being a bit dodgy... as long as it's not a fire hazard.
Most car subwoofers are tuned for maximum mid-bass boom, and will just add a lot of unfocused rumble to a home theatre system. It might be impressive at first, but it'll probably just annoy you after a while. You can calm them down with a bit of equalisation and get something listenable, though.
What power supply you'd need to run the 12V amplifier in your house depends on how powerful the amp is, and how far up you're going to turn it. For low volumes, any old PC PSU would probably work just fine. Note also that cheap car audio equipment commonly has highly exaggerated sticker power ratings; just because it says "5000 WATTS" on the label doesn't mean you need that much power to run it. You'd need to run it with some kind of power meter - the ten-amp range on a cheap multimeter in series with the amp will do for readings up to 120W at 12V - to figure out what it really draws.
Note also that you could easily chop the 12V amp out of the circuit and just connect the subwoofer's driver, the physical speaker in the box...
...directly to a normal mains-powered amplifier instead. The driver in the sub will very probably be a four-ohm unit rather than the eight ohms that's normal for home stereo equipment, but that's no problem for modern amplifiers; all you'd need to do is set the sub volume lower than you otherwise would.
I am about to upgrade my PC at home and I am a bit perplexed about whether spending extra for faster bus speed is going to help performance much. There don't seem to be a lot of useful comparisons of performance around that focus on FSB speed. I could buy the standard "sweet-spot" priced Intel Core 2 Quad with PC6400 DDR2 RAM and that will all run at 800MHz FSB. Or, I could buy the more expensive Q9x CPU with DDR3 and appropriate motherboard and run at 1333MHz across the board (including the video card as well, so that's elegant). Question is, will the extra "buck" give me any noticeable “bang”?
What do I do with my system? Rendering and editing video would be the most demanding thing, but I have a copy of Crysis here I can't even play on my current system. Given I am running an AMD 32-bit Athlon XP single-core at 200MHz FSB I am guessing any upgrade will bring tears of joy to my eyes, but I am one of those people that upgrades every 3+ years so I want this one to last as long as possible.
Bus speed, by itself, won't make much difference. This has been true for the whole of PC history, back to the days when CPUs first started running at multiples of the RAM speed and people were first trying to get textured polygons onto the screen.
Increasing bus speed is still the usual way you overclock a CPU, but overclockers' motherboards always let you decouple other buses from the CPU bus speed, so even if you intend to overclock, it's not vitally important that you have high-speed-capable RAM. Keeping the CPU and RAM more or less "in sync", overclocked by the same amount, will generally give you more performance, but nothing you'll notice without benchmarking.
If you get the more expensive system with 1333MHz "across the board", by the way, you certainly won't actually be running the PCIe video card at 1333MHz. The CPU and RAM buses will be running that fast (RAM actually 667MHz, but with two transfers per clock), but PCIe's default clock speed is only 100MHz. The clock speed of the graphics processor may be 1333MHz, but that doesn't matter, because it's completely decoupled from the other buses.
The (relatively) low clock speed of PCIe is not a problem. This is because the 100MHz clock is just a metronome-like reference for the much faster actual PCIe transfer hardware, which moves at least 2.5 billion serial data transfers per second, not a mere 100 million, and also because multiple PCIe "lanes" can transfer data in parallel, for anything that's plugged into a PCIe socket that's bigger than the simple "x1" type.
PCIe x1 has only one lane, giving it ceiling bandwidth, for version 1 of the PCIe spec, of a mere 298 megabytes per second. (You can't send that much real-world user data per second, but the usually-quoted 250-megabyte-per-second speed is somewhat realistic.) But PCIe x16 - the standard video-card slot for modern PCs - has sixteen lanes, giving tons of bandwidth.
Many modern motherboards do let you increase the PCIe speed, but some hardware in any real computer - as opposed to a bare-bones overclocking-contest rig - is probably not going to like it.
Any time you're limited by the speed with which data can be pumped to the video card from main memory, performance will probably suck anyway. It's the same as it was back in the AGP days and before; a faster video bus will make performance in this situation less dreadful, but it'll still be dreadful.
Modern video-card-slot bandwidth is much higher than the bandwidth of the built-in RAM on old 3D cards, but modern video cards have proportionally faster onboard RAM, and the 3D software people want to run has increased in size pretty much proportionally as well. So you can run an old 3D game like Quake II at lightning speed on all sorts of cheap entry-level hardware today; a modern version of those cut-rate video adapters that have no RAM of their own at all and just share system RAM will be more than fast enough. But if you want to run Crysis or Fallout 3, and your video card doesn't have enough RAM and has to shunt data from main system memory all the time, you'll be in the same situation as someone trying to play Quake II, in 1998, at 1024 by 768, on a Riva 128.
Bulk data transfer to (or even from) the video card should not occur while you're doing something where frame-rate matters. If it does, you can only solve the problem by getting a video card with more memory, or just winding down the prettiness in your games until the thrashing ceases.
When this column was originally published in Atomic magazine in 2008, I told Mark that I'd definitely go with the DDR2 system if I were him, because DDR2 RAM was incredibly cheap but DDR3 was still fairly expensive, though not nearly as ludicrously expensive as it had been a year earlier. DDR2 has gotten even cheaper in the meantime, but DDR prices have dropped a larger proportional amount, so it's now arguable that an Intel Core 2 or AMD Socket AM3 system with DDR3 RAM actually will give you reasonable value for money. And if you drop the few hundred more dollars to get a Core i7 system, you have to use DDR3.
Low-end Core 2 Duos and Quads have always been great little overclockers, though, so even if you get a DDR2 system, you'll probably be able to wind the CPU up to the sticker speed of a more expensive DDR3 system, and only lose out for tasks that lean heavily on system RAM speed.
If you're doing scientific computation or other stuff that really hits the memory hard, then extra RAM speed is worth the money. Video editing generally doesn't qualify here, because any time it's thrashing the RAM it's probably also thrashing the CPU and/or hard drive, and the multi-bottleneck design of the modern PC means that lengthening just the RAM stave of the barrel will not let it hold any more rendering-speed water.
I've had a Logitech MX Duo (the MX700 mouse with matching keyboard) for a very long time now. Of course, the mouse is the much better part of the pair, and I like it so much that I am seriously considering replacing the microswitches in the one that I already wore out.
The keyboard is not so great, but I have fallen in love with having a volume control and mute button on it. Since I'm not really interested in the gigantic-delete-button motif seen on most higher-end keyboards being sold today, I'll probably be looking at some sort of buckling-spring model. Which means giving up my media controls - unless an external add-on (similar to a numeric pad) exists.
There are plenty of USB controller doodads meant for audio and video producers, but they're huge overkill for this application. I think you might be pretty happy, though, with plain old hotkeys for play/pause/next/last functions (Winamp, for instance, defaults to using Ctrl-Alt plus the Insert/Delete block of keys for these functions; you just turn on "Global Hotkeys" to make them work), plus a Griffin "PowerMate".
I've never played with a PowerMate, but they're apparently very pleasant to use, not to mention rather snappy-looking.
(Several hackers have whipped up their own PowerMate-alike from mouse-guts and other salvaged gear. A VCR head drum has the weight and slick bearings of the Griffin product, but there are many other possibilities.)