Atomic I/O letters column #87Originally published 2008, in Atomic: Maximum Power Computing Last modified 16-Jan-2015.
On home audio receivers there is the capability to switch between speaker sets - "A" or "B" - for speakers, say, in your office and/or on your deck.
I have my monitor on an arm where I can use it at my computer in my office, or extend the arm through the bedroom door and use it as my TV while in bed. I have a Logitech X540 speaker system in my office ("A"). I want to put another X540 system in my bedroom, switching to it ("B") when using in my bedroom.
Is there a reasonable way to make this happen?
There's no cheap and elegant way to do this, but it can be done, probably for only about twenty bucks plus cables. Just switch the analogue audio, all 5.1 channels of it.
$10 A/V switchboxes are all over eBay and the discount stores - they're often called something like "Video Game Selector Switch" or some such, because they're meant to let you connect two or three analogue audio/video sources to the single set of connectors on a TV. Heck, here's one for $US5.63 including delivery to anywhere in the world. I'm sure it's every bit as excellent as you'd expect for that kind of money.
The simplest ones, like the above suspiciously-cheap example, just have sets of three RCA sockets for hookups - two for audio, one for composite video. Fancier models (we're still talking only ten bucks per) have S-Video sockets, too.
These selectors are very simple inside, with no impedance-matching transformers or anything; just a sliding switch with two or three throws and a buttload of poles, connecting one set of inputs at a time to the single output.
Because of this simplicity, you can easily use such a selector backwards, to send one source to one of two or three destinations depending on the switch setting. There's your audio destination selector, right there.
There are some resistors in the selector I've got here which suggest to me that trying to hack audio through the S-Video connectors isn't a good idea, though. So that leaves you with three channels to play with, because using the composite video connectors for audio should be A-OK.
If all you want to switch is stereo audio, possibly plus a single subwoofer channel, a single one of these switches should do just fine. If you want 5.1 channels, just buy two switches, and distribute your audio wires between them. Now you have to flick two cheesy plastic switches to get all of your audio going to the second destination, but that's no big deal.
For cabling, unless you got a selector switch with 1/8th inch stereo plug inputs and outputs, you'd need adapters to turn those computer-standard connectors into RCAs and then back into 1/8th inch plugs. All you ought to need, though, are three standard 1/8th-inch-to-twin-RCA-plug leads for each of the inputs and outputs on the selector.
Note that cheap speaker switchboxes also exist, for hooking up more than one set of speakers to an amplifier with only one set of speaker connectors; see this one and this one, for instance. They're a good solution to the version of this problem where you're happy to use one normal hi-fi amplifier, but they're not made for line-level switching.
Oh, and you can get cheap switchboxes for three-wire component video too; here's one with three inputs that costs less than $US14 delivered. Don't expect to be able to switch "VGA" RGB through a cheap box like this, but you probably actually could use it with no trouble at all for five-channel audio switching.
I have a question related to quad-core CPUs.
How does the operating system see this processor? As two CPUs (as there are two dies), as four CPUs (for each core)...?
What about hyper-threading? My Core 2 Duo doesn't seem to support HT, so I guess the C2Q doesn't either.
I'm more concerned about Windows XP Pro, as it seems the limit for that is 2 physical CPUs.
You'll be pleased to learn that Microsoft's artificial processor-number licensing limits apply per individual physical CPU package, not per core.
So if you've got a dual-core CPU, that's one "processor", as far as Windows licensing goes.
This policy started out when the first Hyper-Threaded P4s turned up. They looked like two CPUs to the OS but really weren't at all, and it would have been ludicrous to insist on a dual-processor license to use one.
Modern Intel CPUs aren't Hyper-Threaded any more, mainly because the Pentium M core that begat the Core series didn't support it. Apparently HT will be coming back in the almost-out-now Nehalem chips, though, making four-core chips look like eight and eight-cores look like 16.
Back in the present day, even though Core 2 Quad CPUs have two whole physical dual-core CPUs on the silicon inside them, they still only count as one processor from Microsoft's point of view. So you'll still be fine even if you have WinXP Home, which only supports a single "processor".
(If you decide to boot Windows 95/98/ME for some reason, it will of course only use one core of your CPU. Only the Windows NT lineage supports multiple CPUs.)
Microsoft have a somewhat dated page on this subject here.
I have several old APC SmartUPS UPSes in my garage attic that are no longer useful because their batteries are shot, and APC no longer sells direct replacements. These old SmartUPS devices have pretty good chargers and inverters in them. So I'm thinking about just using my own replacement batteries.
Is there any reasonable limit on how much battery capacity I can insert into one of these devices? Could I put in batteries with twice the original amp-hour capacity? Five times? Ten times?
I'm not so much worried about destroying the UPSes - as I mentioned, they're just collecting dust (and probably mouse droppings), anyway. I'm worried about sinking a bunch of money into deep-cycle lead-acid batteries for this project, only to learn that it won't work like I hope.
It's possible for large battery capacity to over-strain some charger designs, or trip a sanity-check circuit - if a charger notices that it's put ten times the energy into its battery as it thinks it should be able to accept, it may decide to stop, in case this indicates a house-fire in the making.
But realistically, I don't think either of these problems is likely to arise with consumer UPSes. Their charge circuits are current-limited, so they may take a long time to charge a giant battery bank but won't have any other problems. And they're not smart enough to complain about strangely lengthy charge times.
The whole idea of a big-ass battery bank is that only a once-in-50-years blackout should be enough to completely drain it, so expensive deep-cycle batteries are not necessarily required. I suggest you buy the finest, cheapest, off-brand car batteries your local big-box store has to offer, and build your first battery bank out of those. That's what I did. It's been a couple of years now with no maintenance for the two batteries and I think their capacity's not what it was, but they can still run a few hundred volt-amps for as long as the power is likely to be out.
It's difficult to put an exact figure on the capacity, for UPS purposes, of non-deep-cycle batteries, as I explain in that UPS Upgrade piece. But you should be able to count on an easy ten repeatable amp-hours from even the crappiest of small $40 12V car batteries. 20Ah is not out of the question from cheap batteries, provided the charger clicks on pretty soon after that much discharge to get 'em back out of the sulfation danger zone.
(On the subject of which, I think "desulfator" gadgets do actually work, and allow you to revive a lead-acid battery that's been over-discharged and sulfated into uselessness. I've not yet actually demonstrated this by personal experiment, though.)
The relatively gentle charging and discharging that car batteries will get from a domestic UPS also means that they're likely to behave themselves very nicely, and not bubble sulfuric acid out all over your floor or anything. My two dirt-cheap batteries haven't missed a beat.
I was perusing a random messageboard and saw a Google ad for "Killer NICs".
A two to three hundred dollar "gaming" network card.
Sounded kind of ridiculous to me to start off with, but then I got to reading the details...
I knew some NICs load the CPU more than others, depending on their operation, but that was about my extent of knowledge.
Are these things worth the dosh? Ever heard of them?
The Killer NICs have been around for more than two years now, and both the original M1 model and the cheaper, less ridiculous (no giant pointy Klingon heat sink) K1 do, more or less, what it says on the tin.
They're probably not actually worth buying, though.
The Killer NICs aren't traditional network cards. They're a whole very specialised computer on an expansion card, which really does offload pretty much all network-card duties, provided a driver exists for your operating system.
The onboard wired network interfaces that most people use are, in contrast, primarily software-based; the driver, and thus the CPU, does all of the heavy lifting. This approach is a good way to make cheap hardware, which is why it's been used for printers, modems, RAID cards and all sorts of other stuff for many years now. But it's obviously better to offload that work if you're doing something that's CPU-limited and need more cycles for the primary task.
The question is whether your game actually is CPU-limited enough that the relatively small load of the network interface makes a difference.
With the current drivers, the Killer NICs apparently work very well, but you can still only expect a frame rate and ping improvement of a few per cent, and you'll only get that when your CPU's the limiting factor - which it won't necessarily be.
As far as I know, everybody's only compared the Killer NICs with cheap software-assisted NICs, too. There are plenty of server-class network cards that at least approach the Killer NIC design, but cost rather less. I bet that if you put one of those in your game box you'd get benchmarks pretty darn close to a Killer NIC.
The Killer NICs have other features - a firewall, a freakin' BitTorrent client and so on - but you can get all of that stuff from cheap outboard hardware that'll do it just as well, and not require your computer to be powered up for it to work.
So feel free to spend $200 or more on a Killer NIC if you like, but I can't help but think that putting that money toward your next CPU and/or graphics card upgrade would give you a lot more return.