Dan's Data letters #107Publication date: 28 May 2004.
Last modified 03-Dec-2011.
I've read your article about a DIY UPS system with long battery life. My question is, what would be suitable as a either a backup battery power supply for a router or UPS for a router? Ideally, a UPS function (assuming that means the backup power kicks in when mains power goes down) would be nice, but the emphasis would on battery life (maybe five or so hours).
Your DIY UPS seems like overkill, particularly when the router runs off a powerpack/transformer which outputs 12V 1.5A DC; the router only uses about eight watts.
I'm thinking that something that could be installed between the plugpack and the router, rather than at the mains power circuit, could be an idea, but I'm not sure how to get this happening. A commercial UPS would also be fine, but what type of VA capacity would be needed to run a 8 watt router (plus whatever the powerpack uses) for any significant period?
(If you're wondering, I've got a laptop which will last about five hours on batteries, but the ADSL router is useless without power.)
The very weediest of computer UPSes would do the router-powering job well, and probably for a long time, too; a small UPS running from a single 7Ah sealed lead acid brick battery could probably run an eight watt load for eight hours, even after you take inverter and plugpack losses into account.
Something like this, for instance.
I don't know what that thing's battery capacity is, but looking at its specs suggests it'd run your router for at least a few hours.
If you want at least five hours, a slightly beefier UPS would do the job. Still not hideously expensive.
The volt-amp (VA) capacity of the UPS is irrelevant, here; that just tells you how much power the UPS's inverter can deliver, which will always be much more than your humble router needs. The UPS battery capacity is what matters, for small loads that you want to be powered for hours.
The advantage of using a UPS, besides simplicity of setup, is that you could hang a powerboard off it and run multiple other low-current mains-powered devices that you'd like to ride out a blackout. VCR, clock, little stereo, cordless phone base station, et cetera.
Alternatively, as you say, you could use a dedicated low voltage solution. The problem with this is that any battery you just put in parallel with the plugpack output could, possibly, interact badly with the plugpack when the power's out (though you're probably OK there) and, more importantly, probably wouldn't get charged properly by the plugpack. A lead acid battery will become unhappy quite quickly if it's not float charged properly, and ten NiMH or NiCd cells may add up to a quite clean 12 volts, but will suck a ton of current when not fully charged and just hooked straight up to a 12V plugpack, and probably blow a fuse in it.
A 12V sealed lead acid battery and a smart float charger would probably be adequate, though. The main problems I think you could face here are the router not liking the higher-than-12V output of the charger (its output will peak at around 14V), or the charger freaking out when the router's sucking juice in parallel with the battery.
Frankly, all of this looks like too much hassle to me. Just get a basic UPS, plug the plugpack into it, and you're done.
I have a friend of a friend of a friend who has an unused Austar dish attached to the roof of their house, and who doesn't really want it there. I plan to remove it for them, and take it away free of charge. So now I find myself in possession of a fine ol' Austar satellite dish.
What sort of stuff can I use this for?
(Can I place a microwave oven element at the focus and toast a nearby city? Are they any good for wireless networking, or mobile phone transception?)
I advise against fooling with microwave magnetrons, unless you like the idea of ending your life as an amusing statistic.
DVI CRT TLAs
I am thinking of buying a used 21 inch CRT monitor on eBay. I have noticed that some newer models have both VGA and DVI inputs or some (like IBM P275) even have dual DVI input. I know that DVI input gives much better picture quality on an LCD, but can a CRT take advantage of digital input, or does it just take the analog component as DVI-to-VGA adapters do?
Even if CRT can take advantage of DVI input, is it as noticeable as with LCD? Is it worthwhile for me to pay for a DVI enabled CRT?
DVI doesn't, necessarily, even look better than VGA on LCD screens. There can be a difference, yes, even between DVI and VGA from a good video card over a high quality cable, but usually VGA over an ordinary cable from an ordinary video card looks very nearly as good as DVI. You'd think the very inefficient process involved (digital in the video card RAM to analogue at the VGA output to digital again inside the LCD screen...) would pollute the image more than it does. But, generally, it looks fine. If the timing's right, which it ought to always be with a modern auto-sync LCD, everything should look great.
(The above reference to "a high quality cable", by the way, pretty much just means high quality plugs. HD15 VGA connectors today are generally asked to operate at way higher frequencies than the connector was originally intended to pass. A half-decent three-BNC-plug lead, if your monitor has sockets for it, can improve the image noticeably at high resolutions and refresh rates compared with HD15. Interestingly, the DVI-I connector with its extra analogue pins is a superior analogue video connector to the old HD15. You'll very probably have to use a DVI-to-VGA plug adapter to connect a monitor to a DVI-I socket on the back of your video card, though, and that puts at least one HD15 back in the cable chain, so the advantage is lost.)
CRTs with DVI inputs that actually accept digital input - which I think some of them do - are doing digital-to-analogue conversion inside the monitor, instead of on the video card. This has the potential to give superior image quality (the later in the signal path the D-to-A conversion happens, the better), but you're still at the mercy of the D-to-A converter quality. Just as different video cards have different VGA output quality, different digital-DVI-equipped CRTs will no doubt give different image quality, all other things being equal.
All other things won't be equal, of course; different kinds of CRT are likely to make a much bigger difference to the image quality than will different D-to-A converters.
So, on a set of headphones I can measure the ohms by attaching a multimeter to the ground and either of the other sections of the 1/8th inch plug, right? Does a higher value necessarily mean better output (assuming a good amp to drive it)?
Yes, probing a headphone plug as you describe (the ground contact is the biggest one, farthest from the point of the plug) will give you the DC resistance of one side of the headphones. AC impedance can vary dramatically depending on the input signal frequency, but the DC resistance does give you a decent idea of the relative impedance of different sets of headphones.
Higher impedance doesn't necessarily mean better headphones, though. Many high quality headphones do have pretty high impedance, which means they probably sound quieter for a given voltage input than lower impedance 'phones, but that's not necessarily so - distance of the transducers from the ears makes a big difference, and higher impedance doesn't imply lower efficiency either. Impedance-to-sound-quality isn't a fixed relationship, either. High-impedance 'phones are harder for the usual built-into-a-component kind of headphone drivers to push to more than moderate volume, but you can generally get decent volume out of pretty much any combination of driving device and headphones.
You wouldn't happen to know of a nice little metal (preferably) LED flashlight that runs off 1 or 2 AAs and has a pushbutton tailcap switch? I've looked all over and seen none. All the flashlights I like seem to run off all sorts of lithium cells.
Alternatively, you could kit out a Mini Maglite with an LED lamp and a tailcap switch.
Here's an example of the kind of switch I mean; they're apparently quite widely available. I don't know whether this particular one is any good; it was just an early Google hit.
I bought a "new" battery from eBay from an older, but perfectly adequate notebook, a Compaq Armada M300. The M300 takes either a "skinny" battery, or a longer life "fat" battery. The one I got was the fat version, which looks like this.
Seems that HP no longer has knowledge of the original part numbers, and this is what they offer as an equivalent.
As was to be expected, the battery life from the new pack is not as good as "new". But the price was very cheap, and the life is somewhat more than the "skinny" version. The skinny version specs: 14.4v, 1.96 amp-hour. Fat specs: 14.4v, 2.7Ah. The skinny version looks like it contains something sized like an AA cell, the fat version looks more like C sized diameter.
That being said, if I destroy the one that I just got, I won't shed any tears. Is it possible for me to wire in new, fresh cells? I can solder well, although an EE I ain't.
A quick look on Google shows that cylindrical lithium ion cells are not easily found, and given the shapes of the battery packs (tubular), that's what's inside. A pic I found confirms this.
So: Can replacement cells be found for sale? Can I create an even longer life pack? I'd be happy getting the fat pack back to where it was when truly new.
Yes, you probably can re-cell the old pack, but I wouldn't try it. Lithium ion cells are touchy about charging, which is part of the reason why there's so much circuitry on the side of that dissected pack in the scan you pointed out. You can't just swap some other higher capacity LiI cells into a pack and be confident that they'll work properly.
The worst that's likely to happen if you do this sort of thing with NiCd or NiMH cells is that a "smart" charger won't know that the pack capacity's increased and will assume that it's taking longer to charge because Something Is Terribly Wrong, and therefore leave you with a less than complete charge. With lithium ion, all bets are off; if the charger delivers more current than they expect, or charges them too long, they can and will die spectacularly, possibly destroying your laptop, luggage, desk or lap in the process.
You can run into similar problems with NiCds and NiMHs, if you replace super-high-current-capacity cells (like the ones used in radio control applications) with higher capacity cells that can't handle the same drain and charge current, but even then you're very unlikely to see a fire or explosion. Unhappy LiI cells like to share the pain.
But yes, you can find replacement LiI cells here and there. There's a niche market for LiI and lithium polymer (I talk about the two a bit in the "Battery stats" letter here) cells among advanced R/C hobbyists (they're popular among indoor radio controlled plane builders) and robotics folk. They use specialised chargers, though, which your laptop doesn't.
See CheapBatteryPacks.com for an example of this kind of dealer. They have an impressive warning page that you have to pass to get to the loose lithium polymer cells. Most dealers, like Batteries America, only sell built packs of larger cells.
If you can get functionally equivalent cells to the fat pack's original ones - which you may be able to do, though as you say they're hardly easy to find - then you should indeed be able to swap them in and be fine. Do not accept anything that just looks the right size, though.
I'm currently engaged in a university project which aims to make use of LEDs for medical purposes. All well and fine so far, however, the LEDs have to be white (or blue, to be used in conjunction with a phosphor of some sort), no larger than about 2cm in diameter, and (this is the tricky one) capable of producing an optical output of 300mW.
I've been able to find some appropriate power specs for Lumileds' blue models, and with some help from their customer service department a few in the white, but in general elsewhere, all white LEDs appear to be rated in photometric units such as millicandela. The manufacturers' failure to mention useful things such as the active area of the device so that a value for luminance can be obtained is proving to be a little bit frustrating. I was wondering if you knew of some resource that gives enough data to calculate output in radiometric units for various diodes. Emission spectra would be a pure bonus!
I can't really help you, but Lumileds sprinkle a few radiometric stats in among the luminous flux specs on their Luxeon datasheets.
The Luxeon Star datasheet, for instance, reports the royal blue Batwing and Lambertian pattern Stars as outputting a typical radiometric power of 150mW (and points to a design guide, I think this one, which you may or may not find enlightening).
I don't know how much of a pure-colour Luxeon emitter's output would be eaten by the phosphor coating over the blue die in the white emitters, though.
There are radiometric stats on their page for the "Dental Blue" Luxeons, too, which I presume you've seen.
It looks as if the LXHL-LRD5 five watt Dental Blue unit will suit you - but you'll want more than a two centimetre wide heat sink on the back of that sucker. Perhaps an application for a small water block, and the mouth-washing hose...
There Can Be Only One
You may already be familiar with Livermore's famous fire station light bulb, but I was not... until this weekend. Needless to say, 4W isn't going to blind any of Livermore's finest, but I think it's amazing it's still around and kicking.
The Livermore bulb is, as you might expect, something of an electrical celebrity.
As you say, though, four watts ain't much. Particularly not four watts of carbon filament light.
This particular bulb is indeed remarkably long-lived; carbon filaments are much more fragile than tungsten ones, and it's unusual for a carbon bulb that old to be serviceable even if it's never been plugged in.
But it's not actually all that difficult to make a light bulb that'll burn for a very long time. We could certainly mass produce "200 year bulbs" today, and sell them for about the same price as regular bulbs. The secret is the efficiency/lifespan trade-off; make a bulb with a long, thick, inefficient filament and you've made a bulb that'll last a long time. That's pretty much all there is to it.
The four watt Livermore bulb probably only emits around 15 lumens of light, versus something in the vicinity of 60 lumens for a four watt bulb manufactured with modern technology. That's still quite atrocious lumens per watt by the standards of fluorescent lights, but modern cheap tungsten/nitrogen light bulbs nevertheless beat the crud out of every carbon filament bulb that ever there was.