Dan's Data letters #74Publication date: 17 November 2003.
Last modified 15-Aug-2012.
I recently got hold of a TFT monitor with a scratched screen. However, we worked out that it was just the cover (polarising filter?) on the front of the monitor that was scratched. Having removed the polarising filter from similar models before, we know that they are relatively easy to peel off. However, with the filter off, the display appears pure white. Bah.
What is the proper name for this thin sheet (is it polarising filter?) and have you any idea what sort of place would sell them? I live in the UK, but any company worldwide would be fine.
Some LCDs have a protective glass or plastic front element, some don't. Many have some kind of one-piece sandwich. If the screen turns white when you remove the front element, then yeah, there's definitely a polariser in there.
What you need is going to just be called a "polariser" (or "polarizer", for more search hits...). Little bits of polarising film can be had from all sorts of optics/surplus/educational outlets quite cheaply, but you want a good-sized sheet of the stuff.
Shortly after this page went up, a reader suggested a lateral-thinking solution - polarised sunglasses! This would have the added bonus that nobody not wearing the magic shades could see what you were doing.
Regrettably, this is not as great an idea as it sounds, because polarised sunglasses contain a vertical polariser - their aim in life is to block the horizontally polarised light that forms most "glare".
Computer LCDs, though, use a 45-degree-angle front polariser.
So if you wear polarised shades and look at an LCD with your head level, you'll see the screen pretty much as normal, but rather darker than you'd expect from the apparent darkness of the glasses. Tilt your head one way and the screen will look brighter; that's the polariser alignment you'd need to make the LCD viewable if it was missing its front element. Tilt your head the other way and the screen will go black.
If you made glasses with a correctly-angled polariser in them - cardboard glasses, as used for "3D specs" or "fireworks" glasses would work fine, at least for demonstration purposes - then you actually could make a monitor that (almost) only you could read. Some well-connected security contractor probably has a turnkey version of this for ninety thousand dollars per unit right now.
After many years of complaining that my sci-fi ornaments clash with the traditional Christmas theme on our family Christmas Tree, my wife finally bought me my own artificial tree. I have practically all of the Star Trek and Star Wars ornaments released by Hallmark, all of which plug into a string of standard lights so they can light up and/or make sounds as appropriate.
I want to make the tree look tasteful, but cool. I want to build on the sci-fi theme in some way, such as by adding cold cathode fluorescent lamps or bright LEDs, but all the ones I've seen are made to plug into car or computer electrical systems - usually they don't just plug into the wall. Whatever I do needs to be able to plug into a standard US wall outlet.
My electrical/electronic skills are minimal, so anything that requires an understanding of resistances, capacitors, or ohms is likely to burn my house down.
Any ideas, or am I out of luck?
If you buy gear that's meant to decorate a PC, with handy-dandy four-pin PC power sockets, you can just run it from a PC power supply. Since most of these devices have passthrough power connectors, you won't necessarily even need extension or Y-adapter leads. The only trick is getting the PSU to turn on when it's not connected to a motherboard.
Old AT power supplies have a switch on the end of a wire that turns 'em on; one of them would be fine, but they're not very easy to find these days. The PSU in an abandoned 386 box would probably suit you nicely, but it won't necessarily work. Suitable machines used to be easy to find in charity shops, and by the side of the road on council cleanup days, but time has now marched on.
Modern ATX power supplies turn on when pin 14 on the ATX connector (the only green wire) is connected to ground (any black wire). A paper clip is adequate to do this:
...or, for a more permanent but soldering-free solution, you could just cut and strip the green wire and a black one, twist them together, and insulate the join with tape. This sort of loose twist connection is electrically lousy, but there's no current to speak of flowing here and only low voltage, so it's no big deal to do this in the time-honoured manner of The Incompetent Auto Electrician.
Even the dodgiest of "235 watt" no-name PSUs ought to be more than adequate to run lots of PC lights.
I have a 17 inch CTX something or other monitor, but I need more screen space! My 64Mb Radeon 9000 Pro has a single VGA output, a TV-out and a DVI-out. I've been trying to discover whether a cable exists to adapt the DVI to VGA so I can use one of my spare monitors for that little extra Photoshopping/C-programming-with-help-windows space. I found one Web site that is totally confusing in its description of such an adapter, one minute claiming that it'll let me use my analogue VGA monitor from a DVI output, but then halfway down the page insisting that it's only suitable for use with "TFT/LCD" monitors.
If I've worked it out right, the genders of the cable on that site are reversed, which makes it DVI monitor into VGA socket.
Can you clear up some of this mess?
There are two kinds of DVI connector - DVI-I and DVI-D. Both have a grid of 24 pin-holes for the digital signal, but DVI-I also has extra pin-holes on the right hand side of the connector that carry a normal RGB signal. Most, if not all, consumer DVI-connectored video cards have DVI-I connectors, which means they can be used with DVI-to-VGA adapters. Those adapters are just pin-converters; all they do is connect the RGB signal pins to a normal VGA socket.
You can see the kind of connector I'm talking about in my old review here. Note that some Radeon 9000s, I'm told, only have a DVI-D connector; you can tell by looking, though:
DVI-D doesn't have the same collection of extra holes on the right hand side.
I think the best explanation for the product description on the site you mention is that someone there just didn't really know what they were talking about. To my knowledge, all DVI-to-VGA adapters are the same electrically, whether they're a solid plug adapter or a short cable physically.
That page also talks about DVI-A, which is a version of the DVI-I connector where only the analogue-signal pins are connected to anything. This sounds pretty bizarre to me, but I suppose something somewhere might use it.
For VGA-plug-adapter purposes, DVI-A and DVI-I should be exactly the same.
There's also no gender reversal going on, despite the basic adage of plug adapters which states that No Matter What It Is, It's The Wrong Gender.
Both VGA and DVI monitor cables terminate in a male (pins) plug; both VGA and DVI graphics card connectors are female (pin sockets). The adapter you mention, like other DVI-to-VGA adapters, plugs into a DVI-I (or, I suppose, -A) socket, and gives you a VGA socket, to which you can connect your monitor.
I have a Windows NT 4.0 file server, running pretty new hardware (P4 2.4C, 2x512Mb DDR400 and all that normal jazz), but it has one thing odd about it.
This box has a cheap Adaptec IDE controller with 4 CD-ROM drives attached. This is used because the office receives 3-4 replacement CDs a month with updated tax data. These CDs are required for some installed software to work. The problem is that any time you open up File Manager or do anything on the server that makes it read the drives, the CD-ROMs all start accessing at once. This blinking CD-ROM access light Christmas tree effect is pretty annoying; I was wondering if you had a way to stop it?
I'd do it, and speed up access as well, by dumping the CDs to disk files and using a CD-ROM-drive-emulating package to make Windows think the image files are drives. Instant ultra-fast "CD" access is then yours, provided of course that you have enough spare hard disk space.
Assuming the discs aren't copy protected in some way that makes image file extraction difficult (only fun software generally has such limitations...), this is easy to do for free. Use the software of your choice to make ISO files from the CDs, and then set those files up as "drives" using something like DAEMON Tools. DAEMON Tools supports up to four "drives", neatly matching the number of real drives you've got now.
Many Windows CD-burning packages can make ISO files, and so can other utilities like WinISO, but Nero can't; it can only make its own special "NRG" image format, which you can convert to ISO with some other program whose name escapes me now.
(A reader's now pointed out to me that current versions of DAEMON Tools support NRG files as well, which is nice.)
The rigmarole involved with creating ISO files on Windows is rather ridiculous, by the way, since ISO files are just bit-for-bit copies of the data on a CD.
In letters #73, you wrote: "Please, at least, work with one hand behind your back."
When you're working on high-voltage equipment, keeping one hand behind your back makes it impossible to get a shock up one arm, across your chest (which is where you keep various important parts of yourself) and down the other arm. You can still end up dead even if you're working one-handed; if the object in question is running from ordinary mains power (not through an isolation transformer) and you have one foot reasonably well earthed, for instance. But it's harder.
Keeping one hand behind your back is also a good reminder that you're not engaged in ordinary idle tinkering.
I own a Sony Multiscan 20SE monitor. I brought it second hand. It's got a 20 inch display, not-quite-flat Trinitron tube, five BNC connections on its rear (this is the only connection available), and is pretty old now. It is connected to my PC with a BNC to VGA cable.
Do I need a special video card to run the monitor? Do I need a special power cable? I'm using normal cables, but it gives no picture.
No, you don't need a special video card, or cable. The "multiscan" in the monitor's name tells you that, like every other normal PC monitor these days, this one can accept various input frequencies. Any ordinary graphics card should be fine.
From what I can see in the (lousy) spec sheets for this monitor that I can find on the Web, it ought to have a standard 14 pin input connector, as well as the BNC sockets, but I figure you would have noticed one if it was there.
Your BNC cable ought to work, but you might need to change a switch setting on the back of the screen (to tell it to use a different sync signal), or change some other setting to tell it to use the BNC inputs.
If no amount of control fiddling helps, you could, of course, just have a dud monitor.
Can you tell me how to configure a 56K modem to only dial out under Windows 2000 and XP? I have done it previously in NT, and there are plenty of articles on the Web that refer to the procedure for NT 4, but I can't for the life of me work out how to do it in 2000/XP. The reason I ask is that we have an application here that requires a dial-up link to communicate with an outside vendor on an occasional basis, and from a security perspective I would rather not have a modem attached to a networked PC waiting to answer calls.
I wasn't aware that 2000/XP - or any other version of Windows I remember using, for that matter - defaulted to even turning auto-answer on for a modem, much less binding a network connection to that modem so that someone who dialed in could actually connect to your computer and/or LAN after it answered.
As far as I know, you have to set all that stuff up manually if you want it (in XP, Network Connections -> New Connection Wizard -> Next -> Set up an advanced connection...), which of course most people don't.
I realise it's hard to believe that Microsoft would choose a default setting for something which is both secure and non-annoying, but I really think they have, in this case.
I have been looking at some of the charger/dischargers for 7.2 volt radio controlled car batteries, and have found some of them to be quite steep in price. The GMVIS Commander for $AU599, for instance!
Can I not simply use an adjustable power supply of some sort to do the same job? I realise that there are probably charging curves that have to be followed to get the most out of the batteries, but surely the advantage of being able to quick or trickle charge with the power supply and being able to use the power supply for other uses outweigh a dedicated charger?
The GMVIS Commander is an RS232-interface, automatic-cycle, multi-output charger that may be useful for hobby shops, battery assemblers and people who're actually making money racing model cars, but which mainly exists for the convenience of wankers.
There are quite a few sub-$AU100 peak detecting chargers that're more than good enough for domestic six and seven cell pack charging. Most of the cheapies aren't mains powered; the idea is you run them from your car battery when you're at the track, and from a 12-volt-ish power supply - a cheapo car battery charger is adequate - when you're at home. The total cost still ends up pretty low, and you've got a separate power supply/car charger/whatever as well, which as you say is a useful thing. An adjustable bench supply set to 12-13.8V is fine to run one of these chargers, provided it's got sufficient current capacity.
It's possible to charge batteries directly with a power supply - wind up the voltage until you get the current you want, and tweak it up a few more times over the course of the charge - but you have to monitor the pack voltage carefully. Picking end-of-charge for NiCds is relatively easy (when the voltage plateaus and then starts dropping, even by a hundredth of a volt, it's done), but NiMH is harder (it just plateaus and doesn't drop, and the cells are meant to get hot when fast charged). Get distracted once during a fast charge from a bench supply and you can come back to an exploded battery or burning workbench. The extra money for a simple pushbutton peak charger is well worth it.
I was assaulted by a banner ad at a site I recently visited. It actually looked cool enough to pop in the browser. Check it out. Is it worth $US99?
Well, let's just look at the sample images, shall we... oh. There aren't any. Click on the thing that looks like sample images and you get an order form.
There are several of these "digital camera binocular" products out there. Bushnell's ImageView, for instance (which, by the way, sells for less than $US70, as I write this). Around the same price level there's also Meade's CaptureView 8X22, and a few Celestron VistaPix models. The cheaper ones of those also cost well under $US100.
Guess what? None of these companies offer any sample images, either! This may tip you off to the fact that these products aren't much good.
It's hard to find reviews of any of them on the Web, but what there is generally says that even the fancy models aren't very exciting, while the cheap ones generally get downright lousy user writeups.
I don't think any of these products are advertised in a really honest way. If the cameras in these things used the same optics as the binoculars, everything would be simple (and probably look pretty good). But that'd be difficult and expensive, so they don't. The cameras are, instead, slung between the binocular barrels, and they have their own cruddy lenses, just like every other cheap 640 by 480 still/video/web camera out there. These lenses are just a lot more telephoto than the usual medium-wide ones.
Some of the bino-cameras have the same field of view as the binoculars they're built into; some have a bit less. Assuming the horizontal field of view of camera and binoculars is the same (vertical field of view won't be, since the binocular field is a circle and the camera field is a rectangle), then it's fair to say the camera has the same "magnification", but resolution comes into this as well. A dinky 640 by 480 camera (like the ones on all of the sub-$US100 binoculars) that has, say, a 10-degree horizontal field of view, will only lay as many pixels across a given object as will my 3072-by-2048-pixel Canon EOS-D60 using a lens with a 48-degree horizontal field of view. All I need for that is a 40mm focal length - actually, because of the smaller-than-35mm sensor in my D60, I only need about a 25mm wide-angle lens.
If my lens is a optically better than the one on the bino-camera (which it practically has to be; it's not hard to find quite cheap 35mm lenses with a wide range of focal lengths and excellent optical quality) - then a 640 by 480 crop out of the middle of my picture will beat the binocular-cam pic for quality, by several nautical miles. And if I use an inexpensive mildly telephoto lens - say a mere 100mm lens, given the field of view of a 160mm lens by the smaller sensor in my camera - I'll be able to lay far more pixels over a given subject than any of the bino-cams can manage.
You'd expect this, of course, because my camera is an expensive pro digital SLR. Pay 20 times as much and you ought to get better quality, massive flexibility, funky styling and a complimentary backrub.
But similar crops out of the middle of pictures from various digital cameras that're a lot cheaper than mine will also be streets ahead of the bino-cam pics.
All of the bino-cams have fields of view that put them at more than 500mm focal length, in 35mm-film-camera-equivalent terms. 600mm or more, for some of the bino-cams. That's a pretty darn long lens, and it means you'd better have some support if you're taking pictures in anything other than bright sunlight. Hand-holding the binoculars for shutter speeds of 1/500th of a second or slower is going to give you blurry pictures, and you may be surprised at how long the shutter speeds they use are. Their lenses aren't physically big, so even though the sensors inside are teeny little webcam chips, they're still unlikely to manage very low F-numbers, which means longer exposure times at a given light level.
But OK, the things are pretty cheap. You're not paying much more than you'd pay for a basic pair of similarly little binoculars by themselves.
But if you ditched the digicam idea and dropped the same $US100-odd on a pair of undecorated binoculars, you could get an 8X or 10X set with objective lenses larger than 40mm.
The objective lens size (it's the second number in the binocular description - 8X25 binoculars have 8X magnification and 25mm objectives) determines the amount of light the binoculars collect, which in turn largely determines the brightness and clarity of the image. For everything but bright daytime viewing, bigger objective lenses are better. Binoculars with big objectives are actually great for a lot of basic astronomy; light-gathering is more important than magnification for a lot of sky-viewing.
And, for super-tele photography, a proper digital camera with a decently high maximum resolution and a long maximum focal length will easily beat these binoculars for quality.
There are several such cameras with street prices below $US400 these days - the very groovy Minolta DiMAGE Z1, Olympus C-720 Ultra Zoom and C-740 Ultra Zoom, and the Fuji FinePix S5000 Z, for instance. Even with a mere 2000-odd pixels of horizontal resolution, the fact that these cameras have a field of view less than twice as wide as that of the binocular-cams means they've got around twice as much real resolving power, not to mention showing you more of the scene around the 0.3 megapixel rectangle that the bino-cam can image. And this is before we even consider the large quality difference, pixel for pixel, between the cheap webcam-quality sensors in the bino-cams and the much better ones in the proper digicams.
No, these proper cameras aren't as cheap as the bino-cams. And they're not as convenient as having a quality camera built into your binoculars either. But nobody actually sells such a thing yet, no matter what the banner ads say.