Atomic I/O letters column #151Originally published 2014, in PC & Tech Authority
(in which Atomic magazine is now a section)
Reprinted here August 28, 2014 Last modified 16-Jan-2015.
HDMI to VGA to composite to RF modulator to...
I wanted to use an old 19-inch LCD with only a "VGA" input socket as a second monitor, and my graphics card has a spare HDMI output. So I bought a HDMI to VGA cable for three bucks on eBay and... it doesn't work. Might as well not be plugged in at all.
The other output on my graphics card is DVI, so I got an HDMI to DVI cable and used it to plug the second output into my main monitor, and that worked. So now I'm thinking that maybe I need to leave the main monitor plugged into the HDMI output and get a DVI to VGA to make the second monitor work. Before I make an eBay cable emporium slightly richer again, though, will this actually work? What was the problem with the FIRST cable?
The detail you're missing here, which cheapo cable vendors do not often clearly explain, is that there is no way at all for a mere cable to connect an HDMI output to an analogue "VGA" monitor. Some digital video connectors have pins that can carry an analogue signal in parallel with the digital, but HDMI, like the newer DisplayPort, does not. An HDMI cable can carry digital audio along with digital video, but not analogue video. You'd need an active converter box to turn HDMI into VGA; a good one of those would cost you more than a new 19-inch monitor.
DVI can deliver an analogue signal, on the four pins with a cross in the middle...
...on one side of the connector (Click that illegibly tiny pinout picture to see the legible Wikipedia version). Only the "DVI-I" version of DVI actually has those pins.
(Well, actually there's also "DVI-A", which has only analogue pins; that's basically just a cable standard for anybody perverse enough to use a DVI socket for input to an analogue monitor.)
If your video card has the pins, you can plug in a DVI-to-VGA cable and run an old monitor, subject to the monitor's resolution and refresh-rate limitations. Otherwise, you can't. HDMI is, signal-wise, very similar to DVI - which is why HDMI-to-DVI cables work - but only on the digital side. There is no analogue HDMI option.
So, you are probably now wondering, what the heck are people doing selling HDMI-to-VGA cables on eBay?
As far as I can see, they're trolling for suckers.
There are some weird industrial and other non-standard devices that actually accept HDMI data on some rewired version of the 15-pin D-sub "VGA" connector (which is extraordinarily antiquitous, by the way). But almost everybody who buys one of these cables is probably trying to do what you tried to do, and is doomed to failure. Or worse - if the non-standard cable connects a voltage on one end to a ground on the other, you're gonna have a bad time.
EBay teems with people selling bizarre cables like this; my favourite is the USB cable with an A plug on each end.
But if, as is usually the case, the cable is just a plain cheap straight-through A-to-A piece of wire, then it is a cable for doing nothing, for destroying one or two USB adapters, or for setting a computer on fire, depending on how kindly the gods are feeling when you plug it in.
UPDATE: After this page went up, a reader contacted me to say that in his experience the super-cheap HDMI-to-VGA adapters you can find on eBay actually work.
Some of the very cheapest hits in that eBay search are for the abovementioned simple cables, or those irritating multi-item drop-down-menu eBay listings where you can buy a useless cable, or a more expensive actual converter, but the low price of the useless cable is the one you see in the search results. Actual little converter boxes start from less than six US dollars, though, so if I needed one I'd definitely buy one, sight unseen.
Is it safe to run a network cable to your neighbour?
I'd like to share my local network with my next-door neighbour, but Wi-Fi is really flaky between us. I presume I could make it work with a directional or maybe high-gain omni antenna, but our walls are spitting distance apart, so I could really easily just string an Ethernet cable across.
I vaguely remember reading about this being a very bad idea for important electrical reasons, but I can't remember why. Is it?
In the olden days, when consumer Ethernet meant 10Base2 over coaxial cable, it was indeed a dreadful idea to string network cables between buildings. The shield on 10Base2 is earthed, earths in different premises can be at different potentials, and all sorts of weirdness could result. It wasn't really a huge electrocution or fire risk, but my definition of "huge", like my opinion on the amusingness of deadly electrical traps, may not be the same as yours.
If "in a hundred years, who's gonna care?" does not sum up your attitude to immediate risks, I suggest you seek a second opinion about any of my advice.
Today, ordinary Ethernet cables are all Unshielded Twisted Pair (UTP), which as the name suggests has no shield, and no earth conductor. So it's still hardly best network-engineering practice to hang Ethernet cables out the window, and a bare Cat5 cable waving in the breeze may not survive a great deal of weather. But the only real danger it poses, besides putting the carefully-tended computers in your house on the same network as possibly malware-infested cesspits next door, is just barely possibly allowing a major electrical fault or lightning strike or something in one building to fry a computer in the other.
If you're running network cables in conduit through a building - and possibly also in conduit between your windows, if you do that to keep the weather off the wire - then I'd use special "plenum-rated" cable that, most importantly, won't carry fire through the conduit like a fuse. Whether or not you use fancy cable and conduit, cables between buildings may also invalidate some aspect of your home insurance in some way. But in your situation, I would totally just fish some cable through a garden hose or something for protection, tape it in place, and call it good.
I must remind you once more, however, that my advice is not always entirely reliable.
(It's gotten to the point where friends of ours still allow their children to play with me, but make clear to them that my suggestions regarding interesting ways to spend our time together are not to be given the weight of authority normally due to instructions from adults.)
Yesterday I discovered the reason why my computer had been slow was that a drive power cable had kinked into the CPU fan and jammed it, and the processor was thermal throttling all the time.
I taped the cable away from the CPU and all was fine again, but the computer had been like this for DAYS. Does this mean I need a new CPU fan? It seems to be working OK, but I can't believe being stuck for DAYS ON END could have done it any good.
I killed a cheap cordless drill once by repeatedly grinding it to a halt while using a hole saw, and I vaguely remember giving my Tamiya Hotshot full throttle when the wheels were jammed with rocks was a really bad idea too.
Is it just a power thing, with the drill and model car motors being much more powerful than the CPU fan one? Is it superior modern technology? Should I get a new CPU fan just to be safe?
Your CPU fan is almost certainly perfectly undamaged, because it's fundamentally different from the ones in cordless tools and R/C cars. Well, it's fundamentally different from the ones in older and/or cheaper tools and cars, at any rate.
Computer-fan motors are all "brushless", with a spinning permanent magnet that's rotated by a series of electromagnet coils...
...that're energised sequentially, so the coils that are energised are always the ones in the right location to pull the magnet along. They need controller electronics to keep track of the spindle location and switch the coils, in order to accomplish this feat.
Simple high-power direct-current motors, on the other hand, are of the "brush" or "brushed" type, with stationary magnets and spinning coils.
The coils in a brush motor are energised by a slotted "commutator" on the spindle, onto which the titular "brushes" - actually, little hard sintered blocks based on graphite or copper - are pushed by springs.
Brush motors are cheap and simple to use. Just connect them to a DC voltage with two wires and they spin. Reverse the voltage and they spin the other way (though not necessarily at the same speed). But brushed motors develop maximum torque - and draw maximum electrical power - when they're not moving at all. This is usually a stuck-drill or jammed-wheels situation where battery-punishing, wire-smoking, gear-stripping maximum torque is exactly what you do not want.
Brushless motors draw a bit more power when stalled than when turning, but typically not enough to do any harm, and their control circuitry seldom has a problem with being stalled. Low-powered brushless motor drivers are also quite simple and cheap to make - especially if, as in computer fans, you only want the motor to spin one way. And there are no brushes and commutator to wear out. So they're a good choice for computer fans.
High-powered brushless motors have been around for some time as well, but until quite recently they've been too expensive for normal hobby and cordless-tool applications. That's changing now. R/C hobbyists will soon be robbed of the thrill of changing out worn brushes, and cordless-tool users will no longer enjoy the smell of commutator-spark ozone.
As the weather warms up, my old (2009) overclocked PC (Win7 64 bit, Core i7 920 at 3.5GHz) is becoming flaky. The system periodically hangs, usually at a moment when a new task or something suddenly starts - something as simple as opening a new browser window can do it, though running games strangely seldom seems to.
I figured more errors with more temperature indicated a cooling problem, so as you've recommended for diagnosing this I took the side off the case and pointed a big desk fan in there. It maybe helped a bit, but not a lot. And just today, I started getting these magnificent bluescreen errors that aren't there quite long enough for me to write them down (I'll photograph the next one!) but which include the terms "clock interrupt" "secondary processor" and "time interval".
"Secondary processor"? Is that the GPU or something? Can you give me a pointer on what to try first?
The error you're getting is an absolutely classic "over-overclocked CPU" bluescreen. It can mean other things, but given the computer you're using, the probable culprits are CPU first and power supply a distant second, only because a flaky PSU can cause almost anything.
What is probably happening is that after years of higher current blowing conductive atoms down the minuscule wires of your CPU (which is actually a real thing that happens), the processor's becoming marginal at its above-spec speed. It's easy to see if this is the case; just go into BIOS setup and return the CPU to its default speed and voltage, and see if the problem clears up.
If you now want more CPU speed than you can get at stock, you'll need a new CPU, but unfortunately LGA 1366 CPUs like yours can no longer be had new for reasonable prices - Intel stopped making them in 2012. There are still plenty of cheap used LGA 1366 chips out there, but they may be in worse shape than the one you've already got. So the most sensible solution at the moment, if slowing the CPU down cures the crashes, is to just live with your stock-speed 920 and save for an upgrade to a current CPU, motherboard and RAM.