Atomic I/O letters column #137Originally published 2012, in Atomic: Maximum Power Computing
Reprinted here January 17, 2013 Last modified 18-Jan-2013.
Why do wireless network speeds keep getting faster?
I've managed to retain some information gleaned during my university days, and the simple enough Shannon-Hartley theorem says to increase the data rate, you must either increase bandwidth or the signal to noise ratio.
So why don't we just do that from the beginning? Why bother with 2G, 3G, 4G Wi-Fi a/b/g/n etc etc. Why not just go to 100G (or whatever) straight away? Are telephone companies just trying to give us a reason to upgrade phones every 2 years?
Am I missing something? Is there some technical limitation, maybe battery life for phones? Is it difficult to design circuitry that uses more than (X) amount of bandwidth (hertz I mean)?
I've read some of Richard Feynman's books, and the impression I got was that we (the human race), know everything there is to know about how photons interact with electrons, and CPUs have come a long way. What more information do we need to be able to design the greatest wireless network in the whole wide galaxy?
All advances in the bandwidth of a given way of connecting network nodes - a copper cable of particular characteristics, a given optic fibre, a given slice of radio-frequency bandwidth - are indeed limited by basic information theory. The actual bandwidth of the whole system, though, is determined by how well the network nodes use the connection. This is partly how close to the theoretical maximum their internal bandwidth and transmit/receive speed are, and partly how well they cope with numerous real-world data-transfer obstacles.
The result, to drag out the tired but surprisingly widely applicable car analogy, is sort of like how motors work in theory and in practice.
The maximum theoretical efficiency of a Carnot-cycle engine like the motor in a car is determined by the difference in temperature between the hot and cold "ends" of the engine - that's the burning fuel and the ambient air temperature, for an internal-combustion engine. This maximum theoretical efficiency for a car engine can easily be 70% or more.
A real-world four-stroke engine is lucky to achieve more than 30% efficiency, though.
And then, there's advances in technology.
If you took your laptop and access point back to 1940 they'd still work, and you could save a lot of Alan Turing's time by giving them to him. But there's no way that anybody in 1940 could duplicate them. They could reverse-engineer some of the hardware (commercial electron microscopes were first made in 1939!) and end up with an electronic computer industry rather sooner than we did in this timeline. (They'd probably be grateful if you threw in a local copy of Wikipedia and some PDFs of engineering texts, and perhaps another laptop so they could use one while studying the other...).
But no arrangement of electronic components manufactured in 1940 could duplicate computing hardware made today.
The same applies for hardware made in 1980, 1990 and 2000; the differences are just smaller.
You could probably make an 802.11-something system from components available in 1990 (802.11 a and b actually came out in 1999), but it'd be cumbersome and expensive and power-hungry and in no way a consumer-market product. In 1980 the Motorola 68000 was a workstation-class CPU; in 1985 it was a powerful-home-computer CPU; in 1996 it was a pocket-PDA CPU; today it's what you use to control a washing machine if a Z80 isn't enough.
And then, there's software and/or firmware.
Wired networking is bad enough; wireless networking is mildly miraculous. Low-frequency low-bandwidth radio will go through all sorts of things, but Wi-Fi and modern mobile phones work in the much more finicky microwave range. The network topology is pretty chaotic, there are dreadful obstruction and echo and crosstalk problems, access-point antennas are small and network-adapter antennas are often very small, and the list goes on. A mixture of theoretical maths and a lot of empirical testing gives us better and better wireless data systems as the years go by, but once again you usually need more processing power and storage space to implement each new refinement of the system.
If you'd like to see another example of this sort of technological evolution, look at the history of the phone-line modem, from double-digit bits per second to tens of thousands. Along the way, more than a few people found they were actually wrong about the theoretical limit to the bandwidth capability of the channel, not because the Shannon-Hartley theorem is wrong, but because empirical testing and fiendish ingenuity revealed the pipe was actually wider than we thought.
There's even a further automotive analogy for this situation: Dragsters!
Using the engineering calculations that predict the performance of a normal car, you can get a pretty definite maximum possible speed for a vehicle with a particular configuration and engine power.
Modern dragsters are MUCH faster than that.
This is primarily because of an evolutionary engineering process where the real-world results of, in particular, the interaction of giant sticky low-pressure tyres and the dragstrip surface, revealed the previous engineering assumptions to be incorrect.
I use Picasa as an image viewer, because it can open everything, but when you view digital camera raw files with it, it insists on doing an "auto levels" sort of thing to punch up the preview of the image. This takes a few seconds and makes it impossible to delete an image while it's in progress and is generally sucky and horrible. How the hell do you turn this off?
You don't. Google in their wisdom have decided that everybody likes that feature.
If for some reason you do not, you'll have to switch to a different image viewer, at least for RAW files. FastStone Image Viewer is free and fast, and supports several raw formats.
The IT setup at my 20-person company is... a mess, it's probably on thedailywtf.com somewhere.
The main computer guy is on holiday now, and I'm doing his job kind of like a dog with a Rubik's Cube. Our in-house stock-management application screwed up for the nth time yesterday, and someone managed to get through to IT guy and he said he has a patched executable that fixes it, and now he's e-mailed that to me and then dropped off the Internet and into scuba-diving gear or something. But I'm wondering whether he accidentally (on purpose?!) sent me some kind of virus. The attached file is called newexe.piz, and IT guy says I should just rename it to newexe.zip and extract the file.
Why this and not a ZIP in the first place? If I copy this around everywhere and it's a virus, I'll catch shit for it, not IT guy.
Searching turns up .piz as a format used by "Pizazz Plus", but I'm pretty sure that isn't it.
It's very probably perfectly fine. He probably renamed it to get it past e-mail filters that block EXEs and ZIPs and so on, specifically to prevent people from mailing malware to each other. Such a filter should, of course, do something smarter like feed attached files to an actual virus scanner, but plenty of mail servers have simpler filters put in place in 1995 and never improved.
If you're doubtful about this or any other file's bona fides, by the way, just bounce it off a virus scanner yourself. I like "Jotti's malware scan", which feeds anything you upload to it to twenty different virus scanners.
(Sending confidential business software to the Web site of some guy in the Netherlands may be hazardous to your employment. From what you say, though, it doesn't seem very likely to make your situation any worse than it already is.)
I have this laptop that's about 2 years old and one day Windows went off the wall and just would not boot up properly, even in safe mode. There was a corrupt file in the Windows system files, so I thought I'd either have to repair it or reinstall Windows altogether. But every time I try to repair Windows with the CD, it stops at the loading bar and just stays like that for hours, not loading anything. So I thought let's just get rid of it all and start fresh, but it still refuses to do anything. As soon as I format the hard drive with the Windows CD and try to start installing Windows it just freezes up.
I am using Windows XP for this laptop as it originally had Vista when I bought it, I just reformatted it with XP. Any ideas what potential problems could cause this?
It sounds very much like a dying hard drive. I wouldn't bet my life on that, but the symptoms match exactly.
Consider this an excuse to upgrade to an SSD!
(Or just get a cheap new hard drive with much more capacity than your old one.)
Given that it seems 95% of Windoze "updates" are really just security fixes for IE v.whatever that I don't use, do you think it's worth the trouble? I've turned off the auto update reminder because I grew tired of Windoze's constant fault reminders and admissions. I check for driver/BIOS updates on my self built AMD boxes when I'm bored, but all seems well. The few updates I have tried seemed a bit wonky at times, but I was thinking of Win7 SP1.
Hey, don't forget the Windows Genuine Advantage updates too! Where would legitimate customers be without the sneaking suspicion that upgrading your computer may make it think you're a pirate!
But seriously, you're right that the majority of updates aren't a big deal. Most are patches and shims to "improve compatibility" with one thing or another that probably wasn't written right, and no given user is likely to encounter more than a very few of the things that get patched. But on the off-chance that you might, you might as well install those updates.
Microsoft Security Essentials, though, is definitely worth installing and updating. Even if you're wise to the ways of viruses and botnets, even if you never use the administrator account unless you really have to, it's still only the work of a moment to accidentally install malware, and modern malware can be a huge pain to remove. The malware/anti-malware arms race changes all the time, so you should never assume that some program that worked great five years ago can still handle everything today, even if it's still regularly updated. But Microsoft Security Essentials may currently have the best hassle-to-effectiveness ratio of all Windows anti-malware tools.
The predecessor to Security Essentials is the Malicious Software Removal Tool, which is still regularly updated and worth having. Yes, it may never find anything amiss on an experienced user's computer, but it's not as if it, or Security Essentials, cost money. The same applies to security updates for applications.
I've had updates set to "check, but let me choose which to download and install" for many years, and I think that's the best option for experienced users. Remember that just because "all seems well" does not necessarily mean that it is. It's perfectly possible for a Windows machine to be botnet-infected without any symptoms obvious to the user.