Dan's Data letters #54Publication date: July 2003.
Last modified 03-Dec-2011.
Why on earth do digital cameras, DVRs and the new (in Australia) mobile video phones utilise MPEG compression to do their stuff, when DivX compresses video streams far better?
Does it have anything to do with them having to be captured with a lossless compression (MJPEG)? Is it for legal reasons?
Basically, it's because MPEG compression is fast, and DivX compression is not. Also, MPEG in its various standardised flavours is, well, standardised, while DivX isn't.
MPEG compression can be done by cheap purpose-built hardware. You just stick in your device a chip that accepts a stream of raw images and squirts out an MPEG bitstream. Similar hardware exists for MJPEG compression, which is why baby digital cameras that produce MJPEG AVIs now exist; they sure as heck don't have an Athlon in there, running from two AA cells.
It's not inconceivable that such hardware could exist for DivX - which was originally just a hacked version of Microsoft's MPEG4 implementation, but is now quite different - but it doesn't yet. DVD players that can do DivX playback are starting to pop up, but I don't think they support all of the DivX flavours yet (there's that standardisation problem; there's a big compatibility break between DivX 3.x and 4.x), and nothing but a PC can make DivX files, as far as I know.
When there's no dedicated compression hardware, you either have to roll your own (very expensive) or use a general purpose CPU (very inelegant, and fairly expensive). And then you get a device that produces video files that Grandma can't view, because she hasn't installed DivX.
MJPEG can be lossless, but isn't when it's used in small devices. It's editable, because each frame is an individual image and none of them are encoded as "like the last frame, only with the following changes..."; this lets you cut-cut edit MJPEG without re-rendering anything. Each frame, though, is normally JPEG compressed, like a regular JPEG (more correctly referred to as "JFIF") image. Lossless MJPEG basically just leaves out the data reduction stage of the compression, and is used in non-linear editing suites; it's vastly larger than lossy MJPEG.
Recently, my laptop has been showing some strange behaviour. Under the "Processes" tab of the Windows Task Manager (in Win2000 Pro) the "System Idle Process" has started showing up as using a large percentage of processor power. We're talking high nineties here. "CPU Usage" however, is reported as being between 2% and 7% at the same time. All of this is while the computer isn't actually doing anything. When there is a call on CPU power, like say, the mouse pointer being run in circles across the screen, then the System Idle Process' percentage drops accordingly (much like a distributed computing client process would) for the duration of the activity.
Normally, if a process uses that much processor power then CPU usage also goes up and, even more telling, the laptop's fan starts up automagically to cool the now active processor. The fan is not on.
And I'm confused.
I've checked for spyware using Spybot. I've booted from the Win2000 CD and chosen M for Murde... I mean R for Repair. No luck.
Besides nuking from orbit, do you have any idea what I can do to fix this?
Nothing needs fixing. The System Idle Process one of the default Win2000 processes, and all it does is indicate the total CPU time that's not being used.
The two to seven per cent CPU usage you're seeing is normal. Even when you're not running any apps, Windows will always manage to use a few per cent of your CPU time, no matter how fast the processor is. It's just clever that way.
I often go to LANs, and have often thought of replacing my CRT with an LCD with a good response time. However, I would like to have some sort of hard plastic cover to place over the screen when transporting it. Do these sorts of products exist?
LCDs vary enough in size that it's hard to make one cover that'd fit a range of models, but CaseAce make some LCD carriers. There are simpler options, too; a piece of thick cardboard is likely to be perfectly adequate. You could secure it with tape or big rubber bands, or go the whole hog and use Velcro dots on the corners of the screen surround.
My office is finally considering upgrading some of our Win98SE systems to something more manageable like XP. Our biggest concern is to get something more stable than Win 98. The computers are primarily used for productivity (Microsoft Office) in addition to running some database software (Timberline, specifically). The slowest of these computers is a P-III 500 with 256Mb SDRAM on a 100MHz FSB, with the fastest being up towards 1GHz and 256Mb SDRAM on a 133MHz bus.
Will 256 MB be enough? Or will $US60 for an extra 256Mb be worth it? Processor upgrades are pretty much out of the question as we're not willing to buy new motherboards, and no one is willing to sell us Slot 1 processors. So, will a 500MHz P-III handle it? Or is it more feasible to upgrade to Windows 2000? I know MS tells us that 300MHz is all that's recommended for XP, but as they make money off of me buying it, they'd like to tell me that a 486 is plenty.
A 256Mb, 500MHz P-III will do for XP, for ordinary office work. It won't be a snappy performer, but you won't be tearing your hair out either.
Microsoft's minimum recommended system, as you note, is a 300MHz CPU and 128Mb of RAM. That CPU speed wouldn't be a killer, but XP on a 128Mb system will chug pretty hideously. 192Mb is pretty much the realistic minimum for XP, and 256Mb is better again.
Even if you're not doing heavy multitasking, working on giant databases or doing some other RAM-hungry task, though, more RAM than 256Mb is likely to be worth the money. Especially if you get it dirt cheap; used PC100 and PC133 (also compatible) RAM is available very cheaply on the auction sites. Shop carefully and you'll be able to pay about half what Crucial would charge you (without buying from someone with 2 feedback, 50% negative).
Windows 2000 is not much more lightweight than XP. WinXP is basically just a big fat service pack for Win2000.
Do you know of a place to buy the foam used in fan filters (preferably somewhere in Australia)?
I've got some Jaycar 80mm fan filters (which are black, plastic, and flimsy), but they're a pretty expensive way to buy foam, and they're not great for airflow when combined with the usual holes in a case.
There are various open cell foams that should be suitable; upholstery places and rubber specialists (like good old Clark Rubber) sell them. You need thin slices of the stuff, though, which general foam suppliers might not have. If they sell air conditioner filter foam, though, that's likely to be suitable. Many hardware stores have that.
UPDATE: After I put this page up, a reader recommended disposable electrostatic dry sweeper cloths, cut to size, as making great fan filters.
I wholeheartedly support this use for the things. Anything to avoid actual dusting.
With regard to your article on fuel cells, I have a question or maybe two about the information you gave on battery technologies.
You compared the energy densities of different chemistries in terms of joules per kilogram - shouldn't the "energy density" be joules per cubic metre?
Obviously this would give a slightly different relationship between the different chemistries but it is perhaps a more important measurement for a lot of portable devices. Am I right in thinking that lithium ion (LiI) batteries are less dense than nickel metal hydride (NiMH), and as a result the J/m^3 scores for these technologies are closer than the J/kg?
Also, do you know how this applies to lithium polymer batteries? I am guessing, but I would have thought Li-Polymer would be less dense still than LiI.
Now that you mention it, yes, energy density should be expressed as energy per unit volume, if only because the term doesn't really make sense otherwise.
The reason why it's used in the way it is, though, is that while battery size isn't unimportant - especially for tiny mobile devices - battery weight matters much more for most applications, including portable gadgets. A normal-sized camcorder can have another ten cubic centimetres of battery if it needs it; you can squeeze that in somewhere, even if it's just a bulbous battery sticking out the back. Extra weight is much more of a problem for portable devices, and for larger things like electric cars.
There wouldn't be much of a market for a battery that stored a gigawatt per kilogram but had a density of one kilogram per cubic kilometre, but current battery technologies aren't that spectacularly different, and line up the same way in both metrics - lithium ion beats nickel metal hydride beats nickel cadmium for energy per kilogram and for energy per cubic metre.
The only outlier on the graph is poor old lead acid, which has OK energy per volume but is very heavy. That, along with its slow charge time, is why you don't see lead acid batteries in consumer devices, or in any but the more cheap and cheerful kinds of electric vehicle.
Lithium polymer is ahead of lithium ion again, by a small margin, generally. Not necessarily, though, because so far they're probably not actually very different.
I've always understood that refresh rates can contribute to eyestrain and headaches, but have never known the relation in which it occurs. I'm running a Hitachi CM771 19" CRT at 1600x1200 and 60Hz, would this cause headaches?
It depends on what effect flicker has on you. Some people can use a nasty low refresh rate like this for days on end without a problem; most people find it quite obnoxious; some people get headaches from it.
Note that the CM771 can do 75Hz at 1600 by 1200, which isn't great but is better than what you're looking at now. Most video adapters have no trouble delivering that signal, so you should check the display setup on your PC.
If you drop to 1280 by 1024 or the squarer 1280 by 960, either of which should display more clearly than 1600 by 1200 (which the monitor doesn't actually have enough phosphor to clearly display), you should be able to do 85Hz.
I read your article on P4 overclocking and I couldn't resist the urge to upgrade again. I was looking at motherboards and I stumbled across MSI's Neo series of motherboards. I was wondering, in your mighty reviewing wisdom, what was your take on D.O.T. (Dynamic Overclocking Technology) and CoreCell? They sound a little too good. Would the D.O.T. be better than manual overclocking? Would I be able to get more performance myself, or would I get more by doing it "dynamic" style?
The closer I looked at these "technologies", the less impressive they seemed.
If you run MSI's supremely goofy looking CoreCell utility, then these mobos can automatically slow down the Northbridge and CPU fans if those chips aren't hot, and can also automatically wind up the clock speed when the CPU seems to have lots to do.
The fan feature exists on other boards - like the Abit IC7 I've got in this PC, for instance.
The overclocking thing is kind of weird. Why wouldn't you want the CPU to be running at the maximum speed it can stably manage all the time, not just when the motherboard thinks its load is high? OK, it'll use a bit less power and produce a bit less heat if you wind it down when it's not got anything to do, but the difference won't be large.
I'm not sure by how much the D.O.T. feature can increase the CPU speed; its five levels of operation are helpfully named after military ranks, and don't seem to have any numbers attached. The mobo has no way to tell how much overclock the CPU can actually stand, so I presume the maximum isn't very large. Whatever it does, you could do manually anyway.
I was interested in your article on projectors, mainly because I have one myself (an ASK Impression A4, since you asked). I don't actually use it for my computer display - I'm satisfied with my 19" CRT display - but I do have it hooked up to an Xbox, for gaming and DVD fun.
Now this is an older projector, and was bought cheaply second hand, without a manual. After a few weeks use, it developed display problems - all the colours seemed to separate, leaving a distorted image. I simply put this down to an older bulb, but when I replaced the original bulb, the problem remained - and I've lived with it since.
But in your article, you wrote "(CRT) projectors give nice results for video, but ... they need regular tweaking to keep the three guns lined up" - which sounds a little like it could be the answer to my problem. I've Googled around, but information seems scarce on this model, and I don't expect you to have any knowledge of it yourself, but do you think it's possible that my problem is a simple one? It'd be nice to be able to watch (and play with) Buffy on my wall without the vampires having ghosts.
Whatever it is, it's not a gun alignment problem, because your projector's not a CRT (well, unless it's something very different from the one described here).
If you've got "registration" problems - red, green and blue not sitting on top of each other as they should, so a white thing has noticeable coloured fringes - then it's an internal alignment issue that may or may not be fixable. I'd think that whatever's out of place can be put back in again, but I don't know what the inside of your projector looks like.
After discovering your sordid past as an Amiga user and writer, I feel compelled to ask: What's your take on the AmigaOne?
Being an old Amiga fan myself and someone who held on to those halcyon days a bit longer than most, I still have an Amiga 1200 sitting in the cupboard, and a couple of A500s in the shed, and my PC has AIAB/WinUAE installed. I cant help but be hopeful for them. But alas, I feel an Amiga revival probably lacks much of a chance in today's industry.
Wow - all you need to do to make the AmigaOne run non-retargetable Amiga software (which is, unless I'm much mistaken, the only software that runs on Amigas whose job isn't done as well or better by PC or Mac alternatives; even the current releases of Amiga OS4 aren't, themselves, retargetable!) is connect it to an Amiga motherboard.
Hurrah! That's MUCH more convenient than leaving the Amiga motherboard inside an Amiga and running your software on that!
The AmigaOne, like various previous reborn-Amiga micro-projects, looks like an entertaining hobby box for people for whom the journey is the destination. Particularly those who can't bear to run Linux on the same platform as anyone else in the world. Anybody who thinks the AmigaOne has any real relevance to the modern computing world, though, is severely mistaken.
I am planning to simply hook an LED up to a round battery with some solder for an upcoming dance party, but I was wondering if you had any ideas as to housing it, or an on/off switch?
I don't know how happy coin cells will be about being soldered to. Since this is a low current connection, you'd probably be OK to just press the leads to the battery and secure them with a blob of glue - hot-melt, epoxy, whatever.
Since the resulting assembly will be so small, there are any number of appropriately goofy things you could install it in. A ping-pong ball (with a bit of padding). A matchbox (room for several!). A badge (stuck to the back of the badge, with the LED bent to stick out the front). I don't know of any specifically made small housings, though.
Most Photon-type lights achieve switching with a cunning design that lets them repeatably press one leg of the LED against one side of the battery, when the button's pressed. Engineering something like that yourself isn't easy. You could, however, stick one leg of the LED to the battery, with the other one held in place by its own springiness, and stick a bit of paper in that gap to turn it off. Given the basic disposability of the concept, unless you want to fancy it up into a battery in your pocket and a wire to the LED(s), there's not a lot of point fiddling around with regular switches.
Do those stick-on "Internal Cell Phone Antenna" things actually friggin' work or what?
I came across this, but it looks a bit suspect. No explanation, their "electrical engineer" isn't named, and they refer me to websites to purchase that are not exactly the best known sources in the world.
So what's the deal - hype or fact?
Trying to figure out whether they do or not just using the signal strength "bars" on a phone display is very difficult. The number of bars you see will vary depending on which way the phone's facing (in all three dimensions), where you are relative to the phone antenna and the nearest base station, various external conditions (varying from day to day or from minute to minute, if you're near a road with traffic or the weather's squally), and precisely where you stand in the room. Modern digital cellular phones operate up in the gigahertz range, and those weeny little radio waves are easily blocked.
And, on top of all of that, the signal strength bars probably don't actually have much to do with the real signal quality.
To my knowledge, whenever anybody's bothered to test one of these "boosters" properly - in a real RF lab, with a calibrated input signal - they've failed miserably.
There's a bit more about them here.
They're also mentioned in passing here.
The page you mention (well, its mirror here, anyway) is mentioned in the first of those pieces, as is the fact that there was no reply from the contact address at the bottom of it.
There are real cell phone signal boosters; they're amplified repeaters that cost hundreds of dollars, at least. They work. They're the same technology that allows you to make mobile phone calls from tunnels, in the basements of big buildings, and in similar places where you've got nothing like line of sight to a normal cell phone tower.
The quality of the stick-on versions can, in contrast, be assessed by examining the kind of people that're trying to sell them.