Ask Dan: All about LCDs

Date: 18 March 2007
Last modified 03-Dec-2011.


I'm in the market for a large panel widescreen display, in the near future. I'm interested in the 22 inch Samsung SyncMaster 225BW [which Aus PC Market currently sell for $AU555.50 including Sydney metro delivery - Australian shoppers can click here to order!], but I really don't know much about any of them!

What should I look out for? I realise that a 5ms response time is good, but are the "overdrive" features on some of the Asus products really capable of 2ms?

What does G to G mean? Is DVI better than VGA? Et cetera!

It's a new and scary world when I can't confidently say "Bah, begone with your high latency TFT filth! I'll stick with my hulking CRT goodness."


The super-low response times quoted for modern LCD monitors are somewhat misleading.

Modern LCDs do have black-to-white-to-black response times that would have been exciting when I reviewed a "super fast" 30ms monitor five years ago.

But, as far as official ISO-certified numbers go, current screens actually fall very far short of the tiny numbers of milliseconds quoted in the marketing glossies.

As I explained in that old review, it's natural for LCD monitors to take longer to change their pixel state when you ask the pixels to change by a smaller amount. That's called "grey-to-grey" or "GTG" response time, and there's no clear standard for measuring it.

Grey-to-grey changes happen whenever you're not asking one subpixel (each pixel's made up of a red, a green and a blue subpixel) to go all the way from fully on (which, confusingly, means that subpixel is black; a powered LCD element blocks light) to fully off (which means it's as bright red, green or blue as it can manage). The LCD elements are driven to the desired brightness by a control voltage, and (to a first approximation) the closer they are already to the brightness you're looking for, the longer it will take them to reach the target brightness.

For small brightness changes this doesn't matter much, because you can't easily see the difference anyway. But for medium changes it can be very visible. That's why a mouse cursor waved around on a grey background may be much more noticeably blurry, on an older LCD, than when it's waved around on a black or white background.

To get around this problem, all LCD manufacturers - not just Asus - have come up with "Response Time Compensation" tricks, which do things like overdriving the control voltage until the subpixel gets to the right brightness, then chopping it to the value you actually want. These tricks are quite sophisticated now, and they're what give us all of these tiny response time figures for modern LCDs, like the 5ms for the Samsung 225BW.

This is standard marketing practice, of course. When the black-white-black numbers were the best, they quoted those. Now grey-to-grey is better, so they quote those numbers instead.

The ISO 13406-2 standard for LCD monitors mainly talks about dead pixels and viewing angles and so on, but it also defines response time as a full black-to-white-to-black cycle. By that measure, modern LCDs are all still up around 16ms, if not worse.

(It kind of stands to reason that you can't get a black-white-black response time better than a sixtieth of a second, 16.7 milliseconds, out of an LCD monitor with a 60Hz refresh rate. No matter how fast the pixels respond, if you only get a new frame every 16.7 milliseconds, the shortest possible period of time that can have three frames of any colour in it is a hair over 16.7 ms. Come in right at the end of one frame, see the whole of the next one, see a tiny bit of the start of the one after that. It's still physically possible to measure LCD response speed independent of refresh rate, but it's not very relevant to real users, when even LCDs that accept higher frequency input probably still only actually refresh at 60Hz. That's right - they just throw away any extra frames in the input signal.)

Relatively slow black-white-black response is not actually a serious problem for most applications. Modern LCDs are quite fast enough for video and action games. But the two kinds of response time do explain why some people still see blurring in certain circumstances. They're not fooling themselves; for large brightness changes, modern screens don't come close to the tiny response time numbers on the spec sheet.

But wait, there's more.

It's normal for LCD monitors, whether you're using DVI or VGA input, to have some tens of milliseconds of "input lag", as they buffer the incoming data in their panel driver hardware. This doesn't make the image blur, but it does make LCDs that much slower than a pure analogue monitor to get an image onto the screen. This can affect audio/video sync in movie playback, and make games feel slightly more sluggish too, but not everybody can notice the difference. I'm pretty good at spotting, and being annoyed by, minor lip-sync problems in video; most people don't seem to notice errors below 100ms.

(I also guarantee you that most people will notice the difference a great deal more if they think they've got a "slow monitor" than if they actually have got one, but have never heard of input lag. And yes, in case you were wondering, 30ms of monitor input lag does make something of a nonsense of gamers' rabid attempts to scrape that last few milliseconds off their ping time with special routers and TCP/IP tweaks.)

Input lag ought to be slightly worse for "VGA" input, since the monitor has to turn the analogue VGA signal into digital before it can even start buffering it. VGA also cannot, theoretically speaking, be as accurate to the video card's orders as digital-all-the-way DVI. In practice, though, it's seldom easy to tell the difference even with two monitors next to each other showing the same content. Unless your VGA cable's lousy, it's next to impossible to tell the difference if you're switching between modes on the one screen, even at quite high resolutions.

Since even cheap video cards have DVI output these days, you might as well use it. But it's nice to have VGA input on your monitor, too; if nothing else, it should let you easily connect one monitor to two computers at once.

The persistence of vision of your eyes, incidentally, makes it very hard to see changes below the ten milliseconds mark unless there's a big difference between the frames. Two similar frames will tend to be overlaid on top of each other, as far as your retinas are concerned.

So a few LCDs are now emulating CRTs by artificially inserting their own black frames, or just a super-fast scrolling black bar, to make fast-moving objects clearer. They're deliberately adding flicker to the flickerless LCD, in other words. Freaky.

X-bit Labs have an excellent piece on all of this stuff, including the mysteries of contrast ratio (perfectly fine, for modern LCDs) and colour gamut (getting better and better for LCDs, but cheap LCDs are already more than good enough for almost everyone on the planet) here.


I need to replace an aging CRT monitor and am interested in something like the Samsung SyncMaster 940BW [yours for a very reasonable $AU352 including Sydney delivery from m'verygoodfriends at Aus PC Market; Aussie shoppers can click here to order!].

The problem is that I've only got onboard video, and my budget won't extend to a new monitor and a graphics card. I'm running a Gigabyte 7VM400M-RZ with a Via S3G KM400 chipset, which I suspect will only run to 1024 by 768.

Do you know if I can get this to run at the 1440 by 900 resolution of the 940BW? If I can't, will the monitor be watchable if I run it at 1280 by 768 for the six months it'll take me to save up for a video card?


If you're talking about 3D mode, then yes, 1024 by something or other is the highest resolution at which you can expect anything vaguely resembling a playable frame rate in even quite old games.

If you're only concerned about 2D mode, though (Kim got back to me and explained that this was indeed what she meant), the KM400's integrated Via "UniChrome" graphics adapter should be able to drive anything up to 1920 by 1200 with no problems, if not more.

Any old graphics adapter has been able to drive all normal monitors to their limits in 2D mode for many years, now. Only now that we've got four megapixel screens at the top of the monitor market do you actually need anything special for 2D, and even then only because you need dual-link DVI to drive those giant 30 inchers at a reasonable refresh rate.

I don't know whether there are any weird widescreen monitor aspect ratio issues with the KM400, though. I've been able to find some, but not many, people complaining about such problems. Most of the complaints are from Linux users (who have driver problems that Windows users don't), and I can't see anybody complaining about problems with your particular motherboard and widescreen monitors. But I still wouldn't bet my life that it'll be plain sailing.

I'd bet a small amount of money that everything'll be OK with nothing more drastic than an update to the latest motherboard BIOS version, though, and there's a good chance that it'll work fine right off the bat. Widescreen monitors are not exotic technology any more, and your motherboard is not ancient.

Regarding watchability at lower resolutions - for suitably small values of "watchable", and assuming widescreen is not a problem, it'll be fine.

Old LCDs used to do nearest-neighbour scaling to enlarge lower resolutions to the full screen size. Or they just displayed the lower resolution with a black border around it, which for my money looked better than the nearest-neighbour Lego-fest (for a fine example of why nearest neighbour is bad, I recommend this old review of a very underwhelming digital camera).

Every LCD for quite a lot of years now, though, has done nice smooth scaling for lower-than-native resolutions, so you just get a fuzzy display. For productivity that's lousy, but for many games, and movie viewing, it's acceptable enough.

Think of it as free anti-aliasing!

Australian shoppers in search of a selection of fine LCD monitors may care to peruse the wares of Aus PC Market.