Dell UltraSharp 3007WFP-HC LCD monitor

Review date: 3 August 2007.
Last modified 03-Dec-2011.

 

For ages now, my monitor of choice has been a Samsung 1100p Plus "21 inch" CRT. The scare quotes are there because this monitor, like all CRTs, has a smaller viewable screen area than its size specification suggests. It still has a 20 inch viewable diagonal, though; along with an old Samsung 753DF 16-inch-diagonal screen (even basic PC graphics cards have for years now been able to drive two monitors), the 21-incher gave me quite a lot of screen real estate.

I stuck with that old CRT for more than four years, mainly because it worked perfectly well (when set up properly...). It was also (barely) tolerably sharp at 1600 by 1200.

After all that time, though, I could no longer get the focus very close to right on one part of the screen without making another part fuzzy, and something was going slightly intermittent and making the image go "boing" from time to time.

It was time for a change.

A big change, naturally.

The only really meaningful screen size upgrade from a 20-inch plus an 16-inch, I said very reasonably to myself, is one of those outrageous thirty inch behemoths.

So I bought one.

Vast monitor

A Dell UltraSharp 3007WFP-HC.

The 3007WFP-HC, as indicated by the ever-increasing number of letters after its name, is the third major revision (in fairly quick succession) of Dell's monster monitor. All of the thirty inchers - including the ones sold by other companies, like Apple, who were first into this market - have the same 2560 by 1600 resolution and require the same dual-link DVI to run at their full resolution.

But the panels (which are almost never made by the company that "makes" the monitor) have been improving, and the prices have been falling.

There were two revisions of the first Dell 30-incher, the 3007WFP. It started out costing $US1999 if you bought it when it was on special (which is the only time when you should buy any product from Dell, the Discount Persian Carpet Warehouse of the computer world). The Australian non-discount list price was $AU2898 or something.

Now, there's the 3007WFP-HC, launched at the end of 2006. It has a wider colour gamut thanks to an improved LG Phillips LM3001WQ1 Super In-Plane Switching (S-IPS) LCD panel. And, when on special, it also has a much lower price.

$AU1799 delivered?

Sold!

(As I write this, it's back up to a strangely precise $AU1,998.70. The discount's currently shifted to the poky little 24 incher.)

I remember when you could pay that much for a CRT screen, and not even get a 21 incher. I also remember the slightly more recent time when a fifteen inch LCD could set you back well over $AU3000 - with a mere 18 incher way over $AU6000.

The 3007WFP-HC still isn't cheap, but $AU650 per square foot of screen area (the actual viewable diagonal is a mere 29.7 inches) looks like a bargain to me.

Everybody seems to love the 3007WFP-HC, so I didn't do a whole lot of in-depth preparation for the arrival of my new four-megapixel friend. A few things about it therefore surprised me. Rather than go over the same ground as every other reviewer, I'll concentrate on these surprises.

The basics

The 3007WFP-HC's special larger-colour-gamut panel is shared by the HP LP3065, which came out a bit earlier and has three DVI inputs to the Dell's one. The 3007WFP-HC also gives you absolutely no other inputs - no analogue "VGA", no component video, no nothing. It's got a USB hub and a multi-slot memory card reader in it, but that's it for extras.

And I really do mean that's it. Like the panels in Apple monitors for some time now, this LG Phillips panel is beautiful but brainless. It has no video processing hardware, so the only actual controls on the monitor are three capacitive touch switches on the front. One power button, one plus button, one minus button. The plus and minus change the screen brightness. And that's all you get.

There's a rinky-dink on-screen-display application for Windows systems (which won't work if you don't plug the monitor's USB cable into your computer), but all it does is show the brightness setting. There's no contrast control, no colour temperature, nothing. Just brightness. Everything else you have to set in software in your operating system.

This isn't as much of a limitation as it would have been a few years ago, though, because the driver software for every half-decent graphics chipset these days includes various colour response adjustments, from gamma correction to colour temperature adjustment. If a particular screen's factory settings aren't horribly wrong, you shouldn't actually need more than a little adjustment.

I've noticed other reviewers often remark - and sometimes complain - about the WFP-HC's super-saturated, "cartoonish" default colour rendition. Extra-punchy colours are fine for most people and great for games, but they're a disaster if you want to do serious design or image processing.

I have nothing to contribute to the WFP-HC colour debate, because almost the first thing I did with it after hooking it up was calibrate the sucker.

If it's stupid but it works, it's not stupid.

The Colorvision Spyder wasn't really made for a screen this big, but it worked fine with a little persuasion.

Now, the monitor's colour response looks A-OK to me. Yes, there's a faint purplish glow to very dark colours around the edges of the screen unless you sit unrealistically far back (it looks sort of like a reflection of something that isn't there, and is unavoidable with LCDs in general and S-IPS panels in particular), but even that is barely noticeable almost all of the time. And there's nothing else to report.

(UPDATE: I've done enough Photoshopping using the new monitor now that I'm quite familiar with the difference between its unadjusted response and its mild-mannered behaviour after profiling. Yes, the difference is large, and can cause great confusion among those unschooled in colour management.)

If you're doing really colour-critical work then no consumer LCD will do, and many CRTs won't cut it either. But it is my considered opinion that nine out of ten people who profess great concern for the colour accuracy of their screen would not actually notice, or be unable to do any of their job, if you sneakily swapped their carefully generated screen profile for a whole different one.

Hardware monitor calibrators still cost a couple of hundred US bucks, but that can easily be split between you and however many friends you can get to share the expense. You don't need to recalibrate monitors very often (realistically, once'll do it, for LCDs), so it's very easy to share a calibrator.

On the subject of calibration and colour response and all that - the 3007WFP-HC's panel has, of course, proper eight-bit-per-channel (24 bit depth total) colour resolution.

Because LCDs are digital devices, they have the same digital colour resolution as the hardware that generates the video signal. CRT monitors are analogue devices (not counting ancient ones from the days of CGA and EGA), so they can display as many gradations of colour as the input signal can bring them, but cheaper LCDs are commonly only six bit per channel devices.

Cut-rate six-bit LCDs (all with old-style twisted-nematic-plus-film panels, I think) are still very easy to find. Dell still have a few of those in the cheap seats of their lineup.

6-bit screens can simulate proper 24-bit colour by using dithering, or with Frame Rate Control (FRC), in which the monitor flickers one colour after another to approximate the desired in-between colour - temporal dithering, if you will. Both strategies are good enough for a lot of users, and the cheap 6-bit TN panels have the super-fast response that sells monitors to gamers. But they still aren't a great choice for photo retouching, and often show noticeable colour banding on some images even if you haven't reduced the amount of their colour gamut they're allowed to use by calibrating them.

Panel colour gamut and bit depth interact in another way, too.

"Colour space" is a good descriptive name, here; if you plot displayable values of red on one axis, green on a second and blue on a third, you define a 3D space whose size and shape varies with the R, G and B capabilities of a display device. A theoretical screen that only had red and green subpixels on it with no blue (such screens actually exist - big LED display boards with only red and green LEDs on them, for instance) would have an RGB colour "space" that was two-dimensional, with no extension into the blue at all.

The larger the colour gamut - and you shouldn't try to compare three-dimensional colour spaces with single numbers, but the WFP-HC's "92% of NTSC" gamut does indeed have the thick end of 1.3 times the volume of that of a typical "72%" LCD - the larger the space in which the finite number of possible pixel colour settings the panel can manage are distributed.

If you think of each possible setting, starting from 0,0,0 and working through all 16,777,216 possible settings of 24 bit colour to 255,255,255 - as a point in the space defined by the gamut, then those points will obviously be further apart when the gamut is larger, all things being equal.

If you had some incredible monitor with a colour gamut 50 times that of NTSC but still an eight-bit panel (ten-bit panels are already appearing at the high end), then the difference between the green defined by 0,100,0 and the green defined by 0,101,0 would be quite noticeable. And, furthermore, if you displayed a picture made for a boring old 72%-of-NTSC monitor on that screen, the colours in the picture would take up a tiny fragment of the available gamut, and the monitor would have to try to display them with only the colour-points that fell within that tiny fragment. The result would be very heavily banded, or quite obnoxiously dithered, or both.

It's already possible for some picky users to see this sort of thing happening, when they view 72% content on a 92% screen. Realistically, though, it's only a problem for the few more-than-100%-of-NTSC LCD monitors (all with LED backlighting, I think) that're starting to trickle into the high end market, but which don't necessarily have extra colour resolution to match their larger gamut.

Still, it's another thing to bear in mind if you're the one designer out of ten who really needs precise colour control. Just as record producers often use pretty ordinary speakers as studio monitors, some designers may be better off with a 72% screen than with a wide-gamut one.

Getting back to more prosaic concerns, the minimum brightness setting for the 3007WFP-HC is still pretty bloody bright. The maximum brightness is down a bit from the non-HC model, at a mere 300 candelas per square metre, but that's still outrageously bright. Not nearly as bright as sunlight on paper, but way brighter than anybody should set a normal indoor desktop monitor.

Ideally, your monitor shouldn't be any brighter than a well-lit book (something which is probably new to the 60Hz-CRT brigade who, today, don't know how to adjust their laptop's screen brightness...). But I can't turn the 3007WFP-HC down that far. Well, not without opening the thing up and fooling with the backlight power supply or something.

I've rigged up a quick-'n'-dodgy bias light behind the monitor to reduce eyestrain, and JediConcentrate and the Darken bookmarklet help to reduce the number of minutes I spend with millions of bright white pixels tanning my retinas. And, over the last few weeks, I've gotten more used to it. But my old CRT was still easier on the eyes in this department - although I more than made up for that with all the squinting I did at its blurry image.

Funny numbers

Like virtually every other LCD on the market today, the 3007 is alleged to have "178°" horizontal and vertical viewing angles. That's about as far as you can go before the bezel around the screen interrupts your view, but it's silly for another reason too. LCD monitors today obviously have far fewer problems with image weirdness when you view them from an angle, but they also very obviously do still look different when looked at from only, say, about 20 degrees off axis. Colour and contrast doesn't stay the same, as it does if you view something like a wall poster from an angle.

The reason for the goofy 178 degree figures is that it allows for a very wide drop in the screen contrast ratio. The standard contrast-drop figure for angle specifications is a factor of ten - in other words, black and white that differ in brightness by a factor of 300 when you're looking at the screen dead-on are allowed to only differ by a factor of 30 when you're at the most extreme viewing angle.

Some LCD monitors also have a more realistic 5X-contrast viewing angle figure, but I wouldn't make any bets about such a specification being available for any particular monitor, if I were you.

From all normal viewing angles for a single computer user, the 3007WFP-HC is fine. There are still obvious contrast differences with angle, which manifest in the usual LCD-monitor way as a sort of glowy reflection-y kind of effect that moves around the screen as you move your head, but the effect is quite mild and doesn't bother me at all.

The 3007WFP-HC also has very low response time figures, which don't mean a great deal, as I explain here. That article also mentions input lag, which is apparently also very low for the Dell. I've noticed no blurring or significant lag, ever, for anything.

(I've now measured the amount of power the 3007WFP-HC consumes, by the way. Read all about it here.)

Dead pixels

Back in the bad old days of super-expensive LCDs, you weren't even guaranteed to get a panel whose pixels all worked. Every LCD pixel is composed of a red, a green and a blue "subpixel", each of which is operated by its very own transistor.

So on a 2560 by 1600 screen like this, the factory has to make twelve million, two hundred and eighty-eight thousand flawless transistors for the screen itself to work perfectly. That's peanuts compared with the number of transistors in a CPU, but CPU transistors don't have to be see-through.

In the LCD olden days, manufacturing error rates were so high that virtually no panels had zero defects. Things are much better now, though.

The 3007WFP-HC's panel is apparently in pixel error class II as defined by ISO standard 13406-2. That allows a rather worrying five malfunctioning subpixels per million pixels (with further rules about stuff like how close they can be to each other), which'd be as many as 20 dud subpixels on a monster screen like this.

As far as I can see, though, there's actually only two subpixel defects on this screen. One red subpixel only ever makes it to about 25% brightness when it's meant to be off, and one green subpixel near it has a minimum brightness of something like 10%.

I only now discovered the green defect while I was trying to find the red one again. If I don't display a black screen and pore over it obsessively, both defects are completely invisible.

And this isn't a cherry-picked Special Very Nice Monitor To Send To Reviewers. This sort of trivial defect seems to be all you're at all likely to get, these days.

IMAX computing

Users of 30-inch monitors face the terrible, terrible problem of how to effectively use all of that space. You don't often want to maximise a folder or document window on a screen this big; either you'll end up with a lot of white space and important program buttons separated by a vast expanse of nothing, or you'll get lines of text 300 or more characters long, which are difficult to read.

I've been using WinSplit Revolution to manage this problem. It's a neat little Windows utility that makes it easy to bounce (most) windows around the screen and quickly resize them to take up the amounts of screen you probably want them to occupy. Two panes, each 1280 by 1600, give you a couple of twenty inch portrait-aspect-ratio "screens" that work great for many tasks. And also occasionally give me Amiga flashbacks.

A vast monitor is, of course, also great for watching video - but not all video. Most video files play just fine in fullscreen mode at full resolution, but some use codecs that need too much CPU horsepower for scaling, or weird resolutions that can't be scaled by whole-number amounts, or something. Result: Lousy frame rate when you switch to full four-megapixel mode.

This problem's easily fixed with any half-decent media player. You just change the fullscreen resolution to 1280 by 800 or something and you're fine, although you'll now have to wait a moment for the monitor to resync when you go to fullscreen mode.

Unfortunately, Flash video is seldom viewed in a half-decent media player. I can't find a way to play online Flash video, like YouTube clips, fullscreen any more.

The latest beta version of the Flash player has hardware acceleration and perfectly cures the problem when I'm playing Flash files in a separate media player, but the browser plugins either aren't fixed, or don't work with the YouTube-style fullscreen scaling, or just plain don't install (in the case of the ActiveX IE plugin, which I don't want to use anyway).

If you have to view YouTube videos in the standard-sized window on a monitor this size you may experience flashbacks from the early days of "multimedia" software, with a hilariously tiny video window (5.5 inch diagonal!) in the middle of a vacant wasteland of screen real estate (26/27ths of the screen!). I'm using the Greasemonkey script YouTube Googler as a stopgap measure; it stretches YouTube video to fit the browser window, and I can use WinSplit Revolution to instantly make that window an appropriate size.

(YouTube Doubler is not as helpful as its name might suggest.)

YouTube itself may solve this problem, as it's switching to the H.264 codec, which ought to scale better.

The scaling problems apply to games as well, of course. Recent 3D games require a very, very muscular graphics card if you want them to run smoothly at 2560 by 1600. It's easy enough to work around this if you don't feel like buying a GeForce 8800 GTX along with your new monitor, of course; just run your games at 1280 by 800 - or even 640 by 400 - and you'll be fine.

I've found that Supreme Commander runs just fine at full resolution with my relatively humble GeForce 7900 GT. Fullscreen 3D action is pushing it...

Supreme Commander at 2560 by 1600

...but splitting the screen and putting the easy-to-compute cartographic map on one half of it solves the problem nicely.

Very large pixels.

Other possibilities also suggest themselves.

Overall

A monitor should just be an abstract hole into data-space. You shouldn't see the monitor, you should see the image. Ideally, you shouldn't even be able to perceive the pixels on the screen, but we're a long way from the few hundred pixels per inch and resolution-independent software that'd make that possible.

In the meantime, the LG Phillips LM3001WQ1 panel is, for general purpose use, the best single computer display device in existence. Better panels - even higher resolution ones - exist, but they're only available in extremely expensive special purpose monitors, like those used in medical imaging and exotic CAD applications. The high-end designers' monitors have better colour rendition than the LM3001WQ1, but I'm pretty sure none of them are anything like as big or high-resolution.

So even if you're a billionaire, the LM3001WQ1 panel is, as of 2007, the best that money can buy, for normal computing. And the Dell 3007WFP-HC is the cheapest monitor that uses it.

And yes, you do get used to it. At first it was as if someone'd replaced my monitor with a drive-in movie screen, but now it's just... a screen.

You know what's next, don't you.


Review monitor kindly provided by Dell, after I kindly paid them for it.



Give Dan some money!
(and no-one gets hurt)