Ash's Ramblings
Crap Doodles
Links

It's all about the pixels

At some point over the last few years it became the 'de-facto' standard for 'high definition' to refer to a picture size of 1920 by 1080 pixels. Allow me to rant on why I think this is dumb and wrong.

Firstly, 'high'. When DVDs came out, they were capable of producing (in PAL territories anyway) a progressive picture of 576 lines high. This is now known as 'standard definition', despite the fact that before DVDs pretty much all video was interlaced, meaning there were only really 288 lines of visible video at any one time. High def is obviously higher, but it is just that: higher. Not high. High def only came in because TVs are getting bigger. Compare 'high' definition to the definition of, say, a cinema camera and it looks very low indeed. So in 20 years time when everyone has an 80-inch screen in their front room, 'high definition' will start to look really pixelly, and you'll probably find that 'extra high definition' and 'super extra amazingly high definition' will need to supercede high def. They really should have called it 'digital video generation 2' or something like that, so they can go for 3, 4, 5, etc next time.

But secondly, and more importantly, people say 'definition' when they actually mean 'resolution'. By only taking into account the image resolution (the frame size in pixels) when defining 'high definition' you end up with some pretty shocking pictures that are, in my opinion, wrongly classed as high definition. Heck, the word 'definition' actually means clarity, so why is it that a blocky, low-bitrate video stream can be classed as high definition just because it's 1080 pixel lines high when a crystal-clear, higher bitrate displayed at 576 lines is considered standard definition, even if it has a clearer picture? The answer is simple: 'high definition' is nothing more than a marketing term. It has about as much meaning as the 'V' in 'DVD'.

A brief analogy: go into any decent camera shop and the salespeople will (correctly) tell you that megapixels are pointless, it's the lens that's important. More and more cameras, and even phone cameras, are being sold with 8, 10, even 12 megapixel definition... but if you don't have a decent lens and CMOS sensor then it's only producing 12 megapixels of rubbish. Video is exactly the same... you can have a high def camcorder, but if it's storing hours of video on a poxy 2GB SD card then you may as well be recording in standard def and upscale it later, it will look just as bad. I'm labelling home-video types with HD camcorders here, but professionals aren't flawless; try watching some low popularity digital TV channel (ie Sky 3, ITV4, etc) on a full-HD setup and you'll see how bad the picture is. There is no trickery or half-truth going on, the picture is indeed 'high definition', at least by its universally recognised definition, but it's a low bit rate and this is why it looks crap and blocky.

I think it's time we stopped thinking purely in terms of pixel resolution and more in terms of bit rate. We also need to redefine the phrase 'high definition' to better reflect the reality of digital video... it's not just about the picture resolution. As for me, I'm going to start saying 'high-res' rather than 'high-def', it's more technically accurate. You're welcome to join me if you like.