February 18, 2007

Pseudo-HD: why resolution isn't everything

Last December, I finally bought an HD set. It's a Sony Bravia 46" LCD. The picture is gorgeous. Sony may be writing the book on how not to launch a game console, but their televisions are still quite nice.

For Christmas, Meredith got me the first season of Battlestar Galactica on DVD. A few weeks ago, we watched the miniseries from the DVD using my Xbox 360's DVD player. The DVD is in standard-def, of course, but the 360 does a decent job of upscaling it to the 1080i output. You'd never mistake it for HD, but it was passable.

The day after we watched that, I noticed that Universal HD on Comcast was showing the same miniseries from BSG. I started watching it to see how much better it looked in HD.

It didn't.

High-Def just defines the resolution. Universal HD / Comcast (not sure who compresses the signal; probably Comcast) was broadcasting the show at 1920x1080 pixels, but they were some awfully fuzzy pixels. Crank the bitrate down far enough, and the detail that can go into those pixels drops dramatically, making the picture look worse or no better than something with less resolution.

This isn't just about HD vs. SD. Some digital TV providers broadcast SD signals in 480x480, then scale them up to full SD (640x480 if you assume square pixels). This will often look better at a given bitrate than sending the signal with the same bitrate at 640x480, because the available bits/pixel is so much worse in the full-resolution case. This is more true at lower bitrates; at sufficiently high bitrates, you should definitely use the higher resolution.

But Universal HD is a joke at the bitrate they're using on Comcast.

Posted by Mike at February 18, 2007 12:55 PM