“4K” is the latest buzzword with TVs.
Most advertising would have you think that 4K is akin to the leap we made from standard to high-definition TVs. Since HD made everything clearer, more pixels should only make your content look better, right?
The problem: That isn’t totally true.
Instead, when you’re buying your next TV, there’s another feature you should be more focused on: HDR.
Taking a step back: Standard definition (SD), high definition (HD), and 4K (or Ultra HD) refer to a characteristic called resolution, or the number of pixels (or tiny display bits) that make up a display.
A common HDTV has a resolution of 1080p. In simple terms, it’s 1,920 x 1,080 pixels: 1,920 pixels going across the display horizontally, and 1,080 pixels going across it vertically.
A 4K TV simply boosts that pixel count: Usually, 4K refers to a display resolution of 2160p, or 3,840 x 2,160 pixels. That’s roughly four times larger than a 1080p picture, hence the term “4K.”
(Technically, 4K isn’t the same as 2160p, but the technical differences are so minor that it doesn’t really matter.)
Of course, companies love the “4K” buzzword, since it sounds bigger and better than normal HD, and 2160p sounds like a bigger and more appealing number than 1080p. If a 4K TV is four times larger than a typical HD TV, it should be four times better, right?
Of course, that’s not the case.
4K isn’t worse than 1080p, but your eyes are physically incapable of noticing those extra pixels unless you have a fairly large TV set, and plan on sitting close to it.
This Carlton Bale article puts it into perspective: From about five feet away, you’d need something like an 84-inch TV to see the additional sharpness. With a more common 42- or 50-inch TV, you’d have to sit about two to three feet away. So, it’s not going to happen, basically.