This article was originally published at www.display-central.com.
As with any new technology, there is often a clash between the engineering developers and the marketeers. (To this day, one of my pet peeves continues to be the way audio amplifier power is defined, but that’s another story.) Ultra-High Definition Television, unfortunately, continues that trend. What the term actually means, and how it is modified, doesn’t currently present a clear picture, if you’ll forgive the pun.
The origin of the term Ultra-High Definition (or Ultra-HDTV, UHD TV, etc.) goes back at least to 2002, when NHK described an “Ultra-high-definition, Wide-screen System with 4000 Scanning Lines”, and I’m sure astute readers will find even earlier references. Now that it’s out of the research labs and actually in retail stores, the mayhem begins. imgresThe CEA saw this coming, so back in June it released the official CEA Ultra High-Definition Display Characteristics V2, which stated that a TV, monitor or projector may be referred to as “Ultra High-Definition” if it meets certain minimum performance attributes, including a display with at least 3840 pixels horizontally and at least 2160 vertically. Of course, all this means is that a product carrying the CEA-trademarked UHDTV logo must meet these performance requirements.
Therein comes the next obfuscation opportunity: actual vs. “capable”. Yes, friends, history repeats itself. Remember when the market was flooded with “HD-capable” monitors that couldn’t actually display HDTV, and the ever-annoying “Real HD”? Of course, it’s happening again. It reminds me of “new and improved”. You mean I was a bozo to buy the original? Are there other definitions? Of course there are.
Last year, SMPTE updated, in several specifications what UHDTV is, or at least, what some other terms mean. In its SMPTE ST 2036-1, it defines “UHDTV1” (or UHD-1) as the 3840 × 2160-pixel image format, and UHDTV2 (or UHD-2) as the 7680 × 4320-pixel image format. One assumes that UHDTV is the union of these characteristics; aside from a variety of the usual frame rates, other pixel arrays are not defined. Which brings up another factual fallacy: just what are “4K” and “8K”? SMPTE takes pains to declare that 4K is “a term used to describe images of 4096 × 2160 pixels although sometimes applied to UHDTV1 images … this term should not be used when referring to UHDTV1”.
Similarly, SMPTE states that 8K is “a casual term for UHDTV2 images,” and “should not be used when referring to UHDTV2.” And, not to ignore its hard work, the ITU similarly recommends (R-REC-BT.2020-1-201406-I) that the pixel counts of 7680 × 4320 and 3840 × 2160 should be used “for UHDTV programme production and international exchange”. If we want to blame someone for the term “4K,” we could hang it on… [Keep reading]