We now know that the new 4th generation Apple TV announced for this fall won't ship with 4K UHD capability. But we don't know why Apple declined to include this feature and opened itself up to opportunistic competition, such as Amazon's Fire TV. I have a theory, and it reflects how my thinking has changed over the past few weeks.
4th gen Apple TV. Image credit: Apple
When Apple first announced its 4th gen Apple TV, it seemed most, including me, were at a loss to explain why Apple wouldn't include support for 4K UHD. After all, the new iPhone 6s family will shoot 4K video and the latest iMacs will display 4K video.
tl;dr. I've been doing some reading lately about the TV industry side of 4K UHD, looking for answers, and I've found increasing support for my theory that 4K as a resolution standard (3840 x 2160p) isn't the only issue here. Rather it's the splintering of the industry around the emerging High Dynamic Range (HDR) technology. And Apple, I believe, is waiting for the dust to settle.
Normally, I pick one standout article of the week to highlight. However, 4K UHD HDR is a complicated subject. To explain the situation requires an appeal to several articles. Here are four that will bring us all up to speed, in increasing order of complexity and depth.
- 4K primer. "4K changes the whole picture"
- An introduction to some technology and 4KTV nomenclature. "What is Samsung's SUHD?"
- HDR 101: "What are HDR TVs and why is every manufacturer selling them now?"
- HDR Next Level: "HDR Is Here—But Don't Rush Out to Buy a New TV Just Yet."
After you read all four articles, you'll have a pretty good idea what's going on in the 4KTV industry today. However, for the impatient, I'll do my best to explain what's going on and the impact on Apple.
The BIG TV Industry Problem
For a few years now, the talking points for 4KTVs have been 1) Lack of content 2) Home bandwidth to support streaming and 3) The retina effect. The first two problems can be solved by technology, but the third is an unsolvable problem related to the visual acuity of the human eye.
Charts have been constructed that show how close one has to be to a 4KTV of a given (diagonal) size in order to gain the benefit of the higher resolution. Apple customers are well of this in relation to Retina displays. For example, for a 65-inch (165 cm) 4KTV, one has to sit at 8 ft (2.4 m) or closer to obtain a noticeable benefit.
Image credit: Carlton Bale
Knowing that this problem gets written about a lot (in a negative way) and will never be solved, the industry had to come up with a way to make 4KTV's look startlingly better, independent of resolution alone. And they found it. It's called HDR, introduced above. This effect is also well known to Apple customers because it's been a feature of the iPhone camera since 2010. However, including HDR in the source video and then transmitting that video in a streaming environment is much more difficult than still photography. We can surmise that Apple video engineers know all about that.
The Fracturing of the Industry
One would think that an industry beset with customer resistance to another TV technology leap, memories of the ugly Blu-ray/HD-DVD war, new UHD disc standards, and the limits of human physiology would unite into a single HDR standard. Alas, they have not.
Articles #3 and #4 above tell the sorry story of a chaotic 4KTV industry, each manufacturer with its own smartTV OS and each with its own views about how HDR should be implemented. That leads to unique naming of each TV maker's vision. Samsung calls it "Peak Illuminator Ultimate." LG calls it "Ultra Luminance." And so on.
An HDR demo. Credit: 4K Ultra HD Review
The bad news is that while firmware can always be updated, HDR requires advanced phosphors/LCD crystals. Article #2 contains an explanation of how Quantum Dots (nanocrystals) can provide that wider dynamic range of luminance. If your old 4KTV doesn't have it, with its limited brightness range in candelas/m2 or isn't OLED, it's never going to be able to implement a great version of HDR.
The worse news is that as each maker develops its own HDR implementation, the transmission of streaming video becomes more complex if not universally impossible. Today, certain video sources are best paired with certain brands of 4KTVs. As Jan Ozer in article #4 above explains it on page 2:
As a consumer, we seem to be entering a world where the set you buy is determined by the content you want to watch—Samsung for Amazon, LG and Sony for Netflix, and Vizio for Vudu. Call me overly frugal, but for a $2,000 purchase, that seems unrealistic. At least initially, you’d have to expect that this paradigm, like the Blu-ray vs. HD-DVD controversy, will slow sales until one or two standards emerge.
When, say, just two HDR standards emerge, it's reasonable for the TV makers to configure the electronics to switch between them. But not four or five.
Of course none of this is meant to suggest that one cannot go out today and buy a Sony smart 4KTV (with an Android OS) or an LG 4KTV (with webOS) and watch what limited 4KTV content there is while upscaling the rest.. (BTW, here's a full list of the OS each different smart TV has.) Apple, however, is in a position of wanting everything to just work for the average consumer.
One can imagine that, given the turbulence in the industry with HDR and the still rather sparse content, Apple felt that the best course for this new 4th gen Apple TV would be to go with what customers know and understand—HDTV and Apple's own strengths with iOS and app development. I haven't seen a better theory.
1080p HDTV is ironclad and pervasive. It's the best user experience here in the fall of 2015. Comparison charts will be constructed by competitors showing that their product proudly has the 4K box checked, but now you know that nothing is as simple as it seems.
Next page: the tech news debris for the week of September 14. Making the Apple Pencil work.