The Cord Cutting Fantasy Isn’t Delivered With Just an Apple TV

4 minute read
| Columns & Opinions

The 4th generation Apple TV is a very nice device. It’s designed to fit seamlessly into a modern HDTV home entertainment system. But the total solution for those with a cord cutting mindset, trying to make a transition, is very complex. One needs a multitude of resources, with only one component supplied by Apple.

Curd Cutting

For starters, the cord cutter needs a roadmap. What essential services result in a given monthly bill? Plus what’s the bill for new hardware?

In terms of a roadmap, there’s nothing like a certain kind of graphic to convey information about multiple competitors. Business Insider has constructed such a chart: “This chart shows how absurdly complicated Netflix’s competitive landscape has become.” Not only does this chart size up the “competitive landscape” in streaming video but it also serves as a roadmap for those people trying to construct a cord cutter package that meets their needs.

Streaming services

See original article for full size chart.

For example, one might select Netflix as the anchor and add, say, CBS All Access and then add a sports package. What’s cool about the chart is that the pricing is included in each box, so one can estimate total monthly charges. I liked this chart a lot for that reason, but it also got me thinking about why the rate of cord cutting remains so low.

Cord Cutting Agony

My first reaction was that it’s a very complicated process to cut the cord. If one, say, upgraded a cable DVR recently, then one has committed to a new contract that’s very expensive to break out of. Even if one can terminate the TV service after a pleasant (!) conversation with customer service, it might complicate the bundling of the ISP side if one is with, say, Comcast/Xfinity or Time Warner. And then one may have to spend some time ripping out hardware, reconfiguring, selecting a streaming box, and then selecting streaming services from the chart above.

That’s a big challenge for many families that have conflicting interests. Plus, because it’s a major undertaking, the scope and detail of a guide like this is still very limited. One almost needs an entire magazine, on paper, full of resources, scenarios, diagrams and step-by-step instructions. Complicating that is the arrival of 4K UHD TVs, HDR requirements, 4K UHD Blu-ray players and the need to upgrade, perhaps, a non-cable 4K DVR.

It’s a crazy mess, and the prospects are enough, I suspect, to cause most families to just stay with what they have, doing occasional, piecemeal replacements. One thing that can help is to find a CEDIA professional in your town who can advise and help implement.

Cord Nevers have an advantage. Starting with something simple is easy to build on. In fact, it’s probably a good idea for families with conventional systems to start all over in, say, a den, and build a prototype cord cutter system as an engineering prototype. Later, the configuration can be duplicated in other rooms.

Apple to the Rescue!?

If Apple so desired, there’s a lot more the company could do to make this process simpler and more friendly with cool, advanced hardware. For example,

  1. A simple 55 & 65-inch (OLED) 4K UHD TV with HDR that doesn’t phone home, betraying our privacy.
  2. A 4K OTA DVR (APFS on Flash/SSD) with a (thankfully) modest monthly subscription fee billed to iTunes.
  3. A compatible 5th generation 4K UHD Apple TV.

Everything would “Just work.” Now that would be cool.

Next page: The Tech News Debris for the Week of July 11th. iPhone stress relief.

7 Comments Add a comment

  1. geoduck

    We’ve been exploring cord cutting and, well, it’s not been going well.
    Netflix Canada has much less than advertised for the US version. I find I use daily-motion more than anything else. Then this is the summer. My wife can use MLB.COM for her baseball fix. But come fall she’ll want to see the Vikings, and Gopher sports, which we can’t seem to find online anywhere. Also as we didn’t completely cut the cord, we just cut back to a bare minimal package, we’re finding that we’re drifting back to turning on cable. It’s just there, easy, simple, a no brainer. So yes, as absurdly priced as they are, cable is just an easier solution than cobbling together something out of multiple services that likely would cost us as much.. I expect our experiment to come to an end this September.

  2. geoduck

    Oh and one more thing. We keep looking at AppleTV and other such boxes and it seems that whatever we want to watch has to be on our cable subscription to show up. Well in that case why bother with the box?

  3. skipaq

    We just switched from DirecTV to a small local cable provider that resells a hand full of satellite channels along with local stations. Our monthly bill dropped $38. We also switched from Brighthouse internet service to DSL saving another $15 per month.

    We use Netflix but are growing frustrated with the growing trend pulling series or seasons from availability. This has happened in the middle of watching the seasons of some series. More and more we are simply buying a couple of season passes in iTunes a month of what we want to watch. It costs less than the various bundled cable and satellite packages we have tried.

    We just won’t put up with the constant price increases of both Brighthouse (now owned by Charter) and DirecTV. Six to eight times a year our bill would be higher.

  4. leeeoooooo

    I cut my cable a year and a half ago when I realized that what I was watching most of the time, MSNBC and Marvel’s Agents of SHIELD, were both available online. I quickly added a 3rd generation Apple TV with a Hulu subscription and haven’t looked back. I tried HBO Now a couple of times, but I find I really don’t watch it enough to justify the expense.
    I’m completely uninterested in sports, so that helps. I’m finding most of my supplemental viewing content comes from YouTube.

  5. orubin

    There Are a few I nteresting side effects of “cord cutting” that seem to be overlooked often. Right now, it looks good because there is a lot of content that makes it from broadcast and cable TV to “cordless” services, so the idea seems very inticing. But who do you think is paying for that content? Right now it’s cable and satellite systems. But as subscribers go away and revenu drops, so does the vast array of programming. (The same argument can be made for a la carte services. ). As much of the new programming disappears, what will be left is what mainstream America pays for, or a lot of independent low budget schlock. Both are reasons for worry if you like programs outside the mainstream, something you get with hundreds of channels creating programming.

    Secondly, the cost is not necessarily going to be cheaper. When you start adding up HBO Go, Showtime to Go, Netflix, Amazon Prime, Hulu, CBS paid, and the paid streaming services that will emerge as more people go streaming, and you will need them all as each gets exclusive content to their system only, you will pay the same or maybe more and in return, get only a handful of channels where you once had 100s. Sure, it’s “on demand”, but so is my Dish Network system for the most part (I do not work for them.) and I have a LOT of choice right now. It may not be better. I suspect it won’t. It certainly will not be cheaper when you add in all the subscriptions and the cost of higher speed internet.

    I like the idea of cord cutting at times, just don’t fool yourself into thinking you will get more programming for less money. In the long run, it will be the other way around.

  6. wab95


    Greetings as I pen this post from my mother’s basement. Literally.

    You’ve provided much food for thought in this week’s PD, however, I’ll confine myself to but one.

    There is a unifying theme between Ed Wenck’s article about the future, in which he cites Ray Kurzweil that humans are ‘by our nature, linear’ with respect to our growth and adaptation, and Taha Khalifa’s piece on science catching up with science fiction, in which he poses a valid question; ‘At IFA 2015, one of the world’s leading trade shows for consumer electronics and technology applications, technology was brought to life, and the futuristic devices of our favourite sci-fi movies were launched; which brought me to think of a vital question, were those devices “smart?”’ I’ll focus on ‘smart’ in the sense of ‘a good idea’.

    This is a question that frequently goes both unasked and, when asked, unanswered in a coherent, consensual fashion, if for no other reason than there is no single authoritative voice to speak for society writ large, itself not a monolith but a diverse set of communities strewn across the planet with various cultural predispositions. To the extent that it is answered, it tends to be with our wallets, in which consumers, both personal and enterprise, either adopt a device (e.g. the iPhone) or it fails to even ignite the public appetite (e.g. Google Glass).

    This consumer behaviour, however, needs to be interpreted with caution and context. Some devices address an extant need, whether or not that need is expressed or latent, and can be embraced within the constraints of current social norms, like the iPhone. Other devices, like Glass, present more of a challenge, in that current social norms, or at least the rules and laws that govern them, need modification in order to accommodate the device. A device might in fact be a great idea, but either ahead of its time, or not sufficiently thought through to permit wide spread adoption without social discomfort, if not disruption.

    An important observation that often goes unremarked, and when remarked, whose implication to device creation and adoption is seldom applied to which technologies should be pursued and are ‘smart’ in the sense that they are timely, appropriate and necessary. This goes back to the theme that humans are linear. Wherever we are in our social evolution, apart from the visionaries amongst us, our horizon is bent by the gravitational pull of our social, cultural and technological condition, and the experiences and expectations that these suggest. All one need do is look at the future predicted by a previous generation (e.g. 1927’s Metropolis) to appreciate how quaint were many of the notions held by a previous generation about their potential ‘future’, and how these were shaped by that generation’s experience and culture. Indeed, one can even see it in more modern offerings, like STNG, in which software relevant security issues, which any adolescent today would expect, are wholly absent in the Federation’s flagship, not to mention access to the ship’s vital systems. Late 20th Century humanity simply was there yet.

    Kurzweil’s assertion that we are liner is plausible insofar as the rate at which society grows without disruption; however, it is incomplete to the point of being incorrect with respect to how we grow. Our technology, as we have discussed in this forum many times, is a socially and culturally transformative experience. It alters both how we see ourselves today, and what we both aspire to and expect for ourselves tomorrow. In short, the interplay between ourselves and our tech is fundamentally organic, not linear. Organic growth is not linear, but an expressed interaction between nascent potential and the environment in which that growth occurs, e.g. a person may have the potential to grow to over 6 feet in height, but their childhood diet may constrain that growth. The tech we adopt, and how it feeds our cultural behaviour and expectations, in turn shapes the direction we take, and environment constrains how far and how fast we get there.

    For tech companies the lesson is obvious. Future directions in tech development are not, and have never been, a slavish and uncritical pursuit of the dreams of our forefathers (to use a phrase), but an organic outgrowth of our real life’s trajectory. And when this is married to vision, genius and daring, it can witness a cognitive and technological leap of which a previous generation had never dreamt.

  7. wab95


    I just back and re-read my post and found a typographical error that renders the first sentence in paragraph 6 uninterpretable. It reads (change italicised),

    “An important observation that often goes unremarked, and when remarked, whose implication to device creation and adoption is seldom applied to which technologies should be pursued and are ‘smart’ in the sense that they are timely, appropriate and necessary. ”

    It should read,

    “An important observation that often goes unremarked, and when remarked, whose implication to device creation and adoption is seldom applied, is which technologies should be pursued and are ‘smart’ in the sense that they are timely, appropriate and necessary. ”

    Whether or not autocorrect is to blame, or my thinking faster than I can type, it would sure be great if we could go back and correct our posts.

    That said, one thing I failed to emphasise in discussing which devices are ‘smart’, in terms of being a good idea is that, occasionally, devices appear to be created simply because a company can create them based on a newly minted technology, and often based on a legacy science fiction idea without adaptation to society’s current direction. The credit card sized pocket organiser featured in Arthur Clarke’s ‘Foundations of Paradise’ comes to mind, which came to market in about 1987 and lasted for all of about a minute, maybe two. But others could include Samsung’s conceptualisation of the smart watch, which had the thematic consistency of a chimera, and was little more than a legacy concept of how many miniaturised technologies could be shoe-horned into a wrist device.

    These devices fail to correct for how society’s direction and culture have already been altered since that science fiction idea was first conceived. These devices (or ideas if they get beyond being lab prototypes) can fail simply because they are an idea from the past, proposed for a future that is not now, nor will likely ever be, or as conceived, are woefully incomplete and inadequate.

    Humanoid robots, in my view, fit this category. Engineers have found that true robots (automatons that are designed to do repetitive tasks more quickly than can a human) do not need to look humanoid, and are in fact better suited for their tasks when their design is optimised for that task.

    That is a fundamentally different issue than that of androids, with which ‘robots’ should not be confused. Androids are artificial humans for whom we have yet to create a consensus justification beyond the titillating, let alone the technology and socio-legal framework for coexistence.

Add a Comment

Log in to comment (TMO, Twitter, Facebook) or Register for a TMO Account