Page 2 – The Tech News Debris for the Week of October 3rd.
Why We Shouldn’t Laugh at The AirPods
UBS financial analyst Steve Milunovich has presented an interesting theory about Apple. The discussion is best presented by CNBC in “Don’t laugh at AirPods — they’re part of Apple’s ‘ambient paradigm’.”
The Milunovich note is presented by author Arjun Kharpal:
The ambient paradigm consists of a many devices providing different input/output methods that can be flexibly utilized depending on the situation (sitting, walking, running, driving). Collectively these devices offer the capability of earlier products and more delivered as a seamless user experience
Devices become extensions of one another rather than discreet, computing platforms. It is an expression of what Tim Cook has described as ‘iOS everywhere.’
Analyst Milunovich goes on to note how Apple products, when first introduced, seem orthogonal or unrelated to a master plan. However, in time, Apple soon connects the dots with integrative thinking, and we see where things are going. One example is the Apple Wallet which paved the way for Apple Pay. Or the new Home app in iOS 10 which is probably filed a way on the back page but will soon emerge as essential. Milunovich continues:
Apple often introduces important long-term technology in a more modest form that reveals only part of its eventual capability. For instance, Apple appears to have settled on the recent version of the Watch as a fitness play, but we expect health and Internet of Things interactions to become integral over time.
Another fascinating notion presented is that of “Scarcity Theory.”
… new technology makes previously scarce resources abundant while creating new scarce resources. Users are both overserved and underserved at the same time, creating demand for the new job-to-be-done enabled by new capabilities. Winning vendors identify and own the next scarce resource.
All this is why Apple is undervalued, according to UBS. As Apple integrates its ambient products, resolves (and creates) scarcity issues, the customer becomes more immersed in and dependent on the ecosphere that’s been created. New software services can be the glue that propels new growth.
It’s been rumored that Apple will use an OLED display in the next iPhone in 2017. On the other hand, the company might use OLEDs only until another technology matures: micro-LED. Few know for sure, but here’s a story about the OLED side: “Apple And Sharp Meeting Up Over OLED Screens For iPhone 8.”
As predicted, Comcast is rolling out its 1 TB monthly data plan. While Business Insider dramatically paints this as something terribly restrictive and something people aren’t pleased with, I don’t see this as anything draconian by Comcast. If you read the BI article and my previous analysis, you’ll conclude it’s all a tempest in a teapot.
There are conflicting opinions in the press about how annoyed Apple may or may not be with Intel. The likely answer is that Apple is both annoyed and not annoyed in a kind of Shrödinger superposition of states. If you’d like to see both sides of the argument, here’s a good one. “Apple and Intel Parting Ways? Don’t Be Ridiculous.”
On the other hand, the Broadwell late delivery fiasco might have permanently convinced Apple to bypass Intel’s production snafus. See this from Jonny Evans: “Will next year’s iPads be faster than Macs?” Last week’s Particle Debris also raised this specter.
Finally, which country do you think has the most malware infected devices? The U.S.? Here’s a sobering article at CNBC .
Particle Debris is a generally a mix of John Martellaro’s observations and opinions about a standout event or article of the week (preamble on page one) followed on page two by a discussion of articles that didn’t make the TMO headlines, the technical news debris. The column is published most every Friday except for holidays.
One thought on “Artificial Intelligence Research, Unintended Consequences and Sex”
Pardon the duplication, I’m reposting a comment that I meant to put in this article.
The problem with voice recognition, and this just happened to me with iMessages when I tried it again having reading this article, is if you use a new word, it makes a mistake and you have to correct it manually. That defeats the purpose of using dictation. Okay, it’s too much to ask the app to know something it hasn’t heard before, but there must be a way to make the correction also through the voice interface. Now that seems to be a big problem for a silicon mind, how does iMessages’ natural language interface distinguish between a command and actual text without reserving words that signal that a command or text is about to follow? Humans easily make this distinction in conversation. A steno easily figures out if what she is hearing is text to be transcribed rather than an aside. And if she makes a mistake, it is easily corrected through conversation. She easily infers intent, she has a good idea of what’s in the head of the person giving dictation.
I think this is a challenge that pervades all of AI: inferring idiosyncratic intent.
There is a much reproduced experiment in psychology (a field notorious for irreproducible experimental results) where they show that humans, down to 2 years old even, understand the concept of ‘people other than myself have their own thoughts and own sets of knowledge’ and, more importantly, can make accurate guesses of what another person is thinking of and knows.
Earlier this week, news came out that this experiment was run on bonobos, chimps and orangutans and the key finding is that they also exhibit this ability. (Well maybe not to the extent that we do.) Here…
As far as I know computers can’t do this, and as long as they don’t, AI systems will not be able to do the Star Trek-like things that people hope they can do in the future.