Alexa, Siri, You’re Not a Trusted Family Member

3 minute read
| Editorial

The conceit of AI agents like Alexa, Cortana, Google Home and Siri is that they are to be always listening, invited to be treated as trusted family members. Or the loyal computer of our family’s starship. I don’t like these analogies at all.

Amazon Echo Dot responding to "computer" instead of "Alexa"

Go ahead. Ask me anything.

Here’s where I’m coming from. On May 10th our amazing Jeff Butts wrote: “Siri’s No Amazon Alexa yet, but Give Her Time.” What caught my attention was this section.

I’m not sure when the idea of a disembodied personal assistant like Siri or Amazon Alexa first entered our awareness. However, my first recollection of such technology came from the science fiction television series Star Trek. I, and many others, have memories of wishing we could call on something like the Enterprise’s computer just by speaking our commands or queries. That’s what Star Trek promised us, and it’s what many of us are waiting for.

I’ve been waiting too, and like Jeff I’ve been fantasizing about that kind of verbal experience with a computer for years. But there’s a modern problem, not anticipated in the last century’s science fiction.

Loyalty

As originally conceived, this shipboard computer operates in a military environment. It’s local and designed to serve only the needs of the crew.

I’m thinking of Star Trek VI, The Undiscovered Country. Imagine a plot twist—for literary purposes. The Klingon Ambassador shows up on the Enterprise with a warrant from the Klingon High Command demanding to know the details of Captain Kirk’s current battle strategy. For argument’s sake, the heated discussion and legalities end up in the hands of the ship’s computer.

Loyal only to the captain and his crew, the computer ignores the order and launches a shut down process with the plan to avoid complying with the order. It will sacrifice itself to preserve the safety and secrets of the crew. (Will Amazon do that for me?)

Contrast that science fiction scenario to our modern AIs in which all requests are forwarded to the developer’s server, to be analyzed. And they’re stored there, subject to a law enforcement warrant. We all know what can and did happen: “A Murder Case Tests Alexa’s Devotion to Your Privacy.

Follow the Money

There’s a financial motive behind these AI agents, served up ostensibly as helpful servants, but actually not under our sole control.

And yet. The proposition, indeed the conceit, is that we are to trust these disembodied voices, these little cylinders on our desks to sit there and listen to much of what we say, given the right prefix command of course.

Focusing on Amazon for a bit, I will admit that the company has gone to some trouble to give us warm fuzzies. The blue ring (and optional beep) inform us when Alexa is listening. The microphone can be disabled (red ring on). Even so, these mechanisms to protect our privacy are tenuous. There’s too much temptation to abuse a system like this, and as technology progresses, there’s no assurance that what we see on the surface, and the assumptions we make, will be preserved for all time by all developers.

Alexa and similar AIs aren’t really servants dedicated 100 percent to us. They are services designed to look and act like a servant, but there’s no comparison to a trusted family member. Family members have earned the right to always hear what we say, how we feel, and even forgive misstatements.

Robots Perpetuate the Illusion

It will get worse when these AIs migrate into cute robots that are designed, by their appearance, to be cuddly, friendly, approachable, and non-threatening. We’ll have to be doubly careful how we converse with these robots because, if the basic design continues, they will also be under the legal control of the developer. Read your EULA.

Until these AIs evolve into independent beings, with a will and sense of who they’re loyal to, as in Star Trek’s Lt. Commander Data, I suspect that the Star Trek fantasy and illusion will be purposefully perpetuated by the developers. There will be much to gain besides hardware revenues. Temptations to further leverage what is learned about us will be hard to resist. (Apple can be applauded, so far.) As services like this proliferate, customers will be enticed into reading too much into the innocent looking experience. Caution will be thrown to the wind. In the process, the relationship will remain very much one-sided.

Alexa. Cortana. Google Home. Siri. You are cute and helpful. But you’re not a trusted family member. Nor are you my loyal starship computer. You’re just another internet service to be handled very carefully.

2 Comments Add a comment

  1. Well said. We’ve already had a brush with corporate agents masquerading as helpful appliances: Samsung connected TVs. This is the reason I prefer Siri on my phone set up so I have to hold the Home button to wake her. I want her to listen when I want her to listen. Otherwise I want her off.

  2. Google Home (GH) has basically what you appear to be talking about. In our home the younger kids are can not turn down the AC. With the GH it knows who is who on the fly. So when a kid asked it just does not happen as it just knows who is who.

    It is the same with the hot tub. Guest are not allowed to turn up. So if they asked it just would not happen. We have had the Echo since it launched and the approach is a passcode which I never liked. First, anyone can overhear and then repeat. But it also makes it obvious you are not allowing guest to do something which is awkward.

    We have a GH in the kitchen that everyone uses through the day. In the morning when one of my kids ask for music they get their Spotify account and when I ask I get YouTube Red.

Add a Comment

Log in to comment (TMO, Twitter, Facebook) or Register for a TMO Account