Sexploitation. It's a word that we like to think is constrained to unsavory websites and isn't approved of in polite society. And yet. Just as with many other technology developments that can be misused, 3-D printing and robot technology have enabled the construction of, if you will, android sex dolls. There are few legal constraints on this, and we can probably expect see it escalate quite a bit before social forces learn how to deal with it in a positive way.
The "Mark 1" Android. Image credit: Reuters.
The Particle Debris standout article of the week is from Wired. "The Scarlett Johansson Bot Is the Robotic Future of Objectifying Women." For reference, here's the original article via Reuters Building a humanoid Hollywood star.
One might think the Wired piece is a overly sensational article, but it's based on a sober realization of the social implications. Here's a quote; the reference is to actress Scarlett Johansson.
News broke on Friday about a Hong Kong designer who made a robot that looks just like the award-winning actress—although Ricky Ma, the robot’s creator, wouldn’t name the actress he modeled the bot on, choosing instead to call it Mark 1. It took Ma eighteen months and over $50,000 to complete the project, which he constructed on his patio with a 3-D printer and software that he taught himself how to use.
Aside from the creepy factor, the legal aspects are explored next by Wired
But Ma made the robot in his house and may not necessarily profit. “So here Scarlett Johansson might face more First Amendment pushback,” said Calo. [a law professor at the University of Washington.] Still, according to Reuters, Ma does hope that an investor will buy his robot, which could give Scarlett Johansson clearer grounds to take legal action.
This article raises all kind of interesting issues.
- What if the commercial version of such an android looked like no other person on earth. There's no one to sue. Does that constitute a legal green light?
- Should laws be enacted to prevent the construction of what are, essentially, sophisticated sex toys in the form of a human being?
- More to the point, does the construction and use of such devices lead to a consumer mentality that's dangerous? Examples: By analogy, Twitter has created, in some cases, an overly abusive environment against innocent women. Here's another: Just as the occasional athlete, accustomed to professional violence, sometimes has a hard time controlling strong emotions with a partner, would the pervasive of use of such commercial devices unshackle/promote bad social impulses with real human beings?
- Would a major tech giant like Apple or Google ever consider such a project? (Some pieces are in moving into place, such as Siri, Atlas and Liam.)
Given enough exploration, one can probably think of other factors that come into play with this kind of technology.
From the movie Ex Machina.
The concept is not new. This fantasy, if you will, has been explored in science fiction for many years now. The most recent example is the movie Ex Machina" But it's quite another thing to confront the reality of the technology and explore the implications in real life (and the courts) than it is to explore the mere concept in a dramatic film or TV show.
Siri and Other AIs
In somewhat of an oblique fashion, the subject of Artificial Intelligence (AI) agents comes in here. I saw another, related article, that explored the use of female voices for Apple's Siri, Microsoft's Cortana, and Amazon's Alexa. On The iPhone And The Women In Star Trek" The argument in this case goes like this:
But the default voice for all these digital assistants—Apple, Amazon, Google, and Microsoft—comes from executives and engineers and employees who are decidedly male; roughly 80-percent, on average, and that percentage goes much higher when only engineering and executive jobs are considered.
Does Siri, in a female voice, objectify women?
The author goes on to conclude, "Sex sells." In this case, however, I believe the author's argument is weak. There are technical reasons why, say, Apple would select a female voice as the default for environments that would be expected to be noisy, such as restaurants. Just as many aircraft cockpits use a female voice in audible alerts for technical reasons, like Bitching Betty," where formal testing has proven that certain frequencies and voices cut through noise and distraction in an emergency, the same applies to an AI assistant. Note that, in Settings, Apple's Siri can be changed to a male voice.
I don't know about the other AIs, but if there's no option to change the voice to male, then the PixoBebo author has a stronger case.
All in all, the emergence of specific technologies is getting a healthy discussion going about these matters. In time, and it will take some time, we'll move from science fiction fantasy/dramas to a collective and firm social and legal grasp of the situation. And how to deal with it.
Next page: The Tech News Debris for the Week of April 4th. Microsoft's Metamorphosis. The coming Tesla-Chevy war.