Have you committed adultery if you sleep with a robot? Are you guilty of murder if your robotic prosthesis arm malfunctions and kills someone? Could you work for a robot boss?
There was a time when such questions were considered only by writers of science fiction. But such ethical and practical questions are already being considered and will have to be answered in the coming decades as robotic technology advances, as the book being reviewed in the following piece from yesterday’s newspaper makes clear.
Introducing Robot Ethics
Thirty years ago, few people envisioned just how completely computers would be integrated into our everyday lives; today, they’re everywhere. In Robot Ethics: The Ethical and Social Implications of Robotics, Patrick Lin (a science ethicist), Keith Abney (a philosopher of science) and George Bekey (a computer scientist) argue that the same is true about robots. Today, they are technological oddities; tomorrow, they’ll be ubiquitous and indispensable. That’s why, they write, we need “the emerging field of robot ethics.”
In their introduction to the book, which is a collection of essays in robot ethics from philosophers, lawyers, and scientists, Lin, Abney, and Bekey point out that people have been thinking about the ethics of robotics for millennia. Isaac Asimov’s three laws of robotics are only the most recent entry in a long tradition. “Homer,” the editors write, “described in his Iliad the intelligent robots or ‘golden servants’ created by Hephaestus, the ancient Greek god of technology… Leonardo da Vinci conceived of a mechanical knight that would be called a robot today.” But the need for a serious inquiry into robot ethics is now greater than ever before, because robots are now advanced enough to participate, on their own, in the ethical world:
[I]n August 2010, the U.S. military lost control of a helicopter drone during a test flight for more than thirty minutes and twenty-three miles, as it veered toward Washington, D.C., violating airspace restrictions meant to protect the White House…. In October 2007, a semiautonomous robotic cannon deployed by the South African Army malfunctioned, killing nine “friendly” soldiers and wounding fourteen others….
Already, robots are taking care of our elderly and children…. Some soldiers have emotionally bonded with the bomb-disposing PackBots that have saved their lives, sobbing when the robot meets its end.
Already, fascinating moral questions are emerging. If a robot malfunctions and harms someone, who is responsible — the robot’s owner, its manufacturer, or the robot itself? Under what circumstances can robots be put in positions of authority, with human beings required to obey them? Is it ethically wrong for robots to prey upon our emotional sensitivities — should they be required to remind us, explicitly or implicitly, that they are only machines? How safe do robots need to be before they’re deployed in society at large? Should cyborgs — human beings with robot parts — have a special legal status if their parts malfunction and hurt someone? If a police robot uses its sensors to perform a surveillance operation, does that constitute a search? (And can the robot decide if there is probable cause?)
Some of these questions are speculative; others are uncomfortably concrete. Take this example involving (what else?) robot sex, from an essay by David Levy:
Upmarket sex dolls were introduced to the Korean public at the Sexpo exposition in Seoul in August 2005, and were immediately seen as a possible antidote to Korea’s Special Law on Prostitution that had been placed on the statute books the previous year. Before long, hotels in Korea were hiring out “doll experience rooms” for around 25,000 won per hour ($25)…. This initiative quickly became so successful at plugging the gap created by the antiprostituion law that, before long, establishments were opening up that were dedicated solely to the use of sex dolls… These hotels assumed, quite reasonably, that there was no question of them running foul of the law, since their dolls were not human. But the Korean police were not so sure. The news website Chosun.com… reported, in October 2006, that the police in Gyeonggi Province were “looking into whether these businesses violate the law . . . Since the sex acts are occurring with a doll and not a human being, it is unclear whether the Special Law on Prostitution applies.”
It seems inevitable, Levy writes, that more advanced “sexbots” will push this issue even more to the fore, forcing lawmakers to figure out just which aspects of prostitution they want to outlaw.
Levy’s sexbot example is emblematic of a theme running through this collection of essays: The ethical problems posed by robots aren’t just about the robots. They’re also about old, familiar human behaviors which we must reconsider once robots are introduced. How will spouses feel, Levy asks, about the use of sexbots? Some will see it as adultery, others as a intrinsically meaningless. The answer, Levy argues, really has nothing to do with the robots themselves. “It will depend very much,” he writes, “on the sexual ethics of the relationship itself when robots do not enter the picture.”
Let’s consider the questions I posed to open this post:
Have you committed adultery if you sleep with a robot?
To have sex with a doll seems to me to be little more than complicated masturbation. But what if one develops an emotional attachment to the robotic doll? That may sound weird, but folks have been known to become attached to their cars, trucks, clothes, photos — all manner of things, and none of them provide sexual gratification.
Are you guilty of murder if your robotic prosthesis arm malfunctions and kills someone?
I don’t see how it could rise to the level of murder, but what if the prosthesis required regular maintenance and one continuously neglected to perform it? In such a case, I could see a district attorney going for a charge of involuntary manslaughter.
Could you work for a robot boss?
I think some of us already do. Take certain delivery drivers. While they may have a human “boss,” their day’s work is dictated by a computer which determines which stops they will make and often, in which order.
I think the near future will bring some interesting and challenging developments in the field of robotics. We might well see robotic cops or soldiers. When you get sick, you might first have to be diagnosed by a robot doctor before you get to see a human one. And who knows, if the field advances far enough, we might one day even see the Massachusetts Supreme Judicial Court leading the way again by making it legal to marry your robotic lover.
What are your answers to the three questions?
Do you think this is all just foolish fantasy or can you foresee a time when these and the questions in the article will have to be faced?
And can you think of others who are, directly or effectively, working for a robot boss?