Morality for robots?

Morality for robots?
This is the book cover of "The Machine Question: Critical Perspectives on AI, Robots, and Ethics." Credit: The MIT Press

On the topic of computers, artificial intelligence and robots, Northern Illinois University Professor David Gunkel says science fiction is fast becoming "science fact."

Fictional depictions of have run the gamut from the loyal Robot in "Lost in Space" to the killer computer HAL in "2001: A Space Odyssey" and the endearing C-3PO and R2-D2 of "Star Wars" fame.

While those robotic personifications are still the stuff of fiction, the issues they raised have never been more relevant than today, says Gunkel, a professor of communication technology.

In his new book, "The Machine Question: Critical Perspectives on AI, Robots, and Ethics" (The MIT Press), Gunkel ratchets up the debate over whether and to what extent intelligent and autonomous machines of our own making can be considered to have legitimate moral responsibilities and any legitimate claim to moral treatment.

"A lot of the innovation in thinking about machines and their moral consideration has been done in science fiction, and this book calls upon fiction to show us how we've confronted the problem," Gunkel says. "In fact, the first piece of writing to use the term 'robot' was a 1920s play called 'R.U.R.,' which included a meditation on our responsibilities to these machines."

Ethics is typically understood as being concerned with questions of responsibility for and in the face of an "other," presumably another person.

But Gunkel, who holds a Ph.D. in philosophy, notes that this cornerstone of modern ethical thought has been significantly challenged, most visibly by animal rights activists but also increasingly by those at the cutting edge of technology.

"If we admit the animal should have moral consideration, we need to think seriously about the machine," Gunkel says. "It is really the next step in terms of looking at the non-human other."

The NIU professor points out that real decision-making machines are now ensconced in business, personal lives and even national defense. Machines are trading stocks, deciding whether you're credit worthy and conducting clandestine Drone missions overseas.

"Online interactions with machines provide an even more pervasive example," Gunkel adds. "It's getting more difficult to distinguish whether we're talking to a human or to a machine. In fact, the majority of activity on the Internet is machine traffic—that is, machine to machine. Machines have taken over; it has happened."

Some machines even have the ability to innovate or become smarter, raising questions over who is responsible for their actions. "It could be viewed as if the programmer who writes the original program is like a parent who no longer is responsible for the machine's decisions and innovations," Gunkel says.

Some governments are beginning to address the ethical dilemmas. South Korea, for instance, created a code of ethics to prevent human abuse of robots—and vice versa. Meanwhile, Japan's Ministry of Economy, Trade and Industry is purportedly working on a code of behavior for robots, especially those employed in the elder-care industry.

Ethical dilemmas are even cropping up in sports, Gunkel says, noting recent questions surrounding human augmentation. He points to the case of South African sprinter and double amputee Oscar Pistorius, nicknamed "blade runner" because he runs on two prosthetic legs made of carbon-fiber.

In 2008, Pistorius was restricted from competing in the Beijing Olympics because there was concern that he had an unfair advantage. This decision was successfully challenged, and Pistorius competed in the 2012 London Games.

Similar concerns about the fairness of human augmentation can be seen in the recent crisis "concerning pharmacological prosthetics, or steroids, in professional baseball," Gunkel says. "This is, I would argue, one version of the machine question."

But Gunkel says he was inspired to write "The Machine Question" because engineers and scientists are increasingly bumping up against important ethical questions related to machines.

"Engineers are smart people but are not necessarily trained in ethics," Gunkel says. "In a way, this book aims to connect the dots across the disciplinary divide, to get the scientists and engineers talking to the humanists, who bring 2,500 years of ethical thinking to bear on these problems posed by new technology.

"The real danger," Gunkel adds, "is if we don't have these conversations."

In "The Machine Question," Gunkel frames the debate, which in recent years has ramped up in academia, where conferences, symposia and workshops carry provocative titles such as "AI, Ethics, and (Quasi) Human Rights."

"I wanted to follow all the threads, provide an overview and make sure we're asking the right questions," Gunkel says.

He concludes in his new book that the moral community indeed has been far too restrictive.

"Historically, we have excluded many entities from moral consideration and these exclusions have had devastating effects for others," Gunkel says. "Just as the animal has been successfully extended moral consideration in the second-half of the 20th century, I conclude that we will, in the 21st century, need to consider doing something similar for the intelligent machines and robots that are increasingly part of our world."

"The Machine Question" is available for purchase through The MIT Press, amazon.com and numerous other book sellers. Gunkel is author of two other books, "Hacking Cyberspace" and "Thinking Otherwise: Philosophy, Communication, Technology."

Citation: Morality for robots? (2012, August 29) retrieved 29 March 2024 from https://phys.org/news/2012-08-morality-robots.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

The ethical robot (w/ Video)

0 shares

Feedback to editors